2026-03-10T07:41:53.794 INFO:root:teuthology version: 1.2.4.dev6+g1c580df7a 2026-03-10T07:41:53.804 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-10T07:41:53.831 INFO:teuthology.run:Config: archive_path: /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/954 branch: squid description: orch/cephadm/mds_upgrade_sequence/{bluestore-bitmap centos_9.stream conf/{client mds mgr mon osd} fail_fs/yes kernel overrides/{ignorelist_health ignorelist_upgrade ignorelist_wrongly_marked_down pg-warn pg_health syntax} roles tasks/{0-from/reef/{v18.2.1} 1-volume/{0-create 1-ranks/1 2-allow_standby_replay/no 3-inline/yes 4-verify} 2-client/fuse 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} email: null first_in_suite: false flavor: default job_id: '954' last_in_suite: false machine_type: vps meta: - desc: 'setup ceph/v18.2.1 ' name: kyr-2026-03-10_01:00:38-orch-squid-none-default-vps no_nested_subset: false os_type: centos os_version: 9.stream overrides: admin_socket: branch: squid ansible.cephlab: branch: main skip_tags: nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs vars: timezone: UTC ceph: cluster-conf: mgr: client mount timeout: 30 debug client: 20 debug mgr: 20 debug ms: 1 mon warn on pool no app: false conf: client: client mount timeout: 600 debug client: 20 debug ms: 1 rados mon op timeout: 900 rados osd op timeout: 900 global: mon pg warn min per osd: 0 mds: debug mds: 20 debug mds balancer: 20 debug ms: 1 mds debug frag: true mds debug scatterstat: true mds op complaint time: 180 mds verify scatter: true osd op complaint time: 180 rados mon op timeout: 900 rados osd op timeout: 900 mgr: debug mgr: 20 debug ms: 1 mon: debug mon: 20 debug ms: 1 debug paxos: 20 mon down mkfs grace: 300 mon op complaint time: 120 osd: bdev async discard: true bdev enable discard: true bluestore allocator: bitmap bluestore block size: 96636764160 bluestore fsck on mount: true debug bluefs: 1/20 debug bluestore: 1/20 debug ms: 1 debug osd: 20 debug rocksdb: 4/10 mon osd backfillfull_ratio: 0.85 mon osd full ratio: 0.9 mon osd nearfull ratio: 0.8 osd failsafe full ratio: 0.95 osd mclock iops capacity threshold hdd: 49000 osd objectstore: bluestore osd op complaint time: 180 flavor: default fs: xfs log-ignorelist: - \(MDS_ALL_DOWN\) - \(MDS_UP_LESS_THAN_MAX\) - FS_DEGRADED - filesystem is degraded - FS_INLINE_DATA_DEPRECATED - FS_WITH_FAILED_MDS - MDS_ALL_DOWN - filesystem is offline - is offline because no MDS - MDS_DAMAGE - MDS_DEGRADED - MDS_FAILED - MDS_INSUFFICIENT_STANDBY - MDS_UP_LESS_THAN_MAX - online, but wants - filesystem is online with fewer MDS than max_mds - POOL_APP_NOT_ENABLED - do not have an application enabled - overall HEALTH_ - Replacing daemon - deprecated feature inline_data - MGR_MODULE_ERROR - OSD_DOWN - osds down - overall HEALTH_ - \(OSD_DOWN\) - \(OSD_ - but it is still running - is not responding - MON_DOWN - PG_AVAILABILITY - PG_DEGRADED - Reduced data availability - Degraded data redundancy - pg .* is stuck inactive - pg .* is .*degraded - pg .* is stuck peering sha1: e911bdebe5c8faa3800735d1568fcdca65db60df ceph-deploy: bluestore: true conf: client: log file: /var/log/ceph/ceph-$name.$pid.log mon: {} osd: bdev async discard: true bdev enable discard: true bluestore block size: 96636764160 bluestore fsck on mount: true debug bluefs: 1/20 debug bluestore: 1/20 debug rocksdb: 4/10 mon osd backfillfull_ratio: 0.85 mon osd full ratio: 0.9 mon osd nearfull ratio: 0.8 osd failsafe full ratio: 0.95 osd objectstore: bluestore fs: xfs install: ceph: flavor: default sha1: e911bdebe5c8faa3800735d1568fcdca65db60df extra_system_packages: deb: - python3-xmltodict - python3-jmespath rpm: - bzip2 - perl-Test-Harness - python3-xmltodict - python3-jmespath kclient: syntax: v1 selinux: allowlist: - scontext=system_u:system_r:logrotate_t:s0 - scontext=system_u:system_r:getty_t:s0 thrashosds: bdev_inject_crash: 2 bdev_inject_crash_probability: 0.5 workunit: branch: tt-squid sha1: 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b owner: kyr priority: 1000 repo: https://github.com/ceph/ceph.git roles: - - host.a - client.0 - osd.0 - osd.1 - osd.2 - - host.b - client.1 - osd.3 - osd.4 - osd.5 seed: 8043 sha1: e911bdebe5c8faa3800735d1568fcdca65db60df sleep_before_teardown: 0 subset: 1/64 suite: orch suite_branch: tt-squid suite_path: /home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa suite_relpath: qa suite_repo: https://github.com/kshtsk/ceph.git suite_sha1: 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b targets: vm05.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLjvvLfyCbJfqoKPTuRTH6g/4/E/vkESTQxDsJ5NTg7U2jdn4vp+aA9LaLHDR4WpL6ZmMwCeUK4HeUkxLFIW8yE= vm08.local: ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPbUPsBaLkFaFvyeSyL0Fro+8bal4F4qWZruFdfreV4B0+MAR2QsEfuss4YK7XHDwhT9QyNtVGz3xw+lhz4/M9I= tasks: - install: exclude_packages: - ceph-volume tag: v18.2.1 - print: '**** done install task...' - cephadm: compiled_cephadm_branch: reef conf: osd: osd_class_default_list: '*' osd_class_load_list: '*' image: quay.io/ceph/ceph:v18.2.1 roleless: true - print: '**** done end installing v18.2.1 cephadm ...' - cephadm.shell: host.a: - ceph config set mgr mgr/cephadm/use_repo_digest true --force - print: '**** done cephadm.shell ceph config set mgr...' - cephadm.shell: host.a: - ceph orch status - ceph orch ps - ceph orch ls - ceph orch host ls - ceph orch device ls - cephadm.shell: host.a: - ceph fs volume create cephfs --placement=4 - ceph fs dump - cephadm.shell: host.a: - ceph fs set cephfs max_mds 1 - cephadm.shell: host.a: - ceph fs set cephfs allow_standby_replay false - cephadm.shell: host.a: - ceph fs set cephfs inline_data true --yes-i-really-really-mean-it - cephadm.shell: host.a: - ceph fs dump - ceph --format=json fs dump | jq -e ".filesystems | length == 1" - while ! ceph --format=json mds versions | jq -e ". | add == 4"; do sleep 1; done - fs.pre_upgrade_save: null - ceph-fuse: null - print: '**** done client' - parallel: - upgrade-tasks - workload-tasks - cephadm.shell: host.a: - ceph fs dump - fs.post_upgrade_checks: null teuthology: fragments_dropped: [] meta: {} postmerge: - "local kernel = py_attrgetter(yaml).get('kernel')\nif kernel ~= nil then\n local\ \ branch = py_attrgetter(kernel).get('branch')\n if branch and not kernel.branch:find\ \ \"-all$\" then\n log.debug(\"removing default kernel specification: %s\"\ , kernel)\n py_attrgetter(kernel).pop('branch', nil)\n py_attrgetter(kernel).pop('deb',\ \ nil)\n py_attrgetter(kernel).pop('flavor', nil)\n py_attrgetter(kernel).pop('kdb',\ \ nil)\n py_attrgetter(kernel).pop('koji', nil)\n py_attrgetter(kernel).pop('koji_task',\ \ nil)\n py_attrgetter(kernel).pop('rpm', nil)\n py_attrgetter(kernel).pop('sha1',\ \ nil)\n py_attrgetter(kernel).pop('tag', nil)\n end\nend\n" variables: fail_fs: true teuthology_branch: clyso-debian-13 teuthology_repo: https://github.com/clyso/teuthology teuthology_sha1: 1c580df7a9c7c2aadc272da296344fd99f27c444 timestamp: 2026-03-10_01:00:38 tube: vps upgrade-tasks: sequential: - cephadm.shell: env: - sha1 host.a: - ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force - ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force - ceph config set global log_to_journald false --force - ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1 --daemon-types mgr - while ceph orch upgrade status | jq '.in_progress' | grep true && ! ceph orch upgrade status | jq '.message' | grep Error ; do ceph orch ps ; ceph versions ; ceph orch upgrade status ; sleep 30 ; done - ceph versions | jq -e '.mgr | length == 1' - ceph versions | jq -e '.mgr | keys' | grep $sha1 - ceph versions | jq -e '.overall | length == 2' - ceph orch upgrade check quay.ceph.io/ceph-ci/ceph:$sha1 | jq -e '.up_to_date | length == 2' - ceph orch ps - cephadm.shell: env: - sha1 host.a: - ceph config set mgr mgr/orchestrator/fail_fs true - cephadm.shell: env: - sha1 host.a: - ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force - ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force - ceph config set global log_to_journald false --force - ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1 - cephadm.shell: env: - sha1 host.a: - while ceph orch upgrade status | jq '.in_progress' | grep true && ! ceph orch upgrade status | jq '.message' | grep Error ; do ceph orch ps ; ceph versions ; ceph fs dump; ceph orch upgrade status ; ceph health detail ; sleep 30 ; done - ceph orch ps - ceph orch upgrade status - ceph health detail - ceph versions - echo "wait for servicemap items w/ changing names to refresh" - sleep 60 - ceph orch ps - ceph versions - ceph versions | jq -e '.overall | length == 1' - ceph versions | jq -e '.overall | keys' | grep $sha1 user: kyr verbose: false worker_log: /home/teuthos/.teuthology/dispatcher/dispatcher.vps.611473 workload-tasks: sequential: - workunit: clients: all: - suites/fsstress.sh 2026-03-10T07:41:53.831 INFO:teuthology.run:suite_path is set to /home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa; will attempt to use it 2026-03-10T07:41:53.831 INFO:teuthology.run:Found tasks at /home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks 2026-03-10T07:41:53.831 INFO:teuthology.run_tasks:Running task internal.check_packages... 2026-03-10T07:41:53.832 INFO:teuthology.task.internal:Checking packages... 2026-03-10T07:41:53.832 INFO:teuthology.task.internal:Checking packages for os_type 'centos', flavor 'default' and ceph hash 'e911bdebe5c8faa3800735d1568fcdca65db60df' 2026-03-10T07:41:53.832 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using branch 2026-03-10T07:41:53.832 INFO:teuthology.packaging:ref: None 2026-03-10T07:41:53.832 INFO:teuthology.packaging:tag: None 2026-03-10T07:41:53.832 INFO:teuthology.packaging:branch: squid 2026-03-10T07:41:53.832 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T07:41:53.832 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&ref=squid 2026-03-10T07:41:54.589 INFO:teuthology.task.internal:Found packages for ceph version 19.2.3-678.ge911bdeb 2026-03-10T07:41:54.590 INFO:teuthology.run_tasks:Running task internal.buildpackages_prep... 2026-03-10T07:41:54.591 INFO:teuthology.task.internal:no buildpackages task found 2026-03-10T07:41:54.591 INFO:teuthology.run_tasks:Running task internal.save_config... 2026-03-10T07:41:54.591 INFO:teuthology.task.internal:Saving configuration 2026-03-10T07:41:54.600 INFO:teuthology.run_tasks:Running task internal.check_lock... 2026-03-10T07:41:54.601 INFO:teuthology.task.internal.check_lock:Checking locks... 2026-03-10T07:41:54.607 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm05.local', 'description': '/archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/954', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'centos', 'os_version': '9.stream', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-10 07:40:40.165165', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:05', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBLjvvLfyCbJfqoKPTuRTH6g/4/E/vkESTQxDsJ5NTg7U2jdn4vp+aA9LaLHDR4WpL6ZmMwCeUK4HeUkxLFIW8yE='} 2026-03-10T07:41:54.611 DEBUG:teuthology.task.internal.check_lock:machine status is {'name': 'vm08.local', 'description': '/archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/954', 'up': True, 'machine_type': 'vps', 'is_vm': True, 'vm_host': {'name': 'localhost', 'description': None, 'up': True, 'machine_type': 'libvirt', 'is_vm': False, 'vm_host': None, 'os_type': None, 'os_version': None, 'arch': None, 'locked': True, 'locked_since': None, 'locked_by': None, 'mac_address': None, 'ssh_pub_key': None}, 'os_type': 'centos', 'os_version': '9.stream', 'arch': 'x86_64', 'locked': True, 'locked_since': '2026-03-10 07:40:40.165569', 'locked_by': 'kyr', 'mac_address': '52:55:00:00:00:08', 'ssh_pub_key': 'ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPbUPsBaLkFaFvyeSyL0Fro+8bal4F4qWZruFdfreV4B0+MAR2QsEfuss4YK7XHDwhT9QyNtVGz3xw+lhz4/M9I='} 2026-03-10T07:41:54.611 INFO:teuthology.run_tasks:Running task internal.add_remotes... 2026-03-10T07:41:54.612 INFO:teuthology.task.internal:roles: ubuntu@vm05.local - ['host.a', 'client.0', 'osd.0', 'osd.1', 'osd.2'] 2026-03-10T07:41:54.612 INFO:teuthology.task.internal:roles: ubuntu@vm08.local - ['host.b', 'client.1', 'osd.3', 'osd.4', 'osd.5'] 2026-03-10T07:41:54.612 INFO:teuthology.run_tasks:Running task console_log... 2026-03-10T07:41:54.617 DEBUG:teuthology.task.console_log:vm05 does not support IPMI; excluding 2026-03-10T07:41:54.622 DEBUG:teuthology.task.console_log:vm08 does not support IPMI; excluding 2026-03-10T07:41:54.622 DEBUG:teuthology.exit:Installing handler: Handler(exiter=, func=.kill_console_loggers at 0x7f3b2f27a170>, signals=[15]) 2026-03-10T07:41:54.622 INFO:teuthology.run_tasks:Running task internal.connect... 2026-03-10T07:41:54.623 INFO:teuthology.task.internal:Opening connections... 2026-03-10T07:41:54.623 DEBUG:teuthology.task.internal:connecting to ubuntu@vm05.local 2026-03-10T07:41:54.623 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm05.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-10T07:41:54.680 DEBUG:teuthology.task.internal:connecting to ubuntu@vm08.local 2026-03-10T07:41:54.680 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm08.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-10T07:41:54.740 INFO:teuthology.run_tasks:Running task internal.push_inventory... 2026-03-10T07:41:54.741 DEBUG:teuthology.orchestra.run.vm05:> uname -m 2026-03-10T07:41:54.791 INFO:teuthology.orchestra.run.vm05.stdout:x86_64 2026-03-10T07:41:54.791 DEBUG:teuthology.orchestra.run.vm05:> cat /etc/os-release 2026-03-10T07:41:54.845 INFO:teuthology.orchestra.run.vm05.stdout:NAME="CentOS Stream" 2026-03-10T07:41:54.845 INFO:teuthology.orchestra.run.vm05.stdout:VERSION="9" 2026-03-10T07:41:54.845 INFO:teuthology.orchestra.run.vm05.stdout:ID="centos" 2026-03-10T07:41:54.845 INFO:teuthology.orchestra.run.vm05.stdout:ID_LIKE="rhel fedora" 2026-03-10T07:41:54.845 INFO:teuthology.orchestra.run.vm05.stdout:VERSION_ID="9" 2026-03-10T07:41:54.845 INFO:teuthology.orchestra.run.vm05.stdout:PLATFORM_ID="platform:el9" 2026-03-10T07:41:54.845 INFO:teuthology.orchestra.run.vm05.stdout:PRETTY_NAME="CentOS Stream 9" 2026-03-10T07:41:54.845 INFO:teuthology.orchestra.run.vm05.stdout:ANSI_COLOR="0;31" 2026-03-10T07:41:54.845 INFO:teuthology.orchestra.run.vm05.stdout:LOGO="fedora-logo-icon" 2026-03-10T07:41:54.845 INFO:teuthology.orchestra.run.vm05.stdout:CPE_NAME="cpe:/o:centos:centos:9" 2026-03-10T07:41:54.845 INFO:teuthology.orchestra.run.vm05.stdout:HOME_URL="https://centos.org/" 2026-03-10T07:41:54.845 INFO:teuthology.orchestra.run.vm05.stdout:BUG_REPORT_URL="https://issues.redhat.com/" 2026-03-10T07:41:54.846 INFO:teuthology.orchestra.run.vm05.stdout:REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux 9" 2026-03-10T07:41:54.846 INFO:teuthology.orchestra.run.vm05.stdout:REDHAT_SUPPORT_PRODUCT_VERSION="CentOS Stream" 2026-03-10T07:41:54.846 INFO:teuthology.lock.ops:Updating vm05.local on lock server 2026-03-10T07:41:54.850 DEBUG:teuthology.orchestra.run.vm08:> uname -m 2026-03-10T07:41:54.865 INFO:teuthology.orchestra.run.vm08.stdout:x86_64 2026-03-10T07:41:54.865 DEBUG:teuthology.orchestra.run.vm08:> cat /etc/os-release 2026-03-10T07:41:54.919 INFO:teuthology.orchestra.run.vm08.stdout:NAME="CentOS Stream" 2026-03-10T07:41:54.919 INFO:teuthology.orchestra.run.vm08.stdout:VERSION="9" 2026-03-10T07:41:54.919 INFO:teuthology.orchestra.run.vm08.stdout:ID="centos" 2026-03-10T07:41:54.919 INFO:teuthology.orchestra.run.vm08.stdout:ID_LIKE="rhel fedora" 2026-03-10T07:41:54.919 INFO:teuthology.orchestra.run.vm08.stdout:VERSION_ID="9" 2026-03-10T07:41:54.919 INFO:teuthology.orchestra.run.vm08.stdout:PLATFORM_ID="platform:el9" 2026-03-10T07:41:54.919 INFO:teuthology.orchestra.run.vm08.stdout:PRETTY_NAME="CentOS Stream 9" 2026-03-10T07:41:54.919 INFO:teuthology.orchestra.run.vm08.stdout:ANSI_COLOR="0;31" 2026-03-10T07:41:54.919 INFO:teuthology.orchestra.run.vm08.stdout:LOGO="fedora-logo-icon" 2026-03-10T07:41:54.919 INFO:teuthology.orchestra.run.vm08.stdout:CPE_NAME="cpe:/o:centos:centos:9" 2026-03-10T07:41:54.919 INFO:teuthology.orchestra.run.vm08.stdout:HOME_URL="https://centos.org/" 2026-03-10T07:41:54.919 INFO:teuthology.orchestra.run.vm08.stdout:BUG_REPORT_URL="https://issues.redhat.com/" 2026-03-10T07:41:54.919 INFO:teuthology.orchestra.run.vm08.stdout:REDHAT_SUPPORT_PRODUCT="Red Hat Enterprise Linux 9" 2026-03-10T07:41:54.919 INFO:teuthology.orchestra.run.vm08.stdout:REDHAT_SUPPORT_PRODUCT_VERSION="CentOS Stream" 2026-03-10T07:41:54.920 INFO:teuthology.lock.ops:Updating vm08.local on lock server 2026-03-10T07:41:54.924 INFO:teuthology.run_tasks:Running task internal.serialize_remote_roles... 2026-03-10T07:41:54.926 INFO:teuthology.run_tasks:Running task internal.check_conflict... 2026-03-10T07:41:54.927 INFO:teuthology.task.internal:Checking for old test directory... 2026-03-10T07:41:54.927 DEBUG:teuthology.orchestra.run.vm05:> test '!' -e /home/ubuntu/cephtest 2026-03-10T07:41:54.929 DEBUG:teuthology.orchestra.run.vm08:> test '!' -e /home/ubuntu/cephtest 2026-03-10T07:41:54.973 INFO:teuthology.run_tasks:Running task internal.check_ceph_data... 2026-03-10T07:41:54.974 INFO:teuthology.task.internal:Checking for non-empty /var/lib/ceph... 2026-03-10T07:41:54.974 DEBUG:teuthology.orchestra.run.vm05:> test -z $(ls -A /var/lib/ceph) 2026-03-10T07:41:54.983 DEBUG:teuthology.orchestra.run.vm08:> test -z $(ls -A /var/lib/ceph) 2026-03-10T07:41:54.995 INFO:teuthology.orchestra.run.vm05.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-10T07:41:55.027 INFO:teuthology.orchestra.run.vm08.stderr:ls: cannot access '/var/lib/ceph': No such file or directory 2026-03-10T07:41:55.028 INFO:teuthology.run_tasks:Running task internal.vm_setup... 2026-03-10T07:41:55.036 DEBUG:teuthology.orchestra.run.vm05:> test -e /ceph-qa-ready 2026-03-10T07:41:55.051 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T07:41:55.242 DEBUG:teuthology.orchestra.run.vm08:> test -e /ceph-qa-ready 2026-03-10T07:41:55.255 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T07:41:55.446 INFO:teuthology.run_tasks:Running task internal.base... 2026-03-10T07:41:55.447 INFO:teuthology.task.internal:Creating test directory... 2026-03-10T07:41:55.448 DEBUG:teuthology.orchestra.run.vm05:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-10T07:41:55.450 DEBUG:teuthology.orchestra.run.vm08:> mkdir -p -m0755 -- /home/ubuntu/cephtest 2026-03-10T07:41:55.464 INFO:teuthology.run_tasks:Running task internal.archive_upload... 2026-03-10T07:41:55.465 INFO:teuthology.run_tasks:Running task internal.archive... 2026-03-10T07:41:55.466 INFO:teuthology.task.internal:Creating archive directory... 2026-03-10T07:41:55.466 DEBUG:teuthology.orchestra.run.vm05:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-10T07:41:55.504 DEBUG:teuthology.orchestra.run.vm08:> install -d -m0755 -- /home/ubuntu/cephtest/archive 2026-03-10T07:41:55.522 INFO:teuthology.run_tasks:Running task internal.coredump... 2026-03-10T07:41:55.523 INFO:teuthology.task.internal:Enabling coredump saving... 2026-03-10T07:41:55.523 DEBUG:teuthology.orchestra.run.vm05:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-10T07:41:55.572 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T07:41:55.573 DEBUG:teuthology.orchestra.run.vm08:> test -f /run/.containerenv -o -f /.dockerenv 2026-03-10T07:41:55.589 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T07:41:55.589 DEBUG:teuthology.orchestra.run.vm05:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-10T07:41:55.615 DEBUG:teuthology.orchestra.run.vm08:> install -d -m0755 -- /home/ubuntu/cephtest/archive/coredump && sudo sysctl -w kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core && echo kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core | sudo tee -a /etc/sysctl.conf 2026-03-10T07:41:55.640 INFO:teuthology.orchestra.run.vm05.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-10T07:41:55.649 INFO:teuthology.orchestra.run.vm05.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-10T07:41:55.652 INFO:teuthology.orchestra.run.vm08.stdout:kernel.core_pattern = /home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-10T07:41:55.661 INFO:teuthology.orchestra.run.vm08.stdout:kernel.core_pattern=/home/ubuntu/cephtest/archive/coredump/%t.%p.core 2026-03-10T07:41:55.662 INFO:teuthology.run_tasks:Running task internal.sudo... 2026-03-10T07:41:55.664 INFO:teuthology.task.internal:Configuring sudo... 2026-03-10T07:41:55.664 DEBUG:teuthology.orchestra.run.vm05:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-10T07:41:55.692 DEBUG:teuthology.orchestra.run.vm08:> sudo sed -i.orig.teuthology -e 's/^\([^#]*\) \(requiretty\)/\1 !\2/g' -e 's/^\([^#]*\) !\(visiblepw\)/\1 \2/g' /etc/sudoers 2026-03-10T07:41:55.731 INFO:teuthology.run_tasks:Running task internal.syslog... 2026-03-10T07:41:55.734 INFO:teuthology.task.internal.syslog:Starting syslog monitoring... 2026-03-10T07:41:55.734 DEBUG:teuthology.orchestra.run.vm05:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-10T07:41:55.756 DEBUG:teuthology.orchestra.run.vm08:> mkdir -p -m0755 -- /home/ubuntu/cephtest/archive/syslog 2026-03-10T07:41:55.786 DEBUG:teuthology.orchestra.run.vm05:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-10T07:41:55.832 DEBUG:teuthology.orchestra.run.vm05:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-10T07:41:55.890 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T07:41:55.890 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-10T07:41:55.947 DEBUG:teuthology.orchestra.run.vm08:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-10T07:41:55.972 DEBUG:teuthology.orchestra.run.vm08:> install -m 666 /dev/null /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-10T07:41:56.030 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T07:41:56.030 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/etc/rsyslog.d/80-cephtest.conf 2026-03-10T07:41:56.091 DEBUG:teuthology.orchestra.run.vm05:> sudo service rsyslog restart 2026-03-10T07:41:56.092 DEBUG:teuthology.orchestra.run.vm08:> sudo service rsyslog restart 2026-03-10T07:41:56.116 INFO:teuthology.orchestra.run.vm05.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-10T07:41:56.158 INFO:teuthology.orchestra.run.vm08.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-10T07:41:56.615 INFO:teuthology.run_tasks:Running task internal.timer... 2026-03-10T07:41:56.616 INFO:teuthology.task.internal:Starting timer... 2026-03-10T07:41:56.616 INFO:teuthology.run_tasks:Running task pcp... 2026-03-10T07:41:56.619 INFO:teuthology.run_tasks:Running task selinux... 2026-03-10T07:41:56.621 DEBUG:teuthology.task:Applying overrides for task selinux: {'allowlist': ['scontext=system_u:system_r:logrotate_t:s0', 'scontext=system_u:system_r:getty_t:s0']} 2026-03-10T07:41:56.621 INFO:teuthology.task.selinux:Excluding vm05: VMs are not yet supported 2026-03-10T07:41:56.621 INFO:teuthology.task.selinux:Excluding vm08: VMs are not yet supported 2026-03-10T07:41:56.621 DEBUG:teuthology.task.selinux:Getting current SELinux state 2026-03-10T07:41:56.621 DEBUG:teuthology.task.selinux:Existing SELinux modes: {} 2026-03-10T07:41:56.621 INFO:teuthology.task.selinux:Putting SELinux into permissive mode 2026-03-10T07:41:56.621 INFO:teuthology.run_tasks:Running task ansible.cephlab... 2026-03-10T07:41:56.623 DEBUG:teuthology.task:Applying overrides for task ansible.cephlab: {'branch': 'main', 'skip_tags': 'nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs', 'vars': {'timezone': 'UTC'}} 2026-03-10T07:41:56.623 DEBUG:teuthology.repo_utils:Setting repo remote to https://github.com/ceph/ceph-cm-ansible.git 2026-03-10T07:41:56.624 INFO:teuthology.repo_utils:Fetching github.com_ceph_ceph-cm-ansible_main from origin 2026-03-10T07:41:57.111 DEBUG:teuthology.repo_utils:Resetting repo at /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main to origin/main 2026-03-10T07:41:57.116 INFO:teuthology.task.ansible:Playbook: [{'import_playbook': 'ansible_managed.yml'}, {'import_playbook': 'teuthology.yml'}, {'hosts': 'testnodes', 'tasks': [{'set_fact': {'ran_from_cephlab_playbook': True}}]}, {'import_playbook': 'testnodes.yml'}, {'import_playbook': 'container-host.yml'}, {'import_playbook': 'cobbler.yml'}, {'import_playbook': 'paddles.yml'}, {'import_playbook': 'pulpito.yml'}, {'hosts': 'testnodes', 'become': True, 'tasks': [{'name': 'Touch /ceph-qa-ready', 'file': {'path': '/ceph-qa-ready', 'state': 'touch'}, 'when': 'ran_from_cephlab_playbook|bool'}]}] 2026-03-10T07:41:57.116 DEBUG:teuthology.task.ansible:Running ansible-playbook -v --extra-vars '{"ansible_ssh_user": "ubuntu", "timezone": "UTC"}' -i /tmp/teuth_ansible_inventorycab5maxj --limit vm05.local,vm08.local /home/teuthos/src/github.com_ceph_ceph-cm-ansible_main/cephlab.yml --skip-tags nagios,monitoring-scripts,hostname,pubkeys,zap,sudoers,kerberos,ntp-client,resolvconf,cpan,nfs 2026-03-10T07:44:09.447 DEBUG:teuthology.task.ansible:Reconnecting to [Remote(name='ubuntu@vm05.local'), Remote(name='ubuntu@vm08.local')] 2026-03-10T07:44:09.448 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm05.local' 2026-03-10T07:44:09.448 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm05.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-10T07:44:09.511 DEBUG:teuthology.orchestra.run.vm05:> true 2026-03-10T07:44:09.606 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm05.local' 2026-03-10T07:44:09.606 INFO:teuthology.orchestra.remote:Trying to reconnect to host 'ubuntu@vm08.local' 2026-03-10T07:44:09.606 DEBUG:teuthology.orchestra.connection:{'hostname': 'vm08.local', 'username': 'ubuntu', 'timeout': 60} 2026-03-10T07:44:09.671 DEBUG:teuthology.orchestra.run.vm08:> true 2026-03-10T07:44:09.747 INFO:teuthology.orchestra.remote:Successfully reconnected to host 'ubuntu@vm08.local' 2026-03-10T07:44:09.747 INFO:teuthology.run_tasks:Running task clock... 2026-03-10T07:44:09.749 INFO:teuthology.task.clock:Syncing clocks and checking initial clock skew... 2026-03-10T07:44:09.749 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-10T07:44:09.749 DEBUG:teuthology.orchestra.run.vm05:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-10T07:44:09.753 INFO:teuthology.orchestra.run:Running command with timeout 360 2026-03-10T07:44:09.753 DEBUG:teuthology.orchestra.run.vm08:> sudo systemctl stop ntp.service || sudo systemctl stop ntpd.service || sudo systemctl stop chronyd.service ; sudo ntpd -gq || sudo chronyc makestep ; sudo systemctl start ntp.service || sudo systemctl start ntpd.service || sudo systemctl start chronyd.service ; PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-10T07:44:09.784 INFO:teuthology.orchestra.run.vm05.stderr:Failed to stop ntp.service: Unit ntp.service not loaded. 2026-03-10T07:44:09.802 INFO:teuthology.orchestra.run.vm05.stderr:Failed to stop ntpd.service: Unit ntpd.service not loaded. 2026-03-10T07:44:09.825 INFO:teuthology.orchestra.run.vm08.stderr:Failed to stop ntp.service: Unit ntp.service not loaded. 2026-03-10T07:44:09.829 INFO:teuthology.orchestra.run.vm05.stderr:sudo: ntpd: command not found 2026-03-10T07:44:09.843 INFO:teuthology.orchestra.run.vm05.stdout:506 Cannot talk to daemon 2026-03-10T07:44:09.846 INFO:teuthology.orchestra.run.vm08.stderr:Failed to stop ntpd.service: Unit ntpd.service not loaded. 2026-03-10T07:44:09.859 INFO:teuthology.orchestra.run.vm05.stderr:Failed to start ntp.service: Unit ntp.service not found. 2026-03-10T07:44:09.877 INFO:teuthology.orchestra.run.vm05.stderr:Failed to start ntpd.service: Unit ntpd.service not found. 2026-03-10T07:44:09.879 INFO:teuthology.orchestra.run.vm08.stderr:sudo: ntpd: command not found 2026-03-10T07:44:09.896 INFO:teuthology.orchestra.run.vm08.stdout:506 Cannot talk to daemon 2026-03-10T07:44:09.915 INFO:teuthology.orchestra.run.vm08.stderr:Failed to start ntp.service: Unit ntp.service not found. 2026-03-10T07:44:09.927 INFO:teuthology.orchestra.run.vm05.stderr:bash: line 1: ntpq: command not found 2026-03-10T07:44:09.932 INFO:teuthology.orchestra.run.vm08.stderr:Failed to start ntpd.service: Unit ntpd.service not found. 2026-03-10T07:44:09.979 INFO:teuthology.orchestra.run.vm08.stderr:bash: line 1: ntpq: command not found 2026-03-10T07:44:10.011 INFO:teuthology.orchestra.run.vm08.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-10T07:44:10.011 INFO:teuthology.orchestra.run.vm08.stdout:=============================================================================== 2026-03-10T07:44:10.011 INFO:teuthology.orchestra.run.vm08.stdout:^? server1a.meinberg.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T07:44:10.011 INFO:teuthology.orchestra.run.vm08.stdout:^? time2.sebhosting.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T07:44:10.011 INFO:teuthology.orchestra.run.vm08.stdout:^? rdbl.cntx.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T07:44:10.011 INFO:teuthology.orchestra.run.vm08.stdout:^? ntp0.rrze.uni-erlangen.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T07:44:10.011 INFO:teuthology.orchestra.run.vm05.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-10T07:44:10.011 INFO:teuthology.orchestra.run.vm05.stdout:=============================================================================== 2026-03-10T07:44:10.011 INFO:teuthology.orchestra.run.vm05.stdout:^? time2.sebhosting.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T07:44:10.011 INFO:teuthology.orchestra.run.vm05.stdout:^? rdbl.cntx.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T07:44:10.011 INFO:teuthology.orchestra.run.vm05.stdout:^? ntp0.rrze.uni-erlangen.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T07:44:10.011 INFO:teuthology.orchestra.run.vm05.stdout:^? server1a.meinberg.de 0 6 0 - +0ns[ +0ns] +/- 0ns 2026-03-10T07:44:10.011 INFO:teuthology.run_tasks:Running task install... 2026-03-10T07:44:10.013 DEBUG:teuthology.task.install:project ceph 2026-03-10T07:44:10.013 DEBUG:teuthology.task.install:INSTALL overrides: {'ceph': {'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'}, 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-10T07:44:10.013 DEBUG:teuthology.task.install:config {'exclude_packages': ['ceph-volume'], 'tag': 'v18.2.1', 'flavor': 'default', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}} 2026-03-10T07:44:10.013 INFO:teuthology.task.install:Using flavor: default 2026-03-10T07:44:10.016 DEBUG:teuthology.task.install:Package list is: {'deb': ['ceph', 'cephadm', 'ceph-mds', 'ceph-mgr', 'ceph-common', 'ceph-fuse', 'ceph-test', 'radosgw', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'libcephfs2', 'libcephfs-dev', 'librados2', 'librbd1', 'rbd-fuse'], 'rpm': ['ceph-radosgw', 'ceph-test', 'ceph', 'ceph-base', 'cephadm', 'ceph-immutable-object-cache', 'ceph-mgr', 'ceph-mgr-dashboard', 'ceph-mgr-diskprediction-local', 'ceph-mgr-rook', 'ceph-mgr-cephadm', 'ceph-fuse', 'librados-devel', 'libcephfs2', 'libcephfs-devel', 'librados2', 'librbd1', 'python3-rados', 'python3-rgw', 'python3-cephfs', 'python3-rbd', 'rbd-fuse', 'rbd-mirror', 'rbd-nbd']} 2026-03-10T07:44:10.016 INFO:teuthology.task.install:extra packages: [] 2026-03-10T07:44:10.016 DEBUG:teuthology.task.install.rpm:_update_package_list_and_install: config is {'branch': None, 'cleanup': None, 'debuginfo': None, 'downgrade_packages': [], 'exclude_packages': ['ceph-volume'], 'extra_packages': [], 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}, 'extras': None, 'enable_coprs': [], 'flavor': 'default', 'install_ceph_packages': True, 'packages': {}, 'project': 'ceph', 'repos_only': False, 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'tag': 'v18.2.1', 'wait_for_package': False} 2026-03-10T07:44:10.016 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T07:44:10.016 INFO:teuthology.packaging:ref: None 2026-03-10T07:44:10.016 INFO:teuthology.packaging:tag: v18.2.1 2026-03-10T07:44:10.016 INFO:teuthology.packaging:branch: None 2026-03-10T07:44:10.016 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T07:44:10.701 DEBUG:teuthology.repo_utils:git ls-remote https://github.com/ceph/ceph v18.2.1^{} -> 7fe91d5d5842e04be3b4f514d6dd990c54b29c76 2026-03-10T07:44:10.701 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=7fe91d5d5842e04be3b4f514d6dd990c54b29c76 2026-03-10T07:44:10.702 DEBUG:teuthology.task.install.rpm:_update_package_list_and_install: config is {'branch': None, 'cleanup': None, 'debuginfo': None, 'downgrade_packages': [], 'exclude_packages': ['ceph-volume'], 'extra_packages': [], 'extra_system_packages': {'deb': ['python3-xmltodict', 'python3-jmespath'], 'rpm': ['bzip2', 'perl-Test-Harness', 'python3-xmltodict', 'python3-jmespath']}, 'extras': None, 'enable_coprs': [], 'flavor': 'default', 'install_ceph_packages': True, 'packages': {}, 'project': 'ceph', 'repos_only': False, 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'tag': 'v18.2.1', 'wait_for_package': False} 2026-03-10T07:44:10.702 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T07:44:10.702 INFO:teuthology.packaging:ref: None 2026-03-10T07:44:10.702 INFO:teuthology.packaging:tag: v18.2.1 2026-03-10T07:44:10.702 INFO:teuthology.packaging:branch: None 2026-03-10T07:44:10.702 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T07:44:10.702 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=7fe91d5d5842e04be3b4f514d6dd990c54b29c76 2026-03-10T07:44:11.290 INFO:teuthology.task.install.rpm:Pulling from https://chacra.ceph.com/r/ceph/reef/7fe91d5d5842e04be3b4f514d6dd990c54b29c76/centos/9/flavors/default/ 2026-03-10T07:44:11.290 INFO:teuthology.task.install.rpm:Package version is 18.2.1-0 2026-03-10T07:44:11.445 INFO:teuthology.task.install.rpm:Pulling from https://chacra.ceph.com/r/ceph/reef/7fe91d5d5842e04be3b4f514d6dd990c54b29c76/centos/9/flavors/default/ 2026-03-10T07:44:11.445 INFO:teuthology.task.install.rpm:Package version is 18.2.1-0 2026-03-10T07:44:11.634 INFO:teuthology.packaging:Writing yum repo: [ceph] name=ceph packages for $basearch baseurl=https://chacra.ceph.com/r/ceph/reef/7fe91d5d5842e04be3b4f514d6dd990c54b29c76/centos/9/flavors/default/$basearch enabled=1 gpgcheck=0 type=rpm-md [ceph-noarch] name=ceph noarch packages baseurl=https://chacra.ceph.com/r/ceph/reef/7fe91d5d5842e04be3b4f514d6dd990c54b29c76/centos/9/flavors/default/noarch enabled=1 gpgcheck=0 type=rpm-md [ceph-source] name=ceph source packages baseurl=https://chacra.ceph.com/r/ceph/reef/7fe91d5d5842e04be3b4f514d6dd990c54b29c76/centos/9/flavors/default/SRPMS enabled=1 gpgcheck=0 type=rpm-md 2026-03-10T07:44:11.634 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T07:44:11.634 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/etc/yum.repos.d/ceph.repo 2026-03-10T07:44:11.670 INFO:teuthology.task.install.rpm:Installing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd, bzip2, perl-Test-Harness, python3-xmltodict, python3-jmespath on remote rpm x86_64 2026-03-10T07:44:11.670 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T07:44:11.670 INFO:teuthology.packaging:ref: None 2026-03-10T07:44:11.670 INFO:teuthology.packaging:tag: v18.2.1 2026-03-10T07:44:11.670 INFO:teuthology.packaging:branch: None 2026-03-10T07:44:11.670 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T07:44:11.671 DEBUG:teuthology.orchestra.run.vm05:> if test -f /etc/yum.repos.d/ceph.repo ; then sudo sed -i -e ':a;N;$!ba;s/enabled=1\ngpg/enabled=1\npriority=1\ngpg/g' -e 's;ref/[a-zA-Z0-9_-]*/;ref/v18.2.1/;g' /etc/yum.repos.d/ceph.repo ; fi 2026-03-10T07:44:11.742 DEBUG:teuthology.orchestra.run.vm05:> sudo touch -a /etc/yum/pluginconf.d/priorities.conf ; test -e /etc/yum/pluginconf.d/priorities.conf.orig || sudo cp -af /etc/yum/pluginconf.d/priorities.conf /etc/yum/pluginconf.d/priorities.conf.orig 2026-03-10T07:44:11.818 INFO:teuthology.packaging:Writing yum repo: [ceph] name=ceph packages for $basearch baseurl=https://chacra.ceph.com/r/ceph/reef/7fe91d5d5842e04be3b4f514d6dd990c54b29c76/centos/9/flavors/default/$basearch enabled=1 gpgcheck=0 type=rpm-md [ceph-noarch] name=ceph noarch packages baseurl=https://chacra.ceph.com/r/ceph/reef/7fe91d5d5842e04be3b4f514d6dd990c54b29c76/centos/9/flavors/default/noarch enabled=1 gpgcheck=0 type=rpm-md [ceph-source] name=ceph source packages baseurl=https://chacra.ceph.com/r/ceph/reef/7fe91d5d5842e04be3b4f514d6dd990c54b29c76/centos/9/flavors/default/SRPMS enabled=1 gpgcheck=0 type=rpm-md 2026-03-10T07:44:11.818 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T07:44:11.818 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/etc/yum.repos.d/ceph.repo 2026-03-10T07:44:11.829 DEBUG:teuthology.orchestra.run.vm05:> grep check_obsoletes /etc/yum/pluginconf.d/priorities.conf && sudo sed -i 's/check_obsoletes.*0/check_obsoletes = 1/g' /etc/yum/pluginconf.d/priorities.conf || echo 'check_obsoletes = 1' | sudo tee -a /etc/yum/pluginconf.d/priorities.conf 2026-03-10T07:44:11.853 INFO:teuthology.task.install.rpm:Installing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd, bzip2, perl-Test-Harness, python3-xmltodict, python3-jmespath on remote rpm x86_64 2026-03-10T07:44:11.853 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T07:44:11.853 INFO:teuthology.packaging:ref: None 2026-03-10T07:44:11.853 INFO:teuthology.packaging:tag: v18.2.1 2026-03-10T07:44:11.853 INFO:teuthology.packaging:branch: None 2026-03-10T07:44:11.853 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T07:44:11.854 DEBUG:teuthology.orchestra.run.vm08:> if test -f /etc/yum.repos.d/ceph.repo ; then sudo sed -i -e ':a;N;$!ba;s/enabled=1\ngpg/enabled=1\npriority=1\ngpg/g' -e 's;ref/[a-zA-Z0-9_-]*/;ref/v18.2.1/;g' /etc/yum.repos.d/ceph.repo ; fi 2026-03-10T07:44:11.866 INFO:teuthology.orchestra.run.vm05.stdout:check_obsoletes = 1 2026-03-10T07:44:11.867 DEBUG:teuthology.orchestra.run.vm05:> sudo yum clean all 2026-03-10T07:44:11.933 DEBUG:teuthology.orchestra.run.vm08:> sudo touch -a /etc/yum/pluginconf.d/priorities.conf ; test -e /etc/yum/pluginconf.d/priorities.conf.orig || sudo cp -af /etc/yum/pluginconf.d/priorities.conf /etc/yum/pluginconf.d/priorities.conf.orig 2026-03-10T07:44:12.019 DEBUG:teuthology.orchestra.run.vm08:> grep check_obsoletes /etc/yum/pluginconf.d/priorities.conf && sudo sed -i 's/check_obsoletes.*0/check_obsoletes = 1/g' /etc/yum/pluginconf.d/priorities.conf || echo 'check_obsoletes = 1' | sudo tee -a /etc/yum/pluginconf.d/priorities.conf 2026-03-10T07:44:12.051 INFO:teuthology.orchestra.run.vm08.stdout:check_obsoletes = 1 2026-03-10T07:44:12.053 DEBUG:teuthology.orchestra.run.vm08:> sudo yum clean all 2026-03-10T07:44:12.064 INFO:teuthology.orchestra.run.vm05.stdout:41 files removed 2026-03-10T07:44:12.092 DEBUG:teuthology.orchestra.run.vm05:> sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd bzip2 perl-Test-Harness python3-xmltodict python3-jmespath 2026-03-10T07:44:12.278 INFO:teuthology.orchestra.run.vm08.stdout:41 files removed 2026-03-10T07:44:12.314 DEBUG:teuthology.orchestra.run.vm08:> sudo yum -y install ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd bzip2 perl-Test-Harness python3-xmltodict python3-jmespath 2026-03-10T07:44:13.131 INFO:teuthology.orchestra.run.vm05.stdout:ceph packages for x86_64 89 kB/s | 76 kB 00:00 2026-03-10T07:44:13.320 INFO:teuthology.orchestra.run.vm08.stdout:ceph packages for x86_64 95 kB/s | 76 kB 00:00 2026-03-10T07:44:13.772 INFO:teuthology.orchestra.run.vm05.stdout:ceph noarch packages 15 kB/s | 9.4 kB 00:00 2026-03-10T07:44:13.956 INFO:teuthology.orchestra.run.vm08.stdout:ceph noarch packages 15 kB/s | 9.4 kB 00:00 2026-03-10T07:44:14.408 INFO:teuthology.orchestra.run.vm05.stdout:ceph source packages 3.5 kB/s | 2.2 kB 00:00 2026-03-10T07:44:14.584 INFO:teuthology.orchestra.run.vm08.stdout:ceph source packages 3.6 kB/s | 2.2 kB 00:00 2026-03-10T07:44:15.509 INFO:teuthology.orchestra.run.vm08.stdout:CentOS Stream 9 - BaseOS 9.9 MB/s | 8.9 MB 00:00 2026-03-10T07:44:16.482 INFO:teuthology.orchestra.run.vm05.stdout:CentOS Stream 9 - BaseOS 4.3 MB/s | 8.9 MB 00:02 2026-03-10T07:44:17.037 INFO:teuthology.orchestra.run.vm08.stdout:CentOS Stream 9 - AppStream 37 MB/s | 27 MB 00:00 2026-03-10T07:44:18.967 INFO:teuthology.orchestra.run.vm05.stdout:CentOS Stream 9 - AppStream 15 MB/s | 27 MB 00:01 2026-03-10T07:44:21.725 INFO:teuthology.orchestra.run.vm08.stdout:CentOS Stream 9 - CRB 4.0 MB/s | 8.0 MB 00:01 2026-03-10T07:44:22.378 INFO:teuthology.orchestra.run.vm05.stdout:CentOS Stream 9 - CRB 11 MB/s | 8.0 MB 00:00 2026-03-10T07:44:23.048 INFO:teuthology.orchestra.run.vm08.stdout:CentOS Stream 9 - Extras packages 41 kB/s | 20 kB 00:00 2026-03-10T07:44:23.986 INFO:teuthology.orchestra.run.vm08.stdout:Extra Packages for Enterprise Linux 24 MB/s | 20 MB 00:00 2026-03-10T07:44:24.191 INFO:teuthology.orchestra.run.vm05.stdout:CentOS Stream 9 - Extras packages 21 kB/s | 20 kB 00:00 2026-03-10T07:44:25.465 INFO:teuthology.orchestra.run.vm05.stdout:Extra Packages for Enterprise Linux 17 MB/s | 20 MB 00:01 2026-03-10T07:44:28.454 INFO:teuthology.orchestra.run.vm08.stdout:lab-extras 64 kB/s | 50 kB 00:00 2026-03-10T07:44:29.790 INFO:teuthology.orchestra.run.vm08.stdout:Package librados2-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-10T07:44:29.790 INFO:teuthology.orchestra.run.vm08.stdout:Package librbd1-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-10T07:44:29.794 INFO:teuthology.orchestra.run.vm08.stdout:Package bzip2-1.0.8-11.el9.x86_64 is already installed. 2026-03-10T07:44:29.795 INFO:teuthology.orchestra.run.vm08.stdout:Package perl-Test-Harness-1:3.42-461.el9.noarch is already installed. 2026-03-10T07:44:29.821 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T07:44:29.825 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T07:44:29.825 INFO:teuthology.orchestra.run.vm08.stdout: Package Arch Version Repository Size 2026-03-10T07:44:29.825 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T07:44:29.825 INFO:teuthology.orchestra.run.vm08.stdout:Installing: 2026-03-10T07:44:29.825 INFO:teuthology.orchestra.run.vm08.stdout: ceph x86_64 2:18.2.1-0.el9 ceph 6.4 k 2026-03-10T07:44:29.825 INFO:teuthology.orchestra.run.vm08.stdout: ceph-base x86_64 2:18.2.1-0.el9 ceph 5.2 M 2026-03-10T07:44:29.825 INFO:teuthology.orchestra.run.vm08.stdout: ceph-fuse x86_64 2:18.2.1-0.el9 ceph 839 k 2026-03-10T07:44:29.825 INFO:teuthology.orchestra.run.vm08.stdout: ceph-immutable-object-cache x86_64 2:18.2.1-0.el9 ceph 142 k 2026-03-10T07:44:29.825 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr x86_64 2:18.2.1-0.el9 ceph 1.4 M 2026-03-10T07:44:29.825 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-cephadm noarch 2:18.2.1-0.el9 ceph-noarch 132 k 2026-03-10T07:44:29.825 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-dashboard noarch 2:18.2.1-0.el9 ceph-noarch 1.8 M 2026-03-10T07:44:29.825 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.1-0.el9 ceph-noarch 7.4 M 2026-03-10T07:44:29.825 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-rook noarch 2:18.2.1-0.el9 ceph-noarch 50 k 2026-03-10T07:44:29.825 INFO:teuthology.orchestra.run.vm08.stdout: ceph-radosgw x86_64 2:18.2.1-0.el9 ceph 7.7 M 2026-03-10T07:44:29.825 INFO:teuthology.orchestra.run.vm08.stdout: ceph-test x86_64 2:18.2.1-0.el9 ceph 40 M 2026-03-10T07:44:29.825 INFO:teuthology.orchestra.run.vm08.stdout: cephadm noarch 2:18.2.1-0.el9 ceph-noarch 221 k 2026-03-10T07:44:29.825 INFO:teuthology.orchestra.run.vm08.stdout: libcephfs-devel x86_64 2:18.2.1-0.el9 ceph 31 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: libcephfs2 x86_64 2:18.2.1-0.el9 ceph 658 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: librados-devel x86_64 2:18.2.1-0.el9 ceph 127 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: python3-cephfs x86_64 2:18.2.1-0.el9 ceph 161 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: python3-jmespath noarch 1.0.1-1.el9 appstream 48 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: python3-rados x86_64 2:18.2.1-0.el9 ceph 321 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: python3-rbd x86_64 2:18.2.1-0.el9 ceph 297 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: python3-rgw x86_64 2:18.2.1-0.el9 ceph 99 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: python3-xmltodict noarch 0.12.0-15.el9 epel 22 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: rbd-fuse x86_64 2:18.2.1-0.el9 ceph 86 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: rbd-mirror x86_64 2:18.2.1-0.el9 ceph 3.0 M 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: rbd-nbd x86_64 2:18.2.1-0.el9 ceph 171 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout:Upgrading: 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: librados2 x86_64 2:18.2.1-0.el9 ceph 3.3 M 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: librbd1 x86_64 2:18.2.1-0.el9 ceph 3.0 M 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout:Installing dependencies: 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: boost-program-options x86_64 1.75.0-13.el9 appstream 104 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: ceph-common x86_64 2:18.2.1-0.el9 ceph 18 M 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: ceph-grafana-dashboards noarch 2:18.2.1-0.el9 ceph-noarch 23 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mds x86_64 2:18.2.1-0.el9 ceph 2.1 M 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-modules-core noarch 2:18.2.1-0.el9 ceph-noarch 242 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mon x86_64 2:18.2.1-0.el9 ceph 4.4 M 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: ceph-osd x86_64 2:18.2.1-0.el9 ceph 18 M 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: ceph-prometheus-alerts noarch 2:18.2.1-0.el9 ceph-noarch 15 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: ceph-selinux x86_64 2:18.2.1-0.el9 ceph 24 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas x86_64 3.0.4-9.el9 appstream 30 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 appstream 3.0 M 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 appstream 15 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: fmt x86_64 8.1.1-5.el9 epel 111 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: gperftools-libs x86_64 2.9.1-3.el9 epel 308 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: ledmon-libs x86_64 1.1.0-3.el9 baseos 40 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: libarrow x86_64 9.0.0-15.el9 epel 4.4 M 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: libarrow-doc noarch 9.0.0-15.el9 epel 25 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: libcephsqlite x86_64 2:18.2.1-0.el9 ceph 165 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: libconfig x86_64 1.7.2-9.el9 baseos 72 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: libgfortran x86_64 11.5.0-14.el9 baseos 794 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: liboath x86_64 2.6.12-1.el9 epel 49 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: libpmemobj x86_64 1.12.1-1.el9 appstream 160 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: libquadmath x86_64 11.5.0-14.el9 baseos 184 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: librabbitmq x86_64 0.11.0-7.el9 appstream 45 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: libradosstriper1 x86_64 2:18.2.1-0.el9 ceph 474 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: librdkafka x86_64 1.6.1-102.el9 appstream 662 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: librgw2 x86_64 2:18.2.1-0.el9 ceph 4.5 M 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: libstoragemgmt x86_64 1.10.1-1.el9 appstream 246 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: libunwind x86_64 1.6.2-1.el9 epel 67 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: libxslt x86_64 1.1.34-12.el9 appstream 233 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: lttng-ust x86_64 2.12.0-6.el9 appstream 292 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: mailcap noarch 2.1.49-5.el9 baseos 33 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: openblas x86_64 0.3.29-1.el9 appstream 42 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: openblas-openmp x86_64 0.3.29-1.el9 appstream 5.3 M 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: parquet-libs x86_64 9.0.0-15.el9 epel 838 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: python3-asyncssh noarch 2.13.2-5.el9 epel 548 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: python3-autocommand noarch 2.2.2-8.el9 epel 29 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: python3-babel noarch 2.9.1-2.el9 appstream 6.0 M 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 epel 60 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: python3-bcrypt x86_64 3.2.2-1.el9 epel 43 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: python3-cachetools noarch 4.2.4-1.el9 epel 32 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: python3-ceph-argparse x86_64 2:18.2.1-0.el9 ceph 45 k 2026-03-10T07:44:29.826 INFO:teuthology.orchestra.run.vm08.stdout: python3-ceph-common x86_64 2:18.2.1-0.el9 ceph 124 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-certifi noarch 2023.05.07-4.el9 epel 14 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-cffi x86_64 1.14.5-5.el9 baseos 253 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-cheroot noarch 10.0.1-4.el9 epel 173 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-cherrypy noarch 18.6.1-2.el9 epel 358 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-cryptography x86_64 36.0.1-5.el9 baseos 1.2 M 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-devel x86_64 3.9.25-3.el9 appstream 244 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-google-auth noarch 1:2.45.0-1.el9 epel 254 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco noarch 8.2.1-3.el9 epel 11 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 epel 18 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 epel 23 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-context noarch 6.0.1-3.el9 epel 20 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 epel 19 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-text noarch 4.0.0-2.el9 epel 26 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-jinja2 noarch 2.11.3-8.el9 appstream 249 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-jwt noarch 2.4.0-1.el9 epel 41 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 epel 1.0 M 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 appstream 177 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-logutils noarch 0.3.5-21.el9 epel 46 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-mako noarch 1.1.4-6.el9 appstream 172 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-markupsafe x86_64 1.1.1-12.el9 appstream 35 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-more-itertools noarch 8.12.0-2.el9 epel 79 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-natsort noarch 7.1.1-5.el9 epel 58 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-numpy x86_64 1:1.23.5-2.el9 appstream 6.1 M 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 appstream 442 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-pecan noarch 1.4.2-3.el9 epel 272 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-ply noarch 3.11-14.el9 baseos 106 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-portend noarch 3.1.0-2.el9 epel 16 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 epel 90 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyasn1 noarch 0.4.8-7.el9 appstream 157 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 appstream 277 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-pycparser noarch 2.20-6.el9 baseos 135 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-repoze-lru noarch 0.7-16.el9 epel 31 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-requests noarch 2.25.1-10.el9 baseos 126 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 appstream 54 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-routes noarch 2.5.1-5.el9 epel 188 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-rsa noarch 4.9-2.el9 epel 59 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-scipy x86_64 1.9.3-2.el9 appstream 19 M 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-tempora noarch 5.0.0-2.el9 epel 36 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-toml noarch 0.10.2-6.el9 appstream 42 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-typing-extensions noarch 4.15.0-1.el9 epel 86 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-urllib3 noarch 1.26.5-7.el9 baseos 218 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-webob noarch 1.8.8-2.el9 epel 230 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-websocket-client noarch 1.2.3-2.el9 epel 90 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 epel 427 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-zc-lockfile noarch 2.0-10.el9 epel 20 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: re2 x86_64 1:20211101-20.el9 epel 191 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: socat x86_64 1.7.4.1-8.el9 appstream 303 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: thrift x86_64 0.15.0-4.el9 epel 1.6 M 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: xmlstarlet x86_64 1.6.1-20.el9 appstream 64 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout:Installing weak dependencies: 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 epel 9.0 k 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout:Install 117 Packages 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout:Upgrade 2 Packages 2026-03-10T07:44:29.827 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:44:29.828 INFO:teuthology.orchestra.run.vm08.stdout:Total download size: 182 M 2026-03-10T07:44:29.828 INFO:teuthology.orchestra.run.vm08.stdout:Downloading Packages: 2026-03-10T07:44:29.923 INFO:teuthology.orchestra.run.vm05.stdout:lab-extras 64 kB/s | 50 kB 00:00 2026-03-10T07:44:30.890 INFO:teuthology.orchestra.run.vm08.stdout:(1/119): ceph-18.2.1-0.el9.x86_64.rpm 21 kB/s | 6.4 kB 00:00 2026-03-10T07:44:31.222 INFO:teuthology.orchestra.run.vm05.stdout:Package librados2-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-10T07:44:31.223 INFO:teuthology.orchestra.run.vm05.stdout:Package librbd1-2:16.2.4-5.el9.x86_64 is already installed. 2026-03-10T07:44:31.226 INFO:teuthology.orchestra.run.vm05.stdout:Package bzip2-1.0.8-11.el9.x86_64 is already installed. 2026-03-10T07:44:31.227 INFO:teuthology.orchestra.run.vm05.stdout:Package perl-Test-Harness-1:3.42-461.el9.noarch is already installed. 2026-03-10T07:44:31.252 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout:Installing: 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: ceph x86_64 2:18.2.1-0.el9 ceph 6.4 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: ceph-base x86_64 2:18.2.1-0.el9 ceph 5.2 M 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: ceph-fuse x86_64 2:18.2.1-0.el9 ceph 839 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: ceph-immutable-object-cache x86_64 2:18.2.1-0.el9 ceph 142 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr x86_64 2:18.2.1-0.el9 ceph 1.4 M 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-cephadm noarch 2:18.2.1-0.el9 ceph-noarch 132 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-dashboard noarch 2:18.2.1-0.el9 ceph-noarch 1.8 M 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.1-0.el9 ceph-noarch 7.4 M 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-rook noarch 2:18.2.1-0.el9 ceph-noarch 50 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: ceph-radosgw x86_64 2:18.2.1-0.el9 ceph 7.7 M 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: ceph-test x86_64 2:18.2.1-0.el9 ceph 40 M 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: cephadm noarch 2:18.2.1-0.el9 ceph-noarch 221 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-devel x86_64 2:18.2.1-0.el9 ceph 31 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs2 x86_64 2:18.2.1-0.el9 ceph 658 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: librados-devel x86_64 2:18.2.1-0.el9 ceph 127 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: python3-cephfs x86_64 2:18.2.1-0.el9 ceph 161 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: python3-jmespath noarch 1.0.1-1.el9 appstream 48 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: python3-rados x86_64 2:18.2.1-0.el9 ceph 321 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: python3-rbd x86_64 2:18.2.1-0.el9 ceph 297 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: python3-rgw x86_64 2:18.2.1-0.el9 ceph 99 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: python3-xmltodict noarch 0.12.0-15.el9 epel 22 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: rbd-fuse x86_64 2:18.2.1-0.el9 ceph 86 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: rbd-mirror x86_64 2:18.2.1-0.el9 ceph 3.0 M 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: rbd-nbd x86_64 2:18.2.1-0.el9 ceph 171 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout:Upgrading: 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: librados2 x86_64 2:18.2.1-0.el9 ceph 3.3 M 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: librbd1 x86_64 2:18.2.1-0.el9 ceph 3.0 M 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout:Installing dependencies: 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: boost-program-options x86_64 1.75.0-13.el9 appstream 104 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: ceph-common x86_64 2:18.2.1-0.el9 ceph 18 M 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: ceph-grafana-dashboards noarch 2:18.2.1-0.el9 ceph-noarch 23 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mds x86_64 2:18.2.1-0.el9 ceph 2.1 M 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core noarch 2:18.2.1-0.el9 ceph-noarch 242 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mon x86_64 2:18.2.1-0.el9 ceph 4.4 M 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: ceph-osd x86_64 2:18.2.1-0.el9 ceph 18 M 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: ceph-prometheus-alerts noarch 2:18.2.1-0.el9 ceph-noarch 15 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: ceph-selinux x86_64 2:18.2.1-0.el9 ceph 24 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas x86_64 3.0.4-9.el9 appstream 30 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 appstream 3.0 M 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 appstream 15 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: fmt x86_64 8.1.1-5.el9 epel 111 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: gperftools-libs x86_64 2.9.1-3.el9 epel 308 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: ledmon-libs x86_64 1.1.0-3.el9 baseos 40 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: libarrow x86_64 9.0.0-15.el9 epel 4.4 M 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: libarrow-doc noarch 9.0.0-15.el9 epel 25 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: libcephsqlite x86_64 2:18.2.1-0.el9 ceph 165 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: libconfig x86_64 1.7.2-9.el9 baseos 72 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: libgfortran x86_64 11.5.0-14.el9 baseos 794 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: liboath x86_64 2.6.12-1.el9 epel 49 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: libpmemobj x86_64 1.12.1-1.el9 appstream 160 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: libquadmath x86_64 11.5.0-14.el9 baseos 184 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: librabbitmq x86_64 0.11.0-7.el9 appstream 45 k 2026-03-10T07:44:31.256 INFO:teuthology.orchestra.run.vm05.stdout: libradosstriper1 x86_64 2:18.2.1-0.el9 ceph 474 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: librdkafka x86_64 1.6.1-102.el9 appstream 662 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: librgw2 x86_64 2:18.2.1-0.el9 ceph 4.5 M 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: libstoragemgmt x86_64 1.10.1-1.el9 appstream 246 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: libunwind x86_64 1.6.2-1.el9 epel 67 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: libxslt x86_64 1.1.34-12.el9 appstream 233 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: lttng-ust x86_64 2.12.0-6.el9 appstream 292 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: mailcap noarch 2.1.49-5.el9 baseos 33 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: openblas x86_64 0.3.29-1.el9 appstream 42 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: openblas-openmp x86_64 0.3.29-1.el9 appstream 5.3 M 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: parquet-libs x86_64 9.0.0-15.el9 epel 838 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-asyncssh noarch 2.13.2-5.el9 epel 548 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-autocommand noarch 2.2.2-8.el9 epel 29 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-babel noarch 2.9.1-2.el9 appstream 6.0 M 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 epel 60 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-bcrypt x86_64 3.2.2-1.el9 epel 43 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-cachetools noarch 4.2.4-1.el9 epel 32 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse x86_64 2:18.2.1-0.el9 ceph 45 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-common x86_64 2:18.2.1-0.el9 ceph 124 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-certifi noarch 2023.05.07-4.el9 epel 14 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-cffi x86_64 1.14.5-5.el9 baseos 253 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-cheroot noarch 10.0.1-4.el9 epel 173 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-cherrypy noarch 18.6.1-2.el9 epel 358 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-cryptography x86_64 36.0.1-5.el9 baseos 1.2 M 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-devel x86_64 3.9.25-3.el9 appstream 244 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth noarch 1:2.45.0-1.el9 epel 254 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco noarch 8.2.1-3.el9 epel 11 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 epel 18 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 epel 23 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-context noarch 6.0.1-3.el9 epel 20 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 epel 19 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-text noarch 4.0.0-2.el9 epel 26 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-jinja2 noarch 2.11.3-8.el9 appstream 249 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-jwt noarch 2.4.0-1.el9 epel 41 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 epel 1.0 M 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 appstream 177 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-logutils noarch 0.3.5-21.el9 epel 46 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-mako noarch 1.1.4-6.el9 appstream 172 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-markupsafe x86_64 1.1.1-12.el9 appstream 35 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-more-itertools noarch 8.12.0-2.el9 epel 79 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-natsort noarch 7.1.1-5.el9 epel 58 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy x86_64 1:1.23.5-2.el9 appstream 6.1 M 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 appstream 442 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-pecan noarch 1.4.2-3.el9 epel 272 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-ply noarch 3.11-14.el9 baseos 106 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-portend noarch 3.1.0-2.el9 epel 16 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 epel 90 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1 noarch 0.4.8-7.el9 appstream 157 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 appstream 277 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-pycparser noarch 2.20-6.el9 baseos 135 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-repoze-lru noarch 0.7-16.el9 epel 31 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests noarch 2.25.1-10.el9 baseos 126 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 appstream 54 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-routes noarch 2.5.1-5.el9 epel 188 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa noarch 4.9-2.el9 epel 59 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-scipy x86_64 1.9.3-2.el9 appstream 19 M 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora noarch 5.0.0-2.el9 epel 36 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-toml noarch 0.10.2-6.el9 appstream 42 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-typing-extensions noarch 4.15.0-1.el9 epel 86 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-urllib3 noarch 1.26.5-7.el9 baseos 218 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-webob noarch 1.8.8-2.el9 epel 230 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket-client noarch 1.2.3-2.el9 epel 90 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 epel 427 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: python3-zc-lockfile noarch 2.0-10.el9 epel 20 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: re2 x86_64 1:20211101-20.el9 epel 191 k 2026-03-10T07:44:31.257 INFO:teuthology.orchestra.run.vm05.stdout: socat x86_64 1.7.4.1-8.el9 appstream 303 k 2026-03-10T07:44:31.258 INFO:teuthology.orchestra.run.vm05.stdout: thrift x86_64 0.15.0-4.el9 epel 1.6 M 2026-03-10T07:44:31.258 INFO:teuthology.orchestra.run.vm05.stdout: xmlstarlet x86_64 1.6.1-20.el9 appstream 64 k 2026-03-10T07:44:31.258 INFO:teuthology.orchestra.run.vm05.stdout:Installing weak dependencies: 2026-03-10T07:44:31.258 INFO:teuthology.orchestra.run.vm05.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 epel 9.0 k 2026-03-10T07:44:31.258 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:44:31.258 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-10T07:44:31.258 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T07:44:31.258 INFO:teuthology.orchestra.run.vm05.stdout:Install 117 Packages 2026-03-10T07:44:31.258 INFO:teuthology.orchestra.run.vm05.stdout:Upgrade 2 Packages 2026-03-10T07:44:31.258 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:44:31.258 INFO:teuthology.orchestra.run.vm05.stdout:Total download size: 182 M 2026-03-10T07:44:31.258 INFO:teuthology.orchestra.run.vm05.stdout:Downloading Packages: 2026-03-10T07:44:32.129 INFO:teuthology.orchestra.run.vm05.stdout:(1/119): ceph-18.2.1-0.el9.x86_64.rpm 21 kB/s | 6.4 kB 00:00 2026-03-10T07:44:33.134 INFO:teuthology.orchestra.run.vm05.stdout:(2/119): ceph-base-18.2.1-0.el9.x86_64.rpm 4.0 MB/s | 5.2 MB 00:01 2026-03-10T07:44:33.766 INFO:teuthology.orchestra.run.vm08.stdout:(2/119): ceph-fuse-18.2.1-0.el9.x86_64.rpm 292 kB/s | 839 kB 00:02 2026-03-10T07:44:34.061 INFO:teuthology.orchestra.run.vm05.stdout:(3/119): ceph-immutable-object-cache-18.2.1-0.e 153 kB/s | 142 kB 00:00 2026-03-10T07:44:34.064 INFO:teuthology.orchestra.run.vm08.stdout:(3/119): ceph-immutable-object-cache-18.2.1-0.e 478 kB/s | 142 kB 00:00 2026-03-10T07:44:34.368 INFO:teuthology.orchestra.run.vm05.stdout:(4/119): ceph-mds-18.2.1-0.el9.x86_64.rpm 6.9 MB/s | 2.1 MB 00:00 2026-03-10T07:44:34.575 INFO:teuthology.orchestra.run.vm05.stdout:(5/119): ceph-mgr-18.2.1-0.el9.x86_64.rpm 7.0 MB/s | 1.4 MB 00:00 2026-03-10T07:44:34.757 INFO:teuthology.orchestra.run.vm05.stdout:(6/119): ceph-common-18.2.1-0.el9.x86_64.rpm 6.3 MB/s | 18 MB 00:02 2026-03-10T07:44:35.182 INFO:teuthology.orchestra.run.vm05.stdout:(7/119): ceph-mon-18.2.1-0.el9.x86_64.rpm 7.3 MB/s | 4.4 MB 00:00 2026-03-10T07:44:36.100 INFO:teuthology.orchestra.run.vm05.stdout:(8/119): ceph-radosgw-18.2.1-0.el9.x86_64.rpm 8.4 MB/s | 7.7 MB 00:00 2026-03-10T07:44:36.140 INFO:teuthology.orchestra.run.vm05.stdout:(9/119): ceph-fuse-18.2.1-0.el9.x86_64.rpm 209 kB/s | 839 kB 00:04 2026-03-10T07:44:36.154 INFO:teuthology.orchestra.run.vm08.stdout:(4/119): ceph-mds-18.2.1-0.el9.x86_64.rpm 1.0 MB/s | 2.1 MB 00:02 2026-03-10T07:44:36.199 INFO:teuthology.orchestra.run.vm05.stdout:(10/119): ceph-selinux-18.2.1-0.el9.x86_64.rpm 242 kB/s | 24 kB 00:00 2026-03-10T07:44:36.299 INFO:teuthology.orchestra.run.vm05.stdout:(11/119): libcephfs-devel-18.2.1-0.el9.x86_64.r 312 kB/s | 31 kB 00:00 2026-03-10T07:44:36.405 INFO:teuthology.orchestra.run.vm05.stdout:(12/119): libcephfs2-18.2.1-0.el9.x86_64.rpm 6.1 MB/s | 658 kB 00:00 2026-03-10T07:44:36.505 INFO:teuthology.orchestra.run.vm05.stdout:(13/119): libcephsqlite-18.2.1-0.el9.x86_64.rpm 1.6 MB/s | 165 kB 00:00 2026-03-10T07:44:36.606 INFO:teuthology.orchestra.run.vm05.stdout:(14/119): librados-devel-18.2.1-0.el9.x86_64.rp 1.2 MB/s | 127 kB 00:00 2026-03-10T07:44:36.710 INFO:teuthology.orchestra.run.vm05.stdout:(15/119): libradosstriper1-18.2.1-0.el9.x86_64. 4.5 MB/s | 474 kB 00:00 2026-03-10T07:44:36.779 INFO:teuthology.orchestra.run.vm05.stdout:(16/119): ceph-osd-18.2.1-0.el9.x86_64.rpm 8.7 MB/s | 18 MB 00:02 2026-03-10T07:44:36.834 INFO:teuthology.orchestra.run.vm08.stdout:(5/119): ceph-base-18.2.1-0.el9.x86_64.rpm 851 kB/s | 5.2 MB 00:06 2026-03-10T07:44:36.879 INFO:teuthology.orchestra.run.vm05.stdout:(17/119): python3-ceph-argparse-18.2.1-0.el9.x8 449 kB/s | 45 kB 00:00 2026-03-10T07:44:36.950 INFO:teuthology.orchestra.run.vm08.stdout:(6/119): ceph-mgr-18.2.1-0.el9.x86_64.rpm 1.8 MB/s | 1.4 MB 00:00 2026-03-10T07:44:36.979 INFO:teuthology.orchestra.run.vm05.stdout:(18/119): python3-ceph-common-18.2.1-0.el9.x86_ 1.2 MB/s | 124 kB 00:00 2026-03-10T07:44:37.078 INFO:teuthology.orchestra.run.vm05.stdout:(19/119): python3-cephfs-18.2.1-0.el9.x86_64.rp 1.6 MB/s | 161 kB 00:00 2026-03-10T07:44:37.181 INFO:teuthology.orchestra.run.vm05.stdout:(20/119): python3-rados-18.2.1-0.el9.x86_64.rpm 3.1 MB/s | 321 kB 00:00 2026-03-10T07:44:37.222 INFO:teuthology.orchestra.run.vm05.stdout:(21/119): librgw2-18.2.1-0.el9.x86_64.rpm 8.7 MB/s | 4.5 MB 00:00 2026-03-10T07:44:37.283 INFO:teuthology.orchestra.run.vm05.stdout:(22/119): python3-rbd-18.2.1-0.el9.x86_64.rpm 2.9 MB/s | 297 kB 00:00 2026-03-10T07:44:37.322 INFO:teuthology.orchestra.run.vm05.stdout:(23/119): python3-rgw-18.2.1-0.el9.x86_64.rpm 992 kB/s | 99 kB 00:00 2026-03-10T07:44:37.383 INFO:teuthology.orchestra.run.vm05.stdout:(24/119): rbd-fuse-18.2.1-0.el9.x86_64.rpm 865 kB/s | 86 kB 00:00 2026-03-10T07:44:37.483 INFO:teuthology.orchestra.run.vm05.stdout:(25/119): rbd-nbd-18.2.1-0.el9.x86_64.rpm 1.7 MB/s | 171 kB 00:00 2026-03-10T07:44:37.582 INFO:teuthology.orchestra.run.vm05.stdout:(26/119): ceph-grafana-dashboards-18.2.1-0.el9. 234 kB/s | 23 kB 00:00 2026-03-10T07:44:37.683 INFO:teuthology.orchestra.run.vm05.stdout:(27/119): ceph-mgr-cephadm-18.2.1-0.el9.noarch. 1.3 MB/s | 132 kB 00:00 2026-03-10T07:44:37.731 INFO:teuthology.orchestra.run.vm05.stdout:(28/119): rbd-mirror-18.2.1-0.el9.x86_64.rpm 7.3 MB/s | 3.0 MB 00:00 2026-03-10T07:44:37.894 INFO:teuthology.orchestra.run.vm05.stdout:(29/119): ceph-mgr-dashboard-18.2.1-0.el9.noarc 8.3 MB/s | 1.8 MB 00:00 2026-03-10T07:44:37.996 INFO:teuthology.orchestra.run.vm05.stdout:(30/119): ceph-mgr-modules-core-18.2.1-0.el9.no 2.3 MB/s | 242 kB 00:00 2026-03-10T07:44:38.096 INFO:teuthology.orchestra.run.vm05.stdout:(31/119): ceph-mgr-rook-18.2.1-0.el9.noarch.rpm 504 kB/s | 50 kB 00:00 2026-03-10T07:44:38.134 INFO:teuthology.orchestra.run.vm08.stdout:(7/119): ceph-mon-18.2.1-0.el9.x86_64.rpm 3.4 MB/s | 4.4 MB 00:01 2026-03-10T07:44:38.195 INFO:teuthology.orchestra.run.vm05.stdout:(32/119): ceph-prometheus-alerts-18.2.1-0.el9.n 147 kB/s | 15 kB 00:00 2026-03-10T07:44:38.296 INFO:teuthology.orchestra.run.vm05.stdout:(33/119): cephadm-18.2.1-0.el9.noarch.rpm 2.1 MB/s | 221 kB 00:00 2026-03-10T07:44:38.451 INFO:teuthology.orchestra.run.vm05.stdout:(34/119): ledmon-libs-1.1.0-3.el9.x86_64.rpm 262 kB/s | 40 kB 00:00 2026-03-10T07:44:38.548 INFO:teuthology.orchestra.run.vm05.stdout:(35/119): libconfig-1.7.2-9.el9.x86_64.rpm 743 kB/s | 72 kB 00:00 2026-03-10T07:44:38.707 INFO:teuthology.orchestra.run.vm05.stdout:(36/119): libgfortran-11.5.0-14.el9.x86_64.rpm 4.9 MB/s | 794 kB 00:00 2026-03-10T07:44:38.767 INFO:teuthology.orchestra.run.vm05.stdout:(37/119): libquadmath-11.5.0-14.el9.x86_64.rpm 3.0 MB/s | 184 kB 00:00 2026-03-10T07:44:38.836 INFO:teuthology.orchestra.run.vm05.stdout:(38/119): ceph-mgr-diskprediction-local-18.2.1- 6.7 MB/s | 7.4 MB 00:01 2026-03-10T07:44:38.837 INFO:teuthology.orchestra.run.vm05.stdout:(39/119): mailcap-2.1.49-5.el9.noarch.rpm 476 kB/s | 33 kB 00:00 2026-03-10T07:44:38.979 INFO:teuthology.orchestra.run.vm05.stdout:(40/119): python3-cryptography-36.0.1-5.el9.x86 8.8 MB/s | 1.2 MB 00:00 2026-03-10T07:44:39.027 INFO:teuthology.orchestra.run.vm05.stdout:(41/119): python3-ply-3.11-14.el9.noarch.rpm 2.2 MB/s | 106 kB 00:00 2026-03-10T07:44:39.075 INFO:teuthology.orchestra.run.vm05.stdout:(42/119): python3-pycparser-2.20-6.el9.noarch.r 2.8 MB/s | 135 kB 00:00 2026-03-10T07:44:39.108 INFO:teuthology.orchestra.run.vm05.stdout:(43/119): python3-cffi-1.14.5-5.el9.x86_64.rpm 934 kB/s | 253 kB 00:00 2026-03-10T07:44:39.140 INFO:teuthology.orchestra.run.vm05.stdout:(44/119): python3-requests-2.25.1-10.el9.noarch 1.9 MB/s | 126 kB 00:00 2026-03-10T07:44:39.359 INFO:teuthology.orchestra.run.vm08.stdout:(8/119): ceph-radosgw-18.2.1-0.el9.x86_64.rpm 6.3 MB/s | 7.7 MB 00:01 2026-03-10T07:44:39.374 INFO:teuthology.orchestra.run.vm05.stdout:(45/119): boost-program-options-1.75.0-13.el9.x 444 kB/s | 104 kB 00:00 2026-03-10T07:44:39.418 INFO:teuthology.orchestra.run.vm05.stdout:(46/119): python3-urllib3-1.26.5-7.el9.noarch.r 703 kB/s | 218 kB 00:00 2026-03-10T07:44:39.418 INFO:teuthology.orchestra.run.vm08.stdout:(9/119): ceph-common-18.2.1-0.el9.x86_64.rpm 2.1 MB/s | 18 MB 00:08 2026-03-10T07:44:39.425 INFO:teuthology.orchestra.run.vm05.stdout:(47/119): flexiblas-3.0.4-9.el9.x86_64.rpm 592 kB/s | 30 kB 00:00 2026-03-10T07:44:39.462 INFO:teuthology.orchestra.run.vm08.stdout:(10/119): ceph-selinux-18.2.1-0.el9.x86_64.rpm 232 kB/s | 24 kB 00:00 2026-03-10T07:44:39.466 INFO:teuthology.orchestra.run.vm05.stdout:(48/119): flexiblas-openblas-openmp-3.0.4-9.el9 361 kB/s | 15 kB 00:00 2026-03-10T07:44:39.548 INFO:teuthology.orchestra.run.vm05.stdout:(49/119): libpmemobj-1.12.1-1.el9.x86_64.rpm 1.9 MB/s | 160 kB 00:00 2026-03-10T07:44:39.572 INFO:teuthology.orchestra.run.vm08.stdout:(11/119): libcephfs-devel-18.2.1-0.el9.x86_64.r 283 kB/s | 31 kB 00:00 2026-03-10T07:44:39.589 INFO:teuthology.orchestra.run.vm05.stdout:(50/119): librabbitmq-0.11.0-7.el9.x86_64.rpm 1.1 MB/s | 45 kB 00:00 2026-03-10T07:44:39.680 INFO:teuthology.orchestra.run.vm05.stdout:(51/119): librdkafka-1.6.1-102.el9.x86_64.rpm 7.2 MB/s | 662 kB 00:00 2026-03-10T07:44:39.727 INFO:teuthology.orchestra.run.vm05.stdout:(52/119): libstoragemgmt-1.10.1-1.el9.x86_64.rp 5.1 MB/s | 246 kB 00:00 2026-03-10T07:44:39.757 INFO:teuthology.orchestra.run.vm05.stdout:(53/119): flexiblas-netlib-3.0.4-9.el9.x86_64.r 8.8 MB/s | 3.0 MB 00:00 2026-03-10T07:44:39.773 INFO:teuthology.orchestra.run.vm05.stdout:(54/119): libxslt-1.1.34-12.el9.x86_64.rpm 5.0 MB/s | 233 kB 00:00 2026-03-10T07:44:39.799 INFO:teuthology.orchestra.run.vm05.stdout:(55/119): lttng-ust-2.12.0-6.el9.x86_64.rpm 6.8 MB/s | 292 kB 00:00 2026-03-10T07:44:39.814 INFO:teuthology.orchestra.run.vm05.stdout:(56/119): openblas-0.3.29-1.el9.x86_64.rpm 1.0 MB/s | 42 kB 00:00 2026-03-10T07:44:40.059 INFO:teuthology.orchestra.run.vm05.stdout:(57/119): openblas-openmp-0.3.29-1.el9.x86_64.r 20 MB/s | 5.3 MB 00:00 2026-03-10T07:44:40.101 INFO:teuthology.orchestra.run.vm05.stdout:(58/119): python3-devel-3.9.25-3.el9.x86_64.rpm 5.8 MB/s | 244 kB 00:00 2026-03-10T07:44:40.120 INFO:teuthology.orchestra.run.vm05.stdout:(59/119): python3-babel-2.9.1-2.el9.noarch.rpm 20 MB/s | 6.0 MB 00:00 2026-03-10T07:44:40.141 INFO:teuthology.orchestra.run.vm05.stdout:(60/119): python3-jinja2-2.11.3-8.el9.noarch.rp 6.0 MB/s | 249 kB 00:00 2026-03-10T07:44:40.162 INFO:teuthology.orchestra.run.vm05.stdout:(61/119): python3-jmespath-1.0.1-1.el9.noarch.r 1.1 MB/s | 48 kB 00:00 2026-03-10T07:44:40.181 INFO:teuthology.orchestra.run.vm05.stdout:(62/119): python3-libstoragemgmt-1.10.1-1.el9.x 4.4 MB/s | 177 kB 00:00 2026-03-10T07:44:40.206 INFO:teuthology.orchestra.run.vm05.stdout:(63/119): python3-mako-1.1.4-6.el9.noarch.rpm 3.8 MB/s | 172 kB 00:00 2026-03-10T07:44:40.219 INFO:teuthology.orchestra.run.vm05.stdout:(64/119): python3-markupsafe-1.1.1-12.el9.x86_6 919 kB/s | 35 kB 00:00 2026-03-10T07:44:40.262 INFO:teuthology.orchestra.run.vm05.stdout:(65/119): python3-numpy-f2py-1.23.5-2.el9.x86_6 10 MB/s | 442 kB 00:00 2026-03-10T07:44:40.318 INFO:teuthology.orchestra.run.vm08.stdout:(12/119): ceph-osd-18.2.1-0.el9.x86_64.rpm 5.2 MB/s | 18 MB 00:03 2026-03-10T07:44:40.318 INFO:teuthology.orchestra.run.vm05.stdout:(66/119): python3-pyasn1-0.4.8-7.el9.noarch.rpm 2.7 MB/s | 157 kB 00:00 2026-03-10T07:44:40.360 INFO:teuthology.orchestra.run.vm05.stdout:(67/119): python3-pyasn1-modules-0.4.8-7.el9.no 6.6 MB/s | 277 kB 00:00 2026-03-10T07:44:40.398 INFO:teuthology.orchestra.run.vm05.stdout:(68/119): python3-requests-oauthlib-1.3.0-12.el 1.4 MB/s | 54 kB 00:00 2026-03-10T07:44:40.426 INFO:teuthology.orchestra.run.vm08.stdout:(13/119): libcephsqlite-18.2.1-0.el9.x86_64.rpm 1.5 MB/s | 165 kB 00:00 2026-03-10T07:44:40.527 INFO:teuthology.orchestra.run.vm08.stdout:(14/119): librados-devel-18.2.1-0.el9.x86_64.rp 1.2 MB/s | 127 kB 00:00 2026-03-10T07:44:40.540 INFO:teuthology.orchestra.run.vm05.stdout:(69/119): python3-numpy-1.23.5-2.el9.x86_64.rpm 18 MB/s | 6.1 MB 00:00 2026-03-10T07:44:40.581 INFO:teuthology.orchestra.run.vm05.stdout:(70/119): python3-toml-0.10.2-6.el9.noarch.rpm 1.0 MB/s | 42 kB 00:00 2026-03-10T07:44:40.630 INFO:teuthology.orchestra.run.vm05.stdout:(71/119): socat-1.7.4.1-8.el9.x86_64.rpm 6.2 MB/s | 303 kB 00:00 2026-03-10T07:44:40.671 INFO:teuthology.orchestra.run.vm05.stdout:(72/119): xmlstarlet-1.6.1-20.el9.x86_64.rpm 1.5 MB/s | 64 kB 00:00 2026-03-10T07:44:40.806 INFO:teuthology.orchestra.run.vm08.stdout:(15/119): libcephfs2-18.2.1-0.el9.x86_64.rpm 533 kB/s | 658 kB 00:01 2026-03-10T07:44:40.954 INFO:teuthology.orchestra.run.vm05.stdout:(73/119): fmt-8.1.1-5.el9.x86_64.rpm 391 kB/s | 111 kB 00:00 2026-03-10T07:44:41.093 INFO:teuthology.orchestra.run.vm05.stdout:(74/119): gperftools-libs-2.9.1-3.el9.x86_64.rp 2.2 MB/s | 308 kB 00:00 2026-03-10T07:44:41.170 INFO:teuthology.orchestra.run.vm05.stdout:(75/119): python3-scipy-1.9.3-2.el9.x86_64.rpm 25 MB/s | 19 MB 00:00 2026-03-10T07:44:41.351 INFO:teuthology.orchestra.run.vm05.stdout:(76/119): libarrow-9.0.0-15.el9.x86_64.rpm 17 MB/s | 4.4 MB 00:00 2026-03-10T07:44:41.368 INFO:teuthology.orchestra.run.vm05.stdout:(77/119): libarrow-doc-9.0.0-15.el9.noarch.rpm 125 kB/s | 25 kB 00:00 2026-03-10T07:44:41.398 INFO:teuthology.orchestra.run.vm05.stdout:(78/119): liboath-2.6.12-1.el9.x86_64.rpm 1.0 MB/s | 49 kB 00:00 2026-03-10T07:44:41.454 INFO:teuthology.orchestra.run.vm05.stdout:(79/119): parquet-libs-9.0.0-15.el9.x86_64.rpm 15 MB/s | 838 kB 00:00 2026-03-10T07:44:41.464 INFO:teuthology.orchestra.run.vm08.stdout:(16/119): libradosstriper1-18.2.1-0.el9.x86_64. 506 kB/s | 474 kB 00:00 2026-03-10T07:44:41.480 INFO:teuthology.orchestra.run.vm05.stdout:(80/119): libunwind-1.6.2-1.el9.x86_64.rpm 602 kB/s | 67 kB 00:00 2026-03-10T07:44:41.510 INFO:teuthology.orchestra.run.vm05.stdout:(81/119): python3-asyncssh-2.13.2-5.el9.noarch. 9.6 MB/s | 548 kB 00:00 2026-03-10T07:44:41.523 INFO:teuthology.orchestra.run.vm08.stdout:(17/119): librgw2-18.2.1-0.el9.x86_64.rpm 6.2 MB/s | 4.5 MB 00:00 2026-03-10T07:44:41.537 INFO:teuthology.orchestra.run.vm05.stdout:(82/119): python3-autocommand-2.2.2-8.el9.noarc 523 kB/s | 29 kB 00:00 2026-03-10T07:44:41.559 INFO:teuthology.orchestra.run.vm05.stdout:(83/119): python3-backports-tarfile-1.2.0-1.el9 1.2 MB/s | 60 kB 00:00 2026-03-10T07:44:41.564 INFO:teuthology.orchestra.run.vm08.stdout:(18/119): python3-ceph-argparse-18.2.1-0.el9.x8 449 kB/s | 45 kB 00:00 2026-03-10T07:44:41.599 INFO:teuthology.orchestra.run.vm05.stdout:(84/119): python3-bcrypt-3.2.2-1.el9.x86_64.rpm 702 kB/s | 43 kB 00:00 2026-03-10T07:44:41.606 INFO:teuthology.orchestra.run.vm05.stdout:(85/119): python3-cachetools-4.2.4-1.el9.noarch 691 kB/s | 32 kB 00:00 2026-03-10T07:44:41.630 INFO:teuthology.orchestra.run.vm08.stdout:(19/119): python3-ceph-common-18.2.1-0.el9.x86_ 1.1 MB/s | 124 kB 00:00 2026-03-10T07:44:41.650 INFO:teuthology.orchestra.run.vm05.stdout:(86/119): python3-certifi-2023.05.07-4.el9.noar 277 kB/s | 14 kB 00:00 2026-03-10T07:44:41.655 INFO:teuthology.orchestra.run.vm05.stdout:(87/119): python3-cheroot-10.0.1-4.el9.noarch.r 3.5 MB/s | 173 kB 00:00 2026-03-10T07:44:41.666 INFO:teuthology.orchestra.run.vm08.stdout:(20/119): python3-cephfs-18.2.1-0.el9.x86_64.rp 1.6 MB/s | 161 kB 00:00 2026-03-10T07:44:41.704 INFO:teuthology.orchestra.run.vm05.stdout:(88/119): python3-google-auth-2.45.0-1.el9.noar 5.0 MB/s | 254 kB 00:00 2026-03-10T07:44:41.733 INFO:teuthology.orchestra.run.vm08.stdout:(21/119): python3-rados-18.2.1-0.el9.x86_64.rpm 3.1 MB/s | 321 kB 00:00 2026-03-10T07:44:41.751 INFO:teuthology.orchestra.run.vm05.stdout:(89/119): python3-jaraco-8.2.1-3.el9.noarch.rpm 230 kB/s | 11 kB 00:00 2026-03-10T07:44:41.768 INFO:teuthology.orchestra.run.vm08.stdout:(22/119): python3-rbd-18.2.1-0.el9.x86_64.rpm 2.8 MB/s | 297 kB 00:00 2026-03-10T07:44:41.798 INFO:teuthology.orchestra.run.vm05.stdout:(90/119): python3-jaraco-classes-3.2.1-5.el9.no 379 kB/s | 18 kB 00:00 2026-03-10T07:44:41.799 INFO:teuthology.orchestra.run.vm05.stdout:(91/119): python3-cherrypy-18.6.1-2.el9.noarch. 2.3 MB/s | 358 kB 00:00 2026-03-10T07:44:41.835 INFO:teuthology.orchestra.run.vm08.stdout:(23/119): python3-rgw-18.2.1-0.el9.x86_64.rpm 977 kB/s | 99 kB 00:00 2026-03-10T07:44:41.846 INFO:teuthology.orchestra.run.vm05.stdout:(92/119): python3-jaraco-collections-3.0.0-8.el 489 kB/s | 23 kB 00:00 2026-03-10T07:44:41.846 INFO:teuthology.orchestra.run.vm05.stdout:(93/119): python3-jaraco-context-6.0.1-3.el9.no 420 kB/s | 20 kB 00:00 2026-03-10T07:44:41.869 INFO:teuthology.orchestra.run.vm08.stdout:(24/119): rbd-fuse-18.2.1-0.el9.x86_64.rpm 856 kB/s | 86 kB 00:00 2026-03-10T07:44:41.893 INFO:teuthology.orchestra.run.vm05.stdout:(94/119): python3-jaraco-functools-3.5.0-2.el9. 417 kB/s | 19 kB 00:00 2026-03-10T07:44:41.894 INFO:teuthology.orchestra.run.vm05.stdout:(95/119): python3-jaraco-text-4.0.0-2.el9.noarc 558 kB/s | 26 kB 00:00 2026-03-10T07:44:41.941 INFO:teuthology.orchestra.run.vm05.stdout:(96/119): python3-jwt+crypto-2.4.0-1.el9.noarch 191 kB/s | 9.0 kB 00:00 2026-03-10T07:44:41.942 INFO:teuthology.orchestra.run.vm05.stdout:(97/119): python3-jwt-2.4.0-1.el9.noarch.rpm 849 kB/s | 41 kB 00:00 2026-03-10T07:44:41.971 INFO:teuthology.orchestra.run.vm08.stdout:(25/119): rbd-nbd-18.2.1-0.el9.x86_64.rpm 1.6 MB/s | 171 kB 00:00 2026-03-10T07:44:42.000 INFO:teuthology.orchestra.run.vm05.stdout:(98/119): python3-kubernetes-26.1.0-3.el9.noarc 17 MB/s | 1.0 MB 00:00 2026-03-10T07:44:42.000 INFO:teuthology.orchestra.run.vm05.stdout:(99/119): python3-logutils-0.3.5-21.el9.noarch. 801 kB/s | 46 kB 00:00 2026-03-10T07:44:42.048 INFO:teuthology.orchestra.run.vm05.stdout:(100/119): python3-more-itertools-8.12.0-2.el9. 1.6 MB/s | 79 kB 00:00 2026-03-10T07:44:42.052 INFO:teuthology.orchestra.run.vm05.stdout:(101/119): python3-natsort-7.1.1-5.el9.noarch.r 1.1 MB/s | 58 kB 00:00 2026-03-10T07:44:42.070 INFO:teuthology.orchestra.run.vm08.stdout:(26/119): ceph-grafana-dashboards-18.2.1-0.el9. 232 kB/s | 23 kB 00:00 2026-03-10T07:44:42.119 INFO:teuthology.orchestra.run.vm05.stdout:(102/119): python3-pecan-1.4.2-3.el9.noarch.rpm 3.8 MB/s | 272 kB 00:00 2026-03-10T07:44:42.120 INFO:teuthology.orchestra.run.vm05.stdout:(103/119): python3-portend-3.1.0-2.el9.noarch.r 242 kB/s | 16 kB 00:00 2026-03-10T07:44:42.172 INFO:teuthology.orchestra.run.vm08.stdout:(27/119): ceph-mgr-cephadm-18.2.1-0.el9.noarch. 1.3 MB/s | 132 kB 00:00 2026-03-10T07:44:42.175 INFO:teuthology.orchestra.run.vm05.stdout:(104/119): python3-pyOpenSSL-21.0.0-1.el9.noarc 1.6 MB/s | 90 kB 00:00 2026-03-10T07:44:42.177 INFO:teuthology.orchestra.run.vm05.stdout:(105/119): python3-repoze-lru-0.7-16.el9.noarch 542 kB/s | 31 kB 00:00 2026-03-10T07:44:42.224 INFO:teuthology.orchestra.run.vm05.stdout:(106/119): python3-routes-2.5.1-5.el9.noarch.rp 3.8 MB/s | 188 kB 00:00 2026-03-10T07:44:42.229 INFO:teuthology.orchestra.run.vm05.stdout:(107/119): python3-rsa-4.9-2.el9.noarch.rpm 1.1 MB/s | 59 kB 00:00 2026-03-10T07:44:42.271 INFO:teuthology.orchestra.run.vm05.stdout:(108/119): python3-tempora-5.0.0-2.el9.noarch.r 768 kB/s | 36 kB 00:00 2026-03-10T07:44:42.281 INFO:teuthology.orchestra.run.vm05.stdout:(109/119): python3-typing-extensions-4.15.0-1.e 1.6 MB/s | 86 kB 00:00 2026-03-10T07:44:42.322 INFO:teuthology.orchestra.run.vm05.stdout:(110/119): python3-webob-1.8.8-2.el9.noarch.rpm 4.5 MB/s | 230 kB 00:00 2026-03-10T07:44:42.334 INFO:teuthology.orchestra.run.vm05.stdout:(111/119): python3-websocket-client-1.2.3-2.el9 1.7 MB/s | 90 kB 00:00 2026-03-10T07:44:42.341 INFO:teuthology.orchestra.run.vm08.stdout:(28/119): rbd-mirror-18.2.1-0.el9.x86_64.rpm 5.9 MB/s | 3.0 MB 00:00 2026-03-10T07:44:42.374 INFO:teuthology.orchestra.run.vm05.stdout:(112/119): python3-werkzeug-2.0.3-3.el9.1.noarc 8.0 MB/s | 427 kB 00:00 2026-03-10T07:44:42.381 INFO:teuthology.orchestra.run.vm05.stdout:(113/119): python3-xmltodict-0.12.0-15.el9.noar 479 kB/s | 22 kB 00:00 2026-03-10T07:44:42.422 INFO:teuthology.orchestra.run.vm05.stdout:(114/119): python3-zc-lockfile-2.0-10.el9.noarc 423 kB/s | 20 kB 00:00 2026-03-10T07:44:42.442 INFO:teuthology.orchestra.run.vm05.stdout:(115/119): re2-20211101-20.el9.x86_64.rpm 3.1 MB/s | 191 kB 00:00 2026-03-10T07:44:42.551 INFO:teuthology.orchestra.run.vm05.stdout:(116/119): thrift-0.15.0-4.el9.x86_64.rpm 12 MB/s | 1.6 MB 00:00 2026-03-10T07:44:42.581 INFO:teuthology.orchestra.run.vm08.stdout:(29/119): ceph-mgr-dashboard-18.2.1-0.el9.noarc 4.3 MB/s | 1.8 MB 00:00 2026-03-10T07:44:42.684 INFO:teuthology.orchestra.run.vm08.stdout:(30/119): ceph-mgr-modules-core-18.2.1-0.el9.no 2.3 MB/s | 242 kB 00:00 2026-03-10T07:44:42.785 INFO:teuthology.orchestra.run.vm08.stdout:(31/119): ceph-mgr-rook-18.2.1-0.el9.noarch.rpm 496 kB/s | 50 kB 00:00 2026-03-10T07:44:42.886 INFO:teuthology.orchestra.run.vm08.stdout:(32/119): ceph-prometheus-alerts-18.2.1-0.el9.n 146 kB/s | 15 kB 00:00 2026-03-10T07:44:42.989 INFO:teuthology.orchestra.run.vm08.stdout:(33/119): cephadm-18.2.1-0.el9.noarch.rpm 2.1 MB/s | 221 kB 00:00 2026-03-10T07:44:43.405 INFO:teuthology.orchestra.run.vm08.stdout:(34/119): ledmon-libs-1.1.0-3.el9.x86_64.rpm 97 kB/s | 40 kB 00:00 2026-03-10T07:44:43.437 INFO:teuthology.orchestra.run.vm05.stdout:(117/119): librados2-18.2.1-0.el9.x86_64.rpm 3.3 MB/s | 3.3 MB 00:00 2026-03-10T07:44:43.457 INFO:teuthology.orchestra.run.vm08.stdout:(35/119): ceph-mgr-diskprediction-local-18.2.1- 6.6 MB/s | 7.4 MB 00:01 2026-03-10T07:44:43.457 INFO:teuthology.orchestra.run.vm05.stdout:(118/119): librbd1-18.2.1-0.el9.x86_64.rpm 3.3 MB/s | 3.0 MB 00:00 2026-03-10T07:44:43.612 INFO:teuthology.orchestra.run.vm08.stdout:(36/119): libconfig-1.7.2-9.el9.x86_64.rpm 347 kB/s | 72 kB 00:00 2026-03-10T07:44:43.921 INFO:teuthology.orchestra.run.vm08.stdout:(37/119): libgfortran-11.5.0-14.el9.x86_64.rpm 1.7 MB/s | 794 kB 00:00 2026-03-10T07:44:43.948 INFO:teuthology.orchestra.run.vm08.stdout:(38/119): mailcap-2.1.49-5.el9.noarch.rpm 1.2 MB/s | 33 kB 00:00 2026-03-10T07:44:44.036 INFO:teuthology.orchestra.run.vm08.stdout:(39/119): libquadmath-11.5.0-14.el9.x86_64.rpm 435 kB/s | 184 kB 00:00 2026-03-10T07:44:44.462 INFO:teuthology.orchestra.run.vm08.stdout:(40/119): python3-cffi-1.14.5-5.el9.x86_64.rpm 493 kB/s | 253 kB 00:00 2026-03-10T07:44:44.583 INFO:teuthology.orchestra.run.vm08.stdout:(41/119): python3-cryptography-36.0.1-5.el9.x86 2.3 MB/s | 1.2 MB 00:00 2026-03-10T07:44:44.677 INFO:teuthology.orchestra.run.vm08.stdout:(42/119): python3-ply-3.11-14.el9.noarch.rpm 496 kB/s | 106 kB 00:00 2026-03-10T07:44:45.182 INFO:teuthology.orchestra.run.vm08.stdout:(43/119): ceph-test-18.2.1-0.el9.x86_64.rpm 6.9 MB/s | 40 MB 00:05 2026-03-10T07:44:45.183 INFO:teuthology.orchestra.run.vm08.stdout:(44/119): python3-pycparser-2.20-6.el9.noarch.r 225 kB/s | 135 kB 00:00 2026-03-10T07:44:45.315 INFO:teuthology.orchestra.run.vm08.stdout:(45/119): python3-requests-2.25.1-10.el9.noarch 198 kB/s | 126 kB 00:00 2026-03-10T07:44:45.374 INFO:teuthology.orchestra.run.vm08.stdout:(46/119): boost-program-options-1.75.0-13.el9.x 546 kB/s | 104 kB 00:00 2026-03-10T07:44:45.416 INFO:teuthology.orchestra.run.vm08.stdout:(47/119): flexiblas-3.0.4-9.el9.x86_64.rpm 294 kB/s | 30 kB 00:00 2026-03-10T07:44:45.478 INFO:teuthology.orchestra.run.vm08.stdout:(48/119): flexiblas-openblas-openmp-3.0.4-9.el9 248 kB/s | 15 kB 00:00 2026-03-10T07:44:45.617 INFO:teuthology.orchestra.run.vm08.stdout:(49/119): libpmemobj-1.12.1-1.el9.x86_64.rpm 1.1 MB/s | 160 kB 00:00 2026-03-10T07:44:45.662 INFO:teuthology.orchestra.run.vm08.stdout:(50/119): librabbitmq-0.11.0-7.el9.x86_64.rpm 1.0 MB/s | 45 kB 00:00 2026-03-10T07:44:45.797 INFO:teuthology.orchestra.run.vm08.stdout:(51/119): python3-urllib3-1.26.5-7.el9.noarch.r 354 kB/s | 218 kB 00:00 2026-03-10T07:44:45.877 INFO:teuthology.orchestra.run.vm08.stdout:(52/119): flexiblas-netlib-3.0.4-9.el9.x86_64.r 6.0 MB/s | 3.0 MB 00:00 2026-03-10T07:44:45.883 INFO:teuthology.orchestra.run.vm08.stdout:(53/119): librdkafka-1.6.1-102.el9.x86_64.rpm 2.9 MB/s | 662 kB 00:00 2026-03-10T07:44:45.946 INFO:teuthology.orchestra.run.vm08.stdout:(54/119): libxslt-1.1.34-12.el9.x86_64.rpm 3.3 MB/s | 233 kB 00:00 2026-03-10T07:44:45.990 INFO:teuthology.orchestra.run.vm08.stdout:(55/119): lttng-ust-2.12.0-6.el9.x86_64.rpm 2.7 MB/s | 292 kB 00:00 2026-03-10T07:44:45.992 INFO:teuthology.orchestra.run.vm08.stdout:(56/119): openblas-0.3.29-1.el9.x86_64.rpm 922 kB/s | 42 kB 00:00 2026-03-10T07:44:46.028 INFO:teuthology.orchestra.run.vm08.stdout:(57/119): libstoragemgmt-1.10.1-1.el9.x86_64.rp 1.0 MB/s | 246 kB 00:00 2026-03-10T07:44:46.141 INFO:teuthology.orchestra.run.vm08.stdout:(58/119): python3-devel-3.9.25-3.el9.x86_64.rpm 2.1 MB/s | 244 kB 00:00 2026-03-10T07:44:46.241 INFO:teuthology.orchestra.run.vm08.stdout:(59/119): python3-jinja2-2.11.3-8.el9.noarch.rp 2.4 MB/s | 249 kB 00:00 2026-03-10T07:44:46.288 INFO:teuthology.orchestra.run.vm08.stdout:(60/119): python3-jmespath-1.0.1-1.el9.noarch.r 1.0 MB/s | 48 kB 00:00 2026-03-10T07:44:46.326 INFO:teuthology.orchestra.run.vm08.stdout:(61/119): python3-babel-2.9.1-2.el9.noarch.rpm 18 MB/s | 6.0 MB 00:00 2026-03-10T07:44:46.365 INFO:teuthology.orchestra.run.vm08.stdout:(62/119): python3-libstoragemgmt-1.10.1-1.el9.x 2.3 MB/s | 177 kB 00:00 2026-03-10T07:44:46.403 INFO:teuthology.orchestra.run.vm08.stdout:(63/119): openblas-openmp-0.3.29-1.el9.x86_64.r 13 MB/s | 5.3 MB 00:00 2026-03-10T07:44:46.404 INFO:teuthology.orchestra.run.vm08.stdout:(64/119): python3-mako-1.1.4-6.el9.noarch.rpm 2.2 MB/s | 172 kB 00:00 2026-03-10T07:44:46.423 INFO:teuthology.orchestra.run.vm08.stdout:(65/119): python3-markupsafe-1.1.1-12.el9.x86_6 600 kB/s | 35 kB 00:00 2026-03-10T07:44:46.496 INFO:teuthology.orchestra.run.vm08.stdout:(66/119): python3-pyasn1-0.4.8-7.el9.noarch.rpm 2.1 MB/s | 157 kB 00:00 2026-03-10T07:44:46.535 INFO:teuthology.orchestra.run.vm08.stdout:(67/119): python3-numpy-f2py-1.23.5-2.el9.x86_6 3.3 MB/s | 442 kB 00:00 2026-03-10T07:44:46.560 INFO:teuthology.orchestra.run.vm08.stdout:(68/119): python3-pyasn1-modules-0.4.8-7.el9.no 4.3 MB/s | 277 kB 00:00 2026-03-10T07:44:46.586 INFO:teuthology.orchestra.run.vm08.stdout:(69/119): python3-requests-oauthlib-1.3.0-12.el 1.0 MB/s | 54 kB 00:00 2026-03-10T07:44:46.651 INFO:teuthology.orchestra.run.vm08.stdout:(70/119): python3-toml-0.10.2-6.el9.noarch.rpm 651 kB/s | 42 kB 00:00 2026-03-10T07:44:46.707 INFO:teuthology.orchestra.run.vm08.stdout:(71/119): python3-numpy-1.23.5-2.el9.x86_64.rpm 20 MB/s | 6.1 MB 00:00 2026-03-10T07:44:46.713 INFO:teuthology.orchestra.run.vm08.stdout:(72/119): socat-1.7.4.1-8.el9.x86_64.rpm 4.8 MB/s | 303 kB 00:00 2026-03-10T07:44:46.756 INFO:teuthology.orchestra.run.vm08.stdout:(73/119): xmlstarlet-1.6.1-20.el9.x86_64.rpm 1.3 MB/s | 64 kB 00:00 2026-03-10T07:44:46.966 INFO:teuthology.orchestra.run.vm08.stdout:(74/119): fmt-8.1.1-5.el9.x86_64.rpm 437 kB/s | 111 kB 00:00 2026-03-10T07:44:47.195 INFO:teuthology.orchestra.run.vm08.stdout:(75/119): gperftools-libs-2.9.1-3.el9.x86_64.rp 701 kB/s | 308 kB 00:00 2026-03-10T07:44:47.420 INFO:teuthology.orchestra.run.vm08.stdout:(76/119): python3-scipy-1.9.3-2.el9.x86_64.rpm 22 MB/s | 19 MB 00:00 2026-03-10T07:44:47.420 INFO:teuthology.orchestra.run.vm08.stdout:(77/119): libarrow-doc-9.0.0-15.el9.noarch.rpm 110 kB/s | 25 kB 00:00 2026-03-10T07:44:47.575 INFO:teuthology.orchestra.run.vm08.stdout:(78/119): liboath-2.6.12-1.el9.x86_64.rpm 315 kB/s | 49 kB 00:00 2026-03-10T07:44:47.645 INFO:teuthology.orchestra.run.vm08.stdout:(79/119): libunwind-1.6.2-1.el9.x86_64.rpm 300 kB/s | 67 kB 00:00 2026-03-10T07:44:48.072 INFO:teuthology.orchestra.run.vm05.stdout:(119/119): ceph-test-18.2.1-0.el9.x86_64.rpm 3.3 MB/s | 40 MB 00:11 2026-03-10T07:44:48.076 INFO:teuthology.orchestra.run.vm05.stdout:-------------------------------------------------------------------------------- 2026-03-10T07:44:48.076 INFO:teuthology.orchestra.run.vm05.stdout:Total 11 MB/s | 182 MB 00:16 2026-03-10T07:44:48.343 INFO:teuthology.orchestra.run.vm08.stdout:(80/119): python3-asyncssh-2.13.2-5.el9.noarch. 787 kB/s | 548 kB 00:00 2026-03-10T07:44:48.438 INFO:teuthology.orchestra.run.vm08.stdout:(81/119): parquet-libs-9.0.0-15.el9.x86_64.rpm 971 kB/s | 838 kB 00:00 2026-03-10T07:44:48.443 INFO:teuthology.orchestra.run.vm08.stdout:(82/119): python3-autocommand-2.2.2-8.el9.noarc 293 kB/s | 29 kB 00:00 2026-03-10T07:44:48.497 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-10T07:44:48.537 INFO:teuthology.orchestra.run.vm08.stdout:(83/119): python3-backports-tarfile-1.2.0-1.el9 611 kB/s | 60 kB 00:00 2026-03-10T07:44:48.540 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-10T07:44:48.540 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-10T07:44:48.566 INFO:teuthology.orchestra.run.vm08.stdout:(84/119): python3-bcrypt-3.2.2-1.el9.x86_64.rpm 354 kB/s | 43 kB 00:00 2026-03-10T07:44:48.629 INFO:teuthology.orchestra.run.vm08.stdout:(85/119): python3-cachetools-4.2.4-1.el9.noarch 350 kB/s | 32 kB 00:00 2026-03-10T07:44:48.652 INFO:teuthology.orchestra.run.vm08.stdout:(86/119): python3-certifi-2023.05.07-4.el9.noar 166 kB/s | 14 kB 00:00 2026-03-10T07:44:48.806 INFO:teuthology.orchestra.run.vm08.stdout:(87/119): python3-cheroot-10.0.1-4.el9.noarch.r 981 kB/s | 173 kB 00:00 2026-03-10T07:44:48.929 INFO:teuthology.orchestra.run.vm08.stdout:(88/119): python3-cherrypy-18.6.1-2.el9.noarch. 1.3 MB/s | 358 kB 00:00 2026-03-10T07:44:48.992 INFO:teuthology.orchestra.run.vm08.stdout:(89/119): python3-google-auth-2.45.0-1.el9.noar 1.3 MB/s | 254 kB 00:00 2026-03-10T07:44:48.997 INFO:teuthology.orchestra.run.vm08.stdout:(90/119): python3-jaraco-8.2.1-3.el9.noarch.rpm 156 kB/s | 11 kB 00:00 2026-03-10T07:44:49.038 INFO:teuthology.orchestra.run.vm08.stdout:(91/119): python3-jaraco-classes-3.2.1-5.el9.no 389 kB/s | 18 kB 00:00 2026-03-10T07:44:49.038 INFO:teuthology.orchestra.run.vm08.stdout:(92/119): python3-jaraco-collections-3.0.0-8.el 568 kB/s | 23 kB 00:00 2026-03-10T07:44:49.119 INFO:teuthology.orchestra.run.vm08.stdout:(93/119): python3-jaraco-context-6.0.1-3.el9.no 244 kB/s | 20 kB 00:00 2026-03-10T07:44:49.119 INFO:teuthology.orchestra.run.vm08.stdout:(94/119): python3-jaraco-functools-3.5.0-2.el9. 241 kB/s | 19 kB 00:00 2026-03-10T07:44:49.174 INFO:teuthology.orchestra.run.vm08.stdout:(95/119): python3-jaraco-text-4.0.0-2.el9.noarc 478 kB/s | 26 kB 00:00 2026-03-10T07:44:49.175 INFO:teuthology.orchestra.run.vm08.stdout:(96/119): python3-jwt+crypto-2.4.0-1.el9.noarch 163 kB/s | 9.0 kB 00:00 2026-03-10T07:44:49.257 INFO:teuthology.orchestra.run.vm08.stdout:(97/119): python3-jwt-2.4.0-1.el9.noarch.rpm 496 kB/s | 41 kB 00:00 2026-03-10T07:44:49.261 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-10T07:44:49.261 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-10T07:44:49.346 INFO:teuthology.orchestra.run.vm08.stdout:(98/119): libarrow-9.0.0-15.el9.x86_64.rpm 1.9 MB/s | 4.4 MB 00:02 2026-03-10T07:44:49.347 INFO:teuthology.orchestra.run.vm08.stdout:(99/119): python3-logutils-0.3.5-21.el9.noarch. 513 kB/s | 46 kB 00:00 2026-03-10T07:44:49.441 INFO:teuthology.orchestra.run.vm08.stdout:(100/119): python3-natsort-7.1.1-5.el9.noarch.r 615 kB/s | 58 kB 00:00 2026-03-10T07:44:49.457 INFO:teuthology.orchestra.run.vm08.stdout:(101/119): python3-more-itertools-8.12.0-2.el9. 715 kB/s | 79 kB 00:00 2026-03-10T07:44:49.473 INFO:teuthology.orchestra.run.vm08.stdout:(102/119): python3-kubernetes-26.1.0-3.el9.noar 3.4 MB/s | 1.0 MB 00:00 2026-03-10T07:44:49.527 INFO:teuthology.orchestra.run.vm08.stdout:(103/119): python3-pecan-1.4.2-3.el9.noarch.rpm 3.1 MB/s | 272 kB 00:00 2026-03-10T07:44:49.527 INFO:teuthology.orchestra.run.vm08.stdout:(104/119): python3-portend-3.1.0-2.el9.noarch.r 233 kB/s | 16 kB 00:00 2026-03-10T07:44:49.529 INFO:teuthology.orchestra.run.vm08.stdout:(105/119): python3-pyOpenSSL-21.0.0-1.el9.noarc 1.6 MB/s | 90 kB 00:00 2026-03-10T07:44:49.559 INFO:teuthology.orchestra.run.vm08.stdout:(106/119): python3-repoze-lru-0.7-16.el9.noarch 965 kB/s | 31 kB 00:00 2026-03-10T07:44:49.591 INFO:teuthology.orchestra.run.vm08.stdout:(107/119): python3-routes-2.5.1-5.el9.noarch.rp 2.9 MB/s | 188 kB 00:00 2026-03-10T07:44:49.591 INFO:teuthology.orchestra.run.vm08.stdout:(108/119): python3-rsa-4.9-2.el9.noarch.rpm 950 kB/s | 59 kB 00:00 2026-03-10T07:44:49.592 INFO:teuthology.orchestra.run.vm08.stdout:(109/119): python3-tempora-5.0.0-2.el9.noarch.r 1.1 MB/s | 36 kB 00:00 2026-03-10T07:44:49.624 INFO:teuthology.orchestra.run.vm08.stdout:(110/119): python3-typing-extensions-4.15.0-1.e 2.6 MB/s | 86 kB 00:00 2026-03-10T07:44:49.655 INFO:teuthology.orchestra.run.vm08.stdout:(111/119): python3-webob-1.8.8-2.el9.noarch.rpm 3.6 MB/s | 230 kB 00:00 2026-03-10T07:44:49.656 INFO:teuthology.orchestra.run.vm08.stdout:(112/119): python3-websocket-client-1.2.3-2.el9 1.4 MB/s | 90 kB 00:00 2026-03-10T07:44:49.716 INFO:teuthology.orchestra.run.vm08.stdout:(113/119): python3-xmltodict-0.12.0-15.el9.noar 365 kB/s | 22 kB 00:00 2026-03-10T07:44:49.717 INFO:teuthology.orchestra.run.vm08.stdout:(114/119): python3-zc-lockfile-2.0-10.el9.noarc 332 kB/s | 20 kB 00:00 2026-03-10T07:44:49.718 INFO:teuthology.orchestra.run.vm08.stdout:(115/119): python3-werkzeug-2.0.3-3.el9.1.noarc 4.4 MB/s | 427 kB 00:00 2026-03-10T07:44:49.750 INFO:teuthology.orchestra.run.vm08.stdout:(116/119): re2-20211101-20.el9.x86_64.rpm 5.6 MB/s | 191 kB 00:00 2026-03-10T07:44:49.972 INFO:teuthology.orchestra.run.vm08.stdout:(117/119): thrift-0.15.0-4.el9.x86_64.rpm 6.2 MB/s | 1.6 MB 00:00 2026-03-10T07:44:50.040 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-10T07:44:50.048 INFO:teuthology.orchestra.run.vm05.stdout: Installing : thrift-0.15.0-4.el9.x86_64 1/121 2026-03-10T07:44:50.060 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-more-itertools-8.12.0-2.el9.noarch 2/121 2026-03-10T07:44:50.226 INFO:teuthology.orchestra.run.vm05.stdout: Installing : lttng-ust-2.12.0-6.el9.x86_64 3/121 2026-03-10T07:44:50.228 INFO:teuthology.orchestra.run.vm05.stdout: Upgrading : librados2-2:18.2.1-0.el9.x86_64 4/121 2026-03-10T07:44:50.272 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librados2-2:18.2.1-0.el9.x86_64 4/121 2026-03-10T07:44:50.273 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libcephfs2-2:18.2.1-0.el9.x86_64 5/121 2026-03-10T07:44:50.303 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libcephfs2-2:18.2.1-0.el9.x86_64 5/121 2026-03-10T07:44:50.312 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-rados-2:18.2.1-0.el9.x86_64 6/121 2026-03-10T07:44:50.316 INFO:teuthology.orchestra.run.vm05.stdout: Installing : librdkafka-1.6.1-102.el9.x86_64 7/121 2026-03-10T07:44:50.319 INFO:teuthology.orchestra.run.vm05.stdout: Installing : librabbitmq-0.11.0-7.el9.x86_64 8/121 2026-03-10T07:44:50.328 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jaraco-8.2.1-3.el9.noarch 9/121 2026-03-10T07:44:50.329 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libcephsqlite-2:18.2.1-0.el9.x86_64 10/121 2026-03-10T07:44:50.364 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libcephsqlite-2:18.2.1-0.el9.x86_64 10/121 2026-03-10T07:44:50.366 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libradosstriper1-2:18.2.1-0.el9.x86_64 11/121 2026-03-10T07:44:50.413 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libradosstriper1-2:18.2.1-0.el9.x86_64 11/121 2026-03-10T07:44:50.419 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-werkzeug-2.0.3-3.el9.1.noarch 12/121 2026-03-10T07:44:50.446 INFO:teuthology.orchestra.run.vm05.stdout: Installing : liboath-2.6.12-1.el9.x86_64 13/121 2026-03-10T07:44:50.455 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-pyasn1-0.4.8-7.el9.noarch 14/121 2026-03-10T07:44:50.458 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-markupsafe-1.1.1-12.el9.x86_64 15/121 2026-03-10T07:44:50.487 INFO:teuthology.orchestra.run.vm05.stdout: Installing : flexiblas-3.0.4-9.el9.x86_64 16/121 2026-03-10T07:44:50.504 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-urllib3-1.26.5-7.el9.noarch 17/121 2026-03-10T07:44:50.508 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-requests-2.25.1-10.el9.noarch 18/121 2026-03-10T07:44:50.517 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libquadmath-11.5.0-14.el9.x86_64 19/121 2026-03-10T07:44:50.519 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libgfortran-11.5.0-14.el9.x86_64 20/121 2026-03-10T07:44:50.524 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ledmon-libs-1.1.0-3.el9.x86_64 21/121 2026-03-10T07:44:50.535 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-ceph-argparse-2:18.2.1-0.el9.x86_64 22/121 2026-03-10T07:44:50.549 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-cephfs-2:18.2.1-0.el9.x86_64 23/121 2026-03-10T07:44:50.578 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-requests-oauthlib-1.3.0-12.el9.noarch 24/121 2026-03-10T07:44:50.639 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-mako-1.1.4-6.el9.noarch 25/121 2026-03-10T07:44:50.655 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-pyasn1-modules-0.4.8-7.el9.noarch 26/121 2026-03-10T07:44:50.655 INFO:teuthology.orchestra.run.vm08.stdout:(118/119): librbd1-18.2.1-0.el9.x86_64.rpm 3.3 MB/s | 3.0 MB 00:00 2026-03-10T07:44:50.663 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-rsa-4.9-2.el9.noarch 27/121 2026-03-10T07:44:50.672 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jaraco-classes-3.2.1-5.el9.noarch 28/121 2026-03-10T07:44:50.676 INFO:teuthology.orchestra.run.vm05.stdout: Installing : librados-devel-2:18.2.1-0.el9.x86_64 29/121 2026-03-10T07:44:50.713 INFO:teuthology.orchestra.run.vm05.stdout: Installing : re2-1:20211101-20.el9.x86_64 30/121 2026-03-10T07:44:50.719 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libarrow-9.0.0-15.el9.x86_64 31/121 2026-03-10T07:44:50.737 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-zc-lockfile-2.0-10.el9.noarch 32/121 2026-03-10T07:44:50.762 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-websocket-client-1.2.3-2.el9.noarch 33/121 2026-03-10T07:44:50.769 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-webob-1.8.8-2.el9.noarch 34/121 2026-03-10T07:44:50.775 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-typing-extensions-4.15.0-1.el9.noarch 35/121 2026-03-10T07:44:50.790 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-repoze-lru-0.7-16.el9.noarch 36/121 2026-03-10T07:44:50.801 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-routes-2.5.1-5.el9.noarch 37/121 2026-03-10T07:44:50.814 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-natsort-7.1.1-5.el9.noarch 38/121 2026-03-10T07:44:50.821 INFO:teuthology.orchestra.run.vm08.stdout:(119/119): librados2-18.2.1-0.el9.x86_64.rpm 3.0 MB/s | 3.3 MB 00:01 2026-03-10T07:44:50.824 INFO:teuthology.orchestra.run.vm08.stdout:-------------------------------------------------------------------------------- 2026-03-10T07:44:50.824 INFO:teuthology.orchestra.run.vm08.stdout:Total 8.7 MB/s | 182 MB 00:20 2026-03-10T07:44:50.881 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-logutils-0.3.5-21.el9.noarch 39/121 2026-03-10T07:44:50.890 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-pecan-1.4.2-3.el9.noarch 40/121 2026-03-10T07:44:50.900 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-certifi-2023.05.07-4.el9.noarch 41/121 2026-03-10T07:44:50.949 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-cachetools-4.2.4-1.el9.noarch 42/121 2026-03-10T07:44:51.240 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-10T07:44:51.294 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-10T07:44:51.294 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-10T07:44:51.329 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-google-auth-1:2.45.0-1.el9.noarch 43/121 2026-03-10T07:44:51.345 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-kubernetes-1:26.1.0-3.el9.noarch 44/121 2026-03-10T07:44:51.351 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-backports-tarfile-1.2.0-1.el9.noarch 45/121 2026-03-10T07:44:51.359 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jaraco-context-6.0.1-3.el9.noarch 46/121 2026-03-10T07:44:51.365 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-autocommand-2.2.2-8.el9.noarch 47/121 2026-03-10T07:44:51.373 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libunwind-1.6.2-1.el9.x86_64 48/121 2026-03-10T07:44:51.377 INFO:teuthology.orchestra.run.vm05.stdout: Installing : gperftools-libs-2.9.1-3.el9.x86_64 49/121 2026-03-10T07:44:51.380 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libarrow-doc-9.0.0-15.el9.noarch 50/121 2026-03-10T07:44:51.391 INFO:teuthology.orchestra.run.vm05.stdout: Installing : fmt-8.1.1-5.el9.x86_64 51/121 2026-03-10T07:44:51.399 INFO:teuthology.orchestra.run.vm05.stdout: Installing : socat-1.7.4.1-8.el9.x86_64 52/121 2026-03-10T07:44:51.405 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-toml-0.10.2-6.el9.noarch 53/121 2026-03-10T07:44:51.414 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jaraco-functools-3.5.0-2.el9.noarch 54/121 2026-03-10T07:44:51.419 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jaraco-text-4.0.0-2.el9.noarch 55/121 2026-03-10T07:44:51.433 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jaraco-collections-3.0.0-8.el9.noarch 56/121 2026-03-10T07:44:51.440 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-tempora-5.0.0-2.el9.noarch 57/121 2026-03-10T07:44:51.484 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-portend-3.1.0-2.el9.noarch 58/121 2026-03-10T07:44:51.769 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-devel-3.9.25-3.el9.x86_64 59/121 2026-03-10T07:44:51.800 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-babel-2.9.1-2.el9.noarch 60/121 2026-03-10T07:44:51.807 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-10T07:44:51.868 INFO:teuthology.orchestra.run.vm05.stdout: Installing : openblas-0.3.29-1.el9.x86_64 62/121 2026-03-10T07:44:51.871 INFO:teuthology.orchestra.run.vm05.stdout: Installing : openblas-openmp-0.3.29-1.el9.x86_64 63/121 2026-03-10T07:44:51.895 INFO:teuthology.orchestra.run.vm05.stdout: Installing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 64/121 2026-03-10T07:44:52.048 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-10T07:44:52.049 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-10T07:44:52.282 INFO:teuthology.orchestra.run.vm05.stdout: Installing : flexiblas-netlib-3.0.4-9.el9.x86_64 65/121 2026-03-10T07:44:52.371 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-10T07:44:52.868 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-10T07:44:52.876 INFO:teuthology.orchestra.run.vm08.stdout: Installing : thrift-0.15.0-4.el9.x86_64 1/121 2026-03-10T07:44:52.889 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-more-itertools-8.12.0-2.el9.noarch 2/121 2026-03-10T07:44:53.059 INFO:teuthology.orchestra.run.vm08.stdout: Installing : lttng-ust-2.12.0-6.el9.x86_64 3/121 2026-03-10T07:44:53.062 INFO:teuthology.orchestra.run.vm08.stdout: Upgrading : librados2-2:18.2.1-0.el9.x86_64 4/121 2026-03-10T07:44:53.106 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librados2-2:18.2.1-0.el9.x86_64 4/121 2026-03-10T07:44:53.109 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libcephfs2-2:18.2.1-0.el9.x86_64 5/121 2026-03-10T07:44:53.138 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libcephfs2-2:18.2.1-0.el9.x86_64 5/121 2026-03-10T07:44:53.147 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-rados-2:18.2.1-0.el9.x86_64 6/121 2026-03-10T07:44:53.152 INFO:teuthology.orchestra.run.vm08.stdout: Installing : librdkafka-1.6.1-102.el9.x86_64 7/121 2026-03-10T07:44:53.154 INFO:teuthology.orchestra.run.vm08.stdout: Installing : librabbitmq-0.11.0-7.el9.x86_64 8/121 2026-03-10T07:44:53.163 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jaraco-8.2.1-3.el9.noarch 9/121 2026-03-10T07:44:53.165 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libcephsqlite-2:18.2.1-0.el9.x86_64 10/121 2026-03-10T07:44:53.182 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-10T07:44:53.236 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-scipy-1.9.3-2.el9.x86_64 68/121 2026-03-10T07:44:53.244 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libxslt-1.1.34-12.el9.x86_64 69/121 2026-03-10T07:44:53.249 INFO:teuthology.orchestra.run.vm05.stdout: Installing : xmlstarlet-1.6.1-20.el9.x86_64 70/121 2026-03-10T07:44:53.264 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libcephsqlite-2:18.2.1-0.el9.x86_64 10/121 2026-03-10T07:44:53.265 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libradosstriper1-2:18.2.1-0.el9.x86_64 11/121 2026-03-10T07:44:53.314 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libradosstriper1-2:18.2.1-0.el9.x86_64 11/121 2026-03-10T07:44:53.320 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-werkzeug-2.0.3-3.el9.1.noarch 12/121 2026-03-10T07:44:53.345 INFO:teuthology.orchestra.run.vm08.stdout: Installing : liboath-2.6.12-1.el9.x86_64 13/121 2026-03-10T07:44:53.355 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-pyasn1-0.4.8-7.el9.noarch 14/121 2026-03-10T07:44:53.359 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-markupsafe-1.1.1-12.el9.x86_64 15/121 2026-03-10T07:44:53.386 INFO:teuthology.orchestra.run.vm08.stdout: Installing : flexiblas-3.0.4-9.el9.x86_64 16/121 2026-03-10T07:44:53.401 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libpmemobj-1.12.1-1.el9.x86_64 71/121 2026-03-10T07:44:53.403 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-urllib3-1.26.5-7.el9.noarch 17/121 2026-03-10T07:44:53.405 INFO:teuthology.orchestra.run.vm05.stdout: Upgrading : librbd1-2:18.2.1-0.el9.x86_64 72/121 2026-03-10T07:44:53.408 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-requests-2.25.1-10.el9.noarch 18/121 2026-03-10T07:44:53.416 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libquadmath-11.5.0-14.el9.x86_64 19/121 2026-03-10T07:44:53.419 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libgfortran-11.5.0-14.el9.x86_64 20/121 2026-03-10T07:44:53.425 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ledmon-libs-1.1.0-3.el9.x86_64 21/121 2026-03-10T07:44:53.436 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-ceph-argparse-2:18.2.1-0.el9.x86_64 22/121 2026-03-10T07:44:53.440 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librbd1-2:18.2.1-0.el9.x86_64 72/121 2026-03-10T07:44:53.444 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-rbd-2:18.2.1-0.el9.x86_64 73/121 2026-03-10T07:44:53.450 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-cephfs-2:18.2.1-0.el9.x86_64 23/121 2026-03-10T07:44:53.452 INFO:teuthology.orchestra.run.vm05.stdout: Installing : boost-program-options-1.75.0-13.el9.x86_64 74/121 2026-03-10T07:44:53.482 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-requests-oauthlib-1.3.0-12.el9.noarch 24/121 2026-03-10T07:44:53.546 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-mako-1.1.4-6.el9.noarch 25/121 2026-03-10T07:44:53.564 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-pyasn1-modules-0.4.8-7.el9.noarch 26/121 2026-03-10T07:44:53.573 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-rsa-4.9-2.el9.noarch 27/121 2026-03-10T07:44:53.583 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jaraco-classes-3.2.1-5.el9.noarch 28/121 2026-03-10T07:44:53.588 INFO:teuthology.orchestra.run.vm08.stdout: Installing : librados-devel-2:18.2.1-0.el9.x86_64 29/121 2026-03-10T07:44:53.625 INFO:teuthology.orchestra.run.vm08.stdout: Installing : re2-1:20211101-20.el9.x86_64 30/121 2026-03-10T07:44:53.632 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libarrow-9.0.0-15.el9.x86_64 31/121 2026-03-10T07:44:53.650 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-zc-lockfile-2.0-10.el9.noarch 32/121 2026-03-10T07:44:53.670 INFO:teuthology.orchestra.run.vm05.stdout: Installing : parquet-libs-9.0.0-15.el9.x86_64 75/121 2026-03-10T07:44:53.673 INFO:teuthology.orchestra.run.vm05.stdout: Installing : librgw2-2:18.2.1-0.el9.x86_64 76/121 2026-03-10T07:44:53.676 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-websocket-client-1.2.3-2.el9.noarch 33/121 2026-03-10T07:44:53.684 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-webob-1.8.8-2.el9.noarch 34/121 2026-03-10T07:44:53.691 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-typing-extensions-4.15.0-1.el9.noarch 35/121 2026-03-10T07:44:53.693 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librgw2-2:18.2.1-0.el9.x86_64 76/121 2026-03-10T07:44:53.702 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-rgw-2:18.2.1-0.el9.x86_64 77/121 2026-03-10T07:44:53.705 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-repoze-lru-0.7-16.el9.noarch 36/121 2026-03-10T07:44:53.720 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-routes-2.5.1-5.el9.noarch 37/121 2026-03-10T07:44:53.720 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-ply-3.11-14.el9.noarch 78/121 2026-03-10T07:44:53.734 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-natsort-7.1.1-5.el9.noarch 38/121 2026-03-10T07:44:53.742 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-pycparser-2.20-6.el9.noarch 79/121 2026-03-10T07:44:53.804 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-logutils-0.3.5-21.el9.noarch 39/121 2026-03-10T07:44:53.813 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-pecan-1.4.2-3.el9.noarch 40/121 2026-03-10T07:44:53.825 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-certifi-2023.05.07-4.el9.noarch 41/121 2026-03-10T07:44:53.834 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-cffi-1.14.5-5.el9.x86_64 80/121 2026-03-10T07:44:53.848 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-cryptography-36.0.1-5.el9.x86_64 81/121 2026-03-10T07:44:53.876 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-cachetools-4.2.4-1.el9.noarch 42/121 2026-03-10T07:44:53.877 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-pyOpenSSL-21.0.0-1.el9.noarch 82/121 2026-03-10T07:44:53.916 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-cheroot-10.0.1-4.el9.noarch 83/121 2026-03-10T07:44:53.980 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-cherrypy-18.6.1-2.el9.noarch 84/121 2026-03-10T07:44:53.994 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-asyncssh-2.13.2-5.el9.noarch 85/121 2026-03-10T07:44:53.997 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jwt-2.4.0-1.el9.noarch 86/121 2026-03-10T07:44:54.005 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jwt+crypto-2.4.0-1.el9.noarch 87/121 2026-03-10T07:44:54.009 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-bcrypt-3.2.2-1.el9.x86_64 88/121 2026-03-10T07:44:54.015 INFO:teuthology.orchestra.run.vm05.stdout: Installing : mailcap-2.1.49-5.el9.noarch 89/121 2026-03-10T07:44:54.018 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libconfig-1.7.2-9.el9.x86_64 90/121 2026-03-10T07:44:54.038 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T07:44:54.038 INFO:teuthology.orchestra.run.vm05.stdout:Creating group 'libstoragemgmt' with GID 994. 2026-03-10T07:44:54.038 INFO:teuthology.orchestra.run.vm05.stdout:Creating user 'libstoragemgmt' (daemon account for libstoragemgmt) with UID 994 and GID 994. 2026-03-10T07:44:54.038 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:44:54.051 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T07:44:54.079 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T07:44:54.079 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/libstoragemgmt.service → /usr/lib/systemd/system/libstoragemgmt.service. 2026-03-10T07:44:54.079 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:44:54.097 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 92/121 2026-03-10T07:44:54.150 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: cephadm-2:18.2.1-0.el9.noarch 93/121 2026-03-10T07:44:54.153 INFO:teuthology.orchestra.run.vm05.stdout: Installing : cephadm-2:18.2.1-0.el9.noarch 93/121 2026-03-10T07:44:54.159 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 94/121 2026-03-10T07:44:54.187 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 95/121 2026-03-10T07:44:54.191 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-ceph-common-2:18.2.1-0.el9.x86_64 96/121 2026-03-10T07:44:54.273 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-google-auth-1:2.45.0-1.el9.noarch 43/121 2026-03-10T07:44:54.291 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-kubernetes-1:26.1.0-3.el9.noarch 44/121 2026-03-10T07:44:54.297 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-backports-tarfile-1.2.0-1.el9.noarch 45/121 2026-03-10T07:44:54.305 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jaraco-context-6.0.1-3.el9.noarch 46/121 2026-03-10T07:44:54.310 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-autocommand-2.2.2-8.el9.noarch 47/121 2026-03-10T07:44:54.318 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libunwind-1.6.2-1.el9.x86_64 48/121 2026-03-10T07:44:54.322 INFO:teuthology.orchestra.run.vm08.stdout: Installing : gperftools-libs-2.9.1-3.el9.x86_64 49/121 2026-03-10T07:44:54.325 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libarrow-doc-9.0.0-15.el9.noarch 50/121 2026-03-10T07:44:54.336 INFO:teuthology.orchestra.run.vm08.stdout: Installing : fmt-8.1.1-5.el9.x86_64 51/121 2026-03-10T07:44:54.345 INFO:teuthology.orchestra.run.vm08.stdout: Installing : socat-1.7.4.1-8.el9.x86_64 52/121 2026-03-10T07:44:54.350 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-toml-0.10.2-6.el9.noarch 53/121 2026-03-10T07:44:54.358 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jaraco-functools-3.5.0-2.el9.noarch 54/121 2026-03-10T07:44:54.364 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jaraco-text-4.0.0-2.el9.noarch 55/121 2026-03-10T07:44:54.374 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jaraco-collections-3.0.0-8.el9.noarch 56/121 2026-03-10T07:44:54.379 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-tempora-5.0.0-2.el9.noarch 57/121 2026-03-10T07:44:54.422 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-portend-3.1.0-2.el9.noarch 58/121 2026-03-10T07:44:54.704 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-devel-3.9.25-3.el9.x86_64 59/121 2026-03-10T07:44:54.736 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-babel-2.9.1-2.el9.noarch 60/121 2026-03-10T07:44:54.743 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-10T07:44:54.816 INFO:teuthology.orchestra.run.vm08.stdout: Installing : openblas-0.3.29-1.el9.x86_64 62/121 2026-03-10T07:44:54.820 INFO:teuthology.orchestra.run.vm08.stdout: Installing : openblas-openmp-0.3.29-1.el9.x86_64 63/121 2026-03-10T07:44:54.846 INFO:teuthology.orchestra.run.vm08.stdout: Installing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 64/121 2026-03-10T07:44:55.194 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-common-2:18.2.1-0.el9.x86_64 97/121 2026-03-10T07:44:55.201 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-common-2:18.2.1-0.el9.x86_64 97/121 2026-03-10T07:44:55.281 INFO:teuthology.orchestra.run.vm08.stdout: Installing : flexiblas-netlib-3.0.4-9.el9.x86_64 65/121 2026-03-10T07:44:55.380 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-10T07:44:55.530 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-common-2:18.2.1-0.el9.x86_64 97/121 2026-03-10T07:44:55.596 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-base-2:18.2.1-0.el9.x86_64 98/121 2026-03-10T07:44:55.638 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-base-2:18.2.1-0.el9.x86_64 98/121 2026-03-10T07:44:55.638 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /usr/lib/systemd/system/ceph.target. 2026-03-10T07:44:55.638 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /usr/lib/systemd/system/ceph-crash.service. 2026-03-10T07:44:55.638 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:44:55.644 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-selinux-2:18.2.1-0.el9.x86_64 99/121 2026-03-10T07:44:56.300 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-10T07:44:56.333 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-scipy-1.9.3-2.el9.x86_64 68/121 2026-03-10T07:44:56.340 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libxslt-1.1.34-12.el9.x86_64 69/121 2026-03-10T07:44:56.346 INFO:teuthology.orchestra.run.vm08.stdout: Installing : xmlstarlet-1.6.1-20.el9.x86_64 70/121 2026-03-10T07:44:56.514 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libpmemobj-1.12.1-1.el9.x86_64 71/121 2026-03-10T07:44:56.517 INFO:teuthology.orchestra.run.vm08.stdout: Upgrading : librbd1-2:18.2.1-0.el9.x86_64 72/121 2026-03-10T07:44:56.548 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librbd1-2:18.2.1-0.el9.x86_64 72/121 2026-03-10T07:44:56.552 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-rbd-2:18.2.1-0.el9.x86_64 73/121 2026-03-10T07:44:56.561 INFO:teuthology.orchestra.run.vm08.stdout: Installing : boost-program-options-1.75.0-13.el9.x86_64 74/121 2026-03-10T07:44:56.796 INFO:teuthology.orchestra.run.vm08.stdout: Installing : parquet-libs-9.0.0-15.el9.x86_64 75/121 2026-03-10T07:44:56.799 INFO:teuthology.orchestra.run.vm08.stdout: Installing : librgw2-2:18.2.1-0.el9.x86_64 76/121 2026-03-10T07:44:56.819 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librgw2-2:18.2.1-0.el9.x86_64 76/121 2026-03-10T07:44:56.828 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-rgw-2:18.2.1-0.el9.x86_64 77/121 2026-03-10T07:44:56.850 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-ply-3.11-14.el9.noarch 78/121 2026-03-10T07:44:56.875 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-pycparser-2.20-6.el9.noarch 79/121 2026-03-10T07:44:56.985 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-cffi-1.14.5-5.el9.x86_64 80/121 2026-03-10T07:44:57.018 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-cryptography-36.0.1-5.el9.x86_64 81/121 2026-03-10T07:44:57.054 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-pyOpenSSL-21.0.0-1.el9.noarch 82/121 2026-03-10T07:44:57.100 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-cheroot-10.0.1-4.el9.noarch 83/121 2026-03-10T07:44:57.171 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-cherrypy-18.6.1-2.el9.noarch 84/121 2026-03-10T07:44:57.186 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-asyncssh-2.13.2-5.el9.noarch 85/121 2026-03-10T07:44:57.189 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jwt-2.4.0-1.el9.noarch 86/121 2026-03-10T07:44:57.197 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jwt+crypto-2.4.0-1.el9.noarch 87/121 2026-03-10T07:44:57.203 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-bcrypt-3.2.2-1.el9.x86_64 88/121 2026-03-10T07:44:57.208 INFO:teuthology.orchestra.run.vm08.stdout: Installing : mailcap-2.1.49-5.el9.noarch 89/121 2026-03-10T07:44:57.211 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libconfig-1.7.2-9.el9.x86_64 90/121 2026-03-10T07:44:57.230 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T07:44:57.230 INFO:teuthology.orchestra.run.vm08.stdout:Creating group 'libstoragemgmt' with GID 994. 2026-03-10T07:44:57.230 INFO:teuthology.orchestra.run.vm08.stdout:Creating user 'libstoragemgmt' (daemon account for libstoragemgmt) with UID 994 and GID 994. 2026-03-10T07:44:57.230 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:44:57.243 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T07:44:57.280 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 91/121 2026-03-10T07:44:57.280 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/libstoragemgmt.service → /usr/lib/systemd/system/libstoragemgmt.service. 2026-03-10T07:44:57.280 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:44:57.303 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 92/121 2026-03-10T07:44:57.368 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: cephadm-2:18.2.1-0.el9.noarch 93/121 2026-03-10T07:44:57.371 INFO:teuthology.orchestra.run.vm08.stdout: Installing : cephadm-2:18.2.1-0.el9.noarch 93/121 2026-03-10T07:44:57.376 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 94/121 2026-03-10T07:44:57.407 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 95/121 2026-03-10T07:44:57.412 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-ceph-common-2:18.2.1-0.el9.x86_64 96/121 2026-03-10T07:44:58.414 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-common-2:18.2.1-0.el9.x86_64 97/121 2026-03-10T07:44:58.420 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-common-2:18.2.1-0.el9.x86_64 97/121 2026-03-10T07:44:58.729 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-common-2:18.2.1-0.el9.x86_64 97/121 2026-03-10T07:44:58.803 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-base-2:18.2.1-0.el9.x86_64 98/121 2026-03-10T07:44:58.853 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-base-2:18.2.1-0.el9.x86_64 98/121 2026-03-10T07:44:58.853 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /usr/lib/systemd/system/ceph.target. 2026-03-10T07:44:58.853 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-crash.service → /usr/lib/systemd/system/ceph-crash.service. 2026-03-10T07:44:58.853 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:44:58.858 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-selinux-2:18.2.1-0.el9.x86_64 99/121 2026-03-10T07:45:02.173 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-selinux-2:18.2.1-0.el9.x86_64 99/121 2026-03-10T07:45:02.173 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /sys 2026-03-10T07:45:02.173 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /proc 2026-03-10T07:45:02.173 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /mnt 2026-03-10T07:45:02.173 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /var/tmp 2026-03-10T07:45:02.173 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /home 2026-03-10T07:45:02.173 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /root 2026-03-10T07:45:02.173 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /tmp 2026-03-10T07:45:02.173 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:45:02.206 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 100/121 2026-03-10T07:45:02.337 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 100/121 2026-03-10T07:45:02.342 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 101/121 2026-03-10T07:45:02.897 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 101/121 2026-03-10T07:45:02.905 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noa 102/121 2026-03-10T07:45:02.968 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noa 102/121 2026-03-10T07:45:03.049 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 103/121 2026-03-10T07:45:03.052 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-2:18.2.1-0.el9.x86_64 104/121 2026-03-10T07:45:03.081 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-2:18.2.1-0.el9.x86_64 104/121 2026-03-10T07:45:03.081 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T07:45:03.081 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-10T07:45:03.081 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-10T07:45:03.081 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-10T07:45:03.081 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:45:03.095 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mgr-rook-2:18.2.1-0.el9.noarch 105/121 2026-03-10T07:45:03.208 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.1-0.el9.noarch 105/121 2026-03-10T07:45:03.211 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mds-2:18.2.1-0.el9.x86_64 106/121 2026-03-10T07:45:03.235 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mds-2:18.2.1-0.el9.x86_64 106/121 2026-03-10T07:45:03.235 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T07:45:03.235 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-10T07:45:03.235 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-10T07:45:03.235 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-10T07:45:03.235 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:45:03.472 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-mon-2:18.2.1-0.el9.x86_64 107/121 2026-03-10T07:45:03.493 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mon-2:18.2.1-0.el9.x86_64 107/121 2026-03-10T07:45:03.493 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T07:45:03.493 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-10T07:45:03.493 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-10T07:45:03.493 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-10T07:45:03.493 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:45:04.464 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-osd-2:18.2.1-0.el9.x86_64 108/121 2026-03-10T07:45:04.489 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-osd-2:18.2.1-0.el9.x86_64 108/121 2026-03-10T07:45:04.489 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T07:45:04.489 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-10T07:45:04.489 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-10T07:45:04.489 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-10T07:45:04.489 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:45:04.938 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-2:18.2.1-0.el9.x86_64 109/121 2026-03-10T07:45:05.000 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-radosgw-2:18.2.1-0.el9.x86_64 110/121 2026-03-10T07:45:05.023 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-radosgw-2:18.2.1-0.el9.x86_64 110/121 2026-03-10T07:45:05.023 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T07:45:05.023 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-10T07:45:05.023 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-10T07:45:05.023 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-10T07:45:05.023 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:45:05.098 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-immutable-object-cache-2:18.2.1-0.el9.x86_6 111/121 2026-03-10T07:45:05.120 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_6 111/121 2026-03-10T07:45:05.121 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T07:45:05.121 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-10T07:45:05.121 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:45:05.272 INFO:teuthology.orchestra.run.vm05.stdout: Installing : rbd-mirror-2:18.2.1-0.el9.x86_64 112/121 2026-03-10T07:45:05.295 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: rbd-mirror-2:18.2.1-0.el9.x86_64 112/121 2026-03-10T07:45:05.295 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T07:45:05.295 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-10T07:45:05.295 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-10T07:45:05.295 INFO:teuthology.orchestra.run.vm05.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-10T07:45:05.296 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:45:05.735 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-selinux-2:18.2.1-0.el9.x86_64 99/121 2026-03-10T07:45:05.735 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /sys 2026-03-10T07:45:05.736 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /proc 2026-03-10T07:45:05.736 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /mnt 2026-03-10T07:45:05.736 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /var/tmp 2026-03-10T07:45:05.736 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /home 2026-03-10T07:45:05.736 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /root 2026-03-10T07:45:05.736 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /tmp 2026-03-10T07:45:05.736 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:45:05.767 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 100/121 2026-03-10T07:45:05.896 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 100/121 2026-03-10T07:45:05.901 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 101/121 2026-03-10T07:45:06.440 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 101/121 2026-03-10T07:45:06.448 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noa 102/121 2026-03-10T07:45:06.519 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noa 102/121 2026-03-10T07:45:06.602 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 103/121 2026-03-10T07:45:06.605 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-mgr-2:18.2.1-0.el9.x86_64 104/121 2026-03-10T07:45:06.628 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-2:18.2.1-0.el9.x86_64 104/121 2026-03-10T07:45:06.628 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T07:45:06.628 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-10T07:45:06.628 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-10T07:45:06.628 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mgr.target → /usr/lib/systemd/system/ceph-mgr.target. 2026-03-10T07:45:06.628 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:45:06.642 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-mgr-rook-2:18.2.1-0.el9.noarch 105/121 2026-03-10T07:45:06.758 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.1-0.el9.noarch 105/121 2026-03-10T07:45:06.761 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-mds-2:18.2.1-0.el9.x86_64 106/121 2026-03-10T07:45:06.786 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mds-2:18.2.1-0.el9.x86_64 106/121 2026-03-10T07:45:06.786 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T07:45:06.786 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-10T07:45:06.786 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-10T07:45:06.786 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mds.target → /usr/lib/systemd/system/ceph-mds.target. 2026-03-10T07:45:06.786 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:45:07.033 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-mon-2:18.2.1-0.el9.x86_64 107/121 2026-03-10T07:45:07.055 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mon-2:18.2.1-0.el9.x86_64 107/121 2026-03-10T07:45:07.055 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T07:45:07.055 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-10T07:45:07.055 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-10T07:45:07.055 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-mon.target → /usr/lib/systemd/system/ceph-mon.target. 2026-03-10T07:45:07.055 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:45:07.405 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-test-2:18.2.1-0.el9.x86_64 113/121 2026-03-10T07:45:07.417 INFO:teuthology.orchestra.run.vm05.stdout: Installing : rbd-fuse-2:18.2.1-0.el9.x86_64 114/121 2026-03-10T07:45:07.423 INFO:teuthology.orchestra.run.vm05.stdout: Installing : rbd-nbd-2:18.2.1-0.el9.x86_64 115/121 2026-03-10T07:45:07.463 INFO:teuthology.orchestra.run.vm05.stdout: Installing : libcephfs-devel-2:18.2.1-0.el9.x86_64 116/121 2026-03-10T07:45:07.470 INFO:teuthology.orchestra.run.vm05.stdout: Installing : ceph-fuse-2:18.2.1-0.el9.x86_64 117/121 2026-03-10T07:45:07.479 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-xmltodict-0.12.0-15.el9.noarch 118/121 2026-03-10T07:45:07.484 INFO:teuthology.orchestra.run.vm05.stdout: Installing : python3-jmespath-1.0.1-1.el9.noarch 119/121 2026-03-10T07:45:07.484 INFO:teuthology.orchestra.run.vm05.stdout: Cleanup : librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-10T07:45:07.500 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-10T07:45:07.500 INFO:teuthology.orchestra.run.vm05.stdout: Cleanup : librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T07:45:07.898 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-osd-2:18.2.1-0.el9.x86_64 108/121 2026-03-10T07:45:07.926 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-osd-2:18.2.1-0.el9.x86_64 108/121 2026-03-10T07:45:07.926 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T07:45:07.926 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-10T07:45:07.926 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-10T07:45:07.926 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-osd.target → /usr/lib/systemd/system/ceph-osd.target. 2026-03-10T07:45:07.926 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:45:08.369 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-2:18.2.1-0.el9.x86_64 109/121 2026-03-10T07:45:08.405 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-radosgw-2:18.2.1-0.el9.x86_64 110/121 2026-03-10T07:45:08.428 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-radosgw-2:18.2.1-0.el9.x86_64 110/121 2026-03-10T07:45:08.428 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T07:45:08.429 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-10T07:45:08.429 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-10T07:45:08.429 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-radosgw.target → /usr/lib/systemd/system/ceph-radosgw.target. 2026-03-10T07:45:08.429 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:45:08.440 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-immutable-object-cache-2:18.2.1-0.el9.x86_6 111/121 2026-03-10T07:45:08.463 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_6 111/121 2026-03-10T07:45:08.463 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T07:45:08.463 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-10T07:45:08.463 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:45:08.627 INFO:teuthology.orchestra.run.vm08.stdout: Installing : rbd-mirror-2:18.2.1-0.el9.x86_64 112/121 2026-03-10T07:45:08.653 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: rbd-mirror-2:18.2.1-0.el9.x86_64 112/121 2026-03-10T07:45:08.653 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T07:45:08.653 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-10T07:45:08.653 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-10T07:45:08.653 INFO:teuthology.orchestra.run.vm08.stdout:Created symlink /etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target → /usr/lib/systemd/system/ceph-rbd-mirror.target. 2026-03-10T07:45:08.653 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:45:08.669 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T07:45:08.669 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-2:18.2.1-0.el9.x86_64 1/121 2026-03-10T07:45:08.669 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-base-2:18.2.1-0.el9.x86_64 2/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-common-2:18.2.1-0.el9.x86_64 3/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-fuse-2:18.2.1-0.el9.x86_64 4/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-immutable-object-cache-2:18.2.1-0.el9.x86_6 5/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mds-2:18.2.1-0.el9.x86_64 6/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-2:18.2.1-0.el9.x86_64 7/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mon-2:18.2.1-0.el9.x86_64 8/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-osd-2:18.2.1-0.el9.x86_64 9/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-radosgw-2:18.2.1-0.el9.x86_64 10/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-selinux-2:18.2.1-0.el9.x86_64 11/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-test-2:18.2.1-0.el9.x86_64 12/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephfs-devel-2:18.2.1-0.el9.x86_64 13/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephfs2-2:18.2.1-0.el9.x86_64 14/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephsqlite-2:18.2.1-0.el9.x86_64 15/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librados-devel-2:18.2.1-0.el9.x86_64 16/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libradosstriper1-2:18.2.1-0.el9.x86_64 17/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librgw2-2:18.2.1-0.el9.x86_64 18/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-ceph-argparse-2:18.2.1-0.el9.x86_64 19/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-ceph-common-2:18.2.1-0.el9.x86_64 20/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cephfs-2:18.2.1-0.el9.x86_64 21/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rados-2:18.2.1-0.el9.x86_64 22/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rbd-2:18.2.1-0.el9.x86_64 23/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rgw-2:18.2.1-0.el9.x86_64 24/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : rbd-fuse-2:18.2.1-0.el9.x86_64 25/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : rbd-mirror-2:18.2.1-0.el9.x86_64 26/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : rbd-nbd-2:18.2.1-0.el9.x86_64 27/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 28/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 29/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 30/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noa 31/121 2026-03-10T07:45:08.670 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 32/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-rook-2:18.2.1-0.el9.noarch 33/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 34/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : cephadm-2:18.2.1-0.el9.noarch 35/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 36/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 37/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 38/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 39/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 40/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 41/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 42/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-ply-3.11-14.el9.noarch 43/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 44/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 45/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 46/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 47/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 48/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 49/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 50/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 51/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 52/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 53/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 54/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 55/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 56/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 57/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 58/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 59/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 60/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jmespath-1.0.1-1.el9.noarch 62/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 63/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 64/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 65/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 68/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 69/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 70/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 71/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 72/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 73/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 74/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 75/121 2026-03-10T07:45:08.672 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 76/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 77/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 78/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 79/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 80/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 81/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 82/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 83/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 84/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 85/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 86/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 87/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 88/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 89/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 90/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 91/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 92/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 93/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 94/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 95/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 96/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 97/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 98/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 99/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 100/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 101/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 102/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 103/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 104/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 105/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 106/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 107/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 108/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 109/121 2026-03-10T07:45:08.673 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 110/121 2026-03-10T07:45:08.674 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 111/121 2026-03-10T07:45:08.674 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 112/121 2026-03-10T07:45:08.674 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 113/121 2026-03-10T07:45:08.674 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-xmltodict-0.12.0-15.el9.noarch 114/121 2026-03-10T07:45:08.674 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 115/121 2026-03-10T07:45:08.674 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : re2-1:20211101-20.el9.x86_64 116/121 2026-03-10T07:45:08.674 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 117/121 2026-03-10T07:45:08.674 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librados2-2:18.2.1-0.el9.x86_64 118/121 2026-03-10T07:45:08.674 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librados2-2:16.2.4-5.el9.x86_64 119/121 2026-03-10T07:45:08.674 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librbd1-2:18.2.1-0.el9.x86_64 120/121 2026-03-10T07:45:08.779 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librbd1-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout:Upgraded: 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: librados2-2:18.2.1-0.el9.x86_64 librbd1-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout:Installed: 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: ceph-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: ceph-base-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: ceph-common-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: ceph-fuse-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mds-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarch 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-rook-2:18.2.1-0.el9.noarch 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mon-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: ceph-osd-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: ceph-radosgw-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: ceph-selinux-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: ceph-test-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: cephadm-2:18.2.1-0.el9.noarch 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-devel-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs2-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: libcephsqlite-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-10T07:45:08.780 INFO:teuthology.orchestra.run.vm05.stdout: librados-devel-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: libradosstriper1-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: librgw2-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: libxslt-1.1.34-12.el9.x86_64 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: mailcap-2.1.49-5.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-common-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-cephfs-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-jmespath-1.0.1-1.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-10T07:45:08.781 INFO:teuthology.orchestra.run.vm05.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: python3-ply-3.11-14.el9.noarch 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: python3-rados-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: python3-rbd-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: python3-rgw-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: python3-xmltodict-0.12.0-15.el9.noarch 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: rbd-fuse-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: rbd-mirror-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: rbd-nbd-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: re2-1:20211101-20.el9.x86_64 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: socat-1.7.4.1-8.el9.x86_64 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: xmlstarlet-1.6.1-20.el9.x86_64 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:45:08.782 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T07:45:08.872 DEBUG:teuthology.parallel:result is None 2026-03-10T07:45:10.662 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-test-2:18.2.1-0.el9.x86_64 113/121 2026-03-10T07:45:10.674 INFO:teuthology.orchestra.run.vm08.stdout: Installing : rbd-fuse-2:18.2.1-0.el9.x86_64 114/121 2026-03-10T07:45:10.678 INFO:teuthology.orchestra.run.vm08.stdout: Installing : rbd-nbd-2:18.2.1-0.el9.x86_64 115/121 2026-03-10T07:45:10.718 INFO:teuthology.orchestra.run.vm08.stdout: Installing : libcephfs-devel-2:18.2.1-0.el9.x86_64 116/121 2026-03-10T07:45:10.724 INFO:teuthology.orchestra.run.vm08.stdout: Installing : ceph-fuse-2:18.2.1-0.el9.x86_64 117/121 2026-03-10T07:45:10.733 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-xmltodict-0.12.0-15.el9.noarch 118/121 2026-03-10T07:45:10.738 INFO:teuthology.orchestra.run.vm08.stdout: Installing : python3-jmespath-1.0.1-1.el9.noarch 119/121 2026-03-10T07:45:10.738 INFO:teuthology.orchestra.run.vm08.stdout: Cleanup : librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-10T07:45:10.752 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librbd1-2:16.2.4-5.el9.x86_64 120/121 2026-03-10T07:45:10.753 INFO:teuthology.orchestra.run.vm08.stdout: Cleanup : librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T07:45:11.894 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librados2-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T07:45:11.894 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-2:18.2.1-0.el9.x86_64 1/121 2026-03-10T07:45:11.894 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-base-2:18.2.1-0.el9.x86_64 2/121 2026-03-10T07:45:11.894 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-common-2:18.2.1-0.el9.x86_64 3/121 2026-03-10T07:45:11.894 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-fuse-2:18.2.1-0.el9.x86_64 4/121 2026-03-10T07:45:11.894 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-immutable-object-cache-2:18.2.1-0.el9.x86_6 5/121 2026-03-10T07:45:11.894 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mds-2:18.2.1-0.el9.x86_64 6/121 2026-03-10T07:45:11.894 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-2:18.2.1-0.el9.x86_64 7/121 2026-03-10T07:45:11.894 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mon-2:18.2.1-0.el9.x86_64 8/121 2026-03-10T07:45:11.894 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-osd-2:18.2.1-0.el9.x86_64 9/121 2026-03-10T07:45:11.894 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-radosgw-2:18.2.1-0.el9.x86_64 10/121 2026-03-10T07:45:11.894 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-selinux-2:18.2.1-0.el9.x86_64 11/121 2026-03-10T07:45:11.894 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-test-2:18.2.1-0.el9.x86_64 12/121 2026-03-10T07:45:11.894 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libcephfs-devel-2:18.2.1-0.el9.x86_64 13/121 2026-03-10T07:45:11.894 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libcephfs2-2:18.2.1-0.el9.x86_64 14/121 2026-03-10T07:45:11.894 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libcephsqlite-2:18.2.1-0.el9.x86_64 15/121 2026-03-10T07:45:11.894 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librados-devel-2:18.2.1-0.el9.x86_64 16/121 2026-03-10T07:45:11.894 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libradosstriper1-2:18.2.1-0.el9.x86_64 17/121 2026-03-10T07:45:11.894 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librgw2-2:18.2.1-0.el9.x86_64 18/121 2026-03-10T07:45:11.894 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-ceph-argparse-2:18.2.1-0.el9.x86_64 19/121 2026-03-10T07:45:11.895 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-ceph-common-2:18.2.1-0.el9.x86_64 20/121 2026-03-10T07:45:11.895 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cephfs-2:18.2.1-0.el9.x86_64 21/121 2026-03-10T07:45:11.895 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-rados-2:18.2.1-0.el9.x86_64 22/121 2026-03-10T07:45:11.895 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-rbd-2:18.2.1-0.el9.x86_64 23/121 2026-03-10T07:45:11.895 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-rgw-2:18.2.1-0.el9.x86_64 24/121 2026-03-10T07:45:11.895 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : rbd-fuse-2:18.2.1-0.el9.x86_64 25/121 2026-03-10T07:45:11.895 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : rbd-mirror-2:18.2.1-0.el9.x86_64 26/121 2026-03-10T07:45:11.895 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : rbd-nbd-2:18.2.1-0.el9.x86_64 27/121 2026-03-10T07:45:11.895 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 28/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 29/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 30/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noa 31/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 32/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-rook-2:18.2.1-0.el9.noarch 33/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 34/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : cephadm-2:18.2.1-0.el9.noarch 35/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 36/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 37/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 38/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 39/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 40/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 41/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 42/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-ply-3.11-14.el9.noarch 43/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 44/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 45/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 46/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 47/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 48/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 49/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 50/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 51/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 52/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 53/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 54/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 55/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 56/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 57/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 58/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 59/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 60/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 61/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jmespath-1.0.1-1.el9.noarch 62/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 63/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 64/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 65/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 66/121 2026-03-10T07:45:11.896 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 67/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 68/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 69/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 70/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 71/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 72/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 73/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 74/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 75/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 76/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 77/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 78/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 79/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 80/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 81/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 82/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 83/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 84/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 85/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 86/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 87/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 88/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 89/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 90/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 91/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 92/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 93/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 94/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 95/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 96/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 97/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 98/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 99/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 100/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 101/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 102/121 2026-03-10T07:45:11.897 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 103/121 2026-03-10T07:45:11.898 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 104/121 2026-03-10T07:45:11.898 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 105/121 2026-03-10T07:45:11.898 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 106/121 2026-03-10T07:45:11.898 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 107/121 2026-03-10T07:45:11.898 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 108/121 2026-03-10T07:45:11.898 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 109/121 2026-03-10T07:45:11.898 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 110/121 2026-03-10T07:45:11.898 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 111/121 2026-03-10T07:45:11.898 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 112/121 2026-03-10T07:45:11.898 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 113/121 2026-03-10T07:45:11.898 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-xmltodict-0.12.0-15.el9.noarch 114/121 2026-03-10T07:45:11.898 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 115/121 2026-03-10T07:45:11.898 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : re2-1:20211101-20.el9.x86_64 116/121 2026-03-10T07:45:11.898 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 117/121 2026-03-10T07:45:11.898 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librados2-2:18.2.1-0.el9.x86_64 118/121 2026-03-10T07:45:11.898 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librados2-2:16.2.4-5.el9.x86_64 119/121 2026-03-10T07:45:11.898 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librbd1-2:18.2.1-0.el9.x86_64 120/121 2026-03-10T07:45:11.994 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librbd1-2:16.2.4-5.el9.x86_64 121/121 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout:Upgraded: 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: librados2-2:18.2.1-0.el9.x86_64 librbd1-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout:Installed: 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: ceph-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: ceph-base-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: ceph-common-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: ceph-fuse-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mds-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarch 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-rook-2:18.2.1-0.el9.noarch 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mon-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: ceph-osd-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: ceph-radosgw-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: ceph-selinux-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: ceph-test-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: cephadm-2:18.2.1-0.el9.noarch 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: ledmon-libs-1.1.0-3.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: libcephfs-devel-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: libcephfs2-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: libcephsqlite-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: libconfig-1.7.2-9.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-10T07:45:11.995 INFO:teuthology.orchestra.run.vm08.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: librados-devel-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: libradosstriper1-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: librgw2-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: libxslt-1.1.34-12.el9.x86_64 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: mailcap-2.1.49-5.el9.noarch 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: python3-ceph-argparse-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: python3-ceph-common-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: python3-cephfs-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-10T07:45:11.996 INFO:teuthology.orchestra.run.vm08.stdout: python3-jmespath-1.0.1-1.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-ply-3.11-14.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-rados-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-rbd-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-rgw-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-xmltodict-0.12.0-15.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: rbd-fuse-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: rbd-mirror-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: rbd-nbd-2:18.2.1-0.el9.x86_64 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: re2-1:20211101-20.el9.x86_64 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: socat-1.7.4.1-8.el9.x86_64 2026-03-10T07:45:11.997 INFO:teuthology.orchestra.run.vm08.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-10T07:45:11.998 INFO:teuthology.orchestra.run.vm08.stdout: xmlstarlet-1.6.1-20.el9.x86_64 2026-03-10T07:45:11.998 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:45:11.998 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T07:45:12.078 DEBUG:teuthology.parallel:result is None 2026-03-10T07:45:12.079 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T07:45:12.079 INFO:teuthology.packaging:ref: None 2026-03-10T07:45:12.079 INFO:teuthology.packaging:tag: v18.2.1 2026-03-10T07:45:12.079 INFO:teuthology.packaging:branch: None 2026-03-10T07:45:12.079 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T07:45:12.079 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=7fe91d5d5842e04be3b4f514d6dd990c54b29c76 2026-03-10T07:45:12.742 DEBUG:teuthology.orchestra.run.vm05:> rpm -q ceph --qf '%{VERSION}-%{RELEASE}' 2026-03-10T07:45:12.761 INFO:teuthology.orchestra.run.vm05.stdout:18.2.1-0.el9 2026-03-10T07:45:12.761 INFO:teuthology.packaging:The installed version of ceph is 18.2.1-0.el9 2026-03-10T07:45:12.761 INFO:teuthology.task.install:The correct ceph version 18.2.1-0 is installed. 2026-03-10T07:45:12.762 WARNING:teuthology.packaging:More than one of ref, tag, branch, or sha1 supplied; using tag 2026-03-10T07:45:12.762 INFO:teuthology.packaging:ref: None 2026-03-10T07:45:12.762 INFO:teuthology.packaging:tag: v18.2.1 2026-03-10T07:45:12.762 INFO:teuthology.packaging:branch: None 2026-03-10T07:45:12.762 INFO:teuthology.packaging:sha1: e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T07:45:12.762 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=7fe91d5d5842e04be3b4f514d6dd990c54b29c76 2026-03-10T07:45:13.348 DEBUG:teuthology.orchestra.run.vm08:> rpm -q ceph --qf '%{VERSION}-%{RELEASE}' 2026-03-10T07:45:13.368 INFO:teuthology.orchestra.run.vm08.stdout:18.2.1-0.el9 2026-03-10T07:45:13.369 INFO:teuthology.packaging:The installed version of ceph is 18.2.1-0.el9 2026-03-10T07:45:13.369 INFO:teuthology.task.install:The correct ceph version 18.2.1-0 is installed. 2026-03-10T07:45:13.370 INFO:teuthology.task.install.util:Shipping valgrind.supp... 2026-03-10T07:45:13.370 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T07:45:13.370 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-10T07:45:13.398 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T07:45:13.398 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/home/ubuntu/cephtest/valgrind.supp 2026-03-10T07:45:13.438 INFO:teuthology.task.install.util:Shipping 'daemon-helper'... 2026-03-10T07:45:13.439 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T07:45:13.439 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/usr/bin/daemon-helper 2026-03-10T07:45:13.465 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-10T07:45:13.531 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T07:45:13.531 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/usr/bin/daemon-helper 2026-03-10T07:45:13.555 DEBUG:teuthology.orchestra.run.vm08:> sudo chmod a=rx -- /usr/bin/daemon-helper 2026-03-10T07:45:13.619 INFO:teuthology.task.install.util:Shipping 'adjust-ulimits'... 2026-03-10T07:45:13.619 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T07:45:13.619 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-10T07:45:13.643 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-10T07:45:13.707 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T07:45:13.708 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/usr/bin/adjust-ulimits 2026-03-10T07:45:13.736 DEBUG:teuthology.orchestra.run.vm08:> sudo chmod a=rx -- /usr/bin/adjust-ulimits 2026-03-10T07:45:13.802 INFO:teuthology.task.install.util:Shipping 'stdin-killer'... 2026-03-10T07:45:13.802 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T07:45:13.802 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/usr/bin/stdin-killer 2026-03-10T07:45:13.826 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-10T07:45:13.893 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T07:45:13.893 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/usr/bin/stdin-killer 2026-03-10T07:45:13.916 DEBUG:teuthology.orchestra.run.vm08:> sudo chmod a=rx -- /usr/bin/stdin-killer 2026-03-10T07:45:13.980 INFO:teuthology.run_tasks:Running task print... 2026-03-10T07:45:13.983 INFO:teuthology.task.print:**** done install task... 2026-03-10T07:45:13.983 INFO:teuthology.run_tasks:Running task cephadm... 2026-03-10T07:45:14.028 INFO:tasks.cephadm:Config: {'compiled_cephadm_branch': 'reef', 'conf': {'osd': {'osd_class_default_list': '*', 'osd_class_load_list': '*', 'bdev async discard': True, 'bdev enable discard': True, 'bluestore allocator': 'bitmap', 'bluestore block size': 96636764160, 'bluestore fsck on mount': True, 'debug bluefs': '1/20', 'debug bluestore': '1/20', 'debug ms': 1, 'debug osd': 20, 'debug rocksdb': '4/10', 'mon osd backfillfull_ratio': 0.85, 'mon osd full ratio': 0.9, 'mon osd nearfull ratio': 0.8, 'osd failsafe full ratio': 0.95, 'osd mclock iops capacity threshold hdd': 49000, 'osd objectstore': 'bluestore', 'osd op complaint time': 180}, 'client': {'client mount timeout': 600, 'debug client': 20, 'debug ms': 1, 'rados mon op timeout': 900, 'rados osd op timeout': 900}, 'global': {'mon pg warn min per osd': 0}, 'mds': {'debug mds': 20, 'debug mds balancer': 20, 'debug ms': 1, 'mds debug frag': True, 'mds debug scatterstat': True, 'mds op complaint time': 180, 'mds verify scatter': True, 'osd op complaint time': 180, 'rados mon op timeout': 900, 'rados osd op timeout': 900}, 'mgr': {'debug mgr': 20, 'debug ms': 1}, 'mon': {'debug mon': 20, 'debug ms': 1, 'debug paxos': 20, 'mon down mkfs grace': 300, 'mon op complaint time': 120}}, 'image': 'quay.io/ceph/ceph:v18.2.1', 'roleless': True, 'cluster-conf': {'mgr': {'client mount timeout': 30, 'debug client': 20, 'debug mgr': 20, 'debug ms': 1, 'mon warn on pool no app': False}}, 'flavor': 'default', 'fs': 'xfs', 'log-ignorelist': ['\\(MDS_ALL_DOWN\\)', '\\(MDS_UP_LESS_THAN_MAX\\)', 'FS_DEGRADED', 'filesystem is degraded', 'FS_INLINE_DATA_DEPRECATED', 'FS_WITH_FAILED_MDS', 'MDS_ALL_DOWN', 'filesystem is offline', 'is offline because no MDS', 'MDS_DAMAGE', 'MDS_DEGRADED', 'MDS_FAILED', 'MDS_INSUFFICIENT_STANDBY', 'MDS_UP_LESS_THAN_MAX', 'online, but wants', 'filesystem is online with fewer MDS than max_mds', 'POOL_APP_NOT_ENABLED', 'do not have an application enabled', 'overall HEALTH_', 'Replacing daemon', 'deprecated feature inline_data', 'MGR_MODULE_ERROR', 'OSD_DOWN', 'osds down', 'overall HEALTH_', '\\(OSD_DOWN\\)', '\\(OSD_', 'but it is still running', 'is not responding', 'MON_DOWN', 'PG_AVAILABILITY', 'PG_DEGRADED', 'Reduced data availability', 'Degraded data redundancy', 'pg .* is stuck inactive', 'pg .* is .*degraded', 'pg .* is stuck peering'], 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df'} 2026-03-10T07:45:14.028 INFO:tasks.cephadm:Cluster image is quay.io/ceph/ceph:v18.2.1 2026-03-10T07:45:14.028 INFO:tasks.cephadm:Cluster fsid is 12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T07:45:14.028 INFO:tasks.cephadm:Choosing monitor IPs and ports... 2026-03-10T07:45:14.028 INFO:tasks.cephadm:No mon roles; fabricating mons 2026-03-10T07:45:14.028 INFO:tasks.cephadm:Monitor IPs: {'mon.vm05': '192.168.123.105', 'mon.vm08': '192.168.123.108'} 2026-03-10T07:45:14.028 INFO:tasks.cephadm:Normalizing hostnames... 2026-03-10T07:45:14.028 DEBUG:teuthology.orchestra.run.vm05:> sudo hostname $(hostname -s) 2026-03-10T07:45:14.055 DEBUG:teuthology.orchestra.run.vm08:> sudo hostname $(hostname -s) 2026-03-10T07:45:14.081 INFO:tasks.cephadm:Downloading "compiled" cephadm from cachra for reef 2026-03-10T07:45:14.081 DEBUG:teuthology.packaging:Querying https://shaman.ceph.com/api/search?status=ready&project=ceph&flavor=default&distros=centos%2F9%2Fx86_64&sha1=e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T07:45:14.695 INFO:tasks.cephadm:builder_project result: [{'url': 'https://3.chacra.ceph.com/r/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/', 'chacra_url': 'https://3.chacra.ceph.com/repos/ceph/squid/e911bdebe5c8faa3800735d1568fcdca65db60df/centos/9/flavors/default/', 'ref': 'squid', 'sha1': 'e911bdebe5c8faa3800735d1568fcdca65db60df', 'distro': 'centos', 'distro_version': '9', 'distro_codename': None, 'modified': '2026-02-25 18:55:15.146628', 'status': 'ready', 'flavor': 'default', 'project': 'ceph', 'archs': ['source', 'x86_64'], 'extra': {'version': '19.2.3-678-ge911bdeb', 'package_manager_version': '19.2.3-678.ge911bdeb', 'build_url': 'https://jenkins.ceph.com/job/ceph-dev-pipeline/3275/', 'root_build_cause': '', 'node_name': '10.20.192.26+soko16', 'job_name': 'ceph-dev-pipeline'}}] 2026-03-10T07:45:15.496 INFO:tasks.util.chacra:got chacra host 3.chacra.ceph.com, ref reef, sha1 ab47f43c099b2cbae6e21342fe673ce251da54d6 from https://shaman.ceph.com/api/search/?project=ceph&distros=centos%2F9%2Fx86_64&flavor=default&ref=reef 2026-03-10T07:45:15.498 INFO:tasks.cephadm:Discovered cachra url: https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm 2026-03-10T07:45:15.498 INFO:tasks.cephadm:Downloading cephadm from url: https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm 2026-03-10T07:45:15.498 DEBUG:teuthology.orchestra.run.vm05:> curl --silent -L https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm > /home/ubuntu/cephtest/cephadm && ls -l /home/ubuntu/cephtest/cephadm 2026-03-10T07:45:16.811 INFO:teuthology.orchestra.run.vm05.stdout:-rw-r--r--. 1 ubuntu ubuntu 217462 Mar 10 07:45 /home/ubuntu/cephtest/cephadm 2026-03-10T07:45:16.811 DEBUG:teuthology.orchestra.run.vm08:> curl --silent -L https://3.chacra.ceph.com/binaries/ceph/reef/ab47f43c099b2cbae6e21342fe673ce251da54d6/centos/9/x86_64/flavors/default/cephadm > /home/ubuntu/cephtest/cephadm && ls -l /home/ubuntu/cephtest/cephadm 2026-03-10T07:45:18.095 INFO:teuthology.orchestra.run.vm08.stdout:-rw-r--r--. 1 ubuntu ubuntu 217462 Mar 10 07:45 /home/ubuntu/cephtest/cephadm 2026-03-10T07:45:18.095 DEBUG:teuthology.orchestra.run.vm05:> test -s /home/ubuntu/cephtest/cephadm && test $(stat -c%s /home/ubuntu/cephtest/cephadm) -gt 1000 && chmod +x /home/ubuntu/cephtest/cephadm 2026-03-10T07:45:18.111 DEBUG:teuthology.orchestra.run.vm08:> test -s /home/ubuntu/cephtest/cephadm && test $(stat -c%s /home/ubuntu/cephtest/cephadm) -gt 1000 && chmod +x /home/ubuntu/cephtest/cephadm 2026-03-10T07:45:18.162 INFO:tasks.cephadm:Pulling image quay.io/ceph/ceph:v18.2.1 on all hosts... 2026-03-10T07:45:18.162 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 pull 2026-03-10T07:45:18.164 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 pull 2026-03-10T07:45:18.307 INFO:teuthology.orchestra.run.vm05.stderr:Pulling container image quay.io/ceph/ceph:v18.2.1... 2026-03-10T07:45:18.346 INFO:teuthology.orchestra.run.vm08.stderr:Pulling container image quay.io/ceph/ceph:v18.2.1... 2026-03-10T07:45:36.726 INFO:teuthology.orchestra.run.vm08.stdout:{ 2026-03-10T07:45:36.726 INFO:teuthology.orchestra.run.vm08.stdout: "ceph_version": "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)", 2026-03-10T07:45:36.726 INFO:teuthology.orchestra.run.vm08.stdout: "image_id": "5be31c24972a920012b90a9769e8313e2490c82aee752aefd49af9cf4c0f3fcf", 2026-03-10T07:45:36.726 INFO:teuthology.orchestra.run.vm08.stdout: "repo_digests": [ 2026-03-10T07:45:36.726 INFO:teuthology.orchestra.run.vm08.stdout: "quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3", 2026-03-10T07:45:36.726 INFO:teuthology.orchestra.run.vm08.stdout: "quay.io/ceph/ceph@sha256:e8e55db8b4fd270dbec25bc764437a2a3abb707971c4dba5f559fb83018049dc" 2026-03-10T07:45:36.727 INFO:teuthology.orchestra.run.vm08.stdout: ] 2026-03-10T07:45:36.727 INFO:teuthology.orchestra.run.vm08.stdout:} 2026-03-10T07:45:37.544 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:45:37.544 INFO:teuthology.orchestra.run.vm05.stdout: "ceph_version": "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)", 2026-03-10T07:45:37.544 INFO:teuthology.orchestra.run.vm05.stdout: "image_id": "5be31c24972a920012b90a9769e8313e2490c82aee752aefd49af9cf4c0f3fcf", 2026-03-10T07:45:37.544 INFO:teuthology.orchestra.run.vm05.stdout: "repo_digests": [ 2026-03-10T07:45:37.544 INFO:teuthology.orchestra.run.vm05.stdout: "quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3", 2026-03-10T07:45:37.544 INFO:teuthology.orchestra.run.vm05.stdout: "quay.io/ceph/ceph@sha256:e8e55db8b4fd270dbec25bc764437a2a3abb707971c4dba5f559fb83018049dc" 2026-03-10T07:45:37.544 INFO:teuthology.orchestra.run.vm05.stdout: ] 2026-03-10T07:45:37.544 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:45:37.554 DEBUG:teuthology.orchestra.run.vm05:> sudo mkdir -p /etc/ceph 2026-03-10T07:45:37.578 DEBUG:teuthology.orchestra.run.vm08:> sudo mkdir -p /etc/ceph 2026-03-10T07:45:37.603 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod 777 /etc/ceph 2026-03-10T07:45:37.641 DEBUG:teuthology.orchestra.run.vm08:> sudo chmod 777 /etc/ceph 2026-03-10T07:45:37.667 INFO:tasks.cephadm:Writing seed config... 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [osd] osd_class_default_list = * 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [osd] osd_class_load_list = * 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [osd] bdev async discard = True 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [osd] bdev enable discard = True 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [osd] bluestore allocator = bitmap 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [osd] bluestore block size = 96636764160 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [osd] bluestore fsck on mount = True 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [osd] debug bluefs = 1/20 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [osd] debug bluestore = 1/20 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [osd] debug ms = 1 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [osd] debug osd = 20 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [osd] debug rocksdb = 4/10 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [osd] mon osd backfillfull_ratio = 0.85 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [osd] mon osd full ratio = 0.9 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [osd] mon osd nearfull ratio = 0.8 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [osd] osd failsafe full ratio = 0.95 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [osd] osd mclock iops capacity threshold hdd = 49000 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [osd] osd objectstore = bluestore 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [osd] osd op complaint time = 180 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [client] client mount timeout = 600 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [client] debug client = 20 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [client] debug ms = 1 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [client] rados mon op timeout = 900 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [client] rados osd op timeout = 900 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [global] mon pg warn min per osd = 0 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [mds] debug mds = 20 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [mds] debug mds balancer = 20 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [mds] debug ms = 1 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [mds] mds debug frag = True 2026-03-10T07:45:37.668 INFO:tasks.cephadm: override: [mds] mds debug scatterstat = True 2026-03-10T07:45:37.669 INFO:tasks.cephadm: override: [mds] mds op complaint time = 180 2026-03-10T07:45:37.669 INFO:tasks.cephadm: override: [mds] mds verify scatter = True 2026-03-10T07:45:37.669 INFO:tasks.cephadm: override: [mds] osd op complaint time = 180 2026-03-10T07:45:37.669 INFO:tasks.cephadm: override: [mds] rados mon op timeout = 900 2026-03-10T07:45:37.669 INFO:tasks.cephadm: override: [mds] rados osd op timeout = 900 2026-03-10T07:45:37.669 INFO:tasks.cephadm: override: [mgr] debug mgr = 20 2026-03-10T07:45:37.669 INFO:tasks.cephadm: override: [mgr] debug ms = 1 2026-03-10T07:45:37.669 INFO:tasks.cephadm: override: [mon] debug mon = 20 2026-03-10T07:45:37.669 INFO:tasks.cephadm: override: [mon] debug ms = 1 2026-03-10T07:45:37.669 INFO:tasks.cephadm: override: [mon] debug paxos = 20 2026-03-10T07:45:37.669 INFO:tasks.cephadm: override: [mon] mon down mkfs grace = 300 2026-03-10T07:45:37.669 INFO:tasks.cephadm: override: [mon] mon op complaint time = 120 2026-03-10T07:45:37.669 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T07:45:37.669 DEBUG:teuthology.orchestra.run.vm05:> dd of=/home/ubuntu/cephtest/seed.ceph.conf 2026-03-10T07:45:37.696 DEBUG:tasks.cephadm:Final config: [global] # make logging friendly to teuthology log_to_file = true log_to_stderr = false log to journald = false mon cluster log to file = true mon cluster log file level = debug mon clock drift allowed = 1.000 # replicate across OSDs, not hosts osd crush chooseleaf type = 0 #osd pool default size = 2 osd pool default erasure code profile = plugin=jerasure technique=reed_sol_van k=2 m=1 crush-failure-domain=osd # enable some debugging auth debug = true ms die on old message = true ms die on bug = true debug asserts on shutdown = true # adjust warnings mon max pg per osd = 10000# >= luminous mon pg warn max object skew = 0 mon osd allow primary affinity = true mon osd allow pg remap = true mon warn on legacy crush tunables = false mon warn on crush straw calc version zero = false mon warn on no sortbitwise = false mon warn on osd down out interval zero = false mon warn on too few osds = false mon_warn_on_pool_pg_num_not_power_of_two = false # disable pg_autoscaler by default for new pools osd_pool_default_pg_autoscale_mode = off # tests delete pools mon allow pool delete = true fsid = 12e9780e-1c55-11f1-8896-79f7c2e9b508 mon pg warn min per osd = 0 [osd] osd scrub load threshold = 5.0 osd scrub max interval = 600 osd mclock profile = high_recovery_ops osd recover clone overlap = true osd recovery max chunk = 1048576 osd deep scrub update digest min age = 30 osd map max advance = 10 osd memory target autotune = true # debugging osd debug shutdown = true osd debug op order = true osd debug verify stray on activate = true osd debug pg log writeout = true osd debug verify cached snaps = true osd debug verify missing on start = true osd debug misdirected ops = true osd op queue = debug_random osd op queue cut off = debug_random osd shutdown pgref assert = true bdev debug aio = true osd sloppy crc = true osd_class_default_list = * osd_class_load_list = * bdev async discard = True bdev enable discard = True bluestore allocator = bitmap bluestore block size = 96636764160 bluestore fsck on mount = True debug bluefs = 1/20 debug bluestore = 1/20 debug ms = 1 debug osd = 20 debug rocksdb = 4/10 mon osd backfillfull_ratio = 0.85 mon osd full ratio = 0.9 mon osd nearfull ratio = 0.8 osd failsafe full ratio = 0.95 osd mclock iops capacity threshold hdd = 49000 osd objectstore = bluestore osd op complaint time = 180 [mgr] mon reweight min pgs per osd = 4 mon reweight min bytes per osd = 10 mgr/telemetry/nag = false debug mgr = 20 debug ms = 1 [mon] mon data avail warn = 5 mon mgr mkfs grace = 240 mon reweight min pgs per osd = 4 mon osd reporter subtree level = osd mon osd prime pg temp = true mon reweight min bytes per osd = 10 # rotate auth tickets quickly to exercise renewal paths auth mon ticket ttl = 660# 11m auth service ticket ttl = 240# 4m # don't complain about global id reclaim mon_warn_on_insecure_global_id_reclaim = false mon_warn_on_insecure_global_id_reclaim_allowed = false debug mon = 20 debug ms = 1 debug paxos = 20 mon down mkfs grace = 300 mon op complaint time = 120 [client.rgw] rgw cache enabled = true rgw enable ops log = true rgw enable usage log = true [client] client mount timeout = 600 debug client = 20 debug ms = 1 rados mon op timeout = 900 rados osd op timeout = 900 [mds] debug mds = 20 debug mds balancer = 20 debug ms = 1 mds debug frag = True mds debug scatterstat = True mds op complaint time = 180 mds verify scatter = True osd op complaint time = 180 rados mon op timeout = 900 rados osd op timeout = 900 2026-03-10T07:45:37.696 DEBUG:teuthology.orchestra.run.vm05:mon.vm05> sudo journalctl -f -n 0 -u ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@mon.vm05.service 2026-03-10T07:45:37.738 INFO:tasks.cephadm:Bootstrapping... 2026-03-10T07:45:37.739 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 -v bootstrap --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 --config /home/ubuntu/cephtest/seed.ceph.conf --output-config /etc/ceph/ceph.conf --output-keyring /etc/ceph/ceph.client.admin.keyring --output-pub-ssh-key /home/ubuntu/cephtest/ceph.pub --mon-ip 192.168.123.105 --skip-admin-label && sudo chmod +r /etc/ceph/ceph.client.admin.keyring 2026-03-10T07:45:37.857 INFO:teuthology.orchestra.run.vm05.stdout:-------------------------------------------------------------------------------- 2026-03-10T07:45:37.857 INFO:teuthology.orchestra.run.vm05.stdout:cephadm ['--image', 'quay.io/ceph/ceph:v18.2.1', '-v', 'bootstrap', '--fsid', '12e9780e-1c55-11f1-8896-79f7c2e9b508', '--config', '/home/ubuntu/cephtest/seed.ceph.conf', '--output-config', '/etc/ceph/ceph.conf', '--output-keyring', '/etc/ceph/ceph.client.admin.keyring', '--output-pub-ssh-key', '/home/ubuntu/cephtest/ceph.pub', '--mon-ip', '192.168.123.105', '--skip-admin-label'] 2026-03-10T07:45:37.876 INFO:teuthology.orchestra.run.vm05.stdout:/bin/podman: stdout 5.8.0 2026-03-10T07:45:37.876 INFO:teuthology.orchestra.run.vm05.stderr:Specifying an fsid for your cluster offers no advantages and may increase the likelihood of fsid conflicts. 2026-03-10T07:45:37.876 INFO:teuthology.orchestra.run.vm05.stdout:Verifying podman|docker is present... 2026-03-10T07:45:37.892 INFO:teuthology.orchestra.run.vm05.stdout:/bin/podman: stdout 5.8.0 2026-03-10T07:45:37.893 INFO:teuthology.orchestra.run.vm05.stdout:Verifying lvm2 is present... 2026-03-10T07:45:37.893 INFO:teuthology.orchestra.run.vm05.stdout:Verifying time synchronization is in place... 2026-03-10T07:45:37.899 INFO:teuthology.orchestra.run.vm05.stdout:Non-zero exit code 1 from systemctl is-enabled chrony.service 2026-03-10T07:45:37.899 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stderr Failed to get unit file state for chrony.service: No such file or directory 2026-03-10T07:45:37.904 INFO:teuthology.orchestra.run.vm05.stdout:Non-zero exit code 3 from systemctl is-active chrony.service 2026-03-10T07:45:37.904 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stdout inactive 2026-03-10T07:45:37.910 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stdout enabled 2026-03-10T07:45:37.914 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stdout active 2026-03-10T07:45:37.914 INFO:teuthology.orchestra.run.vm05.stdout:Unit chronyd.service is enabled and running 2026-03-10T07:45:37.914 INFO:teuthology.orchestra.run.vm05.stdout:Repeating the final host check... 2026-03-10T07:45:37.932 INFO:teuthology.orchestra.run.vm05.stdout:/bin/podman: stdout 5.8.0 2026-03-10T07:45:37.932 INFO:teuthology.orchestra.run.vm05.stdout:podman (/bin/podman) version 5.8.0 is present 2026-03-10T07:45:37.932 INFO:teuthology.orchestra.run.vm05.stdout:systemctl is present 2026-03-10T07:45:37.932 INFO:teuthology.orchestra.run.vm05.stdout:lvcreate is present 2026-03-10T07:45:37.938 INFO:teuthology.orchestra.run.vm05.stdout:Non-zero exit code 1 from systemctl is-enabled chrony.service 2026-03-10T07:45:37.938 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stderr Failed to get unit file state for chrony.service: No such file or directory 2026-03-10T07:45:37.943 INFO:teuthology.orchestra.run.vm05.stdout:Non-zero exit code 3 from systemctl is-active chrony.service 2026-03-10T07:45:37.943 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stdout inactive 2026-03-10T07:45:37.948 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stdout enabled 2026-03-10T07:45:37.953 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stdout active 2026-03-10T07:45:37.953 INFO:teuthology.orchestra.run.vm05.stdout:Unit chronyd.service is enabled and running 2026-03-10T07:45:37.953 INFO:teuthology.orchestra.run.vm05.stdout:Host looks OK 2026-03-10T07:45:37.953 INFO:teuthology.orchestra.run.vm05.stdout:Cluster fsid: 12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T07:45:37.953 INFO:teuthology.orchestra.run.vm05.stdout:Acquiring lock 140688236757968 on /run/cephadm/12e9780e-1c55-11f1-8896-79f7c2e9b508.lock 2026-03-10T07:45:37.953 INFO:teuthology.orchestra.run.vm05.stdout:Lock 140688236757968 acquired on /run/cephadm/12e9780e-1c55-11f1-8896-79f7c2e9b508.lock 2026-03-10T07:45:37.954 INFO:teuthology.orchestra.run.vm05.stdout:Verifying IP 192.168.123.105 port 3300 ... 2026-03-10T07:45:37.954 INFO:teuthology.orchestra.run.vm05.stdout:Verifying IP 192.168.123.105 port 6789 ... 2026-03-10T07:45:37.954 INFO:teuthology.orchestra.run.vm05.stdout:Base mon IP(s) is [192.168.123.105:3300, 192.168.123.105:6789], mon addrv is [v2:192.168.123.105:3300,v1:192.168.123.105:6789] 2026-03-10T07:45:37.957 INFO:teuthology.orchestra.run.vm05.stdout:/sbin/ip: stdout default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.105 metric 100 2026-03-10T07:45:37.957 INFO:teuthology.orchestra.run.vm05.stdout:/sbin/ip: stdout 192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.105 metric 100 2026-03-10T07:45:37.959 INFO:teuthology.orchestra.run.vm05.stdout:/sbin/ip: stdout ::1 dev lo proto kernel metric 256 pref medium 2026-03-10T07:45:37.959 INFO:teuthology.orchestra.run.vm05.stdout:/sbin/ip: stdout fe80::/64 dev eth0 proto kernel metric 1024 pref medium 2026-03-10T07:45:37.961 INFO:teuthology.orchestra.run.vm05.stdout:/sbin/ip: stdout 1: lo: mtu 65536 state UNKNOWN qlen 1000 2026-03-10T07:45:37.961 INFO:teuthology.orchestra.run.vm05.stdout:/sbin/ip: stdout inet6 ::1/128 scope host 2026-03-10T07:45:37.961 INFO:teuthology.orchestra.run.vm05.stdout:/sbin/ip: stdout valid_lft forever preferred_lft forever 2026-03-10T07:45:37.961 INFO:teuthology.orchestra.run.vm05.stdout:/sbin/ip: stdout 2: eth0: mtu 1500 state UP qlen 1000 2026-03-10T07:45:37.961 INFO:teuthology.orchestra.run.vm05.stdout:/sbin/ip: stdout inet6 fe80::5055:ff:fe00:5/64 scope link noprefixroute 2026-03-10T07:45:37.961 INFO:teuthology.orchestra.run.vm05.stdout:/sbin/ip: stdout valid_lft forever preferred_lft forever 2026-03-10T07:45:37.961 INFO:teuthology.orchestra.run.vm05.stdout:Mon IP `192.168.123.105` is in CIDR network `192.168.123.0/24` 2026-03-10T07:45:37.961 INFO:teuthology.orchestra.run.vm05.stdout:Mon IP `192.168.123.105` is in CIDR network `192.168.123.0/24` 2026-03-10T07:45:37.961 INFO:teuthology.orchestra.run.vm05.stdout:Inferred mon public CIDR from local network configuration ['192.168.123.0/24', '192.168.123.0/24'] 2026-03-10T07:45:37.962 INFO:teuthology.orchestra.run.vm05.stdout:Internal network (--cluster-network) has not been provided, OSD replication will default to the public_network 2026-03-10T07:45:37.962 INFO:teuthology.orchestra.run.vm05.stdout:Pulling container image quay.io/ceph/ceph:v18.2.1... 2026-03-10T07:45:39.263 INFO:teuthology.orchestra.run.vm05.stdout:/bin/podman: stdout 5be31c24972a920012b90a9769e8313e2490c82aee752aefd49af9cf4c0f3fcf 2026-03-10T07:45:39.264 INFO:teuthology.orchestra.run.vm05.stdout:/bin/podman: stderr Trying to pull quay.io/ceph/ceph:v18.2.1... 2026-03-10T07:45:39.264 INFO:teuthology.orchestra.run.vm05.stdout:/bin/podman: stderr Getting image source signatures 2026-03-10T07:45:39.264 INFO:teuthology.orchestra.run.vm05.stdout:/bin/podman: stderr Copying blob sha256:7feca07754707458c3945cf0062cf4dabc512f6d90fe1a9a1370b362b6011124 2026-03-10T07:45:39.264 INFO:teuthology.orchestra.run.vm05.stdout:/bin/podman: stderr Copying blob sha256:a733d3c618b71f19c168ebecd1953429dce2c1631835ca182e9551c36dce5989 2026-03-10T07:45:39.264 INFO:teuthology.orchestra.run.vm05.stdout:/bin/podman: stderr Copying config sha256:5be31c24972a920012b90a9769e8313e2490c82aee752aefd49af9cf4c0f3fcf 2026-03-10T07:45:39.264 INFO:teuthology.orchestra.run.vm05.stdout:/bin/podman: stderr Writing manifest to image destination 2026-03-10T07:45:39.425 INFO:teuthology.orchestra.run.vm05.stdout:ceph: stdout ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable) 2026-03-10T07:45:39.425 INFO:teuthology.orchestra.run.vm05.stdout:Ceph version: ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable) 2026-03-10T07:45:39.426 INFO:teuthology.orchestra.run.vm05.stdout:Extracting ceph user uid/gid from container image... 2026-03-10T07:45:39.494 INFO:teuthology.orchestra.run.vm05.stdout:stat: stdout 167 167 2026-03-10T07:45:39.494 INFO:teuthology.orchestra.run.vm05.stdout:Creating initial keys... 2026-03-10T07:45:39.608 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph-authtool: stdout AQAjzK9pqu0XIhAAtTiHJM1trlQlKkDDpvWZZQ== 2026-03-10T07:45:39.694 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph-authtool: stdout AQAjzK9piRNpKBAAIp7LEpfw01ysYebKfzSIhg== 2026-03-10T07:45:39.814 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph-authtool: stdout AQAjzK9pGvs3LhAAfq7efO9AxSVO+YA7XNcg+g== 2026-03-10T07:45:39.814 INFO:teuthology.orchestra.run.vm05.stdout:Creating initial monmap... 2026-03-10T07:45:39.924 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: monmap file /tmp/monmap 2026-03-10T07:45:39.924 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/monmaptool: stdout setting min_mon_release = pacific 2026-03-10T07:45:39.924 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: set fsid to 12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T07:45:39.924 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/monmaptool: stdout /usr/bin/monmaptool: writing epoch 0 to /tmp/monmap (1 monitors) 2026-03-10T07:45:39.924 INFO:teuthology.orchestra.run.vm05.stdout:monmaptool for vm05 [v2:192.168.123.105:3300,v1:192.168.123.105:6789] on /usr/bin/monmaptool: monmap file /tmp/monmap 2026-03-10T07:45:39.924 INFO:teuthology.orchestra.run.vm05.stdout:setting min_mon_release = pacific 2026-03-10T07:45:39.924 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/monmaptool: set fsid to 12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T07:45:39.925 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/monmaptool: writing epoch 0 to /tmp/monmap (1 monitors) 2026-03-10T07:45:39.925 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:45:39.925 INFO:teuthology.orchestra.run.vm05.stdout:Creating mon... 2026-03-10T07:45:40.038 INFO:teuthology.orchestra.run.vm05.stdout:create mon.vm05 on 2026-03-10T07:45:40.200 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stderr Removed "/etc/systemd/system/multi-user.target.wants/ceph.target". 2026-03-10T07:45:40.317 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stderr Created symlink /etc/systemd/system/multi-user.target.wants/ceph.target → /etc/systemd/system/ceph.target. 2026-03-10T07:45:40.438 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stderr Created symlink /etc/systemd/system/multi-user.target.wants/ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508.target → /etc/systemd/system/ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508.target. 2026-03-10T07:45:40.438 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph.target.wants/ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508.target → /etc/systemd/system/ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508.target. 2026-03-10T07:45:40.577 INFO:teuthology.orchestra.run.vm05.stdout:Non-zero exit code 1 from systemctl reset-failed ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@mon.vm05 2026-03-10T07:45:40.577 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stderr Failed to reset failed state of unit ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@mon.vm05.service: Unit ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@mon.vm05.service not loaded. 2026-03-10T07:45:40.716 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508.target.wants/ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@mon.vm05.service → /etc/systemd/system/ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@.service. 2026-03-10T07:45:40.878 INFO:teuthology.orchestra.run.vm05.stdout:firewalld does not appear to be present 2026-03-10T07:45:40.879 INFO:teuthology.orchestra.run.vm05.stdout:Not possible to enable service . firewalld.service is not available 2026-03-10T07:45:40.879 INFO:teuthology.orchestra.run.vm05.stdout:Waiting for mon to start... 2026-03-10T07:45:40.879 INFO:teuthology.orchestra.run.vm05.stdout:Waiting for mon... 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout cluster: 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout id: 12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout health: HEALTH_OK 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout services: 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon: 1 daemons, quorum vm05 (age 0.157898s) 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mgr: no daemons active 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout osd: 0 osds: 0 up, 0 in 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout data: 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout pools: 0 pools, 0 pgs 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout objects: 0 objects, 0 B 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout usage: 0 B used, 0 B / 0 B avail 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout pgs: 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.013+0000 7fc075991700 1 Processor -- start 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.013+0000 7fc075991700 1 -- start start 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.013+0000 7fc075991700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc07007b760 0x7fc07007bb80 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.013+0000 7fc075991700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc07007c150 con 0x7fc07007b760 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.014+0000 7fc06effd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc07007b760 0x7fc07007bb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.014+0000 7fc06effd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc07007b760 0x7fc07007bb80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36204/0 (socket says 192.168.123.105:36204) 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.014+0000 7fc06effd700 1 -- 192.168.123.105:0/2594567335 learned_addr learned my addr 192.168.123.105:0/2594567335 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.014+0000 7fc06effd700 1 -- 192.168.123.105:0/2594567335 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc07007c9b0 con 0x7fc07007b760 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.014+0000 7fc06effd700 1 --2- 192.168.123.105:0/2594567335 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc07007b760 0x7fc07007bb80 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7fc06800b0d0 tx=0x7fc06800b490 comp rx=0 tx=0).ready entity=mon.0 client_cookie=9b546d2b9a477d98 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.015+0000 7fc06dffb700 1 -- 192.168.123.105:0/2594567335 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc06800e070 con 0x7fc07007b760 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.015+0000 7fc06dffb700 1 -- 192.168.123.105:0/2594567335 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7fc06800ba80 con 0x7fc07007b760 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.015+0000 7fc075991700 1 -- 192.168.123.105:0/2594567335 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc07007b760 msgr2=0x7fc07007bb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.015+0000 7fc075991700 1 --2- 192.168.123.105:0/2594567335 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc07007b760 0x7fc07007bb80 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7fc06800b0d0 tx=0x7fc06800b490 comp rx=0 tx=0).stop 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.015+0000 7fc075991700 1 -- 192.168.123.105:0/2594567335 shutdown_connections 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.015+0000 7fc075991700 1 --2- 192.168.123.105:0/2594567335 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc07007b760 0x7fc07007bb80 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.015+0000 7fc075991700 1 -- 192.168.123.105:0/2594567335 >> 192.168.123.105:0/2594567335 conn(0x7fc070103f50 msgr2=0x7fc070106370 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.016+0000 7fc075991700 1 -- 192.168.123.105:0/2594567335 shutdown_connections 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.016+0000 7fc075991700 1 -- 192.168.123.105:0/2594567335 wait complete. 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.016+0000 7fc075991700 1 Processor -- start 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.017+0000 7fc075991700 1 -- start start 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.017+0000 7fc075991700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc07007b760 0x7fc07019fff0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.017+0000 7fc06effd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc07007b760 0x7fc07019fff0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.017+0000 7fc06effd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc07007b760 0x7fc07019fff0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36206/0 (socket says 192.168.123.105:36206) 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.017+0000 7fc06effd700 1 -- 192.168.123.105:0/1782725947 learned_addr learned my addr 192.168.123.105:0/1782725947 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.017+0000 7fc075991700 1 -- 192.168.123.105:0/1782725947 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc06800bbf0 con 0x7fc07007b760 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.017+0000 7fc06effd700 1 -- 192.168.123.105:0/1782725947 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc068009d20 con 0x7fc07007b760 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.018+0000 7fc06effd700 1 --2- 192.168.123.105:0/1782725947 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc07007b760 0x7fc07019fff0 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7fc07007c440 tx=0x7fc068004ad0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.018+0000 7fc07498f700 1 -- 192.168.123.105:0/1782725947 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc0680047c0 con 0x7fc07007b760 2026-03-10T07:45:41.097 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.018+0000 7fc07498f700 1 -- 192.168.123.105:0/1782725947 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7fc068004920 con 0x7fc07007b760 2026-03-10T07:45:41.098 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.018+0000 7fc07498f700 1 -- 192.168.123.105:0/1782725947 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fc06801e430 con 0x7fc07007b760 2026-03-10T07:45:41.098 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.019+0000 7fc075991700 1 -- 192.168.123.105:0/1782725947 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc0701a0530 con 0x7fc07007b760 2026-03-10T07:45:41.098 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.019+0000 7fc07498f700 1 -- 192.168.123.105:0/1782725947 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7fc06801e890 con 0x7fc07007b760 2026-03-10T07:45:41.098 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.019+0000 7fc075991700 1 -- 192.168.123.105:0/1782725947 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc0701a09b0 con 0x7fc07007b760 2026-03-10T07:45:41.098 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.020+0000 7fc07498f700 1 -- 192.168.123.105:0/1782725947 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fc068027d10 con 0x7fc07007b760 2026-03-10T07:45:41.098 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.020+0000 7fc075991700 1 -- 192.168.123.105:0/1782725947 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc07019a0b0 con 0x7fc07007b760 2026-03-10T07:45:41.098 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.022+0000 7fc07498f700 1 -- 192.168.123.105:0/1782725947 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72446 (secure 0 0 0) 0x7fc068046b10 con 0x7fc07007b760 2026-03-10T07:45:41.098 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.060+0000 7fc075991700 1 -- 192.168.123.105:0/1782725947 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "status"} v 0) v1 -- 0x7fc0700623c0 con 0x7fc07007b760 2026-03-10T07:45:41.098 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.061+0000 7fc07498f700 1 -- 192.168.123.105:0/1782725947 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "status"}]=0 v0) v1 ==== 54+0+320 (secure 0 0 0) 0x7fc068035030 con 0x7fc07007b760 2026-03-10T07:45:41.098 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.062+0000 7fc075991700 1 -- 192.168.123.105:0/1782725947 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc07007b760 msgr2=0x7fc07019fff0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:41.098 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.062+0000 7fc075991700 1 --2- 192.168.123.105:0/1782725947 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc07007b760 0x7fc07019fff0 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7fc07007c440 tx=0x7fc068004ad0 comp rx=0 tx=0).stop 2026-03-10T07:45:41.098 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.063+0000 7fc075991700 1 -- 192.168.123.105:0/1782725947 shutdown_connections 2026-03-10T07:45:41.098 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.063+0000 7fc075991700 1 --2- 192.168.123.105:0/1782725947 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc07007b760 0x7fc07019fff0 unknown :-1 s=CLOSED pgs=2 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:41.098 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.063+0000 7fc075991700 1 -- 192.168.123.105:0/1782725947 >> 192.168.123.105:0/1782725947 conn(0x7fc070103f50 msgr2=0x7fc0701056a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:41.098 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.063+0000 7fc075991700 1 -- 192.168.123.105:0/1782725947 shutdown_connections 2026-03-10T07:45:41.098 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.063+0000 7fc075991700 1 -- 192.168.123.105:0/1782725947 wait complete. 2026-03-10T07:45:41.098 INFO:teuthology.orchestra.run.vm05.stdout:mon is available 2026-03-10T07:45:41.098 INFO:teuthology.orchestra.run.vm05.stdout:Assimilating anything we can from ceph.conf... 2026-03-10T07:45:41.320 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout [global] 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout fsid = 12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_host = [v2:192.168.123.105:3300,v1:192.168.123.105:6789] 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_osd_allow_pg_remap = true 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_osd_allow_primary_affinity = true 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_warn_on_no_sortbitwise = false 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout osd_crush_chooseleaf_type = 0 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout [mgr] 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mgr/telemetry/nag = false 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout [osd] 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_osd_backfillfull_ratio = 0.85 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_osd_full_ratio = 0.9 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_osd_nearfull_ratio = 0.8 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout osd_map_max_advance = 10 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout osd_sloppy_crc = true 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.223+0000 7f93aa6cf700 1 Processor -- start 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.223+0000 7f93aa6cf700 1 -- start start 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.224+0000 7f93aa6cf700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f93a4104fb0 0x7f93a41073e0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.224+0000 7f93aa6cf700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f93a4074720 con 0x7f93a4104fb0 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.224+0000 7f93a3fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f93a4104fb0 0x7f93a41073e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.224+0000 7f93a3fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f93a4104fb0 0x7f93a41073e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36208/0 (socket says 192.168.123.105:36208) 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.224+0000 7f93a3fff700 1 -- 192.168.123.105:0/180858887 learned_addr learned my addr 192.168.123.105:0/180858887 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.224+0000 7f93a3fff700 1 -- 192.168.123.105:0/180858887 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f93a4107920 con 0x7f93a4104fb0 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.224+0000 7f93a3fff700 1 --2- 192.168.123.105:0/180858887 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f93a4104fb0 0x7f93a41073e0 secure :-1 s=READY pgs=3 cs=0 l=1 rev1=1 crypto rx=0x7f9394009a90 tx=0x7f9394009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=63181088756c4ed9 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.225+0000 7f93a2ffd700 1 -- 192.168.123.105:0/180858887 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9394004030 con 0x7f93a4104fb0 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.225+0000 7f93a2ffd700 1 -- 192.168.123.105:0/180858887 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7f9394004190 con 0x7f93a4104fb0 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.225+0000 7f93a2ffd700 1 -- 192.168.123.105:0/180858887 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9394004320 con 0x7f93a4104fb0 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.225+0000 7f93aa6cf700 1 -- 192.168.123.105:0/180858887 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f93a4104fb0 msgr2=0x7f93a41073e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.225+0000 7f93aa6cf700 1 --2- 192.168.123.105:0/180858887 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f93a4104fb0 0x7f93a41073e0 secure :-1 s=READY pgs=3 cs=0 l=1 rev1=1 crypto rx=0x7f9394009a90 tx=0x7f9394009da0 comp rx=0 tx=0).stop 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.226+0000 7f93aa6cf700 1 -- 192.168.123.105:0/180858887 shutdown_connections 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.226+0000 7f93aa6cf700 1 --2- 192.168.123.105:0/180858887 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f93a4104fb0 0x7f93a41073e0 unknown :-1 s=CLOSED pgs=3 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.226+0000 7f93aa6cf700 1 -- 192.168.123.105:0/180858887 >> 192.168.123.105:0/180858887 conn(0x7f93a4100bd0 msgr2=0x7f93a4103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.226+0000 7f93aa6cf700 1 -- 192.168.123.105:0/180858887 shutdown_connections 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.226+0000 7f93aa6cf700 1 -- 192.168.123.105:0/180858887 wait complete. 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.226+0000 7f93aa6cf700 1 Processor -- start 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.226+0000 7f93aa6cf700 1 -- start start 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.226+0000 7f93aa6cf700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f93a4104fb0 0x7f93a41a00a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.226+0000 7f93aa6cf700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f93a41a05e0 con 0x7f93a4104fb0 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.227+0000 7f93a3fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f93a4104fb0 0x7f93a41a00a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.227+0000 7f93a3fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f93a4104fb0 0x7f93a41a00a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36214/0 (socket says 192.168.123.105:36214) 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.227+0000 7f93a3fff700 1 -- 192.168.123.105:0/3829011038 learned_addr learned my addr 192.168.123.105:0/3829011038 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.227+0000 7f93a3fff700 1 -- 192.168.123.105:0/3829011038 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9394009740 con 0x7f93a4104fb0 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.227+0000 7f93a3fff700 1 --2- 192.168.123.105:0/3829011038 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f93a4104fb0 0x7f93a41a00a0 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7f93a4073b60 tx=0x7f9394004750 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.227+0000 7f93a17fa700 1 -- 192.168.123.105:0/3829011038 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9394004550 con 0x7f93a4104fb0 2026-03-10T07:45:41.321 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.227+0000 7f93a17fa700 1 -- 192.168.123.105:0/3829011038 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(0 keys) v1 ==== 4+0+0 (secure 0 0 0) 0x7f9394020070 con 0x7f93a4104fb0 2026-03-10T07:45:41.322 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.227+0000 7f93aa6cf700 1 -- 192.168.123.105:0/3829011038 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f93a41a07e0 con 0x7f93a4104fb0 2026-03-10T07:45:41.322 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.227+0000 7f93a17fa700 1 -- 192.168.123.105:0/3829011038 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f93940036a0 con 0x7f93a4104fb0 2026-03-10T07:45:41.322 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.227+0000 7f93aa6cf700 1 -- 192.168.123.105:0/3829011038 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f93a41a0c00 con 0x7f93a4104fb0 2026-03-10T07:45:41.322 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.228+0000 7f93a17fa700 1 -- 192.168.123.105:0/3829011038 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7f9394022030 con 0x7f93a4104fb0 2026-03-10T07:45:41.322 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.228+0000 7f93a17fa700 1 -- 192.168.123.105:0/3829011038 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f939401bc90 con 0x7f93a4104fb0 2026-03-10T07:45:41.322 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.229+0000 7f93aa6cf700 1 -- 192.168.123.105:0/3829011038 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f93a419a290 con 0x7f93a4104fb0 2026-03-10T07:45:41.322 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.230+0000 7f93a17fa700 1 -- 192.168.123.105:0/3829011038 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72446 (secure 0 0 0) 0x7f939401b450 con 0x7f93a4104fb0 2026-03-10T07:45:41.322 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.266+0000 7f93aa6cf700 1 -- 192.168.123.105:0/3829011038 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "config assimilate-conf"} v 0) v1 -- 0x7f93a40623c0 con 0x7f93a4104fb0 2026-03-10T07:45:41.322 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.271+0000 7f93a17fa700 1 -- 192.168.123.105:0/3829011038 <== mon.0 v2:192.168.123.105:3300/0 7 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7f9394003bf0 con 0x7f93a4104fb0 2026-03-10T07:45:41.322 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.271+0000 7f93a17fa700 1 -- 192.168.123.105:0/3829011038 <== mon.0 v2:192.168.123.105:3300/0 8 ==== mon_command_ack([{"prefix": "config assimilate-conf"}]=0 v2) v1 ==== 70+0+435 (secure 0 0 0) 0x7f93a40623c0 con 0x7f93a4104fb0 2026-03-10T07:45:41.322 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.272+0000 7f93aa6cf700 1 -- 192.168.123.105:0/3829011038 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f93a4104fb0 msgr2=0x7f93a41a00a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:41.322 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.272+0000 7f93aa6cf700 1 --2- 192.168.123.105:0/3829011038 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f93a4104fb0 0x7f93a41a00a0 secure :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0x7f93a4073b60 tx=0x7f9394004750 comp rx=0 tx=0).stop 2026-03-10T07:45:41.322 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.272+0000 7f93aa6cf700 1 -- 192.168.123.105:0/3829011038 shutdown_connections 2026-03-10T07:45:41.322 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.273+0000 7f93aa6cf700 1 --2- 192.168.123.105:0/3829011038 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f93a4104fb0 0x7f93a41a00a0 unknown :-1 s=CLOSED pgs=4 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:41.322 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.273+0000 7f93aa6cf700 1 -- 192.168.123.105:0/3829011038 >> 192.168.123.105:0/3829011038 conn(0x7f93a4100bd0 msgr2=0x7f93a4102370 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:41.322 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.273+0000 7f93aa6cf700 1 -- 192.168.123.105:0/3829011038 shutdown_connections 2026-03-10T07:45:41.322 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.273+0000 7f93aa6cf700 1 -- 192.168.123.105:0/3829011038 wait complete. 2026-03-10T07:45:41.322 INFO:teuthology.orchestra.run.vm05.stdout:Generating new minimal ceph.conf... 2026-03-10T07:45:41.536 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.447+0000 7fd77a311700 1 Processor -- start 2026-03-10T07:45:41.536 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.447+0000 7fd77a311700 1 -- start start 2026-03-10T07:45:41.536 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.447+0000 7fd77a311700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd774108990 0x7fd774108db0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:41.536 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.447+0000 7fd77a311700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd774109380 con 0x7fd774108990 2026-03-10T07:45:41.536 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.447+0000 7fd773fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd774108990 0x7fd774108db0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:41.536 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.447+0000 7fd773fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd774108990 0x7fd774108db0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36218/0 (socket says 192.168.123.105:36218) 2026-03-10T07:45:41.536 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.447+0000 7fd773fff700 1 -- 192.168.123.105:0/3529221407 learned_addr learned my addr 192.168.123.105:0/3529221407 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:41.536 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.448+0000 7fd773fff700 1 -- 192.168.123.105:0/3529221407 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd774109b90 con 0x7fd774108990 2026-03-10T07:45:41.536 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.448+0000 7fd773fff700 1 --2- 192.168.123.105:0/3529221407 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd774108990 0x7fd774108db0 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fd75c009cf0 tx=0x7fd75c00b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=a0edc75331bdc941 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:41.536 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.448+0000 7fd772ffd700 1 -- 192.168.123.105:0/3529221407 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd75c004030 con 0x7fd774108990 2026-03-10T07:45:41.536 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.448+0000 7fd772ffd700 1 -- 192.168.123.105:0/3529221407 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7fd75c00b810 con 0x7fd774108990 2026-03-10T07:45:41.536 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.448+0000 7fd77a311700 1 -- 192.168.123.105:0/3529221407 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd774108990 msgr2=0x7fd774108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:41.536 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.448+0000 7fd77a311700 1 --2- 192.168.123.105:0/3529221407 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd774108990 0x7fd774108db0 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fd75c009cf0 tx=0x7fd75c00b0e0 comp rx=0 tx=0).stop 2026-03-10T07:45:41.536 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.449+0000 7fd77a311700 1 -- 192.168.123.105:0/3529221407 shutdown_connections 2026-03-10T07:45:41.536 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.449+0000 7fd77a311700 1 --2- 192.168.123.105:0/3529221407 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd774108990 0x7fd774108db0 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:41.536 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.449+0000 7fd77a311700 1 -- 192.168.123.105:0/3529221407 >> 192.168.123.105:0/3529221407 conn(0x7fd774103f50 msgr2=0x7fd774106370 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:41.536 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.449+0000 7fd77a311700 1 -- 192.168.123.105:0/3529221407 shutdown_connections 2026-03-10T07:45:41.536 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.449+0000 7fd77a311700 1 -- 192.168.123.105:0/3529221407 wait complete. 2026-03-10T07:45:41.536 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.450+0000 7fd77a311700 1 Processor -- start 2026-03-10T07:45:41.536 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.450+0000 7fd77a311700 1 -- start start 2026-03-10T07:45:41.536 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.450+0000 7fd77a311700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd774108990 0x7fd77419c650 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.450+0000 7fd77a311700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd774109380 con 0x7fd774108990 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.450+0000 7fd773fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd774108990 0x7fd77419c650 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.450+0000 7fd773fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd774108990 0x7fd77419c650 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36232/0 (socket says 192.168.123.105:36232) 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.450+0000 7fd773fff700 1 -- 192.168.123.105:0/2291424170 learned_addr learned my addr 192.168.123.105:0/2291424170 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.450+0000 7fd773fff700 1 -- 192.168.123.105:0/2291424170 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd75c009740 con 0x7fd774108990 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.450+0000 7fd773fff700 1 --2- 192.168.123.105:0/2291424170 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd774108990 0x7fd77419c650 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fd75c003770 tx=0x7fd75c003e10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.451+0000 7fd7717fa700 1 -- 192.168.123.105:0/2291424170 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd75c004170 con 0x7fd774108990 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.451+0000 7fd77a311700 1 -- 192.168.123.105:0/2291424170 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd77419cb90 con 0x7fd774108990 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.451+0000 7fd77a311700 1 -- 192.168.123.105:0/2291424170 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd77419cfb0 con 0x7fd774108990 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.451+0000 7fd7717fa700 1 -- 192.168.123.105:0/2291424170 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7fd75c0042d0 con 0x7fd774108990 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.451+0000 7fd7717fa700 1 -- 192.168.123.105:0/2291424170 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd75c01a430 con 0x7fd774108990 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.451+0000 7fd7717fa700 1 -- 192.168.123.105:0/2291424170 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7fd75c01a950 con 0x7fd774108990 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.451+0000 7fd7717fa700 1 -- 192.168.123.105:0/2291424170 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fd75c01e070 con 0x7fd774108990 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.451+0000 7fd77a311700 1 -- 192.168.123.105:0/2291424170 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd77419d260 con 0x7fd774108990 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.453+0000 7fd7717fa700 1 -- 192.168.123.105:0/2291424170 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72446 (secure 0 0 0) 0x7fd75c004440 con 0x7fd774108990 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.489+0000 7fd77a311700 1 -- 192.168.123.105:0/2291424170 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "config generate-minimal-conf"} v 0) v1 -- 0x7fd77404fa20 con 0x7fd774108990 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.489+0000 7fd7717fa700 1 -- 192.168.123.105:0/2291424170 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "config generate-minimal-conf"}]=0 v2) v1 ==== 76+0+181 (secure 0 0 0) 0x7fd75c02d430 con 0x7fd774108990 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.490+0000 7fd77a311700 1 -- 192.168.123.105:0/2291424170 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd774108990 msgr2=0x7fd77419c650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.490+0000 7fd77a311700 1 --2- 192.168.123.105:0/2291424170 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd774108990 0x7fd77419c650 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fd75c003770 tx=0x7fd75c003e10 comp rx=0 tx=0).stop 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.490+0000 7fd77a311700 1 -- 192.168.123.105:0/2291424170 shutdown_connections 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.490+0000 7fd77a311700 1 --2- 192.168.123.105:0/2291424170 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd774108990 0x7fd77419c650 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.490+0000 7fd77a311700 1 -- 192.168.123.105:0/2291424170 >> 192.168.123.105:0/2291424170 conn(0x7fd774103f50 msgr2=0x7fd774104cf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.491+0000 7fd77a311700 1 -- 192.168.123.105:0/2291424170 shutdown_connections 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:41.491+0000 7fd77a311700 1 -- 192.168.123.105:0/2291424170 wait complete. 2026-03-10T07:45:41.537 INFO:teuthology.orchestra.run.vm05.stdout:Restarting the monitor... 2026-03-10T07:45:41.643 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 podman[50302]: 2026-03-10 07:45:41.630893816 +0000 UTC m=+0.034814791 container died 0114b57998e6411a26b60894da0383405b917ec0addc0cf0c2db133299274e6e (image=quay.io/ceph/ceph:v18.2.1, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm05, maintainer=Guillaume Abrioux , org.label-schema.schema-version=1.0, GIT_BRANCH=HEAD, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, io.buildah.version=1.29.1, org.label-schema.build-date=20240222, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.1, GIT_CLEAN=True, ceph=True) 2026-03-10T07:45:41.854 INFO:teuthology.orchestra.run.vm05.stdout:Setting public_network to 192.168.123.0/24 in global config section 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 podman[50302]: 2026-03-10 07:45:41.649166535 +0000 UTC m=+0.053087510 container remove 0114b57998e6411a26b60894da0383405b917ec0addc0cf0c2db133299274e6e (image=quay.io/ceph/ceph:v18.2.1, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm05, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_POINT_RELEASE=-18.2.1, GIT_CLEAN=True, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, org.label-schema.build-date=20240222, RELEASE=HEAD, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , org.label-schema.name=CentOS Stream 8 Base Image, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True) 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 bash[50302]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm05 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@mon.vm05.service: Deactivated successfully. 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 systemd[1]: Stopped Ceph mon.vm05 for 12e9780e-1c55-11f1-8896-79f7c2e9b508. 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 systemd[1]: Starting Ceph mon.vm05 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 podman[50371]: 2026-03-10 07:45:41.803901702 +0000 UTC m=+0.015903599 container create 2a459bf05146cbba941e46798fca428be9fb365229bdc64db44d9ab8b736af7f (image=quay.io/ceph/ceph:v18.2.1, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm05, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, org.label-schema.license=GPLv2, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, org.label-schema.vendor=CentOS, maintainer=Guillaume Abrioux , org.label-schema.schema-version=1.0, io.buildah.version=1.29.1, GIT_BRANCH=HEAD, CEPH_POINT_RELEASE=-18.2.1, ceph=True, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.build-date=20240222) 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 podman[50371]: 2026-03-10 07:45:41.8402855 +0000 UTC m=+0.052287397 container init 2a459bf05146cbba941e46798fca428be9fb365229bdc64db44d9ab8b736af7f (image=quay.io/ceph/ceph:v18.2.1, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm05, maintainer=Guillaume Abrioux , org.label-schema.schema-version=1.0, GIT_BRANCH=HEAD, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, ceph=True, org.label-schema.name=CentOS Stream 8 Base Image, RELEASE=HEAD, org.label-schema.build-date=20240222, io.buildah.version=1.29.1, CEPH_POINT_RELEASE=-18.2.1, org.label-schema.license=GPLv2, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.vendor=CentOS) 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 podman[50371]: 2026-03-10 07:45:41.842860316 +0000 UTC m=+0.054862202 container start 2a459bf05146cbba941e46798fca428be9fb365229bdc64db44d9ab8b736af7f (image=quay.io/ceph/ceph:v18.2.1, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm05, org.label-schema.name=CentOS Stream 8 Base Image, GIT_CLEAN=True, RELEASE=HEAD, maintainer=Guillaume Abrioux , org.label-schema.schema-version=1.0, GIT_BRANCH=HEAD, CEPH_POINT_RELEASE=-18.2.1, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.build-date=20240222, org.label-schema.vendor=CentOS, io.buildah.version=1.29.1, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, org.label-schema.license=GPLv2) 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 bash[50371]: 2a459bf05146cbba941e46798fca428be9fb365229bdc64db44d9ab8b736af7f 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 podman[50371]: 2026-03-10 07:45:41.796808216 +0000 UTC m=+0.008810113 image pull 5be31c24972a920012b90a9769e8313e2490c82aee752aefd49af9cf4c0f3fcf quay.io/ceph/ceph:v18.2.1 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 systemd[1]: Started Ceph mon.vm05 for 12e9780e-1c55-11f1-8896-79f7c2e9b508. 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: set uid:gid to 167:167 (ceph:ceph) 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable), process ceph-mon, pid 2 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: pidfile_write: ignore empty --pid-file 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: load: jerasure load: lrc 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: RocksDB version: 7.9.2 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Git sha 0 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Compile date 2023-12-11 22:07:34 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: DB SUMMARY 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: DB Session ID: K2C0S43GXTYIEA535J19 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: CURRENT file: CURRENT 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: IDENTITY file: IDENTITY 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: MANIFEST file: MANIFEST-000010 size: 179 Bytes 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm05/store.db dir, Total Num: 1, files: 000008.sst 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm05/store.db: 000009.log size: 89048 ; 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.error_if_exists: 0 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.create_if_missing: 0 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.paranoid_checks: 1 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-10T07:45:41.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.env: 0x555bd6a5a720 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.fs: PosixFileSystem 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.info_log: 0x555bd864f360 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_file_opening_threads: 16 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.statistics: (nil) 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.use_fsync: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_log_file_size: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.keep_log_file_num: 1000 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.recycle_log_file_num: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.allow_fallocate: 1 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.allow_mmap_reads: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.allow_mmap_writes: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.use_direct_reads: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.create_missing_column_families: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.db_log_dir: 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.wal_dir: 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.advise_random_on_open: 1 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.db_write_buffer_size: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.write_buffer_manager: 0x555bd78de320 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.rate_limiter: (nil) 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.wal_recovery_mode: 2 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.enable_thread_tracking: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.enable_pipelined_write: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.unordered_write: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.row_cache: None 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.wal_filter: None 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.allow_ingest_behind: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.two_write_queues: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.manual_wal_flush: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.wal_compression: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.atomic_flush: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.log_readahead_size: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.best_efforts_recovery: 0 2026-03-10T07:45:41.912 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.allow_data_in_errors: 0 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.db_host_id: __hostname__ 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_background_jobs: 2 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_background_compactions: -1 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_subcompactions: 1 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_total_wal_size: 0 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_open_files: -1 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.bytes_per_sync: 0 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.compaction_readahead_size: 0 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_background_flushes: -1 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Compression algorithms supported: 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: kZSTD supported: 0 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: kXpressCompression supported: 0 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: kLZ4HCCompression supported: 1 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: kZlibCompression supported: 1 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: kSnappyCompression supported: 1 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: kLZ4Compression supported: 1 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: kBZip2Compression supported: 0 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm05/store.db/MANIFEST-000010 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.merge_operator: 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.compaction_filter: None 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.compaction_filter_factory: None 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.sst_partitioner_factory: None 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x555bd864f480) 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout: cache_index_and_filter_blocks: 1 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout: pin_top_level_index_and_filter: 1 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout: index_type: 0 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout: data_block_index_type: 0 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout: index_shortening: 1 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout: checksum: 4 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout: no_block_cache: 0 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_cache: 0x555bd7961350 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_cache_name: BinnedLRUCache 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_cache_options: 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout: capacity : 536870912 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout: num_shard_bits : 4 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout: strict_capacity_limit : 0 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout: high_pri_pool_ratio: 0.000 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_cache_compressed: (nil) 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout: persistent_cache: (nil) 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_size: 4096 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_size_deviation: 10 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_restart_interval: 16 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout: index_block_restart_interval: 1 2026-03-10T07:45:41.913 INFO:journalctl@ceph.mon.vm05.vm05.stdout: metadata_block_size: 4096 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout: partition_filters: 0 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout: use_delta_encoding: 1 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout: filter_policy: bloomfilter 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout: whole_key_filtering: 1 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout: verify_compression: 0 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout: read_amp_bytes_per_bit: 0 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout: format_version: 5 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout: enable_index_compression: 1 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_align: 0 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout: max_auto_readahead_size: 262144 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout: prepopulate_block_cache: 0 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout: initial_auto_readahead_size: 8192 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout: num_file_reads_for_auto_readahead: 2 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.write_buffer_size: 33554432 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_write_buffer_number: 2 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.compression: NoCompression 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.bottommost_compression: Disabled 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.prefix_extractor: nullptr 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.num_levels: 7 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.compression_opts.level: 32767 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.compression_opts.strategy: 0 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.compression_opts.enabled: false 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.target_file_size_base: 67108864 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.arena_block_size: 1048576 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.disable_auto_compactions: 0 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-10T07:45:41.914 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.table_properties_collectors: 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.inplace_update_support: 0 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.bloom_locality: 0 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.max_successive_merges: 0 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.paranoid_file_checks: 0 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.force_consistency_checks: 1 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.report_bg_io_stats: 0 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.ttl: 2592000 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.enable_blob_files: false 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.min_blob_size: 0 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.blob_file_size: 268435456 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.blob_file_starting_level: 0 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm05/store.db/MANIFEST-000010 succeeded,manifest_file_number is 10, next_file_number is 12, last_sequence is 5, log_number is 5,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 5 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 07859557-12af-4cf8-b0fd-7862ae4579b0 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773128741867356, "job": 1, "event": "recovery_started", "wal_files": [9]} 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #9 mode 2 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773128741869245, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 13, "file_size": 84711, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 8, "largest_seqno": 287, "table_properties": {"data_size": 82789, "index_size": 209, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 13288, "raw_average_key_size": 51, "raw_value_size": 75614, "raw_average_value_size": 293, "num_data_blocks": 9, "num_entries": 258, "num_filter_entries": 258, "num_deletions": 3, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773128741, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "07859557-12af-4cf8-b0fd-7862ae4579b0", "db_session_id": "K2C0S43GXTYIEA535J19", "orig_file_number": 13, "seqno_to_time_mapping": "N/A"}} 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773128741869295, "job": 1, "event": "recovery_finished"} 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: [db/version_set.cc:5047] Creating manifest 15 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm05/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x555bd79fe000 2026-03-10T07:45:41.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: rocksdb: DB pointer 0x555bd79ea000 2026-03-10T07:45:42.100 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.007+0000 7fbbe95b0700 1 Processor -- start 2026-03-10T07:45:42.100 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.008+0000 7fbbe95b0700 1 -- start start 2026-03-10T07:45:42.100 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.008+0000 7fbbe95b0700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbe4105650 0x7fbbe4105a70 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:42.100 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.009+0000 7fbbe95b0700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbbe4106040 con 0x7fbbe4105650 2026-03-10T07:45:42.100 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.009+0000 7fbbe2ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbe4105650 0x7fbbe4105a70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:42.100 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.009+0000 7fbbe2ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbe4105650 0x7fbbe4105a70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36240/0 (socket says 192.168.123.105:36240) 2026-03-10T07:45:42.100 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.009+0000 7fbbe2ffd700 1 -- 192.168.123.105:0/2620823705 learned_addr learned my addr 192.168.123.105:0/2620823705 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:42.100 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.010+0000 7fbbe2ffd700 1 -- 192.168.123.105:0/2620823705 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbbe4106850 con 0x7fbbe4105650 2026-03-10T07:45:42.100 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.010+0000 7fbbe2ffd700 1 --2- 192.168.123.105:0/2620823705 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbe4105650 0x7fbbe4105a70 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7fbbcc00bf90 tx=0x7fbbcc00d5d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=f72ce93fee5afc2d server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:42.100 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.010+0000 7fbbe1ffb700 1 -- 192.168.123.105:0/2620823705 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbbcc00dcc0 con 0x7fbbe4105650 2026-03-10T07:45:42.100 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.010+0000 7fbbe1ffb700 1 -- 192.168.123.105:0/2620823705 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7fbbcc00de20 con 0x7fbbe4105650 2026-03-10T07:45:42.100 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.010+0000 7fbbe1ffb700 1 -- 192.168.123.105:0/2620823705 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbbcc003e20 con 0x7fbbe4105650 2026-03-10T07:45:42.100 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.011+0000 7fbbe95b0700 1 -- 192.168.123.105:0/2620823705 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbe4105650 msgr2=0x7fbbe4105a70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:42.100 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.011+0000 7fbbe95b0700 1 --2- 192.168.123.105:0/2620823705 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbe4105650 0x7fbbe4105a70 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7fbbcc00bf90 tx=0x7fbbcc00d5d0 comp rx=0 tx=0).stop 2026-03-10T07:45:42.100 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.011+0000 7fbbe95b0700 1 -- 192.168.123.105:0/2620823705 shutdown_connections 2026-03-10T07:45:42.100 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.011+0000 7fbbe95b0700 1 --2- 192.168.123.105:0/2620823705 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbe4105650 0x7fbbe4105a70 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.011+0000 7fbbe95b0700 1 -- 192.168.123.105:0/2620823705 >> 192.168.123.105:0/2620823705 conn(0x7fbbe4100bd0 msgr2=0x7fbbe4103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.011+0000 7fbbe95b0700 1 -- 192.168.123.105:0/2620823705 shutdown_connections 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.011+0000 7fbbe95b0700 1 -- 192.168.123.105:0/2620823705 wait complete. 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.012+0000 7fbbe95b0700 1 Processor -- start 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.012+0000 7fbbe95b0700 1 -- start start 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.012+0000 7fbbe95b0700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbe4105650 0x7fbbe4198100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.012+0000 7fbbe95b0700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbbe4198640 con 0x7fbbe4105650 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.012+0000 7fbbe2ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbe4105650 0x7fbbe4198100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.013+0000 7fbbe2ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbe4105650 0x7fbbe4198100 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36252/0 (socket says 192.168.123.105:36252) 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.013+0000 7fbbe2ffd700 1 -- 192.168.123.105:0/1072182292 learned_addr learned my addr 192.168.123.105:0/1072182292 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.013+0000 7fbbe2ffd700 1 -- 192.168.123.105:0/1072182292 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbbcc00b9e0 con 0x7fbbe4105650 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.013+0000 7fbbe2ffd700 1 --2- 192.168.123.105:0/1072182292 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbe4105650 0x7fbbe4198100 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7fbbcc00da10 tx=0x7fbbcc0115c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.013+0000 7fbbdbfff700 1 -- 192.168.123.105:0/1072182292 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbbcc0119f0 con 0x7fbbe4105650 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.013+0000 7fbbdbfff700 1 -- 192.168.123.105:0/1072182292 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(27 keys) v1 ==== 1037+0+0 (secure 0 0 0) 0x7fbbcc0193f0 con 0x7fbbe4105650 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.013+0000 7fbbe95b0700 1 -- 192.168.123.105:0/1072182292 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbbe4198840 con 0x7fbbe4105650 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.013+0000 7fbbe95b0700 1 -- 192.168.123.105:0/1072182292 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbbe4198c60 con 0x7fbbe4105650 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.014+0000 7fbbdbfff700 1 -- 192.168.123.105:0/1072182292 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fbbcc0119f0 con 0x7fbbe4105650 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.014+0000 7fbbdbfff700 1 -- 192.168.123.105:0/1072182292 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7fbbcc019560 con 0x7fbbe4105650 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.014+0000 7fbbdbfff700 1 -- 192.168.123.105:0/1072182292 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fbbcc024c40 con 0x7fbbe4105650 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.014+0000 7fbbe95b0700 1 -- 192.168.123.105:0/1072182292 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbbe4191ee0 con 0x7fbbe4105650 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.016+0000 7fbbdbfff700 1 -- 192.168.123.105:0/1072182292 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72446 (secure 0 0 0) 0x7fbbcc01f070 con 0x7fbbe4105650 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.050+0000 7fbbe95b0700 1 -- 192.168.123.105:0/1072182292 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config set, name=public_network}] v 0) v1 -- 0x7fbbe404fa20 con 0x7fbbe4105650 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.053+0000 7fbbdbfff700 1 -- 192.168.123.105:0/1072182292 <== mon.0 v2:192.168.123.105:3300/0 7 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fbbcc028030 con 0x7fbbe4105650 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.053+0000 7fbbdbfff700 1 -- 192.168.123.105:0/1072182292 <== mon.0 v2:192.168.123.105:3300/0 8 ==== mon_command_ack([{prefix=config set, name=public_network}]=0 v3)=0 v3) v1 ==== 130+0+0 (secure 0 0 0) 0x7fbbcc01f070 con 0x7fbbe4105650 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.054+0000 7fbbe95b0700 1 -- 192.168.123.105:0/1072182292 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbe4105650 msgr2=0x7fbbe4198100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.054+0000 7fbbe95b0700 1 --2- 192.168.123.105:0/1072182292 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbe4105650 0x7fbbe4198100 secure :-1 s=READY pgs=2 cs=0 l=1 rev1=1 crypto rx=0x7fbbcc00da10 tx=0x7fbbcc0115c0 comp rx=0 tx=0).stop 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.054+0000 7fbbe95b0700 1 -- 192.168.123.105:0/1072182292 shutdown_connections 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.054+0000 7fbbe95b0700 1 --2- 192.168.123.105:0/1072182292 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbbe4105650 0x7fbbe4198100 unknown :-1 s=CLOSED pgs=2 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.054+0000 7fbbe95b0700 1 -- 192.168.123.105:0/1072182292 >> 192.168.123.105:0/1072182292 conn(0x7fbbe4100bd0 msgr2=0x7fbbe418f020 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.055+0000 7fbbe95b0700 1 -- 192.168.123.105:0/1072182292 shutdown_connections 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.055+0000 7fbbe95b0700 1 -- 192.168.123.105:0/1072182292 wait complete. 2026-03-10T07:45:42.101 INFO:teuthology.orchestra.run.vm05.stdout:Wrote config to /etc/ceph/ceph.conf 2026-03-10T07:45:42.102 INFO:teuthology.orchestra.run.vm05.stdout:Wrote keyring to /etc/ceph/ceph.client.admin.keyring 2026-03-10T07:45:42.102 INFO:teuthology.orchestra.run.vm05.stdout:Creating mgr... 2026-03-10T07:45:42.102 INFO:teuthology.orchestra.run.vm05.stdout:Verifying port 0.0.0.0:9283 ... 2026-03-10T07:45:42.103 INFO:teuthology.orchestra.run.vm05.stdout:Verifying port 0.0.0.0:8765 ... 2026-03-10T07:45:42.103 INFO:teuthology.orchestra.run.vm05.stdout:Verifying port 0.0.0.0:8443 ... 2026-03-10T07:45:42.170 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: mon.vm05 is new leader, mons vm05 in quorum (ranks 0) 2026-03-10T07:45:42.170 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: monmap e1: 1 mons at {vm05=[v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0]} removed_ranks: {} disallowed_leaders: {} 2026-03-10T07:45:42.170 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: fsmap 2026-03-10T07:45:42.170 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: osdmap e1: 0 total, 0 up, 0 in 2026-03-10T07:45:42.170 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:41 vm05 ceph-mon[50387]: mgrmap e1: no daemons active 2026-03-10T07:45:42.253 INFO:teuthology.orchestra.run.vm05.stdout:Non-zero exit code 1 from systemctl reset-failed ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@mgr.vm05.blexke 2026-03-10T07:45:42.253 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stderr Failed to reset failed state of unit ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@mgr.vm05.blexke.service: Unit ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@mgr.vm05.blexke.service not loaded. 2026-03-10T07:45:42.377 INFO:teuthology.orchestra.run.vm05.stdout:systemctl: stderr Created symlink /etc/systemd/system/ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508.target.wants/ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@mgr.vm05.blexke.service → /etc/systemd/system/ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@.service. 2026-03-10T07:45:42.532 INFO:teuthology.orchestra.run.vm05.stdout:firewalld does not appear to be present 2026-03-10T07:45:42.532 INFO:teuthology.orchestra.run.vm05.stdout:Not possible to enable service . firewalld.service is not available 2026-03-10T07:45:42.532 INFO:teuthology.orchestra.run.vm05.stdout:firewalld does not appear to be present 2026-03-10T07:45:42.532 INFO:teuthology.orchestra.run.vm05.stdout:Not possible to open ports <[9283, 8765, 8443]>. firewalld.service is not available 2026-03-10T07:45:42.532 INFO:teuthology.orchestra.run.vm05.stdout:Waiting for mgr to start... 2026-03-10T07:45:42.532 INFO:teuthology.orchestra.run.vm05.stdout:Waiting for mgr... 2026-03-10T07:45:42.823 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T07:45:42.823 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout { 2026-03-10T07:45:42.823 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "fsid": "12e9780e-1c55-11f1-8896-79f7c2e9b508", 2026-03-10T07:45:42.823 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "health": { 2026-03-10T07:45:42.823 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-10T07:45:42.823 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-10T07:45:42.823 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-10T07:45:42.823 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:42.823 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-10T07:45:42.823 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-10T07:45:42.823 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 0 2026-03-10T07:45:42.823 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout ], 2026-03-10T07:45:42.823 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-10T07:45:42.823 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "vm05" 2026-03-10T07:45:42.823 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout ], 2026-03-10T07:45:42.823 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "quorum_age": 0, 2026-03-10T07:45:42.823 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-10T07:45:42.823 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T07:45:42.823 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-10T07:45:42.823 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-10T07:45:42.823 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:42.823 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-10T07:45:42.823 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T07:45:42.823 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "available": false, 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "restful" 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout ], 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "modified": "2026-03-10T07:45:40.905462+0000", 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout } 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.675+0000 7fe85d067700 1 Processor -- start 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.675+0000 7fe85d067700 1 -- start start 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.675+0000 7fe85d067700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe8581087a0 0x7fe858108bc0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.675+0000 7fe85d067700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe858109190 con 0x7fe8581087a0 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.677+0000 7fe857fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe8581087a0 0x7fe858108bc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.677+0000 7fe857fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe8581087a0 0x7fe858108bc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36286/0 (socket says 192.168.123.105:36286) 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.677+0000 7fe857fff700 1 -- 192.168.123.105:0/590014804 learned_addr learned my addr 192.168.123.105:0/590014804 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.677+0000 7fe857fff700 1 -- 192.168.123.105:0/590014804 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe8581099a0 con 0x7fe8581087a0 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.677+0000 7fe857fff700 1 --2- 192.168.123.105:0/590014804 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe8581087a0 0x7fe858108bc0 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fe848009a90 tx=0x7fe848009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=d9d1e4855e9fcf02 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.679+0000 7fe856ffd700 1 -- 192.168.123.105:0/590014804 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe848004030 con 0x7fe8581087a0 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.679+0000 7fe856ffd700 1 -- 192.168.123.105:0/590014804 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fe84800b7e0 con 0x7fe8581087a0 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.679+0000 7fe856ffd700 1 -- 192.168.123.105:0/590014804 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe8480039f0 con 0x7fe8581087a0 2026-03-10T07:45:42.825 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.680+0000 7fe85d067700 1 -- 192.168.123.105:0/590014804 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe8581087a0 msgr2=0x7fe858108bc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.680+0000 7fe85d067700 1 --2- 192.168.123.105:0/590014804 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe8581087a0 0x7fe858108bc0 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fe848009a90 tx=0x7fe848009da0 comp rx=0 tx=0).stop 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.683+0000 7fe85d067700 1 -- 192.168.123.105:0/590014804 shutdown_connections 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.683+0000 7fe85d067700 1 --2- 192.168.123.105:0/590014804 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe8581087a0 0x7fe858108bc0 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.683+0000 7fe85d067700 1 -- 192.168.123.105:0/590014804 >> 192.168.123.105:0/590014804 conn(0x7fe85807bbb0 msgr2=0x7fe8581064e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.684+0000 7fe85d067700 1 -- 192.168.123.105:0/590014804 shutdown_connections 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.685+0000 7fe85d067700 1 -- 192.168.123.105:0/590014804 wait complete. 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.685+0000 7fe85d067700 1 Processor -- start 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.685+0000 7fe85d067700 1 -- start start 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.685+0000 7fe85d067700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe8581087a0 0x7fe8581a9190 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.685+0000 7fe85d067700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe8581a96d0 con 0x7fe8581087a0 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.685+0000 7fe857fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe8581087a0 0x7fe8581a9190 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.686+0000 7fe857fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe8581087a0 0x7fe8581a9190 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36292/0 (socket says 192.168.123.105:36292) 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.686+0000 7fe857fff700 1 -- 192.168.123.105:0/1690117156 learned_addr learned my addr 192.168.123.105:0/1690117156 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.686+0000 7fe857fff700 1 -- 192.168.123.105:0/1690117156 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe848009740 con 0x7fe8581087a0 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.686+0000 7fe857fff700 1 --2- 192.168.123.105:0/1690117156 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe8581087a0 0x7fe8581a9190 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fe84800beb0 tx=0x7fe848003c60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.687+0000 7fe8557fa700 1 -- 192.168.123.105:0/1690117156 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe848004080 con 0x7fe8581087a0 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.687+0000 7fe85d067700 1 -- 192.168.123.105:0/1690117156 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe85810a820 con 0x7fe8581087a0 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.687+0000 7fe85d067700 1 -- 192.168.123.105:0/1690117156 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe8581a9b60 con 0x7fe8581087a0 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.687+0000 7fe8557fa700 1 -- 192.168.123.105:0/1690117156 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fe84802b430 con 0x7fe8581087a0 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.687+0000 7fe8557fa700 1 -- 192.168.123.105:0/1690117156 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe84801a430 con 0x7fe8581087a0 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.688+0000 7fe8557fa700 1 -- 192.168.123.105:0/1690117156 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7fe84801a650 con 0x7fe8581087a0 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.688+0000 7fe8557fa700 1 -- 192.168.123.105:0/1690117156 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fe848011ad0 con 0x7fe8581087a0 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.689+0000 7fe85d067700 1 -- 192.168.123.105:0/1690117156 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe844005320 con 0x7fe8581087a0 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.690+0000 7fe8557fa700 1 -- 192.168.123.105:0/1690117156 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72446 (secure 0 0 0) 0x7fe84802b9d0 con 0x7fe8581087a0 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.728+0000 7fe85d067700 1 -- 192.168.123.105:0/1690117156 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7fe844005190 con 0x7fe8581087a0 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.729+0000 7fe8557fa700 1 -- 192.168.123.105:0/1690117156 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1241 (secure 0 0 0) 0x7fe848011ce0 con 0x7fe8581087a0 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.733+0000 7fe83effd700 1 -- 192.168.123.105:0/1690117156 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe8581087a0 msgr2=0x7fe8581a9190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.733+0000 7fe83effd700 1 --2- 192.168.123.105:0/1690117156 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe8581087a0 0x7fe8581a9190 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fe84800beb0 tx=0x7fe848003c60 comp rx=0 tx=0).stop 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.733+0000 7fe83effd700 1 -- 192.168.123.105:0/1690117156 shutdown_connections 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.733+0000 7fe83effd700 1 --2- 192.168.123.105:0/1690117156 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe8581087a0 0x7fe8581a9190 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.734+0000 7fe83effd700 1 -- 192.168.123.105:0/1690117156 >> 192.168.123.105:0/1690117156 conn(0x7fe85807bbb0 msgr2=0x7fe85807d450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.734+0000 7fe83effd700 1 -- 192.168.123.105:0/1690117156 shutdown_connections 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:42.734+0000 7fe83effd700 1 -- 192.168.123.105:0/1690117156 wait complete. 2026-03-10T07:45:42.826 INFO:teuthology.orchestra.run.vm05.stdout:mgr not available, waiting (1/15)... 2026-03-10T07:45:43.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:43 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/1072182292' entity='client.admin' 2026-03-10T07:45:43.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:43 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/1690117156' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout { 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "fsid": "12e9780e-1c55-11f1-8896-79f7c2e9b508", 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "health": { 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 0 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout ], 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "vm05" 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout ], 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "quorum_age": 3, 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-10T07:45:45.087 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "available": false, 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "restful" 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout ], 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "modified": "2026-03-10T07:45:40.905462+0000", 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout } 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.966+0000 7f235d39c700 1 Processor -- start 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.966+0000 7f235d39c700 1 -- start start 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.966+0000 7f235d39c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23580721d0 0x7f23580725f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.966+0000 7f235d39c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2358072bc0 con 0x7f23580721d0 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.967+0000 7f2357fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23580721d0 0x7f23580725f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.967+0000 7f2357fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23580721d0 0x7f23580725f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36296/0 (socket says 192.168.123.105:36296) 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.967+0000 7f2357fff700 1 -- 192.168.123.105:0/1634888542 learned_addr learned my addr 192.168.123.105:0/1634888542 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.967+0000 7f2357fff700 1 -- 192.168.123.105:0/1634888542 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f235810e1c0 con 0x7f23580721d0 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.968+0000 7f2357fff700 1 --2- 192.168.123.105:0/1634888542 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23580721d0 0x7f23580725f0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f234800ab30 tx=0x7f2348010730 comp rx=0 tx=0).ready entity=mon.0 client_cookie=d1d960aae550da2c server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.969+0000 7f2356ffd700 1 -- 192.168.123.105:0/1634888542 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2348010e00 con 0x7f23580721d0 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.969+0000 7f2356ffd700 1 -- 192.168.123.105:0/1634888542 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f23480044d0 con 0x7f23580721d0 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.969+0000 7f235d39c700 1 -- 192.168.123.105:0/1634888542 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23580721d0 msgr2=0x7f23580725f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:45.089 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.969+0000 7f235d39c700 1 --2- 192.168.123.105:0/1634888542 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23580721d0 0x7f23580725f0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f234800ab30 tx=0x7f2348010730 comp rx=0 tx=0).stop 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.970+0000 7f235d39c700 1 -- 192.168.123.105:0/1634888542 shutdown_connections 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.970+0000 7f235d39c700 1 --2- 192.168.123.105:0/1634888542 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23580721d0 0x7f23580725f0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.970+0000 7f235d39c700 1 -- 192.168.123.105:0/1634888542 >> 192.168.123.105:0/1634888542 conn(0x7f235806d320 msgr2=0x7f235806f760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.970+0000 7f235d39c700 1 -- 192.168.123.105:0/1634888542 shutdown_connections 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.970+0000 7f235d39c700 1 -- 192.168.123.105:0/1634888542 wait complete. 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.971+0000 7f235d39c700 1 Processor -- start 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.971+0000 7f235d39c700 1 -- start start 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.971+0000 7f235d39c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23581b18c0 0x7f23581b1ce0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.971+0000 7f235d39c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f234801a410 con 0x7f23581b18c0 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.972+0000 7f2357fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23581b18c0 0x7f23581b1ce0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.972+0000 7f2357fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23581b18c0 0x7f23581b1ce0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36306/0 (socket says 192.168.123.105:36306) 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.972+0000 7f2357fff700 1 -- 192.168.123.105:0/1158369573 learned_addr learned my addr 192.168.123.105:0/1158369573 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.972+0000 7f2357fff700 1 -- 192.168.123.105:0/1158369573 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f234800a7e0 con 0x7f23581b18c0 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.973+0000 7f2357fff700 1 --2- 192.168.123.105:0/1158369573 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23581b18c0 0x7f23581b1ce0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f234800bbd0 tx=0x7f2348003980 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.973+0000 7f23557fa700 1 -- 192.168.123.105:0/1158369573 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2348003bd0 con 0x7f23581b18c0 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.973+0000 7f235d39c700 1 -- 192.168.123.105:0/1158369573 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f23581b2220 con 0x7f23581b18c0 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.973+0000 7f235d39c700 1 -- 192.168.123.105:0/1158369573 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f23581b4ea0 con 0x7f23581b18c0 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.974+0000 7f23557fa700 1 -- 192.168.123.105:0/1158369573 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f234800f070 con 0x7f23581b18c0 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.974+0000 7f23557fa700 1 -- 192.168.123.105:0/1158369573 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f23480229a0 con 0x7f23581b18c0 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.974+0000 7f23557fa700 1 -- 192.168.123.105:0/1158369573 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 1) v1 ==== 660+0+0 (secure 0 0 0) 0x7f2348018070 con 0x7f23581b18c0 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.974+0000 7f23557fa700 1 -- 192.168.123.105:0/1158369573 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f234802c8e0 con 0x7f23581b18c0 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.975+0000 7f235d39c700 1 -- 192.168.123.105:0/1158369573 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2344005320 con 0x7f23581b18c0 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:44.977+0000 7f23557fa700 1 -- 192.168.123.105:0/1158369573 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72446 (secure 0 0 0) 0x7f2348027070 con 0x7f23581b18c0 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:45.014+0000 7f235d39c700 1 -- 192.168.123.105:0/1158369573 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7f2344005190 con 0x7f23581b18c0 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:45.015+0000 7f23557fa700 1 -- 192.168.123.105:0/1158369573 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1241 (secure 0 0 0) 0x7f2348003d30 con 0x7f23581b18c0 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:45.021+0000 7f233effd700 1 -- 192.168.123.105:0/1158369573 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23581b18c0 msgr2=0x7f23581b1ce0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:45.021+0000 7f233effd700 1 --2- 192.168.123.105:0/1158369573 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23581b18c0 0x7f23581b1ce0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f234800bbd0 tx=0x7f2348003980 comp rx=0 tx=0).stop 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:45.021+0000 7f233effd700 1 -- 192.168.123.105:0/1158369573 shutdown_connections 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:45.021+0000 7f233effd700 1 --2- 192.168.123.105:0/1158369573 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f23581b18c0 0x7f23581b1ce0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:45.021+0000 7f233effd700 1 -- 192.168.123.105:0/1158369573 >> 192.168.123.105:0/1158369573 conn(0x7f235806d320 msgr2=0x7f235806dd70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:45.022+0000 7f233effd700 1 -- 192.168.123.105:0/1158369573 shutdown_connections 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:45.022+0000 7f233effd700 1 -- 192.168.123.105:0/1158369573 wait complete. 2026-03-10T07:45:45.090 INFO:teuthology.orchestra.run.vm05.stdout:mgr not available, waiting (2/15)... 2026-03-10T07:45:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:45 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/1158369573' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout { 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "fsid": "12e9780e-1c55-11f1-8896-79f7c2e9b508", 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "health": { 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 0 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout ], 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "vm05" 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout ], 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "quorum_age": 5, 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-10T07:45:47.302 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "available": false, 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "restful" 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout ], 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "modified": "2026-03-10T07:45:40.905462+0000", 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout } 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.211+0000 7f613af12700 1 Processor -- start 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.211+0000 7f613af12700 1 -- start start 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.212+0000 7f613af12700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6134106760 0x7f6134106b80 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.212+0000 7f613af12700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6134107150 con 0x7f6134106760 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.212+0000 7f6138cae700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6134106760 0x7f6134106b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.212+0000 7f6138cae700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6134106760 0x7f6134106b80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36382/0 (socket says 192.168.123.105:36382) 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.212+0000 7f6138cae700 1 -- 192.168.123.105:0/3103902920 learned_addr learned my addr 192.168.123.105:0/3103902920 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.212+0000 7f6138cae700 1 -- 192.168.123.105:0/3103902920 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6134107960 con 0x7f6134106760 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.213+0000 7f6138cae700 1 --2- 192.168.123.105:0/3103902920 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6134106760 0x7f6134106b80 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f6124009cf0 tx=0x7f612400b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=8ca0df3178b9349a server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.213+0000 7f61337fe700 1 -- 192.168.123.105:0/3103902920 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6124004030 con 0x7f6134106760 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.213+0000 7f61337fe700 1 -- 192.168.123.105:0/3103902920 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f612400b810 con 0x7f6134106760 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.213+0000 7f61337fe700 1 -- 192.168.123.105:0/3103902920 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6124003a90 con 0x7f6134106760 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.213+0000 7f613af12700 1 -- 192.168.123.105:0/3103902920 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6134106760 msgr2=0x7f6134106b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.213+0000 7f613af12700 1 --2- 192.168.123.105:0/3103902920 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6134106760 0x7f6134106b80 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7f6124009cf0 tx=0x7f612400b0e0 comp rx=0 tx=0).stop 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.213+0000 7f613af12700 1 -- 192.168.123.105:0/3103902920 shutdown_connections 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.213+0000 7f613af12700 1 --2- 192.168.123.105:0/3103902920 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6134106760 0x7f6134106b80 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.213+0000 7f613af12700 1 -- 192.168.123.105:0/3103902920 >> 192.168.123.105:0/3103902920 conn(0x7f6134101ce0 msgr2=0x7f6134104140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.214+0000 7f613af12700 1 -- 192.168.123.105:0/3103902920 shutdown_connections 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.214+0000 7f613af12700 1 -- 192.168.123.105:0/3103902920 wait complete. 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.214+0000 7f613af12700 1 Processor -- start 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.214+0000 7f613af12700 1 -- start start 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.214+0000 7f613af12700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6134198650 0x7f6134198a70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.214+0000 7f613af12700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6134198fb0 con 0x7f6134198650 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.215+0000 7f6138cae700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6134198650 0x7f6134198a70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.215+0000 7f6138cae700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6134198650 0x7f6134198a70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36394/0 (socket says 192.168.123.105:36394) 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.215+0000 7f6138cae700 1 -- 192.168.123.105:0/3424430280 learned_addr learned my addr 192.168.123.105:0/3424430280 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.215+0000 7f6138cae700 1 -- 192.168.123.105:0/3424430280 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6124009740 con 0x7f6134198650 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.215+0000 7f6138cae700 1 --2- 192.168.123.105:0/3424430280 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6134198650 0x7f6134198a70 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f6124000c00 tx=0x7f6124003db0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.215+0000 7f6131ffb700 1 -- 192.168.123.105:0/3424430280 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f6124004010 con 0x7f6134198650 2026-03-10T07:45:47.303 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.216+0000 7f6131ffb700 1 -- 192.168.123.105:0/3424430280 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f6124024470 con 0x7f6134198650 2026-03-10T07:45:47.304 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.216+0000 7f613af12700 1 -- 192.168.123.105:0/3424430280 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f61341087e0 con 0x7f6134198650 2026-03-10T07:45:47.304 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.216+0000 7f6131ffb700 1 -- 192.168.123.105:0/3424430280 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f612401a440 con 0x7f6134198650 2026-03-10T07:45:47.304 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.216+0000 7f613af12700 1 -- 192.168.123.105:0/3424430280 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f613419bc30 con 0x7f6134198650 2026-03-10T07:45:47.304 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.216+0000 7f6131ffb700 1 -- 192.168.123.105:0/3424430280 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 2) v1 ==== 45034+0+0 (secure 0 0 0) 0x7f6124021070 con 0x7f6134198650 2026-03-10T07:45:47.304 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.217+0000 7f6131ffb700 1 -- 192.168.123.105:0/3424430280 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f612404cf50 con 0x7f6134198650 2026-03-10T07:45:47.304 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.217+0000 7f613af12700 1 -- 192.168.123.105:0/3424430280 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6134192050 con 0x7f6134198650 2026-03-10T07:45:47.304 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.218+0000 7f6131ffb700 1 -- 192.168.123.105:0/3424430280 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+72446 (secure 0 0 0) 0x7f612401a5a0 con 0x7f6134198650 2026-03-10T07:45:47.304 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.256+0000 7f613af12700 1 -- 192.168.123.105:0/3424430280 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7f613404fa20 con 0x7f6134198650 2026-03-10T07:45:47.304 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.257+0000 7f6131ffb700 1 -- 192.168.123.105:0/3424430280 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1241 (secure 0 0 0) 0x7f612404b090 con 0x7f6134198650 2026-03-10T07:45:47.304 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.258+0000 7f613af12700 1 -- 192.168.123.105:0/3424430280 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6134198650 msgr2=0x7f6134198a70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:47.304 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.258+0000 7f613af12700 1 --2- 192.168.123.105:0/3424430280 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6134198650 0x7f6134198a70 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f6124000c00 tx=0x7f6124003db0 comp rx=0 tx=0).stop 2026-03-10T07:45:47.304 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.258+0000 7f613af12700 1 -- 192.168.123.105:0/3424430280 shutdown_connections 2026-03-10T07:45:47.304 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.258+0000 7f613af12700 1 --2- 192.168.123.105:0/3424430280 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6134198650 0x7f6134198a70 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:47.304 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.258+0000 7f613af12700 1 -- 192.168.123.105:0/3424430280 >> 192.168.123.105:0/3424430280 conn(0x7f6134101ce0 msgr2=0x7f61341039c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:47.304 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.259+0000 7f613af12700 1 -- 192.168.123.105:0/3424430280 shutdown_connections 2026-03-10T07:45:47.304 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:47.259+0000 7f613af12700 1 -- 192.168.123.105:0/3424430280 wait complete. 2026-03-10T07:45:47.304 INFO:teuthology.orchestra.run.vm05.stdout:mgr not available, waiting (3/15)... 2026-03-10T07:45:48.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:47 vm05 ceph-mon[50387]: Activating manager daemon vm05.blexke 2026-03-10T07:45:48.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:47 vm05 ceph-mon[50387]: mgrmap e2: vm05.blexke(active, starting, since 0.00334911s) 2026-03-10T07:45:48.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:47 vm05 ceph-mon[50387]: from='mgr.14100 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T07:45:48.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:47 vm05 ceph-mon[50387]: from='mgr.14100 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr metadata", "who": "vm05.blexke", "id": "vm05.blexke"}]: dispatch 2026-03-10T07:45:48.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:47 vm05 ceph-mon[50387]: from='mgr.14100 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T07:45:48.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:47 vm05 ceph-mon[50387]: from='mgr.14100 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T07:45:48.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:47 vm05 ceph-mon[50387]: from='mgr.14100 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T07:45:48.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:47 vm05 ceph-mon[50387]: Manager daemon vm05.blexke is now available 2026-03-10T07:45:48.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:47 vm05 ceph-mon[50387]: from='mgr.14100 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.blexke/mirror_snapshot_schedule"}]: dispatch 2026-03-10T07:45:48.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:47 vm05 ceph-mon[50387]: from='mgr.14100 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:45:48.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:47 vm05 ceph-mon[50387]: from='mgr.14100 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:45:48.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:47 vm05 ceph-mon[50387]: from='mgr.14100 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:45:48.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:47 vm05 ceph-mon[50387]: from='mgr.14100 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.blexke/trash_purge_schedule"}]: dispatch 2026-03-10T07:45:48.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:47 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/3424430280' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-10T07:45:49.328 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:48 vm05 ceph-mon[50387]: mgrmap e3: vm05.blexke(active, since 1.00776s) 2026-03-10T07:45:49.624 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout { 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "fsid": "12e9780e-1c55-11f1-8896-79f7c2e9b508", 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "health": { 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "status": "HEALTH_OK", 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "checks": {}, 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "mutes": [] 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "election_epoch": 5, 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "quorum": [ 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 0 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout ], 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "quorum_names": [ 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "vm05" 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout ], 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "quorum_age": 7, 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "monmap": { 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "min_mon_release_name": "reef", 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_mons": 1 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "osdmap": { 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_osds": 0, 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_up_osds": 0, 2026-03-10T07:45:49.625 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "osd_up_since": 0, 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_in_osds": 0, 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "osd_in_since": 0, 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_remapped_pgs": 0 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "pgmap": { 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "pgs_by_state": [], 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_pgs": 0, 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_pools": 0, 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_objects": 0, 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "data_bytes": 0, 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "bytes_used": 0, 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "bytes_avail": 0, 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "bytes_total": 0 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "fsmap": { 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "by_rank": [], 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "up:standby": 0 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "mgrmap": { 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_standbys": 0, 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "modules": [ 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "iostat", 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "nfs", 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "restful" 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout ], 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T07:45:49.626 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:49.627 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "servicemap": { 2026-03-10T07:45:49.627 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 1, 2026-03-10T07:45:49.627 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "modified": "2026-03-10T07:45:40.905462+0000", 2026-03-10T07:45:49.627 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "services": {} 2026-03-10T07:45:49.627 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout }, 2026-03-10T07:45:49.627 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "progress_events": {} 2026-03-10T07:45:49.627 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout } 2026-03-10T07:45:49.627 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.421+0000 7fd5afea0700 1 Processor -- start 2026-03-10T07:45:49.627 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.421+0000 7fd5afea0700 1 -- start start 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.421+0000 7fd5afea0700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5a8072b70 0x7fd5a810fdc0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.421+0000 7fd5afea0700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd5a8110390 con 0x7fd5a8072b70 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.422+0000 7fd5adc3c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5a8072b70 0x7fd5a810fdc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.422+0000 7fd5adc3c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5a8072b70 0x7fd5a810fdc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38070/0 (socket says 192.168.123.105:38070) 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.422+0000 7fd5adc3c700 1 -- 192.168.123.105:0/1038164491 learned_addr learned my addr 192.168.123.105:0/1038164491 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.422+0000 7fd5adc3c700 1 -- 192.168.123.105:0/1038164491 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd5a81104d0 con 0x7fd5a8072b70 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.422+0000 7fd5adc3c700 1 --2- 192.168.123.105:0/1038164491 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5a8072b70 0x7fd5a810fdc0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7fd5a400d180 tx=0x7fd5a400d490 comp rx=0 tx=0).ready entity=mon.0 client_cookie=102805462644aa75 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.422+0000 7fd5acc3a700 1 -- 192.168.123.105:0/1038164491 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd5a4010070 con 0x7fd5a8072b70 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.423+0000 7fd5acc3a700 1 -- 192.168.123.105:0/1038164491 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fd5a4004030 con 0x7fd5a8072b70 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.423+0000 7fd5afea0700 1 -- 192.168.123.105:0/1038164491 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5a8072b70 msgr2=0x7fd5a810fdc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.423+0000 7fd5afea0700 1 --2- 192.168.123.105:0/1038164491 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5a8072b70 0x7fd5a810fdc0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7fd5a400d180 tx=0x7fd5a400d490 comp rx=0 tx=0).stop 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.423+0000 7fd5afea0700 1 -- 192.168.123.105:0/1038164491 shutdown_connections 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.423+0000 7fd5afea0700 1 --2- 192.168.123.105:0/1038164491 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5a8072b70 0x7fd5a810fdc0 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.423+0000 7fd5afea0700 1 -- 192.168.123.105:0/1038164491 >> 192.168.123.105:0/1038164491 conn(0x7fd5a806d660 msgr2=0x7fd5a806fac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.423+0000 7fd5afea0700 1 -- 192.168.123.105:0/1038164491 shutdown_connections 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.423+0000 7fd5afea0700 1 -- 192.168.123.105:0/1038164491 wait complete. 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.424+0000 7fd5afea0700 1 Processor -- start 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.424+0000 7fd5afea0700 1 -- start start 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.424+0000 7fd5afea0700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5a8072b70 0x7fd5a81a4300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.424+0000 7fd5afea0700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd5a4003bb0 con 0x7fd5a8072b70 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.424+0000 7fd5adc3c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5a8072b70 0x7fd5a81a4300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.424+0000 7fd5adc3c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5a8072b70 0x7fd5a81a4300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38078/0 (socket says 192.168.123.105:38078) 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.424+0000 7fd5adc3c700 1 -- 192.168.123.105:0/4060093756 learned_addr learned my addr 192.168.123.105:0/4060093756 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.424+0000 7fd5adc3c700 1 -- 192.168.123.105:0/4060093756 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd5a40087c0 con 0x7fd5a8072b70 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.424+0000 7fd5adc3c700 1 --2- 192.168.123.105:0/4060093756 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5a8072b70 0x7fd5a81a4300 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fd5a4008c40 tx=0x7fd5a4008d20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.424+0000 7fd59effd700 1 -- 192.168.123.105:0/4060093756 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd5a4010050 con 0x7fd5a8072b70 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.425+0000 7fd59effd700 1 -- 192.168.123.105:0/4060093756 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fd5a4004620 con 0x7fd5a8072b70 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.425+0000 7fd59effd700 1 -- 192.168.123.105:0/4060093756 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd5a40164e0 con 0x7fd5a8072b70 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.425+0000 7fd5afea0700 1 -- 192.168.123.105:0/4060093756 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd5a81a4840 con 0x7fd5a8072b70 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.425+0000 7fd5afea0700 1 -- 192.168.123.105:0/4060093756 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd5a81a0d60 con 0x7fd5a8072b70 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.426+0000 7fd59effd700 1 -- 192.168.123.105:0/4060093756 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 4) v1 ==== 45267+0+0 (secure 0 0 0) 0x7fd5a4004180 con 0x7fd5a8072b70 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.426+0000 7fd59effd700 1 --2- 192.168.123.105:0/4060093756 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd594038370 0x7fd59403a830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.426+0000 7fd59effd700 1 -- 192.168.123.105:0/4060093756 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fd5a404aeb0 con 0x7fd5a8072b70 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.426+0000 7fd5ad43b700 1 --2- 192.168.123.105:0/4060093756 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd594038370 0x7fd59403a830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.426+0000 7fd5ad43b700 1 --2- 192.168.123.105:0/4060093756 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd594038370 0x7fd59403a830 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fd598006fd0 tx=0x7fd598006e40 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.426+0000 7fd5afea0700 1 -- 192.168.123.105:0/4060093756 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd5a804fa20 con 0x7fd5a8072b70 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.429+0000 7fd59effd700 1 -- 192.168.123.105:0/4060093756 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fd5a401b070 con 0x7fd5a8072b70 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.576+0000 7fd5afea0700 1 -- 192.168.123.105:0/4060093756 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "status", "format": "json-pretty"} v 0) v1 -- 0x7fd5a80623c0 con 0x7fd5a8072b70 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.576+0000 7fd59effd700 1 -- 192.168.123.105:0/4060093756 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "status", "format": "json-pretty"}]=0 v0) v1 ==== 79+0+1240 (secure 0 0 0) 0x7fd5a4016640 con 0x7fd5a8072b70 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.580+0000 7fd5afea0700 1 -- 192.168.123.105:0/4060093756 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd594038370 msgr2=0x7fd59403a830 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.580+0000 7fd5afea0700 1 --2- 192.168.123.105:0/4060093756 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd594038370 0x7fd59403a830 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fd598006fd0 tx=0x7fd598006e40 comp rx=0 tx=0).stop 2026-03-10T07:45:49.628 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.580+0000 7fd5afea0700 1 -- 192.168.123.105:0/4060093756 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5a8072b70 msgr2=0x7fd5a81a4300 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:49.629 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.580+0000 7fd5afea0700 1 --2- 192.168.123.105:0/4060093756 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5a8072b70 0x7fd5a81a4300 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fd5a4008c40 tx=0x7fd5a4008d20 comp rx=0 tx=0).stop 2026-03-10T07:45:49.629 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.581+0000 7fd5afea0700 1 -- 192.168.123.105:0/4060093756 shutdown_connections 2026-03-10T07:45:49.629 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.581+0000 7fd5afea0700 1 --2- 192.168.123.105:0/4060093756 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd594038370 0x7fd59403a830 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:49.629 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.581+0000 7fd5afea0700 1 --2- 192.168.123.105:0/4060093756 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5a8072b70 0x7fd5a81a4300 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:49.629 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.581+0000 7fd5afea0700 1 -- 192.168.123.105:0/4060093756 >> 192.168.123.105:0/4060093756 conn(0x7fd5a806d660 msgr2=0x7fd5a806fdb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:49.629 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.581+0000 7fd5afea0700 1 -- 192.168.123.105:0/4060093756 shutdown_connections 2026-03-10T07:45:49.629 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.581+0000 7fd5afea0700 1 -- 192.168.123.105:0/4060093756 wait complete. 2026-03-10T07:45:49.629 INFO:teuthology.orchestra.run.vm05.stdout:mgr is available 2026-03-10T07:45:49.910 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T07:45:49.910 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout [global] 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout fsid = 12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_osd_allow_pg_remap = true 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_osd_allow_primary_affinity = true 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_warn_on_no_sortbitwise = false 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout osd_crush_chooseleaf_type = 0 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout [mgr] 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mgr/telemetry/nag = false 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout [osd] 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_osd_backfillfull_ratio = 0.85 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_osd_full_ratio = 0.9 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout mon_osd_nearfull_ratio = 0.8 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout osd_map_max_advance = 10 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout osd_sloppy_crc = true 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.742+0000 7fe62376b700 1 Processor -- start 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.743+0000 7fe62376b700 1 -- start start 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.743+0000 7fe62376b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe61c108970 0x7fe61c108d90 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.743+0000 7fe62376b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe61c109360 con 0x7fe61c108970 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.743+0000 7fe621507700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe61c108970 0x7fe61c108d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.743+0000 7fe621507700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe61c108970 0x7fe61c108d90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38094/0 (socket says 192.168.123.105:38094) 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.743+0000 7fe621507700 1 -- 192.168.123.105:0/1800243783 learned_addr learned my addr 192.168.123.105:0/1800243783 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.744+0000 7fe621507700 1 -- 192.168.123.105:0/1800243783 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe61c109b70 con 0x7fe61c108970 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.744+0000 7fe621507700 1 --2- 192.168.123.105:0/1800243783 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe61c108970 0x7fe61c108d90 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fe60c009a90 tx=0x7fe60c009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=75ace2677bd6295e server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.744+0000 7fe613fff700 1 -- 192.168.123.105:0/1800243783 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe60c004030 con 0x7fe61c108970 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.744+0000 7fe613fff700 1 -- 192.168.123.105:0/1800243783 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fe60c00b7e0 con 0x7fe61c108970 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.744+0000 7fe62376b700 1 -- 192.168.123.105:0/1800243783 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe61c108970 msgr2=0x7fe61c108d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.744+0000 7fe62376b700 1 --2- 192.168.123.105:0/1800243783 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe61c108970 0x7fe61c108d90 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fe60c009a90 tx=0x7fe60c009da0 comp rx=0 tx=0).stop 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.745+0000 7fe62376b700 1 -- 192.168.123.105:0/1800243783 shutdown_connections 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.745+0000 7fe62376b700 1 --2- 192.168.123.105:0/1800243783 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe61c108970 0x7fe61c108d90 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.745+0000 7fe62376b700 1 -- 192.168.123.105:0/1800243783 >> 192.168.123.105:0/1800243783 conn(0x7fe61c07be30 msgr2=0x7fe61c1064e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.745+0000 7fe62376b700 1 -- 192.168.123.105:0/1800243783 shutdown_connections 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.745+0000 7fe62376b700 1 -- 192.168.123.105:0/1800243783 wait complete. 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.745+0000 7fe62376b700 1 Processor -- start 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.745+0000 7fe62376b700 1 -- start start 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.747+0000 7fe62376b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe61c108970 0x7fe61c19ca60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.747+0000 7fe621507700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe61c108970 0x7fe61c19ca60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.747+0000 7fe621507700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe61c108970 0x7fe61c19ca60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38108/0 (socket says 192.168.123.105:38108) 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.747+0000 7fe621507700 1 -- 192.168.123.105:0/894436647 learned_addr learned my addr 192.168.123.105:0/894436647 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.747+0000 7fe62376b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe61c109360 con 0x7fe61c108970 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.747+0000 7fe621507700 1 -- 192.168.123.105:0/894436647 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe60c009740 con 0x7fe61c108970 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.747+0000 7fe621507700 1 --2- 192.168.123.105:0/894436647 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe61c108970 0x7fe61c19ca60 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fe60c00bdf0 tx=0x7fe60c00bed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.747+0000 7fe6127fc700 1 -- 192.168.123.105:0/894436647 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe60c003f40 con 0x7fe61c108970 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.747+0000 7fe6127fc700 1 -- 192.168.123.105:0/894436647 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7fe60c004540 con 0x7fe61c108970 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.747+0000 7fe6127fc700 1 -- 192.168.123.105:0/894436647 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fe60c024de0 con 0x7fe61c108970 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.747+0000 7fe62376b700 1 -- 192.168.123.105:0/894436647 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe61c19cfa0 con 0x7fe61c108970 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.747+0000 7fe62376b700 1 -- 192.168.123.105:0/894436647 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe61c19d3c0 con 0x7fe61c108970 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.747+0000 7fe6127fc700 1 -- 192.168.123.105:0/894436647 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 4) v1 ==== 45267+0+0 (secure 0 0 0) 0x7fe60c01b440 con 0x7fe61c108970 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.747+0000 7fe6127fc700 1 --2- 192.168.123.105:0/894436647 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe608038390 0x7fe60803a850 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.747+0000 7fe6127fc700 1 -- 192.168.123.105:0/894436647 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7fe60c04cec0 con 0x7fe61c108970 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.748+0000 7fe620d06700 1 --2- 192.168.123.105:0/894436647 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe608038390 0x7fe60803a850 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.748+0000 7fe62376b700 1 -- 192.168.123.105:0/894436647 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe61c196910 con 0x7fe61c108970 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.751+0000 7fe6127fc700 1 -- 192.168.123.105:0/894436647 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fe60c01f030 con 0x7fe61c108970 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.751+0000 7fe620d06700 1 --2- 192.168.123.105:0/894436647 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe608038390 0x7fe60803a850 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7fe618006fd0 tx=0x7fe618006e40 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.851+0000 7fe62376b700 1 -- 192.168.123.105:0/894436647 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "config assimilate-conf"} v 0) v1 -- 0x7fe61c19d670 con 0x7fe61c108970 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.856+0000 7fe6127fc700 1 -- 192.168.123.105:0/894436647 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "config assimilate-conf"}]=0 v3) v1 ==== 70+0+373 (secure 0 0 0) 0x7fe60c04a020 con 0x7fe61c108970 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.859+0000 7fe62376b700 1 -- 192.168.123.105:0/894436647 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe608038390 msgr2=0x7fe60803a850 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.859+0000 7fe62376b700 1 --2- 192.168.123.105:0/894436647 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe608038390 0x7fe60803a850 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7fe618006fd0 tx=0x7fe618006e40 comp rx=0 tx=0).stop 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.859+0000 7fe62376b700 1 -- 192.168.123.105:0/894436647 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe61c108970 msgr2=0x7fe61c19ca60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.859+0000 7fe62376b700 1 --2- 192.168.123.105:0/894436647 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe61c108970 0x7fe61c19ca60 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fe60c00bdf0 tx=0x7fe60c00bed0 comp rx=0 tx=0).stop 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.859+0000 7fe62376b700 1 -- 192.168.123.105:0/894436647 shutdown_connections 2026-03-10T07:45:49.911 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.859+0000 7fe62376b700 1 --2- 192.168.123.105:0/894436647 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe608038390 0x7fe60803a850 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:49.912 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.859+0000 7fe62376b700 1 --2- 192.168.123.105:0/894436647 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe61c108970 0x7fe61c19ca60 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:49.912 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.859+0000 7fe62376b700 1 -- 192.168.123.105:0/894436647 >> 192.168.123.105:0/894436647 conn(0x7fe61c07be30 msgr2=0x7fe61c107490 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:49.912 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.860+0000 7fe62376b700 1 -- 192.168.123.105:0/894436647 shutdown_connections 2026-03-10T07:45:49.912 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:49.860+0000 7fe62376b700 1 -- 192.168.123.105:0/894436647 wait complete. 2026-03-10T07:45:49.912 INFO:teuthology.orchestra.run.vm05.stdout:Enabling cephadm module... 2026-03-10T07:45:50.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:49 vm05 ceph-mon[50387]: mgrmap e4: vm05.blexke(active, since 2s) 2026-03-10T07:45:50.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:49 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/4060093756' entity='client.admin' cmd=[{"prefix": "status", "format": "json-pretty"}]: dispatch 2026-03-10T07:45:50.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:49 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/894436647' entity='client.admin' cmd=[{"prefix": "config assimilate-conf"}]: dispatch 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.032+0000 7f3d0466b700 1 Processor -- start 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.032+0000 7f3d0466b700 1 -- start start 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.033+0000 7f3d0466b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3cfc105650 0x7f3cfc105a70 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.033+0000 7f3d0466b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3cfc106040 con 0x7f3cfc105650 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.033+0000 7f3d02407700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3cfc105650 0x7f3cfc105a70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.033+0000 7f3d02407700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3cfc105650 0x7f3cfc105a70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38124/0 (socket says 192.168.123.105:38124) 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.033+0000 7f3d02407700 1 -- 192.168.123.105:0/2174914035 learned_addr learned my addr 192.168.123.105:0/2174914035 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.033+0000 7f3d02407700 1 -- 192.168.123.105:0/2174914035 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3cfc106850 con 0x7f3cfc105650 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.034+0000 7f3d02407700 1 --2- 192.168.123.105:0/2174914035 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3cfc105650 0x7f3cfc105a70 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f3cec009cf0 tx=0x7f3cec00b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=bdd77cf8a5405ae6 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.034+0000 7f3d01405700 1 -- 192.168.123.105:0/2174914035 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3cec004030 con 0x7f3cfc105650 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.034+0000 7f3d01405700 1 -- 192.168.123.105:0/2174914035 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f3cec00b810 con 0x7f3cfc105650 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.034+0000 7f3d0466b700 1 -- 192.168.123.105:0/2174914035 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3cfc105650 msgr2=0x7f3cfc105a70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.034+0000 7f3d0466b700 1 --2- 192.168.123.105:0/2174914035 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3cfc105650 0x7f3cfc105a70 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f3cec009cf0 tx=0x7f3cec00b0e0 comp rx=0 tx=0).stop 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.034+0000 7f3d0466b700 1 -- 192.168.123.105:0/2174914035 shutdown_connections 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.034+0000 7f3d0466b700 1 --2- 192.168.123.105:0/2174914035 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3cfc105650 0x7f3cfc105a70 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.034+0000 7f3d0466b700 1 -- 192.168.123.105:0/2174914035 >> 192.168.123.105:0/2174914035 conn(0x7f3cfc100bd0 msgr2=0x7f3cfc103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.035+0000 7f3d0466b700 1 -- 192.168.123.105:0/2174914035 shutdown_connections 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.035+0000 7f3d0466b700 1 -- 192.168.123.105:0/2174914035 wait complete. 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.035+0000 7f3d0466b700 1 Processor -- start 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.035+0000 7f3d0466b700 1 -- start start 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.035+0000 7f3d0466b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3cfc19c810 0x7f3cfc19cc30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.036+0000 7f3d02407700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3cfc19c810 0x7f3cfc19cc30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.036+0000 7f3d02407700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3cfc19c810 0x7f3cfc19cc30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38130/0 (socket says 192.168.123.105:38130) 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.036+0000 7f3d02407700 1 -- 192.168.123.105:0/3386930890 learned_addr learned my addr 192.168.123.105:0/3386930890 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.035+0000 7f3d0466b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3cfc106040 con 0x7f3cfc19c810 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.036+0000 7f3d02407700 1 -- 192.168.123.105:0/3386930890 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3cec009740 con 0x7f3cfc19c810 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.036+0000 7f3d02407700 1 --2- 192.168.123.105:0/3386930890 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3cfc19c810 0x7f3cfc19cc30 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f3cec009cc0 tx=0x7f3cec003cb0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.036+0000 7f3cf37fe700 1 -- 192.168.123.105:0/3386930890 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3cec003ed0 con 0x7f3cfc19c810 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.036+0000 7f3cf37fe700 1 -- 192.168.123.105:0/3386930890 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f3cec0044d0 con 0x7f3cfc19c810 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.036+0000 7f3cf37fe700 1 -- 192.168.123.105:0/3386930890 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3cec01ac60 con 0x7f3cfc19c810 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.036+0000 7f3d0466b700 1 -- 192.168.123.105:0/3386930890 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3cfc19d170 con 0x7f3cfc19c810 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.036+0000 7f3d0466b700 1 -- 192.168.123.105:0/3386930890 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3cfc19fdf0 con 0x7f3cfc19c810 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.037+0000 7f3cf37fe700 1 -- 192.168.123.105:0/3386930890 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 4) v1 ==== 45267+0+0 (secure 0 0 0) 0x7f3cec011420 con 0x7f3cfc19c810 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.037+0000 7f3cf37fe700 1 --2- 192.168.123.105:0/3386930890 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3ce8038330 0x7f3ce803a7f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.037+0000 7f3cf37fe700 1 -- 192.168.123.105:0/3386930890 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f3cec04c560 con 0x7f3cfc19c810 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.037+0000 7f3d01c06700 1 --2- 192.168.123.105:0/3386930890 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3ce8038330 0x7f3ce803a7f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.038+0000 7f3d0466b700 1 -- 192.168.123.105:0/3386930890 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3cfc04fa90 con 0x7f3cfc19c810 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.040+0000 7f3cf37fe700 1 -- 192.168.123.105:0/3386930890 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f3cec01f020 con 0x7f3cfc19c810 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.041+0000 7f3d01c06700 1 --2- 192.168.123.105:0/3386930890 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3ce8038330 0x7f3ce803a7f0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f3cf8006fd0 tx=0x7f3cf8006e40 comp rx=0 tx=0).ready entity=mgr.14100 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.172+0000 7f3d0466b700 1 -- 192.168.123.105:0/3386930890 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mgr module enable", "module": "cephadm"} v 0) v1 -- 0x7f3cfc19ffe0 con 0x7f3cfc19c810 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.989+0000 7f3cf37fe700 1 -- 192.168.123.105:0/3386930890 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mgrmap(e 5) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f3cec011a00 con 0x7f3cfc19c810 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.995+0000 7f3cf37fe700 1 -- 192.168.123.105:0/3386930890 <== mon.0 v2:192.168.123.105:3300/0 8 ==== mon_command_ack([{"prefix": "mgr module enable", "module": "cephadm"}]=0 v5) v1 ==== 86+0+0 (secure 0 0 0) 0x7f3cec01a4f0 con 0x7f3cfc19c810 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.998+0000 7f3d0466b700 1 -- 192.168.123.105:0/3386930890 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3ce8038330 msgr2=0x7f3ce803a7f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.998+0000 7f3d0466b700 1 --2- 192.168.123.105:0/3386930890 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3ce8038330 0x7f3ce803a7f0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f3cf8006fd0 tx=0x7f3cf8006e40 comp rx=0 tx=0).stop 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.998+0000 7f3d0466b700 1 -- 192.168.123.105:0/3386930890 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3cfc19c810 msgr2=0x7f3cfc19cc30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.998+0000 7f3d0466b700 1 --2- 192.168.123.105:0/3386930890 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3cfc19c810 0x7f3cfc19cc30 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f3cec009cc0 tx=0x7f3cec003cb0 comp rx=0 tx=0).stop 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.998+0000 7f3d0466b700 1 -- 192.168.123.105:0/3386930890 shutdown_connections 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.998+0000 7f3d0466b700 1 --2- 192.168.123.105:0/3386930890 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3ce8038330 0x7f3ce803a7f0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.998+0000 7f3d0466b700 1 --2- 192.168.123.105:0/3386930890 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3cfc19c810 0x7f3cfc19cc30 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.998+0000 7f3d0466b700 1 -- 192.168.123.105:0/3386930890 >> 192.168.123.105:0/3386930890 conn(0x7f3cfc100bd0 msgr2=0x7f3cfc074130 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.998+0000 7f3d0466b700 1 -- 192.168.123.105:0/3386930890 shutdown_connections 2026-03-10T07:45:51.086 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:50.998+0000 7f3d0466b700 1 -- 192.168.123.105:0/3386930890 wait complete. 2026-03-10T07:45:51.366 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:50 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/3386930890' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "cephadm"}]: dispatch 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout { 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 5, 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "active_name": "vm05.blexke", 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_standby": 0 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout } 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.219+0000 7f5a0e3ac700 1 Processor -- start 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.219+0000 7f5a0e3ac700 1 -- start start 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.219+0000 7f5a0e3ac700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a08071ce0 0x7f5a08072100 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.219+0000 7f5a0e3ac700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5a080726d0 con 0x7f5a08071ce0 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.220+0000 7f5a0d3aa700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a08071ce0 0x7f5a08072100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.220+0000 7f5a0d3aa700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a08071ce0 0x7f5a08072100 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38164/0 (socket says 192.168.123.105:38164) 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.220+0000 7f5a0d3aa700 1 -- 192.168.123.105:0/1185988291 learned_addr learned my addr 192.168.123.105:0/1185988291 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.220+0000 7f5a0d3aa700 1 -- 192.168.123.105:0/1185988291 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5a08072810 con 0x7f5a08071ce0 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.221+0000 7f5a0d3aa700 1 --2- 192.168.123.105:0/1185988291 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a08071ce0 0x7f5a08072100 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f5a0400d0d0 tx=0x7f5a0400d3e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=3435532ce0facadc server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.221+0000 7f59fffff700 1 -- 192.168.123.105:0/1185988291 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5a04010070 con 0x7f5a08071ce0 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.221+0000 7f59fffff700 1 -- 192.168.123.105:0/1185988291 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f5a04004030 con 0x7f5a08071ce0 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.221+0000 7f5a0e3ac700 1 -- 192.168.123.105:0/1185988291 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a08071ce0 msgr2=0x7f5a08072100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.221+0000 7f5a0e3ac700 1 --2- 192.168.123.105:0/1185988291 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a08071ce0 0x7f5a08072100 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f5a0400d0d0 tx=0x7f5a0400d3e0 comp rx=0 tx=0).stop 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.221+0000 7f5a0e3ac700 1 -- 192.168.123.105:0/1185988291 shutdown_connections 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.221+0000 7f5a0e3ac700 1 --2- 192.168.123.105:0/1185988291 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a08071ce0 0x7f5a08072100 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.221+0000 7f5a0e3ac700 1 -- 192.168.123.105:0/1185988291 >> 192.168.123.105:0/1185988291 conn(0x7f5a0806d320 msgr2=0x7f5a0806f760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.221+0000 7f5a0e3ac700 1 -- 192.168.123.105:0/1185988291 shutdown_connections 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.221+0000 7f5a0e3ac700 1 -- 192.168.123.105:0/1185988291 wait complete. 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.222+0000 7f5a0e3ac700 1 Processor -- start 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.222+0000 7f5a0e3ac700 1 -- start start 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.222+0000 7f5a0e3ac700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a08086de0 0x7f5a08087200 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.222+0000 7f5a0e3ac700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5a04003b60 con 0x7f5a08086de0 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.222+0000 7f5a0d3aa700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a08086de0 0x7f5a08087200 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.222+0000 7f5a0d3aa700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a08086de0 0x7f5a08087200 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38180/0 (socket says 192.168.123.105:38180) 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.222+0000 7f5a0d3aa700 1 -- 192.168.123.105:0/3020461515 learned_addr learned my addr 192.168.123.105:0/3020461515 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.223+0000 7f5a0d3aa700 1 -- 192.168.123.105:0/3020461515 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5a040088c0 con 0x7f5a08086de0 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.223+0000 7f5a0d3aa700 1 --2- 192.168.123.105:0/3020461515 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a08086de0 0x7f5a08087200 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f5a040039a0 tx=0x7f5a04004200 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.223+0000 7f59fe7fc700 1 -- 192.168.123.105:0/3020461515 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5a04010050 con 0x7f5a08086de0 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.223+0000 7f5a0e3ac700 1 -- 192.168.123.105:0/3020461515 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5a08087740 con 0x7f5a08086de0 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.223+0000 7f5a0e3ac700 1 -- 192.168.123.105:0/3020461515 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5a080883b0 con 0x7f5a08086de0 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.224+0000 7f59fe7fc700 1 -- 192.168.123.105:0/3020461515 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f5a040045b0 con 0x7f5a08086de0 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.224+0000 7f59fe7fc700 1 -- 192.168.123.105:0/3020461515 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5a0401f730 con 0x7f5a08086de0 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.225+0000 7f5a0e3ac700 1 -- 192.168.123.105:0/3020461515 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f59ec005320 con 0x7f5a08086de0 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.225+0000 7f59fe7fc700 1 -- 192.168.123.105:0/3020461515 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 5) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f5a0401d040 con 0x7f5a08086de0 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.225+0000 7f59fe7fc700 1 --2- 192.168.123.105:0/3020461515 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f59f40383e0 0x7f59f403a8a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.225+0000 7f5a0cba9700 1 -- 192.168.123.105:0/3020461515 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f59f40383e0 msgr2=0x7f59f403a8a0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.225+0000 7f5a0cba9700 1 --2- 192.168.123.105:0/3020461515 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f59f40383e0 0x7f59f403a8a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.225+0000 7f59fe7fc700 1 -- 192.168.123.105:0/3020461515 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f5a0404d9d0 con 0x7f5a08086de0 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.228+0000 7f59fe7fc700 1 -- 192.168.123.105:0/3020461515 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f5a04015930 con 0x7f5a08086de0 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.363+0000 7f5a0e3ac700 1 -- 192.168.123.105:0/3020461515 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mgr stat"} v 0) v1 -- 0x7f59ec006200 con 0x7f5a08086de0 2026-03-10T07:45:51.398 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.366+0000 7f59fe7fc700 1 -- 192.168.123.105:0/3020461515 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mgr stat"}]=0 v5) v1 ==== 56+0+98 (secure 0 0 0) 0x7f5a04023080 con 0x7f5a08086de0 2026-03-10T07:45:51.399 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.368+0000 7f59f3fff700 1 -- 192.168.123.105:0/3020461515 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f59f40383e0 msgr2=0x7f59f403a8a0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:45:51.399 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.368+0000 7f59f3fff700 1 --2- 192.168.123.105:0/3020461515 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f59f40383e0 0x7f59f403a8a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:51.399 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.368+0000 7f59f3fff700 1 -- 192.168.123.105:0/3020461515 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a08086de0 msgr2=0x7f5a08087200 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:51.399 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.368+0000 7f59f3fff700 1 --2- 192.168.123.105:0/3020461515 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a08086de0 0x7f5a08087200 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f5a040039a0 tx=0x7f5a04004200 comp rx=0 tx=0).stop 2026-03-10T07:45:51.399 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.369+0000 7f59f3fff700 1 -- 192.168.123.105:0/3020461515 shutdown_connections 2026-03-10T07:45:51.399 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.369+0000 7f59f3fff700 1 --2- 192.168.123.105:0/3020461515 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f59f40383e0 0x7f59f403a8a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:51.399 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.369+0000 7f59f3fff700 1 --2- 192.168.123.105:0/3020461515 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a08086de0 0x7f5a08087200 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:51.399 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.369+0000 7f59f3fff700 1 -- 192.168.123.105:0/3020461515 >> 192.168.123.105:0/3020461515 conn(0x7f5a0806d320 msgr2=0x7f5a0806dd00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:51.399 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.369+0000 7f59f3fff700 1 -- 192.168.123.105:0/3020461515 shutdown_connections 2026-03-10T07:45:51.399 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.369+0000 7f59f3fff700 1 -- 192.168.123.105:0/3020461515 wait complete. 2026-03-10T07:45:51.399 INFO:teuthology.orchestra.run.vm05.stdout:Waiting for the mgr to restart... 2026-03-10T07:45:51.399 INFO:teuthology.orchestra.run.vm05.stdout:Waiting for mgr epoch 5... 2026-03-10T07:45:52.383 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:51 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/3386930890' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "cephadm"}]': finished 2026-03-10T07:45:52.383 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:51 vm05 ceph-mon[50387]: mgrmap e5: vm05.blexke(active, since 4s) 2026-03-10T07:45:52.383 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:51 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/3020461515' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch 2026-03-10T07:45:55.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:55 vm05 ceph-mon[50387]: Active manager daemon vm05.blexke restarted 2026-03-10T07:45:55.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:55 vm05 ceph-mon[50387]: Activating manager daemon vm05.blexke 2026-03-10T07:45:55.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:55 vm05 ceph-mon[50387]: osdmap e2: 0 total, 0 up, 0 in 2026-03-10T07:45:55.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:55 vm05 ceph-mon[50387]: mgrmap e6: vm05.blexke(active, starting, since 0.00436178s) 2026-03-10T07:45:55.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:55 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T07:45:55.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:55 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr metadata", "who": "vm05.blexke", "id": "vm05.blexke"}]: dispatch 2026-03-10T07:45:55.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:55 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T07:45:55.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:55 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T07:45:55.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:55 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T07:45:55.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:55 vm05 ceph-mon[50387]: Manager daemon vm05.blexke is now available 2026-03-10T07:45:55.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:55 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:45:55.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:55 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:45:55.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:55 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:45:55.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:55 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:45:55.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:55 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout { 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "mgrmap_epoch": 7, 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "initialized": true 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout } 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.536+0000 7f2fe288f700 1 Processor -- start 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.537+0000 7f2fe288f700 1 -- start start 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.537+0000 7f2fe288f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2fdc071da0 0x7f2fdc0721c0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.537+0000 7f2fe288f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2fdc072700 con 0x7f2fdc071da0 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.537+0000 7f2fdbfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2fdc071da0 0x7f2fdc0721c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.537+0000 7f2fdbfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2fdc071da0 0x7f2fdc0721c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38194/0 (socket says 192.168.123.105:38194) 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.537+0000 7f2fdbfff700 1 -- 192.168.123.105:0/646226041 learned_addr learned my addr 192.168.123.105:0/646226041 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.538+0000 7f2fdbfff700 1 -- 192.168.123.105:0/646226041 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2fdc072840 con 0x7f2fdc071da0 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.538+0000 7f2fdbfff700 1 --2- 192.168.123.105:0/646226041 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2fdc071da0 0x7f2fdc0721c0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f2fcc009a90 tx=0x7f2fcc009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=6682589bd506e0d7 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.539+0000 7f2fdaffd700 1 -- 192.168.123.105:0/646226041 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2fcc004030 con 0x7f2fdc071da0 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.539+0000 7f2fdaffd700 1 -- 192.168.123.105:0/646226041 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f2fcc00b7e0 con 0x7f2fdc071da0 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.539+0000 7f2fdaffd700 1 -- 192.168.123.105:0/646226041 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2fcc0039f0 con 0x7f2fdc071da0 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.539+0000 7f2fe288f700 1 -- 192.168.123.105:0/646226041 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2fdc071da0 msgr2=0x7f2fdc0721c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.539+0000 7f2fe288f700 1 --2- 192.168.123.105:0/646226041 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2fdc071da0 0x7f2fdc0721c0 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f2fcc009a90 tx=0x7f2fcc009da0 comp rx=0 tx=0).stop 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.539+0000 7f2fe288f700 1 -- 192.168.123.105:0/646226041 shutdown_connections 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.539+0000 7f2fe288f700 1 --2- 192.168.123.105:0/646226041 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2fdc071da0 0x7f2fdc0721c0 secure :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f2fcc009a90 tx=0x7f2fcc009da0 comp rx=0 tx=0).stop 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.539+0000 7f2fe288f700 1 -- 192.168.123.105:0/646226041 >> 192.168.123.105:0/646226041 conn(0x7f2fdc06d400 msgr2=0x7f2fdc06f840 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.539+0000 7f2fe288f700 1 -- 192.168.123.105:0/646226041 shutdown_connections 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.539+0000 7f2fe288f700 1 -- 192.168.123.105:0/646226041 wait complete. 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.540+0000 7f2fe288f700 1 Processor -- start 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.540+0000 7f2fe288f700 1 -- start start 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.540+0000 7f2fe288f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2fdc1a9250 0x7f2fdc1a9670 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.540+0000 7f2fe288f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2fdc1a9bb0 con 0x7f2fdc1a9250 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.540+0000 7f2fdbfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2fdc1a9250 0x7f2fdc1a9670 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.540+0000 7f2fdbfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2fdc1a9250 0x7f2fdc1a9670 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38204/0 (socket says 192.168.123.105:38204) 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.540+0000 7f2fdbfff700 1 -- 192.168.123.105:0/3751511327 learned_addr learned my addr 192.168.123.105:0/3751511327 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.540+0000 7f2fdbfff700 1 -- 192.168.123.105:0/3751511327 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2fcc009740 con 0x7f2fdc1a9250 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.540+0000 7f2fdbfff700 1 --2- 192.168.123.105:0/3751511327 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2fdc1a9250 0x7f2fdc1a9670 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f2fdc071a70 tx=0x7f2fcc00bf30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.542+0000 7f2fd97fa700 1 -- 192.168.123.105:0/3751511327 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2fcc003fa0 con 0x7f2fdc1a9250 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.542+0000 7f2fe288f700 1 -- 192.168.123.105:0/3751511327 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2fdc1a9db0 con 0x7f2fdc1a9250 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.542+0000 7f2fe288f700 1 -- 192.168.123.105:0/3751511327 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2fdc07b1c0 con 0x7f2fdc1a9250 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.542+0000 7f2fd97fa700 1 -- 192.168.123.105:0/3751511327 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f2fcc0045a0 con 0x7f2fdc1a9250 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.542+0000 7f2fd97fa700 1 -- 192.168.123.105:0/3751511327 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f2fcc01b440 con 0x7f2fdc1a9250 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.543+0000 7f2fd97fa700 1 -- 192.168.123.105:0/3751511327 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 5) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f2fcc01b5a0 con 0x7f2fdc1a9250 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.543+0000 7f2fd97fa700 1 --2- 192.168.123.105:0/3751511327 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2fc40383f0 0x7f2fc403a8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.543+0000 7f2fd97fa700 1 -- 192.168.123.105:0/3751511327 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(1..1 src has 1..1) v4 ==== 725+0+0 (secure 0 0 0) 0x7f2fcc04d180 con 0x7f2fdc1a9250 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.543+0000 7f2fe288f700 1 -- 192.168.123.105:0/3751511327 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7f2fc8000d40 con 0x7f2fc40383f0 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.543+0000 7f2fdb7fe700 1 -- 192.168.123.105:0/3751511327 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2fc40383f0 msgr2=0x7f2fc403a8b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.543+0000 7f2fdb7fe700 1 --2- 192.168.123.105:0/3751511327 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2fc40383f0 0x7f2fc403a8b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.744+0000 7f2fdb7fe700 1 -- 192.168.123.105:0/3751511327 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2fc40383f0 msgr2=0x7f2fc403a8b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:51.744+0000 7f2fdb7fe700 1 --2- 192.168.123.105:0/3751511327 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2fc40383f0 0x7f2fc403a8b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.400000 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:52.145+0000 7f2fdb7fe700 1 -- 192.168.123.105:0/3751511327 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2fc40383f0 msgr2=0x7f2fc403a8b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:52.145+0000 7f2fdb7fe700 1 --2- 192.168.123.105:0/3751511327 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2fc40383f0 0x7f2fc403a8b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.800000 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:52.946+0000 7f2fdb7fe700 1 -- 192.168.123.105:0/3751511327 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2fc40383f0 msgr2=0x7f2fc403a8b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:52.946+0000 7f2fdb7fe700 1 --2- 192.168.123.105:0/3751511327 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2fc40383f0 0x7f2fc403a8b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 1.600000 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:54.547+0000 7f2fdb7fe700 1 -- 192.168.123.105:0/3751511327 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2fc40383f0 msgr2=0x7f2fc403a8b0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:54.547+0000 7f2fdb7fe700 1 --2- 192.168.123.105:0/3751511327 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2fc40383f0 0x7f2fc403a8b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 3.200000 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:55.499+0000 7f2fd97fa700 1 -- 192.168.123.105:0/3751511327 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mgrmap(e 6) v1 ==== 45045+0+0 (secure 0 0 0) 0x7f2fcc02dce0 con 0x7f2fdc1a9250 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:55.499+0000 7f2fd97fa700 1 -- 192.168.123.105:0/3751511327 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2fc40383f0 msgr2=0x7f2fc403a8b0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:55.499+0000 7f2fd97fa700 1 --2- 192.168.123.105:0/3751511327 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2fc40383f0 0x7f2fc403a8b0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.006+0000 7f2fd97fa700 1 -- 192.168.123.105:0/3751511327 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mgrmap(e 7) v1 ==== 45172+0+0 (secure 0 0 0) 0x7f2fcc04d6e0 con 0x7f2fdc1a9250 2026-03-10T07:45:57.404 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.006+0000 7f2fd97fa700 1 --2- 192.168.123.105:0/3751511327 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2fc40383f0 0x7f2fc403a8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:57.405 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.006+0000 7f2fd97fa700 1 -- 192.168.123.105:0/3751511327 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7f2fc8000d40 con 0x7f2fc40383f0 2026-03-10T07:45:57.405 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.009+0000 7f2fdb7fe700 1 --2- 192.168.123.105:0/3751511327 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2fc40383f0 0x7f2fc403a8b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:57.405 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.010+0000 7f2fdb7fe700 1 --2- 192.168.123.105:0/3751511327 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2fc40383f0 0x7f2fc403a8b0 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f2fd0003a10 tx=0x7f2fd00092b0 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:57.405 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.011+0000 7f2fd97fa700 1 -- 192.168.123.105:0/3751511327 <== mgr.14120 v2:192.168.123.105:6800/2 1 ==== command_reply(tid 0: 0 ) v1 ==== 8+0+6910 (secure 0 0 0) 0x7f2fc8000d40 con 0x7f2fc40383f0 2026-03-10T07:45:57.405 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.014+0000 7f2fe288f700 1 -- 192.168.123.105:0/3751511327 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- command(tid 1: {"prefix": "mgr_status"}) v1 -- 0x7f2fc8002800 con 0x7f2fc40383f0 2026-03-10T07:45:57.405 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.015+0000 7f2fd97fa700 1 -- 192.168.123.105:0/3751511327 <== mgr.14120 v2:192.168.123.105:6800/2 2 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+51 (secure 0 0 0) 0x7f2fc8002800 con 0x7f2fc40383f0 2026-03-10T07:45:57.405 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.015+0000 7f2fc2ffd700 1 -- 192.168.123.105:0/3751511327 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2fc40383f0 msgr2=0x7f2fc403a8b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:57.405 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.015+0000 7f2fc2ffd700 1 --2- 192.168.123.105:0/3751511327 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2fc40383f0 0x7f2fc403a8b0 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f2fd0003a10 tx=0x7f2fd00092b0 comp rx=0 tx=0).stop 2026-03-10T07:45:57.405 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.015+0000 7f2fc2ffd700 1 -- 192.168.123.105:0/3751511327 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2fdc1a9250 msgr2=0x7f2fdc1a9670 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:57.405 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.015+0000 7f2fc2ffd700 1 --2- 192.168.123.105:0/3751511327 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2fdc1a9250 0x7f2fdc1a9670 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f2fdc071a70 tx=0x7f2fcc00bf30 comp rx=0 tx=0).stop 2026-03-10T07:45:57.405 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.015+0000 7f2fc2ffd700 1 -- 192.168.123.105:0/3751511327 shutdown_connections 2026-03-10T07:45:57.405 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.015+0000 7f2fc2ffd700 1 --2- 192.168.123.105:0/3751511327 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2fc40383f0 0x7f2fc403a8b0 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:57.405 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.015+0000 7f2fc2ffd700 1 --2- 192.168.123.105:0/3751511327 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2fdc1a9250 0x7f2fdc1a9670 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:57.405 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.015+0000 7f2fc2ffd700 1 -- 192.168.123.105:0/3751511327 >> 192.168.123.105:0/3751511327 conn(0x7f2fdc06d400 msgr2=0x7f2fdc06e0b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:57.405 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.015+0000 7f2fc2ffd700 1 -- 192.168.123.105:0/3751511327 shutdown_connections 2026-03-10T07:45:57.405 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.015+0000 7f2fc2ffd700 1 -- 192.168.123.105:0/3751511327 wait complete. 2026-03-10T07:45:57.405 INFO:teuthology.orchestra.run.vm05.stdout:mgr epoch 5 is available 2026-03-10T07:45:57.405 INFO:teuthology.orchestra.run.vm05.stdout:Setting orchestrator backend to cephadm... 2026-03-10T07:45:57.520 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:57 vm05 ceph-mon[50387]: Found migration_current of "None". Setting to last migration. 2026-03-10T07:45:57.520 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:57 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.blexke/mirror_snapshot_schedule"}]: dispatch 2026-03-10T07:45:57.520 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:57 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.blexke/trash_purge_schedule"}]: dispatch 2026-03-10T07:45:57.520 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:57 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:45:57.520 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:57 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:45:57.521 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:57 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:45:57.687 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.531+0000 7f9cd7b53700 1 Processor -- start 2026-03-10T07:45:57.687 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.532+0000 7f9cd7b53700 1 -- start start 2026-03-10T07:45:57.687 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.532+0000 7f9cd7b53700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cd0108980 0x7f9cd0108da0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:57.687 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.532+0000 7f9cd7b53700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9cd0109370 con 0x7f9cd0108980 2026-03-10T07:45:57.687 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.533+0000 7f9cd58ef700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cd0108980 0x7f9cd0108da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:57.687 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.533+0000 7f9cd58ef700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cd0108980 0x7f9cd0108da0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38268/0 (socket says 192.168.123.105:38268) 2026-03-10T07:45:57.687 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.533+0000 7f9cd58ef700 1 -- 192.168.123.105:0/4287861546 learned_addr learned my addr 192.168.123.105:0/4287861546 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:57.687 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.533+0000 7f9cd58ef700 1 -- 192.168.123.105:0/4287861546 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9cd0109b80 con 0x7f9cd0108980 2026-03-10T07:45:57.687 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.533+0000 7f9cd58ef700 1 --2- 192.168.123.105:0/4287861546 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cd0108980 0x7f9cd0108da0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f9cc0009a90 tx=0x7f9cc0009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=bff40ddfdfa874d6 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:57.687 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.534+0000 7f9cd48ed700 1 -- 192.168.123.105:0/4287861546 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9cc0004030 con 0x7f9cd0108980 2026-03-10T07:45:57.687 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.534+0000 7f9cd48ed700 1 -- 192.168.123.105:0/4287861546 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f9cc000b7e0 con 0x7f9cd0108980 2026-03-10T07:45:57.687 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.534+0000 7f9cd48ed700 1 -- 192.168.123.105:0/4287861546 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9cc0003a40 con 0x7f9cd0108980 2026-03-10T07:45:57.687 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.534+0000 7f9cd7b53700 1 -- 192.168.123.105:0/4287861546 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cd0108980 msgr2=0x7f9cd0108da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:57.687 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.534+0000 7f9cd7b53700 1 --2- 192.168.123.105:0/4287861546 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cd0108980 0x7f9cd0108da0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f9cc0009a90 tx=0x7f9cc0009da0 comp rx=0 tx=0).stop 2026-03-10T07:45:57.687 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.534+0000 7f9cd7b53700 1 -- 192.168.123.105:0/4287861546 shutdown_connections 2026-03-10T07:45:57.687 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.534+0000 7f9cd7b53700 1 --2- 192.168.123.105:0/4287861546 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cd0108980 0x7f9cd0108da0 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:57.687 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.534+0000 7f9cd7b53700 1 -- 192.168.123.105:0/4287861546 >> 192.168.123.105:0/4287861546 conn(0x7f9cd01044d0 msgr2=0x7f9cd01068c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:57.687 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.534+0000 7f9cd7b53700 1 -- 192.168.123.105:0/4287861546 shutdown_connections 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.534+0000 7f9cd7b53700 1 -- 192.168.123.105:0/4287861546 wait complete. 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.535+0000 7f9cd7b53700 1 Processor -- start 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.535+0000 7f9cd7b53700 1 -- start start 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.535+0000 7f9cd7b53700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cd019c860 0x7f9cd019cc80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.535+0000 7f9cd7b53700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9cd019d1c0 con 0x7f9cd019c860 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.535+0000 7f9cd58ef700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cd019c860 0x7f9cd019cc80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.536+0000 7f9cd58ef700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cd019c860 0x7f9cd019cc80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38276/0 (socket says 192.168.123.105:38276) 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.536+0000 7f9cd58ef700 1 -- 192.168.123.105:0/4231835569 learned_addr learned my addr 192.168.123.105:0/4231835569 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.536+0000 7f9cd58ef700 1 -- 192.168.123.105:0/4231835569 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9cc0009740 con 0x7f9cd019c860 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.536+0000 7f9cd58ef700 1 --2- 192.168.123.105:0/4231835569 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cd019c860 0x7f9cd019cc80 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f9cc0003710 tx=0x7f9cc0003b00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.536+0000 7f9cc6ffd700 1 -- 192.168.123.105:0/4231835569 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9cc0003fc0 con 0x7f9cd019c860 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.536+0000 7f9cc6ffd700 1 -- 192.168.123.105:0/4231835569 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f9cc0024460 con 0x7f9cd019c860 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.536+0000 7f9cc6ffd700 1 -- 192.168.123.105:0/4231835569 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9cc001b440 con 0x7f9cd019c860 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.536+0000 7f9cd7b53700 1 -- 192.168.123.105:0/4231835569 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9cd019d3c0 con 0x7f9cd019c860 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.536+0000 7f9cd7b53700 1 -- 192.168.123.105:0/4231835569 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9cd007c4d0 con 0x7f9cd019c860 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.537+0000 7f9cc6ffd700 1 -- 192.168.123.105:0/4231835569 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 7) v1 ==== 45172+0+0 (secure 0 0 0) 0x7f9cc00245d0 con 0x7f9cd019c860 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.537+0000 7f9cd7b53700 1 -- 192.168.123.105:0/4231835569 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9cd004fa20 con 0x7f9cd019c860 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.537+0000 7f9cc6ffd700 1 --2- 192.168.123.105:0/4231835569 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9cbc038220 0x7f9cbc03a6e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.537+0000 7f9cc6ffd700 1 -- 192.168.123.105:0/4231835569 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f9cc004bfc0 con 0x7f9cd019c860 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.540+0000 7f9cd50ee700 1 --2- 192.168.123.105:0/4231835569 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9cbc038220 0x7f9cbc03a6e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.540+0000 7f9cd50ee700 1 --2- 192.168.123.105:0/4231835569 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9cbc038220 0x7f9cbc03a6e0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f9ccc006fd0 tx=0x7f9ccc006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.541+0000 7f9cc6ffd700 1 -- 192.168.123.105:0/4231835569 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f9cc0004120 con 0x7f9cd019c860 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.648+0000 7f9cd7b53700 1 -- 192.168.123.105:0/4231835569 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch set backend", "module_name": "cephadm", "target": ["mon-mgr", ""]}) v1 -- 0x7f9cd0106920 con 0x7f9cbc038220 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.654+0000 7f9cc6ffd700 1 -- 192.168.123.105:0/4231835569 <== mgr.14120 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f9cd0106920 con 0x7f9cbc038220 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.659+0000 7f9cd7b53700 1 -- 192.168.123.105:0/4231835569 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9cbc038220 msgr2=0x7f9cbc03a6e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.659+0000 7f9cd7b53700 1 --2- 192.168.123.105:0/4231835569 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9cbc038220 0x7f9cbc03a6e0 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f9ccc006fd0 tx=0x7f9ccc006e40 comp rx=0 tx=0).stop 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.659+0000 7f9cd7b53700 1 -- 192.168.123.105:0/4231835569 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cd019c860 msgr2=0x7f9cd019cc80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.659+0000 7f9cd7b53700 1 --2- 192.168.123.105:0/4231835569 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cd019c860 0x7f9cd019cc80 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f9cc0003710 tx=0x7f9cc0003b00 comp rx=0 tx=0).stop 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.660+0000 7f9cd7b53700 1 -- 192.168.123.105:0/4231835569 shutdown_connections 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.660+0000 7f9cd7b53700 1 --2- 192.168.123.105:0/4231835569 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9cbc038220 0x7f9cbc03a6e0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.660+0000 7f9cd7b53700 1 --2- 192.168.123.105:0/4231835569 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cd019c860 0x7f9cd019cc80 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.660+0000 7f9cd7b53700 1 -- 192.168.123.105:0/4231835569 >> 192.168.123.105:0/4231835569 conn(0x7f9cd01044d0 msgr2=0x7f9cd0106210 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.660+0000 7f9cd7b53700 1 -- 192.168.123.105:0/4231835569 shutdown_connections 2026-03-10T07:45:57.688 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.660+0000 7f9cd7b53700 1 -- 192.168.123.105:0/4231835569 wait complete. 2026-03-10T07:45:57.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout value unchanged 2026-03-10T07:45:57.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.804+0000 7f4b1205d700 1 Processor -- start 2026-03-10T07:45:57.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.804+0000 7f4b1205d700 1 -- start start 2026-03-10T07:45:57.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.804+0000 7f4b1205d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4b0c072b70 0x7f4b0c10fdc0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:57.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.804+0000 7f4b1205d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4b0c110390 con 0x7f4b0c072b70 2026-03-10T07:45:57.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.805+0000 7f4b0b7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4b0c072b70 0x7f4b0c10fdc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:57.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.805+0000 7f4b0b7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4b0c072b70 0x7f4b0c10fdc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38280/0 (socket says 192.168.123.105:38280) 2026-03-10T07:45:57.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.805+0000 7f4b0b7fe700 1 -- 192.168.123.105:0/2400572594 learned_addr learned my addr 192.168.123.105:0/2400572594 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:57.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.805+0000 7f4b0b7fe700 1 -- 192.168.123.105:0/2400572594 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4b0c1104d0 con 0x7f4b0c072b70 2026-03-10T07:45:57.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.805+0000 7f4b0b7fe700 1 --2- 192.168.123.105:0/2400572594 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4b0c072b70 0x7f4b0c10fdc0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f4b04009fb0 tx=0x7f4b0400b450 comp rx=0 tx=0).ready entity=mon.0 client_cookie=272e1a64ead5fbaf server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:57.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.805+0000 7f4b0a7fc700 1 -- 192.168.123.105:0/2400572594 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4b0400baa0 con 0x7f4b0c072b70 2026-03-10T07:45:57.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.805+0000 7f4b0a7fc700 1 -- 192.168.123.105:0/2400572594 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f4b0400bc00 con 0x7f4b0c072b70 2026-03-10T07:45:57.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.805+0000 7f4b0a7fc700 1 -- 192.168.123.105:0/2400572594 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4b04004750 con 0x7f4b0c072b70 2026-03-10T07:45:57.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.806+0000 7f4b1205d700 1 -- 192.168.123.105:0/2400572594 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4b0c072b70 msgr2=0x7f4b0c10fdc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:57.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.806+0000 7f4b1205d700 1 --2- 192.168.123.105:0/2400572594 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4b0c072b70 0x7f4b0c10fdc0 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f4b04009fb0 tx=0x7f4b0400b450 comp rx=0 tx=0).stop 2026-03-10T07:45:57.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.806+0000 7f4b1205d700 1 -- 192.168.123.105:0/2400572594 shutdown_connections 2026-03-10T07:45:57.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.806+0000 7f4b1205d700 1 --2- 192.168.123.105:0/2400572594 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4b0c072b70 0x7f4b0c10fdc0 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:57.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.806+0000 7f4b1205d700 1 -- 192.168.123.105:0/2400572594 >> 192.168.123.105:0/2400572594 conn(0x7f4b0c06d660 msgr2=0x7f4b0c06fac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:57.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.806+0000 7f4b1205d700 1 -- 192.168.123.105:0/2400572594 shutdown_connections 2026-03-10T07:45:57.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.806+0000 7f4b1205d700 1 -- 192.168.123.105:0/2400572594 wait complete. 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.806+0000 7f4b1205d700 1 Processor -- start 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.806+0000 7f4b1205d700 1 -- start start 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.806+0000 7f4b1205d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4b0c072b70 0x7f4b0c1b1a90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.806+0000 7f4b1205d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4b0c1b1fd0 con 0x7f4b0c072b70 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.807+0000 7f4b0b7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4b0c072b70 0x7f4b0c1b1a90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.807+0000 7f4b0b7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4b0c072b70 0x7f4b0c1b1a90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38286/0 (socket says 192.168.123.105:38286) 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.807+0000 7f4b0b7fe700 1 -- 192.168.123.105:0/2212505473 learned_addr learned my addr 192.168.123.105:0/2212505473 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.807+0000 7f4b0b7fe700 1 -- 192.168.123.105:0/2212505473 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4b04009d20 con 0x7f4b0c072b70 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.807+0000 7f4b0b7fe700 1 --2- 192.168.123.105:0/2212505473 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4b0c072b70 0x7f4b0c1b1a90 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f4b04004b70 tx=0x7f4b0400bb50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.808+0000 7f4b08ff9700 1 -- 192.168.123.105:0/2212505473 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4b040046f0 con 0x7f4b0c072b70 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.808+0000 7f4b08ff9700 1 -- 192.168.123.105:0/2212505473 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f4b04010ab0 con 0x7f4b0c072b70 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.808+0000 7f4b1205d700 1 -- 192.168.123.105:0/2212505473 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4b0c1b21d0 con 0x7f4b0c072b70 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.808+0000 7f4b1205d700 1 -- 192.168.123.105:0/2212505473 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4b0c1b2650 con 0x7f4b0c072b70 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.809+0000 7f4b08ff9700 1 -- 192.168.123.105:0/2212505473 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4b04007a80 con 0x7f4b0c072b70 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.809+0000 7f4b08ff9700 1 -- 192.168.123.105:0/2212505473 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 7) v1 ==== 45172+0+0 (secure 0 0 0) 0x7f4b04007cc0 con 0x7f4b0c072b70 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.810+0000 7f4b08ff9700 1 --2- 192.168.123.105:0/2212505473 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4af4038280 0x7f4af403a740 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.810+0000 7f4b08ff9700 1 -- 192.168.123.105:0/2212505473 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f4b0404ce20 con 0x7f4b0c072b70 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.810+0000 7f4b0affd700 1 --2- 192.168.123.105:0/2212505473 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4af4038280 0x7f4af403a740 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.810+0000 7f4b0affd700 1 --2- 192.168.123.105:0/2212505473 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4af4038280 0x7f4af403a740 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f4b00006fd0 tx=0x7f4b00006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.810+0000 7f4b1205d700 1 -- 192.168.123.105:0/2212505473 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4af8005320 con 0x7f4b0c072b70 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.813+0000 7f4b08ff9700 1 -- 192.168.123.105:0/2212505473 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f4b0402a430 con 0x7f4b0c072b70 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.917+0000 7f4b1205d700 1 -- 192.168.123.105:0/2212505473 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "cephadm set-user", "user": "root", "target": ["mon-mgr", ""]}) v1 -- 0x7f4af8000bf0 con 0x7f4af4038280 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.917+0000 7f4b08ff9700 1 -- 192.168.123.105:0/2212505473 <== mgr.14120 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+16 (secure 0 0 0) 0x7f4af8000bf0 con 0x7f4af4038280 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.920+0000 7f4b1205d700 1 -- 192.168.123.105:0/2212505473 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4af4038280 msgr2=0x7f4af403a740 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.920+0000 7f4b1205d700 1 --2- 192.168.123.105:0/2212505473 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4af4038280 0x7f4af403a740 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f4b00006fd0 tx=0x7f4b00006e40 comp rx=0 tx=0).stop 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.920+0000 7f4b1205d700 1 -- 192.168.123.105:0/2212505473 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4b0c072b70 msgr2=0x7f4b0c1b1a90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.920+0000 7f4b1205d700 1 --2- 192.168.123.105:0/2212505473 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4b0c072b70 0x7f4b0c1b1a90 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f4b04004b70 tx=0x7f4b0400bb50 comp rx=0 tx=0).stop 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.920+0000 7f4b1205d700 1 -- 192.168.123.105:0/2212505473 shutdown_connections 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.920+0000 7f4b1205d700 1 --2- 192.168.123.105:0/2212505473 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4af4038280 0x7f4af403a740 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.921+0000 7f4b1205d700 1 --2- 192.168.123.105:0/2212505473 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4b0c072b70 0x7f4b0c1b1a90 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.921+0000 7f4b1205d700 1 -- 192.168.123.105:0/2212505473 >> 192.168.123.105:0/2212505473 conn(0x7f4b0c06d660 msgr2=0x7f4b0c06f400 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.921+0000 7f4b1205d700 1 -- 192.168.123.105:0/2212505473 shutdown_connections 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:57.921+0000 7f4b1205d700 1 -- 192.168.123.105:0/2212505473 wait complete. 2026-03-10T07:45:57.970 INFO:teuthology.orchestra.run.vm05.stdout:Generating ssh key... 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.079+0000 7f0905f1c700 1 Processor -- start 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.081+0000 7f0905f1c700 1 -- start start 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.082+0000 7f0905f1c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0900106770 0x7f0900106b90 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.082+0000 7f0905f1c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0900107160 con 0x7f0900106770 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.082+0000 7f08ff7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0900106770 0x7f0900106b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.082+0000 7f08ff7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0900106770 0x7f0900106b90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38294/0 (socket says 192.168.123.105:38294) 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.082+0000 7f08ff7fe700 1 -- 192.168.123.105:0/1965487126 learned_addr learned my addr 192.168.123.105:0/1965487126 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.083+0000 7f08ff7fe700 1 -- 192.168.123.105:0/1965487126 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0900107970 con 0x7f0900106770 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.083+0000 7f08ff7fe700 1 --2- 192.168.123.105:0/1965487126 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0900106770 0x7f0900106b90 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f08e8009cf0 tx=0x7f08e800b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=83a231dd0c68d373 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.083+0000 7f08fe7fc700 1 -- 192.168.123.105:0/1965487126 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f08e8004030 con 0x7f0900106770 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.083+0000 7f08fe7fc700 1 -- 192.168.123.105:0/1965487126 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f08e800b810 con 0x7f0900106770 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.083+0000 7f08fe7fc700 1 -- 192.168.123.105:0/1965487126 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f08e8003a90 con 0x7f0900106770 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.084+0000 7f0905f1c700 1 -- 192.168.123.105:0/1965487126 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0900106770 msgr2=0x7f0900106b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.084+0000 7f0905f1c700 1 --2- 192.168.123.105:0/1965487126 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0900106770 0x7f0900106b90 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f08e8009cf0 tx=0x7f08e800b0e0 comp rx=0 tx=0).stop 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.084+0000 7f0905f1c700 1 -- 192.168.123.105:0/1965487126 shutdown_connections 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.084+0000 7f0905f1c700 1 --2- 192.168.123.105:0/1965487126 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0900106770 0x7f0900106b90 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.084+0000 7f0905f1c700 1 -- 192.168.123.105:0/1965487126 >> 192.168.123.105:0/1965487126 conn(0x7f0900101d30 msgr2=0x7f0900104150 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.084+0000 7f0905f1c700 1 -- 192.168.123.105:0/1965487126 shutdown_connections 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.085+0000 7f0905f1c700 1 -- 192.168.123.105:0/1965487126 wait complete. 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.085+0000 7f0905f1c700 1 Processor -- start 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.085+0000 7f0905f1c700 1 -- start start 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.085+0000 7f0905f1c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0900106770 0x7f0900198430 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.085+0000 7f0905f1c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0900198970 con 0x7f0900106770 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.086+0000 7f08ff7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0900106770 0x7f0900198430 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.086+0000 7f08ff7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0900106770 0x7f0900198430 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38302/0 (socket says 192.168.123.105:38302) 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.086+0000 7f08ff7fe700 1 -- 192.168.123.105:0/3864958018 learned_addr learned my addr 192.168.123.105:0/3864958018 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.087+0000 7f08ff7fe700 1 -- 192.168.123.105:0/3864958018 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f08e8009740 con 0x7f0900106770 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.087+0000 7f08ff7fe700 1 --2- 192.168.123.105:0/3864958018 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0900106770 0x7f0900198430 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f08e8003770 tx=0x7f08e8003fe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.087+0000 7f08fcff9700 1 -- 192.168.123.105:0/3864958018 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f08e8004140 con 0x7f0900106770 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.087+0000 7f08fcff9700 1 -- 192.168.123.105:0/3864958018 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f08e80042a0 con 0x7f0900106770 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.087+0000 7f08fcff9700 1 -- 192.168.123.105:0/3864958018 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f08e8011420 con 0x7f0900106770 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.087+0000 7f0905f1c700 1 -- 192.168.123.105:0/3864958018 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f09001087f0 con 0x7f0900106770 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.087+0000 7f0905f1c700 1 -- 192.168.123.105:0/3864958018 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0900198e00 con 0x7f0900106770 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.088+0000 7f08fcff9700 1 -- 192.168.123.105:0/3864958018 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 7) v1 ==== 45172+0+0 (secure 0 0 0) 0x7f08e8011580 con 0x7f0900106770 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.088+0000 7f08fcff9700 1 --2- 192.168.123.105:0/3864958018 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f08ec03c6d0 0x7f08ec03eb90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.088+0000 7f08fcff9700 1 -- 192.168.123.105:0/3864958018 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f08e804cbb0 con 0x7f0900106770 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.088+0000 7f08feffd700 1 --2- 192.168.123.105:0/3864958018 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f08ec03c6d0 0x7f08ec03eb90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.089+0000 7f0905f1c700 1 -- 192.168.123.105:0/3864958018 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0900192060 con 0x7f0900106770 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.091+0000 7f08feffd700 1 --2- 192.168.123.105:0/3864958018 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f08ec03c6d0 0x7f08ec03eb90 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f08f0006fd0 tx=0x7f08f0006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.091+0000 7f08fcff9700 1 -- 192.168.123.105:0/3864958018 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f08e8020070 con 0x7f0900106770 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.193+0000 7f0905f1c700 1 -- 192.168.123.105:0/3864958018 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "cephadm generate-key", "target": ["mon-mgr", ""]}) v1 -- 0x7f090002ce60 con 0x7f08ec03c6d0 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.264+0000 7f08fcff9700 1 -- 192.168.123.105:0/3864958018 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f08e8018c70 con 0x7f0900106770 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.301+0000 7f08fcff9700 1 -- 192.168.123.105:0/3864958018 <== mgr.14120 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f090002ce60 con 0x7f08ec03c6d0 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.304+0000 7f0905f1c700 1 -- 192.168.123.105:0/3864958018 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f08ec03c6d0 msgr2=0x7f08ec03eb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.304+0000 7f0905f1c700 1 --2- 192.168.123.105:0/3864958018 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f08ec03c6d0 0x7f08ec03eb90 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f08f0006fd0 tx=0x7f08f0006e40 comp rx=0 tx=0).stop 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.304+0000 7f0905f1c700 1 -- 192.168.123.105:0/3864958018 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0900106770 msgr2=0x7f0900198430 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.304+0000 7f0905f1c700 1 --2- 192.168.123.105:0/3864958018 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0900106770 0x7f0900198430 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f08e8003770 tx=0x7f08e8003fe0 comp rx=0 tx=0).stop 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.304+0000 7f0905f1c700 1 -- 192.168.123.105:0/3864958018 shutdown_connections 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.304+0000 7f0905f1c700 1 --2- 192.168.123.105:0/3864958018 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f08ec03c6d0 0x7f08ec03eb90 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.304+0000 7f0905f1c700 1 --2- 192.168.123.105:0/3864958018 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0900106770 0x7f0900198430 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:58.334 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.304+0000 7f0905f1c700 1 -- 192.168.123.105:0/3864958018 >> 192.168.123.105:0/3864958018 conn(0x7f0900101d30 msgr2=0x7f0900103890 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:58.335 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.304+0000 7f0905f1c700 1 -- 192.168.123.105:0/3864958018 shutdown_connections 2026-03-10T07:45:58.335 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.304+0000 7f0905f1c700 1 -- 192.168.123.105:0/3864958018 wait complete. 2026-03-10T07:45:58.335 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:58 vm05 ceph-mon[50387]: [10/Mar/2026:07:45:56] ENGINE Bus STARTING 2026-03-10T07:45:58.335 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:58 vm05 ceph-mon[50387]: [10/Mar/2026:07:45:56] ENGINE Serving on http://192.168.123.105:8765 2026-03-10T07:45:58.335 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:58 vm05 ceph-mon[50387]: [10/Mar/2026:07:45:56] ENGINE Serving on https://192.168.123.105:7150 2026-03-10T07:45:58.335 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:58 vm05 ceph-mon[50387]: [10/Mar/2026:07:45:56] ENGINE Bus STARTED 2026-03-10T07:45:58.335 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:58 vm05 ceph-mon[50387]: mgrmap e7: vm05.blexke(active, since 1.51327s) 2026-03-10T07:45:58.335 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:58 vm05 ceph-mon[50387]: from='client.14124 -' entity='client.admin' cmd=[{"prefix": "get_command_descriptions"}]: dispatch 2026-03-10T07:45:58.335 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:58 vm05 ceph-mon[50387]: from='client.14124 -' entity='client.admin' cmd=[{"prefix": "mgr_status"}]: dispatch 2026-03-10T07:45:58.335 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:58 vm05 ceph-mon[50387]: from='client.14132 -' entity='client.admin' cmd=[{"prefix": "orch set backend", "module_name": "cephadm", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:45:58.335 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:58 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:45:58.335 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:58 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:45:58.335 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:58 vm05 ceph-mon[50387]: from='client.14134 -' entity='client.admin' cmd=[{"prefix": "cephadm set-user", "user": "root", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDyvut0qWxnRXwISmv/mUA3HO6BUSTGqEA6tFNZAuE4tVWUasENK1pKS3o+yVImRTgz9e64jfhEHjSkaEvPh5AXFEDz/yoFvR2/8RFefSEKuXDSX7Mjpg5cXAaO1p6wHxkZfJrJ2mDRxEiiC3a63hkDnJ75eOQn6cQxlTUQYPVd8UCeEMCipS9uDZQunarRqEKYrZ/mNAEx36+9Vj5yS8cfbebzKZi5RG2H6IZDdICEyGMB+XTEUdK1z87vZncs0Vf6ckGiUcacDk4t8RHSxfBzd5yVF7Y/7NPSTJpCX5E6zVW2ldHri06w7XluDFrLVNaueSM7PZwjiDrQciE4u/2yaFv00fsAjEpc58ZvvOdqDT9lcvNRoNNoSjVnvK1AvRmUKaZrBHsCexA6UclTr5nOfasz6NOFSn1iHKZNEHDrOg8UYASszZF8riM6223sU0fBqzT66pptuIP3Fc+iEb2MiJzYFQyYAe6Z9yQ0PMuDkxwhB6g3HN/i4wmTScEFiFU= ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.455+0000 7f84e8f09700 1 Processor -- start 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.456+0000 7f84e8f09700 1 -- start start 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.456+0000 7f84e8f09700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f84e4106760 0x7f84e4106b80 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.456+0000 7f84e8f09700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f84e4107150 con 0x7f84e4106760 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.456+0000 7f84e259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f84e4106760 0x7f84e4106b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.456+0000 7f84e259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f84e4106760 0x7f84e4106b80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38316/0 (socket says 192.168.123.105:38316) 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.456+0000 7f84e259c700 1 -- 192.168.123.105:0/1556063379 learned_addr learned my addr 192.168.123.105:0/1556063379 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.457+0000 7f84e259c700 1 -- 192.168.123.105:0/1556063379 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f84e4107960 con 0x7f84e4106760 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.457+0000 7f84e259c700 1 --2- 192.168.123.105:0/1556063379 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f84e4106760 0x7f84e4106b80 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f84cc009cf0 tx=0x7f84cc00b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=c0a4fc871ced6a1e server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.457+0000 7f84e159a700 1 -- 192.168.123.105:0/1556063379 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f84cc004030 con 0x7f84e4106760 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.457+0000 7f84e159a700 1 -- 192.168.123.105:0/1556063379 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f84cc00b810 con 0x7f84e4106760 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.457+0000 7f84e159a700 1 -- 192.168.123.105:0/1556063379 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f84cc003a90 con 0x7f84e4106760 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.457+0000 7f84e8f09700 1 -- 192.168.123.105:0/1556063379 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f84e4106760 msgr2=0x7f84e4106b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.457+0000 7f84e8f09700 1 --2- 192.168.123.105:0/1556063379 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f84e4106760 0x7f84e4106b80 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f84cc009cf0 tx=0x7f84cc00b0e0 comp rx=0 tx=0).stop 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.457+0000 7f84e8f09700 1 -- 192.168.123.105:0/1556063379 shutdown_connections 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.457+0000 7f84e8f09700 1 --2- 192.168.123.105:0/1556063379 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f84e4106760 0x7f84e4106b80 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.457+0000 7f84e8f09700 1 -- 192.168.123.105:0/1556063379 >> 192.168.123.105:0/1556063379 conn(0x7f84e4101ce0 msgr2=0x7f84e4104140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.458+0000 7f84e8f09700 1 -- 192.168.123.105:0/1556063379 shutdown_connections 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.458+0000 7f84e8f09700 1 -- 192.168.123.105:0/1556063379 wait complete. 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.458+0000 7f84e8f09700 1 Processor -- start 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.458+0000 7f84e8f09700 1 -- start start 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.458+0000 7f84e8f09700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f84e4106760 0x7f84e419c690 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.458+0000 7f84e8f09700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f84e419cbd0 con 0x7f84e4106760 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.459+0000 7f84e259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f84e4106760 0x7f84e419c690 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.459+0000 7f84e259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f84e4106760 0x7f84e419c690 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38326/0 (socket says 192.168.123.105:38326) 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.459+0000 7f84e259c700 1 -- 192.168.123.105:0/2864692631 learned_addr learned my addr 192.168.123.105:0/2864692631 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.459+0000 7f84e259c700 1 -- 192.168.123.105:0/2864692631 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f84cc009740 con 0x7f84e4106760 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.459+0000 7f84e259c700 1 --2- 192.168.123.105:0/2864692631 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f84e4106760 0x7f84e419c690 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f84cc000c00 tx=0x7f84cc011840 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.459+0000 7f84db7fe700 1 -- 192.168.123.105:0/2864692631 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f84cc011a70 con 0x7f84e4106760 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.459+0000 7f84db7fe700 1 -- 192.168.123.105:0/2864692631 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f84cc011bd0 con 0x7f84e4106760 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.459+0000 7f84db7fe700 1 -- 192.168.123.105:0/2864692631 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f84cc01a560 con 0x7f84e4106760 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.459+0000 7f84e8f09700 1 -- 192.168.123.105:0/2864692631 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f84e419cdd0 con 0x7f84e4106760 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.459+0000 7f84e8f09700 1 -- 192.168.123.105:0/2864692631 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f84e419d1f0 con 0x7f84e4106760 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.460+0000 7f84db7fe700 1 -- 192.168.123.105:0/2864692631 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f84cc01a6c0 con 0x7f84e4106760 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.460+0000 7f84db7fe700 1 --2- 192.168.123.105:0/2864692631 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f84d00383a0 0x7f84d003a860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.460+0000 7f84db7fe700 1 -- 192.168.123.105:0/2864692631 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f84cc01e070 con 0x7f84e4106760 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.460+0000 7f84e1d9b700 1 --2- 192.168.123.105:0/2864692631 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f84d00383a0 0x7f84d003a860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.461+0000 7f84e1d9b700 1 --2- 192.168.123.105:0/2864692631 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f84d00383a0 0x7f84d003a860 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f84d4006fd0 tx=0x7f84d4006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.461+0000 7f84e8f09700 1 -- 192.168.123.105:0/2864692631 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f84e40623c0 con 0x7f84e4106760 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.464+0000 7f84db7fe700 1 -- 192.168.123.105:0/2864692631 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f84cc02d430 con 0x7f84e4106760 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.565+0000 7f84e8f09700 1 -- 192.168.123.105:0/2864692631 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "cephadm get-pub-key", "target": ["mon-mgr", ""]}) v1 -- 0x7f84e419fcf0 con 0x7f84d00383a0 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.566+0000 7f84db7fe700 1 -- 192.168.123.105:0/2864692631 <== mgr.14120 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+595 (secure 0 0 0) 0x7f84e419fcf0 con 0x7f84d00383a0 2026-03-10T07:45:58.622 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.568+0000 7f84e8f09700 1 -- 192.168.123.105:0/2864692631 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f84d00383a0 msgr2=0x7f84d003a860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:58.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.568+0000 7f84e8f09700 1 --2- 192.168.123.105:0/2864692631 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f84d00383a0 0x7f84d003a860 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f84d4006fd0 tx=0x7f84d4006e40 comp rx=0 tx=0).stop 2026-03-10T07:45:58.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.569+0000 7f84e8f09700 1 -- 192.168.123.105:0/2864692631 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f84e4106760 msgr2=0x7f84e419c690 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:45:58.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.569+0000 7f84e8f09700 1 --2- 192.168.123.105:0/2864692631 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f84e4106760 0x7f84e419c690 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f84cc000c00 tx=0x7f84cc011840 comp rx=0 tx=0).stop 2026-03-10T07:45:58.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.569+0000 7f84e8f09700 1 -- 192.168.123.105:0/2864692631 shutdown_connections 2026-03-10T07:45:58.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.569+0000 7f84e8f09700 1 --2- 192.168.123.105:0/2864692631 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f84d00383a0 0x7f84d003a860 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:58.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.569+0000 7f84e8f09700 1 --2- 192.168.123.105:0/2864692631 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f84e4106760 0x7f84e419c690 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:45:58.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.569+0000 7f84e8f09700 1 -- 192.168.123.105:0/2864692631 >> 192.168.123.105:0/2864692631 conn(0x7f84e4101ce0 msgr2=0x7f84e41029b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:45:58.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.569+0000 7f84e8f09700 1 -- 192.168.123.105:0/2864692631 shutdown_connections 2026-03-10T07:45:58.623 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.569+0000 7f84e8f09700 1 -- 192.168.123.105:0/2864692631 wait complete. 2026-03-10T07:45:58.623 INFO:teuthology.orchestra.run.vm05.stdout:Wrote public SSH key to /home/ubuntu/cephtest/ceph.pub 2026-03-10T07:45:58.623 INFO:teuthology.orchestra.run.vm05.stdout:Adding key to root@localhost authorized_keys... 2026-03-10T07:45:58.623 INFO:teuthology.orchestra.run.vm05.stdout:Adding host vm05... 2026-03-10T07:45:59.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:59 vm05 ceph-mon[50387]: from='client.14136 -' entity='client.admin' cmd=[{"prefix": "cephadm generate-key", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:45:59.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:59 vm05 ceph-mon[50387]: Generating ssh key... 2026-03-10T07:45:59.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:59 vm05 ceph-mon[50387]: mgrmap e8: vm05.blexke(active, since 2s) 2026-03-10T07:45:59.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:59 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:45:59.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:59 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:45:59.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:59 vm05 ceph-mon[50387]: from='client.14138 -' entity='client.admin' cmd=[{"prefix": "cephadm get-pub-key", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:45:59.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:45:59 vm05 ceph-mon[50387]: from='client.14140 -' entity='client.admin' cmd=[{"prefix": "orch host add", "hostname": "vm05", "addr": "192.168.123.105", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:46:00.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:00 vm05 ceph-mon[50387]: Deploying cephadm binary to vm05 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout Added host 'vm05' with addr '192.168.123.105' 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.741+0000 7f16a99c7700 1 Processor -- start 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.741+0000 7f16a99c7700 1 -- start start 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.742+0000 7f16a99c7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16a4104fb0 0x7f16a41073e0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.742+0000 7f16a99c7700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f16a4074720 con 0x7f16a4104fb0 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.742+0000 7f16a2ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16a4104fb0 0x7f16a41073e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.742+0000 7f16a2ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16a4104fb0 0x7f16a41073e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38328/0 (socket says 192.168.123.105:38328) 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.742+0000 7f16a2ffd700 1 -- 192.168.123.105:0/2343499805 learned_addr learned my addr 192.168.123.105:0/2343499805 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.742+0000 7f16a2ffd700 1 -- 192.168.123.105:0/2343499805 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f16a4107920 con 0x7f16a4104fb0 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.743+0000 7f16a2ffd700 1 --2- 192.168.123.105:0/2343499805 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16a4104fb0 0x7f16a41073e0 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f168c009a90 tx=0x7f168c009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=664f16a30b73a54e server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.743+0000 7f16a1ffb700 1 -- 192.168.123.105:0/2343499805 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f168c004030 con 0x7f16a4104fb0 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.743+0000 7f16a1ffb700 1 -- 192.168.123.105:0/2343499805 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f168c00b7e0 con 0x7f16a4104fb0 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.743+0000 7f16a99c7700 1 -- 192.168.123.105:0/2343499805 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16a4104fb0 msgr2=0x7f16a41073e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.743+0000 7f16a99c7700 1 --2- 192.168.123.105:0/2343499805 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16a4104fb0 0x7f16a41073e0 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f168c009a90 tx=0x7f168c009da0 comp rx=0 tx=0).stop 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.743+0000 7f16a99c7700 1 -- 192.168.123.105:0/2343499805 shutdown_connections 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.743+0000 7f16a99c7700 1 --2- 192.168.123.105:0/2343499805 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16a4104fb0 0x7f16a41073e0 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.743+0000 7f16a99c7700 1 -- 192.168.123.105:0/2343499805 >> 192.168.123.105:0/2343499805 conn(0x7f16a4100bd0 msgr2=0x7f16a4103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.744+0000 7f16a99c7700 1 -- 192.168.123.105:0/2343499805 shutdown_connections 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.744+0000 7f16a99c7700 1 -- 192.168.123.105:0/2343499805 wait complete. 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.744+0000 7f16a99c7700 1 Processor -- start 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.744+0000 7f16a99c7700 1 -- start start 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.744+0000 7f16a99c7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16a41a0ba0 0x7f16a41a0fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.744+0000 7f16a99c7700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f16a4074720 con 0x7f16a41a0ba0 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.745+0000 7f16a2ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16a41a0ba0 0x7f16a41a0fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.745+0000 7f16a2ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16a41a0ba0 0x7f16a41a0fc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38332/0 (socket says 192.168.123.105:38332) 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.745+0000 7f16a2ffd700 1 -- 192.168.123.105:0/2698177082 learned_addr learned my addr 192.168.123.105:0/2698177082 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.745+0000 7f16a2ffd700 1 -- 192.168.123.105:0/2698177082 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f168c009740 con 0x7f16a41a0ba0 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.745+0000 7f16a2ffd700 1 --2- 192.168.123.105:0/2698177082 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16a41a0ba0 0x7f16a41a0fc0 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f168c009130 tx=0x7f168c00be80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.745+0000 7f16a89c5700 1 -- 192.168.123.105:0/2698177082 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f168c01a670 con 0x7f16a41a0ba0 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.745+0000 7f16a99c7700 1 -- 192.168.123.105:0/2698177082 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f16a41a1500 con 0x7f16a41a0ba0 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.746+0000 7f16a99c7700 1 -- 192.168.123.105:0/2698177082 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f16a41a4180 con 0x7f16a41a0ba0 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.746+0000 7f16a89c5700 1 -- 192.168.123.105:0/2698177082 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f168c01ac70 con 0x7f16a41a0ba0 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.746+0000 7f16a89c5700 1 -- 192.168.123.105:0/2698177082 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f168c0044b0 con 0x7f16a41a0ba0 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.746+0000 7f16a89c5700 1 -- 192.168.123.105:0/2698177082 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f168c02c430 con 0x7f16a41a0ba0 2026-03-10T07:46:00.636 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.746+0000 7f16a89c5700 1 --2- 192.168.123.105:0/2698177082 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f16900383e0 0x7f169003a8a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:00.637 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.746+0000 7f16a89c5700 1 -- 192.168.123.105:0/2698177082 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f168c04c710 con 0x7f16a41a0ba0 2026-03-10T07:46:00.637 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.747+0000 7f16a27fc700 1 --2- 192.168.123.105:0/2698177082 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f16900383e0 0x7f169003a8a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:00.637 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.747+0000 7f16a27fc700 1 --2- 192.168.123.105:0/2698177082 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f16900383e0 0x7f169003a8a0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f1694006fd0 tx=0x7f1694006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:00.637 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.747+0000 7f16a99c7700 1 -- 192.168.123.105:0/2698177082 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1684005320 con 0x7f16a41a0ba0 2026-03-10T07:46:00.637 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.750+0000 7f16a89c5700 1 -- 192.168.123.105:0/2698177082 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f168c004610 con 0x7f16a41a0ba0 2026-03-10T07:46:00.637 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:45:58.858+0000 7f16a99c7700 1 -- 192.168.123.105:0/2698177082 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch host add", "hostname": "vm05", "addr": "192.168.123.105", "target": ["mon-mgr", ""]}) v1 -- 0x7f1684000bf0 con 0x7f16900383e0 2026-03-10T07:46:00.637 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.598+0000 7f16a89c5700 1 -- 192.168.123.105:0/2698177082 <== mgr.14120 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+46 (secure 0 0 0) 0x7f1684000bf0 con 0x7f16900383e0 2026-03-10T07:46:00.637 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.601+0000 7f16a99c7700 1 -- 192.168.123.105:0/2698177082 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f16900383e0 msgr2=0x7f169003a8a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:00.637 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.601+0000 7f16a99c7700 1 --2- 192.168.123.105:0/2698177082 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f16900383e0 0x7f169003a8a0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f1694006fd0 tx=0x7f1694006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:00.637 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.601+0000 7f16a99c7700 1 -- 192.168.123.105:0/2698177082 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16a41a0ba0 msgr2=0x7f16a41a0fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:00.637 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.601+0000 7f16a99c7700 1 --2- 192.168.123.105:0/2698177082 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16a41a0ba0 0x7f16a41a0fc0 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f168c009130 tx=0x7f168c00be80 comp rx=0 tx=0).stop 2026-03-10T07:46:00.637 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.601+0000 7f16a99c7700 1 -- 192.168.123.105:0/2698177082 shutdown_connections 2026-03-10T07:46:00.637 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.601+0000 7f16a99c7700 1 --2- 192.168.123.105:0/2698177082 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f16900383e0 0x7f169003a8a0 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:00.637 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.601+0000 7f16a99c7700 1 --2- 192.168.123.105:0/2698177082 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16a41a0ba0 0x7f16a41a0fc0 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:00.637 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.601+0000 7f16a99c7700 1 -- 192.168.123.105:0/2698177082 >> 192.168.123.105:0/2698177082 conn(0x7f16a4100bd0 msgr2=0x7f16a41071d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:00.637 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.601+0000 7f16a99c7700 1 -- 192.168.123.105:0/2698177082 shutdown_connections 2026-03-10T07:46:00.637 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.601+0000 7f16a99c7700 1 -- 192.168.123.105:0/2698177082 wait complete. 2026-03-10T07:46:00.637 INFO:teuthology.orchestra.run.vm05.stdout:Deploying mon service with default placement... 2026-03-10T07:46:00.971 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout Scheduled mon update... 2026-03-10T07:46:00.971 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.785+0000 7f40d5192700 1 Processor -- start 2026-03-10T07:46:00.971 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.785+0000 7f40d5192700 1 -- start start 2026-03-10T07:46:00.972 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.785+0000 7f40d5192700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40d0071e00 0x7f40d0072220 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:00.972 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.785+0000 7f40d5192700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f40d00727f0 con 0x7f40d0071e00 2026-03-10T07:46:00.972 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.785+0000 7f40cffff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40d0071e00 0x7f40d0072220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:00.972 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.785+0000 7f40cffff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40d0071e00 0x7f40d0072220 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57548/0 (socket says 192.168.123.105:57548) 2026-03-10T07:46:00.972 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.785+0000 7f40cffff700 1 -- 192.168.123.105:0/1514957707 learned_addr learned my addr 192.168.123.105:0/1514957707 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.786+0000 7f40cffff700 1 -- 192.168.123.105:0/1514957707 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f40d010ddb0 con 0x7f40d0071e00 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.786+0000 7f40cffff700 1 --2- 192.168.123.105:0/1514957707 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40d0071e00 0x7f40d0072220 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f40c0009cf0 tx=0x7f40c000b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=647abca1da02da7e server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.786+0000 7f40ceffd700 1 -- 192.168.123.105:0/1514957707 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f40c0004030 con 0x7f40d0071e00 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.786+0000 7f40ceffd700 1 -- 192.168.123.105:0/1514957707 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f40c000b810 con 0x7f40d0071e00 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.786+0000 7f40ceffd700 1 -- 192.168.123.105:0/1514957707 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f40c0003a90 con 0x7f40d0071e00 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.787+0000 7f40d5192700 1 -- 192.168.123.105:0/1514957707 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40d0071e00 msgr2=0x7f40d0072220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.787+0000 7f40d5192700 1 --2- 192.168.123.105:0/1514957707 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40d0071e00 0x7f40d0072220 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f40c0009cf0 tx=0x7f40c000b0e0 comp rx=0 tx=0).stop 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.789+0000 7f40d5192700 1 -- 192.168.123.105:0/1514957707 shutdown_connections 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.789+0000 7f40d5192700 1 --2- 192.168.123.105:0/1514957707 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40d0071e00 0x7f40d0072220 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.789+0000 7f40d5192700 1 -- 192.168.123.105:0/1514957707 >> 192.168.123.105:0/1514957707 conn(0x7f40d006d320 msgr2=0x7f40d006f760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.789+0000 7f40d5192700 1 -- 192.168.123.105:0/1514957707 shutdown_connections 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.789+0000 7f40d5192700 1 -- 192.168.123.105:0/1514957707 wait complete. 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.789+0000 7f40d5192700 1 Processor -- start 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.789+0000 7f40d5192700 1 -- start start 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.789+0000 7f40d5192700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40d0071e00 0x7f40d01a9030 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.789+0000 7f40d5192700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f40d01a9570 con 0x7f40d0071e00 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.790+0000 7f40cffff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40d0071e00 0x7f40d01a9030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.790+0000 7f40cffff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40d0071e00 0x7f40d01a9030 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57554/0 (socket says 192.168.123.105:57554) 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.790+0000 7f40cffff700 1 -- 192.168.123.105:0/1745697381 learned_addr learned my addr 192.168.123.105:0/1745697381 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.790+0000 7f40cffff700 1 -- 192.168.123.105:0/1745697381 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f40c0009740 con 0x7f40d0071e00 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.790+0000 7f40cffff700 1 --2- 192.168.123.105:0/1745697381 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40d0071e00 0x7f40d01a9030 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f40c0000c00 tx=0x7f40c0011700 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.790+0000 7f40cd7fa700 1 -- 192.168.123.105:0/1745697381 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f40c0011a70 con 0x7f40d0071e00 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.790+0000 7f40cd7fa700 1 -- 192.168.123.105:0/1745697381 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f40c0011bd0 con 0x7f40d0071e00 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.791+0000 7f40cd7fa700 1 -- 192.168.123.105:0/1745697381 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f40c001a560 con 0x7f40d0071e00 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.791+0000 7f40d5192700 1 -- 192.168.123.105:0/1745697381 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f40d01a9770 con 0x7f40d0071e00 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.791+0000 7f40d5192700 1 -- 192.168.123.105:0/1745697381 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f40d01a9b90 con 0x7f40d0071e00 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.792+0000 7f40cd7fa700 1 -- 192.168.123.105:0/1745697381 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f40c0011d40 con 0x7f40d0071e00 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.792+0000 7f40d5192700 1 -- 192.168.123.105:0/1745697381 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f40d00623c0 con 0x7f40d0071e00 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.795+0000 7f40cd7fa700 1 --2- 192.168.123.105:0/1745697381 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f40b8038070 0x7f40b803a530 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.795+0000 7f40cd7fa700 1 -- 192.168.123.105:0/1745697381 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f40c0028030 con 0x7f40d0071e00 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.795+0000 7f40cd7fa700 1 -- 192.168.123.105:0/1745697381 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f40c0018960 con 0x7f40d0071e00 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.795+0000 7f40cf7fe700 1 --2- 192.168.123.105:0/1745697381 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f40b8038070 0x7f40b803a530 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.795+0000 7f40cf7fe700 1 --2- 192.168.123.105:0/1745697381 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f40b8038070 0x7f40b803a530 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f40c4006fd0 tx=0x7f40c4006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.904+0000 7f40d5192700 1 -- 192.168.123.105:0/1745697381 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mon", "target": ["mon-mgr", ""]}) v1 -- 0x7f40d006e5c0 con 0x7f40b8038070 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.911+0000 7f40cd7fa700 1 -- 192.168.123.105:0/1745697381 <== mgr.14120 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7f40d006e5c0 con 0x7f40b8038070 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.918+0000 7f40d5192700 1 -- 192.168.123.105:0/1745697381 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f40b8038070 msgr2=0x7f40b803a530 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.918+0000 7f40d5192700 1 --2- 192.168.123.105:0/1745697381 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f40b8038070 0x7f40b803a530 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f40c4006fd0 tx=0x7f40c4006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.918+0000 7f40d5192700 1 -- 192.168.123.105:0/1745697381 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40d0071e00 msgr2=0x7f40d01a9030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.918+0000 7f40d5192700 1 --2- 192.168.123.105:0/1745697381 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40d0071e00 0x7f40d01a9030 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f40c0000c00 tx=0x7f40c0011700 comp rx=0 tx=0).stop 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.918+0000 7f40d5192700 1 -- 192.168.123.105:0/1745697381 shutdown_connections 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.918+0000 7f40d5192700 1 --2- 192.168.123.105:0/1745697381 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f40b8038070 0x7f40b803a530 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.918+0000 7f40d5192700 1 --2- 192.168.123.105:0/1745697381 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40d0071e00 0x7f40d01a9030 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.918+0000 7f40d5192700 1 -- 192.168.123.105:0/1745697381 >> 192.168.123.105:0/1745697381 conn(0x7f40d006d320 msgr2=0x7f40d006deb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.918+0000 7f40d5192700 1 -- 192.168.123.105:0/1745697381 shutdown_connections 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:00.918+0000 7f40d5192700 1 -- 192.168.123.105:0/1745697381 wait complete. 2026-03-10T07:46:00.973 INFO:teuthology.orchestra.run.vm05.stdout:Deploying mgr service with default placement... 2026-03-10T07:46:01.329 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout Scheduled mgr update... 2026-03-10T07:46:01.329 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.133+0000 7f3b125e0700 1 Processor -- start 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.134+0000 7f3b125e0700 1 -- start start 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.134+0000 7f3b125e0700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b0c07adc0 0x7f3b0c079220 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.134+0000 7f3b125e0700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3b0c0797f0 con 0x7f3b0c07adc0 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.134+0000 7f3b0bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b0c07adc0 0x7f3b0c079220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.134+0000 7f3b0bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b0c07adc0 0x7f3b0c079220 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57556/0 (socket says 192.168.123.105:57556) 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.135+0000 7f3b0bfff700 1 -- 192.168.123.105:0/1127035873 learned_addr learned my addr 192.168.123.105:0/1127035873 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.135+0000 7f3b0bfff700 1 -- 192.168.123.105:0/1127035873 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3b0c079930 con 0x7f3b0c07adc0 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.135+0000 7f3b0bfff700 1 --2- 192.168.123.105:0/1127035873 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b0c07adc0 0x7f3b0c079220 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f3af4009a90 tx=0x7f3af4009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=c58e651bf8e9e93a server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.135+0000 7f3b0affd700 1 -- 192.168.123.105:0/1127035873 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3af4004030 con 0x7f3b0c07adc0 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.136+0000 7f3b0affd700 1 -- 192.168.123.105:0/1127035873 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f3af400b7e0 con 0x7f3b0c07adc0 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.136+0000 7f3b125e0700 1 -- 192.168.123.105:0/1127035873 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b0c07adc0 msgr2=0x7f3b0c079220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.136+0000 7f3b125e0700 1 --2- 192.168.123.105:0/1127035873 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b0c07adc0 0x7f3b0c079220 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f3af4009a90 tx=0x7f3af4009da0 comp rx=0 tx=0).stop 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.136+0000 7f3b125e0700 1 -- 192.168.123.105:0/1127035873 shutdown_connections 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.136+0000 7f3b125e0700 1 --2- 192.168.123.105:0/1127035873 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b0c07adc0 0x7f3b0c079220 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.136+0000 7f3b125e0700 1 -- 192.168.123.105:0/1127035873 >> 192.168.123.105:0/1127035873 conn(0x7f3b0c101ce0 msgr2=0x7f3b0c104140 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.136+0000 7f3b125e0700 1 -- 192.168.123.105:0/1127035873 shutdown_connections 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.136+0000 7f3b125e0700 1 -- 192.168.123.105:0/1127035873 wait complete. 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.137+0000 7f3b125e0700 1 Processor -- start 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.137+0000 7f3b125e0700 1 -- start start 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.137+0000 7f3b125e0700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b0c19c790 0x7f3b0c19cbb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.137+0000 7f3b125e0700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3b0c0797f0 con 0x7f3b0c19c790 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.137+0000 7f3b0bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b0c19c790 0x7f3b0c19cbb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.137+0000 7f3b0bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b0c19c790 0x7f3b0c19cbb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57568/0 (socket says 192.168.123.105:57568) 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.137+0000 7f3b0bfff700 1 -- 192.168.123.105:0/2979574150 learned_addr learned my addr 192.168.123.105:0/2979574150 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.138+0000 7f3b0bfff700 1 -- 192.168.123.105:0/2979574150 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3af4009740 con 0x7f3b0c19c790 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.138+0000 7f3b0bfff700 1 --2- 192.168.123.105:0/2979574150 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b0c19c790 0x7f3b0c19cbb0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f3af400bd00 tx=0x7f3af400bde0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.138+0000 7f3b097fa700 1 -- 192.168.123.105:0/2979574150 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3af4003ec0 con 0x7f3b0c19c790 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.138+0000 7f3b097fa700 1 -- 192.168.123.105:0/2979574150 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1075+0+0 (secure 0 0 0) 0x7f3af40044c0 con 0x7f3b0c19c790 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.138+0000 7f3b097fa700 1 -- 192.168.123.105:0/2979574150 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3af401ac80 con 0x7f3b0c19c790 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.138+0000 7f3b125e0700 1 -- 192.168.123.105:0/2979574150 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3b0c19d0f0 con 0x7f3b0c19c790 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.138+0000 7f3b125e0700 1 -- 192.168.123.105:0/2979574150 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3b0c19fd70 con 0x7f3b0c19c790 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.139+0000 7f3b097fa700 1 -- 192.168.123.105:0/2979574150 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f3af4011420 con 0x7f3b0c19c790 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.139+0000 7f3b097fa700 1 --2- 192.168.123.105:0/2979574150 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3af80383e0 0x7f3af803a8a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.139+0000 7f3b0b7fe700 1 --2- 192.168.123.105:0/2979574150 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3af80383e0 0x7f3af803a8a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.139+0000 7f3b097fa700 1 -- 192.168.123.105:0/2979574150 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f3af404c7b0 con 0x7f3b0c19c790 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.140+0000 7f3b0b7fe700 1 --2- 192.168.123.105:0/2979574150 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3af80383e0 0x7f3af803a8a0 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f3afc006fd0 tx=0x7f3afc006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.140+0000 7f3b125e0700 1 -- 192.168.123.105:0/2979574150 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3b0c04fa20 con 0x7f3b0c19c790 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.143+0000 7f3b097fa700 1 -- 192.168.123.105:0/2979574150 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f3af401ade0 con 0x7f3b0c19c790 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.147+0000 7f3b097fa700 1 -- 192.168.123.105:0/2979574150 <== mon.0 v2:192.168.123.105:3300/0 7 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3af401a460 con 0x7f3b0c19c790 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.270+0000 7f3b125e0700 1 -- 192.168.123.105:0/2979574150 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mgr", "target": ["mon-mgr", ""]}) v1 -- 0x7f3b0c102fe0 con 0x7f3af80383e0 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.274+0000 7f3b097fa700 1 -- 192.168.123.105:0/2979574150 <== mgr.14120 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7f3b0c102fe0 con 0x7f3af80383e0 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.276+0000 7f3b125e0700 1 -- 192.168.123.105:0/2979574150 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3af80383e0 msgr2=0x7f3af803a8a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.276+0000 7f3b125e0700 1 --2- 192.168.123.105:0/2979574150 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3af80383e0 0x7f3af803a8a0 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f3afc006fd0 tx=0x7f3afc006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.276+0000 7f3b125e0700 1 -- 192.168.123.105:0/2979574150 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b0c19c790 msgr2=0x7f3b0c19cbb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.276+0000 7f3b125e0700 1 --2- 192.168.123.105:0/2979574150 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b0c19c790 0x7f3b0c19cbb0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f3af400bd00 tx=0x7f3af400bde0 comp rx=0 tx=0).stop 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.276+0000 7f3b125e0700 1 -- 192.168.123.105:0/2979574150 shutdown_connections 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.276+0000 7f3b125e0700 1 --2- 192.168.123.105:0/2979574150 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3af80383e0 0x7f3af803a8a0 unknown :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.276+0000 7f3b125e0700 1 --2- 192.168.123.105:0/2979574150 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b0c19c790 0x7f3b0c19cbb0 secure :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f3af400bd00 tx=0x7f3af400bde0 comp rx=0 tx=0).stop 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.276+0000 7f3b125e0700 1 -- 192.168.123.105:0/2979574150 >> 192.168.123.105:0/2979574150 conn(0x7f3b0c101ce0 msgr2=0x7f3b0c1028d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.277+0000 7f3b125e0700 1 -- 192.168.123.105:0/2979574150 shutdown_connections 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.277+0000 7f3b125e0700 1 -- 192.168.123.105:0/2979574150 wait complete. 2026-03-10T07:46:01.330 INFO:teuthology.orchestra.run.vm05.stdout:Deploying crash service with default placement... 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout Scheduled crash update... 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.461+0000 7f9a2411b700 1 Processor -- start 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.461+0000 7f9a2411b700 1 -- start start 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.462+0000 7f9a2411b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9a1c108970 0x7f9a1c108d90 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.462+0000 7f9a2411b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9a1c109360 con 0x7f9a1c108970 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.462+0000 7f9a21eb7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9a1c108970 0x7f9a1c108d90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.462+0000 7f9a21eb7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9a1c108970 0x7f9a1c108d90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57582/0 (socket says 192.168.123.105:57582) 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.462+0000 7f9a21eb7700 1 -- 192.168.123.105:0/2454561413 learned_addr learned my addr 192.168.123.105:0/2454561413 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.462+0000 7f9a21eb7700 1 -- 192.168.123.105:0/2454561413 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9a1c109b70 con 0x7f9a1c108970 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.462+0000 7f9a21eb7700 1 --2- 192.168.123.105:0/2454561413 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9a1c108970 0x7f9a1c108d90 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f9a0c009a90 tx=0x7f9a0c009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=a0a25be383f6ac39 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.463+0000 7f9a20eb5700 1 -- 192.168.123.105:0/2454561413 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9a0c004030 con 0x7f9a1c108970 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.463+0000 7f9a20eb5700 1 -- 192.168.123.105:0/2454561413 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9a0c00b7e0 con 0x7f9a1c108970 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.463+0000 7f9a20eb5700 1 -- 192.168.123.105:0/2454561413 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9a0c003ae0 con 0x7f9a1c108970 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.463+0000 7f9a2411b700 1 -- 192.168.123.105:0/2454561413 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9a1c108970 msgr2=0x7f9a1c108d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.463+0000 7f9a2411b700 1 --2- 192.168.123.105:0/2454561413 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9a1c108970 0x7f9a1c108d90 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f9a0c009a90 tx=0x7f9a0c009da0 comp rx=0 tx=0).stop 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.464+0000 7f9a2411b700 1 -- 192.168.123.105:0/2454561413 shutdown_connections 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.464+0000 7f9a2411b700 1 --2- 192.168.123.105:0/2454561413 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9a1c108970 0x7f9a1c108d90 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.464+0000 7f9a2411b700 1 -- 192.168.123.105:0/2454561413 >> 192.168.123.105:0/2454561413 conn(0x7f9a1c07be30 msgr2=0x7f9a1c1064e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.464+0000 7f9a2411b700 1 -- 192.168.123.105:0/2454561413 shutdown_connections 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.464+0000 7f9a2411b700 1 -- 192.168.123.105:0/2454561413 wait complete. 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.464+0000 7f9a2411b700 1 Processor -- start 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.464+0000 7f9a2411b700 1 -- start start 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.465+0000 7f9a2411b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9a1c108970 0x7f9a1c19c680 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.465+0000 7f9a2411b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9a1c19cbc0 con 0x7f9a1c108970 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.465+0000 7f9a21eb7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9a1c108970 0x7f9a1c19c680 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.465+0000 7f9a21eb7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9a1c108970 0x7f9a1c19c680 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57588/0 (socket says 192.168.123.105:57588) 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.465+0000 7f9a21eb7700 1 -- 192.168.123.105:0/3830529050 learned_addr learned my addr 192.168.123.105:0/3830529050 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.465+0000 7f9a21eb7700 1 -- 192.168.123.105:0/3830529050 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9a0c009740 con 0x7f9a1c108970 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.465+0000 7f9a21eb7700 1 --2- 192.168.123.105:0/3830529050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9a1c108970 0x7f9a1c19c680 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f9a0c009710 tx=0x7f9a0c00bfa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.465+0000 7f9a12ffd700 1 -- 192.168.123.105:0/3830529050 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9a0c0041a0 con 0x7f9a1c108970 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.466+0000 7f9a12ffd700 1 -- 192.168.123.105:0/3830529050 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9a0c004300 con 0x7f9a1c108970 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.466+0000 7f9a12ffd700 1 -- 192.168.123.105:0/3830529050 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9a0c0114a0 con 0x7f9a1c108970 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.466+0000 7f9a2411b700 1 -- 192.168.123.105:0/3830529050 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9a1c19cdc0 con 0x7f9a1c108970 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.466+0000 7f9a2411b700 1 -- 192.168.123.105:0/3830529050 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9a1c19d1e0 con 0x7f9a1c108970 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.468+0000 7f9a12ffd700 1 -- 192.168.123.105:0/3830529050 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f9a0c011600 con 0x7f9a1c108970 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.468+0000 7f9a12ffd700 1 --2- 192.168.123.105:0/3830529050 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9a080383f0 0x7f9a0803a8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.468+0000 7f9a12ffd700 1 -- 192.168.123.105:0/3830529050 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f9a0c04d110 con 0x7f9a1c108970 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.470+0000 7f9a2411b700 1 -- 192.168.123.105:0/3830529050 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9a1c04fa20 con 0x7f9a1c108970 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.470+0000 7f9a216b6700 1 --2- 192.168.123.105:0/3830529050 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9a080383f0 0x7f9a0803a8b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.473+0000 7f9a216b6700 1 --2- 192.168.123.105:0/3830529050 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9a080383f0 0x7f9a0803a8b0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f9a18006fd0 tx=0x7f9a18006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.474+0000 7f9a12ffd700 1 -- 192.168.123.105:0/3830529050 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f9a0c02b430 con 0x7f9a1c108970 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.588+0000 7f9a2411b700 1 -- 192.168.123.105:0/3830529050 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "crash", "target": ["mon-mgr", ""]}) v1 -- 0x7f9a1c106420 con 0x7f9a080383f0 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.595+0000 7f9a12ffd700 1 -- 192.168.123.105:0/3830529050 <== mgr.14120 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+26 (secure 0 0 0) 0x7f9a1c106420 con 0x7f9a080383f0 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.598+0000 7f9a2411b700 1 -- 192.168.123.105:0/3830529050 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9a080383f0 msgr2=0x7f9a0803a8b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.598+0000 7f9a2411b700 1 --2- 192.168.123.105:0/3830529050 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9a080383f0 0x7f9a0803a8b0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f9a18006fd0 tx=0x7f9a18006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.598+0000 7f9a2411b700 1 -- 192.168.123.105:0/3830529050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9a1c108970 msgr2=0x7f9a1c19c680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.598+0000 7f9a2411b700 1 --2- 192.168.123.105:0/3830529050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9a1c108970 0x7f9a1c19c680 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f9a0c009710 tx=0x7f9a0c00bfa0 comp rx=0 tx=0).stop 2026-03-10T07:46:01.645 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.601+0000 7f9a2411b700 1 -- 192.168.123.105:0/3830529050 shutdown_connections 2026-03-10T07:46:01.646 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.601+0000 7f9a2411b700 1 --2- 192.168.123.105:0/3830529050 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9a080383f0 0x7f9a0803a8b0 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:01.646 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.601+0000 7f9a2411b700 1 --2- 192.168.123.105:0/3830529050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9a1c108970 0x7f9a1c19c680 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:01.646 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.601+0000 7f9a2411b700 1 -- 192.168.123.105:0/3830529050 >> 192.168.123.105:0/3830529050 conn(0x7f9a1c07be30 msgr2=0x7f9a1c105d10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:01.646 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.603+0000 7f9a2411b700 1 -- 192.168.123.105:0/3830529050 shutdown_connections 2026-03-10T07:46:01.646 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.603+0000 7f9a2411b700 1 -- 192.168.123.105:0/3830529050 wait complete. 2026-03-10T07:46:01.646 INFO:teuthology.orchestra.run.vm05.stdout:Deploying ceph-exporter service with default placement... 2026-03-10T07:46:01.894 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:01 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:01.894 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:01 vm05 ceph-mon[50387]: Added host vm05 2026-03-10T07:46:01.894 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:01 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:46:01.895 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:01 vm05 ceph-mon[50387]: from='client.14142 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mon", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:46:01.895 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:01 vm05 ceph-mon[50387]: Saving service mon spec with placement count:5 2026-03-10T07:46:01.895 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:01 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:01.895 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:01 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:01.895 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:01 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:01.895 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:01 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:01.895 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:01 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:02.001 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout Scheduled ceph-exporter update... 2026-03-10T07:46:02.001 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.793+0000 7f303298f700 1 Processor -- start 2026-03-10T07:46:02.001 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.794+0000 7f303298f700 1 -- start start 2026-03-10T07:46:02.001 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.794+0000 7f303298f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f302c072b70 0x7f302c10fdc0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:02.001 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.795+0000 7f303298f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f302c110390 con 0x7f302c072b70 2026-03-10T07:46:02.001 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.795+0000 7f302bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f302c072b70 0x7f302c10fdc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.795+0000 7f302bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f302c072b70 0x7f302c10fdc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57592/0 (socket says 192.168.123.105:57592) 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.795+0000 7f302bfff700 1 -- 192.168.123.105:0/4197199706 learned_addr learned my addr 192.168.123.105:0/4197199706 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.795+0000 7f302bfff700 1 -- 192.168.123.105:0/4197199706 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f302c1104d0 con 0x7f302c072b70 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.795+0000 7f302bfff700 1 --2- 192.168.123.105:0/4197199706 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f302c072b70 0x7f302c10fdc0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f301c00ab30 tx=0x7f301c010730 comp rx=0 tx=0).ready entity=mon.0 client_cookie=f7e21f735d79024a server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.795+0000 7f302affd700 1 -- 192.168.123.105:0/4197199706 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f301c010e00 con 0x7f302c072b70 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.795+0000 7f302affd700 1 -- 192.168.123.105:0/4197199706 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f301c004510 con 0x7f302c072b70 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.796+0000 7f303298f700 1 -- 192.168.123.105:0/4197199706 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f302c072b70 msgr2=0x7f302c10fdc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.796+0000 7f303298f700 1 --2- 192.168.123.105:0/4197199706 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f302c072b70 0x7f302c10fdc0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f301c00ab30 tx=0x7f301c010730 comp rx=0 tx=0).stop 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.796+0000 7f303298f700 1 -- 192.168.123.105:0/4197199706 shutdown_connections 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.796+0000 7f303298f700 1 --2- 192.168.123.105:0/4197199706 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f302c072b70 0x7f302c10fdc0 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.796+0000 7f303298f700 1 -- 192.168.123.105:0/4197199706 >> 192.168.123.105:0/4197199706 conn(0x7f302c06d660 msgr2=0x7f302c06fac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.797+0000 7f303298f700 1 -- 192.168.123.105:0/4197199706 shutdown_connections 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.797+0000 7f303298f700 1 -- 192.168.123.105:0/4197199706 wait complete. 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.797+0000 7f303298f700 1 Processor -- start 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.797+0000 7f303298f700 1 -- start start 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.797+0000 7f303298f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f302c072b70 0x7f302c1b1b00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.797+0000 7f303298f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f302c1b2040 con 0x7f302c072b70 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.797+0000 7f302bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f302c072b70 0x7f302c1b1b00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.797+0000 7f302bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f302c072b70 0x7f302c1b1b00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57594/0 (socket says 192.168.123.105:57594) 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.797+0000 7f302bfff700 1 -- 192.168.123.105:0/2462320749 learned_addr learned my addr 192.168.123.105:0/2462320749 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.797+0000 7f302bfff700 1 -- 192.168.123.105:0/2462320749 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f301c00a7e0 con 0x7f302c072b70 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.798+0000 7f302bfff700 1 --2- 192.168.123.105:0/2462320749 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f302c072b70 0x7f302c1b1b00 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f301c010bb0 tx=0x7f301c003980 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.798+0000 7f30297fa700 1 -- 192.168.123.105:0/2462320749 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f301c003b60 con 0x7f302c072b70 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.798+0000 7f303298f700 1 -- 192.168.123.105:0/2462320749 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f302c1b2240 con 0x7f302c072b70 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.798+0000 7f303298f700 1 -- 192.168.123.105:0/2462320749 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f302c1b26c0 con 0x7f302c072b70 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.799+0000 7f30297fa700 1 -- 192.168.123.105:0/2462320749 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f301c00f070 con 0x7f302c072b70 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.799+0000 7f30297fa700 1 -- 192.168.123.105:0/2462320749 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f301c009660 con 0x7f302c072b70 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.800+0000 7f30297fa700 1 -- 192.168.123.105:0/2462320749 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f301c018070 con 0x7f302c072b70 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.800+0000 7f30297fa700 1 --2- 192.168.123.105:0/2462320749 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3014038410 0x7f301403a8d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.800+0000 7f302b7fe700 1 --2- 192.168.123.105:0/2462320749 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3014038410 0x7f301403a8d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.801+0000 7f302b7fe700 1 --2- 192.168.123.105:0/2462320749 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3014038410 0x7f301403a8d0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f302400ad80 tx=0x7f30240093f0 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.801+0000 7f30297fa700 1 -- 192.168.123.105:0/2462320749 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f301c04bf90 con 0x7f302c072b70 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.801+0000 7f303298f700 1 -- 192.168.123.105:0/2462320749 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3018005320 con 0x7f302c072b70 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.805+0000 7f30297fa700 1 -- 192.168.123.105:0/2462320749 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f301c027070 con 0x7f302c072b70 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.933+0000 7f303298f700 1 -- 192.168.123.105:0/2462320749 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "ceph-exporter", "target": ["mon-mgr", ""]}) v1 -- 0x7f3018000bf0 con 0x7f3014038410 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.942+0000 7f30297fa700 1 -- 192.168.123.105:0/2462320749 <== mgr.14120 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+34 (secure 0 0 0) 0x7f3018000bf0 con 0x7f3014038410 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.946+0000 7f303298f700 1 -- 192.168.123.105:0/2462320749 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3014038410 msgr2=0x7f301403a8d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.946+0000 7f303298f700 1 --2- 192.168.123.105:0/2462320749 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3014038410 0x7f301403a8d0 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f302400ad80 tx=0x7f30240093f0 comp rx=0 tx=0).stop 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.946+0000 7f303298f700 1 -- 192.168.123.105:0/2462320749 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f302c072b70 msgr2=0x7f302c1b1b00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.946+0000 7f303298f700 1 --2- 192.168.123.105:0/2462320749 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f302c072b70 0x7f302c1b1b00 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f301c010bb0 tx=0x7f301c003980 comp rx=0 tx=0).stop 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.946+0000 7f303298f700 1 -- 192.168.123.105:0/2462320749 shutdown_connections 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.946+0000 7f303298f700 1 --2- 192.168.123.105:0/2462320749 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3014038410 0x7f301403a8d0 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.946+0000 7f303298f700 1 --2- 192.168.123.105:0/2462320749 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f302c072b70 0x7f302c1b1b00 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.946+0000 7f303298f700 1 -- 192.168.123.105:0/2462320749 >> 192.168.123.105:0/2462320749 conn(0x7f302c06d660 msgr2=0x7f302c06f3b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.946+0000 7f303298f700 1 -- 192.168.123.105:0/2462320749 shutdown_connections 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:01.946+0000 7f303298f700 1 -- 192.168.123.105:0/2462320749 wait complete. 2026-03-10T07:46:02.002 INFO:teuthology.orchestra.run.vm05.stdout:Deploying prometheus service with default placement... 2026-03-10T07:46:02.310 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout Scheduled prometheus update... 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.129+0000 7fb9fd567700 1 Processor -- start 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.129+0000 7fb9fd567700 1 -- start start 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.130+0000 7fb9fd567700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9f81071c0 0x7fb9f81095b0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.130+0000 7fb9fd567700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb9f8074720 con 0x7fb9f81071c0 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.130+0000 7fb9f6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9f81071c0 0x7fb9f81095b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.130+0000 7fb9f6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9f81071c0 0x7fb9f81095b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57600/0 (socket says 192.168.123.105:57600) 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.130+0000 7fb9f6ffd700 1 -- 192.168.123.105:0/4221511364 learned_addr learned my addr 192.168.123.105:0/4221511364 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.130+0000 7fb9f6ffd700 1 -- 192.168.123.105:0/4221511364 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb9f8109af0 con 0x7fb9f81071c0 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.130+0000 7fb9f6ffd700 1 --2- 192.168.123.105:0/4221511364 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9f81071c0 0x7fb9f81095b0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fb9e0009a90 tx=0x7fb9e0009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=fb5014649f99681e server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.131+0000 7fb9f5ffb700 1 -- 192.168.123.105:0/4221511364 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb9e0004030 con 0x7fb9f81071c0 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.131+0000 7fb9f5ffb700 1 -- 192.168.123.105:0/4221511364 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb9e000b7e0 con 0x7fb9f81071c0 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.131+0000 7fb9f5ffb700 1 -- 192.168.123.105:0/4221511364 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb9e0003b30 con 0x7fb9f81071c0 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.131+0000 7fb9fd567700 1 -- 192.168.123.105:0/4221511364 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9f81071c0 msgr2=0x7fb9f81095b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.131+0000 7fb9fd567700 1 --2- 192.168.123.105:0/4221511364 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9f81071c0 0x7fb9f81095b0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7fb9e0009a90 tx=0x7fb9e0009da0 comp rx=0 tx=0).stop 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.131+0000 7fb9fd567700 1 -- 192.168.123.105:0/4221511364 shutdown_connections 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.131+0000 7fb9fd567700 1 --2- 192.168.123.105:0/4221511364 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9f81071c0 0x7fb9f81095b0 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.131+0000 7fb9fd567700 1 -- 192.168.123.105:0/4221511364 >> 192.168.123.105:0/4221511364 conn(0x7fb9f8100bd0 msgr2=0x7fb9f8103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.131+0000 7fb9fd567700 1 -- 192.168.123.105:0/4221511364 shutdown_connections 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.131+0000 7fb9fd567700 1 -- 192.168.123.105:0/4221511364 wait complete. 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.132+0000 7fb9fd567700 1 Processor -- start 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.132+0000 7fb9fd567700 1 -- start start 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.132+0000 7fb9fd567700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9f8198420 0x7fb9f8198840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.132+0000 7fb9fd567700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb9f8198d80 con 0x7fb9f8198420 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.132+0000 7fb9f6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9f8198420 0x7fb9f8198840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.132+0000 7fb9f6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9f8198420 0x7fb9f8198840 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57614/0 (socket says 192.168.123.105:57614) 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.132+0000 7fb9f6ffd700 1 -- 192.168.123.105:0/23910865 learned_addr learned my addr 192.168.123.105:0/23910865 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.133+0000 7fb9f6ffd700 1 -- 192.168.123.105:0/23910865 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb9e0009740 con 0x7fb9f8198420 2026-03-10T07:46:02.311 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.133+0000 7fb9f6ffd700 1 --2- 192.168.123.105:0/23910865 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9f8198420 0x7fb9f8198840 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fb9e0003770 tx=0x7fb9e000bfa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.133+0000 7fb9effff700 1 -- 192.168.123.105:0/23910865 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb9e0004030 con 0x7fb9f8198420 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.133+0000 7fb9effff700 1 -- 192.168.123.105:0/23910865 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb9e0024470 con 0x7fb9f8198420 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.133+0000 7fb9fd567700 1 -- 192.168.123.105:0/23910865 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb9f8198f80 con 0x7fb9f8198420 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.133+0000 7fb9effff700 1 -- 192.168.123.105:0/23910865 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb9e001a440 con 0x7fb9f8198420 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.133+0000 7fb9fd567700 1 -- 192.168.123.105:0/23910865 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb9f819bbd0 con 0x7fb9f8198420 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.135+0000 7fb9effff700 1 -- 192.168.123.105:0/23910865 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7fb9e0021070 con 0x7fb9f8198420 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.135+0000 7fb9effff700 1 --2- 192.168.123.105:0/23910865 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb9e403c7e0 0x7fb9e403eca0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.135+0000 7fb9effff700 1 -- 192.168.123.105:0/23910865 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fb9e004d140 con 0x7fb9f8198420 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.136+0000 7fb9f67fc700 1 --2- 192.168.123.105:0/23910865 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb9e403c7e0 0x7fb9e403eca0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.136+0000 7fb9f67fc700 1 --2- 192.168.123.105:0/23910865 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb9e403c7e0 0x7fb9e403eca0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fb9e8006fd0 tx=0x7fb9e8006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.136+0000 7fb9fd567700 1 -- 192.168.123.105:0/23910865 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb9f8192140 con 0x7fb9f8198420 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.141+0000 7fb9effff700 1 -- 192.168.123.105:0/23910865 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb9e001a5a0 con 0x7fb9f8198420 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.263+0000 7fb9fd567700 1 -- 192.168.123.105:0/23910865 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "prometheus", "target": ["mon-mgr", ""]}) v1 -- 0x7fb9f80611d0 con 0x7fb9e403c7e0 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.268+0000 7fb9effff700 1 -- 192.168.123.105:0/23910865 <== mgr.14120 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+31 (secure 0 0 0) 0x7fb9f80611d0 con 0x7fb9e403c7e0 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.270+0000 7fb9fd567700 1 -- 192.168.123.105:0/23910865 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb9e403c7e0 msgr2=0x7fb9e403eca0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.270+0000 7fb9fd567700 1 --2- 192.168.123.105:0/23910865 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb9e403c7e0 0x7fb9e403eca0 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fb9e8006fd0 tx=0x7fb9e8006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.271+0000 7fb9fd567700 1 -- 192.168.123.105:0/23910865 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9f8198420 msgr2=0x7fb9f8198840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.271+0000 7fb9fd567700 1 --2- 192.168.123.105:0/23910865 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9f8198420 0x7fb9f8198840 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fb9e0003770 tx=0x7fb9e000bfa0 comp rx=0 tx=0).stop 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.271+0000 7fb9fd567700 1 -- 192.168.123.105:0/23910865 shutdown_connections 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.271+0000 7fb9fd567700 1 --2- 192.168.123.105:0/23910865 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb9e403c7e0 0x7fb9e403eca0 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.271+0000 7fb9fd567700 1 --2- 192.168.123.105:0/23910865 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb9f8198420 0x7fb9f8198840 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.271+0000 7fb9fd567700 1 -- 192.168.123.105:0/23910865 >> 192.168.123.105:0/23910865 conn(0x7fb9f8100bd0 msgr2=0x7fb9f81018b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.272+0000 7fb9fd567700 1 -- 192.168.123.105:0/23910865 shutdown_connections 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.272+0000 7fb9fd567700 1 -- 192.168.123.105:0/23910865 wait complete. 2026-03-10T07:46:02.312 INFO:teuthology.orchestra.run.vm05.stdout:Deploying grafana service with default placement... 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout Scheduled grafana update... 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.454+0000 7fabb262c700 1 Processor -- start 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.455+0000 7fabb262c700 1 -- start start 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.456+0000 7fabb262c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fabac108990 0x7fabac108db0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.456+0000 7fabb262c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fabac109380 con 0x7fabac108990 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.456+0000 7fababfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fabac108990 0x7fabac108db0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.456+0000 7fababfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fabac108990 0x7fabac108db0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57628/0 (socket says 192.168.123.105:57628) 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.456+0000 7fababfff700 1 -- 192.168.123.105:0/4064903082 learned_addr learned my addr 192.168.123.105:0/4064903082 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.456+0000 7fababfff700 1 -- 192.168.123.105:0/4064903082 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fabac109b90 con 0x7fabac108990 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.457+0000 7fababfff700 1 --2- 192.168.123.105:0/4064903082 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fabac108990 0x7fabac108db0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7fab94009cf0 tx=0x7fab9400b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=e54a191eb2499ba5 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.457+0000 7fabaaffd700 1 -- 192.168.123.105:0/4064903082 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fab94004030 con 0x7fabac108990 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.457+0000 7fabaaffd700 1 -- 192.168.123.105:0/4064903082 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fab9400b810 con 0x7fabac108990 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.457+0000 7fabaaffd700 1 -- 192.168.123.105:0/4064903082 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fab94003b10 con 0x7fabac108990 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.458+0000 7fabb262c700 1 -- 192.168.123.105:0/4064903082 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fabac108990 msgr2=0x7fabac108db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.458+0000 7fabb262c700 1 --2- 192.168.123.105:0/4064903082 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fabac108990 0x7fabac108db0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7fab94009cf0 tx=0x7fab9400b0e0 comp rx=0 tx=0).stop 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.458+0000 7fabb262c700 1 -- 192.168.123.105:0/4064903082 shutdown_connections 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.458+0000 7fabb262c700 1 --2- 192.168.123.105:0/4064903082 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fabac108990 0x7fabac108db0 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.458+0000 7fabb262c700 1 -- 192.168.123.105:0/4064903082 >> 192.168.123.105:0/4064903082 conn(0x7fabac103f50 msgr2=0x7fabac106370 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.458+0000 7fabb262c700 1 -- 192.168.123.105:0/4064903082 shutdown_connections 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.458+0000 7fabb262c700 1 -- 192.168.123.105:0/4064903082 wait complete. 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.458+0000 7fabb262c700 1 Processor -- start 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.459+0000 7fabb262c700 1 -- start start 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.459+0000 7fabb262c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fabac108990 0x7fabac19c6d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.459+0000 7fabb262c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fabac19cc10 con 0x7fabac108990 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.459+0000 7fababfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fabac108990 0x7fabac19c6d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.459+0000 7fababfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fabac108990 0x7fabac19c6d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57636/0 (socket says 192.168.123.105:57636) 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.459+0000 7fababfff700 1 -- 192.168.123.105:0/2044716941 learned_addr learned my addr 192.168.123.105:0/2044716941 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.459+0000 7fababfff700 1 -- 192.168.123.105:0/2044716941 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fab94009740 con 0x7fabac108990 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.460+0000 7fababfff700 1 --2- 192.168.123.105:0/2044716941 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fabac108990 0x7fabac19c6d0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7fab94000c00 tx=0x7fab94011890 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.460+0000 7faba97fa700 1 -- 192.168.123.105:0/2044716941 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fab94011bc0 con 0x7fabac108990 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.460+0000 7faba97fa700 1 -- 192.168.123.105:0/2044716941 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fab94011d20 con 0x7fabac108990 2026-03-10T07:46:02.790 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.460+0000 7faba97fa700 1 -- 192.168.123.105:0/2044716941 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fab9401a550 con 0x7fabac108990 2026-03-10T07:46:02.791 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.460+0000 7fabb262c700 1 -- 192.168.123.105:0/2044716941 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fabac19ce10 con 0x7fabac108990 2026-03-10T07:46:02.791 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.460+0000 7fabb262c700 1 -- 192.168.123.105:0/2044716941 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fabac19d230 con 0x7fabac108990 2026-03-10T07:46:02.791 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.461+0000 7faba97fa700 1 -- 192.168.123.105:0/2044716941 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7fab9401b440 con 0x7fabac108990 2026-03-10T07:46:02.791 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.461+0000 7fabb262c700 1 -- 192.168.123.105:0/2044716941 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fabac0623c0 con 0x7fabac108990 2026-03-10T07:46:02.791 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.461+0000 7faba97fa700 1 --2- 192.168.123.105:0/2044716941 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fab980383f0 0x7fab9803a8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:02.791 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.461+0000 7faba97fa700 1 -- 192.168.123.105:0/2044716941 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fab9404c0f0 con 0x7fabac108990 2026-03-10T07:46:02.791 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.462+0000 7fabab7fe700 1 --2- 192.168.123.105:0/2044716941 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fab980383f0 0x7fab9803a8b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:02.791 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.463+0000 7fabab7fe700 1 --2- 192.168.123.105:0/2044716941 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fab980383f0 0x7fab9803a8b0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7fab9c006fd0 tx=0x7fab9c006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:02.791 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.464+0000 7faba97fa700 1 -- 192.168.123.105:0/2044716941 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fab9407f0e0 con 0x7fabac108990 2026-03-10T07:46:02.791 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.576+0000 7fabb262c700 1 -- 192.168.123.105:0/2044716941 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "grafana", "target": ["mon-mgr", ""]}) v1 -- 0x7fabac105360 con 0x7fab980383f0 2026-03-10T07:46:02.791 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.721+0000 7faba97fa700 1 -- 192.168.123.105:0/2044716941 <== mgr.14120 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+28 (secure 0 0 0) 0x7fabac105360 con 0x7fab980383f0 2026-03-10T07:46:02.791 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.724+0000 7fabb262c700 1 -- 192.168.123.105:0/2044716941 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fab980383f0 msgr2=0x7fab9803a8b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:02.791 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.724+0000 7fabb262c700 1 --2- 192.168.123.105:0/2044716941 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fab980383f0 0x7fab9803a8b0 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7fab9c006fd0 tx=0x7fab9c006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:02.791 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.724+0000 7fabb262c700 1 -- 192.168.123.105:0/2044716941 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fabac108990 msgr2=0x7fabac19c6d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:02.791 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.724+0000 7fabb262c700 1 --2- 192.168.123.105:0/2044716941 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fabac108990 0x7fabac19c6d0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7fab94000c00 tx=0x7fab94011890 comp rx=0 tx=0).stop 2026-03-10T07:46:02.791 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.725+0000 7fabb262c700 1 -- 192.168.123.105:0/2044716941 shutdown_connections 2026-03-10T07:46:02.791 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.725+0000 7fabb262c700 1 --2- 192.168.123.105:0/2044716941 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fab980383f0 0x7fab9803a8b0 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:02.791 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.725+0000 7fabb262c700 1 --2- 192.168.123.105:0/2044716941 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fabac108990 0x7fabac19c6d0 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:02.791 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.725+0000 7fabb262c700 1 -- 192.168.123.105:0/2044716941 >> 192.168.123.105:0/2044716941 conn(0x7fabac103f50 msgr2=0x7fabac104c50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:02.791 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.725+0000 7fabb262c700 1 -- 192.168.123.105:0/2044716941 shutdown_connections 2026-03-10T07:46:02.791 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.725+0000 7fabb262c700 1 -- 192.168.123.105:0/2044716941 wait complete. 2026-03-10T07:46:02.791 INFO:teuthology.orchestra.run.vm05.stdout:Deploying node-exporter service with default placement... 2026-03-10T07:46:02.950 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:02 vm05 ceph-mon[50387]: from='client.14144 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mgr", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:46:02.950 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:02 vm05 ceph-mon[50387]: Saving service mgr spec with placement count:2 2026-03-10T07:46:02.950 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:02 vm05 ceph-mon[50387]: from='client.14146 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "crash", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:46:02.950 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:02 vm05 ceph-mon[50387]: Saving service crash spec with placement * 2026-03-10T07:46:02.950 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:02 vm05 ceph-mon[50387]: from='client.14148 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "ceph-exporter", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:46:02.950 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:02 vm05 ceph-mon[50387]: Saving service ceph-exporter spec with placement * 2026-03-10T07:46:02.950 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:02 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:02.950 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:02 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:02.950 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:02 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:02.950 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:02 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:03.115 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout Scheduled node-exporter update... 2026-03-10T07:46:03.116 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.948+0000 7f99c0dcf700 1 Processor -- start 2026-03-10T07:46:03.116 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.948+0000 7f99c0dcf700 1 -- start start 2026-03-10T07:46:03.116 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.949+0000 7f99c0dcf700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f99bc07b760 0x7f99bc07bb80 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:03.116 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.949+0000 7f99c0dcf700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f99bc07c150 con 0x7f99bc07b760 2026-03-10T07:46:03.116 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.949+0000 7f99bb7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f99bc07b760 0x7f99bc07bb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:03.116 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.949+0000 7f99bb7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f99bc07b760 0x7f99bc07bb80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57642/0 (socket says 192.168.123.105:57642) 2026-03-10T07:46:03.116 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.949+0000 7f99bb7fe700 1 -- 192.168.123.105:0/2699615193 learned_addr learned my addr 192.168.123.105:0/2699615193 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:03.116 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.950+0000 7f99bb7fe700 1 -- 192.168.123.105:0/2699615193 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f99bc07c9b0 con 0x7f99bc07b760 2026-03-10T07:46:03.116 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.950+0000 7f99bb7fe700 1 --2- 192.168.123.105:0/2699615193 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f99bc07b760 0x7f99bc07bb80 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f99b400b0d0 tx=0x7f99b400b490 comp rx=0 tx=0).ready entity=mon.0 client_cookie=62306d14a9d2b263 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:03.116 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.955+0000 7f99ba7fc700 1 -- 192.168.123.105:0/2699615193 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f99b400e070 con 0x7f99bc07b760 2026-03-10T07:46:03.116 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.955+0000 7f99ba7fc700 1 -- 192.168.123.105:0/2699615193 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f99b4003a20 con 0x7f99bc07b760 2026-03-10T07:46:03.116 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.955+0000 7f99ba7fc700 1 -- 192.168.123.105:0/2699615193 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f99b40046b0 con 0x7f99bc07b760 2026-03-10T07:46:03.116 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.958+0000 7f99c0dcf700 1 -- 192.168.123.105:0/2699615193 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f99bc07b760 msgr2=0x7f99bc07bb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:03.116 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.958+0000 7f99c0dcf700 1 --2- 192.168.123.105:0/2699615193 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f99bc07b760 0x7f99bc07bb80 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f99b400b0d0 tx=0x7f99b400b490 comp rx=0 tx=0).stop 2026-03-10T07:46:03.116 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.958+0000 7f99c0dcf700 1 -- 192.168.123.105:0/2699615193 shutdown_connections 2026-03-10T07:46:03.116 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.958+0000 7f99c0dcf700 1 --2- 192.168.123.105:0/2699615193 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f99bc07b760 0x7f99bc07bb80 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:03.116 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.958+0000 7f99c0dcf700 1 -- 192.168.123.105:0/2699615193 >> 192.168.123.105:0/2699615193 conn(0x7f99bc103d70 msgr2=0x7f99bc106190 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:03.116 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.958+0000 7f99c0dcf700 1 -- 192.168.123.105:0/2699615193 shutdown_connections 2026-03-10T07:46:03.116 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.959+0000 7f99c0dcf700 1 -- 192.168.123.105:0/2699615193 wait complete. 2026-03-10T07:46:03.116 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.959+0000 7f99c0dcf700 1 Processor -- start 2026-03-10T07:46:03.116 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.959+0000 7f99c0dcf700 1 -- start start 2026-03-10T07:46:03.116 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.959+0000 7f99c0dcf700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f99bc1a9300 0x7f99bc1a9720 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:03.116 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.959+0000 7f99c0dcf700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f99bc1a9c60 con 0x7f99bc1a9300 2026-03-10T07:46:03.118 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.959+0000 7f99bb7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f99bc1a9300 0x7f99bc1a9720 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:03.118 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.959+0000 7f99bb7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f99bc1a9300 0x7f99bc1a9720 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57646/0 (socket says 192.168.123.105:57646) 2026-03-10T07:46:03.118 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.959+0000 7f99bb7fe700 1 -- 192.168.123.105:0/1330766965 learned_addr learned my addr 192.168.123.105:0/1330766965 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:03.118 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.960+0000 7f99bb7fe700 1 -- 192.168.123.105:0/1330766965 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f99b4009d20 con 0x7f99bc1a9300 2026-03-10T07:46:03.118 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.960+0000 7f99bb7fe700 1 --2- 192.168.123.105:0/1330766965 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f99bc1a9300 0x7f99bc1a9720 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f99b4015040 tx=0x7f99b400bd60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:03.118 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.960+0000 7f99b8ff9700 1 -- 192.168.123.105:0/1330766965 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f99b400e050 con 0x7f99bc1a9300 2026-03-10T07:46:03.118 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.960+0000 7f99b8ff9700 1 -- 192.168.123.105:0/1330766965 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f99b40092e0 con 0x7f99bc1a9300 2026-03-10T07:46:03.118 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.961+0000 7f99b8ff9700 1 -- 192.168.123.105:0/1330766965 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f99b40128e0 con 0x7f99bc1a9300 2026-03-10T07:46:03.118 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.961+0000 7f99c0dcf700 1 -- 192.168.123.105:0/1330766965 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f99bc1a9e60 con 0x7f99bc1a9300 2026-03-10T07:46:03.118 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.961+0000 7f99c0dcf700 1 -- 192.168.123.105:0/1330766965 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f99bc1acab0 con 0x7f99bc1a9300 2026-03-10T07:46:03.118 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.961+0000 7f99c0dcf700 1 -- 192.168.123.105:0/1330766965 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f99bc04f070 con 0x7f99bc1a9300 2026-03-10T07:46:03.118 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.962+0000 7f99b8ff9700 1 -- 192.168.123.105:0/1330766965 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f99b4019040 con 0x7f99bc1a9300 2026-03-10T07:46:03.119 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.962+0000 7f99b8ff9700 1 --2- 192.168.123.105:0/1330766965 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f99a4037fe0 0x7f99a403a4a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:03.119 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.962+0000 7f99b8ff9700 1 -- 192.168.123.105:0/1330766965 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f99b404c150 con 0x7f99bc1a9300 2026-03-10T07:46:03.119 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.962+0000 7f99baffd700 1 --2- 192.168.123.105:0/1330766965 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f99a4037fe0 0x7f99a403a4a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:03.119 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.962+0000 7f99baffd700 1 --2- 192.168.123.105:0/1330766965 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f99a4037fe0 0x7f99a403a4a0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f99ac006fd0 tx=0x7f99ac006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:03.119 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:02.964+0000 7f99b8ff9700 1 -- 192.168.123.105:0/1330766965 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f99b4025ba0 con 0x7f99bc1a9300 2026-03-10T07:46:03.119 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.073+0000 7f99c0dcf700 1 -- 192.168.123.105:0/1330766965 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "node-exporter", "target": ["mon-mgr", ""]}) v1 -- 0x7f99bc0611d0 con 0x7f99a4037fe0 2026-03-10T07:46:03.119 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.078+0000 7f99b8ff9700 1 -- 192.168.123.105:0/1330766965 <== mgr.14120 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+34 (secure 0 0 0) 0x7f99bc0611d0 con 0x7f99a4037fe0 2026-03-10T07:46:03.119 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.080+0000 7f99c0dcf700 1 -- 192.168.123.105:0/1330766965 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f99a4037fe0 msgr2=0x7f99a403a4a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:03.119 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.080+0000 7f99c0dcf700 1 --2- 192.168.123.105:0/1330766965 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f99a4037fe0 0x7f99a403a4a0 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7f99ac006fd0 tx=0x7f99ac006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:03.119 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.080+0000 7f99c0dcf700 1 -- 192.168.123.105:0/1330766965 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f99bc1a9300 msgr2=0x7f99bc1a9720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:03.119 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.080+0000 7f99c0dcf700 1 --2- 192.168.123.105:0/1330766965 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f99bc1a9300 0x7f99bc1a9720 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f99b4015040 tx=0x7f99b400bd60 comp rx=0 tx=0).stop 2026-03-10T07:46:03.119 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.080+0000 7f99c0dcf700 1 -- 192.168.123.105:0/1330766965 shutdown_connections 2026-03-10T07:46:03.119 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.080+0000 7f99c0dcf700 1 --2- 192.168.123.105:0/1330766965 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f99a4037fe0 0x7f99a403a4a0 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:03.119 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.080+0000 7f99c0dcf700 1 --2- 192.168.123.105:0/1330766965 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f99bc1a9300 0x7f99bc1a9720 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:03.119 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.080+0000 7f99c0dcf700 1 -- 192.168.123.105:0/1330766965 >> 192.168.123.105:0/1330766965 conn(0x7f99bc103d70 msgr2=0x7f99bc106050 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:03.119 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.080+0000 7f99c0dcf700 1 -- 192.168.123.105:0/1330766965 shutdown_connections 2026-03-10T07:46:03.119 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.081+0000 7f99c0dcf700 1 -- 192.168.123.105:0/1330766965 wait complete. 2026-03-10T07:46:03.119 INFO:teuthology.orchestra.run.vm05.stdout:Deploying alertmanager service with default placement... 2026-03-10T07:46:03.422 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout Scheduled alertmanager update... 2026-03-10T07:46:03.422 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.253+0000 7f42a0fb1700 1 Processor -- start 2026-03-10T07:46:03.422 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.253+0000 7f42a0fb1700 1 -- start start 2026-03-10T07:46:03.422 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.254+0000 7f42a0fb1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f429c071e00 0x7f429c072220 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:03.422 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.254+0000 7f42a0fb1700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f429c0727f0 con 0x7f429c071e00 2026-03-10T07:46:03.422 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.254+0000 7f429b7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f429c071e00 0x7f429c072220 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:03.422 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.254+0000 7f429b7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f429c071e00 0x7f429c072220 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57650/0 (socket says 192.168.123.105:57650) 2026-03-10T07:46:03.423 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.254+0000 7f429b7fe700 1 -- 192.168.123.105:0/526098939 learned_addr learned my addr 192.168.123.105:0/526098939 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:03.423 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.254+0000 7f429b7fe700 1 -- 192.168.123.105:0/526098939 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f429c10ddb0 con 0x7f429c071e00 2026-03-10T07:46:03.423 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.254+0000 7f429b7fe700 1 --2- 192.168.123.105:0/526098939 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f429c071e00 0x7f429c072220 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f428c009cf0 tx=0x7f428c00b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=7593e05aa88eea82 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:03.423 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.256+0000 7f429a7fc700 1 -- 192.168.123.105:0/526098939 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f428c004030 con 0x7f429c071e00 2026-03-10T07:46:03.423 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.256+0000 7f429a7fc700 1 -- 192.168.123.105:0/526098939 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f428c00b810 con 0x7f429c071e00 2026-03-10T07:46:03.423 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.256+0000 7f42a0fb1700 1 -- 192.168.123.105:0/526098939 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f429c071e00 msgr2=0x7f429c072220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:03.423 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.256+0000 7f42a0fb1700 1 --2- 192.168.123.105:0/526098939 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f429c071e00 0x7f429c072220 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f428c009cf0 tx=0x7f428c00b0e0 comp rx=0 tx=0).stop 2026-03-10T07:46:03.423 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.256+0000 7f42a0fb1700 1 -- 192.168.123.105:0/526098939 shutdown_connections 2026-03-10T07:46:03.423 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.256+0000 7f42a0fb1700 1 --2- 192.168.123.105:0/526098939 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f429c071e00 0x7f429c072220 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:03.423 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.256+0000 7f42a0fb1700 1 -- 192.168.123.105:0/526098939 >> 192.168.123.105:0/526098939 conn(0x7f429c06d320 msgr2=0x7f429c06f760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:03.423 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.256+0000 7f42a0fb1700 1 -- 192.168.123.105:0/526098939 shutdown_connections 2026-03-10T07:46:03.423 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.256+0000 7f42a0fb1700 1 -- 192.168.123.105:0/526098939 wait complete. 2026-03-10T07:46:03.423 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.257+0000 7f42a0fb1700 1 Processor -- start 2026-03-10T07:46:03.423 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.257+0000 7f42a0fb1700 1 -- start start 2026-03-10T07:46:03.423 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.257+0000 7f42a0fb1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f429c11d4f0 0x7f429c11bb80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:03.423 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.257+0000 7f42a0fb1700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f429c0727f0 con 0x7f429c11d4f0 2026-03-10T07:46:03.423 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.257+0000 7f429b7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f429c11d4f0 0x7f429c11bb80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:03.423 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.257+0000 7f429b7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f429c11d4f0 0x7f429c11bb80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57656/0 (socket says 192.168.123.105:57656) 2026-03-10T07:46:03.423 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.257+0000 7f429b7fe700 1 -- 192.168.123.105:0/1884979563 learned_addr learned my addr 192.168.123.105:0/1884979563 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:03.423 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.257+0000 7f429b7fe700 1 -- 192.168.123.105:0/1884979563 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f428c009740 con 0x7f429c11d4f0 2026-03-10T07:46:03.423 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.258+0000 7f429b7fe700 1 --2- 192.168.123.105:0/1884979563 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f429c11d4f0 0x7f429c11bb80 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f428c009cc0 tx=0x7f428c00bfa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:03.424 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.258+0000 7f4298ff9700 1 -- 192.168.123.105:0/1884979563 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f428c003950 con 0x7f429c11d4f0 2026-03-10T07:46:03.424 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.258+0000 7f4298ff9700 1 -- 192.168.123.105:0/1884979563 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f428c0043c0 con 0x7f429c11d4f0 2026-03-10T07:46:03.424 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.259+0000 7f4298ff9700 1 -- 192.168.123.105:0/1884979563 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f428c01ac80 con 0x7f429c11d4f0 2026-03-10T07:46:03.424 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.259+0000 7f42a0fb1700 1 -- 192.168.123.105:0/1884979563 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f429c11d910 con 0x7f429c11d4f0 2026-03-10T07:46:03.424 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.259+0000 7f42a0fb1700 1 -- 192.168.123.105:0/1884979563 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f429c11db60 con 0x7f429c11d4f0 2026-03-10T07:46:03.424 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.260+0000 7f42a0fb1700 1 -- 192.168.123.105:0/1884979563 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f429c0623c0 con 0x7f429c11d4f0 2026-03-10T07:46:03.424 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.263+0000 7f4298ff9700 1 -- 192.168.123.105:0/1884979563 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f428c004530 con 0x7f429c11d4f0 2026-03-10T07:46:03.424 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.263+0000 7f4298ff9700 1 --2- 192.168.123.105:0/1884979563 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4284038050 0x7f428403a510 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:03.424 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.263+0000 7f4298ff9700 1 -- 192.168.123.105:0/1884979563 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f428c04bcf0 con 0x7f429c11d4f0 2026-03-10T07:46:03.424 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.263+0000 7f4298ff9700 1 -- 192.168.123.105:0/1884979563 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f428c019c90 con 0x7f429c11d4f0 2026-03-10T07:46:03.424 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.267+0000 7f429affd700 1 --2- 192.168.123.105:0/1884979563 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4284038050 0x7f428403a510 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:03.424 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.269+0000 7f429affd700 1 --2- 192.168.123.105:0/1884979563 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4284038050 0x7f428403a510 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f4290006fd0 tx=0x7f4290006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:03.424 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.374+0000 7f42a0fb1700 1 -- 192.168.123.105:0/1884979563 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "alertmanager", "target": ["mon-mgr", ""]}) v1 -- 0x7f429c06e480 con 0x7f4284038050 2026-03-10T07:46:03.424 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.380+0000 7f4298ff9700 1 -- 192.168.123.105:0/1884979563 <== mgr.14120 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+33 (secure 0 0 0) 0x7f429c06e480 con 0x7f4284038050 2026-03-10T07:46:03.424 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.382+0000 7f42a0fb1700 1 -- 192.168.123.105:0/1884979563 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4284038050 msgr2=0x7f428403a510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:03.424 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.382+0000 7f42a0fb1700 1 --2- 192.168.123.105:0/1884979563 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4284038050 0x7f428403a510 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f4290006fd0 tx=0x7f4290006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:03.424 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.382+0000 7f42a0fb1700 1 -- 192.168.123.105:0/1884979563 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f429c11d4f0 msgr2=0x7f429c11bb80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:03.424 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.382+0000 7f42a0fb1700 1 --2- 192.168.123.105:0/1884979563 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f429c11d4f0 0x7f429c11bb80 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f428c009cc0 tx=0x7f428c00bfa0 comp rx=0 tx=0).stop 2026-03-10T07:46:03.424 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.382+0000 7f42a0fb1700 1 -- 192.168.123.105:0/1884979563 shutdown_connections 2026-03-10T07:46:03.424 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.382+0000 7f42a0fb1700 1 --2- 192.168.123.105:0/1884979563 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4284038050 0x7f428403a510 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:03.424 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.382+0000 7f42a0fb1700 1 --2- 192.168.123.105:0/1884979563 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f429c11d4f0 0x7f429c11bb80 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:03.424 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.382+0000 7f42a0fb1700 1 -- 192.168.123.105:0/1884979563 >> 192.168.123.105:0/1884979563 conn(0x7f429c06d320 msgr2=0x7f429c06dd70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:03.424 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.383+0000 7f42a0fb1700 1 -- 192.168.123.105:0/1884979563 shutdown_connections 2026-03-10T07:46:03.424 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.383+0000 7f42a0fb1700 1 -- 192.168.123.105:0/1884979563 wait complete. 2026-03-10T07:46:03.746 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.561+0000 7f0a69d3c700 1 Processor -- start 2026-03-10T07:46:03.746 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.561+0000 7f0a69d3c700 1 -- start start 2026-03-10T07:46:03.746 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.562+0000 7f0a69d3c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0a641087a0 0x7f0a64108bc0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:03.746 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.562+0000 7f0a69d3c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0a64109190 con 0x7f0a641087a0 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.562+0000 7f0a68d3a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0a641087a0 0x7f0a64108bc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.562+0000 7f0a68d3a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0a641087a0 0x7f0a64108bc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57660/0 (socket says 192.168.123.105:57660) 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.562+0000 7f0a68d3a700 1 -- 192.168.123.105:0/3916581614 learned_addr learned my addr 192.168.123.105:0/3916581614 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.563+0000 7f0a68d3a700 1 -- 192.168.123.105:0/3916581614 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0a641099a0 con 0x7f0a641087a0 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.564+0000 7f0a68d3a700 1 --2- 192.168.123.105:0/3916581614 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0a641087a0 0x7f0a64108bc0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f0a58009a90 tx=0x7f0a58009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=f1ded4c9f150aaa0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.564+0000 7f0a637fe700 1 -- 192.168.123.105:0/3916581614 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0a58004030 con 0x7f0a641087a0 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.564+0000 7f0a637fe700 1 -- 192.168.123.105:0/3916581614 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0a5800b7e0 con 0x7f0a641087a0 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.565+0000 7f0a637fe700 1 -- 192.168.123.105:0/3916581614 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0a58003ae0 con 0x7f0a641087a0 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.566+0000 7f0a69d3c700 1 -- 192.168.123.105:0/3916581614 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0a641087a0 msgr2=0x7f0a64108bc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.566+0000 7f0a69d3c700 1 --2- 192.168.123.105:0/3916581614 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0a641087a0 0x7f0a64108bc0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f0a58009a90 tx=0x7f0a58009da0 comp rx=0 tx=0).stop 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.569+0000 7f0a69d3c700 1 -- 192.168.123.105:0/3916581614 shutdown_connections 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.569+0000 7f0a69d3c700 1 --2- 192.168.123.105:0/3916581614 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0a641087a0 0x7f0a64108bc0 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.569+0000 7f0a69d3c700 1 -- 192.168.123.105:0/3916581614 >> 192.168.123.105:0/3916581614 conn(0x7f0a6407bc30 msgr2=0x7f0a641064e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.569+0000 7f0a69d3c700 1 -- 192.168.123.105:0/3916581614 shutdown_connections 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.570+0000 7f0a69d3c700 1 -- 192.168.123.105:0/3916581614 wait complete. 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.571+0000 7f0a69d3c700 1 Processor -- start 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.571+0000 7f0a69d3c700 1 -- start start 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.575+0000 7f0a69d3c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0a641087a0 0x7f0a641980b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.575+0000 7f0a69d3c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0a641985f0 con 0x7f0a641087a0 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.581+0000 7f0a68d3a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0a641087a0 0x7f0a641980b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.581+0000 7f0a68d3a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0a641087a0 0x7f0a641980b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57674/0 (socket says 192.168.123.105:57674) 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.581+0000 7f0a68d3a700 1 -- 192.168.123.105:0/3803491507 learned_addr learned my addr 192.168.123.105:0/3803491507 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.582+0000 7f0a68d3a700 1 -- 192.168.123.105:0/3803491507 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0a58009740 con 0x7f0a641087a0 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.582+0000 7f0a68d3a700 1 --2- 192.168.123.105:0/3803491507 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0a641087a0 0x7f0a641980b0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f0a58009710 tx=0x7f0a58011700 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.582+0000 7f0a61ffb700 1 -- 192.168.123.105:0/3803491507 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0a58011b80 con 0x7f0a641087a0 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.582+0000 7f0a61ffb700 1 -- 192.168.123.105:0/3803491507 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0a58011ce0 con 0x7f0a641087a0 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.583+0000 7f0a61ffb700 1 -- 192.168.123.105:0/3803491507 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0a58019630 con 0x7f0a641087a0 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.583+0000 7f0a69d3c700 1 -- 192.168.123.105:0/3803491507 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0a641987f0 con 0x7f0a641087a0 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.583+0000 7f0a69d3c700 1 -- 192.168.123.105:0/3803491507 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0a64198c10 con 0x7f0a641087a0 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.584+0000 7f0a69d3c700 1 -- 192.168.123.105:0/3803491507 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0a640623c0 con 0x7f0a641087a0 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.584+0000 7f0a61ffb700 1 -- 192.168.123.105:0/3803491507 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f0a58028020 con 0x7f0a641087a0 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.585+0000 7f0a61ffb700 1 --2- 192.168.123.105:0/3803491507 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0a4c037fe0 0x7f0a4c03a4a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.585+0000 7f0a61ffb700 1 -- 192.168.123.105:0/3803491507 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f0a5804c630 con 0x7f0a641087a0 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.587+0000 7f0a63fff700 1 --2- 192.168.123.105:0/3803491507 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0a4c037fe0 0x7f0a4c03a4a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.587+0000 7f0a63fff700 1 --2- 192.168.123.105:0/3803491507 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0a4c037fe0 0x7f0a4c03a4a0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f0a54006fd0 tx=0x7f0a54006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.587+0000 7f0a61ffb700 1 -- 192.168.123.105:0/3803491507 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f0a58018350 con 0x7f0a641087a0 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.709+0000 7f0a69d3c700 1 -- 192.168.123.105:0/3803491507 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/container_init}] v 0) v1 -- 0x7f0a64199970 con 0x7f0a641087a0 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.714+0000 7f0a61ffb700 1 -- 192.168.123.105:0/3803491507 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/container_init}]=0 v7)=0 v7) v1 ==== 142+0+0 (secure 0 0 0) 0x7f0a58021b90 con 0x7f0a641087a0 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.717+0000 7f0a69d3c700 1 -- 192.168.123.105:0/3803491507 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0a4c037fe0 msgr2=0x7f0a4c03a4a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.717+0000 7f0a69d3c700 1 --2- 192.168.123.105:0/3803491507 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0a4c037fe0 0x7f0a4c03a4a0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f0a54006fd0 tx=0x7f0a54006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.717+0000 7f0a69d3c700 1 -- 192.168.123.105:0/3803491507 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0a641087a0 msgr2=0x7f0a641980b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.718+0000 7f0a69d3c700 1 --2- 192.168.123.105:0/3803491507 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0a641087a0 0x7f0a641980b0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f0a58009710 tx=0x7f0a58011700 comp rx=0 tx=0).stop 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.718+0000 7f0a69d3c700 1 -- 192.168.123.105:0/3803491507 shutdown_connections 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.718+0000 7f0a69d3c700 1 --2- 192.168.123.105:0/3803491507 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0a4c037fe0 0x7f0a4c03a4a0 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.718+0000 7f0a69d3c700 1 --2- 192.168.123.105:0/3803491507 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0a641087a0 0x7f0a641980b0 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.718+0000 7f0a69d3c700 1 -- 192.168.123.105:0/3803491507 >> 192.168.123.105:0/3803491507 conn(0x7f0a6407bc30 msgr2=0x7f0a64105d30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.718+0000 7f0a69d3c700 1 -- 192.168.123.105:0/3803491507 shutdown_connections 2026-03-10T07:46:03.747 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.718+0000 7f0a69d3c700 1 -- 192.168.123.105:0/3803491507 wait complete. 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.872+0000 7fd09771c700 1 Processor -- start 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.873+0000 7fd09771c700 1 -- start start 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.873+0000 7fd09771c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd090108980 0x7fd090108da0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.873+0000 7fd09771c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd090109370 con 0x7fd090108980 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.873+0000 7fd0954b8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd090108980 0x7fd090108da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.873+0000 7fd0954b8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd090108980 0x7fd090108da0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57690/0 (socket says 192.168.123.105:57690) 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.873+0000 7fd0954b8700 1 -- 192.168.123.105:0/2770847621 learned_addr learned my addr 192.168.123.105:0/2770847621 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.874+0000 7fd0954b8700 1 -- 192.168.123.105:0/2770847621 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd090109b80 con 0x7fd090108980 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.874+0000 7fd0954b8700 1 --2- 192.168.123.105:0/2770847621 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd090108980 0x7fd090108da0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fd080009a90 tx=0x7fd080009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=30a2aff94f143fcf server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.874+0000 7fd087fff700 1 -- 192.168.123.105:0/2770847621 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd080004030 con 0x7fd090108980 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.874+0000 7fd087fff700 1 -- 192.168.123.105:0/2770847621 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd08000b7e0 con 0x7fd090108980 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.874+0000 7fd09771c700 1 -- 192.168.123.105:0/2770847621 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd090108980 msgr2=0x7fd090108da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.874+0000 7fd09771c700 1 --2- 192.168.123.105:0/2770847621 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd090108980 0x7fd090108da0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7fd080009a90 tx=0x7fd080009da0 comp rx=0 tx=0).stop 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.875+0000 7fd09771c700 1 -- 192.168.123.105:0/2770847621 shutdown_connections 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.875+0000 7fd09771c700 1 --2- 192.168.123.105:0/2770847621 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd090108980 0x7fd090108da0 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.875+0000 7fd09771c700 1 -- 192.168.123.105:0/2770847621 >> 192.168.123.105:0/2770847621 conn(0x7fd0901044d0 msgr2=0x7fd0901068c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.875+0000 7fd09771c700 1 -- 192.168.123.105:0/2770847621 shutdown_connections 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.875+0000 7fd09771c700 1 -- 192.168.123.105:0/2770847621 wait complete. 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.875+0000 7fd09771c700 1 Processor -- start 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.875+0000 7fd09771c700 1 -- start start 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.876+0000 7fd09771c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd090108980 0x7fd09019c770 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.876+0000 7fd09771c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd090109370 con 0x7fd090108980 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.876+0000 7fd0954b8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd090108980 0x7fd09019c770 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.876+0000 7fd0954b8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd090108980 0x7fd09019c770 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57692/0 (socket says 192.168.123.105:57692) 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.876+0000 7fd0954b8700 1 -- 192.168.123.105:0/1901412910 learned_addr learned my addr 192.168.123.105:0/1901412910 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.876+0000 7fd0954b8700 1 -- 192.168.123.105:0/1901412910 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd080009740 con 0x7fd090108980 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.876+0000 7fd0954b8700 1 --2- 192.168.123.105:0/1901412910 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd090108980 0x7fd09019c770 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fd080006fa0 tx=0x7fd080003d50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.877+0000 7fd0867fc700 1 -- 192.168.123.105:0/1901412910 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd080004180 con 0x7fd090108980 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.877+0000 7fd0867fc700 1 -- 192.168.123.105:0/1901412910 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd0800042e0 con 0x7fd090108980 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.877+0000 7fd0867fc700 1 -- 192.168.123.105:0/1901412910 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd080011650 con 0x7fd090108980 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.877+0000 7fd09771c700 1 -- 192.168.123.105:0/1901412910 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd09019ccb0 con 0x7fd090108980 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.877+0000 7fd09771c700 1 -- 192.168.123.105:0/1901412910 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd09019d0d0 con 0x7fd090108980 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.878+0000 7fd0867fc700 1 -- 192.168.123.105:0/1901412910 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7fd0800117b0 con 0x7fd090108980 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.878+0000 7fd0867fc700 1 --2- 192.168.123.105:0/1901412910 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd07c038330 0x7fd07c03a7f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.878+0000 7fd0867fc700 1 -- 192.168.123.105:0/1901412910 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7fd08004c0a0 con 0x7fd090108980 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.878+0000 7fd09771c700 1 -- 192.168.123.105:0/1901412910 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd09004fa20 con 0x7fd090108980 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.881+0000 7fd094cb7700 1 --2- 192.168.123.105:0/1901412910 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd07c038330 0x7fd07c03a7f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.881+0000 7fd094cb7700 1 --2- 192.168.123.105:0/1901412910 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd07c038330 0x7fd07c03a7f0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fd08c006fd0 tx=0x7fd08c006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.881+0000 7fd0867fc700 1 -- 192.168.123.105:0/1901412910 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fd08001a430 con 0x7fd090108980 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.983+0000 7fd09771c700 1 -- 192.168.123.105:0/1901412910 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config set, name=mgr/dashboard/ssl_server_port}] v 0) v1 -- 0x7fd09019d380 con 0x7fd090108980 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.987+0000 7fd0867fc700 1 -- 192.168.123.105:0/1901412910 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/dashboard/ssl_server_port}]=0 v8)=0 v8) v1 ==== 130+0+0 (secure 0 0 0) 0x7fd080021b90 con 0x7fd090108980 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.992+0000 7fd09771c700 1 -- 192.168.123.105:0/1901412910 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd07c038330 msgr2=0x7fd07c03a7f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.992+0000 7fd09771c700 1 --2- 192.168.123.105:0/1901412910 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd07c038330 0x7fd07c03a7f0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fd08c006fd0 tx=0x7fd08c006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:04.020 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.992+0000 7fd09771c700 1 -- 192.168.123.105:0/1901412910 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd090108980 msgr2=0x7fd09019c770 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:04.021 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.992+0000 7fd09771c700 1 --2- 192.168.123.105:0/1901412910 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd090108980 0x7fd09019c770 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fd080006fa0 tx=0x7fd080003d50 comp rx=0 tx=0).stop 2026-03-10T07:46:04.021 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.992+0000 7fd09771c700 1 -- 192.168.123.105:0/1901412910 shutdown_connections 2026-03-10T07:46:04.021 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.992+0000 7fd09771c700 1 --2- 192.168.123.105:0/1901412910 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd07c038330 0x7fd07c03a7f0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:04.021 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.992+0000 7fd09771c700 1 --2- 192.168.123.105:0/1901412910 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd090108980 0x7fd09019c770 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:04.021 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.992+0000 7fd09771c700 1 -- 192.168.123.105:0/1901412910 >> 192.168.123.105:0/1901412910 conn(0x7fd0901044d0 msgr2=0x7fd090106210 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:04.021 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.992+0000 7fd09771c700 1 -- 192.168.123.105:0/1901412910 shutdown_connections 2026-03-10T07:46:04.021 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:03.992+0000 7fd09771c700 1 -- 192.168.123.105:0/1901412910 wait complete. 2026-03-10T07:46:04.021 INFO:teuthology.orchestra.run.vm05.stdout:Enabling the dashboard module... 2026-03-10T07:46:04.046 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:03 vm05 ceph-mon[50387]: from='client.14150 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "prometheus", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:46:04.046 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:03 vm05 ceph-mon[50387]: Saving service prometheus spec with placement count:1 2026-03-10T07:46:04.046 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:03 vm05 ceph-mon[50387]: from='client.14152 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "grafana", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:46:04.046 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:03 vm05 ceph-mon[50387]: Saving service grafana spec with placement count:1 2026-03-10T07:46:04.046 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:03 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:04.046 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:03 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:04.046 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:03 vm05 ceph-mon[50387]: from='mgr.14120 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:04.046 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:03 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/3803491507' entity='client.admin' 2026-03-10T07:46:05.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.141+0000 7f88a978f700 1 Processor -- start 2026-03-10T07:46:05.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.142+0000 7f88a978f700 1 -- start start 2026-03-10T07:46:05.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.142+0000 7f88a978f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88a4104fb0 0x7f88a41073e0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:05.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.142+0000 7f88a978f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f88a4074720 con 0x7f88a4104fb0 2026-03-10T07:46:05.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.142+0000 7f88a2ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88a4104fb0 0x7f88a41073e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:05.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.142+0000 7f88a2ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88a4104fb0 0x7f88a41073e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57694/0 (socket says 192.168.123.105:57694) 2026-03-10T07:46:05.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.142+0000 7f88a2ffd700 1 -- 192.168.123.105:0/3040975437 learned_addr learned my addr 192.168.123.105:0/3040975437 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:05.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.143+0000 7f88a2ffd700 1 -- 192.168.123.105:0/3040975437 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f88a4107920 con 0x7f88a4104fb0 2026-03-10T07:46:05.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.143+0000 7f88a2ffd700 1 --2- 192.168.123.105:0/3040975437 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88a4104fb0 0x7f88a41073e0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f888c009a90 tx=0x7f888c009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=c91e7d783d080b79 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:05.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.143+0000 7f88a1ffb700 1 -- 192.168.123.105:0/3040975437 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f888c004030 con 0x7f88a4104fb0 2026-03-10T07:46:05.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.143+0000 7f88a1ffb700 1 -- 192.168.123.105:0/3040975437 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f888c00b7e0 con 0x7f88a4104fb0 2026-03-10T07:46:05.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.143+0000 7f88a978f700 1 -- 192.168.123.105:0/3040975437 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88a4104fb0 msgr2=0x7f88a41073e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:05.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.143+0000 7f88a978f700 1 --2- 192.168.123.105:0/3040975437 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88a4104fb0 0x7f88a41073e0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f888c009a90 tx=0x7f888c009da0 comp rx=0 tx=0).stop 2026-03-10T07:46:05.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.144+0000 7f88a978f700 1 -- 192.168.123.105:0/3040975437 shutdown_connections 2026-03-10T07:46:05.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.144+0000 7f88a978f700 1 --2- 192.168.123.105:0/3040975437 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88a4104fb0 0x7f88a41073e0 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:05.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.144+0000 7f88a978f700 1 -- 192.168.123.105:0/3040975437 >> 192.168.123.105:0/3040975437 conn(0x7f88a4100bd0 msgr2=0x7f88a4103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:05.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.144+0000 7f88a978f700 1 -- 192.168.123.105:0/3040975437 shutdown_connections 2026-03-10T07:46:05.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.144+0000 7f88a978f700 1 -- 192.168.123.105:0/3040975437 wait complete. 2026-03-10T07:46:05.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.144+0000 7f88a978f700 1 Processor -- start 2026-03-10T07:46:05.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.144+0000 7f88a978f700 1 -- start start 2026-03-10T07:46:05.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.145+0000 7f88a978f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88a41a0bd0 0x7f88a41a1010 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:05.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.145+0000 7f88a978f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f88a4074720 con 0x7f88a41a0bd0 2026-03-10T07:46:05.026 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.145+0000 7f88a2ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88a41a0bd0 0x7f88a41a1010 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.145+0000 7f88a2ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88a41a0bd0 0x7f88a41a1010 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57696/0 (socket says 192.168.123.105:57696) 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.145+0000 7f88a2ffd700 1 -- 192.168.123.105:0/3901385925 learned_addr learned my addr 192.168.123.105:0/3901385925 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.145+0000 7f88a2ffd700 1 -- 192.168.123.105:0/3901385925 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f888c009740 con 0x7f88a41a0bd0 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.145+0000 7f88a2ffd700 1 --2- 192.168.123.105:0/3901385925 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88a41a0bd0 0x7f88a41a1010 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f888c009130 tx=0x7f888c00be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.145+0000 7f889bfff700 1 -- 192.168.123.105:0/3901385925 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f888c003f60 con 0x7f88a41a0bd0 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.145+0000 7f889bfff700 1 -- 192.168.123.105:0/3901385925 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f888c0045a0 con 0x7f88a41a0bd0 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.145+0000 7f88a978f700 1 -- 192.168.123.105:0/3901385925 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f88a41a1550 con 0x7f88a41a0bd0 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.146+0000 7f88a978f700 1 -- 192.168.123.105:0/3901385925 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f88a41a4190 con 0x7f88a41a0bd0 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.146+0000 7f889bfff700 1 -- 192.168.123.105:0/3901385925 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f888c01ad80 con 0x7f88a41a0bd0 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.146+0000 7f889bfff700 1 -- 192.168.123.105:0/3901385925 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 8) v1 ==== 45278+0+0 (secure 0 0 0) 0x7f888c01a6f0 con 0x7f88a41a0bd0 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.146+0000 7f889bfff700 1 --2- 192.168.123.105:0/3901385925 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8890038390 0x7f889003a850 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.146+0000 7f889bfff700 1 -- 192.168.123.105:0/3901385925 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f888c04ba20 con 0x7f88a41a0bd0 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.147+0000 7f88a27fc700 1 --2- 192.168.123.105:0/3901385925 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8890038390 0x7f889003a850 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.147+0000 7f88a27fc700 1 --2- 192.168.123.105:0/3901385925 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8890038390 0x7f889003a850 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f8894006fd0 tx=0x7f8894006e40 comp rx=0 tx=0).ready entity=mgr.14120 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.147+0000 7f88a978f700 1 -- 192.168.123.105:0/3901385925 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f88a419a290 con 0x7f88a41a0bd0 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.150+0000 7f889bfff700 1 -- 192.168.123.105:0/3901385925 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f888c011420 con 0x7f88a41a0bd0 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.277+0000 7f88a978f700 1 -- 192.168.123.105:0/3901385925 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mgr module enable", "module": "dashboard"} v 0) v1 -- 0x7f88a404fa20 con 0x7f88a41a0bd0 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.991+0000 7f889bfff700 1 -- 192.168.123.105:0/3901385925 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mgrmap(e 9) v1 ==== 45291+0+0 (secure 0 0 0) 0x7f888c0040c0 con 0x7f88a41a0bd0 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.991+0000 7f889bfff700 1 -- 192.168.123.105:0/3901385925 <== mon.0 v2:192.168.123.105:3300/0 8 ==== mon_command_ack([{"prefix": "mgr module enable", "module": "dashboard"}]=0 v9) v1 ==== 88+0+0 (secure 0 0 0) 0x7f888c04e580 con 0x7f88a41a0bd0 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.994+0000 7f88a978f700 1 -- 192.168.123.105:0/3901385925 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8890038390 msgr2=0x7f889003a850 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.994+0000 7f88a978f700 1 --2- 192.168.123.105:0/3901385925 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8890038390 0x7f889003a850 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f8894006fd0 tx=0x7f8894006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.994+0000 7f88a978f700 1 -- 192.168.123.105:0/3901385925 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88a41a0bd0 msgr2=0x7f88a41a1010 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.994+0000 7f88a978f700 1 --2- 192.168.123.105:0/3901385925 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88a41a0bd0 0x7f88a41a1010 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f888c009130 tx=0x7f888c00be30 comp rx=0 tx=0).stop 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.994+0000 7f88a978f700 1 -- 192.168.123.105:0/3901385925 shutdown_connections 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.994+0000 7f88a978f700 1 --2- 192.168.123.105:0/3901385925 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8890038390 0x7f889003a850 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.994+0000 7f88a978f700 1 --2- 192.168.123.105:0/3901385925 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f88a41a0bd0 0x7f88a41a1010 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.994+0000 7f88a978f700 1 -- 192.168.123.105:0/3901385925 >> 192.168.123.105:0/3901385925 conn(0x7f88a4100bd0 msgr2=0x7f88a41071d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.994+0000 7f88a978f700 1 -- 192.168.123.105:0/3901385925 shutdown_connections 2026-03-10T07:46:05.027 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:04.994+0000 7f88a978f700 1 -- 192.168.123.105:0/3901385925 wait complete. 2026-03-10T07:46:05.330 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:04 vm05 ceph-mon[50387]: from='client.14154 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "node-exporter", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:46:05.330 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:04 vm05 ceph-mon[50387]: Saving service node-exporter spec with placement * 2026-03-10T07:46:05.330 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:04 vm05 ceph-mon[50387]: from='client.14156 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "alertmanager", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:46:05.330 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:04 vm05 ceph-mon[50387]: Saving service alertmanager spec with placement count:1 2026-03-10T07:46:05.330 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:04 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/1901412910' entity='client.admin' 2026-03-10T07:46:05.331 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:04 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/3901385925' entity='client.admin' cmd=[{"prefix": "mgr module enable", "module": "dashboard"}]: dispatch 2026-03-10T07:46:05.365 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout { 2026-03-10T07:46:05.365 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "epoch": 9, 2026-03-10T07:46:05.365 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "available": true, 2026-03-10T07:46:05.365 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "active_name": "vm05.blexke", 2026-03-10T07:46:05.365 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "num_standby": 0 2026-03-10T07:46:05.365 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout } 2026-03-10T07:46:05.365 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.190+0000 7f400e113700 1 Processor -- start 2026-03-10T07:46:05.365 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.190+0000 7f400e113700 1 -- start start 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.190+0000 7f400e113700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40080721d0 0x7f40080725f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.190+0000 7f400e113700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4008072bc0 con 0x7f40080721d0 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.191+0000 7f400d111700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40080721d0 0x7f40080725f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.191+0000 7f400d111700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40080721d0 0x7f40080725f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57722/0 (socket says 192.168.123.105:57722) 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.191+0000 7f400d111700 1 -- 192.168.123.105:0/4205755282 learned_addr learned my addr 192.168.123.105:0/4205755282 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.191+0000 7f400d111700 1 -- 192.168.123.105:0/4205755282 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f400810e1c0 con 0x7f40080721d0 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.191+0000 7f400d111700 1 --2- 192.168.123.105:0/4205755282 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40080721d0 0x7f40080725f0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f400400d180 tx=0x7f400400d490 comp rx=0 tx=0).ready entity=mon.0 client_cookie=9d57e2d2747bb549 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.192+0000 7f3ffffff700 1 -- 192.168.123.105:0/4205755282 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4004010070 con 0x7f40080721d0 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.192+0000 7f3ffffff700 1 -- 192.168.123.105:0/4205755282 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f4004004510 con 0x7f40080721d0 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.192+0000 7f400e113700 1 -- 192.168.123.105:0/4205755282 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40080721d0 msgr2=0x7f40080725f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.192+0000 7f400e113700 1 --2- 192.168.123.105:0/4205755282 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40080721d0 0x7f40080725f0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f400400d180 tx=0x7f400400d490 comp rx=0 tx=0).stop 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.192+0000 7f400e113700 1 -- 192.168.123.105:0/4205755282 shutdown_connections 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.192+0000 7f400e113700 1 --2- 192.168.123.105:0/4205755282 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40080721d0 0x7f40080725f0 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.192+0000 7f400e113700 1 -- 192.168.123.105:0/4205755282 >> 192.168.123.105:0/4205755282 conn(0x7f400806d320 msgr2=0x7f400806f760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.192+0000 7f400e113700 1 -- 192.168.123.105:0/4205755282 shutdown_connections 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.192+0000 7f400e113700 1 -- 192.168.123.105:0/4205755282 wait complete. 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.193+0000 7f400e113700 1 Processor -- start 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.193+0000 7f400e113700 1 -- start start 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.193+0000 7f400e113700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40081a09c0 0x7f40081a0de0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.193+0000 7f400e113700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4004003c20 con 0x7f40081a09c0 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.193+0000 7f400d111700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40081a09c0 0x7f40081a0de0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.193+0000 7f400d111700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40081a09c0 0x7f40081a0de0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57732/0 (socket says 192.168.123.105:57732) 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.193+0000 7f400d111700 1 -- 192.168.123.105:0/3521520082 learned_addr learned my addr 192.168.123.105:0/3521520082 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.193+0000 7f400d111700 1 -- 192.168.123.105:0/3521520082 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f40040087c0 con 0x7f40081a09c0 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.193+0000 7f400d111700 1 --2- 192.168.123.105:0/3521520082 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40081a09c0 0x7f40081a0de0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f4004008c10 tx=0x7f4004008cf0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.194+0000 7f3ffe7fc700 1 -- 192.168.123.105:0/3521520082 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4004010050 con 0x7f40081a09c0 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.194+0000 7f400e113700 1 -- 192.168.123.105:0/3521520082 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f40081a1320 con 0x7f40081a09c0 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.194+0000 7f400e113700 1 -- 192.168.123.105:0/3521520082 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f40081a3fa0 con 0x7f40081a09c0 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.194+0000 7f3ffe7fc700 1 -- 192.168.123.105:0/3521520082 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f400400deb0 con 0x7f40081a09c0 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.194+0000 7f3ffe7fc700 1 -- 192.168.123.105:0/3521520082 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4004016440 con 0x7f40081a09c0 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.195+0000 7f400e113700 1 -- 192.168.123.105:0/3521520082 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3fec005320 con 0x7f40081a09c0 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.198+0000 7f3ffe7fc700 1 -- 192.168.123.105:0/3521520082 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 9) v1 ==== 45291+0+0 (secure 0 0 0) 0x7f40040041f0 con 0x7f40081a09c0 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.198+0000 7f3ffe7fc700 1 --2- 192.168.123.105:0/3521520082 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3ff4038430 0x7f3ff403a8f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.198+0000 7f400c910700 1 -- 192.168.123.105:0/3521520082 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3ff4038430 msgr2=0x7f3ff403a8f0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.198+0000 7f400c910700 1 --2- 192.168.123.105:0/3521520082 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3ff4038430 0x7f3ff403a8f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.198+0000 7f3ffe7fc700 1 -- 192.168.123.105:0/3521520082 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f400404c220 con 0x7f40081a09c0 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.198+0000 7f3ffe7fc700 1 -- 192.168.123.105:0/3521520082 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f400404c650 con 0x7f40081a09c0 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.329+0000 7f400e113700 1 -- 192.168.123.105:0/3521520082 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mgr stat"} v 0) v1 -- 0x7f3fec005cc0 con 0x7f40081a09c0 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.330+0000 7f3ffe7fc700 1 -- 192.168.123.105:0/3521520082 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mgr stat"}]=0 v9) v1 ==== 56+0+98 (secure 0 0 0) 0x7f400401b030 con 0x7f40081a09c0 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.335+0000 7f3ff3fff700 1 -- 192.168.123.105:0/3521520082 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3ff4038430 msgr2=0x7f3ff403a8f0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:46:05.366 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.335+0000 7f3ff3fff700 1 --2- 192.168.123.105:0/3521520082 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3ff4038430 0x7f3ff403a8f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:05.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.335+0000 7f3ff3fff700 1 -- 192.168.123.105:0/3521520082 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40081a09c0 msgr2=0x7f40081a0de0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:05.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.335+0000 7f3ff3fff700 1 --2- 192.168.123.105:0/3521520082 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40081a09c0 0x7f40081a0de0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f4004008c10 tx=0x7f4004008cf0 comp rx=0 tx=0).stop 2026-03-10T07:46:05.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.337+0000 7f3ff3fff700 1 -- 192.168.123.105:0/3521520082 shutdown_connections 2026-03-10T07:46:05.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.337+0000 7f3ff3fff700 1 --2- 192.168.123.105:0/3521520082 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3ff4038430 0x7f3ff403a8f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:05.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.337+0000 7f3ff3fff700 1 --2- 192.168.123.105:0/3521520082 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f40081a09c0 0x7f40081a0de0 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:05.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.337+0000 7f3ff3fff700 1 -- 192.168.123.105:0/3521520082 >> 192.168.123.105:0/3521520082 conn(0x7f400806d320 msgr2=0x7f400806de90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:05.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.337+0000 7f3ff3fff700 1 -- 192.168.123.105:0/3521520082 shutdown_connections 2026-03-10T07:46:05.367 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.337+0000 7f3ff3fff700 1 -- 192.168.123.105:0/3521520082 wait complete. 2026-03-10T07:46:05.367 INFO:teuthology.orchestra.run.vm05.stdout:Waiting for the mgr to restart... 2026-03-10T07:46:05.367 INFO:teuthology.orchestra.run.vm05.stdout:Waiting for mgr epoch 9... 2026-03-10T07:46:06.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:06 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/3901385925' entity='client.admin' cmd='[{"prefix": "mgr module enable", "module": "dashboard"}]': finished 2026-03-10T07:46:06.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:06 vm05 ceph-mon[50387]: mgrmap e9: vm05.blexke(active, since 9s) 2026-03-10T07:46:06.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:06 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/3521520082' entity='client.admin' cmd=[{"prefix": "mgr stat"}]: dispatch 2026-03-10T07:46:09.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:09 vm05 ceph-mon[50387]: Active manager daemon vm05.blexke restarted 2026-03-10T07:46:09.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:09 vm05 ceph-mon[50387]: Activating manager daemon vm05.blexke 2026-03-10T07:46:09.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:09 vm05 ceph-mon[50387]: osdmap e3: 0 total, 0 up, 0 in 2026-03-10T07:46:09.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:09 vm05 ceph-mon[50387]: mgrmap e10: vm05.blexke(active, starting, since 0.00513863s) 2026-03-10T07:46:09.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:09 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T07:46:09.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:09 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr metadata", "who": "vm05.blexke", "id": "vm05.blexke"}]: dispatch 2026-03-10T07:46:09.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:09 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T07:46:09.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:09 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T07:46:09.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:09 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T07:46:09.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:09 vm05 ceph-mon[50387]: Manager daemon vm05.blexke is now available 2026-03-10T07:46:09.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:09 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:46:09.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:09 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:46:09.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:09 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.blexke/mirror_snapshot_schedule"}]: dispatch 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout { 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "mgrmap_epoch": 11, 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout "initialized": true 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout } 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.504+0000 7f78aef16700 1 Processor -- start 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.504+0000 7f78aef16700 1 -- start start 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.504+0000 7f78aef16700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78a00a4de0 0x7f78a00a5200 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.504+0000 7f78aef16700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f78a00a57d0 con 0x7f78a00a4de0 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.504+0000 7f78accb2700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78a00a4de0 0x7f78a00a5200 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.504+0000 7f78accb2700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78a00a4de0 0x7f78a00a5200 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57736/0 (socket says 192.168.123.105:57736) 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.504+0000 7f78accb2700 1 -- 192.168.123.105:0/28278292 learned_addr learned my addr 192.168.123.105:0/28278292 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.505+0000 7f78accb2700 1 -- 192.168.123.105:0/28278292 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f78a00a5fe0 con 0x7f78a00a4de0 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.505+0000 7f78accb2700 1 --2- 192.168.123.105:0/28278292 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78a00a4de0 0x7f78a00a5200 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f789c00d180 tx=0x7f789c00d490 comp rx=0 tx=0).ready entity=mon.0 client_cookie=e241e3501e615f58 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.505+0000 7f78a77fe700 1 -- 192.168.123.105:0/28278292 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f789c010070 con 0x7f78a00a4de0 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.506+0000 7f78a77fe700 1 -- 192.168.123.105:0/28278292 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f789c004510 con 0x7f78a00a4de0 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.506+0000 7f78aef16700 1 -- 192.168.123.105:0/28278292 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78a00a4de0 msgr2=0x7f78a00a5200 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.506+0000 7f78aef16700 1 --2- 192.168.123.105:0/28278292 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78a00a4de0 0x7f78a00a5200 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f789c00d180 tx=0x7f789c00d490 comp rx=0 tx=0).stop 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.506+0000 7f78aef16700 1 -- 192.168.123.105:0/28278292 shutdown_connections 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.506+0000 7f78aef16700 1 --2- 192.168.123.105:0/28278292 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78a00a4de0 0x7f78a00a5200 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.506+0000 7f78aef16700 1 -- 192.168.123.105:0/28278292 >> 192.168.123.105:0/28278292 conn(0x7f78a009ff10 msgr2=0x7f78a00a2370 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.506+0000 7f78aef16700 1 -- 192.168.123.105:0/28278292 shutdown_connections 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.506+0000 7f78aef16700 1 -- 192.168.123.105:0/28278292 wait complete. 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.507+0000 7f78aef16700 1 Processor -- start 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.507+0000 7f78aef16700 1 -- start start 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.507+0000 7f78aef16700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78a0138780 0x7f78a0138ba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.507+0000 7f78aef16700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f789c003c20 con 0x7f78a0138780 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.507+0000 7f78accb2700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78a0138780 0x7f78a0138ba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.507+0000 7f78accb2700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78a0138780 0x7f78a0138ba0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57748/0 (socket says 192.168.123.105:57748) 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.507+0000 7f78accb2700 1 -- 192.168.123.105:0/391228375 learned_addr learned my addr 192.168.123.105:0/391228375 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.507+0000 7f78accb2700 1 -- 192.168.123.105:0/391228375 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f789c0087c0 con 0x7f78a0138780 2026-03-10T07:46:10.567 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.507+0000 7f78accb2700 1 --2- 192.168.123.105:0/391228375 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78a0138780 0x7f78a0138ba0 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f789c008c10 tx=0x7f789c008cf0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.508+0000 7f78a5ffb700 1 -- 192.168.123.105:0/391228375 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f789c010050 con 0x7f78a0138780 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.508+0000 7f78aef16700 1 -- 192.168.123.105:0/391228375 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f78a01390e0 con 0x7f78a0138780 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.508+0000 7f78aef16700 1 -- 192.168.123.105:0/391228375 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f78a0139d50 con 0x7f78a0138780 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.508+0000 7f78a5ffb700 1 -- 192.168.123.105:0/391228375 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f789c00deb0 con 0x7f78a0138780 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.508+0000 7f78a5ffb700 1 -- 192.168.123.105:0/391228375 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f789c016440 con 0x7f78a0138780 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.509+0000 7f78a5ffb700 1 -- 192.168.123.105:0/391228375 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 9) v1 ==== 45291+0+0 (secure 0 0 0) 0x7f789c0041f0 con 0x7f78a0138780 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.509+0000 7f78a5ffb700 1 --2- 192.168.123.105:0/391228375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7898038430 0x7f789803a8f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.509+0000 7f78a7fff700 1 -- 192.168.123.105:0/391228375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7898038430 msgr2=0x7f789803a8f0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.509+0000 7f78a7fff700 1 --2- 192.168.123.105:0/391228375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7898038430 0x7f789803a8f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.510+0000 7f78a5ffb700 1 -- 192.168.123.105:0/391228375 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(2..2 src has 1..2) v4 ==== 940+0+0 (secure 0 0 0) 0x7f789c04b790 con 0x7f78a0138780 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.510+0000 7f78aef16700 1 -- 192.168.123.105:0/391228375 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7f788c000d40 con 0x7f7898038430 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.710+0000 7f78a7fff700 1 -- 192.168.123.105:0/391228375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7898038430 msgr2=0x7f789803a8f0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:05.710+0000 7f78a7fff700 1 --2- 192.168.123.105:0/391228375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7898038430 0x7f789803a8f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.400000 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:06.111+0000 7f78a7fff700 1 -- 192.168.123.105:0/391228375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7898038430 msgr2=0x7f789803a8f0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:06.111+0000 7f78a7fff700 1 --2- 192.168.123.105:0/391228375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7898038430 0x7f789803a8f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.800000 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:06.912+0000 7f78a7fff700 1 -- 192.168.123.105:0/391228375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7898038430 msgr2=0x7f789803a8f0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:06.912+0000 7f78a7fff700 1 --2- 192.168.123.105:0/391228375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7898038430 0x7f789803a8f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 1.600000 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:08.513+0000 7f78a7fff700 1 -- 192.168.123.105:0/391228375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7898038430 msgr2=0x7f789803a8f0 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:08.513+0000 7f78a7fff700 1 --2- 192.168.123.105:0/391228375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7898038430 0x7f789803a8f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 3.200000 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:09.524+0000 7f78a5ffb700 1 -- 192.168.123.105:0/391228375 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mgrmap(e 10) v1 ==== 45058+0+0 (secure 0 0 0) 0x7f789c0044a0 con 0x7f78a0138780 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:09.525+0000 7f78a5ffb700 1 -- 192.168.123.105:0/391228375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7898038430 msgr2=0x7f789803a8f0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:09.525+0000 7f78a5ffb700 1 --2- 192.168.123.105:0/391228375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7898038430 0x7f789803a8f0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.528+0000 7f78a5ffb700 1 -- 192.168.123.105:0/391228375 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mgrmap(e 11) v1 ==== 45185+0+0 (secure 0 0 0) 0x7f789c04d1a0 con 0x7f78a0138780 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.528+0000 7f78a5ffb700 1 --2- 192.168.123.105:0/391228375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7898038430 0x7f789803a8f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.528+0000 7f78a5ffb700 1 -- 192.168.123.105:0/391228375 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- command(tid 0: {"prefix": "get_command_descriptions"}) v1 -- 0x7f788c000d40 con 0x7f7898038430 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.530+0000 7f78a7fff700 1 --2- 192.168.123.105:0/391228375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7898038430 0x7f789803a8f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.530+0000 7f78a7fff700 1 --2- 192.168.123.105:0/391228375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7898038430 0x7f789803a8f0 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f78a804f8e0 tx=0x7f78a80690d0 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.531+0000 7f78a5ffb700 1 -- 192.168.123.105:0/391228375 <== mgr.14164 v2:192.168.123.105:6800/2 1 ==== command_reply(tid 0: 0 ) v1 ==== 8+0+6910 (secure 0 0 0) 0x7f788c000d40 con 0x7f7898038430 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.534+0000 7f78937fe700 1 -- 192.168.123.105:0/391228375 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- command(tid 1: {"prefix": "mgr_status"}) v1 -- 0x7f7880000d10 con 0x7f7898038430 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.535+0000 7f78a5ffb700 1 -- 192.168.123.105:0/391228375 <== mgr.14164 v2:192.168.123.105:6800/2 2 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+52 (secure 0 0 0) 0x7f7880000d10 con 0x7f7898038430 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.535+0000 7f78937fe700 1 -- 192.168.123.105:0/391228375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7898038430 msgr2=0x7f789803a8f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.535+0000 7f78937fe700 1 --2- 192.168.123.105:0/391228375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7898038430 0x7f789803a8f0 secure :-1 s=READY pgs=1 cs=0 l=1 rev1=1 crypto rx=0x7f78a804f8e0 tx=0x7f78a80690d0 comp rx=0 tx=0).stop 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.535+0000 7f78937fe700 1 -- 192.168.123.105:0/391228375 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78a0138780 msgr2=0x7f78a0138ba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.535+0000 7f78937fe700 1 --2- 192.168.123.105:0/391228375 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78a0138780 0x7f78a0138ba0 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f789c008c10 tx=0x7f789c008cf0 comp rx=0 tx=0).stop 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.535+0000 7f78937fe700 1 -- 192.168.123.105:0/391228375 shutdown_connections 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.535+0000 7f78937fe700 1 --2- 192.168.123.105:0/391228375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7898038430 0x7f789803a8f0 unknown :-1 s=CLOSED pgs=1 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.535+0000 7f78937fe700 1 --2- 192.168.123.105:0/391228375 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78a0138780 0x7f78a0138ba0 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.535+0000 7f78937fe700 1 -- 192.168.123.105:0/391228375 >> 192.168.123.105:0/391228375 conn(0x7f78a009ff10 msgr2=0x7f78a00a0970 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.535+0000 7f78937fe700 1 -- 192.168.123.105:0/391228375 shutdown_connections 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.536+0000 7f78937fe700 1 -- 192.168.123.105:0/391228375 wait complete. 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:mgr epoch 9 is available 2026-03-10T07:46:10.568 INFO:teuthology.orchestra.run.vm05.stdout:Generating a dashboard self-signed certificate... 2026-03-10T07:46:10.820 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:10 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.blexke/trash_purge_schedule"}]: dispatch 2026-03-10T07:46:10.820 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:10 vm05 ceph-mon[50387]: [10/Mar/2026:07:46:09] ENGINE Bus STARTING 2026-03-10T07:46:10.820 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:10 vm05 ceph-mon[50387]: [10/Mar/2026:07:46:10] ENGINE Serving on http://192.168.123.105:8765 2026-03-10T07:46:10.820 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:10 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:10.820 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:10 vm05 ceph-mon[50387]: mgrmap e11: vm05.blexke(active, since 1.00872s) 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout Self-signed certificate created 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.696+0000 7f61aa5a0700 1 Processor -- start 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.697+0000 7f61aa5a0700 1 -- start start 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.697+0000 7f61aa5a0700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f619c097d20 0x7f619c098140 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.697+0000 7f61aa5a0700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f619c098710 con 0x7f619c097d20 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.697+0000 7f61a959e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f619c097d20 0x7f619c098140 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.697+0000 7f61a959e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f619c097d20 0x7f619c098140 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43654/0 (socket says 192.168.123.105:43654) 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.697+0000 7f61a959e700 1 -- 192.168.123.105:0/1146921955 learned_addr learned my addr 192.168.123.105:0/1146921955 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.698+0000 7f61a959e700 1 -- 192.168.123.105:0/1146921955 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f619c098f20 con 0x7f619c097d20 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.698+0000 7f61a959e700 1 --2- 192.168.123.105:0/1146921955 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f619c097d20 0x7f619c098140 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f61a0009a90 tx=0x7f61a0009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=56f47a483d3fd240 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.698+0000 7f619bfff700 1 -- 192.168.123.105:0/1146921955 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f61a0004030 con 0x7f619c097d20 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.698+0000 7f619bfff700 1 -- 192.168.123.105:0/1146921955 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f61a000b7e0 con 0x7f619c097d20 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.698+0000 7f619bfff700 1 -- 192.168.123.105:0/1146921955 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f61a0003ae0 con 0x7f619c097d20 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.698+0000 7f61aa5a0700 1 -- 192.168.123.105:0/1146921955 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f619c097d20 msgr2=0x7f619c098140 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.698+0000 7f61aa5a0700 1 --2- 192.168.123.105:0/1146921955 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f619c097d20 0x7f619c098140 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f61a0009a90 tx=0x7f61a0009da0 comp rx=0 tx=0).stop 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.700+0000 7f61aa5a0700 1 -- 192.168.123.105:0/1146921955 shutdown_connections 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.700+0000 7f61aa5a0700 1 --2- 192.168.123.105:0/1146921955 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f619c097d20 0x7f619c098140 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.700+0000 7f61aa5a0700 1 -- 192.168.123.105:0/1146921955 >> 192.168.123.105:0/1146921955 conn(0x7f619c0932a0 msgr2=0x7f619c095700 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.700+0000 7f61aa5a0700 1 -- 192.168.123.105:0/1146921955 shutdown_connections 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.700+0000 7f61aa5a0700 1 -- 192.168.123.105:0/1146921955 wait complete. 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.700+0000 7f61aa5a0700 1 Processor -- start 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.700+0000 7f61aa5a0700 1 -- start start 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.700+0000 7f61aa5a0700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f619c097d20 0x7f619c12b910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.701+0000 7f61aa5a0700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f619c12be50 con 0x7f619c097d20 2026-03-10T07:46:10.968 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.701+0000 7f61a959e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f619c097d20 0x7f619c12b910 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.701+0000 7f61a959e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f619c097d20 0x7f619c12b910 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43656/0 (socket says 192.168.123.105:43656) 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.701+0000 7f61a959e700 1 -- 192.168.123.105:0/1636077374 learned_addr learned my addr 192.168.123.105:0/1636077374 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.701+0000 7f61a959e700 1 -- 192.168.123.105:0/1636077374 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f61a0009740 con 0x7f619c097d20 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.701+0000 7f61a959e700 1 --2- 192.168.123.105:0/1636077374 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f619c097d20 0x7f619c12b910 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f61a0000c00 tx=0x7f61a000bfa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.701+0000 7f619a7fc700 1 -- 192.168.123.105:0/1636077374 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f61a0003fe0 con 0x7f619c097d20 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.701+0000 7f619a7fc700 1 -- 192.168.123.105:0/1636077374 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f61a001a430 con 0x7f619c097d20 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.701+0000 7f61aa5a0700 1 -- 192.168.123.105:0/1636077374 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f619c12c050 con 0x7f619c097d20 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.701+0000 7f61aa5a0700 1 -- 192.168.123.105:0/1636077374 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f619c12c4f0 con 0x7f619c097d20 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.702+0000 7f619a7fc700 1 -- 192.168.123.105:0/1636077374 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f61a0011420 con 0x7f619c097d20 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.702+0000 7f619a7fc700 1 -- 192.168.123.105:0/1636077374 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 11) v1 ==== 45185+0+0 (secure 0 0 0) 0x7f61a0028020 con 0x7f619c097d20 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.703+0000 7f619a7fc700 1 --2- 192.168.123.105:0/1636077374 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f61900382c0 0x7f619003a780 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.703+0000 7f619a7fc700 1 -- 192.168.123.105:0/1636077374 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f61a004bc50 con 0x7f619c097d20 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.702+0000 7f61aa5a0700 1 -- 192.168.123.105:0/1636077374 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6188005320 con 0x7f619c097d20 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.704+0000 7f61a8d9d700 1 --2- 192.168.123.105:0/1636077374 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f61900382c0 0x7f619003a780 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.707+0000 7f619a7fc700 1 -- 192.168.123.105:0/1636077374 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f61a0019e20 con 0x7f619c097d20 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.707+0000 7f61a8d9d700 1 --2- 192.168.123.105:0/1636077374 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f61900382c0 0x7f619003a780 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f6194006fd0 tx=0x7f6194006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.818+0000 7f61aa5a0700 1 -- 192.168.123.105:0/1636077374 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "dashboard create-self-signed-cert", "target": ["mon-mgr", ""]}) v1 -- 0x7f6188000bf0 con 0x7f61900382c0 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.930+0000 7f619a7fc700 1 -- 192.168.123.105:0/1636077374 <== mgr.14164 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f6188000bf0 con 0x7f61900382c0 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.932+0000 7f61aa5a0700 1 -- 192.168.123.105:0/1636077374 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f61900382c0 msgr2=0x7f619003a780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.932+0000 7f61aa5a0700 1 --2- 192.168.123.105:0/1636077374 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f61900382c0 0x7f619003a780 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f6194006fd0 tx=0x7f6194006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.932+0000 7f61aa5a0700 1 -- 192.168.123.105:0/1636077374 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f619c097d20 msgr2=0x7f619c12b910 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.932+0000 7f61aa5a0700 1 --2- 192.168.123.105:0/1636077374 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f619c097d20 0x7f619c12b910 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f61a0000c00 tx=0x7f61a000bfa0 comp rx=0 tx=0).stop 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.932+0000 7f61aa5a0700 1 -- 192.168.123.105:0/1636077374 shutdown_connections 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.932+0000 7f61aa5a0700 1 --2- 192.168.123.105:0/1636077374 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f61900382c0 0x7f619003a780 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.933+0000 7f61aa5a0700 1 --2- 192.168.123.105:0/1636077374 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f619c097d20 0x7f619c12b910 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.933+0000 7f61aa5a0700 1 -- 192.168.123.105:0/1636077374 >> 192.168.123.105:0/1636077374 conn(0x7f619c0932a0 msgr2=0x7f619c093e50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.933+0000 7f61aa5a0700 1 -- 192.168.123.105:0/1636077374 shutdown_connections 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:10.933+0000 7f61aa5a0700 1 -- 192.168.123.105:0/1636077374 wait complete. 2026-03-10T07:46:10.969 INFO:teuthology.orchestra.run.vm05.stdout:Creating initial admin user... 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout {"username": "admin", "password": "$2b$12$idZcq0kUWx/2iqcpp4lgIOcc0WUbgVJImY6/uJiPMvsHxYiW3afg6", "roles": ["administrator"], "name": null, "email": null, "lastUpdate": 1773128771, "enabled": true, "pwdExpirationDate": null, "pwdUpdateRequired": true} 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.110+0000 7fb43342d700 1 Processor -- start 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.111+0000 7fb43342d700 1 -- start start 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.111+0000 7fb43342d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb42c0721d0 0x7fb42c0725f0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.111+0000 7fb43342d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb42c072bc0 con 0x7fb42c0721d0 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.111+0000 7fb43242b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb42c0721d0 0x7fb42c0725f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.111+0000 7fb43242b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb42c0721d0 0x7fb42c0725f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43668/0 (socket says 192.168.123.105:43668) 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.111+0000 7fb43242b700 1 -- 192.168.123.105:0/1224968418 learned_addr learned my addr 192.168.123.105:0/1224968418 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.111+0000 7fb43242b700 1 -- 192.168.123.105:0/1224968418 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb42c10e1c0 con 0x7fb42c0721d0 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.111+0000 7fb43242b700 1 --2- 192.168.123.105:0/1224968418 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb42c0721d0 0x7fb42c0725f0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fb428009cf0 tx=0x7fb42800b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=1a88b88721f30a8a server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.112+0000 7fb431429700 1 -- 192.168.123.105:0/1224968418 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb428004030 con 0x7fb42c0721d0 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.112+0000 7fb431429700 1 -- 192.168.123.105:0/1224968418 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb42800b810 con 0x7fb42c0721d0 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.112+0000 7fb43342d700 1 -- 192.168.123.105:0/1224968418 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb42c0721d0 msgr2=0x7fb42c0725f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.112+0000 7fb43342d700 1 --2- 192.168.123.105:0/1224968418 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb42c0721d0 0x7fb42c0725f0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7fb428009cf0 tx=0x7fb42800b0e0 comp rx=0 tx=0).stop 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.112+0000 7fb43342d700 1 -- 192.168.123.105:0/1224968418 shutdown_connections 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.112+0000 7fb43342d700 1 --2- 192.168.123.105:0/1224968418 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb42c0721d0 0x7fb42c0725f0 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.112+0000 7fb43342d700 1 -- 192.168.123.105:0/1224968418 >> 192.168.123.105:0/1224968418 conn(0x7fb42c06d320 msgr2=0x7fb42c06f760 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.112+0000 7fb43342d700 1 -- 192.168.123.105:0/1224968418 shutdown_connections 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.113+0000 7fb43342d700 1 -- 192.168.123.105:0/1224968418 wait complete. 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.113+0000 7fb43342d700 1 Processor -- start 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.113+0000 7fb43342d700 1 -- start start 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.113+0000 7fb43342d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb42c0721d0 0x7fb42c1a0cd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.113+0000 7fb43342d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb42c072bc0 con 0x7fb42c0721d0 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.113+0000 7fb43242b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb42c0721d0 0x7fb42c1a0cd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.113+0000 7fb43242b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb42c0721d0 0x7fb42c1a0cd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43674/0 (socket says 192.168.123.105:43674) 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.113+0000 7fb43242b700 1 -- 192.168.123.105:0/1300466539 learned_addr learned my addr 192.168.123.105:0/1300466539 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.113+0000 7fb43242b700 1 -- 192.168.123.105:0/1300466539 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb428009740 con 0x7fb42c0721d0 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.114+0000 7fb43242b700 1 --2- 192.168.123.105:0/1300466539 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb42c0721d0 0x7fb42c1a0cd0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fb428006e90 tx=0x7fb428003d30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.114+0000 7fb4237fe700 1 -- 192.168.123.105:0/1300466539 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb428003f40 con 0x7fb42c0721d0 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.114+0000 7fb43342d700 1 -- 192.168.123.105:0/1300466539 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb42c1a1210 con 0x7fb42c0721d0 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.114+0000 7fb43342d700 1 -- 192.168.123.105:0/1300466539 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb42c1a1630 con 0x7fb42c0721d0 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.115+0000 7fb4237fe700 1 -- 192.168.123.105:0/1300466539 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb428004580 con 0x7fb42c0721d0 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.115+0000 7fb4237fe700 1 -- 192.168.123.105:0/1300466539 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb42801ae60 con 0x7fb42c0721d0 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.116+0000 7fb43342d700 1 -- 192.168.123.105:0/1300466539 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb410005320 con 0x7fb42c0721d0 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.116+0000 7fb4237fe700 1 -- 192.168.123.105:0/1300466539 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 11) v1 ==== 45185+0+0 (secure 0 0 0) 0x7fb42801a430 con 0x7fb42c0721d0 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.116+0000 7fb4237fe700 1 --2- 192.168.123.105:0/1300466539 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb418037fa0 0x7fb41803a460 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.116+0000 7fb4237fe700 1 -- 192.168.123.105:0/1300466539 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fb42804be00 con 0x7fb42c0721d0 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.119+0000 7fb431c2a700 1 --2- 192.168.123.105:0/1300466539 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb418037fa0 0x7fb41803a460 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.120+0000 7fb4237fe700 1 -- 192.168.123.105:0/1300466539 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb42801a6e0 con 0x7fb42c0721d0 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.120+0000 7fb431c2a700 1 --2- 192.168.123.105:0/1300466539 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb418037fa0 0x7fb41803a460 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fb41c006fd0 tx=0x7fb41c006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.254+0000 7fb43342d700 1 -- 192.168.123.105:0/1300466539 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "dashboard ac-user-create", "username": "admin", "rolename": "administrator", "force_password": true, "pwd_update_required": true, "target": ["mon-mgr", ""]}) v1 -- 0x7fb410002430 con 0x7fb418037fa0 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.425+0000 7fb4237fe700 1 -- 192.168.123.105:0/1300466539 <== mgr.14164 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+252 (secure 0 0 0) 0x7fb410002430 con 0x7fb418037fa0 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.427+0000 7fb43342d700 1 -- 192.168.123.105:0/1300466539 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb418037fa0 msgr2=0x7fb41803a460 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.427+0000 7fb43342d700 1 --2- 192.168.123.105:0/1300466539 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb418037fa0 0x7fb41803a460 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fb41c006fd0 tx=0x7fb41c006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.427+0000 7fb43342d700 1 -- 192.168.123.105:0/1300466539 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb42c0721d0 msgr2=0x7fb42c1a0cd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.427+0000 7fb43342d700 1 --2- 192.168.123.105:0/1300466539 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb42c0721d0 0x7fb42c1a0cd0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fb428006e90 tx=0x7fb428003d30 comp rx=0 tx=0).stop 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.427+0000 7fb43342d700 1 -- 192.168.123.105:0/1300466539 shutdown_connections 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.427+0000 7fb43342d700 1 --2- 192.168.123.105:0/1300466539 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb418037fa0 0x7fb41803a460 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.427+0000 7fb43342d700 1 --2- 192.168.123.105:0/1300466539 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb42c0721d0 0x7fb42c1a0cd0 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:11.476 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.427+0000 7fb43342d700 1 -- 192.168.123.105:0/1300466539 >> 192.168.123.105:0/1300466539 conn(0x7fb42c06d320 msgr2=0x7fb42c10f3e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:11.477 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.428+0000 7fb43342d700 1 -- 192.168.123.105:0/1300466539 shutdown_connections 2026-03-10T07:46:11.477 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.428+0000 7fb43342d700 1 -- 192.168.123.105:0/1300466539 wait complete. 2026-03-10T07:46:11.477 INFO:teuthology.orchestra.run.vm05.stdout:Fetching dashboard port number... 2026-03-10T07:46:11.759 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stdout 8443 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.608+0000 7fd92ded1700 1 Processor -- start 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.609+0000 7fd92ded1700 1 -- start start 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.609+0000 7fd92ded1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd9281071c0 0x7fd9281095b0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.609+0000 7fd92ded1700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd928074720 con 0x7fd9281071c0 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.610+0000 7fd9277fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd9281071c0 0x7fd9281095b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.610+0000 7fd9277fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd9281071c0 0x7fd9281095b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43680/0 (socket says 192.168.123.105:43680) 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.610+0000 7fd9277fe700 1 -- 192.168.123.105:0/2565206417 learned_addr learned my addr 192.168.123.105:0/2565206417 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.610+0000 7fd9277fe700 1 -- 192.168.123.105:0/2565206417 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd928109af0 con 0x7fd9281071c0 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.610+0000 7fd9277fe700 1 --2- 192.168.123.105:0/2565206417 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd9281071c0 0x7fd9281095b0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fd918009a90 tx=0x7fd918009da0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=244898c023271004 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.610+0000 7fd9267fc700 1 -- 192.168.123.105:0/2565206417 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd91800fbf0 con 0x7fd9281071c0 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.610+0000 7fd9267fc700 1 -- 192.168.123.105:0/2565206417 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd918004510 con 0x7fd9281071c0 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.610+0000 7fd9267fc700 1 -- 192.168.123.105:0/2565206417 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd918017450 con 0x7fd9281071c0 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.611+0000 7fd92ded1700 1 -- 192.168.123.105:0/2565206417 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd9281071c0 msgr2=0x7fd9281095b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.611+0000 7fd92ded1700 1 --2- 192.168.123.105:0/2565206417 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd9281071c0 0x7fd9281095b0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fd918009a90 tx=0x7fd918009da0 comp rx=0 tx=0).stop 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.611+0000 7fd92ded1700 1 -- 192.168.123.105:0/2565206417 shutdown_connections 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.611+0000 7fd92ded1700 1 --2- 192.168.123.105:0/2565206417 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd9281071c0 0x7fd9281095b0 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.611+0000 7fd92ded1700 1 -- 192.168.123.105:0/2565206417 >> 192.168.123.105:0/2565206417 conn(0x7fd928100bd0 msgr2=0x7fd928103030 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.611+0000 7fd92ded1700 1 -- 192.168.123.105:0/2565206417 shutdown_connections 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.611+0000 7fd92ded1700 1 -- 192.168.123.105:0/2565206417 wait complete. 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.612+0000 7fd92ded1700 1 Processor -- start 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.612+0000 7fd92ded1700 1 -- start start 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.612+0000 7fd92ded1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd9281071c0 0x7fd9281a0a40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.612+0000 7fd92ded1700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd9281a0f80 con 0x7fd9281071c0 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.612+0000 7fd9277fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd9281071c0 0x7fd9281a0a40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.612+0000 7fd9277fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd9281071c0 0x7fd9281a0a40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43682/0 (socket says 192.168.123.105:43682) 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.612+0000 7fd9277fe700 1 -- 192.168.123.105:0/4133507688 learned_addr learned my addr 192.168.123.105:0/4133507688 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.612+0000 7fd9277fe700 1 -- 192.168.123.105:0/4133507688 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd918009740 con 0x7fd9281071c0 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.612+0000 7fd9277fe700 1 --2- 192.168.123.105:0/4133507688 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd9281071c0 0x7fd9281a0a40 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fd918000c00 tx=0x7fd918003a70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.612+0000 7fd924ff9700 1 -- 192.168.123.105:0/4133507688 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd91800f390 con 0x7fd9281071c0 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.612+0000 7fd924ff9700 1 -- 192.168.123.105:0/4133507688 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd91800c040 con 0x7fd9281071c0 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.612+0000 7fd924ff9700 1 -- 192.168.123.105:0/4133507688 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fd918021a50 con 0x7fd9281071c0 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.613+0000 7fd92ded1700 1 -- 192.168.123.105:0/4133507688 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd9281a1180 con 0x7fd9281071c0 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.613+0000 7fd92ded1700 1 -- 192.168.123.105:0/4133507688 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd9281a15a0 con 0x7fd9281071c0 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.614+0000 7fd924ff9700 1 -- 192.168.123.105:0/4133507688 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 11) v1 ==== 45185+0+0 (secure 0 0 0) 0x7fd918028030 con 0x7fd9281071c0 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.614+0000 7fd924ff9700 1 --2- 192.168.123.105:0/4133507688 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd9100382d0 0x7fd91003a790 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.614+0000 7fd924ff9700 1 -- 192.168.123.105:0/4133507688 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fd91804bb30 con 0x7fd9281071c0 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.614+0000 7fd926ffd700 1 --2- 192.168.123.105:0/4133507688 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd9100382d0 0x7fd91003a790 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.614+0000 7fd92ded1700 1 -- 192.168.123.105:0/4133507688 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd928192140 con 0x7fd9281071c0 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.617+0000 7fd924ff9700 1 -- 192.168.123.105:0/4133507688 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fd91801c030 con 0x7fd9281071c0 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.617+0000 7fd926ffd700 1 --2- 192.168.123.105:0/4133507688 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd9100382d0 0x7fd91003a790 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fd91c006fd0 tx=0x7fd91c006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.717+0000 7fd92ded1700 1 -- 192.168.123.105:0/4133507688 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"} v 0) v1 -- 0x7fd9280623c0 con 0x7fd9281071c0 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.720+0000 7fd924ff9700 1 -- 192.168.123.105:0/4133507688 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"}]=0 v8) v1 ==== 112+0+5 (secure 0 0 0) 0x7fd918049020 con 0x7fd9281071c0 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.722+0000 7fd92ded1700 1 -- 192.168.123.105:0/4133507688 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd9100382d0 msgr2=0x7fd91003a790 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.722+0000 7fd92ded1700 1 --2- 192.168.123.105:0/4133507688 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd9100382d0 0x7fd91003a790 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fd91c006fd0 tx=0x7fd91c006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.723+0000 7fd92ded1700 1 -- 192.168.123.105:0/4133507688 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd9281071c0 msgr2=0x7fd9281a0a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.723+0000 7fd92ded1700 1 --2- 192.168.123.105:0/4133507688 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd9281071c0 0x7fd9281a0a40 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fd918000c00 tx=0x7fd918003a70 comp rx=0 tx=0).stop 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.723+0000 7fd92ded1700 1 -- 192.168.123.105:0/4133507688 shutdown_connections 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.723+0000 7fd92ded1700 1 --2- 192.168.123.105:0/4133507688 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd9100382d0 0x7fd91003a790 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.723+0000 7fd92ded1700 1 --2- 192.168.123.105:0/4133507688 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd9281071c0 0x7fd9281a0a40 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.723+0000 7fd92ded1700 1 -- 192.168.123.105:0/4133507688 >> 192.168.123.105:0/4133507688 conn(0x7fd928100bd0 msgr2=0x7fd9281018b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.723+0000 7fd92ded1700 1 -- 192.168.123.105:0/4133507688 shutdown_connections 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.723+0000 7fd92ded1700 1 -- 192.168.123.105:0/4133507688 wait complete. 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:firewalld does not appear to be present 2026-03-10T07:46:11.760 INFO:teuthology.orchestra.run.vm05.stdout:Not possible to open ports <[8443]>. firewalld.service is not available 2026-03-10T07:46:11.761 INFO:teuthology.orchestra.run.vm05.stdout:Ceph Dashboard is now available at: 2026-03-10T07:46:11.761 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:46:11.761 INFO:teuthology.orchestra.run.vm05.stdout: URL: https://vm05.local:8443/ 2026-03-10T07:46:11.761 INFO:teuthology.orchestra.run.vm05.stdout: User: admin 2026-03-10T07:46:11.761 INFO:teuthology.orchestra.run.vm05.stdout: Password: qsqiphovt7 2026-03-10T07:46:11.761 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:46:11.761 INFO:teuthology.orchestra.run.vm05.stdout:Saving cluster configuration to /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config directory 2026-03-10T07:46:11.761 INFO:teuthology.orchestra.run.vm05.stdout:Enabling autotune for osd_memory_target 2026-03-10T07:46:12.003 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:11 vm05 ceph-mon[50387]: [10/Mar/2026:07:46:10] ENGINE Serving on https://192.168.123.105:7150 2026-03-10T07:46:12.003 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:11 vm05 ceph-mon[50387]: [10/Mar/2026:07:46:10] ENGINE Bus STARTED 2026-03-10T07:46:12.003 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:11 vm05 ceph-mon[50387]: from='client.14168 -' entity='client.admin' cmd=[{"prefix": "get_command_descriptions"}]: dispatch 2026-03-10T07:46:12.003 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:11 vm05 ceph-mon[50387]: from='client.14168 -' entity='client.admin' cmd=[{"prefix": "mgr_status"}]: dispatch 2026-03-10T07:46:12.003 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:11 vm05 ceph-mon[50387]: from='client.14176 -' entity='client.admin' cmd=[{"prefix": "dashboard create-self-signed-cert", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:46:12.003 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:11 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:12.003 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:11 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:12.003 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:11 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:12.003 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:11 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:12.003 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:11 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/4133507688' entity='client.admin' cmd=[{"prefix": "config get", "who": "mgr", "key": "mgr/dashboard/ssl_server_port"}]: dispatch 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.890+0000 7f31faf5d700 1 Processor -- start 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.890+0000 7f31faf5d700 1 -- start start 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.890+0000 7f31faf5d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f31f4108980 0x7f31f4108da0 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.890+0000 7f31faf5d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f31f4109370 con 0x7f31f4108980 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.891+0000 7f31f8cf9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f31f4108980 0x7f31f4108da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.891+0000 7f31f8cf9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f31f4108980 0x7f31f4108da0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43692/0 (socket says 192.168.123.105:43692) 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.891+0000 7f31f8cf9700 1 -- 192.168.123.105:0/236627399 learned_addr learned my addr 192.168.123.105:0/236627399 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.891+0000 7f31f8cf9700 1 -- 192.168.123.105:0/236627399 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f31f4109b80 con 0x7f31f4108980 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.891+0000 7f31f8cf9700 1 --2- 192.168.123.105:0/236627399 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f31f4108980 0x7f31f4108da0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f31e0009cf0 tx=0x7f31e000b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=95ebd3910370c03d server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.892+0000 7f31f37fe700 1 -- 192.168.123.105:0/236627399 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f31e0004030 con 0x7f31f4108980 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.892+0000 7f31f37fe700 1 -- 192.168.123.105:0/236627399 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f31e000b810 con 0x7f31f4108980 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.892+0000 7f31faf5d700 1 -- 192.168.123.105:0/236627399 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f31f4108980 msgr2=0x7f31f4108da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.892+0000 7f31faf5d700 1 --2- 192.168.123.105:0/236627399 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f31f4108980 0x7f31f4108da0 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f31e0009cf0 tx=0x7f31e000b0e0 comp rx=0 tx=0).stop 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.893+0000 7f31faf5d700 1 -- 192.168.123.105:0/236627399 shutdown_connections 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.893+0000 7f31faf5d700 1 --2- 192.168.123.105:0/236627399 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f31f4108980 0x7f31f4108da0 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.893+0000 7f31faf5d700 1 -- 192.168.123.105:0/236627399 >> 192.168.123.105:0/236627399 conn(0x7f31f41044d0 msgr2=0x7f31f41068c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.893+0000 7f31faf5d700 1 -- 192.168.123.105:0/236627399 shutdown_connections 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.893+0000 7f31faf5d700 1 -- 192.168.123.105:0/236627399 wait complete. 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.893+0000 7f31faf5d700 1 Processor -- start 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.893+0000 7f31faf5d700 1 -- start start 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.894+0000 7f31faf5d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f31f4108980 0x7f31f419caa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.894+0000 7f31f8cf9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f31f4108980 0x7f31f419caa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.894+0000 7f31f8cf9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f31f4108980 0x7f31f419caa0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43698/0 (socket says 192.168.123.105:43698) 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.894+0000 7f31f8cf9700 1 -- 192.168.123.105:0/2804532828 learned_addr learned my addr 192.168.123.105:0/2804532828 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.894+0000 7f31faf5d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f31f4109370 con 0x7f31f4108980 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.894+0000 7f31f8cf9700 1 -- 192.168.123.105:0/2804532828 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f31e0009740 con 0x7f31f4108980 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.894+0000 7f31f8cf9700 1 --2- 192.168.123.105:0/2804532828 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f31f4108980 0x7f31f419caa0 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f31e0006e90 tx=0x7f31e0003d30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.895+0000 7f31f1ffb700 1 -- 192.168.123.105:0/2804532828 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f31e0003f40 con 0x7f31f4108980 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.895+0000 7f31f1ffb700 1 -- 192.168.123.105:0/2804532828 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f31e0004580 con 0x7f31f4108980 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.895+0000 7f31f1ffb700 1 -- 192.168.123.105:0/2804532828 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f31e001ae60 con 0x7f31f4108980 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.895+0000 7f31faf5d700 1 -- 192.168.123.105:0/2804532828 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f31f419cfe0 con 0x7f31f4108980 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.895+0000 7f31faf5d700 1 -- 192.168.123.105:0/2804532828 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f31f419d400 con 0x7f31f4108980 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.896+0000 7f31f1ffb700 1 -- 192.168.123.105:0/2804532828 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 11) v1 ==== 45185+0+0 (secure 0 0 0) 0x7f31e001a430 con 0x7f31f4108980 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.896+0000 7f31faf5d700 1 -- 192.168.123.105:0/2804532828 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f31f4196950 con 0x7f31f4108980 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.896+0000 7f31f1ffb700 1 --2- 192.168.123.105:0/2804532828 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f31e4038270 0x7f31e403a730 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.896+0000 7f31f1ffb700 1 -- 192.168.123.105:0/2804532828 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f31e004bdb0 con 0x7f31f4108980 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.899+0000 7f31f3fff700 1 --2- 192.168.123.105:0/2804532828 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f31e4038270 0x7f31e403a730 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.899+0000 7f31f1ffb700 1 -- 192.168.123.105:0/2804532828 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f31e001a730 con 0x7f31f4108980 2026-03-10T07:46:12.056 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:11.899+0000 7f31f3fff700 1 --2- 192.168.123.105:0/2804532828 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f31e4038270 0x7f31e403a730 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f31e8006fd0 tx=0x7f31e8006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:12.057 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.002+0000 7f31faf5d700 1 -- 192.168.123.105:0/2804532828 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config set, name=osd_memory_target_autotune}] v 0) v1 -- 0x7f31f404fa20 con 0x7f31f4108980 2026-03-10T07:46:12.057 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.004+0000 7f31f1ffb700 1 -- 192.168.123.105:0/2804532828 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{prefix=config set, name=osd_memory_target_autotune}]=0 v8)=0 v8) v1 ==== 127+0+0 (secure 0 0 0) 0x7f31e0018b40 con 0x7f31f4108980 2026-03-10T07:46:12.057 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.007+0000 7f31faf5d700 1 -- 192.168.123.105:0/2804532828 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f31e4038270 msgr2=0x7f31e403a730 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:12.057 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.007+0000 7f31faf5d700 1 --2- 192.168.123.105:0/2804532828 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f31e4038270 0x7f31e403a730 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f31e8006fd0 tx=0x7f31e8006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:12.057 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.007+0000 7f31faf5d700 1 -- 192.168.123.105:0/2804532828 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f31f4108980 msgr2=0x7f31f419caa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:12.057 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.007+0000 7f31faf5d700 1 --2- 192.168.123.105:0/2804532828 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f31f4108980 0x7f31f419caa0 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f31e0006e90 tx=0x7f31e0003d30 comp rx=0 tx=0).stop 2026-03-10T07:46:12.057 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.007+0000 7f31faf5d700 1 -- 192.168.123.105:0/2804532828 shutdown_connections 2026-03-10T07:46:12.057 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.007+0000 7f31faf5d700 1 --2- 192.168.123.105:0/2804532828 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f31e4038270 0x7f31e403a730 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:12.057 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.007+0000 7f31faf5d700 1 --2- 192.168.123.105:0/2804532828 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f31f4108980 0x7f31f419caa0 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:12.057 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.007+0000 7f31faf5d700 1 -- 192.168.123.105:0/2804532828 >> 192.168.123.105:0/2804532828 conn(0x7f31f41044d0 msgr2=0x7f31f41078d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:12.057 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.007+0000 7f31faf5d700 1 -- 192.168.123.105:0/2804532828 shutdown_connections 2026-03-10T07:46:12.057 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.007+0000 7f31faf5d700 1 -- 192.168.123.105:0/2804532828 wait complete. 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.174+0000 7f9d02b9a700 1 Processor -- start 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.174+0000 7f9d02b9a700 1 -- start start 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.175+0000 7f9d02b9a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cfc07c870 0x7f9cfc07cc90 unknown :-1 s=NONE pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.175+0000 7f9d02b9a700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9cfc07d260 con 0x7f9cfc07c870 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.175+0000 7f9d00936700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cfc07c870 0x7f9cfc07cc90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.175+0000 7f9d00936700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cfc07c870 0x7f9cfc07cc90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43704/0 (socket says 192.168.123.105:43704) 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.175+0000 7f9d00936700 1 -- 192.168.123.105:0/2263794705 learned_addr learned my addr 192.168.123.105:0/2263794705 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.175+0000 7f9d00936700 1 -- 192.168.123.105:0/2263794705 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9cfc07dac0 con 0x7f9cfc07c870 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.176+0000 7f9d00936700 1 --2- 192.168.123.105:0/2263794705 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cfc07c870 0x7f9cfc07cc90 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f9cec009cf0 tx=0x7f9cec00b0e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=a59e1111426f58fc server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.176+0000 7f9cfb7fe700 1 -- 192.168.123.105:0/2263794705 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9cec004030 con 0x7f9cfc07c870 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.176+0000 7f9cfb7fe700 1 -- 192.168.123.105:0/2263794705 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9cec00b810 con 0x7f9cfc07c870 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.176+0000 7f9d02b9a700 1 -- 192.168.123.105:0/2263794705 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cfc07c870 msgr2=0x7f9cfc07cc90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.176+0000 7f9d02b9a700 1 --2- 192.168.123.105:0/2263794705 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cfc07c870 0x7f9cfc07cc90 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f9cec009cf0 tx=0x7f9cec00b0e0 comp rx=0 tx=0).stop 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.176+0000 7f9d02b9a700 1 -- 192.168.123.105:0/2263794705 shutdown_connections 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.176+0000 7f9d02b9a700 1 --2- 192.168.123.105:0/2263794705 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cfc07c870 0x7f9cfc07cc90 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.176+0000 7f9d02b9a700 1 -- 192.168.123.105:0/2263794705 >> 192.168.123.105:0/2263794705 conn(0x7f9cfc07be30 msgr2=0x7f9cfc1064e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.177+0000 7f9d02b9a700 1 -- 192.168.123.105:0/2263794705 shutdown_connections 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.177+0000 7f9d02b9a700 1 -- 192.168.123.105:0/2263794705 wait complete. 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.177+0000 7f9d02b9a700 1 Processor -- start 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.177+0000 7f9d02b9a700 1 -- start start 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.177+0000 7f9d02b9a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cfc1a0bf0 0x7f9cfc1a1030 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.177+0000 7f9d02b9a700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9cfc07d260 con 0x7f9cfc1a0bf0 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.178+0000 7f9d00936700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cfc1a0bf0 0x7f9cfc1a1030 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.178+0000 7f9d00936700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cfc1a0bf0 0x7f9cfc1a1030 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43708/0 (socket says 192.168.123.105:43708) 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.178+0000 7f9d00936700 1 -- 192.168.123.105:0/259534927 learned_addr learned my addr 192.168.123.105:0/259534927 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.178+0000 7f9d00936700 1 -- 192.168.123.105:0/259534927 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9cec009740 con 0x7f9cfc1a0bf0 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.178+0000 7f9d00936700 1 --2- 192.168.123.105:0/259534927 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cfc1a0bf0 0x7f9cfc1a1030 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f9cec009cc0 tx=0x7f9cec00bfa0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.178+0000 7f9cf9ffb700 1 -- 192.168.123.105:0/259534927 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9cec003950 con 0x7f9cfc1a0bf0 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.178+0000 7f9cf9ffb700 1 -- 192.168.123.105:0/259534927 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9cec0043c0 con 0x7f9cfc1a0bf0 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.178+0000 7f9cf9ffb700 1 -- 192.168.123.105:0/259534927 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f9cec01aca0 con 0x7f9cfc1a0bf0 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.178+0000 7f9d02b9a700 1 -- 192.168.123.105:0/259534927 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9cfc1a1570 con 0x7f9cfc1a0bf0 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.178+0000 7f9d02b9a700 1 -- 192.168.123.105:0/259534927 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9cfc1a41b0 con 0x7f9cfc1a0bf0 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.179+0000 7f9cf9ffb700 1 -- 192.168.123.105:0/259534927 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 11) v1 ==== 45185+0+0 (secure 0 0 0) 0x7f9cec004530 con 0x7f9cfc1a0bf0 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.179+0000 7f9d02b9a700 1 -- 192.168.123.105:0/259534927 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9cfc19a200 con 0x7f9cfc1a0bf0 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.179+0000 7f9cf9ffb700 1 --2- 192.168.123.105:0/259534927 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9ce40382b0 0x7f9ce403a770 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.179+0000 7f9cf9ffb700 1 -- 192.168.123.105:0/259534927 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f9cec04b570 con 0x7f9cfc1a0bf0 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.180+0000 7f9cfbfff700 1 --2- 192.168.123.105:0/259534927 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9ce40382b0 0x7f9ce403a770 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.182+0000 7f9cfbfff700 1 --2- 192.168.123.105:0/259534927 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9ce40382b0 0x7f9ce403a770 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f9cf0006fd0 tx=0x7f9cf0006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.183+0000 7f9cf9ffb700 1 -- 192.168.123.105:0/259534927 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f9cec01ae00 con 0x7f9cfc1a0bf0 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.338+0000 7f9d02b9a700 1 -- 192.168.123.105:0/259534927 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config-key set, key=mgr/dashboard/cluster/status}] v 0) v1 -- 0x7f9cfc0623c0 con 0x7f9cfc1a0bf0 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.343+0000 7f9cf9ffb700 1 -- 192.168.123.105:0/259534927 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{prefix=config-key set, key=mgr/dashboard/cluster/status}]=0 set mgr/dashboard/cluster/status v34)=0 set mgr/dashboard/cluster/status v34) v1 ==== 153+0+0 (secure 0 0 0) 0x7f9cec01f0c0 con 0x7f9cfc1a0bf0 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.346+0000 7f9d02b9a700 1 -- 192.168.123.105:0/259534927 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9ce40382b0 msgr2=0x7f9ce403a770 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.346+0000 7f9d02b9a700 1 --2- 192.168.123.105:0/259534927 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9ce40382b0 0x7f9ce403a770 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f9cf0006fd0 tx=0x7f9cf0006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.346+0000 7f9d02b9a700 1 -- 192.168.123.105:0/259534927 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cfc1a0bf0 msgr2=0x7f9cfc1a1030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.346+0000 7f9d02b9a700 1 --2- 192.168.123.105:0/259534927 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cfc1a0bf0 0x7f9cfc1a1030 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f9cec009cc0 tx=0x7f9cec00bfa0 comp rx=0 tx=0).stop 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.346+0000 7f9d02b9a700 1 -- 192.168.123.105:0/259534927 shutdown_connections 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.346+0000 7f9d02b9a700 1 --2- 192.168.123.105:0/259534927 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9ce40382b0 0x7f9ce403a770 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.346+0000 7f9d02b9a700 1 --2- 192.168.123.105:0/259534927 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cfc1a0bf0 0x7f9cfc1a1030 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.346+0000 7f9d02b9a700 1 -- 192.168.123.105:0/259534927 >> 192.168.123.105:0/259534927 conn(0x7f9cfc07be30 msgr2=0x7f9cfc107110 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.347+0000 7f9d02b9a700 1 -- 192.168.123.105:0/259534927 shutdown_connections 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr 2026-03-10T07:46:12.347+0000 7f9d02b9a700 1 -- 192.168.123.105:0/259534927 wait complete. 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:/usr/bin/ceph: stderr set mgr/dashboard/cluster/status 2026-03-10T07:46:12.381 INFO:teuthology.orchestra.run.vm05.stdout:You can access the Ceph CLI as following in case of multi-cluster or non-default config: 2026-03-10T07:46:12.382 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:46:12.382 INFO:teuthology.orchestra.run.vm05.stdout: sudo /home/ubuntu/cephtest/cephadm shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring 2026-03-10T07:46:12.382 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:46:12.382 INFO:teuthology.orchestra.run.vm05.stdout:Or, if you are only running a single cluster on this host: 2026-03-10T07:46:12.382 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:46:12.382 INFO:teuthology.orchestra.run.vm05.stdout: sudo /home/ubuntu/cephtest/cephadm shell 2026-03-10T07:46:12.382 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:46:12.382 INFO:teuthology.orchestra.run.vm05.stdout:Please consider enabling telemetry to help improve Ceph: 2026-03-10T07:46:12.382 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:46:12.382 INFO:teuthology.orchestra.run.vm05.stdout: ceph telemetry on 2026-03-10T07:46:12.382 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:46:12.382 INFO:teuthology.orchestra.run.vm05.stdout:For more information see: 2026-03-10T07:46:12.382 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:46:12.382 INFO:teuthology.orchestra.run.vm05.stdout: https://docs.ceph.com/en/latest/mgr/telemetry/ 2026-03-10T07:46:12.382 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:46:12.382 INFO:teuthology.orchestra.run.vm05.stdout:Bootstrap complete. 2026-03-10T07:46:12.405 INFO:tasks.cephadm:Fetching config... 2026-03-10T07:46:12.406 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T07:46:12.406 DEBUG:teuthology.orchestra.run.vm05:> dd if=/etc/ceph/ceph.conf of=/dev/stdout 2026-03-10T07:46:12.425 INFO:tasks.cephadm:Fetching client.admin keyring... 2026-03-10T07:46:12.425 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T07:46:12.425 DEBUG:teuthology.orchestra.run.vm05:> dd if=/etc/ceph/ceph.client.admin.keyring of=/dev/stdout 2026-03-10T07:46:12.488 INFO:tasks.cephadm:Fetching mon keyring... 2026-03-10T07:46:12.488 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T07:46:12.488 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/keyring of=/dev/stdout 2026-03-10T07:46:12.562 INFO:tasks.cephadm:Fetching pub ssh key... 2026-03-10T07:46:12.562 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T07:46:12.562 DEBUG:teuthology.orchestra.run.vm05:> dd if=/home/ubuntu/cephtest/ceph.pub of=/dev/stdout 2026-03-10T07:46:12.617 INFO:tasks.cephadm:Installing pub ssh key for root users... 2026-03-10T07:46:12.618 DEBUG:teuthology.orchestra.run.vm05:> sudo install -d -m 0700 /root/.ssh && echo 'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDyvut0qWxnRXwISmv/mUA3HO6BUSTGqEA6tFNZAuE4tVWUasENK1pKS3o+yVImRTgz9e64jfhEHjSkaEvPh5AXFEDz/yoFvR2/8RFefSEKuXDSX7Mjpg5cXAaO1p6wHxkZfJrJ2mDRxEiiC3a63hkDnJ75eOQn6cQxlTUQYPVd8UCeEMCipS9uDZQunarRqEKYrZ/mNAEx36+9Vj5yS8cfbebzKZi5RG2H6IZDdICEyGMB+XTEUdK1z87vZncs0Vf6ckGiUcacDk4t8RHSxfBzd5yVF7Y/7NPSTJpCX5E6zVW2ldHri06w7XluDFrLVNaueSM7PZwjiDrQciE4u/2yaFv00fsAjEpc58ZvvOdqDT9lcvNRoNNoSjVnvK1AvRmUKaZrBHsCexA6UclTr5nOfasz6NOFSn1iHKZNEHDrOg8UYASszZF8riM6223sU0fBqzT66pptuIP3Fc+iEb2MiJzYFQyYAe6Z9yQ0PMuDkxwhB6g3HN/i4wmTScEFiFU= ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508' | sudo tee -a /root/.ssh/authorized_keys && sudo chmod 0600 /root/.ssh/authorized_keys 2026-03-10T07:46:12.692 INFO:teuthology.orchestra.run.vm05.stdout:ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDyvut0qWxnRXwISmv/mUA3HO6BUSTGqEA6tFNZAuE4tVWUasENK1pKS3o+yVImRTgz9e64jfhEHjSkaEvPh5AXFEDz/yoFvR2/8RFefSEKuXDSX7Mjpg5cXAaO1p6wHxkZfJrJ2mDRxEiiC3a63hkDnJ75eOQn6cQxlTUQYPVd8UCeEMCipS9uDZQunarRqEKYrZ/mNAEx36+9Vj5yS8cfbebzKZi5RG2H6IZDdICEyGMB+XTEUdK1z87vZncs0Vf6ckGiUcacDk4t8RHSxfBzd5yVF7Y/7NPSTJpCX5E6zVW2ldHri06w7XluDFrLVNaueSM7PZwjiDrQciE4u/2yaFv00fsAjEpc58ZvvOdqDT9lcvNRoNNoSjVnvK1AvRmUKaZrBHsCexA6UclTr5nOfasz6NOFSn1iHKZNEHDrOg8UYASszZF8riM6223sU0fBqzT66pptuIP3Fc+iEb2MiJzYFQyYAe6Z9yQ0PMuDkxwhB6g3HN/i4wmTScEFiFU= ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T07:46:12.702 DEBUG:teuthology.orchestra.run.vm08:> sudo install -d -m 0700 /root/.ssh && echo 'ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDyvut0qWxnRXwISmv/mUA3HO6BUSTGqEA6tFNZAuE4tVWUasENK1pKS3o+yVImRTgz9e64jfhEHjSkaEvPh5AXFEDz/yoFvR2/8RFefSEKuXDSX7Mjpg5cXAaO1p6wHxkZfJrJ2mDRxEiiC3a63hkDnJ75eOQn6cQxlTUQYPVd8UCeEMCipS9uDZQunarRqEKYrZ/mNAEx36+9Vj5yS8cfbebzKZi5RG2H6IZDdICEyGMB+XTEUdK1z87vZncs0Vf6ckGiUcacDk4t8RHSxfBzd5yVF7Y/7NPSTJpCX5E6zVW2ldHri06w7XluDFrLVNaueSM7PZwjiDrQciE4u/2yaFv00fsAjEpc58ZvvOdqDT9lcvNRoNNoSjVnvK1AvRmUKaZrBHsCexA6UclTr5nOfasz6NOFSn1iHKZNEHDrOg8UYASszZF8riM6223sU0fBqzT66pptuIP3Fc+iEb2MiJzYFQyYAe6Z9yQ0PMuDkxwhB6g3HN/i4wmTScEFiFU= ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508' | sudo tee -a /root/.ssh/authorized_keys && sudo chmod 0600 /root/.ssh/authorized_keys 2026-03-10T07:46:12.737 INFO:teuthology.orchestra.run.vm08.stdout:ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDyvut0qWxnRXwISmv/mUA3HO6BUSTGqEA6tFNZAuE4tVWUasENK1pKS3o+yVImRTgz9e64jfhEHjSkaEvPh5AXFEDz/yoFvR2/8RFefSEKuXDSX7Mjpg5cXAaO1p6wHxkZfJrJ2mDRxEiiC3a63hkDnJ75eOQn6cQxlTUQYPVd8UCeEMCipS9uDZQunarRqEKYrZ/mNAEx36+9Vj5yS8cfbebzKZi5RG2H6IZDdICEyGMB+XTEUdK1z87vZncs0Vf6ckGiUcacDk4t8RHSxfBzd5yVF7Y/7NPSTJpCX5E6zVW2ldHri06w7XluDFrLVNaueSM7PZwjiDrQciE4u/2yaFv00fsAjEpc58ZvvOdqDT9lcvNRoNNoSjVnvK1AvRmUKaZrBHsCexA6UclTr5nOfasz6NOFSn1iHKZNEHDrOg8UYASszZF8riM6223sU0fBqzT66pptuIP3Fc+iEb2MiJzYFQyYAe6Z9yQ0PMuDkxwhB6g3HN/i4wmTScEFiFU= ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T07:46:12.747 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph config set mgr mgr/cephadm/allow_ptrace true 2026-03-10T07:46:12.902 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:46:12.935 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:12 vm05 ceph-mon[50387]: from='client.14178 -' entity='client.admin' cmd=[{"prefix": "dashboard ac-user-create", "username": "admin", "rolename": "administrator", "force_password": true, "pwd_update_required": true, "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:46:12.935 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:12 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/259534927' entity='client.admin' 2026-03-10T07:46:12.935 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:12 vm05 ceph-mon[50387]: mgrmap e12: vm05.blexke(active, since 2s) 2026-03-10T07:46:13.169 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.167+0000 7f6367140700 1 -- 192.168.123.105:0/1016274789 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6360072230 msgr2=0x7f6360072650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:13.169 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.167+0000 7f6367140700 1 --2- 192.168.123.105:0/1016274789 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6360072230 0x7f6360072650 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f635c009b50 tx=0x7f635c009e60 comp rx=0 tx=0).stop 2026-03-10T07:46:13.169 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.168+0000 7f6367140700 1 -- 192.168.123.105:0/1016274789 shutdown_connections 2026-03-10T07:46:13.169 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.168+0000 7f6367140700 1 --2- 192.168.123.105:0/1016274789 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6360072230 0x7f6360072650 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:13.169 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.168+0000 7f6367140700 1 -- 192.168.123.105:0/1016274789 >> 192.168.123.105:0/1016274789 conn(0x7f636006d660 msgr2=0x7f636006fac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:13.169 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.168+0000 7f6367140700 1 -- 192.168.123.105:0/1016274789 shutdown_connections 2026-03-10T07:46:13.169 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.168+0000 7f6367140700 1 -- 192.168.123.105:0/1016274789 wait complete. 2026-03-10T07:46:13.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.169+0000 7f6367140700 1 Processor -- start 2026-03-10T07:46:13.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.169+0000 7f6367140700 1 -- start start 2026-03-10T07:46:13.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.169+0000 7f6367140700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6360072230 0x7f63601afa00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:13.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.169+0000 7f6367140700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f63601aff40 con 0x7f6360072230 2026-03-10T07:46:13.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.169+0000 7f636613e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6360072230 0x7f63601afa00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:13.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.169+0000 7f636613e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6360072230 0x7f63601afa00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43730/0 (socket says 192.168.123.105:43730) 2026-03-10T07:46:13.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.169+0000 7f636613e700 1 -- 192.168.123.105:0/4015415966 learned_addr learned my addr 192.168.123.105:0/4015415966 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:13.170 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.169+0000 7f636613e700 1 -- 192.168.123.105:0/4015415966 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f635c0097e0 con 0x7f6360072230 2026-03-10T07:46:13.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.170+0000 7f636613e700 1 --2- 192.168.123.105:0/4015415966 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6360072230 0x7f63601afa00 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f635c004f90 tx=0x7f635c0050d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:13.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.170+0000 7f63577fe700 1 -- 192.168.123.105:0/4015415966 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f635c01c070 con 0x7f6360072230 2026-03-10T07:46:13.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.170+0000 7f63577fe700 1 -- 192.168.123.105:0/4015415966 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f635c021470 con 0x7f6360072230 2026-03-10T07:46:13.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.170+0000 7f63577fe700 1 -- 192.168.123.105:0/4015415966 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f635c00f460 con 0x7f6360072230 2026-03-10T07:46:13.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.170+0000 7f6367140700 1 -- 192.168.123.105:0/4015415966 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f63601b0080 con 0x7f6360072230 2026-03-10T07:46:13.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.170+0000 7f6367140700 1 -- 192.168.123.105:0/4015415966 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f63601b03e0 con 0x7f6360072230 2026-03-10T07:46:13.174 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.171+0000 7f6367140700 1 -- 192.168.123.105:0/4015415966 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6360110fd0 con 0x7f6360072230 2026-03-10T07:46:13.174 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.171+0000 7f63577fe700 1 -- 192.168.123.105:0/4015415966 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 12) v1 ==== 45291+0+0 (secure 0 0 0) 0x7f635c0052a0 con 0x7f6360072230 2026-03-10T07:46:13.174 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.171+0000 7f63577fe700 1 --2- 192.168.123.105:0/4015415966 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f634c038020 0x7f634c03a4e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:13.174 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.171+0000 7f63577fe700 1 -- 192.168.123.105:0/4015415966 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f635c04cdb0 con 0x7f6360072230 2026-03-10T07:46:13.174 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.171+0000 7f636593d700 1 --2- 192.168.123.105:0/4015415966 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f634c038020 0x7f634c03a4e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:13.174 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.172+0000 7f636593d700 1 --2- 192.168.123.105:0/4015415966 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f634c038020 0x7f634c03a4e0 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f6350006fd0 tx=0x7f6350006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:13.174 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.174+0000 7f63577fe700 1 -- 192.168.123.105:0/4015415966 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f635c00e9b0 con 0x7f6360072230 2026-03-10T07:46:13.290 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.289+0000 7f6367140700 1 -- 192.168.123.105:0/4015415966 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/allow_ptrace}] v 0) v1 -- 0x7f636004f000 con 0x7f6360072230 2026-03-10T07:46:13.296 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.295+0000 7f63577fe700 1 -- 192.168.123.105:0/4015415966 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/allow_ptrace}]=0 v9)=0 v9) v1 ==== 125+0+0 (secure 0 0 0) 0x7f635c026020 con 0x7f6360072230 2026-03-10T07:46:13.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.298+0000 7f6367140700 1 -- 192.168.123.105:0/4015415966 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f634c038020 msgr2=0x7f634c03a4e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:13.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.298+0000 7f6367140700 1 --2- 192.168.123.105:0/4015415966 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f634c038020 0x7f634c03a4e0 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f6350006fd0 tx=0x7f6350006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:13.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.298+0000 7f6367140700 1 -- 192.168.123.105:0/4015415966 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6360072230 msgr2=0x7f63601afa00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:13.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.298+0000 7f6367140700 1 --2- 192.168.123.105:0/4015415966 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6360072230 0x7f63601afa00 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f635c004f90 tx=0x7f635c0050d0 comp rx=0 tx=0).stop 2026-03-10T07:46:13.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.298+0000 7f6367140700 1 -- 192.168.123.105:0/4015415966 shutdown_connections 2026-03-10T07:46:13.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.298+0000 7f6367140700 1 --2- 192.168.123.105:0/4015415966 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f634c038020 0x7f634c03a4e0 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:13.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.298+0000 7f6367140700 1 --2- 192.168.123.105:0/4015415966 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6360072230 0x7f63601afa00 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:13.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.298+0000 7f6367140700 1 -- 192.168.123.105:0/4015415966 >> 192.168.123.105:0/4015415966 conn(0x7f636006d660 msgr2=0x7f636006e340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:13.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.298+0000 7f6367140700 1 -- 192.168.123.105:0/4015415966 shutdown_connections 2026-03-10T07:46:13.299 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.299+0000 7f6367140700 1 -- 192.168.123.105:0/4015415966 wait complete. 2026-03-10T07:46:13.368 INFO:tasks.cephadm:Distributing conf and client.admin keyring to all hosts + 0755 2026-03-10T07:46:13.369 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph orch client-keyring set client.admin '*' --mode 0755 2026-03-10T07:46:13.638 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:46:13.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.928+0000 7f993a4ad700 1 -- 192.168.123.105:0/2007233056 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9934101570 msgr2=0x7f9934103960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:13.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.928+0000 7f993a4ad700 1 --2- 192.168.123.105:0/2007233056 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9934101570 0x7f9934103960 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f991c009b00 tx=0x7f991c009e10 comp rx=0 tx=0).stop 2026-03-10T07:46:13.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.928+0000 7f993a4ad700 1 -- 192.168.123.105:0/2007233056 shutdown_connections 2026-03-10T07:46:13.931 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.928+0000 7f993a4ad700 1 --2- 192.168.123.105:0/2007233056 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9934101570 0x7f9934103960 secure :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f991c009b00 tx=0x7f991c009e10 comp rx=0 tx=0).stop 2026-03-10T07:46:13.931 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.928+0000 7f993a4ad700 1 -- 192.168.123.105:0/2007233056 >> 192.168.123.105:0/2007233056 conn(0x7f99340faf00 msgr2=0x7f99340fd340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:13.931 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.928+0000 7f993a4ad700 1 -- 192.168.123.105:0/2007233056 shutdown_connections 2026-03-10T07:46:13.931 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.928+0000 7f993a4ad700 1 -- 192.168.123.105:0/2007233056 wait complete. 2026-03-10T07:46:13.931 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.928+0000 7f993a4ad700 1 Processor -- start 2026-03-10T07:46:13.931 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.928+0000 7f993a4ad700 1 -- start start 2026-03-10T07:46:13.931 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.928+0000 7f993a4ad700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f993406fae0 0x7f993406ff00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:13.931 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.928+0000 7f993a4ad700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9934070440 con 0x7f993406fae0 2026-03-10T07:46:13.931 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.929+0000 7f9933fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f993406fae0 0x7f993406ff00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:13.931 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.929+0000 7f9933fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f993406fae0 0x7f993406ff00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43752/0 (socket says 192.168.123.105:43752) 2026-03-10T07:46:13.931 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.929+0000 7f9933fff700 1 -- 192.168.123.105:0/3672549050 learned_addr learned my addr 192.168.123.105:0/3672549050 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:13.931 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.929+0000 7f9933fff700 1 -- 192.168.123.105:0/3672549050 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f991c0097e0 con 0x7f993406fae0 2026-03-10T07:46:13.931 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.929+0000 7f9933fff700 1 --2- 192.168.123.105:0/3672549050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f993406fae0 0x7f993406ff00 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f991c00b980 tx=0x7f991c003960 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:13.931 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.930+0000 7f99317fa700 1 -- 192.168.123.105:0/3672549050 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f991c004030 con 0x7f993406fae0 2026-03-10T07:46:13.931 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.930+0000 7f99317fa700 1 -- 192.168.123.105:0/3672549050 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f991c005dc0 con 0x7f993406fae0 2026-03-10T07:46:13.931 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.930+0000 7f99317fa700 1 -- 192.168.123.105:0/3672549050 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f991c017440 con 0x7f993406fae0 2026-03-10T07:46:13.933 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.931+0000 7f993a4ad700 1 -- 192.168.123.105:0/3672549050 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9934070640 con 0x7f993406fae0 2026-03-10T07:46:13.933 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.931+0000 7f993a4ad700 1 -- 192.168.123.105:0/3672549050 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9934071300 con 0x7f993406fae0 2026-03-10T07:46:13.935 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.933+0000 7f993a4ad700 1 -- 192.168.123.105:0/3672549050 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9934108fe0 con 0x7f993406fae0 2026-03-10T07:46:13.936 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.935+0000 7f99317fa700 1 -- 192.168.123.105:0/3672549050 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 12) v1 ==== 45291+0+0 (secure 0 0 0) 0x7f991c003c40 con 0x7f993406fae0 2026-03-10T07:46:13.936 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.935+0000 7f99317fa700 1 --2- 192.168.123.105:0/3672549050 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9920038450 0x7f992003a910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:13.936 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.935+0000 7f99337fe700 1 --2- 192.168.123.105:0/3672549050 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9920038450 0x7f992003a910 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:13.936 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.935+0000 7f99317fa700 1 -- 192.168.123.105:0/3672549050 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f991c04c230 con 0x7f993406fae0 2026-03-10T07:46:13.936 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.935+0000 7f99337fe700 1 --2- 192.168.123.105:0/3672549050 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9920038450 0x7f992003a910 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f9924006fd0 tx=0x7f9924006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:13.936 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:13.936+0000 7f99317fa700 1 -- 192.168.123.105:0/3672549050 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f991c02a970 con 0x7f993406fae0 2026-03-10T07:46:14.045 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.041+0000 7f993a4ad700 1 -- 192.168.123.105:0/3672549050 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch client-keyring set", "entity": "client.admin", "placement": "*", "mode": "0755", "target": ["mon-mgr", ""]}) v1 -- 0x7f993402cfe0 con 0x7f9920038450 2026-03-10T07:46:14.048 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.047+0000 7f99317fa700 1 -- 192.168.123.105:0/3672549050 <== mgr.14164 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f993402cfe0 con 0x7f9920038450 2026-03-10T07:46:14.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.050+0000 7f993a4ad700 1 -- 192.168.123.105:0/3672549050 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9920038450 msgr2=0x7f992003a910 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:14.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.050+0000 7f993a4ad700 1 --2- 192.168.123.105:0/3672549050 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9920038450 0x7f992003a910 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f9924006fd0 tx=0x7f9924006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:14.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.050+0000 7f993a4ad700 1 -- 192.168.123.105:0/3672549050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f993406fae0 msgr2=0x7f993406ff00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:14.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.050+0000 7f993a4ad700 1 --2- 192.168.123.105:0/3672549050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f993406fae0 0x7f993406ff00 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f991c00b980 tx=0x7f991c003960 comp rx=0 tx=0).stop 2026-03-10T07:46:14.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.051+0000 7f993a4ad700 1 -- 192.168.123.105:0/3672549050 shutdown_connections 2026-03-10T07:46:14.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.051+0000 7f993a4ad700 1 --2- 192.168.123.105:0/3672549050 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f9920038450 0x7f992003a910 unknown :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:14.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.051+0000 7f993a4ad700 1 --2- 192.168.123.105:0/3672549050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f993406fae0 0x7f993406ff00 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:14.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.051+0000 7f993a4ad700 1 -- 192.168.123.105:0/3672549050 >> 192.168.123.105:0/3672549050 conn(0x7f99340faf00 msgr2=0x7f99340fbbc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:14.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.051+0000 7f993a4ad700 1 -- 192.168.123.105:0/3672549050 shutdown_connections 2026-03-10T07:46:14.052 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.051+0000 7f993a4ad700 1 -- 192.168.123.105:0/3672549050 wait complete. 2026-03-10T07:46:14.099 INFO:tasks.cephadm:Writing (initial) conf and keyring to vm08 2026-03-10T07:46:14.099 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T07:46:14.099 DEBUG:teuthology.orchestra.run.vm08:> dd of=/etc/ceph/ceph.conf 2026-03-10T07:46:14.117 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T07:46:14.117 DEBUG:teuthology.orchestra.run.vm08:> dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-10T07:46:14.175 INFO:tasks.cephadm:Adding host vm08 to orchestrator... 2026-03-10T07:46:14.175 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph orch host add vm08 2026-03-10T07:46:14.363 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:46:14.568 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:14 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/4015415966' entity='client.admin' 2026-03-10T07:46:14.568 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:14 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:14.568 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:14 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:14.568 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:14 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:46:14.568 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:14 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:14.568 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:14 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T07:46:14.568 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:14 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T07:46:14.568 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:14 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:46:14.568 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:14 vm05 ceph-mon[50387]: Deploying daemon ceph-exporter.vm05 on vm05 2026-03-10T07:46:14.568 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:14 vm05 ceph-mon[50387]: from='client.14188 -' entity='client.admin' cmd=[{"prefix": "orch client-keyring set", "entity": "client.admin", "placement": "*", "mode": "0755", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:46:14.568 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:14 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.767+0000 7fad9e734700 1 -- 192.168.123.105:0/2934421794 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fad981083e0 msgr2=0x7fad98108800 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.767+0000 7fad9e734700 1 --2- 192.168.123.105:0/2934421794 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fad981083e0 0x7fad98108800 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7fad90009230 tx=0x7fad90009260 comp rx=0 tx=0).stop 2026-03-10T07:46:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.767+0000 7fad9e734700 1 -- 192.168.123.105:0/2934421794 shutdown_connections 2026-03-10T07:46:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.767+0000 7fad9e734700 1 --2- 192.168.123.105:0/2934421794 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fad981083e0 0x7fad98108800 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.767+0000 7fad9e734700 1 -- 192.168.123.105:0/2934421794 >> 192.168.123.105:0/2934421794 conn(0x7fad9806d660 msgr2=0x7fad9806fac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.767+0000 7fad9e734700 1 -- 192.168.123.105:0/2934421794 shutdown_connections 2026-03-10T07:46:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.767+0000 7fad9e734700 1 -- 192.168.123.105:0/2934421794 wait complete. 2026-03-10T07:46:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.767+0000 7fad9e734700 1 Processor -- start 2026-03-10T07:46:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.767+0000 7fad9e734700 1 -- start start 2026-03-10T07:46:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.767+0000 7fad9e734700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fad9807eb90 0x7fad9807efb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.767+0000 7fad9e734700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fad9000bdd0 con 0x7fad9807eb90 2026-03-10T07:46:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.767+0000 7fad9d732700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fad9807eb90 0x7fad9807efb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.767+0000 7fad9d732700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fad9807eb90 0x7fad9807efb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43780/0 (socket says 192.168.123.105:43780) 2026-03-10T07:46:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.767+0000 7fad9d732700 1 -- 192.168.123.105:0/1569814654 learned_addr learned my addr 192.168.123.105:0/1569814654 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.767+0000 7fad9d732700 1 -- 192.168.123.105:0/1569814654 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fad90008ee0 con 0x7fad9807eb90 2026-03-10T07:46:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.769+0000 7fad9d732700 1 --2- 192.168.123.105:0/1569814654 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fad9807eb90 0x7fad9807efb0 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fad9000bbb0 tx=0x7fad9000bbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.769+0000 7fad8e7fc700 1 -- 192.168.123.105:0/1569814654 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fad900054d0 con 0x7fad9807eb90 2026-03-10T07:46:14.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.769+0000 7fad9e734700 1 -- 192.168.123.105:0/1569814654 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fad9807f4f0 con 0x7fad9807eb90 2026-03-10T07:46:14.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.769+0000 7fad9e734700 1 -- 192.168.123.105:0/1569814654 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fad98080160 con 0x7fad9807eb90 2026-03-10T07:46:14.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.770+0000 7fad8e7fc700 1 -- 192.168.123.105:0/1569814654 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fad9001e420 con 0x7fad9807eb90 2026-03-10T07:46:14.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.770+0000 7fad8e7fc700 1 -- 192.168.123.105:0/1569814654 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fad900165b0 con 0x7fad9807eb90 2026-03-10T07:46:14.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.770+0000 7fad8e7fc700 1 -- 192.168.123.105:0/1569814654 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 12) v1 ==== 45291+0+0 (secure 0 0 0) 0x7fad9001c070 con 0x7fad9807eb90 2026-03-10T07:46:14.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.770+0000 7fad8e7fc700 1 --2- 192.168.123.105:0/1569814654 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fad84038430 0x7fad8403a8f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:14.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.771+0000 7fad9cf31700 1 --2- 192.168.123.105:0/1569814654 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fad84038430 0x7fad8403a8f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:14.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.771+0000 7fad8e7fc700 1 -- 192.168.123.105:0/1569814654 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fad9004dc60 con 0x7fad9807eb90 2026-03-10T07:46:14.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.771+0000 7fad9cf31700 1 --2- 192.168.123.105:0/1569814654 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fad84038430 0x7fad8403a8f0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7fad94006fd0 tx=0x7fad94006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:14.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.771+0000 7fad9e734700 1 -- 192.168.123.105:0/1569814654 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fad7c005320 con 0x7fad9807eb90 2026-03-10T07:46:14.775 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.774+0000 7fad8e7fc700 1 -- 192.168.123.105:0/1569814654 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fad90016710 con 0x7fad9807eb90 2026-03-10T07:46:14.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:14.898+0000 7fad9e734700 1 -- 192.168.123.105:0/1569814654 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch host add", "hostname": "vm08", "target": ["mon-mgr", ""]}) v1 -- 0x7fad7c000bf0 con 0x7fad84038430 2026-03-10T07:46:16.067 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:16.064+0000 7fad8e7fc700 1 -- 192.168.123.105:0/1569814654 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7fad90027400 con 0x7fad9807eb90 2026-03-10T07:46:16.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:16 vm05 ceph-mon[50387]: from='client.14190 -' entity='client.admin' cmd=[{"prefix": "orch host add", "hostname": "vm08", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:46:16.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:16 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:16.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:16 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:16.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:16 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:16.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:16 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:16.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:16 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T07:46:16.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:16 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished 2026-03-10T07:46:16.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:16 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:46:16.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:16 vm05 ceph-mon[50387]: Deploying daemon crash.vm05 on vm05 2026-03-10T07:46:16.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:16 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:16.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:16 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:16.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:16 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:16.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:16 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:16.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:16.689+0000 7fad8e7fc700 1 -- 192.168.123.105:0/1569814654 <== mgr.14164 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+46 (secure 0 0 0) 0x7fad7c000bf0 con 0x7fad84038430 2026-03-10T07:46:16.694 INFO:teuthology.orchestra.run.vm05.stdout:Added host 'vm08' with addr '192.168.123.108' 2026-03-10T07:46:16.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:16.693+0000 7fad9e734700 1 -- 192.168.123.105:0/1569814654 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fad84038430 msgr2=0x7fad8403a8f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:16.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:16.693+0000 7fad9e734700 1 --2- 192.168.123.105:0/1569814654 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fad84038430 0x7fad8403a8f0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7fad94006fd0 tx=0x7fad94006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:16.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:16.693+0000 7fad9e734700 1 -- 192.168.123.105:0/1569814654 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fad9807eb90 msgr2=0x7fad9807efb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:16.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:16.693+0000 7fad9e734700 1 --2- 192.168.123.105:0/1569814654 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fad9807eb90 0x7fad9807efb0 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fad9000bbb0 tx=0x7fad9000bbe0 comp rx=0 tx=0).stop 2026-03-10T07:46:16.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:16.693+0000 7fad9e734700 1 -- 192.168.123.105:0/1569814654 shutdown_connections 2026-03-10T07:46:16.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:16.693+0000 7fad9e734700 1 --2- 192.168.123.105:0/1569814654 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fad84038430 0x7fad8403a8f0 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:16.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:16.693+0000 7fad9e734700 1 --2- 192.168.123.105:0/1569814654 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fad9807eb90 0x7fad9807efb0 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:16.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:16.693+0000 7fad9e734700 1 -- 192.168.123.105:0/1569814654 >> 192.168.123.105:0/1569814654 conn(0x7fad9806d660 msgr2=0x7fad9806e340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:16.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:16.693+0000 7fad9e734700 1 -- 192.168.123.105:0/1569814654 shutdown_connections 2026-03-10T07:46:16.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:16.694+0000 7fad9e734700 1 -- 192.168.123.105:0/1569814654 wait complete. 2026-03-10T07:46:16.841 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph orch host ls --format=json 2026-03-10T07:46:16.992 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:46:17.267 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.265+0000 7f56ed85d700 1 -- 192.168.123.105:0/1361303515 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56e8107090 msgr2=0x7f56e81074b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:17.267 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.265+0000 7f56ed85d700 1 --2- 192.168.123.105:0/1361303515 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56e8107090 0x7f56e81074b0 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f56d8009b00 tx=0x7f56d8009e10 comp rx=0 tx=0).stop 2026-03-10T07:46:17.267 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.266+0000 7f56ed85d700 1 -- 192.168.123.105:0/1361303515 shutdown_connections 2026-03-10T07:46:17.267 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.266+0000 7f56ed85d700 1 --2- 192.168.123.105:0/1361303515 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56e8107090 0x7f56e81074b0 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:17.267 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.266+0000 7f56ed85d700 1 -- 192.168.123.105:0/1361303515 >> 192.168.123.105:0/1361303515 conn(0x7f56e8076040 msgr2=0x7f56e80784a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:17.267 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.266+0000 7f56ed85d700 1 -- 192.168.123.105:0/1361303515 shutdown_connections 2026-03-10T07:46:17.267 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.266+0000 7f56ed85d700 1 -- 192.168.123.105:0/1361303515 wait complete. 2026-03-10T07:46:17.268 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.267+0000 7f56ed85d700 1 Processor -- start 2026-03-10T07:46:17.268 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.267+0000 7f56ed85d700 1 -- start start 2026-03-10T07:46:17.268 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.267+0000 7f56ed85d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56e8107090 0x7f56e819c1a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:17.268 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.267+0000 7f56ed85d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f56e819c6e0 con 0x7f56e8107090 2026-03-10T07:46:17.268 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.267+0000 7f56e6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56e8107090 0x7f56e819c1a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:17.268 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.268+0000 7f56e6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56e8107090 0x7f56e819c1a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43806/0 (socket says 192.168.123.105:43806) 2026-03-10T07:46:17.268 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.268+0000 7f56e6ffd700 1 -- 192.168.123.105:0/3843974860 learned_addr learned my addr 192.168.123.105:0/3843974860 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:17.268 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.268+0000 7f56e6ffd700 1 -- 192.168.123.105:0/3843974860 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f56d80097e0 con 0x7f56e8107090 2026-03-10T07:46:17.269 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.268+0000 7f56e6ffd700 1 --2- 192.168.123.105:0/3843974860 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56e8107090 0x7f56e819c1a0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f56d8004750 tx=0x7f56d8005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:17.270 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.268+0000 7f56ec85b700 1 -- 192.168.123.105:0/3843974860 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f56d801c070 con 0x7f56e8107090 2026-03-10T07:46:17.270 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.268+0000 7f56ec85b700 1 -- 192.168.123.105:0/3843974860 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f56d8021470 con 0x7f56e8107090 2026-03-10T07:46:17.270 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.268+0000 7f56ec85b700 1 -- 192.168.123.105:0/3843974860 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f56d800f460 con 0x7f56e8107090 2026-03-10T07:46:17.270 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.268+0000 7f56ed85d700 1 -- 192.168.123.105:0/3843974860 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f56e819c8e0 con 0x7f56e8107090 2026-03-10T07:46:17.270 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.268+0000 7f56ed85d700 1 -- 192.168.123.105:0/3843974860 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f56e819cd80 con 0x7f56e8107090 2026-03-10T07:46:17.270 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.269+0000 7f56ec85b700 1 -- 192.168.123.105:0/3843974860 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f56d800f5c0 con 0x7f56e8107090 2026-03-10T07:46:17.271 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.269+0000 7f56ec85b700 1 --2- 192.168.123.105:0/3843974860 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f56d4038440 0x7f56d403a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:17.271 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.269+0000 7f56ec85b700 1 -- 192.168.123.105:0/3843974860 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f56d804d470 con 0x7f56e8107090 2026-03-10T07:46:17.271 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.270+0000 7f56e67fc700 1 --2- 192.168.123.105:0/3843974860 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f56d4038440 0x7f56d403a900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:17.271 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.270+0000 7f56e67fc700 1 --2- 192.168.123.105:0/3843974860 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f56d4038440 0x7f56d403a900 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f56d0006fd0 tx=0x7f56d0006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:17.271 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.271+0000 7f56ed85d700 1 -- 192.168.123.105:0/3843974860 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f56e8195e20 con 0x7f56e8107090 2026-03-10T07:46:17.274 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.273+0000 7f56ec85b700 1 -- 192.168.123.105:0/3843974860 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f56d8026070 con 0x7f56e8107090 2026-03-10T07:46:17.381 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:17 vm05 ceph-mon[50387]: Deploying cephadm binary to vm08 2026-03-10T07:46:17.381 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:17 vm05 ceph-mon[50387]: mgrmap e13: vm05.blexke(active, since 6s) 2026-03-10T07:46:17.381 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:17 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:17.381 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.379+0000 7f56ed85d700 1 -- 192.168.123.105:0/3843974860 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7f56e80611d0 con 0x7f56d4038440 2026-03-10T07:46:17.381 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.380+0000 7f56ec85b700 1 -- 192.168.123.105:0/3843974860 <== mgr.14164 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+155 (secure 0 0 0) 0x7f56e80611d0 con 0x7f56d4038440 2026-03-10T07:46:17.382 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:46:17.382 INFO:teuthology.orchestra.run.vm05.stdout:[{"addr": "192.168.123.105", "hostname": "vm05", "labels": [], "status": ""}, {"addr": "192.168.123.108", "hostname": "vm08", "labels": [], "status": ""}] 2026-03-10T07:46:17.384 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.383+0000 7f56ed85d700 1 -- 192.168.123.105:0/3843974860 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f56d4038440 msgr2=0x7f56d403a900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:17.384 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.384+0000 7f56ed85d700 1 --2- 192.168.123.105:0/3843974860 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f56d4038440 0x7f56d403a900 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7f56d0006fd0 tx=0x7f56d0006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:17.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.384+0000 7f56ed85d700 1 -- 192.168.123.105:0/3843974860 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56e8107090 msgr2=0x7f56e819c1a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:17.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.384+0000 7f56ed85d700 1 --2- 192.168.123.105:0/3843974860 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56e8107090 0x7f56e819c1a0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f56d8004750 tx=0x7f56d8005dc0 comp rx=0 tx=0).stop 2026-03-10T07:46:17.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.384+0000 7f56ed85d700 1 -- 192.168.123.105:0/3843974860 shutdown_connections 2026-03-10T07:46:17.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.384+0000 7f56ed85d700 1 --2- 192.168.123.105:0/3843974860 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f56d4038440 0x7f56d403a900 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:17.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.385+0000 7f56ed85d700 1 --2- 192.168.123.105:0/3843974860 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f56e8107090 0x7f56e819c1a0 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:17.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.385+0000 7f56ed85d700 1 -- 192.168.123.105:0/3843974860 >> 192.168.123.105:0/3843974860 conn(0x7f56e8076040 msgr2=0x7f56e8076d20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:17.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.385+0000 7f56ed85d700 1 -- 192.168.123.105:0/3843974860 shutdown_connections 2026-03-10T07:46:17.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.385+0000 7f56ed85d700 1 -- 192.168.123.105:0/3843974860 wait complete. 2026-03-10T07:46:17.431 INFO:tasks.cephadm:Setting crush tunables to default 2026-03-10T07:46:17.431 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph osd crush tunables default 2026-03-10T07:46:17.575 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:46:17.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.810+0000 7fb402e9c700 1 -- 192.168.123.105:0/3535965498 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3fc1015d0 msgr2=0x7fb3fc1039c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:17.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.810+0000 7fb402e9c700 1 --2- 192.168.123.105:0/3535965498 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3fc1015d0 0x7fb3fc1039c0 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7fb3ec009b50 tx=0x7fb3ec009e60 comp rx=0 tx=0).stop 2026-03-10T07:46:17.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.811+0000 7fb402e9c700 1 -- 192.168.123.105:0/3535965498 shutdown_connections 2026-03-10T07:46:17.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.811+0000 7fb402e9c700 1 --2- 192.168.123.105:0/3535965498 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3fc1015d0 0x7fb3fc1039c0 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:17.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.811+0000 7fb402e9c700 1 -- 192.168.123.105:0/3535965498 >> 192.168.123.105:0/3535965498 conn(0x7fb3fc0faf20 msgr2=0x7fb3fc0fd360 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:17.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.811+0000 7fb402e9c700 1 -- 192.168.123.105:0/3535965498 shutdown_connections 2026-03-10T07:46:17.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.811+0000 7fb402e9c700 1 -- 192.168.123.105:0/3535965498 wait complete. 2026-03-10T07:46:17.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.811+0000 7fb402e9c700 1 Processor -- start 2026-03-10T07:46:17.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.811+0000 7fb402e9c700 1 -- start start 2026-03-10T07:46:17.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.812+0000 7fb402e9c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3fc1015d0 0x7fb3fc1939a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:17.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.812+0000 7fb402e9c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb3fc193ee0 con 0x7fb3fc1015d0 2026-03-10T07:46:17.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.812+0000 7fb400c38700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3fc1015d0 0x7fb3fc1939a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:17.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.812+0000 7fb400c38700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3fc1015d0 0x7fb3fc1939a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:43838/0 (socket says 192.168.123.105:43838) 2026-03-10T07:46:17.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.812+0000 7fb400c38700 1 -- 192.168.123.105:0/209538277 learned_addr learned my addr 192.168.123.105:0/209538277 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:46:17.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.812+0000 7fb400c38700 1 -- 192.168.123.105:0/209538277 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb3ec0097e0 con 0x7fb3fc1015d0 2026-03-10T07:46:17.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.812+0000 7fb400c38700 1 --2- 192.168.123.105:0/209538277 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3fc1015d0 0x7fb3fc1939a0 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7fb3ec005f50 tx=0x7fb3ec0050d0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:17.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.813+0000 7fb3f9ffb700 1 -- 192.168.123.105:0/209538277 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb3ec01c070 con 0x7fb3fc1015d0 2026-03-10T07:46:17.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.813+0000 7fb3f9ffb700 1 -- 192.168.123.105:0/209538277 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb3ec021470 con 0x7fb3fc1015d0 2026-03-10T07:46:17.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.813+0000 7fb3f9ffb700 1 -- 192.168.123.105:0/209538277 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb3ec00f460 con 0x7fb3fc1015d0 2026-03-10T07:46:17.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.813+0000 7fb402e9c700 1 -- 192.168.123.105:0/209538277 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb3fc1940e0 con 0x7fb3fc1015d0 2026-03-10T07:46:17.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.813+0000 7fb402e9c700 1 -- 192.168.123.105:0/209538277 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb3fc194580 con 0x7fb3fc1015d0 2026-03-10T07:46:17.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.813+0000 7fb3f9ffb700 1 -- 192.168.123.105:0/209538277 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7fb3ec00f5e0 con 0x7fb3fc1015d0 2026-03-10T07:46:17.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.814+0000 7fb3f9ffb700 1 --2- 192.168.123.105:0/209538277 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb3e4038470 0x7fb3e403a930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:17.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.814+0000 7fb3f9ffb700 1 -- 192.168.123.105:0/209538277 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(3..3 src has 1..3) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fb3ec04d3a0 con 0x7fb3fc1015d0 2026-03-10T07:46:17.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.814+0000 7fb3fbfff700 1 --2- 192.168.123.105:0/209538277 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb3e4038470 0x7fb3e403a930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:17.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.814+0000 7fb402e9c700 1 -- 192.168.123.105:0/209538277 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb3fc18d600 con 0x7fb3fc1015d0 2026-03-10T07:46:17.818 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.817+0000 7fb3f9ffb700 1 -- 192.168.123.105:0/209538277 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb3ec026070 con 0x7fb3fc1015d0 2026-03-10T07:46:17.818 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.817+0000 7fb3fbfff700 1 --2- 192.168.123.105:0/209538277 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb3e4038470 0x7fb3e403a930 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fb3f0006fd0 tx=0x7fb3f0006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:17.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:17.920+0000 7fb402e9c700 1 -- 192.168.123.105:0/209538277 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd crush tunables", "profile": "default"} v 0) v1 -- 0x7fb3fc0623c0 con 0x7fb3fc1015d0 2026-03-10T07:46:18.072 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:18.071+0000 7fb3f9ffb700 1 -- 192.168.123.105:0/209538277 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd crush tunables", "profile": "default"}]=0 adjusted tunables profile to default v4) v1 ==== 124+0+0 (secure 0 0 0) 0x7fb3ec029950 con 0x7fb3fc1015d0 2026-03-10T07:46:18.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:18.074+0000 7fb402e9c700 1 -- 192.168.123.105:0/209538277 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb3e4038470 msgr2=0x7fb3e403a930 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:18.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:18.074+0000 7fb402e9c700 1 --2- 192.168.123.105:0/209538277 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb3e4038470 0x7fb3e403a930 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fb3f0006fd0 tx=0x7fb3f0006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:18.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:18.074+0000 7fb402e9c700 1 -- 192.168.123.105:0/209538277 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3fc1015d0 msgr2=0x7fb3fc1939a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:18.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:18.074+0000 7fb402e9c700 1 --2- 192.168.123.105:0/209538277 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3fc1015d0 0x7fb3fc1939a0 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7fb3ec005f50 tx=0x7fb3ec0050d0 comp rx=0 tx=0).stop 2026-03-10T07:46:18.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:18.075+0000 7fb402e9c700 1 -- 192.168.123.105:0/209538277 shutdown_connections 2026-03-10T07:46:18.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:18.075+0000 7fb402e9c700 1 --2- 192.168.123.105:0/209538277 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb3e4038470 0x7fb3e403a930 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:18.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:18.075+0000 7fb402e9c700 1 --2- 192.168.123.105:0/209538277 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3fc1015d0 0x7fb3fc1939a0 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:18.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:18.075+0000 7fb402e9c700 1 -- 192.168.123.105:0/209538277 >> 192.168.123.105:0/209538277 conn(0x7fb3fc0faf20 msgr2=0x7fb3fc0fbbe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:18.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:18.075+0000 7fb402e9c700 1 -- 192.168.123.105:0/209538277 shutdown_connections 2026-03-10T07:46:18.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:46:18.075+0000 7fb402e9c700 1 -- 192.168.123.105:0/209538277 wait complete. 2026-03-10T07:46:18.076 INFO:teuthology.orchestra.run.vm05.stderr:adjusted tunables profile to default 2026-03-10T07:46:18.126 INFO:tasks.cephadm:Adding mon.vm05 on vm05 2026-03-10T07:46:18.127 INFO:tasks.cephadm:Adding mon.vm08 on vm08 2026-03-10T07:46:18.127 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph orch apply mon '2;vm05:192.168.123.105=vm05;vm08:192.168.123.108=vm08' 2026-03-10T07:46:18.262 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:18.301 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:18.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:18 vm05 ceph-mon[50387]: Deploying daemon node-exporter.vm05 on vm05 2026-03-10T07:46:18.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:18 vm05 ceph-mon[50387]: Added host vm08 2026-03-10T07:46:18.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:18 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/209538277' entity='client.admin' cmd=[{"prefix": "osd crush tunables", "profile": "default"}]: dispatch 2026-03-10T07:46:19.387 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.385+0000 7f5d135df700 1 -- 192.168.123.108:0/3867612499 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5d0c074dc0 msgr2=0x7f5d0c073220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:19.387 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.385+0000 7f5d135df700 1 --2- 192.168.123.108:0/3867612499 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5d0c074dc0 0x7f5d0c073220 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f5d00009b00 tx=0x7f5d00009e10 comp rx=0 tx=0).stop 2026-03-10T07:46:19.387 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.386+0000 7f5d135df700 1 -- 192.168.123.108:0/3867612499 shutdown_connections 2026-03-10T07:46:19.387 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.386+0000 7f5d135df700 1 --2- 192.168.123.108:0/3867612499 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5d0c074dc0 0x7f5d0c073220 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:19.387 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.386+0000 7f5d135df700 1 -- 192.168.123.108:0/3867612499 >> 192.168.123.108:0/3867612499 conn(0x7f5d0c0fc020 msgr2=0x7f5d0c0fe480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:19.387 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.386+0000 7f5d135df700 1 -- 192.168.123.108:0/3867612499 shutdown_connections 2026-03-10T07:46:19.387 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.386+0000 7f5d135df700 1 -- 192.168.123.108:0/3867612499 wait complete. 2026-03-10T07:46:19.388 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.387+0000 7f5d135df700 1 Processor -- start 2026-03-10T07:46:19.388 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.387+0000 7f5d135df700 1 -- start start 2026-03-10T07:46:19.388 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.387+0000 7f5d135df700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5d0c074dc0 0x7f5d0c19c1a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:19.388 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.387+0000 7f5d135df700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5d0c19c6e0 con 0x7f5d0c074dc0 2026-03-10T07:46:19.389 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.387+0000 7f5d1137b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5d0c074dc0 0x7f5d0c19c1a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:19.389 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.388+0000 7f5d1137b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5d0c074dc0 0x7f5d0c19c1a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:42806/0 (socket says 192.168.123.108:42806) 2026-03-10T07:46:19.389 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.388+0000 7f5d1137b700 1 -- 192.168.123.108:0/2979826939 learned_addr learned my addr 192.168.123.108:0/2979826939 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:46:19.389 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.388+0000 7f5d1137b700 1 -- 192.168.123.108:0/2979826939 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5d000097e0 con 0x7f5d0c074dc0 2026-03-10T07:46:19.389 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.388+0000 7f5d1137b700 1 --2- 192.168.123.108:0/2979826939 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5d0c074dc0 0x7f5d0c19c1a0 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f5d00009fd0 tx=0x7f5d00005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:19.390 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.388+0000 7f5cfe7fc700 1 -- 192.168.123.108:0/2979826939 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5d0001d070 con 0x7f5d0c074dc0 2026-03-10T07:46:19.390 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.389+0000 7f5d135df700 1 -- 192.168.123.108:0/2979826939 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5d0c19c8e0 con 0x7f5d0c074dc0 2026-03-10T07:46:19.390 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.389+0000 7f5d135df700 1 -- 192.168.123.108:0/2979826939 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5d0c19cd80 con 0x7f5d0c074dc0 2026-03-10T07:46:19.391 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.389+0000 7f5cfe7fc700 1 -- 192.168.123.108:0/2979826939 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5d00022470 con 0x7f5d0c074dc0 2026-03-10T07:46:19.391 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.389+0000 7f5cfe7fc700 1 -- 192.168.123.108:0/2979826939 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f5d0000f460 con 0x7f5d0c074dc0 2026-03-10T07:46:19.391 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.390+0000 7f5cfe7fc700 1 -- 192.168.123.108:0/2979826939 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f5d0000f610 con 0x7f5d0c074dc0 2026-03-10T07:46:19.391 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.390+0000 7f5cfe7fc700 1 --2- 192.168.123.108:0/2979826939 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5cf8038480 0x7f5cf803a940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:19.391 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.390+0000 7f5d135df700 1 -- 192.168.123.108:0/2979826939 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5cf0005320 con 0x7f5d0c074dc0 2026-03-10T07:46:19.391 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.390+0000 7f5cfe7fc700 1 -- 192.168.123.108:0/2979826939 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f5d0004d4e0 con 0x7f5d0c074dc0 2026-03-10T07:46:19.392 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.390+0000 7f5d10b7a700 1 --2- 192.168.123.108:0/2979826939 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5cf8038480 0x7f5cf803a940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:19.392 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.391+0000 7f5d10b7a700 1 --2- 192.168.123.108:0/2979826939 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5cf8038480 0x7f5cf803a940 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f5d08006fd0 tx=0x7f5d08006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:19.394 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.393+0000 7f5cfe7fc700 1 -- 192.168.123.108:0/2979826939 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f5d0002a950 con 0x7f5d0c074dc0 2026-03-10T07:46:19.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:19 vm05 ceph-mon[50387]: from='client.14193 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T07:46:19.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:19 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/209538277' entity='client.admin' cmd='[{"prefix": "osd crush tunables", "profile": "default"}]': finished 2026-03-10T07:46:19.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:19 vm05 ceph-mon[50387]: osdmap e4: 0 total, 0 up, 0 in 2026-03-10T07:46:19.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:19 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:19.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:19 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:19.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:19 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:19.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:19 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:19.501 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.499+0000 7f5d135df700 1 -- 192.168.123.108:0/2979826939 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch apply", "service_type": "mon", "placement": "2;vm05:192.168.123.105=vm05;vm08:192.168.123.108=vm08", "target": ["mon-mgr", ""]}) v1 -- 0x7f5cf0000c90 con 0x7f5cf8038480 2026-03-10T07:46:19.506 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.504+0000 7f5cfe7fc700 1 -- 192.168.123.108:0/2979826939 <== mgr.14164 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+24 (secure 0 0 0) 0x7f5cf0000c90 con 0x7f5cf8038480 2026-03-10T07:46:19.506 INFO:teuthology.orchestra.run.vm08.stdout:Scheduled mon update... 2026-03-10T07:46:19.508 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.507+0000 7f5d135df700 1 -- 192.168.123.108:0/2979826939 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5cf8038480 msgr2=0x7f5cf803a940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:19.508 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.507+0000 7f5d135df700 1 --2- 192.168.123.108:0/2979826939 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5cf8038480 0x7f5cf803a940 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f5d08006fd0 tx=0x7f5d08006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:19.508 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.507+0000 7f5d135df700 1 -- 192.168.123.108:0/2979826939 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5d0c074dc0 msgr2=0x7f5d0c19c1a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:19.508 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.507+0000 7f5d135df700 1 --2- 192.168.123.108:0/2979826939 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5d0c074dc0 0x7f5d0c19c1a0 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f5d00009fd0 tx=0x7f5d00005e70 comp rx=0 tx=0).stop 2026-03-10T07:46:19.508 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.507+0000 7f5d135df700 1 -- 192.168.123.108:0/2979826939 shutdown_connections 2026-03-10T07:46:19.508 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.507+0000 7f5d135df700 1 --2- 192.168.123.108:0/2979826939 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5cf8038480 0x7f5cf803a940 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:19.508 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.507+0000 7f5d135df700 1 --2- 192.168.123.108:0/2979826939 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5d0c074dc0 0x7f5d0c19c1a0 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:19.508 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.507+0000 7f5d135df700 1 -- 192.168.123.108:0/2979826939 >> 192.168.123.108:0/2979826939 conn(0x7f5d0c0fc020 msgr2=0x7f5d0c0fcd00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:19.508 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.507+0000 7f5d135df700 1 -- 192.168.123.108:0/2979826939 shutdown_connections 2026-03-10T07:46:19.509 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:19.507+0000 7f5d135df700 1 -- 192.168.123.108:0/2979826939 wait complete. 2026-03-10T07:46:19.581 DEBUG:teuthology.orchestra.run.vm08:mon.vm08> sudo journalctl -f -n 0 -u ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@mon.vm08.service 2026-03-10T07:46:19.583 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T07:46:19.583 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph mon dump -f json 2026-03-10T07:46:19.756 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:19.789 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:20.025 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.023+0000 7efc0dbcb700 1 -- 192.168.123.108:0/3493830281 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efc08102c90 msgr2=0x7efc081030b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:20.025 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.023+0000 7efc0dbcb700 1 --2- 192.168.123.108:0/3493830281 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efc08102c90 0x7efc081030b0 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7efbf8009b00 tx=0x7efbf8009e10 comp rx=0 tx=0).stop 2026-03-10T07:46:20.025 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.024+0000 7efc0dbcb700 1 -- 192.168.123.108:0/3493830281 shutdown_connections 2026-03-10T07:46:20.025 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.024+0000 7efc0dbcb700 1 --2- 192.168.123.108:0/3493830281 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efc08102c90 0x7efc081030b0 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:20.026 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.024+0000 7efc0dbcb700 1 -- 192.168.123.108:0/3493830281 >> 192.168.123.108:0/3493830281 conn(0x7efc080fe230 msgr2=0x7efc08100670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:20.026 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.024+0000 7efc0dbcb700 1 -- 192.168.123.108:0/3493830281 shutdown_connections 2026-03-10T07:46:20.026 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.024+0000 7efc0dbcb700 1 -- 192.168.123.108:0/3493830281 wait complete. 2026-03-10T07:46:20.026 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.025+0000 7efc0dbcb700 1 Processor -- start 2026-03-10T07:46:20.026 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.025+0000 7efc0dbcb700 1 -- start start 2026-03-10T07:46:20.026 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.025+0000 7efc0dbcb700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efc08102c90 0x7efc08197da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:20.027 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.025+0000 7efc0dbcb700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efc081982e0 con 0x7efc08102c90 2026-03-10T07:46:20.027 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.026+0000 7efc077fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efc08102c90 0x7efc08197da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:20.027 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.026+0000 7efc077fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efc08102c90 0x7efc08197da0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:42828/0 (socket says 192.168.123.108:42828) 2026-03-10T07:46:20.027 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.026+0000 7efc077fe700 1 -- 192.168.123.108:0/2367610636 learned_addr learned my addr 192.168.123.108:0/2367610636 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:46:20.027 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.026+0000 7efc077fe700 1 -- 192.168.123.108:0/2367610636 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efbf80097e0 con 0x7efc08102c90 2026-03-10T07:46:20.027 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.026+0000 7efc077fe700 1 --2- 192.168.123.108:0/2367610636 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efc08102c90 0x7efc08197da0 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7efbf8004750 tx=0x7efbf8005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:20.028 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.026+0000 7efc04ff9700 1 -- 192.168.123.108:0/2367610636 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efbf801c070 con 0x7efc08102c90 2026-03-10T07:46:20.028 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.027+0000 7efc0dbcb700 1 -- 192.168.123.108:0/2367610636 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efc081984e0 con 0x7efc08102c90 2026-03-10T07:46:20.028 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.027+0000 7efc0dbcb700 1 -- 192.168.123.108:0/2367610636 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efc08198980 con 0x7efc08102c90 2026-03-10T07:46:20.029 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.027+0000 7efc04ff9700 1 -- 192.168.123.108:0/2367610636 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7efbf8021470 con 0x7efc08102c90 2026-03-10T07:46:20.029 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.027+0000 7efc04ff9700 1 -- 192.168.123.108:0/2367610636 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7efbf800f460 con 0x7efc08102c90 2026-03-10T07:46:20.029 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.028+0000 7efc04ff9700 1 -- 192.168.123.108:0/2367610636 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7efbf800f5c0 con 0x7efc08102c90 2026-03-10T07:46:20.029 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.028+0000 7efc04ff9700 1 --2- 192.168.123.108:0/2367610636 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efbf4038440 0x7efbf403a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:20.029 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.028+0000 7efc04ff9700 1 -- 192.168.123.108:0/2367610636 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7efbf804d490 con 0x7efc08102c90 2026-03-10T07:46:20.029 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.028+0000 7efc06ffd700 1 --2- 192.168.123.108:0/2367610636 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efbf4038440 0x7efbf403a900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:20.029 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.028+0000 7efc0dbcb700 1 -- 192.168.123.108:0/2367610636 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efbe8005320 con 0x7efc08102c90 2026-03-10T07:46:20.030 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.029+0000 7efc06ffd700 1 --2- 192.168.123.108:0/2367610636 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efbf4038440 0x7efbf403a900 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7efbf0006fd0 tx=0x7efbf0006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:20.032 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.031+0000 7efc04ff9700 1 -- 192.168.123.108:0/2367610636 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7efbf8026020 con 0x7efc08102c90 2026-03-10T07:46:20.173 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.172+0000 7efc0dbcb700 1 -- 192.168.123.108:0/2367610636 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7efbe8005190 con 0x7efc08102c90 2026-03-10T07:46:20.174 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.173+0000 7efc04ff9700 1 -- 192.168.123.108:0/2367610636 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7efbf8029720 con 0x7efc08102c90 2026-03-10T07:46:20.174 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:46:20.174 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","modified":"2026-03-10T07:45:39.881211Z","created":"2026-03-10T07:45:39.881211Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T07:46:20.176 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.175+0000 7efc0dbcb700 1 -- 192.168.123.108:0/2367610636 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efbf4038440 msgr2=0x7efbf403a900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:20.176 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.175+0000 7efc0dbcb700 1 --2- 192.168.123.108:0/2367610636 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efbf4038440 0x7efbf403a900 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7efbf0006fd0 tx=0x7efbf0006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:20.176 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.175+0000 7efc0dbcb700 1 -- 192.168.123.108:0/2367610636 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efc08102c90 msgr2=0x7efc08197da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:20.176 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.175+0000 7efc0dbcb700 1 --2- 192.168.123.108:0/2367610636 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efc08102c90 0x7efc08197da0 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7efbf8004750 tx=0x7efbf8005dc0 comp rx=0 tx=0).stop 2026-03-10T07:46:20.176 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.175+0000 7efc0dbcb700 1 -- 192.168.123.108:0/2367610636 shutdown_connections 2026-03-10T07:46:20.177 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.175+0000 7efc0dbcb700 1 --2- 192.168.123.108:0/2367610636 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efbf4038440 0x7efbf403a900 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:20.177 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.176+0000 7efc0dbcb700 1 --2- 192.168.123.108:0/2367610636 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efc08102c90 0x7efc08197da0 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:20.177 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.176+0000 7efc0dbcb700 1 -- 192.168.123.108:0/2367610636 >> 192.168.123.108:0/2367610636 conn(0x7efc080fe230 msgr2=0x7efc080feef0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:20.177 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.176+0000 7efc0dbcb700 1 -- 192.168.123.108:0/2367610636 shutdown_connections 2026-03-10T07:46:20.177 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:20.176+0000 7efc0dbcb700 1 -- 192.168.123.108:0/2367610636 wait complete. 2026-03-10T07:46:20.178 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T07:46:20.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:20 vm05 ceph-mon[50387]: Deploying daemon alertmanager.vm05 on vm05 2026-03-10T07:46:20.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:20 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:20.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:20 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:21.218 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T07:46:21.218 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph mon dump -f json 2026-03-10T07:46:21.358 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:21.395 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:21.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:21 vm05 ceph-mon[50387]: from='client.14197 -' entity='client.admin' cmd=[{"prefix": "orch apply", "service_type": "mon", "placement": "2;vm05:192.168.123.105=vm05;vm08:192.168.123.108=vm08", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:46:21.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:21 vm05 ceph-mon[50387]: Saving service mon spec with placement vm05:192.168.123.105=vm05;vm08:192.168.123.108=vm08;count:2 2026-03-10T07:46:21.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:21 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/2367610636' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:21.640 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.638+0000 7f3d68ee1700 1 -- 192.168.123.108:0/2535743421 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d640fe6e0 msgr2=0x7f3d640feb00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:21.640 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.638+0000 7f3d68ee1700 1 --2- 192.168.123.108:0/2535743421 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d640fe6e0 0x7f3d640feb00 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f3d54009b00 tx=0x7f3d54009e10 comp rx=0 tx=0).stop 2026-03-10T07:46:21.640 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.638+0000 7f3d68ee1700 1 -- 192.168.123.108:0/2535743421 shutdown_connections 2026-03-10T07:46:21.640 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.638+0000 7f3d68ee1700 1 --2- 192.168.123.108:0/2535743421 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d640fe6e0 0x7f3d640feb00 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:21.640 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.638+0000 7f3d68ee1700 1 -- 192.168.123.108:0/2535743421 >> 192.168.123.108:0/2535743421 conn(0x7f3d640fa240 msgr2=0x7f3d640fc6a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:21.640 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.639+0000 7f3d68ee1700 1 -- 192.168.123.108:0/2535743421 shutdown_connections 2026-03-10T07:46:21.640 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.639+0000 7f3d68ee1700 1 -- 192.168.123.108:0/2535743421 wait complete. 2026-03-10T07:46:21.640 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.639+0000 7f3d68ee1700 1 Processor -- start 2026-03-10T07:46:21.640 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.639+0000 7f3d68ee1700 1 -- start start 2026-03-10T07:46:21.640 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.639+0000 7f3d68ee1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d640fe6e0 0x7f3d6418f940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:21.640 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.639+0000 7f3d68ee1700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3d6418fe80 con 0x7f3d640fe6e0 2026-03-10T07:46:21.641 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.640+0000 7f3d6259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d640fe6e0 0x7f3d6418f940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:21.641 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.640+0000 7f3d6259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d640fe6e0 0x7f3d6418f940 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:42850/0 (socket says 192.168.123.108:42850) 2026-03-10T07:46:21.641 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.640+0000 7f3d6259c700 1 -- 192.168.123.108:0/3715234342 learned_addr learned my addr 192.168.123.108:0/3715234342 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:46:21.641 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.640+0000 7f3d6259c700 1 -- 192.168.123.108:0/3715234342 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3d540097e0 con 0x7f3d640fe6e0 2026-03-10T07:46:21.641 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.640+0000 7f3d6259c700 1 --2- 192.168.123.108:0/3715234342 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d640fe6e0 0x7f3d6418f940 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f3d54009fd0 tx=0x7f3d54005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:21.641 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.640+0000 7f3d5b7fe700 1 -- 192.168.123.108:0/3715234342 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3d5401d070 con 0x7f3d640fe6e0 2026-03-10T07:46:21.641 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.640+0000 7f3d68ee1700 1 -- 192.168.123.108:0/3715234342 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3d64190080 con 0x7f3d640fe6e0 2026-03-10T07:46:21.642 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.640+0000 7f3d68ee1700 1 -- 192.168.123.108:0/3715234342 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3d6418c5f0 con 0x7f3d640fe6e0 2026-03-10T07:46:21.643 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.641+0000 7f3d5b7fe700 1 -- 192.168.123.108:0/3715234342 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3d54022470 con 0x7f3d640fe6e0 2026-03-10T07:46:21.643 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.641+0000 7f3d5b7fe700 1 -- 192.168.123.108:0/3715234342 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f3d5400f460 con 0x7f3d640fe6e0 2026-03-10T07:46:21.643 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.641+0000 7f3d5b7fe700 1 -- 192.168.123.108:0/3715234342 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f3d5400f610 con 0x7f3d640fe6e0 2026-03-10T07:46:21.643 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.641+0000 7f3d5b7fe700 1 --2- 192.168.123.108:0/3715234342 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3d44038480 0x7f3d4403a940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:21.643 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.641+0000 7f3d5b7fe700 1 -- 192.168.123.108:0/3715234342 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f3d5404d4e0 con 0x7f3d640fe6e0 2026-03-10T07:46:21.643 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.642+0000 7f3d5bfff700 1 --2- 192.168.123.108:0/3715234342 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3d44038480 0x7f3d4403a940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:21.643 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.642+0000 7f3d68ee1700 1 -- 192.168.123.108:0/3715234342 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3d48005320 con 0x7f3d640fe6e0 2026-03-10T07:46:21.643 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.642+0000 7f3d5bfff700 1 --2- 192.168.123.108:0/3715234342 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3d44038480 0x7f3d4403a940 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f3d4c006fd0 tx=0x7f3d4c006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:21.646 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.645+0000 7f3d5b7fe700 1 -- 192.168.123.108:0/3715234342 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f3d54027070 con 0x7f3d640fe6e0 2026-03-10T07:46:21.799 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.797+0000 7f3d68ee1700 1 -- 192.168.123.108:0/3715234342 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f3d48005190 con 0x7f3d640fe6e0 2026-03-10T07:46:21.799 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.798+0000 7f3d5b7fe700 1 -- 192.168.123.108:0/3715234342 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f3d5402a720 con 0x7f3d640fe6e0 2026-03-10T07:46:21.800 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:46:21.800 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","modified":"2026-03-10T07:45:39.881211Z","created":"2026-03-10T07:45:39.881211Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T07:46:21.802 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.801+0000 7f3d68ee1700 1 -- 192.168.123.108:0/3715234342 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3d44038480 msgr2=0x7f3d4403a940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:21.802 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.801+0000 7f3d68ee1700 1 --2- 192.168.123.108:0/3715234342 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3d44038480 0x7f3d4403a940 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f3d4c006fd0 tx=0x7f3d4c006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:21.802 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.801+0000 7f3d68ee1700 1 -- 192.168.123.108:0/3715234342 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d640fe6e0 msgr2=0x7f3d6418f940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:21.802 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.801+0000 7f3d68ee1700 1 --2- 192.168.123.108:0/3715234342 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d640fe6e0 0x7f3d6418f940 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f3d54009fd0 tx=0x7f3d54005e70 comp rx=0 tx=0).stop 2026-03-10T07:46:21.803 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.801+0000 7f3d68ee1700 1 -- 192.168.123.108:0/3715234342 shutdown_connections 2026-03-10T07:46:21.803 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.801+0000 7f3d68ee1700 1 --2- 192.168.123.108:0/3715234342 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3d44038480 0x7f3d4403a940 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:21.803 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.802+0000 7f3d68ee1700 1 --2- 192.168.123.108:0/3715234342 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d640fe6e0 0x7f3d6418f940 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:21.803 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.802+0000 7f3d68ee1700 1 -- 192.168.123.108:0/3715234342 >> 192.168.123.108:0/3715234342 conn(0x7f3d640fa240 msgr2=0x7f3d640faf20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:21.803 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.802+0000 7f3d68ee1700 1 -- 192.168.123.108:0/3715234342 shutdown_connections 2026-03-10T07:46:21.803 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:21.802+0000 7f3d68ee1700 1 -- 192.168.123.108:0/3715234342 wait complete. 2026-03-10T07:46:21.804 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T07:46:22.279 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:22 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/3715234342' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:22.850 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T07:46:22.850 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph mon dump -f json 2026-03-10T07:46:22.982 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:23.016 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:23.261 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.259+0000 7fb43faa0700 1 -- 192.168.123.108:0/1433965951 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb438074dc0 msgr2=0x7fb438073220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:23.261 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.259+0000 7fb43faa0700 1 --2- 192.168.123.108:0/1433965951 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb438074dc0 0x7fb438073220 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7fb428009b00 tx=0x7fb428009e10 comp rx=0 tx=0).stop 2026-03-10T07:46:23.261 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.260+0000 7fb43faa0700 1 -- 192.168.123.108:0/1433965951 shutdown_connections 2026-03-10T07:46:23.261 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.260+0000 7fb43faa0700 1 --2- 192.168.123.108:0/1433965951 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb438074dc0 0x7fb438073220 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:23.261 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.260+0000 7fb43faa0700 1 -- 192.168.123.108:0/1433965951 >> 192.168.123.108:0/1433965951 conn(0x7fb4380fc000 msgr2=0x7fb4380fe440 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:23.261 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.260+0000 7fb43faa0700 1 -- 192.168.123.108:0/1433965951 shutdown_connections 2026-03-10T07:46:23.261 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.260+0000 7fb43faa0700 1 -- 192.168.123.108:0/1433965951 wait complete. 2026-03-10T07:46:23.262 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.260+0000 7fb43faa0700 1 Processor -- start 2026-03-10T07:46:23.262 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.261+0000 7fb43faa0700 1 -- start start 2026-03-10T07:46:23.262 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.261+0000 7fb43faa0700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb438074dc0 0x7fb43819c160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:23.262 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.261+0000 7fb43faa0700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb43819c6a0 con 0x7fb438074dc0 2026-03-10T07:46:23.262 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.261+0000 7fb43d83c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb438074dc0 0x7fb43819c160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:23.262 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.261+0000 7fb43d83c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb438074dc0 0x7fb43819c160 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:42864/0 (socket says 192.168.123.108:42864) 2026-03-10T07:46:23.262 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.261+0000 7fb43d83c700 1 -- 192.168.123.108:0/237090117 learned_addr learned my addr 192.168.123.108:0/237090117 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:46:23.262 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.261+0000 7fb43d83c700 1 -- 192.168.123.108:0/237090117 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb4280097e0 con 0x7fb438074dc0 2026-03-10T07:46:23.262 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.261+0000 7fb43d83c700 1 --2- 192.168.123.108:0/237090117 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb438074dc0 0x7fb43819c160 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7fb428004750 tx=0x7fb428005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:23.263 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.262+0000 7fb42effd700 1 -- 192.168.123.108:0/237090117 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb42801c070 con 0x7fb438074dc0 2026-03-10T07:46:23.263 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.262+0000 7fb43faa0700 1 -- 192.168.123.108:0/237090117 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb43819c8a0 con 0x7fb438074dc0 2026-03-10T07:46:23.263 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.262+0000 7fb43faa0700 1 -- 192.168.123.108:0/237090117 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb43819cd40 con 0x7fb438074dc0 2026-03-10T07:46:23.263 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.262+0000 7fb42effd700 1 -- 192.168.123.108:0/237090117 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb428021470 con 0x7fb438074dc0 2026-03-10T07:46:23.263 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.262+0000 7fb42effd700 1 -- 192.168.123.108:0/237090117 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb42800f460 con 0x7fb438074dc0 2026-03-10T07:46:23.264 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.262+0000 7fb42effd700 1 -- 192.168.123.108:0/237090117 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7fb42800f5c0 con 0x7fb438074dc0 2026-03-10T07:46:23.264 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.263+0000 7fb42effd700 1 --2- 192.168.123.108:0/237090117 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb424038490 0x7fb42403a950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:23.264 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.263+0000 7fb42effd700 1 -- 192.168.123.108:0/237090117 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fb42804c2f0 con 0x7fb438074dc0 2026-03-10T07:46:23.264 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.263+0000 7fb43faa0700 1 -- 192.168.123.108:0/237090117 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb41c005320 con 0x7fb438074dc0 2026-03-10T07:46:23.264 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.263+0000 7fb43d03b700 1 --2- 192.168.123.108:0/237090117 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb424038490 0x7fb42403a950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:23.264 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.263+0000 7fb43d03b700 1 --2- 192.168.123.108:0/237090117 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb424038490 0x7fb42403a950 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fb434006fd0 tx=0x7fb434006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:23.267 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.266+0000 7fb42effd700 1 -- 192.168.123.108:0/237090117 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb428026070 con 0x7fb438074dc0 2026-03-10T07:46:23.408 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.407+0000 7fb43faa0700 1 -- 192.168.123.108:0/237090117 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fb41c005190 con 0x7fb438074dc0 2026-03-10T07:46:23.409 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.408+0000 7fb42effd700 1 -- 192.168.123.108:0/237090117 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fb428029540 con 0x7fb438074dc0 2026-03-10T07:46:23.410 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:46:23.410 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","modified":"2026-03-10T07:45:39.881211Z","created":"2026-03-10T07:45:39.881211Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T07:46:23.412 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.410+0000 7fb43faa0700 1 -- 192.168.123.108:0/237090117 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb424038490 msgr2=0x7fb42403a950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:23.412 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.411+0000 7fb43faa0700 1 --2- 192.168.123.108:0/237090117 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb424038490 0x7fb42403a950 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fb434006fd0 tx=0x7fb434006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:23.412 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.411+0000 7fb43faa0700 1 -- 192.168.123.108:0/237090117 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb438074dc0 msgr2=0x7fb43819c160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:23.412 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.411+0000 7fb43faa0700 1 --2- 192.168.123.108:0/237090117 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb438074dc0 0x7fb43819c160 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7fb428004750 tx=0x7fb428005dc0 comp rx=0 tx=0).stop 2026-03-10T07:46:23.412 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.411+0000 7fb43faa0700 1 -- 192.168.123.108:0/237090117 shutdown_connections 2026-03-10T07:46:23.412 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.411+0000 7fb43faa0700 1 --2- 192.168.123.108:0/237090117 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb424038490 0x7fb42403a950 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:23.413 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.411+0000 7fb43faa0700 1 --2- 192.168.123.108:0/237090117 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb438074dc0 0x7fb43819c160 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:23.413 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.412+0000 7fb43faa0700 1 -- 192.168.123.108:0/237090117 >> 192.168.123.108:0/237090117 conn(0x7fb4380fc000 msgr2=0x7fb4380fccc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:23.413 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.412+0000 7fb43faa0700 1 -- 192.168.123.108:0/237090117 shutdown_connections 2026-03-10T07:46:23.413 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:23.412+0000 7fb43faa0700 1 -- 192.168.123.108:0/237090117 wait complete. 2026-03-10T07:46:23.414 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T07:46:23.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:23 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:23.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:23 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:23.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:23 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:23.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:23 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:23.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:23 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:23.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:23 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:23.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:23 vm05 ceph-mon[50387]: Regenerating cephadm self-signed grafana TLS certificates 2026-03-10T07:46:23.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:23 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:23.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:23 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:23.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:23 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch 2026-03-10T07:46:23.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:23 vm05 ceph-mon[50387]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-ssl-verify", "value": "false"}]: dispatch 2026-03-10T07:46:23.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:23 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:23.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:23 vm05 ceph-mon[50387]: Deploying daemon grafana.vm05 on vm05 2026-03-10T07:46:24.474 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T07:46:24.474 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph mon dump -f json 2026-03-10T07:46:24.611 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:24.646 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:24.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:24 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/237090117' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:24.906 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.904+0000 7ff141e75700 1 -- 192.168.123.108:0/2049128960 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff13c1014f0 msgr2=0x7ff13c1038e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:24.906 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.904+0000 7ff141e75700 1 --2- 192.168.123.108:0/2049128960 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff13c1014f0 0x7ff13c1038e0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7ff124009b00 tx=0x7ff124009e10 comp rx=0 tx=0).stop 2026-03-10T07:46:24.906 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.905+0000 7ff141e75700 1 -- 192.168.123.108:0/2049128960 shutdown_connections 2026-03-10T07:46:24.906 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.905+0000 7ff141e75700 1 --2- 192.168.123.108:0/2049128960 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff13c1014f0 0x7ff13c1038e0 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:24.906 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.905+0000 7ff141e75700 1 -- 192.168.123.108:0/2049128960 >> 192.168.123.108:0/2049128960 conn(0x7ff13c0faf00 msgr2=0x7ff13c0fd340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:24.906 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.905+0000 7ff141e75700 1 -- 192.168.123.108:0/2049128960 shutdown_connections 2026-03-10T07:46:24.906 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.905+0000 7ff141e75700 1 -- 192.168.123.108:0/2049128960 wait complete. 2026-03-10T07:46:24.906 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.905+0000 7ff141e75700 1 Processor -- start 2026-03-10T07:46:24.907 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.906+0000 7ff141e75700 1 -- start start 2026-03-10T07:46:24.907 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.906+0000 7ff141e75700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff13c1014f0 0x7ff13c195b60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:24.907 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.906+0000 7ff141e75700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff13c1960a0 con 0x7ff13c1014f0 2026-03-10T07:46:24.907 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.906+0000 7ff13b7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff13c1014f0 0x7ff13c195b60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:24.907 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.906+0000 7ff13b7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff13c1014f0 0x7ff13c195b60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:42878/0 (socket says 192.168.123.108:42878) 2026-03-10T07:46:24.907 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.906+0000 7ff13b7fe700 1 -- 192.168.123.108:0/3234843583 learned_addr learned my addr 192.168.123.108:0/3234843583 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:46:24.907 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.906+0000 7ff13b7fe700 1 -- 192.168.123.108:0/3234843583 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff1240097e0 con 0x7ff13c1014f0 2026-03-10T07:46:24.908 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.906+0000 7ff13b7fe700 1 --2- 192.168.123.108:0/3234843583 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff13c1014f0 0x7ff13c195b60 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7ff124004f40 tx=0x7ff124005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:24.909 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.907+0000 7ff138ff9700 1 -- 192.168.123.108:0/3234843583 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff12401c070 con 0x7ff13c1014f0 2026-03-10T07:46:24.909 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.907+0000 7ff138ff9700 1 -- 192.168.123.108:0/3234843583 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff1240053b0 con 0x7ff13c1014f0 2026-03-10T07:46:24.909 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.907+0000 7ff141e75700 1 -- 192.168.123.108:0/3234843583 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff13c1962a0 con 0x7ff13c1014f0 2026-03-10T07:46:24.909 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.907+0000 7ff141e75700 1 -- 192.168.123.108:0/3234843583 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff13c196740 con 0x7ff13c1014f0 2026-03-10T07:46:24.909 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.907+0000 7ff138ff9700 1 -- 192.168.123.108:0/3234843583 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7ff12400f460 con 0x7ff13c1014f0 2026-03-10T07:46:24.909 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.908+0000 7ff138ff9700 1 -- 192.168.123.108:0/3234843583 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7ff124021470 con 0x7ff13c1014f0 2026-03-10T07:46:24.909 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.908+0000 7ff141e75700 1 -- 192.168.123.108:0/3234843583 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff13c18f7f0 con 0x7ff13c1014f0 2026-03-10T07:46:24.909 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.908+0000 7ff138ff9700 1 --2- 192.168.123.108:0/3234843583 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff1280383a0 0x7ff12803a860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:24.909 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.908+0000 7ff138ff9700 1 -- 192.168.123.108:0/3234843583 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7ff12404c3c0 con 0x7ff13c1014f0 2026-03-10T07:46:24.909 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.908+0000 7ff13affd700 1 --2- 192.168.123.108:0/3234843583 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff1280383a0 0x7ff12803a860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:24.910 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.908+0000 7ff13affd700 1 --2- 192.168.123.108:0/3234843583 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff1280383a0 0x7ff12803a860 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7ff12c006fd0 tx=0x7ff12c006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:24.912 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:24.911+0000 7ff138ff9700 1 -- 192.168.123.108:0/3234843583 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ff12400f6e0 con 0x7ff13c1014f0 2026-03-10T07:46:25.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:25.063+0000 7ff141e75700 1 -- 192.168.123.108:0/3234843583 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7ff13c0623c0 con 0x7ff13c1014f0 2026-03-10T07:46:25.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:25.064+0000 7ff138ff9700 1 -- 192.168.123.108:0/3234843583 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7ff124026030 con 0x7ff13c1014f0 2026-03-10T07:46:25.066 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:46:25.066 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","modified":"2026-03-10T07:45:39.881211Z","created":"2026-03-10T07:45:39.881211Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T07:46:25.068 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:25.067+0000 7ff141e75700 1 -- 192.168.123.108:0/3234843583 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff1280383a0 msgr2=0x7ff12803a860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:25.068 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:25.067+0000 7ff141e75700 1 --2- 192.168.123.108:0/3234843583 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff1280383a0 0x7ff12803a860 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7ff12c006fd0 tx=0x7ff12c006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:25.068 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:25.067+0000 7ff141e75700 1 -- 192.168.123.108:0/3234843583 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff13c1014f0 msgr2=0x7ff13c195b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:25.068 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:25.067+0000 7ff141e75700 1 --2- 192.168.123.108:0/3234843583 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff13c1014f0 0x7ff13c195b60 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7ff124004f40 tx=0x7ff124005e70 comp rx=0 tx=0).stop 2026-03-10T07:46:25.068 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:25.067+0000 7ff141e75700 1 -- 192.168.123.108:0/3234843583 shutdown_connections 2026-03-10T07:46:25.069 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:25.067+0000 7ff141e75700 1 --2- 192.168.123.108:0/3234843583 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff1280383a0 0x7ff12803a860 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:25.069 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:25.067+0000 7ff141e75700 1 --2- 192.168.123.108:0/3234843583 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff13c1014f0 0x7ff13c195b60 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:25.069 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:25.067+0000 7ff141e75700 1 -- 192.168.123.108:0/3234843583 >> 192.168.123.108:0/3234843583 conn(0x7ff13c0faf00 msgr2=0x7ff13c0fbbc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:25.069 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:25.067+0000 7ff141e75700 1 -- 192.168.123.108:0/3234843583 shutdown_connections 2026-03-10T07:46:25.069 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:25.067+0000 7ff141e75700 1 -- 192.168.123.108:0/3234843583 wait complete. 2026-03-10T07:46:25.069 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T07:46:25.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:25 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:25.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:25 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/3234843583' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:26.134 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T07:46:26.134 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph mon dump -f json 2026-03-10T07:46:26.268 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:26.312 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:26.548 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.546+0000 7f458c707700 1 -- 192.168.123.108:0/3930842236 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4584102cb0 msgr2=0x7f45841030d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:26.548 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.546+0000 7f458c707700 1 --2- 192.168.123.108:0/3930842236 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4584102cb0 0x7f45841030d0 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f4574009b00 tx=0x7f4574009e10 comp rx=0 tx=0).stop 2026-03-10T07:46:26.548 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.547+0000 7f458c707700 1 -- 192.168.123.108:0/3930842236 shutdown_connections 2026-03-10T07:46:26.548 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.547+0000 7f458c707700 1 --2- 192.168.123.108:0/3930842236 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4584102cb0 0x7f45841030d0 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:26.548 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.547+0000 7f458c707700 1 -- 192.168.123.108:0/3930842236 >> 192.168.123.108:0/3930842236 conn(0x7f45840fe250 msgr2=0x7f4584100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:26.548 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.547+0000 7f458c707700 1 -- 192.168.123.108:0/3930842236 shutdown_connections 2026-03-10T07:46:26.548 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.547+0000 7f458c707700 1 -- 192.168.123.108:0/3930842236 wait complete. 2026-03-10T07:46:26.548 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.547+0000 7f458c707700 1 Processor -- start 2026-03-10T07:46:26.548 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.547+0000 7f458c707700 1 -- start start 2026-03-10T07:46:26.549 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.547+0000 7f458c707700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4584102cb0 0x7f4584197e20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:26.549 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.547+0000 7f458c707700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4584198360 con 0x7f4584102cb0 2026-03-10T07:46:26.549 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.548+0000 7f458a4a3700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4584102cb0 0x7f4584197e20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:26.549 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.548+0000 7f458a4a3700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4584102cb0 0x7f4584197e20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:46554/0 (socket says 192.168.123.108:46554) 2026-03-10T07:46:26.549 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.548+0000 7f458a4a3700 1 -- 192.168.123.108:0/350815027 learned_addr learned my addr 192.168.123.108:0/350815027 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:46:26.549 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.548+0000 7f458a4a3700 1 -- 192.168.123.108:0/350815027 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f45740097e0 con 0x7f4584102cb0 2026-03-10T07:46:26.549 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.548+0000 7f458a4a3700 1 --2- 192.168.123.108:0/350815027 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4584102cb0 0x7f4584197e20 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f4574004d40 tx=0x7f4574004e20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:26.549 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.548+0000 7f457b7fe700 1 -- 192.168.123.108:0/350815027 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f457401c070 con 0x7f4584102cb0 2026-03-10T07:46:26.549 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.548+0000 7f458c707700 1 -- 192.168.123.108:0/350815027 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4584198560 con 0x7f4584102cb0 2026-03-10T07:46:26.550 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.548+0000 7f458c707700 1 -- 192.168.123.108:0/350815027 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4584198a00 con 0x7f4584102cb0 2026-03-10T07:46:26.550 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.548+0000 7f457b7fe700 1 -- 192.168.123.108:0/350815027 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f45740054e0 con 0x7f4584102cb0 2026-03-10T07:46:26.550 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.548+0000 7f457b7fe700 1 -- 192.168.123.108:0/350815027 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f4574003b70 con 0x7f4584102cb0 2026-03-10T07:46:26.550 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.549+0000 7f457b7fe700 1 -- 192.168.123.108:0/350815027 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f4574005000 con 0x7f4584102cb0 2026-03-10T07:46:26.551 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.549+0000 7f457b7fe700 1 --2- 192.168.123.108:0/350815027 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4570038440 0x7f457003a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:26.551 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.549+0000 7f457b7fe700 1 -- 192.168.123.108:0/350815027 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f457404c1a0 con 0x7f4584102cb0 2026-03-10T07:46:26.551 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.549+0000 7f458c707700 1 -- 192.168.123.108:0/350815027 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4568005320 con 0x7f4584102cb0 2026-03-10T07:46:26.551 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.550+0000 7f4589ca2700 1 --2- 192.168.123.108:0/350815027 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4570038440 0x7f457003a900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:26.551 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.550+0000 7f4589ca2700 1 --2- 192.168.123.108:0/350815027 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4570038440 0x7f457003a900 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f4580006fd0 tx=0x7f4580006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:26.553 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.552+0000 7f457b7fe700 1 -- 192.168.123.108:0/350815027 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f4574004500 con 0x7f4584102cb0 2026-03-10T07:46:26.701 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.699+0000 7f458c707700 1 -- 192.168.123.108:0/350815027 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f4568005190 con 0x7f4584102cb0 2026-03-10T07:46:26.701 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.700+0000 7f457b7fe700 1 -- 192.168.123.108:0/350815027 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f45740203c0 con 0x7f4584102cb0 2026-03-10T07:46:26.701 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:46:26.702 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","modified":"2026-03-10T07:45:39.881211Z","created":"2026-03-10T07:45:39.881211Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T07:46:26.704 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.702+0000 7f458c707700 1 -- 192.168.123.108:0/350815027 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4570038440 msgr2=0x7f457003a900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:26.704 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.702+0000 7f458c707700 1 --2- 192.168.123.108:0/350815027 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4570038440 0x7f457003a900 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f4580006fd0 tx=0x7f4580006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:26.704 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.702+0000 7f458c707700 1 -- 192.168.123.108:0/350815027 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4584102cb0 msgr2=0x7f4584197e20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:26.704 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.702+0000 7f458c707700 1 --2- 192.168.123.108:0/350815027 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4584102cb0 0x7f4584197e20 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f4574004d40 tx=0x7f4574004e20 comp rx=0 tx=0).stop 2026-03-10T07:46:26.704 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.703+0000 7f458c707700 1 -- 192.168.123.108:0/350815027 shutdown_connections 2026-03-10T07:46:26.704 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.703+0000 7f458c707700 1 --2- 192.168.123.108:0/350815027 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f4570038440 0x7f457003a900 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:26.704 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.703+0000 7f458c707700 1 --2- 192.168.123.108:0/350815027 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4584102cb0 0x7f4584197e20 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:26.704 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.703+0000 7f458c707700 1 -- 192.168.123.108:0/350815027 >> 192.168.123.108:0/350815027 conn(0x7f45840fe250 msgr2=0x7f45840fef10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:26.704 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.703+0000 7f458c707700 1 -- 192.168.123.108:0/350815027 shutdown_connections 2026-03-10T07:46:26.704 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:26.703+0000 7f458c707700 1 -- 192.168.123.108:0/350815027 wait complete. 2026-03-10T07:46:26.704 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T07:46:27.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:26 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/350815027' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:27.766 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T07:46:27.766 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph mon dump -f json 2026-03-10T07:46:27.906 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:27.944 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:28.208 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.206+0000 7f0fb32bc700 1 -- 192.168.123.108:0/2890945824 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0fac101590 msgr2=0x7f0fac103980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:28.208 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.206+0000 7f0fb32bc700 1 --2- 192.168.123.108:0/2890945824 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0fac101590 0x7f0fac103980 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f0f9c009b00 tx=0x7f0f9c009e10 comp rx=0 tx=0).stop 2026-03-10T07:46:28.208 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.207+0000 7f0fb32bc700 1 -- 192.168.123.108:0/2890945824 shutdown_connections 2026-03-10T07:46:28.208 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.207+0000 7f0fb32bc700 1 --2- 192.168.123.108:0/2890945824 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0fac101590 0x7f0fac103980 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:28.208 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.207+0000 7f0fb32bc700 1 -- 192.168.123.108:0/2890945824 >> 192.168.123.108:0/2890945824 conn(0x7f0fac0faf00 msgr2=0x7f0fac0fd340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:28.208 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.207+0000 7f0fb32bc700 1 -- 192.168.123.108:0/2890945824 shutdown_connections 2026-03-10T07:46:28.208 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.207+0000 7f0fb32bc700 1 -- 192.168.123.108:0/2890945824 wait complete. 2026-03-10T07:46:28.209 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.208+0000 7f0fb32bc700 1 Processor -- start 2026-03-10T07:46:28.209 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.208+0000 7f0fb32bc700 1 -- start start 2026-03-10T07:46:28.209 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.208+0000 7f0fb32bc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0fac101590 0x7f0fac100ef0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:28.209 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.208+0000 7f0fb32bc700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0fac101430 con 0x7f0fac101590 2026-03-10T07:46:28.209 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.208+0000 7f0fb1058700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0fac101590 0x7f0fac100ef0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:28.209 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.208+0000 7f0fb1058700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0fac101590 0x7f0fac100ef0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:46582/0 (socket says 192.168.123.108:46582) 2026-03-10T07:46:28.209 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.208+0000 7f0fb1058700 1 -- 192.168.123.108:0/983173839 learned_addr learned my addr 192.168.123.108:0/983173839 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:46:28.210 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.209+0000 7f0fb1058700 1 -- 192.168.123.108:0/983173839 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0f9c0097e0 con 0x7f0fac101590 2026-03-10T07:46:28.210 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.209+0000 7f0fb1058700 1 --2- 192.168.123.108:0/983173839 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0fac101590 0x7f0fac100ef0 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f0f9c004f40 tx=0x7f0f9c005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:28.210 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.209+0000 7f0fa27fc700 1 -- 192.168.123.108:0/983173839 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0f9c01c070 con 0x7f0fac101590 2026-03-10T07:46:28.210 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.209+0000 7f0fb32bc700 1 -- 192.168.123.108:0/983173839 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0fac0ff540 con 0x7f0fac101590 2026-03-10T07:46:28.210 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.209+0000 7f0fa27fc700 1 -- 192.168.123.108:0/983173839 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0f9c0053b0 con 0x7f0fac101590 2026-03-10T07:46:28.211 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.209+0000 7f0fa27fc700 1 -- 192.168.123.108:0/983173839 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f0f9c00f550 con 0x7f0fac101590 2026-03-10T07:46:28.212 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.209+0000 7f0fb32bc700 1 -- 192.168.123.108:0/983173839 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0fac0ff9e0 con 0x7f0fac101590 2026-03-10T07:46:28.212 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.210+0000 7f0fa27fc700 1 -- 192.168.123.108:0/983173839 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f0f9c00f6b0 con 0x7f0fac101590 2026-03-10T07:46:28.212 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.210+0000 7f0fb32bc700 1 -- 192.168.123.108:0/983173839 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0fac04fa20 con 0x7f0fac101590 2026-03-10T07:46:28.212 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.210+0000 7f0fa27fc700 1 --2- 192.168.123.108:0/983173839 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0f98040c50 0x7f0f98043110 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:28.212 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.210+0000 7f0fa27fc700 1 -- 192.168.123.108:0/983173839 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f0f9c04d4a0 con 0x7f0fac101590 2026-03-10T07:46:28.212 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.211+0000 7f0fb0857700 1 --2- 192.168.123.108:0/983173839 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0f98040c50 0x7f0f98043110 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:28.212 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.211+0000 7f0fb0857700 1 --2- 192.168.123.108:0/983173839 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0f98040c50 0x7f0f98043110 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f0fa8006fd0 tx=0x7f0fa8006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:28.214 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.213+0000 7f0fa27fc700 1 -- 192.168.123.108:0/983173839 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f0f9c029950 con 0x7f0fac101590 2026-03-10T07:46:28.359 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.357+0000 7f0fb32bc700 1 -- 192.168.123.108:0/983173839 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f0fac18f800 con 0x7f0fac101590 2026-03-10T07:46:28.361 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.359+0000 7f0fa27fc700 1 -- 192.168.123.108:0/983173839 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f0fac18f800 con 0x7f0fac101590 2026-03-10T07:46:28.361 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:46:28.361 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","modified":"2026-03-10T07:45:39.881211Z","created":"2026-03-10T07:45:39.881211Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T07:46:28.363 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.362+0000 7f0fb32bc700 1 -- 192.168.123.108:0/983173839 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0f98040c50 msgr2=0x7f0f98043110 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:28.363 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.362+0000 7f0fb32bc700 1 --2- 192.168.123.108:0/983173839 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0f98040c50 0x7f0f98043110 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f0fa8006fd0 tx=0x7f0fa8006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:28.363 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.362+0000 7f0fb32bc700 1 -- 192.168.123.108:0/983173839 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0fac101590 msgr2=0x7f0fac100ef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:28.363 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.362+0000 7f0fb32bc700 1 --2- 192.168.123.108:0/983173839 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0fac101590 0x7f0fac100ef0 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f0f9c004f40 tx=0x7f0f9c005e70 comp rx=0 tx=0).stop 2026-03-10T07:46:28.363 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.362+0000 7f0fb32bc700 1 -- 192.168.123.108:0/983173839 shutdown_connections 2026-03-10T07:46:28.363 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.362+0000 7f0fb32bc700 1 --2- 192.168.123.108:0/983173839 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0f98040c50 0x7f0f98043110 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:28.363 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.362+0000 7f0fb32bc700 1 --2- 192.168.123.108:0/983173839 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0fac101590 0x7f0fac100ef0 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:28.363 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.362+0000 7f0fb32bc700 1 -- 192.168.123.108:0/983173839 >> 192.168.123.108:0/983173839 conn(0x7f0fac0faf00 msgr2=0x7f0fac0fbbc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:28.363 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.362+0000 7f0fb32bc700 1 -- 192.168.123.108:0/983173839 shutdown_connections 2026-03-10T07:46:28.364 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:28.362+0000 7f0fb32bc700 1 -- 192.168.123.108:0/983173839 wait complete. 2026-03-10T07:46:28.364 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T07:46:28.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:28 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/983173839' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:29.419 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T07:46:29.419 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph mon dump -f json 2026-03-10T07:46:29.564 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:29.601 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:29.883 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.880+0000 7f06f2da4700 1 -- 192.168.123.108:0/1757848642 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f06ec102cb0 msgr2=0x7f06ec1030d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:29.883 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.880+0000 7f06f2da4700 1 --2- 192.168.123.108:0/1757848642 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f06ec102cb0 0x7f06ec1030d0 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f06dc009b00 tx=0x7f06dc009e10 comp rx=0 tx=0).stop 2026-03-10T07:46:29.883 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.881+0000 7f06f2da4700 1 -- 192.168.123.108:0/1757848642 shutdown_connections 2026-03-10T07:46:29.883 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.881+0000 7f06f2da4700 1 --2- 192.168.123.108:0/1757848642 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f06ec102cb0 0x7f06ec1030d0 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:29.883 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.881+0000 7f06f2da4700 1 -- 192.168.123.108:0/1757848642 >> 192.168.123.108:0/1757848642 conn(0x7f06ec0fe250 msgr2=0x7f06ec100690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:29.883 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.881+0000 7f06f2da4700 1 -- 192.168.123.108:0/1757848642 shutdown_connections 2026-03-10T07:46:29.883 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.881+0000 7f06f2da4700 1 -- 192.168.123.108:0/1757848642 wait complete. 2026-03-10T07:46:29.883 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.882+0000 7f06f2da4700 1 Processor -- start 2026-03-10T07:46:29.883 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.882+0000 7f06f2da4700 1 -- start start 2026-03-10T07:46:29.883 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.882+0000 7f06f2da4700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f06ec102cb0 0x7f06ec078b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:29.883 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.882+0000 7f06f2da4700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f06ec079080 con 0x7f06ec102cb0 2026-03-10T07:46:29.883 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.882+0000 7f06f0b40700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f06ec102cb0 0x7f06ec078b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:29.883 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.882+0000 7f06f0b40700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f06ec102cb0 0x7f06ec078b40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:46598/0 (socket says 192.168.123.108:46598) 2026-03-10T07:46:29.883 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.882+0000 7f06f0b40700 1 -- 192.168.123.108:0/519037469 learned_addr learned my addr 192.168.123.108:0/519037469 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:46:29.883 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.882+0000 7f06f0b40700 1 -- 192.168.123.108:0/519037469 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f06dc0097e0 con 0x7f06ec102cb0 2026-03-10T07:46:29.884 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.883+0000 7f06f0b40700 1 --2- 192.168.123.108:0/519037469 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f06ec102cb0 0x7f06ec078b40 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f06dc004d10 tx=0x7f06dc004df0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:29.884 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.883+0000 7f06e9ffb700 1 -- 192.168.123.108:0/519037469 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f06dc01d070 con 0x7f06ec102cb0 2026-03-10T07:46:29.885 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.883+0000 7f06f2da4700 1 -- 192.168.123.108:0/519037469 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f06ec079280 con 0x7f06ec102cb0 2026-03-10T07:46:29.885 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.883+0000 7f06f2da4700 1 -- 192.168.123.108:0/519037469 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f06ec0757f0 con 0x7f06ec102cb0 2026-03-10T07:46:29.885 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.883+0000 7f06e9ffb700 1 -- 192.168.123.108:0/519037469 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f06dc0056f0 con 0x7f06ec102cb0 2026-03-10T07:46:29.885 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.883+0000 7f06e9ffb700 1 -- 192.168.123.108:0/519037469 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f06dc00f460 con 0x7f06ec102cb0 2026-03-10T07:46:29.885 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.884+0000 7f06e9ffb700 1 -- 192.168.123.108:0/519037469 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f06dc00f710 con 0x7f06ec102cb0 2026-03-10T07:46:29.885 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.884+0000 7f06e9ffb700 1 --2- 192.168.123.108:0/519037469 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f06d4038430 0x7f06d403a8f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:29.885 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.884+0000 7f06ebfff700 1 --2- 192.168.123.108:0/519037469 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f06d4038430 0x7f06d403a8f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:29.885 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.884+0000 7f06f2da4700 1 -- 192.168.123.108:0/519037469 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f06ec079410 con 0x7f06ec102cb0 2026-03-10T07:46:29.885 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.884+0000 7f06e9ffb700 1 -- 192.168.123.108:0/519037469 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f06dc04d410 con 0x7f06ec102cb0 2026-03-10T07:46:29.886 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.885+0000 7f06ebfff700 1 --2- 192.168.123.108:0/519037469 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f06d4038430 0x7f06d403a8f0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f06e0006fd0 tx=0x7f06e0006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:29.888 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:29.887+0000 7f06e9ffb700 1 -- 192.168.123.108:0/519037469 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f06dc027070 con 0x7f06ec102cb0 2026-03-10T07:46:30.037 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:30.035+0000 7f06f2da4700 1 -- 192.168.123.108:0/519037469 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f06ec04fa20 con 0x7f06ec102cb0 2026-03-10T07:46:30.037 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:30.035+0000 7f06e9ffb700 1 -- 192.168.123.108:0/519037469 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f06dc02ab60 con 0x7f06ec102cb0 2026-03-10T07:46:30.038 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:46:30.038 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","modified":"2026-03-10T07:45:39.881211Z","created":"2026-03-10T07:45:39.881211Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T07:46:30.040 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:30.039+0000 7f06f2da4700 1 -- 192.168.123.108:0/519037469 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f06d4038430 msgr2=0x7f06d403a8f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:30.040 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:30.039+0000 7f06f2da4700 1 --2- 192.168.123.108:0/519037469 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f06d4038430 0x7f06d403a8f0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f06e0006fd0 tx=0x7f06e0006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:30.040 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:30.039+0000 7f06f2da4700 1 -- 192.168.123.108:0/519037469 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f06ec102cb0 msgr2=0x7f06ec078b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:30.040 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:30.039+0000 7f06f2da4700 1 --2- 192.168.123.108:0/519037469 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f06ec102cb0 0x7f06ec078b40 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f06dc004d10 tx=0x7f06dc004df0 comp rx=0 tx=0).stop 2026-03-10T07:46:30.040 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:30.039+0000 7f06f2da4700 1 -- 192.168.123.108:0/519037469 shutdown_connections 2026-03-10T07:46:30.041 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:30.039+0000 7f06f2da4700 1 --2- 192.168.123.108:0/519037469 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f06d4038430 0x7f06d403a8f0 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:30.041 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:30.040+0000 7f06f2da4700 1 --2- 192.168.123.108:0/519037469 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f06ec102cb0 0x7f06ec078b40 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:30.041 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:30.040+0000 7f06f2da4700 1 -- 192.168.123.108:0/519037469 >> 192.168.123.108:0/519037469 conn(0x7f06ec0fe250 msgr2=0x7f06ec0fef10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:30.041 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:30.040+0000 7f06f2da4700 1 -- 192.168.123.108:0/519037469 shutdown_connections 2026-03-10T07:46:30.041 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:30.040+0000 7f06f2da4700 1 -- 192.168.123.108:0/519037469 wait complete. 2026-03-10T07:46:30.042 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T07:46:31.111 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T07:46:31.112 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph mon dump -f json 2026-03-10T07:46:31.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:30 vm05 ceph-mon[50387]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T07:46:31.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:30 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/519037469' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:31.252 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:31.289 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:31.706 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.704+0000 7fb017c78700 1 -- 192.168.123.108:0/101365938 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb010074dc0 msgr2=0x7fb010073220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:31.706 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.704+0000 7fb017c78700 1 --2- 192.168.123.108:0/101365938 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb010074dc0 0x7fb010073220 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7fb004009b00 tx=0x7fb004009e10 comp rx=0 tx=0).stop 2026-03-10T07:46:31.706 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.705+0000 7fb017c78700 1 -- 192.168.123.108:0/101365938 shutdown_connections 2026-03-10T07:46:31.706 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.705+0000 7fb017c78700 1 --2- 192.168.123.108:0/101365938 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb010074dc0 0x7fb010073220 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:31.706 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.705+0000 7fb017c78700 1 -- 192.168.123.108:0/101365938 >> 192.168.123.108:0/101365938 conn(0x7fb0100fc000 msgr2=0x7fb0100fe440 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:31.706 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.705+0000 7fb017c78700 1 -- 192.168.123.108:0/101365938 shutdown_connections 2026-03-10T07:46:31.706 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.705+0000 7fb017c78700 1 -- 192.168.123.108:0/101365938 wait complete. 2026-03-10T07:46:31.706 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.705+0000 7fb017c78700 1 Processor -- start 2026-03-10T07:46:31.707 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.706+0000 7fb017c78700 1 -- start start 2026-03-10T07:46:31.707 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.706+0000 7fb017c78700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb010074dc0 0x7fb01019c160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:31.707 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.706+0000 7fb017c78700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb01019c6a0 con 0x7fb010074dc0 2026-03-10T07:46:31.707 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.706+0000 7fb015a14700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb010074dc0 0x7fb01019c160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:31.707 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.706+0000 7fb015a14700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb010074dc0 0x7fb01019c160 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:46618/0 (socket says 192.168.123.108:46618) 2026-03-10T07:46:31.707 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.706+0000 7fb015a14700 1 -- 192.168.123.108:0/1569555262 learned_addr learned my addr 192.168.123.108:0/1569555262 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:46:31.707 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.706+0000 7fb015a14700 1 -- 192.168.123.108:0/1569555262 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb0040097e0 con 0x7fb010074dc0 2026-03-10T07:46:31.707 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.706+0000 7fb015a14700 1 --2- 192.168.123.108:0/1569555262 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb010074dc0 0x7fb01019c160 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7fb004000c00 tx=0x7fb004004740 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:31.708 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.707+0000 7fb002ffd700 1 -- 192.168.123.108:0/1569555262 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb00401c070 con 0x7fb010074dc0 2026-03-10T07:46:31.708 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.707+0000 7fb017c78700 1 -- 192.168.123.108:0/1569555262 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb01019c8a0 con 0x7fb010074dc0 2026-03-10T07:46:31.708 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.707+0000 7fb017c78700 1 -- 192.168.123.108:0/1569555262 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb01019cd40 con 0x7fb010074dc0 2026-03-10T07:46:31.709 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.707+0000 7fb002ffd700 1 -- 192.168.123.108:0/1569555262 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb0040053b0 con 0x7fb010074dc0 2026-03-10T07:46:31.709 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.707+0000 7fb002ffd700 1 -- 192.168.123.108:0/1569555262 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb00400f460 con 0x7fb010074dc0 2026-03-10T07:46:31.709 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.708+0000 7fb002ffd700 1 -- 192.168.123.108:0/1569555262 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7fb00400f6b0 con 0x7fb010074dc0 2026-03-10T07:46:31.709 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.708+0000 7fb017c78700 1 -- 192.168.123.108:0/1569555262 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb010195e00 con 0x7fb010074dc0 2026-03-10T07:46:31.709 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.708+0000 7fb002ffd700 1 --2- 192.168.123.108:0/1569555262 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7faffc038490 0x7faffc03a950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:31.709 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.708+0000 7fb002ffd700 1 -- 192.168.123.108:0/1569555262 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fb00404d630 con 0x7fb010074dc0 2026-03-10T07:46:31.709 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.708+0000 7fb015213700 1 --2- 192.168.123.108:0/1569555262 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7faffc038490 0x7faffc03a950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:31.710 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.708+0000 7fb015213700 1 --2- 192.168.123.108:0/1569555262 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7faffc038490 0x7faffc03a950 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fb00c006fd0 tx=0x7fb00c006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:31.712 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.711+0000 7fb002ffd700 1 -- 192.168.123.108:0/1569555262 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb004017440 con 0x7fb010074dc0 2026-03-10T07:46:31.852 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.850+0000 7fb017c78700 1 -- 192.168.123.108:0/1569555262 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fb0100623c0 con 0x7fb010074dc0 2026-03-10T07:46:31.852 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.851+0000 7fb002ffd700 1 -- 192.168.123.108:0/1569555262 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fb004026020 con 0x7fb010074dc0 2026-03-10T07:46:31.852 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:46:31.852 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","modified":"2026-03-10T07:45:39.881211Z","created":"2026-03-10T07:45:39.881211Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T07:46:31.854 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.853+0000 7fb017c78700 1 -- 192.168.123.108:0/1569555262 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7faffc038490 msgr2=0x7faffc03a950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:31.854 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.853+0000 7fb017c78700 1 --2- 192.168.123.108:0/1569555262 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7faffc038490 0x7faffc03a950 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7fb00c006fd0 tx=0x7fb00c006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:31.854 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.853+0000 7fb017c78700 1 -- 192.168.123.108:0/1569555262 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb010074dc0 msgr2=0x7fb01019c160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:31.855 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.853+0000 7fb017c78700 1 --2- 192.168.123.108:0/1569555262 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb010074dc0 0x7fb01019c160 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7fb004000c00 tx=0x7fb004004740 comp rx=0 tx=0).stop 2026-03-10T07:46:31.855 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.853+0000 7fb017c78700 1 -- 192.168.123.108:0/1569555262 shutdown_connections 2026-03-10T07:46:31.855 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.853+0000 7fb017c78700 1 --2- 192.168.123.108:0/1569555262 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7faffc038490 0x7faffc03a950 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:31.855 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.853+0000 7fb017c78700 1 --2- 192.168.123.108:0/1569555262 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb010074dc0 0x7fb01019c160 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:31.855 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.853+0000 7fb017c78700 1 -- 192.168.123.108:0/1569555262 >> 192.168.123.108:0/1569555262 conn(0x7fb0100fc000 msgr2=0x7fb0100fccc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:31.855 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.854+0000 7fb017c78700 1 -- 192.168.123.108:0/1569555262 shutdown_connections 2026-03-10T07:46:31.855 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:31.854+0000 7fb017c78700 1 -- 192.168.123.108:0/1569555262 wait complete. 2026-03-10T07:46:31.855 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T07:46:32.409 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:32 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/1569555262' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:32.945 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T07:46:32.945 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph mon dump -f json 2026-03-10T07:46:33.086 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:33.125 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:33.179 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:33 vm05 ceph-mon[50387]: pgmap v5: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T07:46:33.180 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:33 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:33.180 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:33 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:33.180 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:33 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:33.180 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:33 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:33.180 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:33 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:33.180 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:33 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:33.180 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:33 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:33.180 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:33 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:33.372 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.369+0000 7f16e615d700 1 -- 192.168.123.108:0/1597289276 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16e01014f0 msgr2=0x7f16e01038e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:33.372 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.369+0000 7f16e615d700 1 --2- 192.168.123.108:0/1597289276 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16e01014f0 0x7f16e01038e0 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f16c8009b00 tx=0x7f16c8009e10 comp rx=0 tx=0).stop 2026-03-10T07:46:33.372 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.370+0000 7f16e615d700 1 -- 192.168.123.108:0/1597289276 shutdown_connections 2026-03-10T07:46:33.372 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.370+0000 7f16e615d700 1 --2- 192.168.123.108:0/1597289276 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16e01014f0 0x7f16e01038e0 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:33.372 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.370+0000 7f16e615d700 1 -- 192.168.123.108:0/1597289276 >> 192.168.123.108:0/1597289276 conn(0x7f16e00faf00 msgr2=0x7f16e00fd340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:33.372 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.370+0000 7f16e615d700 1 -- 192.168.123.108:0/1597289276 shutdown_connections 2026-03-10T07:46:33.372 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.371+0000 7f16e615d700 1 -- 192.168.123.108:0/1597289276 wait complete. 2026-03-10T07:46:33.372 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.371+0000 7f16e615d700 1 Processor -- start 2026-03-10T07:46:33.372 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.371+0000 7f16e615d700 1 -- start start 2026-03-10T07:46:33.373 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.372+0000 7f16e615d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16e01014f0 0x7f16e0100e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:33.373 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.372+0000 7f16e615d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f16e0101390 con 0x7f16e01014f0 2026-03-10T07:46:33.373 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.372+0000 7f16df7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16e01014f0 0x7f16e0100e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:33.373 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.372+0000 7f16df7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16e01014f0 0x7f16e0100e50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:46634/0 (socket says 192.168.123.108:46634) 2026-03-10T07:46:33.373 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.372+0000 7f16df7fe700 1 -- 192.168.123.108:0/1067341082 learned_addr learned my addr 192.168.123.108:0/1067341082 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:46:33.374 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.373+0000 7f16df7fe700 1 -- 192.168.123.108:0/1067341082 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f16c80097e0 con 0x7f16e01014f0 2026-03-10T07:46:33.374 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.373+0000 7f16df7fe700 1 --2- 192.168.123.108:0/1067341082 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16e01014f0 0x7f16e0100e50 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f16c8004f40 tx=0x7f16c8005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:33.375 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.373+0000 7f16dcff9700 1 -- 192.168.123.108:0/1067341082 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f16c801c070 con 0x7f16e01014f0 2026-03-10T07:46:33.375 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.373+0000 7f16dcff9700 1 -- 192.168.123.108:0/1067341082 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f16c80053b0 con 0x7f16e01014f0 2026-03-10T07:46:33.375 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.373+0000 7f16dcff9700 1 -- 192.168.123.108:0/1067341082 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f16c800f590 con 0x7f16e01014f0 2026-03-10T07:46:33.375 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.373+0000 7f16e615d700 1 -- 192.168.123.108:0/1067341082 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f16e00ff4a0 con 0x7f16e01014f0 2026-03-10T07:46:33.375 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.373+0000 7f16e615d700 1 -- 192.168.123.108:0/1067341082 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f16e00ff940 con 0x7f16e01014f0 2026-03-10T07:46:33.376 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.374+0000 7f16dcff9700 1 -- 192.168.123.108:0/1067341082 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f16c800f6f0 con 0x7f16e01014f0 2026-03-10T07:46:33.376 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.374+0000 7f16e615d700 1 -- 192.168.123.108:0/1067341082 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f16e004fa90 con 0x7f16e01014f0 2026-03-10T07:46:33.376 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.375+0000 7f16dcff9700 1 --2- 192.168.123.108:0/1067341082 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f16cc0383f0 0x7f16cc03a8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:33.376 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.375+0000 7f16dcff9700 1 -- 192.168.123.108:0/1067341082 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f16c804c330 con 0x7f16e01014f0 2026-03-10T07:46:33.378 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.377+0000 7f16deffd700 1 --2- 192.168.123.108:0/1067341082 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f16cc0383f0 0x7f16cc03a8b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:33.379 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.377+0000 7f16dcff9700 1 -- 192.168.123.108:0/1067341082 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f16c8029950 con 0x7f16e01014f0 2026-03-10T07:46:33.379 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.377+0000 7f16deffd700 1 --2- 192.168.123.108:0/1067341082 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f16cc0383f0 0x7f16cc03a8b0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f16d0006fd0 tx=0x7f16d0006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:33.522 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.521+0000 7f16e615d700 1 -- 192.168.123.108:0/1067341082 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f16e00623f0 con 0x7f16e01014f0 2026-03-10T07:46:33.523 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.522+0000 7f16dcff9700 1 -- 192.168.123.108:0/1067341082 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f16c8026030 con 0x7f16e01014f0 2026-03-10T07:46:33.524 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:46:33.524 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","modified":"2026-03-10T07:45:39.881211Z","created":"2026-03-10T07:45:39.881211Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T07:46:33.526 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.525+0000 7f16e615d700 1 -- 192.168.123.108:0/1067341082 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f16cc0383f0 msgr2=0x7f16cc03a8b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:33.526 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.525+0000 7f16e615d700 1 --2- 192.168.123.108:0/1067341082 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f16cc0383f0 0x7f16cc03a8b0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f16d0006fd0 tx=0x7f16d0006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:33.526 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.525+0000 7f16e615d700 1 -- 192.168.123.108:0/1067341082 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16e01014f0 msgr2=0x7f16e0100e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:33.526 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.525+0000 7f16e615d700 1 --2- 192.168.123.108:0/1067341082 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16e01014f0 0x7f16e0100e50 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f16c8004f40 tx=0x7f16c8005e70 comp rx=0 tx=0).stop 2026-03-10T07:46:33.527 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.525+0000 7f16e615d700 1 -- 192.168.123.108:0/1067341082 shutdown_connections 2026-03-10T07:46:33.527 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.526+0000 7f16e615d700 1 --2- 192.168.123.108:0/1067341082 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f16cc0383f0 0x7f16cc03a8b0 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:33.527 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.526+0000 7f16e615d700 1 --2- 192.168.123.108:0/1067341082 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f16e01014f0 0x7f16e0100e50 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:33.527 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.526+0000 7f16e615d700 1 -- 192.168.123.108:0/1067341082 >> 192.168.123.108:0/1067341082 conn(0x7f16e00faf00 msgr2=0x7f16e00fbbc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:33.527 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.526+0000 7f16e615d700 1 -- 192.168.123.108:0/1067341082 shutdown_connections 2026-03-10T07:46:33.527 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:33.526+0000 7f16e615d700 1 -- 192.168.123.108:0/1067341082 wait complete. 2026-03-10T07:46:33.528 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T07:46:34.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:34 vm05 ceph-mon[50387]: Deploying daemon prometheus.vm05 on vm05 2026-03-10T07:46:34.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:34 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/1067341082' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:34.585 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T07:46:34.586 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph mon dump -f json 2026-03-10T07:46:34.720 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:34.758 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:35.006 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.004+0000 7f8b4828c700 1 -- 192.168.123.108:0/2831198542 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8b400ff360 msgr2=0x7f8b400ff780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:35.006 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.004+0000 7f8b4828c700 1 --2- 192.168.123.108:0/2831198542 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8b400ff360 0x7f8b400ff780 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f8b30009b00 tx=0x7f8b30009e10 comp rx=0 tx=0).stop 2026-03-10T07:46:35.006 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.005+0000 7f8b4828c700 1 -- 192.168.123.108:0/2831198542 shutdown_connections 2026-03-10T07:46:35.006 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.005+0000 7f8b4828c700 1 --2- 192.168.123.108:0/2831198542 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8b400ff360 0x7f8b400ff780 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:35.006 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.005+0000 7f8b4828c700 1 -- 192.168.123.108:0/2831198542 >> 192.168.123.108:0/2831198542 conn(0x7f8b400faf00 msgr2=0x7f8b400fd340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:35.006 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.005+0000 7f8b4828c700 1 -- 192.168.123.108:0/2831198542 shutdown_connections 2026-03-10T07:46:35.006 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.005+0000 7f8b4828c700 1 -- 192.168.123.108:0/2831198542 wait complete. 2026-03-10T07:46:35.007 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.005+0000 7f8b4828c700 1 Processor -- start 2026-03-10T07:46:35.007 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.006+0000 7f8b4828c700 1 -- start start 2026-03-10T07:46:35.007 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.006+0000 7f8b4828c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8b400ff360 0x7f8b40104ee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:35.007 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.006+0000 7f8b4828c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8b40105420 con 0x7f8b400ff360 2026-03-10T07:46:35.007 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.006+0000 7f8b46028700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8b400ff360 0x7f8b40104ee0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:35.007 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.006+0000 7f8b46028700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8b400ff360 0x7f8b40104ee0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:46662/0 (socket says 192.168.123.108:46662) 2026-03-10T07:46:35.007 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.006+0000 7f8b46028700 1 -- 192.168.123.108:0/2131133528 learned_addr learned my addr 192.168.123.108:0/2131133528 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:46:35.007 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.006+0000 7f8b46028700 1 -- 192.168.123.108:0/2131133528 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8b300097e0 con 0x7f8b400ff360 2026-03-10T07:46:35.008 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.007+0000 7f8b46028700 1 --2- 192.168.123.108:0/2131133528 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8b400ff360 0x7f8b40104ee0 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f8b30004f40 tx=0x7f8b30005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:35.008 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.007+0000 7f8b377fe700 1 -- 192.168.123.108:0/2131133528 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8b3001c070 con 0x7f8b400ff360 2026-03-10T07:46:35.009 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.007+0000 7f8b377fe700 1 -- 192.168.123.108:0/2131133528 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8b300053b0 con 0x7f8b400ff360 2026-03-10T07:46:35.009 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.007+0000 7f8b4828c700 1 -- 192.168.123.108:0/2131133528 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8b40105620 con 0x7f8b400ff360 2026-03-10T07:46:35.009 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.007+0000 7f8b377fe700 1 -- 192.168.123.108:0/2131133528 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8b3000f460 con 0x7f8b400ff360 2026-03-10T07:46:35.009 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.007+0000 7f8b4828c700 1 -- 192.168.123.108:0/2131133528 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8b40101b90 con 0x7f8b400ff360 2026-03-10T07:46:35.009 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.008+0000 7f8b377fe700 1 -- 192.168.123.108:0/2131133528 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f8b30021470 con 0x7f8b400ff360 2026-03-10T07:46:35.009 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.008+0000 7f8b4828c700 1 -- 192.168.123.108:0/2131133528 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8b401057b0 con 0x7f8b400ff360 2026-03-10T07:46:35.009 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.008+0000 7f8b377fe700 1 --2- 192.168.123.108:0/2131133528 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8b2c0383f0 0x7f8b2c03a8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:35.009 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.008+0000 7f8b377fe700 1 -- 192.168.123.108:0/2131133528 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f8b3004c2a0 con 0x7f8b400ff360 2026-03-10T07:46:35.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.010+0000 7f8b45827700 1 --2- 192.168.123.108:0/2131133528 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8b2c0383f0 0x7f8b2c03a8b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:35.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.011+0000 7f8b377fe700 1 -- 192.168.123.108:0/2131133528 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f8b3000f5e0 con 0x7f8b400ff360 2026-03-10T07:46:35.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.011+0000 7f8b45827700 1 --2- 192.168.123.108:0/2131133528 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8b2c0383f0 0x7f8b2c03a8b0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f8b3c006fd0 tx=0x7f8b3c006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:35.156 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:46:35.156 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","modified":"2026-03-10T07:45:39.881211Z","created":"2026-03-10T07:45:39.881211Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T07:46:35.156 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.152+0000 7f8b4828c700 1 -- 192.168.123.108:0/2131133528 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f8b4004fa20 con 0x7f8b400ff360 2026-03-10T07:46:35.156 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.154+0000 7f8b377fe700 1 -- 192.168.123.108:0/2131133528 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f8b30030300 con 0x7f8b400ff360 2026-03-10T07:46:35.157 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.156+0000 7f8b4828c700 1 -- 192.168.123.108:0/2131133528 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8b2c0383f0 msgr2=0x7f8b2c03a8b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:35.157 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.156+0000 7f8b4828c700 1 --2- 192.168.123.108:0/2131133528 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8b2c0383f0 0x7f8b2c03a8b0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f8b3c006fd0 tx=0x7f8b3c006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:35.157 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.156+0000 7f8b4828c700 1 -- 192.168.123.108:0/2131133528 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8b400ff360 msgr2=0x7f8b40104ee0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:35.157 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.156+0000 7f8b4828c700 1 --2- 192.168.123.108:0/2131133528 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8b400ff360 0x7f8b40104ee0 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f8b30004f40 tx=0x7f8b30005e70 comp rx=0 tx=0).stop 2026-03-10T07:46:35.157 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.156+0000 7f8b4828c700 1 -- 192.168.123.108:0/2131133528 shutdown_connections 2026-03-10T07:46:35.157 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.156+0000 7f8b4828c700 1 --2- 192.168.123.108:0/2131133528 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8b2c0383f0 0x7f8b2c03a8b0 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:35.157 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.156+0000 7f8b4828c700 1 --2- 192.168.123.108:0/2131133528 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8b400ff360 0x7f8b40104ee0 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:35.157 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.156+0000 7f8b4828c700 1 -- 192.168.123.108:0/2131133528 >> 192.168.123.108:0/2131133528 conn(0x7f8b400faf00 msgr2=0x7f8b400fbbc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:35.157 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.156+0000 7f8b4828c700 1 -- 192.168.123.108:0/2131133528 shutdown_connections 2026-03-10T07:46:35.157 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:35.156+0000 7f8b4828c700 1 -- 192.168.123.108:0/2131133528 wait complete. 2026-03-10T07:46:35.158 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T07:46:35.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:35 vm05 ceph-mon[50387]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T07:46:35.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:35 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:36.221 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T07:46:36.222 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph mon dump -f json 2026-03-10T07:46:36.357 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:36.387 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:36.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:36 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/2131133528' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:36.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:36 vm05 ceph-mon[50387]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T07:46:36.608 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.606+0000 7f95f0ad9700 1 -- 192.168.123.108:0/215396530 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95ec0ff2f0 msgr2=0x7f95ec101720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:36.608 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.606+0000 7f95f0ad9700 1 --2- 192.168.123.108:0/215396530 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95ec0ff2f0 0x7f95ec101720 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7f95dc009b00 tx=0x7f95dc009e10 comp rx=0 tx=0).stop 2026-03-10T07:46:36.608 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.606+0000 7f95f0ad9700 1 -- 192.168.123.108:0/215396530 shutdown_connections 2026-03-10T07:46:36.608 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.606+0000 7f95f0ad9700 1 --2- 192.168.123.108:0/215396530 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95ec0ff2f0 0x7f95ec101720 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:36.608 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.607+0000 7f95f0ad9700 1 -- 192.168.123.108:0/215396530 >> 192.168.123.108:0/215396530 conn(0x7f95ec0faf30 msgr2=0x7f95ec0fd350 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:36.608 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.607+0000 7f95f0ad9700 1 -- 192.168.123.108:0/215396530 shutdown_connections 2026-03-10T07:46:36.608 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.607+0000 7f95f0ad9700 1 -- 192.168.123.108:0/215396530 wait complete. 2026-03-10T07:46:36.608 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.607+0000 7f95f0ad9700 1 Processor -- start 2026-03-10T07:46:36.609 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.607+0000 7f95f0ad9700 1 -- start start 2026-03-10T07:46:36.609 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.608+0000 7f95f0ad9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95ec0ff2f0 0x7f95ec19c170 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:36.609 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.608+0000 7f95f0ad9700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f95ec19c6b0 con 0x7f95ec0ff2f0 2026-03-10T07:46:36.609 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.608+0000 7f95ea59c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95ec0ff2f0 0x7f95ec19c170 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:36.609 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.608+0000 7f95ea59c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95ec0ff2f0 0x7f95ec19c170 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:33620/0 (socket says 192.168.123.108:33620) 2026-03-10T07:46:36.609 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.608+0000 7f95ea59c700 1 -- 192.168.123.108:0/4019805140 learned_addr learned my addr 192.168.123.108:0/4019805140 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:46:36.609 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.608+0000 7f95ea59c700 1 -- 192.168.123.108:0/4019805140 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f95dc0097e0 con 0x7f95ec0ff2f0 2026-03-10T07:46:36.610 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.609+0000 7f95ea59c700 1 --2- 192.168.123.108:0/4019805140 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95ec0ff2f0 0x7f95ec19c170 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f95dc004f40 tx=0x7f95dc005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:36.610 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.609+0000 7f95e37fe700 1 -- 192.168.123.108:0/4019805140 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f95dc01c070 con 0x7f95ec0ff2f0 2026-03-10T07:46:36.610 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.609+0000 7f95e37fe700 1 -- 192.168.123.108:0/4019805140 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f95dc0053b0 con 0x7f95ec0ff2f0 2026-03-10T07:46:36.610 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.609+0000 7f95f0ad9700 1 -- 192.168.123.108:0/4019805140 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f95ec19c8b0 con 0x7f95ec0ff2f0 2026-03-10T07:46:36.611 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.609+0000 7f95e37fe700 1 -- 192.168.123.108:0/4019805140 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f95dc00f460 con 0x7f95ec0ff2f0 2026-03-10T07:46:36.611 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.609+0000 7f95f0ad9700 1 -- 192.168.123.108:0/4019805140 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f95ec19cd50 con 0x7f95ec0ff2f0 2026-03-10T07:46:36.611 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.610+0000 7f95e37fe700 1 -- 192.168.123.108:0/4019805140 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f95dc021470 con 0x7f95ec0ff2f0 2026-03-10T07:46:36.612 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.610+0000 7f95f0ad9700 1 -- 192.168.123.108:0/4019805140 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f95ec1965e0 con 0x7f95ec0ff2f0 2026-03-10T07:46:36.612 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.610+0000 7f95e37fe700 1 --2- 192.168.123.108:0/4019805140 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f95d8038440 0x7f95d803a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:36.612 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.610+0000 7f95e37fe700 1 -- 192.168.123.108:0/4019805140 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f95dc04c310 con 0x7f95ec0ff2f0 2026-03-10T07:46:36.614 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.612+0000 7f95e9d9b700 1 --2- 192.168.123.108:0/4019805140 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f95d8038440 0x7f95d803a900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:36.614 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.613+0000 7f95e37fe700 1 -- 192.168.123.108:0/4019805140 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f95dc00f6f0 con 0x7f95ec0ff2f0 2026-03-10T07:46:36.614 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.613+0000 7f95e9d9b700 1 --2- 192.168.123.108:0/4019805140 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f95d8038440 0x7f95d803a900 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f95d4006fd0 tx=0x7f95d4006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:36.756 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.753+0000 7f95f0ad9700 1 -- 192.168.123.108:0/4019805140 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f95ec0623c0 con 0x7f95ec0ff2f0 2026-03-10T07:46:36.756 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.755+0000 7f95e37fe700 1 -- 192.168.123.108:0/4019805140 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f95dc026030 con 0x7f95ec0ff2f0 2026-03-10T07:46:36.756 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:46:36.756 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","modified":"2026-03-10T07:45:39.881211Z","created":"2026-03-10T07:45:39.881211Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T07:46:36.758 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.757+0000 7f95f0ad9700 1 -- 192.168.123.108:0/4019805140 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f95d8038440 msgr2=0x7f95d803a900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:36.759 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.757+0000 7f95f0ad9700 1 --2- 192.168.123.108:0/4019805140 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f95d8038440 0x7f95d803a900 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f95d4006fd0 tx=0x7f95d4006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:36.759 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.757+0000 7f95f0ad9700 1 -- 192.168.123.108:0/4019805140 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95ec0ff2f0 msgr2=0x7f95ec19c170 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:36.759 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.757+0000 7f95f0ad9700 1 --2- 192.168.123.108:0/4019805140 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95ec0ff2f0 0x7f95ec19c170 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f95dc004f40 tx=0x7f95dc005e70 comp rx=0 tx=0).stop 2026-03-10T07:46:36.759 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.758+0000 7f95f0ad9700 1 -- 192.168.123.108:0/4019805140 shutdown_connections 2026-03-10T07:46:36.759 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.758+0000 7f95f0ad9700 1 --2- 192.168.123.108:0/4019805140 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f95d8038440 0x7f95d803a900 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:36.759 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.758+0000 7f95f0ad9700 1 --2- 192.168.123.108:0/4019805140 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95ec0ff2f0 0x7f95ec19c170 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:36.759 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.758+0000 7f95f0ad9700 1 -- 192.168.123.108:0/4019805140 >> 192.168.123.108:0/4019805140 conn(0x7f95ec0faf30 msgr2=0x7f95ec0fbba0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:36.759 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.758+0000 7f95f0ad9700 1 -- 192.168.123.108:0/4019805140 shutdown_connections 2026-03-10T07:46:36.759 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:36.758+0000 7f95f0ad9700 1 -- 192.168.123.108:0/4019805140 wait complete. 2026-03-10T07:46:36.760 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T07:46:37.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:37 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/4019805140' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:37.822 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T07:46:37.822 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph mon dump -f json 2026-03-10T07:46:37.959 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:37.994 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:38.224 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.223+0000 7f38d2bc7700 1 -- 192.168.123.108:0/1216189351 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38cc074dc0 msgr2=0x7f38cc073220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:38.225 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.223+0000 7f38d2bc7700 1 --2- 192.168.123.108:0/1216189351 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38cc074dc0 0x7f38cc073220 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f38bc009b00 tx=0x7f38bc009e10 comp rx=0 tx=0).stop 2026-03-10T07:46:38.225 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.223+0000 7f38d2bc7700 1 -- 192.168.123.108:0/1216189351 shutdown_connections 2026-03-10T07:46:38.225 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.223+0000 7f38d2bc7700 1 --2- 192.168.123.108:0/1216189351 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38cc074dc0 0x7f38cc073220 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:38.225 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.223+0000 7f38d2bc7700 1 -- 192.168.123.108:0/1216189351 >> 192.168.123.108:0/1216189351 conn(0x7f38cc0fbe60 msgr2=0x7f38cc0fe2c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:38.225 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.224+0000 7f38d2bc7700 1 -- 192.168.123.108:0/1216189351 shutdown_connections 2026-03-10T07:46:38.225 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.224+0000 7f38d2bc7700 1 -- 192.168.123.108:0/1216189351 wait complete. 2026-03-10T07:46:38.225 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.224+0000 7f38d2bc7700 1 Processor -- start 2026-03-10T07:46:38.225 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.224+0000 7f38d2bc7700 1 -- start start 2026-03-10T07:46:38.226 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.225+0000 7f38d2bc7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38cc19c1f0 0x7f38cc19c610 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:38.226 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.225+0000 7f38d2bc7700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f38cc19cb50 con 0x7f38cc19c1f0 2026-03-10T07:46:38.226 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.225+0000 7f38d1bc5700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38cc19c1f0 0x7f38cc19c610 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:38.226 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.225+0000 7f38d1bc5700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38cc19c1f0 0x7f38cc19c610 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:33638/0 (socket says 192.168.123.108:33638) 2026-03-10T07:46:38.226 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.225+0000 7f38d1bc5700 1 -- 192.168.123.108:0/4083037967 learned_addr learned my addr 192.168.123.108:0/4083037967 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:46:38.226 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.225+0000 7f38d1bc5700 1 -- 192.168.123.108:0/4083037967 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f38bc0097e0 con 0x7f38cc19c1f0 2026-03-10T07:46:38.227 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.226+0000 7f38d1bc5700 1 --2- 192.168.123.108:0/4083037967 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38cc19c1f0 0x7f38cc19c610 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f38bc009fd0 tx=0x7f38bc005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:38.227 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.226+0000 7f38c2ffd700 1 -- 192.168.123.108:0/4083037967 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f38bc01c070 con 0x7f38cc19c1f0 2026-03-10T07:46:38.227 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.226+0000 7f38d2bc7700 1 -- 192.168.123.108:0/4083037967 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f38cc19cd50 con 0x7f38cc19c1f0 2026-03-10T07:46:38.227 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.226+0000 7f38d2bc7700 1 -- 192.168.123.108:0/4083037967 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f38cc19f9a0 con 0x7f38cc19c1f0 2026-03-10T07:46:38.228 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.227+0000 7f38c2ffd700 1 -- 192.168.123.108:0/4083037967 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f38bc00b810 con 0x7f38cc19c1f0 2026-03-10T07:46:38.228 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.227+0000 7f38c2ffd700 1 -- 192.168.123.108:0/4083037967 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f38bc021e50 con 0x7f38cc19c1f0 2026-03-10T07:46:38.228 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.227+0000 7f38c2ffd700 1 -- 192.168.123.108:0/4083037967 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 13) v1 ==== 45337+0+0 (secure 0 0 0) 0x7f38bc00f460 con 0x7f38cc19c1f0 2026-03-10T07:46:38.229 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.227+0000 7f38d2bc7700 1 -- 192.168.123.108:0/4083037967 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f38b0005320 con 0x7f38cc19c1f0 2026-03-10T07:46:38.229 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.227+0000 7f38c2ffd700 1 --2- 192.168.123.108:0/4083037967 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f38b80383f0 0x7f38b803a8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:38.229 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.227+0000 7f38c2ffd700 1 -- 192.168.123.108:0/4083037967 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f38bc04d300 con 0x7f38cc19c1f0 2026-03-10T07:46:38.231 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.230+0000 7f38d13c4700 1 --2- 192.168.123.108:0/4083037967 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f38b80383f0 0x7f38b803a8b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:38.231 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.230+0000 7f38c2ffd700 1 -- 192.168.123.108:0/4083037967 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f38bc017440 con 0x7f38cc19c1f0 2026-03-10T07:46:38.232 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.230+0000 7f38d13c4700 1 --2- 192.168.123.108:0/4083037967 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f38b80383f0 0x7f38b803a8b0 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f38c8006fd0 tx=0x7f38c8006e40 comp rx=0 tx=0).ready entity=mgr.14164 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:38.374 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.372+0000 7f38d2bc7700 1 -- 192.168.123.108:0/4083037967 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f38b0005190 con 0x7f38cc19c1f0 2026-03-10T07:46:38.375 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.373+0000 7f38c2ffd700 1 -- 192.168.123.108:0/4083037967 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f38bc026030 con 0x7f38cc19c1f0 2026-03-10T07:46:38.375 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:46:38.375 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","modified":"2026-03-10T07:45:39.881211Z","created":"2026-03-10T07:45:39.881211Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T07:46:38.378 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.376+0000 7f38d2bc7700 1 -- 192.168.123.108:0/4083037967 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f38b80383f0 msgr2=0x7f38b803a8b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:38.378 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.376+0000 7f38d2bc7700 1 --2- 192.168.123.108:0/4083037967 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f38b80383f0 0x7f38b803a8b0 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f38c8006fd0 tx=0x7f38c8006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:38.378 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.377+0000 7f38d2bc7700 1 -- 192.168.123.108:0/4083037967 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38cc19c1f0 msgr2=0x7f38cc19c610 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:38.378 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.377+0000 7f38d2bc7700 1 --2- 192.168.123.108:0/4083037967 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38cc19c1f0 0x7f38cc19c610 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f38bc009fd0 tx=0x7f38bc005e70 comp rx=0 tx=0).stop 2026-03-10T07:46:38.378 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.377+0000 7f38d2bc7700 1 -- 192.168.123.108:0/4083037967 shutdown_connections 2026-03-10T07:46:38.378 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.377+0000 7f38d2bc7700 1 --2- 192.168.123.108:0/4083037967 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f38b80383f0 0x7f38b803a8b0 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:38.378 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.377+0000 7f38d2bc7700 1 --2- 192.168.123.108:0/4083037967 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f38cc19c1f0 0x7f38cc19c610 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:38.379 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.377+0000 7f38d2bc7700 1 -- 192.168.123.108:0/4083037967 >> 192.168.123.108:0/4083037967 conn(0x7f38cc0fbe60 msgr2=0x7f38cc0fcb40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:38.379 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.378+0000 7f38d2bc7700 1 -- 192.168.123.108:0/4083037967 shutdown_connections 2026-03-10T07:46:38.379 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:38.378+0000 7f38d2bc7700 1 -- 192.168.123.108:0/4083037967 wait complete. 2026-03-10T07:46:38.379 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T07:46:39.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:38 vm05 ceph-mon[50387]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T07:46:39.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:38 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:39.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:38 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:39.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:38 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:39.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:38 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr module enable", "module": "prometheus"}]: dispatch 2026-03-10T07:46:39.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:38 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/4083037967' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:39.428 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T07:46:39.429 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph mon dump -f json 2026-03-10T07:46:39.579 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:39.615 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:39.846 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.844+0000 7faa19f04700 1 -- 192.168.123.108:0/4250930357 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faa14102ca0 msgr2=0x7faa141030c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:39.846 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.844+0000 7faa19f04700 1 --2- 192.168.123.108:0/4250930357 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faa14102ca0 0x7faa141030c0 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7fa9fc009b00 tx=0x7fa9fc009e10 comp rx=0 tx=0).stop 2026-03-10T07:46:39.846 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.845+0000 7faa19f04700 1 -- 192.168.123.108:0/4250930357 shutdown_connections 2026-03-10T07:46:39.846 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.845+0000 7faa19f04700 1 --2- 192.168.123.108:0/4250930357 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faa14102ca0 0x7faa141030c0 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:39.846 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.845+0000 7faa19f04700 1 -- 192.168.123.108:0/4250930357 >> 192.168.123.108:0/4250930357 conn(0x7faa140fe220 msgr2=0x7faa14100680 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:39.847 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.845+0000 7faa19f04700 1 -- 192.168.123.108:0/4250930357 shutdown_connections 2026-03-10T07:46:39.847 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.845+0000 7faa19f04700 1 -- 192.168.123.108:0/4250930357 wait complete. 2026-03-10T07:46:39.847 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.846+0000 7faa19f04700 1 Processor -- start 2026-03-10T07:46:39.847 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.846+0000 7faa19f04700 1 -- start start 2026-03-10T07:46:39.847 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.846+0000 7faa19f04700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faa14102ca0 0x7faa14197d60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:39.847 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.846+0000 7faa19f04700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faa141982a0 con 0x7faa14102ca0 2026-03-10T07:46:39.848 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.847+0000 7faa137fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faa14102ca0 0x7faa14197d60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:39.848 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.847+0000 7faa137fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faa14102ca0 0x7faa14197d60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:33656/0 (socket says 192.168.123.108:33656) 2026-03-10T07:46:39.848 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.847+0000 7faa137fe700 1 -- 192.168.123.108:0/1493688247 learned_addr learned my addr 192.168.123.108:0/1493688247 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:46:39.848 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.847+0000 7faa137fe700 1 -- 192.168.123.108:0/1493688247 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa9fc0097e0 con 0x7faa14102ca0 2026-03-10T07:46:39.848 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.847+0000 7faa137fe700 1 --2- 192.168.123.108:0/1493688247 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faa14102ca0 0x7faa14197d60 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7fa9fc004750 tx=0x7fa9fc005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:39.848 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.847+0000 7faa10ff9700 1 -- 192.168.123.108:0/1493688247 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa9fc01c070 con 0x7faa14102ca0 2026-03-10T07:46:39.849 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.848+0000 7faa19f04700 1 -- 192.168.123.108:0/1493688247 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7faa141984a0 con 0x7faa14102ca0 2026-03-10T07:46:39.849 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.848+0000 7faa19f04700 1 -- 192.168.123.108:0/1493688247 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7faa14198940 con 0x7faa14102ca0 2026-03-10T07:46:39.850 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.848+0000 7faa10ff9700 1 -- 192.168.123.108:0/1493688247 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa9fc021470 con 0x7faa14102ca0 2026-03-10T07:46:39.850 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.848+0000 7faa10ff9700 1 -- 192.168.123.108:0/1493688247 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fa9fc00f460 con 0x7faa14102ca0 2026-03-10T07:46:39.850 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.849+0000 7faa10ff9700 1 -- 192.168.123.108:0/1493688247 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 14) v1 ==== 45351+0+0 (secure 0 0 0) 0x7fa9fc00f600 con 0x7faa14102ca0 2026-03-10T07:46:39.850 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.849+0000 7faa10ff9700 1 --2- 192.168.123.108:0/1493688247 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7faa00038440 0x7faa0003a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:39.850 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.849+0000 7faa10ff9700 1 -- 192.168.123.108:0/1493688247 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fa9fc04d4c0 con 0x7faa14102ca0 2026-03-10T07:46:39.850 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.849+0000 7faa12ffd700 1 -- 192.168.123.108:0/1493688247 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7faa00038440 msgr2=0x7faa0003a900 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T07:46:39.850 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.849+0000 7faa12ffd700 1 --2- 192.168.123.108:0/1493688247 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7faa00038440 0x7faa0003a900 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T07:46:39.850 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.849+0000 7faa19f04700 1 -- 192.168.123.108:0/1493688247 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7faa14191a30 con 0x7faa14102ca0 2026-03-10T07:46:39.853 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:39.852+0000 7faa10ff9700 1 -- 192.168.123.108:0/1493688247 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fa9fc026070 con 0x7faa14102ca0 2026-03-10T07:46:40.007 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:40.005+0000 7faa19f04700 1 -- 192.168.123.108:0/1493688247 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7faa1402d0c0 con 0x7faa14102ca0 2026-03-10T07:46:40.007 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:40.006+0000 7faa10ff9700 1 -- 192.168.123.108:0/1493688247 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fa9fc029950 con 0x7faa14102ca0 2026-03-10T07:46:40.008 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:46:40.008 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","modified":"2026-03-10T07:45:39.881211Z","created":"2026-03-10T07:45:39.881211Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T07:46:40.011 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:40.009+0000 7faa19f04700 1 -- 192.168.123.108:0/1493688247 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7faa00038440 msgr2=0x7faa0003a900 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:46:40.011 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:40.009+0000 7faa19f04700 1 --2- 192.168.123.108:0/1493688247 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7faa00038440 0x7faa0003a900 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:40.011 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:40.010+0000 7faa19f04700 1 -- 192.168.123.108:0/1493688247 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faa14102ca0 msgr2=0x7faa14197d60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:40.011 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:40.010+0000 7faa19f04700 1 --2- 192.168.123.108:0/1493688247 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faa14102ca0 0x7faa14197d60 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7fa9fc004750 tx=0x7fa9fc005dc0 comp rx=0 tx=0).stop 2026-03-10T07:46:40.011 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:40.010+0000 7faa19f04700 1 -- 192.168.123.108:0/1493688247 shutdown_connections 2026-03-10T07:46:40.011 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:40.010+0000 7faa19f04700 1 --2- 192.168.123.108:0/1493688247 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7faa00038440 0x7faa0003a900 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:40.011 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:40.010+0000 7faa19f04700 1 --2- 192.168.123.108:0/1493688247 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faa14102ca0 0x7faa14197d60 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:40.011 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:40.010+0000 7faa19f04700 1 -- 192.168.123.108:0/1493688247 >> 192.168.123.108:0/1493688247 conn(0x7faa140fe220 msgr2=0x7faa140fef00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:40.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:40.011+0000 7faa19f04700 1 -- 192.168.123.108:0/1493688247 shutdown_connections 2026-03-10T07:46:40.012 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:40.011+0000 7faa19f04700 1 -- 192.168.123.108:0/1493688247 wait complete. 2026-03-10T07:46:40.013 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T07:46:40.031 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:39 vm05 ceph-mon[50387]: from='mgr.14164 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "mgr module enable", "module": "prometheus"}]': finished 2026-03-10T07:46:40.031 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:39 vm05 ceph-mon[50387]: mgrmap e14: vm05.blexke(active, since 29s) 2026-03-10T07:46:41.036 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:40 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/1493688247' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:41.064 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T07:46:41.064 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph mon dump -f json 2026-03-10T07:46:41.197 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:41.236 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:41.496 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.494+0000 7fb3277d4700 1 -- 192.168.123.108:0/520052143 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb320101580 msgr2=0x7fb320103970 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:41.496 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.494+0000 7fb3277d4700 1 --2- 192.168.123.108:0/520052143 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb320101580 0x7fb320103970 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7fb310009b00 tx=0x7fb310009e10 comp rx=0 tx=0).stop 2026-03-10T07:46:41.497 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.495+0000 7fb3277d4700 1 -- 192.168.123.108:0/520052143 shutdown_connections 2026-03-10T07:46:41.497 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.495+0000 7fb3277d4700 1 --2- 192.168.123.108:0/520052143 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb320101580 0x7fb320103970 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:41.497 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.495+0000 7fb3277d4700 1 -- 192.168.123.108:0/520052143 >> 192.168.123.108:0/520052143 conn(0x7fb3200faed0 msgr2=0x7fb3200fd310 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:41.497 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.495+0000 7fb3277d4700 1 -- 192.168.123.108:0/520052143 shutdown_connections 2026-03-10T07:46:41.497 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.495+0000 7fb3277d4700 1 -- 192.168.123.108:0/520052143 wait complete. 2026-03-10T07:46:41.497 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.496+0000 7fb3277d4700 1 Processor -- start 2026-03-10T07:46:41.497 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.496+0000 7fb3277d4700 1 -- start start 2026-03-10T07:46:41.497 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.496+0000 7fb3277d4700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb320101580 0x7fb320195ae0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:41.497 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.496+0000 7fb3277d4700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb320196020 con 0x7fb320101580 2026-03-10T07:46:41.498 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.497+0000 7fb325570700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb320101580 0x7fb320195ae0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:41.498 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.497+0000 7fb325570700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb320101580 0x7fb320195ae0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:33672/0 (socket says 192.168.123.108:33672) 2026-03-10T07:46:41.498 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.497+0000 7fb325570700 1 -- 192.168.123.108:0/3346900188 learned_addr learned my addr 192.168.123.108:0/3346900188 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:46:41.498 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.497+0000 7fb325570700 1 -- 192.168.123.108:0/3346900188 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb3100097e0 con 0x7fb320101580 2026-03-10T07:46:41.498 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.497+0000 7fb325570700 1 --2- 192.168.123.108:0/3346900188 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb320101580 0x7fb320195ae0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7fb310004f40 tx=0x7fb310005e70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:41.499 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.498+0000 7fb3167fc700 1 -- 192.168.123.108:0/3346900188 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb31001c070 con 0x7fb320101580 2026-03-10T07:46:41.499 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.498+0000 7fb3277d4700 1 -- 192.168.123.108:0/3346900188 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb320196220 con 0x7fb320101580 2026-03-10T07:46:41.499 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.498+0000 7fb3277d4700 1 -- 192.168.123.108:0/3346900188 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb3201966c0 con 0x7fb320101580 2026-03-10T07:46:41.500 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.498+0000 7fb3167fc700 1 -- 192.168.123.108:0/3346900188 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb3100053b0 con 0x7fb320101580 2026-03-10T07:46:41.500 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.498+0000 7fb3167fc700 1 -- 192.168.123.108:0/3346900188 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb31000f460 con 0x7fb320101580 2026-03-10T07:46:41.500 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.499+0000 7fb3277d4700 1 -- 192.168.123.108:0/3346900188 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb32018f7d0 con 0x7fb320101580 2026-03-10T07:46:41.500 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.499+0000 7fb3167fc700 1 -- 192.168.123.108:0/3346900188 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 14) v1 ==== 45351+0+0 (secure 0 0 0) 0x7fb310005520 con 0x7fb320101580 2026-03-10T07:46:41.500 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.499+0000 7fb3167fc700 1 --2- 192.168.123.108:0/3346900188 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb30c038440 0x7fb30c03a900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:41.500 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.499+0000 7fb324d6f700 1 -- 192.168.123.108:0/3346900188 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb30c038440 msgr2=0x7fb30c03a900 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T07:46:41.500 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.499+0000 7fb324d6f700 1 --2- 192.168.123.108:0/3346900188 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb30c038440 0x7fb30c03a900 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T07:46:41.500 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.499+0000 7fb3167fc700 1 -- 192.168.123.108:0/3346900188 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7fb3100203d0 con 0x7fb320101580 2026-03-10T07:46:41.503 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.502+0000 7fb3167fc700 1 -- 192.168.123.108:0/3346900188 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb31000f760 con 0x7fb320101580 2026-03-10T07:46:41.648 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.646+0000 7fb3277d4700 1 -- 192.168.123.108:0/3346900188 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fb32002cc70 con 0x7fb320101580 2026-03-10T07:46:41.649 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.647+0000 7fb3167fc700 1 -- 192.168.123.108:0/3346900188 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fb31000f760 con 0x7fb320101580 2026-03-10T07:46:41.649 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:46:41.649 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","modified":"2026-03-10T07:45:39.881211Z","created":"2026-03-10T07:45:39.881211Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T07:46:41.651 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.650+0000 7fb3277d4700 1 -- 192.168.123.108:0/3346900188 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb30c038440 msgr2=0x7fb30c03a900 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:46:41.651 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.650+0000 7fb3277d4700 1 --2- 192.168.123.108:0/3346900188 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb30c038440 0x7fb30c03a900 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:41.651 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.650+0000 7fb3277d4700 1 -- 192.168.123.108:0/3346900188 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb320101580 msgr2=0x7fb320195ae0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:41.651 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.650+0000 7fb3277d4700 1 --2- 192.168.123.108:0/3346900188 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb320101580 0x7fb320195ae0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7fb310004f40 tx=0x7fb310005e70 comp rx=0 tx=0).stop 2026-03-10T07:46:41.651 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.650+0000 7fb3277d4700 1 -- 192.168.123.108:0/3346900188 shutdown_connections 2026-03-10T07:46:41.651 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.650+0000 7fb3277d4700 1 --2- 192.168.123.108:0/3346900188 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb30c038440 0x7fb30c03a900 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:41.651 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.650+0000 7fb3277d4700 1 --2- 192.168.123.108:0/3346900188 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb320101580 0x7fb320195ae0 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:41.651 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.650+0000 7fb3277d4700 1 -- 192.168.123.108:0/3346900188 >> 192.168.123.108:0/3346900188 conn(0x7fb3200faed0 msgr2=0x7fb3200fbb90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:41.652 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.650+0000 7fb3277d4700 1 -- 192.168.123.108:0/3346900188 shutdown_connections 2026-03-10T07:46:41.652 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:41.650+0000 7fb3277d4700 1 -- 192.168.123.108:0/3346900188 wait complete. 2026-03-10T07:46:41.652 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T07:46:41.865 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:41 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/3346900188' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:42.693 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T07:46:42.694 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph mon dump -f json 2026-03-10T07:46:42.840 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:42.877 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:43.158 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.157+0000 7f2430b0b700 1 -- 192.168.123.108:0/2434420067 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f242c0ff980 msgr2=0x7f242c0ffda0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:43.158 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.157+0000 7f2430b0b700 1 --2- 192.168.123.108:0/2434420067 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f242c0ff980 0x7f242c0ffda0 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7f2414009b00 tx=0x7f2414009e10 comp rx=0 tx=0).stop 2026-03-10T07:46:43.159 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.157+0000 7f2430b0b700 1 -- 192.168.123.108:0/2434420067 shutdown_connections 2026-03-10T07:46:43.159 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.157+0000 7f2430b0b700 1 --2- 192.168.123.108:0/2434420067 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f242c0ff980 0x7f242c0ffda0 unknown :-1 s=CLOSED pgs=135 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:43.159 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.157+0000 7f2430b0b700 1 -- 192.168.123.108:0/2434420067 >> 192.168.123.108:0/2434420067 conn(0x7f242c0faf20 msgr2=0x7f242c0fd360 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:43.159 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.158+0000 7f2430b0b700 1 -- 192.168.123.108:0/2434420067 shutdown_connections 2026-03-10T07:46:43.159 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.158+0000 7f2430b0b700 1 -- 192.168.123.108:0/2434420067 wait complete. 2026-03-10T07:46:43.159 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.158+0000 7f2430b0b700 1 Processor -- start 2026-03-10T07:46:43.159 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.158+0000 7f2430b0b700 1 -- start start 2026-03-10T07:46:43.159 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.158+0000 7f2430b0b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f242c0ff980 0x7f242c197dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:43.159 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.158+0000 7f2430b0b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f242c198300 con 0x7f242c0ff980 2026-03-10T07:46:43.160 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.159+0000 7f242a59c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f242c0ff980 0x7f242c197dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:43.160 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.159+0000 7f242a59c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f242c0ff980 0x7f242c197dc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:33690/0 (socket says 192.168.123.108:33690) 2026-03-10T07:46:43.160 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.159+0000 7f242a59c700 1 -- 192.168.123.108:0/3901491198 learned_addr learned my addr 192.168.123.108:0/3901491198 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:46:43.160 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.159+0000 7f242a59c700 1 -- 192.168.123.108:0/3901491198 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f24140097e0 con 0x7f242c0ff980 2026-03-10T07:46:43.160 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.159+0000 7f242a59c700 1 --2- 192.168.123.108:0/3901491198 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f242c0ff980 0x7f242c197dc0 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f2414004750 tx=0x7f2414005dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:43.160 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.159+0000 7f24237fe700 1 -- 192.168.123.108:0/3901491198 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f241401c070 con 0x7f242c0ff980 2026-03-10T07:46:43.160 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.159+0000 7f2430b0b700 1 -- 192.168.123.108:0/3901491198 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f242c198500 con 0x7f242c0ff980 2026-03-10T07:46:43.160 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.159+0000 7f2430b0b700 1 -- 192.168.123.108:0/3901491198 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f242c1989a0 con 0x7f242c0ff980 2026-03-10T07:46:43.161 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.160+0000 7f24237fe700 1 -- 192.168.123.108:0/3901491198 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2414021470 con 0x7f242c0ff980 2026-03-10T07:46:43.161 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.160+0000 7f24237fe700 1 -- 192.168.123.108:0/3901491198 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f241400f460 con 0x7f242c0ff980 2026-03-10T07:46:43.161 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.160+0000 7f24237fe700 1 -- 192.168.123.108:0/3901491198 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 14) v1 ==== 45351+0+0 (secure 0 0 0) 0x7f241400f6d0 con 0x7f242c0ff980 2026-03-10T07:46:43.162 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.160+0000 7f2430b0b700 1 -- 192.168.123.108:0/3901491198 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f242c191b20 con 0x7f242c0ff980 2026-03-10T07:46:43.162 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.161+0000 7f24237fe700 1 --2- 192.168.123.108:0/3901491198 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2418038490 0x7f241803a950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:43.162 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.161+0000 7f24237fe700 1 -- 192.168.123.108:0/3901491198 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(4..4 src has 1..4) v4 ==== 1069+0+0 (secure 0 0 0) 0x7f241404c470 con 0x7f242c0ff980 2026-03-10T07:46:43.162 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.161+0000 7f2429d9b700 1 -- 192.168.123.108:0/3901491198 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2418038490 msgr2=0x7f241803a950 unknown :-1 s=STATE_CONNECTING_RE l=1).process reconnect failed to v2:192.168.123.105:6800/2 2026-03-10T07:46:43.162 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.161+0000 7f2429d9b700 1 --2- 192.168.123.108:0/3901491198 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2418038490 0x7f241803a950 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._fault waiting 0.200000 2026-03-10T07:46:43.164 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.163+0000 7f24237fe700 1 -- 192.168.123.108:0/3901491198 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f2414017490 con 0x7f242c0ff980 2026-03-10T07:46:43.271 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.270+0000 7f24237fe700 1 -- 192.168.123.108:0/3901491198 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mgrmap(e 15) v1 ==== 45072+0+0 (secure 0 0 0) 0x7f2414021a10 con 0x7f242c0ff980 2026-03-10T07:46:43.271 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.270+0000 7f24237fe700 1 -- 192.168.123.108:0/3901491198 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2418038490 msgr2=0x7f241803a950 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:46:43.271 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.270+0000 7f24237fe700 1 --2- 192.168.123.108:0/3901491198 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2418038490 0x7f241803a950 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:43.311 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.309+0000 7f2430b0b700 1 -- 192.168.123.108:0/3901491198 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f242c0623c0 con 0x7f242c0ff980 2026-03-10T07:46:43.311 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.310+0000 7f24237fe700 1 -- 192.168.123.108:0/3901491198 <== mon.0 v2:192.168.123.105:3300/0 8 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f2414026070 con 0x7f242c0ff980 2026-03-10T07:46:43.311 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:46:43.311 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","modified":"2026-03-10T07:45:39.881211Z","created":"2026-03-10T07:45:39.881211Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T07:46:43.313 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.312+0000 7f2430b0b700 1 -- 192.168.123.108:0/3901491198 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f242c0ff980 msgr2=0x7f242c197dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:43.313 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.312+0000 7f2430b0b700 1 --2- 192.168.123.108:0/3901491198 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f242c0ff980 0x7f242c197dc0 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f2414004750 tx=0x7f2414005dc0 comp rx=0 tx=0).stop 2026-03-10T07:46:43.313 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.312+0000 7f2430b0b700 1 -- 192.168.123.108:0/3901491198 shutdown_connections 2026-03-10T07:46:43.313 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.312+0000 7f2430b0b700 1 --2- 192.168.123.108:0/3901491198 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2418038490 0x7f241803a950 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:43.313 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.312+0000 7f2430b0b700 1 --2- 192.168.123.108:0/3901491198 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f242c0ff980 0x7f242c197dc0 unknown :-1 s=CLOSED pgs=136 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:43.313 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.312+0000 7f2430b0b700 1 -- 192.168.123.108:0/3901491198 >> 192.168.123.108:0/3901491198 conn(0x7f242c0faf20 msgr2=0x7f242c0692d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:43.314 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.312+0000 7f2430b0b700 1 -- 192.168.123.108:0/3901491198 shutdown_connections 2026-03-10T07:46:43.314 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:43.313+0000 7f2430b0b700 1 -- 192.168.123.108:0/3901491198 wait complete. 2026-03-10T07:46:43.314 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T07:46:43.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:43 vm05 ceph-mon[50387]: Active manager daemon vm05.blexke restarted 2026-03-10T07:46:43.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:43 vm05 ceph-mon[50387]: Activating manager daemon vm05.blexke 2026-03-10T07:46:43.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:43 vm05 ceph-mon[50387]: osdmap e5: 0 total, 0 up, 0 in 2026-03-10T07:46:43.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:43 vm05 ceph-mon[50387]: mgrmap e15: vm05.blexke(active, starting, since 0.00471434s) 2026-03-10T07:46:43.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:43 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T07:46:43.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:43 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr metadata", "who": "vm05.blexke", "id": "vm05.blexke"}]: dispatch 2026-03-10T07:46:43.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:43 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T07:46:43.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:43 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T07:46:43.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:43 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T07:46:43.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:43 vm05 ceph-mon[50387]: Manager daemon vm05.blexke is now available 2026-03-10T07:46:43.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:43 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:43.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:43 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/3901491198' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:43.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:43 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:46:43.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:43 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.blexke/mirror_snapshot_schedule"}]: dispatch 2026-03-10T07:46:43.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:43 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:46:43.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:43 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:46:44.364 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:44 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.blexke/trash_purge_schedule"}]: dispatch 2026-03-10T07:46:44.364 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:44 vm05 ceph-mon[50387]: [10/Mar/2026:07:46:43] ENGINE Bus STARTING 2026-03-10T07:46:44.364 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:44 vm05 ceph-mon[50387]: [10/Mar/2026:07:46:43] ENGINE Serving on http://192.168.123.105:8765 2026-03-10T07:46:44.364 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:44 vm05 ceph-mon[50387]: [10/Mar/2026:07:46:43] ENGINE Serving on https://192.168.123.105:7150 2026-03-10T07:46:44.364 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:44 vm05 ceph-mon[50387]: [10/Mar/2026:07:46:43] ENGINE Bus STARTED 2026-03-10T07:46:44.364 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:44 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:44.364 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:44 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:44.364 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:44 vm05 ceph-mon[50387]: mgrmap e16: vm05.blexke(active, since 1.01088s) 2026-03-10T07:46:44.371 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T07:46:44.371 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph mon dump -f json 2026-03-10T07:46:44.521 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:44.567 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:44.855 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.853+0000 7fb30f1fc700 1 -- 192.168.123.108:0/1737281619 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3081083e0 msgr2=0x7fb308108800 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:44.855 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.853+0000 7fb30f1fc700 1 --2- 192.168.123.108:0/1737281619 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3081083e0 0x7fb308108800 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7fb304007780 tx=0x7fb30400c050 comp rx=0 tx=0).stop 2026-03-10T07:46:44.855 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.853+0000 7fb30f1fc700 1 -- 192.168.123.108:0/1737281619 shutdown_connections 2026-03-10T07:46:44.855 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.853+0000 7fb30f1fc700 1 --2- 192.168.123.108:0/1737281619 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3081083e0 0x7fb308108800 unknown :-1 s=CLOSED pgs=144 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:44.855 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.853+0000 7fb30f1fc700 1 -- 192.168.123.108:0/1737281619 >> 192.168.123.108:0/1737281619 conn(0x7fb30806d660 msgr2=0x7fb30806fac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:44.856 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.854+0000 7fb30f1fc700 1 -- 192.168.123.108:0/1737281619 shutdown_connections 2026-03-10T07:46:44.856 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.854+0000 7fb30f1fc700 1 -- 192.168.123.108:0/1737281619 wait complete. 2026-03-10T07:46:44.856 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.854+0000 7fb30f1fc700 1 Processor -- start 2026-03-10T07:46:44.856 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.855+0000 7fb30f1fc700 1 -- start start 2026-03-10T07:46:44.856 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.855+0000 7fb30f1fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb308080f00 0x7fb308081320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:44.856 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.855+0000 7fb30f1fc700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb304003680 con 0x7fb308080f00 2026-03-10T07:46:44.856 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.855+0000 7fb30e1fa700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb308080f00 0x7fb308081320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:44.856 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.855+0000 7fb30e1fa700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb308080f00 0x7fb308081320 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:33714/0 (socket says 192.168.123.108:33714) 2026-03-10T07:46:44.856 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.855+0000 7fb30e1fa700 1 -- 192.168.123.108:0/287599883 learned_addr learned my addr 192.168.123.108:0/287599883 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:46:44.857 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.855+0000 7fb30e1fa700 1 -- 192.168.123.108:0/287599883 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb304007430 con 0x7fb308080f00 2026-03-10T07:46:44.857 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.856+0000 7fb30e1fa700 1 --2- 192.168.123.108:0/287599883 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb308080f00 0x7fb308081320 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7fb304007400 tx=0x7fb30400c7b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:44.857 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.856+0000 7fb2ff7fe700 1 -- 192.168.123.108:0/287599883 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb30400f040 con 0x7fb308080f00 2026-03-10T07:46:44.857 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.856+0000 7fb2ff7fe700 1 -- 192.168.123.108:0/287599883 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb30400a5b0 con 0x7fb308080f00 2026-03-10T07:46:44.857 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.856+0000 7fb30f1fc700 1 -- 192.168.123.108:0/287599883 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb308081860 con 0x7fb308080f00 2026-03-10T07:46:44.857 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.856+0000 7fb30f1fc700 1 -- 192.168.123.108:0/287599883 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb3080824d0 con 0x7fb308080f00 2026-03-10T07:46:44.858 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.857+0000 7fb2ff7fe700 1 -- 192.168.123.108:0/287599883 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb304008640 con 0x7fb308080f00 2026-03-10T07:46:44.858 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.857+0000 7fb2ff7fe700 1 -- 192.168.123.108:0/287599883 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 16) v1 ==== 45199+0+0 (secure 0 0 0) 0x7fb30401a070 con 0x7fb308080f00 2026-03-10T07:46:44.858 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.857+0000 7fb2fd7fa700 1 -- 192.168.123.108:0/287599883 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb30804f000 con 0x7fb308080f00 2026-03-10T07:46:44.858 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.857+0000 7fb2ff7fe700 1 --2- 192.168.123.108:0/287599883 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb2f4037fa0 0x7fb2f403a460 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:44.858 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.857+0000 7fb2ff7fe700 1 -- 192.168.123.108:0/287599883 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7fb30404b890 con 0x7fb308080f00 2026-03-10T07:46:44.859 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.857+0000 7fb30d9f9700 1 --2- 192.168.123.108:0/287599883 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb2f4037fa0 0x7fb2f403a460 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:44.859 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.858+0000 7fb30d9f9700 1 --2- 192.168.123.108:0/287599883 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb2f4037fa0 0x7fb2f403a460 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fb30000ad30 tx=0x7fb3000093f0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:44.861 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:44.860+0000 7fb2ff7fe700 1 -- 192.168.123.108:0/287599883 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb30401d5b0 con 0x7fb308080f00 2026-03-10T07:46:45.029 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:45.028+0000 7fb2fd7fa700 1 -- 192.168.123.108:0/287599883 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fb3080623c0 con 0x7fb308080f00 2026-03-10T07:46:45.029 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:45.028+0000 7fb2ff7fe700 1 -- 192.168.123.108:0/287599883 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fb304018090 con 0x7fb308080f00 2026-03-10T07:46:45.029 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:46:45.029 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","modified":"2026-03-10T07:45:39.881211Z","created":"2026-03-10T07:45:39.881211Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T07:46:45.033 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:45.032+0000 7fb2fd7fa700 1 -- 192.168.123.108:0/287599883 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb2f4037fa0 msgr2=0x7fb2f403a460 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:45.033 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:45.032+0000 7fb2fd7fa700 1 --2- 192.168.123.108:0/287599883 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb2f4037fa0 0x7fb2f403a460 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fb30000ad30 tx=0x7fb3000093f0 comp rx=0 tx=0).stop 2026-03-10T07:46:45.033 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:45.032+0000 7fb2fd7fa700 1 -- 192.168.123.108:0/287599883 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb308080f00 msgr2=0x7fb308081320 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:45.033 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:45.032+0000 7fb2fd7fa700 1 --2- 192.168.123.108:0/287599883 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb308080f00 0x7fb308081320 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7fb304007400 tx=0x7fb30400c7b0 comp rx=0 tx=0).stop 2026-03-10T07:46:45.034 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:45.033+0000 7fb2fd7fa700 1 -- 192.168.123.108:0/287599883 shutdown_connections 2026-03-10T07:46:45.034 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:45.033+0000 7fb2fd7fa700 1 --2- 192.168.123.108:0/287599883 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb2f4037fa0 0x7fb2f403a460 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:45.034 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:45.033+0000 7fb2fd7fa700 1 --2- 192.168.123.108:0/287599883 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb308080f00 0x7fb308081320 secure :-1 s=CLOSED pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7fb304007400 tx=0x7fb30400c7b0 comp rx=0 tx=0).stop 2026-03-10T07:46:45.034 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:45.033+0000 7fb2fd7fa700 1 -- 192.168.123.108:0/287599883 >> 192.168.123.108:0/287599883 conn(0x7fb30806d660 msgr2=0x7fb30806e340 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:45.034 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:45.033+0000 7fb2fd7fa700 1 -- 192.168.123.108:0/287599883 shutdown_connections 2026-03-10T07:46:45.034 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:45.033+0000 7fb2fd7fa700 1 -- 192.168.123.108:0/287599883 wait complete. 2026-03-10T07:46:45.040 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T07:46:45.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:45 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:45.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:45 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:45.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:45 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:45.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:45 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/287599883' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:45.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:45 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:45.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:45 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:45.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:45 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:46:46.132 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T07:46:46.132 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph mon dump -f json 2026-03-10T07:46:46.291 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:46.337 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /etc/ceph/ceph.conf 2026-03-10T07:46:46.620 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.618+0000 7f1873321700 1 -- 192.168.123.108:0/4120434899 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f186c106ec0 msgr2=0x7f186c1072e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:46.620 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.618+0000 7f1873321700 1 --2- 192.168.123.108:0/4120434899 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f186c106ec0 0x7f186c1072e0 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7f1868009b10 tx=0x7f1868009e20 comp rx=0 tx=0).stop 2026-03-10T07:46:46.621 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.619+0000 7f1873321700 1 -- 192.168.123.108:0/4120434899 shutdown_connections 2026-03-10T07:46:46.621 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.619+0000 7f1873321700 1 --2- 192.168.123.108:0/4120434899 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f186c106ec0 0x7f186c1072e0 unknown :-1 s=CLOSED pgs=146 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:46.621 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.619+0000 7f1873321700 1 -- 192.168.123.108:0/4120434899 >> 192.168.123.108:0/4120434899 conn(0x7f186c075f00 msgr2=0x7f186c078360 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:46.622 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.620+0000 7f1873321700 1 -- 192.168.123.108:0/4120434899 shutdown_connections 2026-03-10T07:46:46.623 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.622+0000 7f1873321700 1 -- 192.168.123.108:0/4120434899 wait complete. 2026-03-10T07:46:46.624 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.622+0000 7f1873321700 1 Processor -- start 2026-03-10T07:46:46.625 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.624+0000 7f1873321700 1 -- start start 2026-03-10T07:46:46.625 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.624+0000 7f1873321700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f186c106ec0 0x7f186c19c050 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:46.626 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.624+0000 7f1873321700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f186c19c590 con 0x7f186c106ec0 2026-03-10T07:46:46.626 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.624+0000 7f187231f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f186c106ec0 0x7f186c19c050 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:46.626 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.625+0000 7f187231f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f186c106ec0 0x7f186c19c050 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:49520/0 (socket says 192.168.123.108:49520) 2026-03-10T07:46:46.626 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.625+0000 7f187231f700 1 -- 192.168.123.108:0/2380430570 learned_addr learned my addr 192.168.123.108:0/2380430570 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:46:46.627 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.625+0000 7f187231f700 1 -- 192.168.123.108:0/2380430570 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1868009770 con 0x7f186c106ec0 2026-03-10T07:46:46.628 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.627+0000 7f187231f700 1 --2- 192.168.123.108:0/2380430570 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f186c106ec0 0x7f186c19c050 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7f186800b5c0 tx=0x7f186800fa00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:46.629 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.627+0000 7f18637fe700 1 -- 192.168.123.108:0/2380430570 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f186801c070 con 0x7f186c106ec0 2026-03-10T07:46:46.629 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.627+0000 7f1873321700 1 -- 192.168.123.108:0/2380430570 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f186c19c790 con 0x7f186c106ec0 2026-03-10T07:46:46.629 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.627+0000 7f1873321700 1 -- 192.168.123.108:0/2380430570 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f186c19cc30 con 0x7f186c106ec0 2026-03-10T07:46:46.629 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.628+0000 7f1873321700 1 -- 192.168.123.108:0/2380430570 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f186c195bb0 con 0x7f186c106ec0 2026-03-10T07:46:46.629 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.628+0000 7f18637fe700 1 -- 192.168.123.108:0/2380430570 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f186800fd30 con 0x7f186c106ec0 2026-03-10T07:46:46.630 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.628+0000 7f18637fe700 1 -- 192.168.123.108:0/2380430570 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f1868017720 con 0x7f186c106ec0 2026-03-10T07:46:46.633 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.631+0000 7f18637fe700 1 -- 192.168.123.108:0/2380430570 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 17) v1 ==== 45397+0+0 (secure 0 0 0) 0x7f1868021c70 con 0x7f186c106ec0 2026-03-10T07:46:46.633 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.631+0000 7f18637fe700 1 --2- 192.168.123.108:0/2380430570 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f18580385c0 0x7f185803aa80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:46.633 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.631+0000 7f18637fe700 1 -- 192.168.123.108:0/2380430570 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f186804cd00 con 0x7f186c106ec0 2026-03-10T07:46:46.633 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.632+0000 7f1871b1e700 1 --2- 192.168.123.108:0/2380430570 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f18580385c0 0x7f185803aa80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:46.633 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.632+0000 7f18637fe700 1 -- 192.168.123.108:0/2380430570 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f1868079160 con 0x7f186c106ec0 2026-03-10T07:46:46.637 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.634+0000 7f1871b1e700 1 --2- 192.168.123.108:0/2380430570 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f18580385c0 0x7f185803aa80 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f185c006fd0 tx=0x7f185c006e40 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:46.797 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.795+0000 7f1873321700 1 -- 192.168.123.108:0/2380430570 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f186c0623c0 con 0x7f186c106ec0 2026-03-10T07:46:46.797 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.796+0000 7f18637fe700 1 -- 192.168.123.108:0/2380430570 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f1868026070 con 0x7f186c106ec0 2026-03-10T07:46:46.797 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:46:46.797 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","modified":"2026-03-10T07:45:39.881211Z","created":"2026-03-10T07:45:39.881211Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T07:46:46.800 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.799+0000 7f1873321700 1 -- 192.168.123.108:0/2380430570 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f18580385c0 msgr2=0x7f185803aa80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:46.800 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.799+0000 7f1873321700 1 --2- 192.168.123.108:0/2380430570 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f18580385c0 0x7f185803aa80 secure :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0x7f185c006fd0 tx=0x7f185c006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:46.800 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.799+0000 7f1873321700 1 -- 192.168.123.108:0/2380430570 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f186c106ec0 msgr2=0x7f186c19c050 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:46.800 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.799+0000 7f1873321700 1 --2- 192.168.123.108:0/2380430570 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f186c106ec0 0x7f186c19c050 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7f186800b5c0 tx=0x7f186800fa00 comp rx=0 tx=0).stop 2026-03-10T07:46:46.801 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.800+0000 7f1873321700 1 -- 192.168.123.108:0/2380430570 shutdown_connections 2026-03-10T07:46:46.801 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.800+0000 7f1873321700 1 --2- 192.168.123.108:0/2380430570 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f18580385c0 0x7f185803aa80 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:46.801 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.800+0000 7f1873321700 1 --2- 192.168.123.108:0/2380430570 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f186c106ec0 0x7f186c19c050 unknown :-1 s=CLOSED pgs=147 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:46.801 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.800+0000 7f1873321700 1 -- 192.168.123.108:0/2380430570 >> 192.168.123.108:0/2380430570 conn(0x7f186c075f00 msgr2=0x7f186c076be0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:46.801 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.800+0000 7f1873321700 1 -- 192.168.123.108:0/2380430570 shutdown_connections 2026-03-10T07:46:46.801 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:46.800+0000 7f1873321700 1 -- 192.168.123.108:0/2380430570 wait complete. 2026-03-10T07:46:46.801 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T07:46:47.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:46 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:47.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:46 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:47.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:46 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:47.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:46 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:47.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:46 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:46:47.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:46 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:46:47.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:46 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:46:47.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:46 vm05 ceph-mon[50387]: Updating vm05:/etc/ceph/ceph.conf 2026-03-10T07:46:47.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:46 vm05 ceph-mon[50387]: Updating vm08:/etc/ceph/ceph.conf 2026-03-10T07:46:47.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:46 vm05 ceph-mon[50387]: Updating vm08:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.conf 2026-03-10T07:46:47.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:46 vm05 ceph-mon[50387]: Updating vm05:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.conf 2026-03-10T07:46:47.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:46 vm05 ceph-mon[50387]: mgrmap e17: vm05.blexke(active, since 2s) 2026-03-10T07:46:47.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:46 vm05 ceph-mon[50387]: Updating vm08:/etc/ceph/ceph.client.admin.keyring 2026-03-10T07:46:47.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:46 vm05 ceph-mon[50387]: Updating vm05:/etc/ceph/ceph.client.admin.keyring 2026-03-10T07:46:47.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:46 vm05 ceph-mon[50387]: Updating vm05:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.client.admin.keyring 2026-03-10T07:46:47.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:46 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/2380430570' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:47.860 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T07:46:47.860 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph mon dump -f json 2026-03-10T07:46:48.138 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.conf 2026-03-10T07:46:48.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:48 vm05 ceph-mon[50387]: Updating vm08:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.client.admin.keyring 2026-03-10T07:46:48.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:48 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:48.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:48 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:48.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:48 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:48.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:48 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:48.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:48 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:48.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:48 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T07:46:48.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:48 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T07:46:48.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:48 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:46:48.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:48 vm05 ceph-mon[50387]: Deploying daemon ceph-exporter.vm08 on vm08 2026-03-10T07:46:48.435 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.432+0000 7fb716bee700 1 -- 192.168.123.108:0/1790512111 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb71010df10 msgr2=0x7fb71010e2f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:48.435 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.432+0000 7fb716bee700 1 --2- 192.168.123.108:0/1790512111 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb71010df10 0x7fb71010e2f0 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7fb700009b00 tx=0x7fb700009e10 comp rx=0 tx=0).stop 2026-03-10T07:46:48.435 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.433+0000 7fb716bee700 1 -- 192.168.123.108:0/1790512111 shutdown_connections 2026-03-10T07:46:48.435 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.433+0000 7fb716bee700 1 --2- 192.168.123.108:0/1790512111 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb71010df10 0x7fb71010e2f0 unknown :-1 s=CLOSED pgs=149 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:48.435 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.433+0000 7fb716bee700 1 -- 192.168.123.108:0/1790512111 >> 192.168.123.108:0/1790512111 conn(0x7fb71006cb50 msgr2=0x7fb71006cf60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:48.436 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.433+0000 7fb716bee700 1 -- 192.168.123.108:0/1790512111 shutdown_connections 2026-03-10T07:46:48.436 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.433+0000 7fb716bee700 1 -- 192.168.123.108:0/1790512111 wait complete. 2026-03-10T07:46:48.436 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.434+0000 7fb716bee700 1 Processor -- start 2026-03-10T07:46:48.436 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.434+0000 7fb716bee700 1 -- start start 2026-03-10T07:46:48.436 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.434+0000 7fb716bee700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7101aaa50 0x7fb7101aae30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:48.436 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.434+0000 7fb716bee700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb700012070 con 0x7fb7101aaa50 2026-03-10T07:46:48.437 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.435+0000 7fb71498a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7101aaa50 0x7fb7101aae30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:48.437 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.435+0000 7fb71498a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7101aaa50 0x7fb7101aae30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:49540/0 (socket says 192.168.123.108:49540) 2026-03-10T07:46:48.437 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.435+0000 7fb71498a700 1 -- 192.168.123.108:0/2579408392 learned_addr learned my addr 192.168.123.108:0/2579408392 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:46:48.437 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.435+0000 7fb71498a700 1 -- 192.168.123.108:0/2579408392 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb7000097e0 con 0x7fb7101aaa50 2026-03-10T07:46:48.437 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.435+0000 7fb71498a700 1 --2- 192.168.123.108:0/2579408392 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7101aaa50 0x7fb7101aae30 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7fb700006010 tx=0x7fb70000bba0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:48.437 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.436+0000 7fb70dffb700 1 -- 192.168.123.108:0/2579408392 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb70001c070 con 0x7fb7101aaa50 2026-03-10T07:46:48.438 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.436+0000 7fb716bee700 1 -- 192.168.123.108:0/2579408392 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb7101ab3d0 con 0x7fb7101aaa50 2026-03-10T07:46:48.438 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.436+0000 7fb716bee700 1 -- 192.168.123.108:0/2579408392 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb7101a4d20 con 0x7fb7101aaa50 2026-03-10T07:46:48.438 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.436+0000 7fb70dffb700 1 -- 192.168.123.108:0/2579408392 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb700003d70 con 0x7fb7101aaa50 2026-03-10T07:46:48.439 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.437+0000 7fb70dffb700 1 -- 192.168.123.108:0/2579408392 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fb700017440 con 0x7fb7101aaa50 2026-03-10T07:46:48.439 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.438+0000 7fb70dffb700 1 -- 192.168.123.108:0/2579408392 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 17) v1 ==== 45397+0+0 (secure 0 0 0) 0x7fb700017620 con 0x7fb7101aaa50 2026-03-10T07:46:48.439 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.438+0000 7fb70dffb700 1 --2- 192.168.123.108:0/2579408392 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb6f8038610 0x7fb6f803aad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:48.439 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.438+0000 7fb70dffb700 1 -- 192.168.123.108:0/2579408392 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7fb70004d290 con 0x7fb7101aaa50 2026-03-10T07:46:48.440 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.438+0000 7fb70ffff700 1 --2- 192.168.123.108:0/2579408392 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb6f8038610 0x7fb6f803aad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:48.440 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.439+0000 7fb70ffff700 1 --2- 192.168.123.108:0/2579408392 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb6f8038610 0x7fb6f803aad0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fb704006fd0 tx=0x7fb704006e40 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:48.440 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.439+0000 7fb716bee700 1 -- 192.168.123.108:0/2579408392 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb6fc005320 con 0x7fb7101aaa50 2026-03-10T07:46:48.444 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.442+0000 7fb70dffb700 1 -- 192.168.123.108:0/2579408392 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb70003d080 con 0x7fb7101aaa50 2026-03-10T07:46:48.589 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.586+0000 7fb716bee700 1 -- 192.168.123.108:0/2579408392 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fb6fc0059f0 con 0x7fb7101aaa50 2026-03-10T07:46:48.589 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.587+0000 7fb70dffb700 1 -- 192.168.123.108:0/2579408392 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fb700015350 con 0x7fb7101aaa50 2026-03-10T07:46:48.589 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:46:48.589 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","modified":"2026-03-10T07:45:39.881211Z","created":"2026-03-10T07:45:39.881211Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T07:46:48.591 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.590+0000 7fb716bee700 1 -- 192.168.123.108:0/2579408392 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb6f8038610 msgr2=0x7fb6f803aad0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:48.591 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.590+0000 7fb716bee700 1 --2- 192.168.123.108:0/2579408392 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb6f8038610 0x7fb6f803aad0 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7fb704006fd0 tx=0x7fb704006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:48.591 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.590+0000 7fb716bee700 1 -- 192.168.123.108:0/2579408392 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7101aaa50 msgr2=0x7fb7101aae30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:48.591 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.590+0000 7fb716bee700 1 --2- 192.168.123.108:0/2579408392 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7101aaa50 0x7fb7101aae30 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7fb700006010 tx=0x7fb70000bba0 comp rx=0 tx=0).stop 2026-03-10T07:46:48.591 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.590+0000 7fb716bee700 1 -- 192.168.123.108:0/2579408392 shutdown_connections 2026-03-10T07:46:48.591 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.590+0000 7fb716bee700 1 --2- 192.168.123.108:0/2579408392 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb6f8038610 0x7fb6f803aad0 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:48.591 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.590+0000 7fb716bee700 1 --2- 192.168.123.108:0/2579408392 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7101aaa50 0x7fb7101aae30 unknown :-1 s=CLOSED pgs=150 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:48.591 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.590+0000 7fb716bee700 1 -- 192.168.123.108:0/2579408392 >> 192.168.123.108:0/2579408392 conn(0x7fb71006cb50 msgr2=0x7fb71010a630 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:48.591 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.590+0000 7fb716bee700 1 -- 192.168.123.108:0/2579408392 shutdown_connections 2026-03-10T07:46:48.591 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:48.590+0000 7fb716bee700 1 -- 192.168.123.108:0/2579408392 wait complete. 2026-03-10T07:46:48.593 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T07:46:49.638 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T07:46:49.639 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph mon dump -f json 2026-03-10T07:46:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:49 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:49 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:49 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:49 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:49 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T07:46:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:49 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished 2026-03-10T07:46:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:49 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:46:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:49 vm05 ceph-mon[50387]: Deploying daemon crash.vm08 on vm08 2026-03-10T07:46:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:49 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/2579408392' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:49 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:49 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:49 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:49 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:49.908 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.conf 2026-03-10T07:46:50.168 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.166+0000 7fdc4c826700 1 -- 192.168.123.108:0/2306012434 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdc44102e70 msgr2=0x7fdc44103250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:50.168 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.166+0000 7fdc4c826700 1 --2- 192.168.123.108:0/2306012434 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdc44102e70 0x7fdc44103250 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7fdc3c009b00 tx=0x7fdc3c009e10 comp rx=0 tx=0).stop 2026-03-10T07:46:50.168 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.167+0000 7fdc4c826700 1 -- 192.168.123.108:0/2306012434 shutdown_connections 2026-03-10T07:46:50.168 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.167+0000 7fdc4c826700 1 --2- 192.168.123.108:0/2306012434 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdc44102e70 0x7fdc44103250 unknown :-1 s=CLOSED pgs=151 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:50.168 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.167+0000 7fdc4c826700 1 -- 192.168.123.108:0/2306012434 >> 192.168.123.108:0/2306012434 conn(0x7fdc440fe760 msgr2=0x7fdc44100b80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:50.168 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.167+0000 7fdc4c826700 1 -- 192.168.123.108:0/2306012434 shutdown_connections 2026-03-10T07:46:50.168 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.167+0000 7fdc4c826700 1 -- 192.168.123.108:0/2306012434 wait complete. 2026-03-10T07:46:50.169 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.167+0000 7fdc4c826700 1 Processor -- start 2026-03-10T07:46:50.169 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.168+0000 7fdc4c826700 1 -- start start 2026-03-10T07:46:50.169 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.168+0000 7fdc4c826700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdc44102e70 0x7fdc441980c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:50.169 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.168+0000 7fdc4c826700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdc44198600 con 0x7fdc44102e70 2026-03-10T07:46:50.169 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.168+0000 7fdc4a5c2700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdc44102e70 0x7fdc441980c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:50.169 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.168+0000 7fdc4a5c2700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdc44102e70 0x7fdc441980c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:49558/0 (socket says 192.168.123.108:49558) 2026-03-10T07:46:50.169 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.168+0000 7fdc4a5c2700 1 -- 192.168.123.108:0/1288164391 learned_addr learned my addr 192.168.123.108:0/1288164391 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:46:50.170 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.169+0000 7fdc4a5c2700 1 -- 192.168.123.108:0/1288164391 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdc3c0097e0 con 0x7fdc44102e70 2026-03-10T07:46:50.170 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.169+0000 7fdc4a5c2700 1 --2- 192.168.123.108:0/1288164391 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdc44102e70 0x7fdc441980c0 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7fdc3c005b40 tx=0x7fdc3c004dc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:50.171 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.169+0000 7fdc377fe700 1 -- 192.168.123.108:0/1288164391 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fdc3c01c070 con 0x7fdc44102e70 2026-03-10T07:46:50.171 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.169+0000 7fdc377fe700 1 -- 192.168.123.108:0/1288164391 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fdc3c0056f0 con 0x7fdc44102e70 2026-03-10T07:46:50.171 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.169+0000 7fdc377fe700 1 -- 192.168.123.108:0/1288164391 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7fdc3c00f460 con 0x7fdc44102e70 2026-03-10T07:46:50.171 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.169+0000 7fdc4c826700 1 -- 192.168.123.108:0/1288164391 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdc44198800 con 0x7fdc44102e70 2026-03-10T07:46:50.171 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.169+0000 7fdc4c826700 1 -- 192.168.123.108:0/1288164391 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdc44198c20 con 0x7fdc44102e70 2026-03-10T07:46:50.172 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.170+0000 7fdc4c826700 1 -- 192.168.123.108:0/1288164391 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdc4404fa20 con 0x7fdc44102e70 2026-03-10T07:46:50.172 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.170+0000 7fdc377fe700 1 -- 192.168.123.108:0/1288164391 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 17) v1 ==== 45397+0+0 (secure 0 0 0) 0x7fdc3c005210 con 0x7fdc44102e70 2026-03-10T07:46:50.172 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.170+0000 7fdc377fe700 1 --2- 192.168.123.108:0/1288164391 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fdc300384e0 0x7fdc3003a9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:50.172 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.170+0000 7fdc377fe700 1 -- 192.168.123.108:0/1288164391 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7fdc3c04c5c0 con 0x7fdc44102e70 2026-03-10T07:46:50.172 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.170+0000 7fdc49dc1700 1 --2- 192.168.123.108:0/1288164391 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fdc300384e0 0x7fdc3003a9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:50.172 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.171+0000 7fdc49dc1700 1 --2- 192.168.123.108:0/1288164391 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fdc300384e0 0x7fdc3003a9a0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fdc38006fd0 tx=0x7fdc38006e40 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:50.174 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.173+0000 7fdc377fe700 1 -- 192.168.123.108:0/1288164391 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fdc3c02ab50 con 0x7fdc44102e70 2026-03-10T07:46:50.318 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.317+0000 7fdc4c826700 1 -- 192.168.123.108:0/1288164391 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7fdc440623c0 con 0x7fdc44102e70 2026-03-10T07:46:50.319 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.318+0000 7fdc377fe700 1 -- 192.168.123.108:0/1288164391 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7fdc3c026020 con 0x7fdc44102e70 2026-03-10T07:46:50.320 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:46:50.320 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","modified":"2026-03-10T07:45:39.881211Z","created":"2026-03-10T07:45:39.881211Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T07:46:50.321 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.320+0000 7fdc4c826700 1 -- 192.168.123.108:0/1288164391 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fdc300384e0 msgr2=0x7fdc3003a9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:50.322 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.320+0000 7fdc4c826700 1 --2- 192.168.123.108:0/1288164391 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fdc300384e0 0x7fdc3003a9a0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7fdc38006fd0 tx=0x7fdc38006e40 comp rx=0 tx=0).stop 2026-03-10T07:46:50.322 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.321+0000 7fdc4c826700 1 -- 192.168.123.108:0/1288164391 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdc44102e70 msgr2=0x7fdc441980c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:50.322 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.321+0000 7fdc4c826700 1 --2- 192.168.123.108:0/1288164391 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdc44102e70 0x7fdc441980c0 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7fdc3c005b40 tx=0x7fdc3c004dc0 comp rx=0 tx=0).stop 2026-03-10T07:46:50.322 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.321+0000 7fdc4c826700 1 -- 192.168.123.108:0/1288164391 shutdown_connections 2026-03-10T07:46:50.322 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.321+0000 7fdc4c826700 1 --2- 192.168.123.108:0/1288164391 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fdc300384e0 0x7fdc3003a9a0 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:50.322 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.321+0000 7fdc4c826700 1 --2- 192.168.123.108:0/1288164391 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdc44102e70 0x7fdc441980c0 unknown :-1 s=CLOSED pgs=152 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:50.322 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.321+0000 7fdc4c826700 1 -- 192.168.123.108:0/1288164391 >> 192.168.123.108:0/1288164391 conn(0x7fdc440fe760 msgr2=0x7fdc441070e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:50.323 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.321+0000 7fdc4c826700 1 -- 192.168.123.108:0/1288164391 shutdown_connections 2026-03-10T07:46:50.323 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:50.322+0000 7fdc4c826700 1 -- 192.168.123.108:0/1288164391 wait complete. 2026-03-10T07:46:50.324 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T07:46:50.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:50 vm05 ceph-mon[50387]: Deploying daemon node-exporter.vm08 on vm08 2026-03-10T07:46:50.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:50 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/1288164391' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:51.386 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T07:46:51.386 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph mon dump -f json 2026-03-10T07:46:51.564 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.conf 2026-03-10T07:46:51.843 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.841+0000 7f34e19ed700 1 -- 192.168.123.108:0/275932353 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f34dc10c210 msgr2=0x7f34dc10c5f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:51.843 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.841+0000 7f34e19ed700 1 --2- 192.168.123.108:0/275932353 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f34dc10c210 0x7f34dc10c5f0 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7f34cc007780 tx=0x7f34cc00c050 comp rx=0 tx=0).stop 2026-03-10T07:46:51.843 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.841+0000 7f34e19ed700 1 -- 192.168.123.108:0/275932353 shutdown_connections 2026-03-10T07:46:51.843 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.841+0000 7f34e19ed700 1 --2- 192.168.123.108:0/275932353 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f34dc10c210 0x7f34dc10c5f0 unknown :-1 s=CLOSED pgs=153 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:51.843 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.841+0000 7f34e19ed700 1 -- 192.168.123.108:0/275932353 >> 192.168.123.108:0/275932353 conn(0x7f34dc06c990 msgr2=0x7f34dc06cda0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:51.843 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.841+0000 7f34e19ed700 1 -- 192.168.123.108:0/275932353 shutdown_connections 2026-03-10T07:46:51.843 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.841+0000 7f34e19ed700 1 -- 192.168.123.108:0/275932353 wait complete. 2026-03-10T07:46:51.843 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.842+0000 7f34e19ed700 1 Processor -- start 2026-03-10T07:46:51.843 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.842+0000 7f34e19ed700 1 -- start start 2026-03-10T07:46:51.843 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.842+0000 7f34e19ed700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f34dc137520 0x7f34dc134540 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:51.843 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.842+0000 7f34e19ed700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f34cc003680 con 0x7f34dc137520 2026-03-10T07:46:51.843 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.842+0000 7f34daffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f34dc137520 0x7f34dc134540 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:51.843 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.842+0000 7f34daffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f34dc137520 0x7f34dc134540 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:49566/0 (socket says 192.168.123.108:49566) 2026-03-10T07:46:51.844 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.842+0000 7f34daffd700 1 -- 192.168.123.108:0/2628189415 learned_addr learned my addr 192.168.123.108:0/2628189415 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:46:51.844 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.842+0000 7f34daffd700 1 -- 192.168.123.108:0/2628189415 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f34cc007430 con 0x7f34dc137520 2026-03-10T07:46:51.844 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.843+0000 7f34daffd700 1 --2- 192.168.123.108:0/2628189415 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f34dc137520 0x7f34dc134540 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f34cc00a010 tx=0x7f34cc00c7b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:51.844 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.843+0000 7f34e09eb700 1 -- 192.168.123.108:0/2628189415 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f34cc00f040 con 0x7f34dc137520 2026-03-10T07:46:51.844 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.843+0000 7f34e19ed700 1 -- 192.168.123.108:0/2628189415 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f34dc134ae0 con 0x7f34dc137520 2026-03-10T07:46:51.844 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.843+0000 7f34e19ed700 1 -- 192.168.123.108:0/2628189415 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f34dc134f60 con 0x7f34dc137520 2026-03-10T07:46:51.844 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.843+0000 7f34e09eb700 1 -- 192.168.123.108:0/2628189415 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f34cc00cb60 con 0x7f34dc137520 2026-03-10T07:46:51.845 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.843+0000 7f34e09eb700 1 -- 192.168.123.108:0/2628189415 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f34cc008600 con 0x7f34dc137520 2026-03-10T07:46:51.845 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.844+0000 7f34e09eb700 1 -- 192.168.123.108:0/2628189415 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 17) v1 ==== 45397+0+0 (secure 0 0 0) 0x7f34cc01a070 con 0x7f34dc137520 2026-03-10T07:46:51.845 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.844+0000 7f34e09eb700 1 --2- 192.168.123.108:0/2628189415 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f34c4038570 0x7f34c403aa30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:51.845 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.844+0000 7f34e09eb700 1 -- 192.168.123.108:0/2628189415 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f34cc04cc00 con 0x7f34dc137520 2026-03-10T07:46:51.845 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.844+0000 7f34da7fc700 1 --2- 192.168.123.108:0/2628189415 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f34c4038570 0x7f34c403aa30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:51.846 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.845+0000 7f34da7fc700 1 --2- 192.168.123.108:0/2628189415 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f34c4038570 0x7f34c403aa30 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f34d400ad30 tx=0x7f34d40093f0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:51.846 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.845+0000 7f34e19ed700 1 -- 192.168.123.108:0/2628189415 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f34c8005320 con 0x7f34dc137520 2026-03-10T07:46:51.849 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.848+0000 7f34e09eb700 1 -- 192.168.123.108:0/2628189415 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f34cc018020 con 0x7f34dc137520 2026-03-10T07:46:51.997 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.994+0000 7f34e19ed700 1 -- 192.168.123.108:0/2628189415 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f34c8005190 con 0x7f34dc137520 2026-03-10T07:46:51.997 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.996+0000 7f34e09eb700 1 -- 192.168.123.108:0/2628189415 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f34cc0277d0 con 0x7f34dc137520 2026-03-10T07:46:51.997 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:46:51.997 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","modified":"2026-03-10T07:45:39.881211Z","created":"2026-03-10T07:45:39.881211Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T07:46:52.000 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.999+0000 7f34e19ed700 1 -- 192.168.123.108:0/2628189415 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f34c4038570 msgr2=0x7f34c403aa30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:52.000 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:51.999+0000 7f34e19ed700 1 --2- 192.168.123.108:0/2628189415 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f34c4038570 0x7f34c403aa30 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7f34d400ad30 tx=0x7f34d40093f0 comp rx=0 tx=0).stop 2026-03-10T07:46:52.003 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:52.002+0000 7f34e19ed700 1 -- 192.168.123.108:0/2628189415 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f34dc137520 msgr2=0x7f34dc134540 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:52.003 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:52.002+0000 7f34e19ed700 1 --2- 192.168.123.108:0/2628189415 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f34dc137520 0x7f34dc134540 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f34cc00a010 tx=0x7f34cc00c7b0 comp rx=0 tx=0).stop 2026-03-10T07:46:52.003 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:52.002+0000 7f34e19ed700 1 -- 192.168.123.108:0/2628189415 shutdown_connections 2026-03-10T07:46:52.003 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:52.002+0000 7f34e19ed700 1 --2- 192.168.123.108:0/2628189415 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f34c4038570 0x7f34c403aa30 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:52.003 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:52.002+0000 7f34e19ed700 1 --2- 192.168.123.108:0/2628189415 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f34dc137520 0x7f34dc134540 unknown :-1 s=CLOSED pgs=154 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:52.003 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:52.002+0000 7f34e19ed700 1 -- 192.168.123.108:0/2628189415 >> 192.168.123.108:0/2628189415 conn(0x7f34dc06c990 msgr2=0x7f34dc10b720 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:52.003 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:52.002+0000 7f34e19ed700 1 -- 192.168.123.108:0/2628189415 shutdown_connections 2026-03-10T07:46:52.003 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:52.002+0000 7f34e19ed700 1 -- 192.168.123.108:0/2628189415 wait complete. 2026-03-10T07:46:52.004 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T07:46:52.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:52 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/2628189415' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:53.080 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T07:46:53.081 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph mon dump -f json 2026-03-10T07:46:53.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:53 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:53.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:53 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:53.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:53 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:53.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:53 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:53.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:53 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.orfpog", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T07:46:53.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:53 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.vm08.orfpog", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished 2026-03-10T07:46:53.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:53 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T07:46:53.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:53 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:46:53.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:53 vm05 ceph-mon[50387]: Deploying daemon mgr.vm08.orfpog on vm08 2026-03-10T07:46:53.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:53 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:53.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:53 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:53.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:53 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:53.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:53 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:53.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:53 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T07:46:53.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:53 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:46:53.510 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.conf 2026-03-10T07:46:54.062 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.060+0000 7f85a4587700 1 -- 192.168.123.108:0/4080173199 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f859c0818b0 msgr2=0x7f859c0796f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:54.062 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.060+0000 7f85a4587700 1 --2- 192.168.123.108:0/4080173199 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f859c0818b0 0x7f859c0796f0 secure :-1 s=READY pgs=157 cs=0 l=1 rev1=1 crypto rx=0x7f8598009b00 tx=0x7f8598009e10 comp rx=0 tx=0).stop 2026-03-10T07:46:54.062 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.060+0000 7f85a4587700 1 -- 192.168.123.108:0/4080173199 shutdown_connections 2026-03-10T07:46:54.062 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.060+0000 7f85a4587700 1 --2- 192.168.123.108:0/4080173199 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f859c0818b0 0x7f859c0796f0 unknown :-1 s=CLOSED pgs=157 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:54.062 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.060+0000 7f85a4587700 1 -- 192.168.123.108:0/4080173199 >> 192.168.123.108:0/4080173199 conn(0x7f859c0767d0 msgr2=0x7f859c078bf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:54.063 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.061+0000 7f85a4587700 1 -- 192.168.123.108:0/4080173199 shutdown_connections 2026-03-10T07:46:54.063 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.061+0000 7f85a4587700 1 -- 192.168.123.108:0/4080173199 wait complete. 2026-03-10T07:46:54.063 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.061+0000 7f85a4587700 1 Processor -- start 2026-03-10T07:46:54.063 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.061+0000 7f85a4587700 1 -- start start 2026-03-10T07:46:54.063 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.062+0000 7f85a4587700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f859c06ff50 0x7f859c070330 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:54.063 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.062+0000 7f85a4587700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8598012070 con 0x7f859c06ff50 2026-03-10T07:46:54.063 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.062+0000 7f85a2323700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f859c06ff50 0x7f859c070330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:54.063 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.062+0000 7f85a2323700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f859c06ff50 0x7f859c070330 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:49594/0 (socket says 192.168.123.108:49594) 2026-03-10T07:46:54.063 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.062+0000 7f85a2323700 1 -- 192.168.123.108:0/2856745707 learned_addr learned my addr 192.168.123.108:0/2856745707 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:46:54.063 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.062+0000 7f85a2323700 1 -- 192.168.123.108:0/2856745707 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f85980097e0 con 0x7f859c06ff50 2026-03-10T07:46:54.064 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.062+0000 7f85a2323700 1 --2- 192.168.123.108:0/2856745707 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f859c06ff50 0x7f859c070330 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7f8598006010 tx=0x7f859800bba0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:54.064 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.062+0000 7f85937fe700 1 -- 192.168.123.108:0/2856745707 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f859801c070 con 0x7f859c06ff50 2026-03-10T07:46:54.065 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.064+0000 7f85a4587700 1 -- 192.168.123.108:0/2856745707 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f859c0708d0 con 0x7f859c06ff50 2026-03-10T07:46:54.065 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.064+0000 7f85a4587700 1 -- 192.168.123.108:0/2856745707 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f859c070d50 con 0x7f859c06ff50 2026-03-10T07:46:54.065 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.064+0000 7f85937fe700 1 -- 192.168.123.108:0/2856745707 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8598003d70 con 0x7f859c06ff50 2026-03-10T07:46:54.065 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.064+0000 7f85937fe700 1 -- 192.168.123.108:0/2856745707 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 214+0+0 (secure 0 0 0) 0x7f8598017440 con 0x7f859c06ff50 2026-03-10T07:46:54.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.065+0000 7f85937fe700 1 -- 192.168.123.108:0/2856745707 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 17) v1 ==== 45397+0+0 (secure 0 0 0) 0x7f85980175a0 con 0x7f859c06ff50 2026-03-10T07:46:54.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.065+0000 7f85937fe700 1 --2- 192.168.123.108:0/2856745707 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f85880385c0 0x7f858803aa80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:46:54.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.065+0000 7f85937fe700 1 -- 192.168.123.108:0/2856745707 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f859804d230 con 0x7f859c06ff50 2026-03-10T07:46:54.066 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.065+0000 7f85a1b22700 1 --2- 192.168.123.108:0/2856745707 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f85880385c0 0x7f858803aa80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:46:54.067 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.065+0000 7f85a4587700 1 -- 192.168.123.108:0/2856745707 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8580005320 con 0x7f859c06ff50 2026-03-10T07:46:54.067 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.066+0000 7f85a1b22700 1 --2- 192.168.123.108:0/2856745707 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f85880385c0 0x7f858803aa80 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f859400ad30 tx=0x7f85940093f0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:46:54.072 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.071+0000 7f85937fe700 1 -- 192.168.123.108:0/2856745707 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f859802a430 con 0x7f859c06ff50 2026-03-10T07:46:54.235 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.233+0000 7f85a4587700 1 -- 192.168.123.108:0/2856745707 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f8580005190 con 0x7f859c06ff50 2026-03-10T07:46:54.235 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.234+0000 7f85937fe700 1 -- 192.168.123.108:0/2856745707 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 1 v1) v1 ==== 95+0+751 (secure 0 0 0) 0x7f859801fb60 con 0x7f859c06ff50 2026-03-10T07:46:54.235 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:46:54.235 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":1,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","modified":"2026-03-10T07:45:39.881211Z","created":"2026-03-10T07:45:39.881211Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0]} 2026-03-10T07:46:54.239 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.237+0000 7f85917fa700 1 -- 192.168.123.108:0/2856745707 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f85880385c0 msgr2=0x7f858803aa80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:54.239 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.237+0000 7f85917fa700 1 --2- 192.168.123.108:0/2856745707 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f85880385c0 0x7f858803aa80 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f859400ad30 tx=0x7f85940093f0 comp rx=0 tx=0).stop 2026-03-10T07:46:54.239 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.237+0000 7f85917fa700 1 -- 192.168.123.108:0/2856745707 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f859c06ff50 msgr2=0x7f859c070330 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:46:54.239 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.237+0000 7f85917fa700 1 --2- 192.168.123.108:0/2856745707 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f859c06ff50 0x7f859c070330 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7f8598006010 tx=0x7f859800bba0 comp rx=0 tx=0).stop 2026-03-10T07:46:54.239 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.238+0000 7f85917fa700 1 -- 192.168.123.108:0/2856745707 shutdown_connections 2026-03-10T07:46:54.239 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.238+0000 7f85917fa700 1 --2- 192.168.123.108:0/2856745707 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f85880385c0 0x7f858803aa80 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:54.239 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.238+0000 7f85917fa700 1 --2- 192.168.123.108:0/2856745707 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f859c06ff50 0x7f859c070330 unknown :-1 s=CLOSED pgs=158 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:46:54.239 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.238+0000 7f85917fa700 1 -- 192.168.123.108:0/2856745707 >> 192.168.123.108:0/2856745707 conn(0x7f859c0767d0 msgr2=0x7f859c077440 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:46:54.241 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.240+0000 7f85917fa700 1 -- 192.168.123.108:0/2856745707 shutdown_connections 2026-03-10T07:46:54.241 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:46:54.240+0000 7f85917fa700 1 -- 192.168.123.108:0/2856745707 wait complete. 2026-03-10T07:46:54.244 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 1 2026-03-10T07:46:54.477 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Compile date 2023-12-11 22:07:34 2026-03-10T07:46:54.478 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: DB SUMMARY 2026-03-10T07:46:54.478 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: DB Session ID: E09ZIAO6FCG2W73INDVQ 2026-03-10T07:46:54.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:54 vm05 ceph-mon[50387]: Deploying daemon mon.vm08 on vm08 2026-03-10T07:46:54.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:54 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:54 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/2856745707' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: CURRENT file: CURRENT 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: IDENTITY file: IDENTITY 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: MANIFEST file: MANIFEST-000005 size: 59 Bytes 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm08/store.db dir, Total Num: 0, files: 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm08/store.db: 000004.log size: 511 ; 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.error_if_exists: 0 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.create_if_missing: 0 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.paranoid_checks: 1 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.env: 0x55c4a49d3720 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.fs: PosixFileSystem 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.info_log: 0x55c4a62f3360 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_file_opening_threads: 16 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.statistics: (nil) 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.use_fsync: 0 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_log_file_size: 0 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.keep_log_file_num: 1000 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.recycle_log_file_num: 0 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.allow_fallocate: 1 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.allow_mmap_reads: 0 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.allow_mmap_writes: 0 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.use_direct_reads: 0 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.create_missing_column_families: 0 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.db_log_dir: 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.wal_dir: 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.advise_random_on_open: 1 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.db_write_buffer_size: 0 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.write_buffer_manager: 0x55c4a5564140 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.rate_limiter: (nil) 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.wal_recovery_mode: 2 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.enable_thread_tracking: 0 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.enable_pipelined_write: 0 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.unordered_write: 0 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-10T07:46:54.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.row_cache: None 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.wal_filter: None 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.allow_ingest_behind: 0 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.two_write_queues: 0 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.manual_wal_flush: 0 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.wal_compression: 0 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.atomic_flush: 0 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.log_readahead_size: 0 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.best_efforts_recovery: 0 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.allow_data_in_errors: 0 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.db_host_id: __hostname__ 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_background_jobs: 2 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_background_compactions: -1 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_subcompactions: 1 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_total_wal_size: 0 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_open_files: -1 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.bytes_per_sync: 0 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.compaction_readahead_size: 0 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_background_flushes: -1 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Compression algorithms supported: 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: kZSTD supported: 0 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: kXpressCompression supported: 0 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: kLZ4HCCompression supported: 1 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: kZlibCompression supported: 1 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: kSnappyCompression supported: 1 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: kLZ4Compression supported: 1 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: kBZip2Compression supported: 0 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm08/store.db/MANIFEST-000005 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.merge_operator: 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.compaction_filter: None 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.compaction_filter_factory: None 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.sst_partitioner_factory: None 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55c4a62f3480) 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout: cache_index_and_filter_blocks: 1 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-10T07:46:54.921 INFO:journalctl@ceph.mon.vm08.vm08.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: pin_top_level_index_and_filter: 1 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: index_type: 0 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: data_block_index_type: 0 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: index_shortening: 1 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: checksum: 4 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: no_block_cache: 0 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_cache: 0x55c4a56051f0 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_cache_name: BinnedLRUCache 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_cache_options: 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: capacity : 536870912 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: num_shard_bits : 4 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: strict_capacity_limit : 0 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: high_pri_pool_ratio: 0.000 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_cache_compressed: (nil) 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: persistent_cache: (nil) 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_size: 4096 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_size_deviation: 10 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_restart_interval: 16 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: index_block_restart_interval: 1 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: metadata_block_size: 4096 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: partition_filters: 0 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: use_delta_encoding: 1 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: filter_policy: bloomfilter 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: whole_key_filtering: 1 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: verify_compression: 0 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: read_amp_bytes_per_bit: 0 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: format_version: 5 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: enable_index_compression: 1 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_align: 0 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: max_auto_readahead_size: 262144 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: prepopulate_block_cache: 0 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: initial_auto_readahead_size: 8192 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout: num_file_reads_for_auto_readahead: 2 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.write_buffer_size: 33554432 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_write_buffer_number: 2 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.compression: NoCompression 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.bottommost_compression: Disabled 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.prefix_extractor: nullptr 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.num_levels: 7 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.compression_opts.level: 32767 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.compression_opts.strategy: 0 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.compression_opts.enabled: false 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-10T07:46:54.922 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.target_file_size_base: 67108864 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.arena_block_size: 1048576 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.disable_auto_compactions: 0 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.table_properties_collectors: 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.inplace_update_support: 0 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.bloom_locality: 0 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.max_successive_merges: 0 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.paranoid_file_checks: 0 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.force_consistency_checks: 1 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.report_bg_io_stats: 0 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.ttl: 2592000 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.enable_blob_files: false 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.min_blob_size: 0 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.blob_file_size: 268435456 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.blob_file_starting_level: 0 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm08/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: d0e6f634-0d76-4fd8-b0e5-605ccc480124 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773128814481285, "job": 1, "event": "recovery_started", "wal_files": [4]} 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2 2026-03-10T07:46:54.923 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773128814481923, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1617, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 523, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 401, "raw_average_value_size": 80, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773128814, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d0e6f634-0d76-4fd8-b0e5-605ccc480124", "db_session_id": "E09ZIAO6FCG2W73INDVQ", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}} 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773128814482007, "job": 1, "event": "recovery_finished"} 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: [db/version_set.cc:5047] Creating manifest 10 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm08/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55c4a56a2000 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: DB pointer 0x55c4a568e000 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: rocksdb: [db/db_impl/db_impl.cc:1111] 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: ** DB Stats ** 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Interval stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: ** Compaction Stats [default] ** 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: L0 1/0 1.58 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 2.6 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Sum 1/0 1.58 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 2.6 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 2.6 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: ** Compaction Stats [default] ** 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 2.6 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Flush(GB): cumulative 0.000, interval 0.000 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: AddFile(GB): cumulative 0.000, interval 0.000 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: AddFile(Total Files): cumulative 0, interval 0 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: AddFile(L0 Files): cumulative 0, interval 0 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: AddFile(Keys): cumulative 0, interval 0 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Cumulative compaction: 0.00 GB write, 0.35 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Interval compaction: 0.00 GB write, 0.35 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Block cache BinnedLRUCache@0x55c4a56051f0#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 5e-06 secs_since: 0 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%) 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: ** File Read Latency Histogram By Level [default] ** 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: mon.vm08 does not exist in monmap, will attempt to join an existing cluster 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: using public_addr v2:192.168.123.108:0/0 -> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: starting mon.vm08 rank -1 at public addrs [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] at bind addrs [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] mon_data /var/lib/ceph/mon/ceph-vm08 fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: mon.vm08@-1(???) e0 preinit fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: mon.vm08@-1(synchronizing).mds e1 new map 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: mon.vm08@-1(synchronizing).mds e1 print_map 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: e1 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: legacy client fscid: -1 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout: No filesystems configured 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: mon.vm08@-1(synchronizing).osd e0 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: mon.vm08@-1(synchronizing).osd e0 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: mon.vm08@-1(synchronizing).osd e1 e1: 0 total, 0 up, 0 in 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: mon.vm08@-1(synchronizing).osd e2 e2: 0 total, 0 up, 0 in 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: mon.vm08@-1(synchronizing).osd e3 e3: 0 total, 0 up, 0 in 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: mon.vm08@-1(synchronizing).osd e4 e4: 0 total, 0 up, 0 in 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: mon.vm08@-1(synchronizing).osd e5 e5: 0 total, 0 up, 0 in 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: mon.vm08@-1(synchronizing).osd e5 crush map has features 3314932999778484224, adjusting msgr requires 2026-03-10T07:46:54.924 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: mon.vm08@-1(synchronizing).osd e5 crush map has features 288514050185494528, adjusting msgr requires 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: mon.vm08@-1(synchronizing).osd e5 crush map has features 288514050185494528, adjusting msgr requires 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: mon.vm08@-1(synchronizing).osd e5 crush map has features 288514050185494528, adjusting msgr requires 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: Updating vm05:/etc/ceph/ceph.conf 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: Updating vm08:/etc/ceph/ceph.conf 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: Updating vm08:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.conf 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: Updating vm05:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.conf 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: mgrmap e17: vm05.blexke(active, since 2s) 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: Updating vm08:/etc/ceph/ceph.client.admin.keyring 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: Updating vm05:/etc/ceph/ceph.client.admin.keyring 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: Updating vm05:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.client.admin.keyring 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='client.? 192.168.123.108:0/2380430570' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: Updating vm08:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.client.admin.keyring 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: Deploying daemon ceph-exporter.vm08 on vm08 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]': finished 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: Deploying daemon crash.vm08 on vm08 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='client.? 192.168.123.108:0/2579408392' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: Deploying daemon node-exporter.vm08 on vm08 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='client.? 192.168.123.108:0/1288164391' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='client.? 192.168.123.108:0/2628189415' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.orfpog", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.vm08.orfpog", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: Deploying daemon mgr.vm08.orfpog on vm08 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: Deploying daemon mon.vm08 on vm08 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: from='client.? 192.168.123.108:0/2856745707' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: mon.vm08@-1(synchronizing).paxosservice(auth 1..8) refresh upgraded, format 0 -> 3 2026-03-10T07:46:54.925 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: mon.vm08@-1(synchronizing) e1 handle_conf_change mon_allow_pool_delete,mon_cluster_log_to_file 2026-03-10T07:46:54.926 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: expand_channel_meta expand map: {default=false} 2026-03-10T07:46:54.926 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: expand_channel_meta from 'false' to 'false' 2026-03-10T07:46:54.926 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: expand_channel_meta expanded map: {default=false} 2026-03-10T07:46:54.926 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: expand_channel_meta expand map: {default=info} 2026-03-10T07:46:54.926 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: expand_channel_meta from 'info' to 'info' 2026-03-10T07:46:54.926 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: expand_channel_meta expanded map: {default=info} 2026-03-10T07:46:54.926 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: expand_channel_meta expand map: {default=daemon} 2026-03-10T07:46:54.926 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: expand_channel_meta from 'daemon' to 'daemon' 2026-03-10T07:46:54.926 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: expand_channel_meta expanded map: {default=daemon} 2026-03-10T07:46:54.926 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: expand_channel_meta expand map: {default=debug} 2026-03-10T07:46:54.926 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: expand_channel_meta from 'debug' to 'debug' 2026-03-10T07:46:54.926 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:54 vm08 ceph-mon[59917]: expand_channel_meta expanded map: {default=debug} 2026-03-10T07:46:55.302 INFO:tasks.cephadm:Waiting for 2 mons in monmap... 2026-03-10T07:46:55.302 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph mon dump -f json 2026-03-10T07:46:55.474 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm08/config 2026-03-10T07:47:00.047 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:59 vm05 ceph-mon[50387]: mon.vm05 calling monitor election 2026-03-10T07:47:00.047 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:59 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T07:47:00.047 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:59 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:47:00.047 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:59 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:47:00.047 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:59 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:47:00.047 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:59 vm05 ceph-mon[50387]: mon.vm08 calling monitor election 2026-03-10T07:47:00.047 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:59 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:47:00.047 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:59 vm05 ceph-mon[50387]: from='mgr.? 192.168.123.108:0/779307159' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.orfpog/crt"}]: dispatch 2026-03-10T07:47:00.047 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:59 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:47:00.047 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:59 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:47:00.047 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:59 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:47:00.047 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:59 vm05 ceph-mon[50387]: mon.vm05 is new leader, mons vm05,vm08 in quorum (ranks 0,1) 2026-03-10T07:47:00.047 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:59 vm05 ceph-mon[50387]: monmap e2: 2 mons at {vm05=[v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0],vm08=[v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0]} removed_ranks: {} disallowed_leaders: {} 2026-03-10T07:47:00.047 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:59 vm05 ceph-mon[50387]: fsmap 2026-03-10T07:47:00.047 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:59 vm05 ceph-mon[50387]: osdmap e5: 0 total, 0 up, 0 in 2026-03-10T07:47:00.047 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:59 vm05 ceph-mon[50387]: mgrmap e17: vm05.blexke(active, since 16s) 2026-03-10T07:47:00.047 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:59 vm05 ceph-mon[50387]: Standby manager daemon vm08.orfpog started 2026-03-10T07:47:00.048 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:59 vm05 ceph-mon[50387]: from='mgr.? 192.168.123.108:0/779307159' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T07:47:00.048 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:59 vm05 ceph-mon[50387]: overall HEALTH_OK 2026-03-10T07:47:00.048 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:59 vm05 ceph-mon[50387]: from='mgr.? 192.168.123.108:0/779307159' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.orfpog/key"}]: dispatch 2026-03-10T07:47:00.048 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:59 vm05 ceph-mon[50387]: from='mgr.? 192.168.123.108:0/779307159' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T07:47:00.048 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:59 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:00.048 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:59 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:00.048 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:46:59 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:00.149 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:59 vm08 ceph-mon[59917]: mon.vm05 calling monitor election 2026-03-10T07:47:00.149 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:59 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T07:47:00.149 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:59 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:47:00.149 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:59 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:47:00.149 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:59 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:47:00.149 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:59 vm08 ceph-mon[59917]: mon.vm08 calling monitor election 2026-03-10T07:47:00.149 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:59 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:47:00.149 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:59 vm08 ceph-mon[59917]: from='mgr.? 192.168.123.108:0/779307159' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.orfpog/crt"}]: dispatch 2026-03-10T07:47:00.149 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:59 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:47:00.149 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:59 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:47:00.149 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:59 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:47:00.149 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:59 vm08 ceph-mon[59917]: mon.vm05 is new leader, mons vm05,vm08 in quorum (ranks 0,1) 2026-03-10T07:47:00.149 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:59 vm08 ceph-mon[59917]: monmap e2: 2 mons at {vm05=[v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0],vm08=[v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0]} removed_ranks: {} disallowed_leaders: {} 2026-03-10T07:47:00.149 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:59 vm08 ceph-mon[59917]: fsmap 2026-03-10T07:47:00.149 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:59 vm08 ceph-mon[59917]: osdmap e5: 0 total, 0 up, 0 in 2026-03-10T07:47:00.149 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:59 vm08 ceph-mon[59917]: mgrmap e17: vm05.blexke(active, since 16s) 2026-03-10T07:47:00.149 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:59 vm08 ceph-mon[59917]: Standby manager daemon vm08.orfpog started 2026-03-10T07:47:00.149 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:59 vm08 ceph-mon[59917]: from='mgr.? 192.168.123.108:0/779307159' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T07:47:00.149 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:59 vm08 ceph-mon[59917]: overall HEALTH_OK 2026-03-10T07:47:00.149 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:59 vm08 ceph-mon[59917]: from='mgr.? 192.168.123.108:0/779307159' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.orfpog/key"}]: dispatch 2026-03-10T07:47:00.149 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:59 vm08 ceph-mon[59917]: from='mgr.? 192.168.123.108:0/779307159' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T07:47:00.149 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:59 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:00.149 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:59 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:00.149 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:46:59 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:00.151 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.149+0000 7f1f39b27700 1 -- 192.168.123.108:0/49240544 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1f3410de10 msgr2=0x7f1f1c005610 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:00.151 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.149+0000 7f1f39b27700 1 --2- 192.168.123.108:0/49240544 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1f3410de10 0x7f1f1c005610 secure :-1 s=READY pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7f1f24005fa0 tx=0x7f1f2400ff70 comp rx=0 tx=0).stop 2026-03-10T07:47:00.151 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.150+0000 7f1f39b27700 1 -- 192.168.123.108:0/49240544 shutdown_connections 2026-03-10T07:47:00.151 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.150+0000 7f1f39b27700 1 --2- 192.168.123.108:0/49240544 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1f3410de10 0x7f1f1c005610 unknown :-1 s=CLOSED pgs=161 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:00.151 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.150+0000 7f1f39b27700 1 -- 192.168.123.108:0/49240544 >> 192.168.123.108:0/49240544 conn(0x7f1f3406ba60 msgr2=0x7f1f3406be70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:47:00.151 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.150+0000 7f1f39b27700 1 -- 192.168.123.108:0/49240544 shutdown_connections 2026-03-10T07:47:00.152 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.150+0000 7f1f39b27700 1 -- 192.168.123.108:0/49240544 wait complete. 2026-03-10T07:47:00.152 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.150+0000 7f1f39b27700 1 Processor -- start 2026-03-10T07:47:00.152 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.151+0000 7f1f39b27700 1 -- start start 2026-03-10T07:47:00.152 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.151+0000 7f1f39b27700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1f3410de10 0x7f1f341b2380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:47:00.152 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.151+0000 7f1f39b27700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1f341b28c0 0x7f1f341add30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:47:00.152 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.151+0000 7f1f39b27700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1f341ae270 con 0x7f1f341b28c0 2026-03-10T07:47:00.152 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.151+0000 7f1f39b27700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1f341ae3e0 con 0x7f1f3410de10 2026-03-10T07:47:00.152 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.151+0000 7f1f33fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1f341b28c0 0x7f1f341add30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:47:00.152 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.151+0000 7f1f33fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1f341b28c0 0x7f1f341add30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:40412/0 (socket says 192.168.123.108:40412) 2026-03-10T07:47:00.152 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.151+0000 7f1f33fff700 1 -- 192.168.123.108:0/2773066084 learned_addr learned my addr 192.168.123.108:0/2773066084 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:47:00.153 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.151+0000 7f1f33fff700 1 -- 192.168.123.108:0/2773066084 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1f3410de10 msgr2=0x7f1f341b2380 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:00.153 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.151+0000 7f1f33fff700 1 --2- 192.168.123.108:0/2773066084 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1f3410de10 0x7f1f341b2380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:00.153 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.151+0000 7f1f33fff700 1 -- 192.168.123.108:0/2773066084 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1f2400f970 con 0x7f1f341b28c0 2026-03-10T07:47:00.153 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.151+0000 7f1f33fff700 1 --2- 192.168.123.108:0/2773066084 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1f341b28c0 0x7f1f341add30 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7f1f2800b770 tx=0x7f1f2800bb30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:47:00.153 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.151+0000 7f1f31ffb700 1 -- 192.168.123.108:0/2773066084 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1f2800f820 con 0x7f1f341b28c0 2026-03-10T07:47:00.153 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.152+0000 7f1f39b27700 1 -- 192.168.123.108:0/2773066084 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1f341ae670 con 0x7f1f341b28c0 2026-03-10T07:47:00.153 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.152+0000 7f1f39b27700 1 -- 192.168.123.108:0/2773066084 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1f34076fe0 con 0x7f1f341b28c0 2026-03-10T07:47:00.154 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.152+0000 7f1f31ffb700 1 -- 192.168.123.108:0/2773066084 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1f2800fe60 con 0x7f1f341b28c0 2026-03-10T07:47:00.154 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.152+0000 7f1f31ffb700 1 -- 192.168.123.108:0/2773066084 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1f2800d400 con 0x7f1f341b28c0 2026-03-10T07:47:00.154 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.153+0000 7f1f31ffb700 1 -- 192.168.123.108:0/2773066084 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90252+0+0 (secure 0 0 0) 0x7f1f2800d6c0 con 0x7f1f341b28c0 2026-03-10T07:47:00.155 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.153+0000 7f1f31ffb700 1 --2- 192.168.123.108:0/2773066084 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1f2006c5b0 0x7f1f2006ea70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:47:00.155 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.154+0000 7f1f31ffb700 1 -- 192.168.123.108:0/2773066084 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f1f2808b590 con 0x7f1f341b28c0 2026-03-10T07:47:00.155 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.154+0000 7f1f38b25700 1 --2- 192.168.123.108:0/2773066084 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1f2006c5b0 0x7f1f2006ea70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:47:00.155 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.154+0000 7f1f39b27700 1 -- 192.168.123.108:0/2773066084 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1f1c002170 con 0x7f1f341b28c0 2026-03-10T07:47:00.155 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.154+0000 7f1f38b25700 1 --2- 192.168.123.108:0/2773066084 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1f2006c5b0 0x7f1f2006ea70 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f1f24020040 tx=0x7f1f2401d520 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:47:00.158 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.157+0000 7f1f31ffb700 1 -- 192.168.123.108:0/2773066084 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f1f2805a970 con 0x7f1f341b28c0 2026-03-10T07:47:00.314 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.309+0000 7f1f39b27700 1 -- 192.168.123.108:0/2773066084 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mon dump", "format": "json"} v 0) v1 -- 0x7f1f1c001fe0 con 0x7f1f341b28c0 2026-03-10T07:47:00.317 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:47:00.317 INFO:teuthology.orchestra.run.vm08.stdout:{"epoch":2,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","modified":"2026-03-10T07:46:54.511496Z","created":"2026-03-10T07:45:39.881211Z","min_mon_release":18,"min_mon_release_name":"reef","election_strategy":1,"disallowed_leaders: ":"","stretch_mode":false,"tiebreaker_mon":"","removed_ranks: ":"","features":{"persistent":["kraken","luminous","mimic","osdmap-prune","nautilus","octopus","pacific","elector-pinging","quincy","reef"],"optional":[]},"mons":[{"rank":0,"name":"vm05","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:3300","nonce":0},{"type":"v1","addr":"192.168.123.105:6789","nonce":0}]},"addr":"192.168.123.105:6789/0","public_addr":"192.168.123.105:6789/0","priority":0,"weight":0,"crush_location":"{}"},{"rank":1,"name":"vm08","public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:3300","nonce":0},{"type":"v1","addr":"192.168.123.108:6789","nonce":0}]},"addr":"192.168.123.108:6789/0","public_addr":"192.168.123.108:6789/0","priority":0,"weight":0,"crush_location":"{}"}],"quorum":[0,1]} 2026-03-10T07:47:00.318 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.314+0000 7f1f31ffb700 1 -- 192.168.123.108:0/2773066084 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mon dump", "format": "json"}]=0 dumped monmap epoch 2 v2) v1 ==== 95+0+1032 (secure 0 0 0) 0x7f1f2805a500 con 0x7f1f341b28c0 2026-03-10T07:47:00.318 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.317+0000 7f1f39b27700 1 -- 192.168.123.108:0/2773066084 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1f2006c5b0 msgr2=0x7f1f2006ea70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:00.318 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.317+0000 7f1f39b27700 1 --2- 192.168.123.108:0/2773066084 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1f2006c5b0 0x7f1f2006ea70 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f1f24020040 tx=0x7f1f2401d520 comp rx=0 tx=0).stop 2026-03-10T07:47:00.318 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.317+0000 7f1f39b27700 1 -- 192.168.123.108:0/2773066084 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1f341b28c0 msgr2=0x7f1f341add30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:00.318 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.317+0000 7f1f39b27700 1 --2- 192.168.123.108:0/2773066084 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1f341b28c0 0x7f1f341add30 secure :-1 s=READY pgs=162 cs=0 l=1 rev1=1 crypto rx=0x7f1f2800b770 tx=0x7f1f2800bb30 comp rx=0 tx=0).stop 2026-03-10T07:47:00.318 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.317+0000 7f1f39b27700 1 -- 192.168.123.108:0/2773066084 shutdown_connections 2026-03-10T07:47:00.318 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.317+0000 7f1f39b27700 1 --2- 192.168.123.108:0/2773066084 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1f2006c5b0 0x7f1f2006ea70 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:00.318 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.317+0000 7f1f39b27700 1 --2- 192.168.123.108:0/2773066084 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1f3410de10 0x7f1f341b2380 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:00.318 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.317+0000 7f1f39b27700 1 --2- 192.168.123.108:0/2773066084 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1f341b28c0 0x7f1f341add30 unknown :-1 s=CLOSED pgs=162 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:00.319 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.317+0000 7f1f39b27700 1 -- 192.168.123.108:0/2773066084 >> 192.168.123.108:0/2773066084 conn(0x7f1f3406ba60 msgr2=0x7f1f3410fdf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:47:00.319 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.317+0000 7f1f39b27700 1 -- 192.168.123.108:0/2773066084 shutdown_connections 2026-03-10T07:47:00.319 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:00.317+0000 7f1f39b27700 1 -- 192.168.123.108:0/2773066084 wait complete. 2026-03-10T07:47:00.327 INFO:teuthology.orchestra.run.vm08.stderr:dumped monmap epoch 2 2026-03-10T07:47:00.370 INFO:tasks.cephadm:Generating final ceph.conf file... 2026-03-10T07:47:00.370 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph config generate-minimal-conf 2026-03-10T07:47:00.523 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:47:00.830 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.827+0000 7f72b153b700 1 -- 192.168.123.105:0/1670452178 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72ac107ff0 msgr2=0x7f72ac10edf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:00.830 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.827+0000 7f72b153b700 1 --2- 192.168.123.105:0/1670452178 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72ac107ff0 0x7f72ac10edf0 secure :-1 s=READY pgs=163 cs=0 l=1 rev1=1 crypto rx=0x7f729c009b50 tx=0x7f729c009e60 comp rx=0 tx=0).stop 2026-03-10T07:47:00.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.827+0000 7f72b153b700 1 -- 192.168.123.105:0/1670452178 shutdown_connections 2026-03-10T07:47:00.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.827+0000 7f72b153b700 1 --2- 192.168.123.105:0/1670452178 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72ac107ff0 0x7f72ac10edf0 unknown :-1 s=CLOSED pgs=163 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:00.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.827+0000 7f72b153b700 1 -- 192.168.123.105:0/1670452178 >> 192.168.123.105:0/1670452178 conn(0x7f72ac06c970 msgr2=0x7f72ac06cd80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:47:00.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.827+0000 7f72b153b700 1 -- 192.168.123.105:0/1670452178 shutdown_connections 2026-03-10T07:47:00.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.827+0000 7f72b153b700 1 -- 192.168.123.105:0/1670452178 wait complete. 2026-03-10T07:47:00.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.828+0000 7f72b153b700 1 Processor -- start 2026-03-10T07:47:00.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.828+0000 7f72b153b700 1 -- start start 2026-03-10T07:47:00.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.828+0000 7f72b153b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f72ac107ff0 0x7f72ac1a4fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:47:00.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.828+0000 7f72b153b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72ac1a5500 0x7f72ac1a9780 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:47:00.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.828+0000 7f72b153b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f72ac1a9cc0 con 0x7f72ac1a5500 2026-03-10T07:47:00.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.828+0000 7f72b153b700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f72ac1a9e30 con 0x7f72ac107ff0 2026-03-10T07:47:00.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.828+0000 7f72ab7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72ac1a5500 0x7f72ac1a9780 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:47:00.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.828+0000 7f72ab7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72ac1a5500 0x7f72ac1a9780 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:52998/0 (socket says 192.168.123.105:52998) 2026-03-10T07:47:00.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.828+0000 7f72ab7fe700 1 -- 192.168.123.105:0/4085091647 learned_addr learned my addr 192.168.123.105:0/4085091647 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:47:00.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.828+0000 7f72ab7fe700 1 -- 192.168.123.105:0/4085091647 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f72ac107ff0 msgr2=0x7f72ac1a4fc0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:47:00.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.828+0000 7f72ab7fe700 1 --2- 192.168.123.105:0/4085091647 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f72ac107ff0 0x7f72ac1a4fc0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:00.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.828+0000 7f72ab7fe700 1 -- 192.168.123.105:0/4085091647 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f729c0097e0 con 0x7f72ac1a5500 2026-03-10T07:47:00.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.828+0000 7f72ab7fe700 1 --2- 192.168.123.105:0/4085091647 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72ac1a5500 0x7f72ac1a9780 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7f72a000ba70 tx=0x7f72a000be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:47:00.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.829+0000 7f72a97fa700 1 -- 192.168.123.105:0/4085091647 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f72a000c7e0 con 0x7f72ac1a5500 2026-03-10T07:47:00.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.829+0000 7f72a97fa700 1 -- 192.168.123.105:0/4085091647 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f72a000ce20 con 0x7f72ac1a5500 2026-03-10T07:47:00.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.829+0000 7f72a97fa700 1 -- 192.168.123.105:0/4085091647 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f72a0012550 con 0x7f72ac1a5500 2026-03-10T07:47:00.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.830+0000 7f72b153b700 1 -- 192.168.123.105:0/4085091647 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f72ac1a9fd0 con 0x7f72ac1a5500 2026-03-10T07:47:00.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.830+0000 7f72b153b700 1 -- 192.168.123.105:0/4085091647 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f72ac1aa4f0 con 0x7f72ac1a5500 2026-03-10T07:47:00.836 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.830+0000 7f72b153b700 1 -- 192.168.123.105:0/4085091647 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f72ac10d560 con 0x7f72ac1a5500 2026-03-10T07:47:00.837 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.835+0000 7f72a97fa700 1 -- 192.168.123.105:0/4085091647 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90252+0+0 (secure 0 0 0) 0x7f72a0014440 con 0x7f72ac1a5500 2026-03-10T07:47:00.837 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.835+0000 7f72a97fa700 1 --2- 192.168.123.105:0/4085091647 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f729406c680 0x7f729406eb40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:47:00.837 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.835+0000 7f72abfff700 1 --2- 192.168.123.105:0/4085091647 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f729406c680 0x7f729406eb40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:47:00.837 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.836+0000 7f72abfff700 1 --2- 192.168.123.105:0/4085091647 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f729406c680 0x7f729406eb40 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f729c009b20 tx=0x7f729c000bc0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:47:00.837 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.836+0000 7f72a97fa700 1 -- 192.168.123.105:0/4085091647 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7f72a008a170 con 0x7f72ac1a5500 2026-03-10T07:47:00.837 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.836+0000 7f72a97fa700 1 -- 192.168.123.105:0/4085091647 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f72a008a5a0 con 0x7f72ac1a5500 2026-03-10T07:47:00.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:00 vm05 ceph-mon[50387]: mgrmap e18: vm05.blexke(active, since 16s), standbys: vm08.orfpog 2026-03-10T07:47:00.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:00 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr metadata", "who": "vm08.orfpog", "id": "vm08.orfpog"}]: dispatch 2026-03-10T07:47:00.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:00 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:47:00.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:00 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/2773066084' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:47:00.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:00 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:47:00.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:00 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:00.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:00 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:00.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:00 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:00.962 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:00 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:47:00.962 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.959+0000 7f72b153b700 1 -- 192.168.123.105:0/4085091647 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "config generate-minimal-conf"} v 0) v1 -- 0x7f72ac04f000 con 0x7f72ac1a5500 2026-03-10T07:47:00.962 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.961+0000 7f72a97fa700 1 -- 192.168.123.105:0/4085091647 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "config generate-minimal-conf"}]=0 v10) v1 ==== 76+0+235 (secure 0 0 0) 0x7f72ac04f000 con 0x7f72ac1a5500 2026-03-10T07:47:00.963 INFO:teuthology.orchestra.run.vm05.stdout:# minimal ceph.conf for 12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T07:47:00.963 INFO:teuthology.orchestra.run.vm05.stdout:[global] 2026-03-10T07:47:00.963 INFO:teuthology.orchestra.run.vm05.stdout: fsid = 12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T07:47:00.963 INFO:teuthology.orchestra.run.vm05.stdout: mon_host = [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] 2026-03-10T07:47:00.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.964+0000 7f729affd700 1 -- 192.168.123.105:0/4085091647 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f729406c680 msgr2=0x7f729406eb40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:00.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.964+0000 7f729affd700 1 --2- 192.168.123.105:0/4085091647 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f729406c680 0x7f729406eb40 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f729c009b20 tx=0x7f729c000bc0 comp rx=0 tx=0).stop 2026-03-10T07:47:00.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.965+0000 7f729affd700 1 -- 192.168.123.105:0/4085091647 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72ac1a5500 msgr2=0x7f72ac1a9780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:00.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.965+0000 7f729affd700 1 --2- 192.168.123.105:0/4085091647 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72ac1a5500 0x7f72ac1a9780 secure :-1 s=READY pgs=164 cs=0 l=1 rev1=1 crypto rx=0x7f72a000ba70 tx=0x7f72a000be30 comp rx=0 tx=0).stop 2026-03-10T07:47:00.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.965+0000 7f729affd700 1 -- 192.168.123.105:0/4085091647 shutdown_connections 2026-03-10T07:47:00.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.965+0000 7f729affd700 1 --2- 192.168.123.105:0/4085091647 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f729406c680 0x7f729406eb40 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:00.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.965+0000 7f729affd700 1 --2- 192.168.123.105:0/4085091647 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f72ac107ff0 0x7f72ac1a4fc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:00.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.965+0000 7f729affd700 1 --2- 192.168.123.105:0/4085091647 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72ac1a5500 0x7f72ac1a9780 unknown :-1 s=CLOSED pgs=164 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:00.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.966+0000 7f729affd700 1 -- 192.168.123.105:0/4085091647 >> 192.168.123.105:0/4085091647 conn(0x7f72ac06c970 msgr2=0x7f72ac10e850 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:47:00.967 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.966+0000 7f729affd700 1 -- 192.168.123.105:0/4085091647 shutdown_connections 2026-03-10T07:47:00.967 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:00.966+0000 7f729affd700 1 -- 192.168.123.105:0/4085091647 wait complete. 2026-03-10T07:47:01.030 INFO:tasks.cephadm:Distributing (final) config and client.admin keyring... 2026-03-10T07:47:01.030 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T07:47:01.030 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/etc/ceph/ceph.conf 2026-03-10T07:47:01.066 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T07:47:01.067 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-10T07:47:01.140 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T07:47:01.140 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/etc/ceph/ceph.conf 2026-03-10T07:47:01.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:00 vm08 ceph-mon[59917]: mgrmap e18: vm05.blexke(active, since 16s), standbys: vm08.orfpog 2026-03-10T07:47:01.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:00 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr metadata", "who": "vm08.orfpog", "id": "vm08.orfpog"}]: dispatch 2026-03-10T07:47:01.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:00 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:47:01.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:00 vm08 ceph-mon[59917]: from='client.? 192.168.123.108:0/2773066084' entity='client.admin' cmd=[{"prefix": "mon dump", "format": "json"}]: dispatch 2026-03-10T07:47:01.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:00 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:47:01.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:00 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:01.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:00 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:01.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:00 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:01.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:00 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:47:01.171 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T07:47:01.171 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/etc/ceph/ceph.client.admin.keyring 2026-03-10T07:47:01.242 INFO:tasks.cephadm:Deploying OSDs... 2026-03-10T07:47:01.242 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T07:47:01.242 DEBUG:teuthology.orchestra.run.vm05:> dd if=/scratch_devs of=/dev/stdout 2026-03-10T07:47:01.256 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T07:47:01.256 DEBUG:teuthology.orchestra.run.vm05:> ls /dev/[sv]d? 2026-03-10T07:47:01.316 INFO:teuthology.orchestra.run.vm05.stdout:/dev/vda 2026-03-10T07:47:01.317 INFO:teuthology.orchestra.run.vm05.stdout:/dev/vdb 2026-03-10T07:47:01.317 INFO:teuthology.orchestra.run.vm05.stdout:/dev/vdc 2026-03-10T07:47:01.317 INFO:teuthology.orchestra.run.vm05.stdout:/dev/vdd 2026-03-10T07:47:01.317 INFO:teuthology.orchestra.run.vm05.stdout:/dev/vde 2026-03-10T07:47:01.317 WARNING:teuthology.misc:Removing root device: /dev/vda from device list 2026-03-10T07:47:01.317 DEBUG:teuthology.misc:devs=['/dev/vdb', '/dev/vdc', '/dev/vdd', '/dev/vde'] 2026-03-10T07:47:01.317 DEBUG:teuthology.orchestra.run.vm05:> stat /dev/vdb 2026-03-10T07:47:01.377 INFO:teuthology.orchestra.run.vm05.stdout: File: /dev/vdb 2026-03-10T07:47:01.377 INFO:teuthology.orchestra.run.vm05.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T07:47:01.377 INFO:teuthology.orchestra.run.vm05.stdout:Device: 6h/6d Inode: 254 Links: 1 Device type: fc,10 2026-03-10T07:47:01.377 INFO:teuthology.orchestra.run.vm05.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T07:47:01.377 INFO:teuthology.orchestra.run.vm05.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T07:47:01.377 INFO:teuthology.orchestra.run.vm05.stdout:Access: 2026-03-10 07:46:13.814202293 +0000 2026-03-10T07:47:01.377 INFO:teuthology.orchestra.run.vm05.stdout:Modify: 2026-03-10 07:40:48.053000000 +0000 2026-03-10T07:47:01.377 INFO:teuthology.orchestra.run.vm05.stdout:Change: 2026-03-10 07:40:48.053000000 +0000 2026-03-10T07:47:01.377 INFO:teuthology.orchestra.run.vm05.stdout: Birth: 2026-03-10 07:40:46.244000000 +0000 2026-03-10T07:47:01.377 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/dev/vdb of=/dev/null count=1 2026-03-10T07:47:01.445 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records in 2026-03-10T07:47:01.445 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records out 2026-03-10T07:47:01.445 INFO:teuthology.orchestra.run.vm05.stderr:512 bytes copied, 0.000598279 s, 856 kB/s 2026-03-10T07:47:01.446 DEBUG:teuthology.orchestra.run.vm05:> ! mount | grep -v devtmpfs | grep -q /dev/vdb 2026-03-10T07:47:01.513 DEBUG:teuthology.orchestra.run.vm05:> stat /dev/vdc 2026-03-10T07:47:01.571 INFO:teuthology.orchestra.run.vm05.stdout: File: /dev/vdc 2026-03-10T07:47:01.571 INFO:teuthology.orchestra.run.vm05.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T07:47:01.571 INFO:teuthology.orchestra.run.vm05.stdout:Device: 6h/6d Inode: 255 Links: 1 Device type: fc,20 2026-03-10T07:47:01.571 INFO:teuthology.orchestra.run.vm05.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T07:47:01.571 INFO:teuthology.orchestra.run.vm05.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T07:47:01.571 INFO:teuthology.orchestra.run.vm05.stdout:Access: 2026-03-10 07:46:13.879201995 +0000 2026-03-10T07:47:01.571 INFO:teuthology.orchestra.run.vm05.stdout:Modify: 2026-03-10 07:40:48.043000000 +0000 2026-03-10T07:47:01.571 INFO:teuthology.orchestra.run.vm05.stdout:Change: 2026-03-10 07:40:48.043000000 +0000 2026-03-10T07:47:01.571 INFO:teuthology.orchestra.run.vm05.stdout: Birth: 2026-03-10 07:40:46.250000000 +0000 2026-03-10T07:47:01.571 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/dev/vdc of=/dev/null count=1 2026-03-10T07:47:01.635 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records in 2026-03-10T07:47:01.635 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records out 2026-03-10T07:47:01.635 INFO:teuthology.orchestra.run.vm05.stderr:512 bytes copied, 0.000196538 s, 2.6 MB/s 2026-03-10T07:47:01.636 DEBUG:teuthology.orchestra.run.vm05:> ! mount | grep -v devtmpfs | grep -q /dev/vdc 2026-03-10T07:47:01.717 DEBUG:teuthology.orchestra.run.vm05:> stat /dev/vdd 2026-03-10T07:47:01.800 INFO:teuthology.orchestra.run.vm05.stdout: File: /dev/vdd 2026-03-10T07:47:01.800 INFO:teuthology.orchestra.run.vm05.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T07:47:01.800 INFO:teuthology.orchestra.run.vm05.stdout:Device: 6h/6d Inode: 256 Links: 1 Device type: fc,30 2026-03-10T07:47:01.800 INFO:teuthology.orchestra.run.vm05.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T07:47:01.800 INFO:teuthology.orchestra.run.vm05.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T07:47:01.800 INFO:teuthology.orchestra.run.vm05.stdout:Access: 2026-03-10 07:46:13.932201753 +0000 2026-03-10T07:47:01.800 INFO:teuthology.orchestra.run.vm05.stdout:Modify: 2026-03-10 07:40:48.042000000 +0000 2026-03-10T07:47:01.800 INFO:teuthology.orchestra.run.vm05.stdout:Change: 2026-03-10 07:40:48.042000000 +0000 2026-03-10T07:47:01.800 INFO:teuthology.orchestra.run.vm05.stdout: Birth: 2026-03-10 07:40:46.255000000 +0000 2026-03-10T07:47:01.800 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/dev/vdd of=/dev/null count=1 2026-03-10T07:47:01.873 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records in 2026-03-10T07:47:01.873 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records out 2026-03-10T07:47:01.873 INFO:teuthology.orchestra.run.vm05.stderr:512 bytes copied, 0.000140232 s, 3.7 MB/s 2026-03-10T07:47:01.874 DEBUG:teuthology.orchestra.run.vm05:> ! mount | grep -v devtmpfs | grep -q /dev/vdd 2026-03-10T07:47:01.936 DEBUG:teuthology.orchestra.run.vm05:> stat /dev/vde 2026-03-10T07:47:01.996 INFO:teuthology.orchestra.run.vm05.stdout: File: /dev/vde 2026-03-10T07:47:01.996 INFO:teuthology.orchestra.run.vm05.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T07:47:01.996 INFO:teuthology.orchestra.run.vm05.stdout:Device: 6h/6d Inode: 257 Links: 1 Device type: fc,40 2026-03-10T07:47:01.996 INFO:teuthology.orchestra.run.vm05.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T07:47:01.996 INFO:teuthology.orchestra.run.vm05.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T07:47:01.997 INFO:teuthology.orchestra.run.vm05.stdout:Access: 2026-03-10 07:46:13.984201514 +0000 2026-03-10T07:47:01.997 INFO:teuthology.orchestra.run.vm05.stdout:Modify: 2026-03-10 07:40:48.034000000 +0000 2026-03-10T07:47:01.997 INFO:teuthology.orchestra.run.vm05.stdout:Change: 2026-03-10 07:40:48.034000000 +0000 2026-03-10T07:47:01.997 INFO:teuthology.orchestra.run.vm05.stdout: Birth: 2026-03-10 07:40:46.260000000 +0000 2026-03-10T07:47:01.997 DEBUG:teuthology.orchestra.run.vm05:> sudo dd if=/dev/vde of=/dev/null count=1 2026-03-10T07:47:02.062 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:01 vm05 ceph-mon[50387]: Updating vm05:/etc/ceph/ceph.conf 2026-03-10T07:47:02.063 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:01 vm05 ceph-mon[50387]: Updating vm08:/etc/ceph/ceph.conf 2026-03-10T07:47:02.063 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:01 vm05 ceph-mon[50387]: Updating vm08:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.conf 2026-03-10T07:47:02.063 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:01 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/4085091647' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:02.063 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:01 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:02.063 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:01 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:02.063 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:01 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:02.063 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:01 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:02.063 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:01 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:02.063 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:01 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T07:47:02.063 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:01 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T07:47:02.063 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:01 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:02.065 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records in 2026-03-10T07:47:02.065 INFO:teuthology.orchestra.run.vm05.stderr:1+0 records out 2026-03-10T07:47:02.065 INFO:teuthology.orchestra.run.vm05.stderr:512 bytes copied, 0.000140643 s, 3.6 MB/s 2026-03-10T07:47:02.067 DEBUG:teuthology.orchestra.run.vm05:> ! mount | grep -v devtmpfs | grep -q /dev/vde 2026-03-10T07:47:02.148 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T07:47:02.148 DEBUG:teuthology.orchestra.run.vm08:> dd if=/scratch_devs of=/dev/stdout 2026-03-10T07:47:02.163 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T07:47:02.164 DEBUG:teuthology.orchestra.run.vm08:> ls /dev/[sv]d? 2026-03-10T07:47:02.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:01 vm08 ceph-mon[59917]: Updating vm05:/etc/ceph/ceph.conf 2026-03-10T07:47:02.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:01 vm08 ceph-mon[59917]: Updating vm08:/etc/ceph/ceph.conf 2026-03-10T07:47:02.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:01 vm08 ceph-mon[59917]: Updating vm08:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.conf 2026-03-10T07:47:02.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:01 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/4085091647' entity='client.admin' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:02.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:01 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:02.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:01 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:02.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:01 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:02.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:01 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:02.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:01 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:02.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:01 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T07:47:02.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:01 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T07:47:02.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:01 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:02.185 INFO:teuthology.orchestra.run.vm08.stdout:/dev/vda 2026-03-10T07:47:02.185 INFO:teuthology.orchestra.run.vm08.stdout:/dev/vdb 2026-03-10T07:47:02.185 INFO:teuthology.orchestra.run.vm08.stdout:/dev/vdc 2026-03-10T07:47:02.185 INFO:teuthology.orchestra.run.vm08.stdout:/dev/vdd 2026-03-10T07:47:02.185 INFO:teuthology.orchestra.run.vm08.stdout:/dev/vde 2026-03-10T07:47:02.185 WARNING:teuthology.misc:Removing root device: /dev/vda from device list 2026-03-10T07:47:02.185 DEBUG:teuthology.misc:devs=['/dev/vdb', '/dev/vdc', '/dev/vdd', '/dev/vde'] 2026-03-10T07:47:02.185 DEBUG:teuthology.orchestra.run.vm08:> stat /dev/vdb 2026-03-10T07:47:02.244 INFO:teuthology.orchestra.run.vm08.stdout: File: /dev/vdb 2026-03-10T07:47:02.244 INFO:teuthology.orchestra.run.vm08.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T07:47:02.244 INFO:teuthology.orchestra.run.vm08.stdout:Device: 6h/6d Inode: 221 Links: 1 Device type: fc,10 2026-03-10T07:47:02.244 INFO:teuthology.orchestra.run.vm08.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T07:47:02.244 INFO:teuthology.orchestra.run.vm08.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T07:47:02.244 INFO:teuthology.orchestra.run.vm08.stdout:Access: 2026-03-10 07:46:45.595719144 +0000 2026-03-10T07:47:02.244 INFO:teuthology.orchestra.run.vm08.stdout:Modify: 2026-03-10 07:41:25.052000000 +0000 2026-03-10T07:47:02.244 INFO:teuthology.orchestra.run.vm08.stdout:Change: 2026-03-10 07:41:25.052000000 +0000 2026-03-10T07:47:02.244 INFO:teuthology.orchestra.run.vm08.stdout: Birth: 2026-03-10 07:41:23.195000000 +0000 2026-03-10T07:47:02.244 DEBUG:teuthology.orchestra.run.vm08:> sudo dd if=/dev/vdb of=/dev/null count=1 2026-03-10T07:47:02.309 INFO:teuthology.orchestra.run.vm08.stderr:1+0 records in 2026-03-10T07:47:02.309 INFO:teuthology.orchestra.run.vm08.stderr:1+0 records out 2026-03-10T07:47:02.309 INFO:teuthology.orchestra.run.vm08.stderr:512 bytes copied, 0.000191378 s, 2.7 MB/s 2026-03-10T07:47:02.310 DEBUG:teuthology.orchestra.run.vm08:> ! mount | grep -v devtmpfs | grep -q /dev/vdb 2026-03-10T07:47:02.366 DEBUG:teuthology.orchestra.run.vm08:> stat /dev/vdc 2026-03-10T07:47:02.426 INFO:teuthology.orchestra.run.vm08.stdout: File: /dev/vdc 2026-03-10T07:47:02.426 INFO:teuthology.orchestra.run.vm08.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T07:47:02.426 INFO:teuthology.orchestra.run.vm08.stdout:Device: 6h/6d Inode: 224 Links: 1 Device type: fc,20 2026-03-10T07:47:02.426 INFO:teuthology.orchestra.run.vm08.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T07:47:02.426 INFO:teuthology.orchestra.run.vm08.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T07:47:02.426 INFO:teuthology.orchestra.run.vm08.stdout:Access: 2026-03-10 07:46:45.652719141 +0000 2026-03-10T07:47:02.426 INFO:teuthology.orchestra.run.vm08.stdout:Modify: 2026-03-10 07:41:25.084000000 +0000 2026-03-10T07:47:02.426 INFO:teuthology.orchestra.run.vm08.stdout:Change: 2026-03-10 07:41:25.084000000 +0000 2026-03-10T07:47:02.426 INFO:teuthology.orchestra.run.vm08.stdout: Birth: 2026-03-10 07:41:23.202000000 +0000 2026-03-10T07:47:02.426 DEBUG:teuthology.orchestra.run.vm08:> sudo dd if=/dev/vdc of=/dev/null count=1 2026-03-10T07:47:02.491 INFO:teuthology.orchestra.run.vm08.stderr:1+0 records in 2026-03-10T07:47:02.491 INFO:teuthology.orchestra.run.vm08.stderr:1+0 records out 2026-03-10T07:47:02.491 INFO:teuthology.orchestra.run.vm08.stderr:512 bytes copied, 0.000184394 s, 2.8 MB/s 2026-03-10T07:47:02.492 DEBUG:teuthology.orchestra.run.vm08:> ! mount | grep -v devtmpfs | grep -q /dev/vdc 2026-03-10T07:47:02.550 DEBUG:teuthology.orchestra.run.vm08:> stat /dev/vdd 2026-03-10T07:47:02.609 INFO:teuthology.orchestra.run.vm08.stdout: File: /dev/vdd 2026-03-10T07:47:02.609 INFO:teuthology.orchestra.run.vm08.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T07:47:02.609 INFO:teuthology.orchestra.run.vm08.stdout:Device: 6h/6d Inode: 225 Links: 1 Device type: fc,30 2026-03-10T07:47:02.609 INFO:teuthology.orchestra.run.vm08.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T07:47:02.609 INFO:teuthology.orchestra.run.vm08.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T07:47:02.609 INFO:teuthology.orchestra.run.vm08.stdout:Access: 2026-03-10 07:46:45.723719139 +0000 2026-03-10T07:47:02.609 INFO:teuthology.orchestra.run.vm08.stdout:Modify: 2026-03-10 07:41:25.059000000 +0000 2026-03-10T07:47:02.609 INFO:teuthology.orchestra.run.vm08.stdout:Change: 2026-03-10 07:41:25.059000000 +0000 2026-03-10T07:47:02.609 INFO:teuthology.orchestra.run.vm08.stdout: Birth: 2026-03-10 07:41:23.210000000 +0000 2026-03-10T07:47:02.610 DEBUG:teuthology.orchestra.run.vm08:> sudo dd if=/dev/vdd of=/dev/null count=1 2026-03-10T07:47:02.674 INFO:teuthology.orchestra.run.vm08.stderr:1+0 records in 2026-03-10T07:47:02.674 INFO:teuthology.orchestra.run.vm08.stderr:1+0 records out 2026-03-10T07:47:02.674 INFO:teuthology.orchestra.run.vm08.stderr:512 bytes copied, 0.000189905 s, 2.7 MB/s 2026-03-10T07:47:02.675 DEBUG:teuthology.orchestra.run.vm08:> ! mount | grep -v devtmpfs | grep -q /dev/vdd 2026-03-10T07:47:02.731 DEBUG:teuthology.orchestra.run.vm08:> stat /dev/vde 2026-03-10T07:47:02.787 INFO:teuthology.orchestra.run.vm08.stdout: File: /dev/vde 2026-03-10T07:47:02.787 INFO:teuthology.orchestra.run.vm08.stdout: Size: 0 Blocks: 0 IO Block: 512 block special file 2026-03-10T07:47:02.787 INFO:teuthology.orchestra.run.vm08.stdout:Device: 6h/6d Inode: 229 Links: 1 Device type: fc,40 2026-03-10T07:47:02.787 INFO:teuthology.orchestra.run.vm08.stdout:Access: (0660/brw-rw----) Uid: ( 0/ root) Gid: ( 6/ disk) 2026-03-10T07:47:02.787 INFO:teuthology.orchestra.run.vm08.stdout:Context: system_u:object_r:fixed_disk_device_t:s0 2026-03-10T07:47:02.787 INFO:teuthology.orchestra.run.vm08.stdout:Access: 2026-03-10 07:46:45.794719136 +0000 2026-03-10T07:47:02.787 INFO:teuthology.orchestra.run.vm08.stdout:Modify: 2026-03-10 07:41:25.098000000 +0000 2026-03-10T07:47:02.787 INFO:teuthology.orchestra.run.vm08.stdout:Change: 2026-03-10 07:41:25.098000000 +0000 2026-03-10T07:47:02.787 INFO:teuthology.orchestra.run.vm08.stdout: Birth: 2026-03-10 07:41:23.217000000 +0000 2026-03-10T07:47:02.788 DEBUG:teuthology.orchestra.run.vm08:> sudo dd if=/dev/vde of=/dev/null count=1 2026-03-10T07:47:02.850 INFO:teuthology.orchestra.run.vm08.stderr:1+0 records in 2026-03-10T07:47:02.850 INFO:teuthology.orchestra.run.vm08.stderr:1+0 records out 2026-03-10T07:47:02.850 INFO:teuthology.orchestra.run.vm08.stderr:512 bytes copied, 0.00018127 s, 2.8 MB/s 2026-03-10T07:47:02.851 DEBUG:teuthology.orchestra.run.vm08:> ! mount | grep -v devtmpfs | grep -q /dev/vde 2026-03-10T07:47:02.911 INFO:tasks.cephadm:Deploying osd.0 on vm05 with /dev/vde... 2026-03-10T07:47:02.911 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- lvm zap /dev/vde 2026-03-10T07:47:03.083 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:47:03.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:02 vm05 ceph-mon[50387]: Updating vm05:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.conf 2026-03-10T07:47:03.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:02 vm05 ceph-mon[50387]: Reconfiguring mon.vm05 (unknown last config time)... 2026-03-10T07:47:03.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:02 vm05 ceph-mon[50387]: Reconfiguring daemon mon.vm05 on vm05 2026-03-10T07:47:03.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:02 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:03.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:02 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:03.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:02 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.blexke", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T07:47:03.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:02 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T07:47:03.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:02 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:03.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:02 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:03.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:02 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:03.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:02 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T07:47:03.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:02 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:03.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:02 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:03.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:02 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:03.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:02 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T07:47:03.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:02 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:02 vm08 ceph-mon[59917]: Updating vm05:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.conf 2026-03-10T07:47:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:02 vm08 ceph-mon[59917]: Reconfiguring mon.vm05 (unknown last config time)... 2026-03-10T07:47:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:02 vm08 ceph-mon[59917]: Reconfiguring daemon mon.vm05 on vm05 2026-03-10T07:47:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:02 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:02 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:02 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.blexke", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T07:47:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:02 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T07:47:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:02 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:02 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:02 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:02 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T07:47:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:02 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:02 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:02 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:02 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T07:47:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:02 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:03.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:03 vm05 ceph-mon[50387]: Reconfiguring mgr.vm05.blexke (unknown last config time)... 2026-03-10T07:47:03.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:03 vm05 ceph-mon[50387]: Reconfiguring daemon mgr.vm05.blexke on vm05 2026-03-10T07:47:03.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:03 vm05 ceph-mon[50387]: Reconfiguring ceph-exporter.vm05 (monmap changed)... 2026-03-10T07:47:03.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:03 vm05 ceph-mon[50387]: Reconfiguring daemon ceph-exporter.vm05 on vm05 2026-03-10T07:47:03.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:03 vm05 ceph-mon[50387]: Reconfiguring crash.vm05 (monmap changed)... 2026-03-10T07:47:03.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:03 vm05 ceph-mon[50387]: Reconfiguring daemon crash.vm05 on vm05 2026-03-10T07:47:03.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:03 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:03.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:03 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:04.109 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:47:04.123 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph orch daemon add osd vm05:/dev/vde 2026-03-10T07:47:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:03 vm08 ceph-mon[59917]: Reconfiguring mgr.vm05.blexke (unknown last config time)... 2026-03-10T07:47:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:03 vm08 ceph-mon[59917]: Reconfiguring daemon mgr.vm05.blexke on vm05 2026-03-10T07:47:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:03 vm08 ceph-mon[59917]: Reconfiguring ceph-exporter.vm05 (monmap changed)... 2026-03-10T07:47:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:03 vm08 ceph-mon[59917]: Reconfiguring daemon ceph-exporter.vm05 on vm05 2026-03-10T07:47:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:03 vm08 ceph-mon[59917]: Reconfiguring crash.vm05 (monmap changed)... 2026-03-10T07:47:04.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:03 vm08 ceph-mon[59917]: Reconfiguring daemon crash.vm05 on vm05 2026-03-10T07:47:04.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:03 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:04.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:03 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:04.350 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:47:04.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.645+0000 7ff66359e700 1 -- 192.168.123.105:0/2566362560 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff664075a10 msgr2=0x7ff664077ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:04.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.645+0000 7ff66359e700 1 --2- 192.168.123.105:0/2566362560 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff664075a10 0x7ff664077ea0 secure :-1 s=READY pgs=165 cs=0 l=1 rev1=1 crypto rx=0x7ff654007780 tx=0x7ff65400c050 comp rx=0 tx=0).stop 2026-03-10T07:47:04.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.645+0000 7ff66359e700 1 -- 192.168.123.105:0/2566362560 shutdown_connections 2026-03-10T07:47:04.649 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.645+0000 7ff66359e700 1 --2- 192.168.123.105:0/2566362560 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff664075a10 0x7ff664077ea0 unknown :-1 s=CLOSED pgs=165 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:04.649 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.645+0000 7ff66359e700 1 --2- 192.168.123.105:0/2566362560 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff664072b20 0x7ff664072f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:04.649 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.645+0000 7ff66359e700 1 -- 192.168.123.105:0/2566362560 >> 192.168.123.105:0/2566362560 conn(0x7ff66406daa0 msgr2=0x7ff66406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:47:04.649 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.645+0000 7ff66359e700 1 -- 192.168.123.105:0/2566362560 shutdown_connections 2026-03-10T07:47:04.649 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.645+0000 7ff66359e700 1 -- 192.168.123.105:0/2566362560 wait complete. 2026-03-10T07:47:04.649 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.645+0000 7ff66359e700 1 Processor -- start 2026-03-10T07:47:04.649 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.645+0000 7ff66359e700 1 -- start start 2026-03-10T07:47:04.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.645+0000 7ff66359e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff664072b20 0x7ff664083160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:47:04.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.645+0000 7ff66359e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff6640836a0 0x7ff6641b31b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:47:04.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.645+0000 7ff66359e700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff664083b20 con 0x7ff664072b20 2026-03-10T07:47:04.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.645+0000 7ff66359e700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff664083c90 con 0x7ff6640836a0 2026-03-10T07:47:04.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.645+0000 7ff66259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff664072b20 0x7ff664083160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:47:04.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.645+0000 7ff66259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff664072b20 0x7ff664083160 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:53022/0 (socket says 192.168.123.105:53022) 2026-03-10T07:47:04.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.645+0000 7ff66259c700 1 -- 192.168.123.105:0/944024763 learned_addr learned my addr 192.168.123.105:0/944024763 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:47:04.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.646+0000 7ff661d9b700 1 --2- 192.168.123.105:0/944024763 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff6640836a0 0x7ff6641b31b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:47:04.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.647+0000 7ff66259c700 1 -- 192.168.123.105:0/944024763 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff6640836a0 msgr2=0x7ff6641b31b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:04.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.647+0000 7ff66259c700 1 --2- 192.168.123.105:0/944024763 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff6640836a0 0x7ff6641b31b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:04.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.647+0000 7ff66259c700 1 -- 192.168.123.105:0/944024763 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff654007430 con 0x7ff664072b20 2026-03-10T07:47:04.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.648+0000 7ff66259c700 1 --2- 192.168.123.105:0/944024763 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff664072b20 0x7ff664083160 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7ff65c009fd0 tx=0x7ff65c051470 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:47:04.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.650+0000 7ff6537fe700 1 -- 192.168.123.105:0/944024763 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff65c051ee0 con 0x7ff664072b20 2026-03-10T07:47:04.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.650+0000 7ff66359e700 1 -- 192.168.123.105:0/944024763 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff664107df0 con 0x7ff664072b20 2026-03-10T07:47:04.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.650+0000 7ff66359e700 1 -- 192.168.123.105:0/944024763 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff664108310 con 0x7ff664072b20 2026-03-10T07:47:04.653 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.652+0000 7ff6537fe700 1 -- 192.168.123.105:0/944024763 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff65c05a070 con 0x7ff664072b20 2026-03-10T07:47:04.653 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.652+0000 7ff6537fe700 1 -- 192.168.123.105:0/944024763 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff65c061ba0 con 0x7ff664072b20 2026-03-10T07:47:04.653 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.652+0000 7ff6537fe700 1 -- 192.168.123.105:0/944024763 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90252+0+0 (secure 0 0 0) 0x7ff65c06b430 con 0x7ff664072b20 2026-03-10T07:47:04.653 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.652+0000 7ff6537fe700 1 --2- 192.168.123.105:0/944024763 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff64c06c650 0x7ff64c06eb10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:47:04.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.653+0000 7ff661d9b700 1 --2- 192.168.123.105:0/944024763 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff64c06c650 0x7ff64c06eb10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:47:04.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.654+0000 7ff661d9b700 1 --2- 192.168.123.105:0/944024763 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff64c06c650 0x7ff64c06eb10 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7ff65400c4d0 tx=0x7ff6540058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:47:04.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.655+0000 7ff6537fe700 1 -- 192.168.123.105:0/944024763 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(5..5 src has 1..5) v4 ==== 1198+0+0 (secure 0 0 0) 0x7ff65c05c070 con 0x7ff664072b20 2026-03-10T07:47:04.656 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.655+0000 7ff66359e700 1 -- 192.168.123.105:0/944024763 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff644005320 con 0x7ff664072b20 2026-03-10T07:47:04.659 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.658+0000 7ff6537fe700 1 -- 192.168.123.105:0/944024763 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ff65c09f560 con 0x7ff664072b20 2026-03-10T07:47:04.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:04.811+0000 7ff66359e700 1 -- 192.168.123.105:0/944024763 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vde", "target": ["mon-mgr", ""]}) v1 -- 0x7ff644000bf0 con 0x7ff64c06c650 2026-03-10T07:47:04.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:04 vm05 ceph-mon[50387]: Reconfiguring alertmanager.vm05 (dependencies changed)... 2026-03-10T07:47:04.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:04 vm05 ceph-mon[50387]: Reconfiguring daemon alertmanager.vm05 on vm05 2026-03-10T07:47:04.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:04 vm05 ceph-mon[50387]: pgmap v3: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T07:47:05.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:04 vm08 ceph-mon[59917]: Reconfiguring alertmanager.vm05 (dependencies changed)... 2026-03-10T07:47:05.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:04 vm08 ceph-mon[59917]: Reconfiguring daemon alertmanager.vm05 on vm05 2026-03-10T07:47:05.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:04 vm08 ceph-mon[59917]: pgmap v3: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T07:47:05.847 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:05 vm05 ceph-mon[50387]: from='client.14262 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:47:05.847 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:05 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:05.847 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:05 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T07:47:05.847 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:05 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:05.847 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:05 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T07:47:05.847 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:05 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:05.847 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:05 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:05.847 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:05 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:06.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:05 vm08 ceph-mon[59917]: from='client.14262 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:47:06.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:05 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:06.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:05 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T07:47:06.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:05 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:06.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:05 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T07:47:06.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:05 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:06.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:05 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:06.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:05 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:06.936 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:06 vm05 ceph-mon[50387]: Reconfiguring grafana.vm05 (dependencies changed)... 2026-03-10T07:47:06.936 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:06 vm05 ceph-mon[50387]: Reconfiguring daemon grafana.vm05 on vm05 2026-03-10T07:47:06.936 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:06 vm05 ceph-mon[50387]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T07:47:06.936 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:06 vm05 ceph-mon[50387]: Reconfiguring prometheus.vm05 (dependencies changed)... 2026-03-10T07:47:06.936 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:06 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/1677563936' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "b7030a1b-b939-419d-85b0-9e818f756cd8"}]: dispatch 2026-03-10T07:47:06.936 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:06 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/1677563936' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "b7030a1b-b939-419d-85b0-9e818f756cd8"}]': finished 2026-03-10T07:47:06.936 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:06 vm05 ceph-mon[50387]: osdmap e6: 1 total, 0 up, 1 in 2026-03-10T07:47:06.936 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:06 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T07:47:07.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:06 vm08 ceph-mon[59917]: Reconfiguring grafana.vm05 (dependencies changed)... 2026-03-10T07:47:07.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:06 vm08 ceph-mon[59917]: Reconfiguring daemon grafana.vm05 on vm05 2026-03-10T07:47:07.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:06 vm08 ceph-mon[59917]: pgmap v4: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T07:47:07.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:06 vm08 ceph-mon[59917]: Reconfiguring prometheus.vm05 (dependencies changed)... 2026-03-10T07:47:07.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:06 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/1677563936' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "b7030a1b-b939-419d-85b0-9e818f756cd8"}]: dispatch 2026-03-10T07:47:07.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:06 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/1677563936' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "b7030a1b-b939-419d-85b0-9e818f756cd8"}]': finished 2026-03-10T07:47:07.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:06 vm08 ceph-mon[59917]: osdmap e6: 1 total, 0 up, 1 in 2026-03-10T07:47:07.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:06 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T07:47:08.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:07 vm05 ceph-mon[50387]: Reconfiguring daemon prometheus.vm05 on vm05 2026-03-10T07:47:08.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:07 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/1425869221' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T07:47:08.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:07 vm08 ceph-mon[59917]: Reconfiguring daemon prometheus.vm05 on vm05 2026-03-10T07:47:08.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:07 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/1425869221' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T07:47:09.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:08 vm05 ceph-mon[50387]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T07:47:09.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:08 vm08 ceph-mon[59917]: pgmap v6: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T07:47:11.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:10 vm08 ceph-mon[59917]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T07:47:11.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:10 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:11.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:10 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:11.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:10 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T07:47:11.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:10 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:11.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:10 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:11.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:10 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:11.151 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:10 vm05 ceph-mon[50387]: pgmap v7: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T07:47:11.151 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:10 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:11.151 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:10 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:11.151 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:10 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T07:47:11.151 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:10 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:11.151 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:10 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:11.151 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:10 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:12.049 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:11 vm05 ceph-mon[50387]: Reconfiguring ceph-exporter.vm08 (monmap changed)... 2026-03-10T07:47:12.049 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:11 vm05 ceph-mon[50387]: Reconfiguring daemon ceph-exporter.vm08 on vm08 2026-03-10T07:47:12.049 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:11 vm05 ceph-mon[50387]: Reconfiguring crash.vm08 (monmap changed)... 2026-03-10T07:47:12.049 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:11 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T07:47:12.049 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:11 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:12.049 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:11 vm05 ceph-mon[50387]: Reconfiguring daemon crash.vm08 on vm08 2026-03-10T07:47:12.049 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:11 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T07:47:12.049 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:11 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:12.049 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:11 vm05 ceph-mon[50387]: Deploying daemon osd.0 on vm05 2026-03-10T07:47:12.049 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:11 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:12.049 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:11 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:12.049 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:11 vm05 ceph-mon[50387]: Reconfiguring mgr.vm08.orfpog (monmap changed)... 2026-03-10T07:47:12.050 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:11 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.orfpog", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T07:47:12.050 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:11 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T07:47:12.050 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:11 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:12.050 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:11 vm05 ceph-mon[50387]: Reconfiguring daemon mgr.vm08.orfpog on vm08 2026-03-10T07:47:12.050 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:11 vm05 ceph-mon[50387]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T07:47:12.050 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:11 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:12.050 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:11 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:12.050 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:11 vm05 ceph-mon[50387]: Reconfiguring mon.vm08 (monmap changed)... 2026-03-10T07:47:12.050 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:11 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T07:47:12.050 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:11 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T07:47:12.050 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:11 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:12.050 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:11 vm05 ceph-mon[50387]: Reconfiguring daemon mon.vm08 on vm08 2026-03-10T07:47:12.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:11 vm08 ceph-mon[59917]: Reconfiguring ceph-exporter.vm08 (monmap changed)... 2026-03-10T07:47:12.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:11 vm08 ceph-mon[59917]: Reconfiguring daemon ceph-exporter.vm08 on vm08 2026-03-10T07:47:12.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:11 vm08 ceph-mon[59917]: Reconfiguring crash.vm08 (monmap changed)... 2026-03-10T07:47:12.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:11 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T07:47:12.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:11 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:12.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:11 vm08 ceph-mon[59917]: Reconfiguring daemon crash.vm08 on vm08 2026-03-10T07:47:12.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:11 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T07:47:12.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:11 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:12.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:11 vm08 ceph-mon[59917]: Deploying daemon osd.0 on vm05 2026-03-10T07:47:12.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:11 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:12.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:11 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:12.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:11 vm08 ceph-mon[59917]: Reconfiguring mgr.vm08.orfpog (monmap changed)... 2026-03-10T07:47:12.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:11 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.orfpog", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T07:47:12.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:11 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T07:47:12.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:11 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:12.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:11 vm08 ceph-mon[59917]: Reconfiguring daemon mgr.vm08.orfpog on vm08 2026-03-10T07:47:12.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:11 vm08 ceph-mon[59917]: pgmap v8: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T07:47:12.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:11 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:12.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:11 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:12.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:11 vm08 ceph-mon[59917]: Reconfiguring mon.vm08 (monmap changed)... 2026-03-10T07:47:12.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:11 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T07:47:12.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:11 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T07:47:12.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:11 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:12.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:11 vm08 ceph-mon[59917]: Reconfiguring daemon mon.vm08 on vm08 2026-03-10T07:47:13.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:12 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:13.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:12 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:13.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:12 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T07:47:13.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:12 vm05 ceph-mon[50387]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T07:47:13.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:12 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm05.local:9093"}]: dispatch 2026-03-10T07:47:13.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:12 vm05 ceph-mon[50387]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm05.local:9093"}]: dispatch 2026-03-10T07:47:13.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:12 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:13.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:12 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T07:47:13.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:12 vm05 ceph-mon[50387]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T07:47:13.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:12 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm05.local:3000"}]: dispatch 2026-03-10T07:47:13.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:12 vm05 ceph-mon[50387]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm05.local:3000"}]: dispatch 2026-03-10T07:47:13.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:12 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:13.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:12 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T07:47:13.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:12 vm05 ceph-mon[50387]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T07:47:13.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:12 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm05.local:9095"}]: dispatch 2026-03-10T07:47:13.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:12 vm05 ceph-mon[50387]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm05.local:9095"}]: dispatch 2026-03-10T07:47:13.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:12 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:13.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:12 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:47:13.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:12 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:13.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:12 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:13.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:12 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:13.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:12 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:13.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:12 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:13.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:12 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:13.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:12 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T07:47:13.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:12 vm08 ceph-mon[59917]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T07:47:13.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:12 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm05.local:9093"}]: dispatch 2026-03-10T07:47:13.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:12 vm08 ceph-mon[59917]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-alertmanager-api-host", "value": "http://vm05.local:9093"}]: dispatch 2026-03-10T07:47:13.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:12 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:13.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:12 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T07:47:13.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:12 vm08 ceph-mon[59917]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T07:47:13.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:12 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm05.local:3000"}]: dispatch 2026-03-10T07:47:13.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:12 vm08 ceph-mon[59917]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-grafana-api-url", "value": "https://vm05.local:3000"}]: dispatch 2026-03-10T07:47:13.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:12 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:13.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:12 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T07:47:13.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:12 vm08 ceph-mon[59917]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T07:47:13.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:12 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm05.local:9095"}]: dispatch 2026-03-10T07:47:13.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:12 vm08 ceph-mon[59917]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard set-prometheus-api-host", "value": "http://vm05.local:9095"}]: dispatch 2026-03-10T07:47:13.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:12 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:13.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:12 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:47:13.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:12 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:13.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:12 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:13.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:12 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:13.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:12 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:13.782 INFO:teuthology.orchestra.run.vm05.stdout:Created osd(s) 0 on host 'vm05' 2026-03-10T07:47:13.782 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:13.779+0000 7ff6537fe700 1 -- 192.168.123.105:0/944024763 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7ff644000bf0 con 0x7ff64c06c650 2026-03-10T07:47:13.782 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:13.781+0000 7ff66359e700 1 -- 192.168.123.105:0/944024763 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff64c06c650 msgr2=0x7ff64c06eb10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:13.782 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:13.782+0000 7ff66359e700 1 --2- 192.168.123.105:0/944024763 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff64c06c650 0x7ff64c06eb10 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7ff65400c4d0 tx=0x7ff6540058e0 comp rx=0 tx=0).stop 2026-03-10T07:47:13.782 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:13.782+0000 7ff66359e700 1 -- 192.168.123.105:0/944024763 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff664072b20 msgr2=0x7ff664083160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:13.782 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:13.782+0000 7ff66359e700 1 --2- 192.168.123.105:0/944024763 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff664072b20 0x7ff664083160 secure :-1 s=READY pgs=166 cs=0 l=1 rev1=1 crypto rx=0x7ff65c009fd0 tx=0x7ff65c051470 comp rx=0 tx=0).stop 2026-03-10T07:47:13.783 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:13.782+0000 7ff66359e700 1 -- 192.168.123.105:0/944024763 shutdown_connections 2026-03-10T07:47:13.783 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:13.782+0000 7ff66359e700 1 --2- 192.168.123.105:0/944024763 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff64c06c650 0x7ff64c06eb10 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:13.783 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:13.782+0000 7ff66359e700 1 --2- 192.168.123.105:0/944024763 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff664072b20 0x7ff664083160 unknown :-1 s=CLOSED pgs=166 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:13.783 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:13.782+0000 7ff66359e700 1 --2- 192.168.123.105:0/944024763 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff6640836a0 0x7ff6641b31b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:13.783 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:13.782+0000 7ff66359e700 1 -- 192.168.123.105:0/944024763 >> 192.168.123.105:0/944024763 conn(0x7ff66406daa0 msgr2=0x7ff66406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:47:13.783 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:13.782+0000 7ff66359e700 1 -- 192.168.123.105:0/944024763 shutdown_connections 2026-03-10T07:47:13.783 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:13.782+0000 7ff66359e700 1 -- 192.168.123.105:0/944024763 wait complete. 2026-03-10T07:47:13.824 DEBUG:teuthology.orchestra.run.vm05:osd.0> sudo journalctl -f -n 0 -u ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.0.service 2026-03-10T07:47:13.825 INFO:tasks.cephadm:Deploying osd.1 on vm05 with /dev/vdd... 2026-03-10T07:47:13.825 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- lvm zap /dev/vdd 2026-03-10T07:47:14.041 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:47:14.076 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:13 vm05 ceph-mon[50387]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T07:47:14.076 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:13 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:47:14.076 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:13 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:14.076 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:13 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:14.076 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:13 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:14.076 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:13 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:47:14.076 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:13 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:14.077 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:13 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:14.077 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:13 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:14.077 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:13 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:47:14.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:13 vm08 ceph-mon[59917]: pgmap v9: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T07:47:14.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:13 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:47:14.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:13 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:14.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:13 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:14.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:13 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:14.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:13 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:47:14.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:13 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:14.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:13 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:14.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:13 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:14.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:13 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:47:14.662 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:47:14.678 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph orch daemon add osd vm05:/dev/vdd 2026-03-10T07:47:14.882 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:47:15.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.231+0000 7fa031f61700 1 -- 192.168.123.105:0/1593844920 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0240a5df0 msgr2=0x7fa0240a6270 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:15.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.231+0000 7fa031f61700 1 --2- 192.168.123.105:0/1593844920 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0240a5df0 0x7fa0240a6270 secure :-1 s=READY pgs=172 cs=0 l=1 rev1=1 crypto rx=0x7fa018009a60 tx=0x7fa018009d70 comp rx=0 tx=0).stop 2026-03-10T07:47:15.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.231+0000 7fa031f61700 1 -- 192.168.123.105:0/1593844920 shutdown_connections 2026-03-10T07:47:15.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.231+0000 7fa031f61700 1 --2- 192.168.123.105:0/1593844920 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0240a5df0 0x7fa0240a6270 unknown :-1 s=CLOSED pgs=172 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:15.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.231+0000 7fa031f61700 1 --2- 192.168.123.105:0/1593844920 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0240a4cb0 0x7fa0240a50d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:15.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.231+0000 7fa031f61700 1 -- 192.168.123.105:0/1593844920 >> 192.168.123.105:0/1593844920 conn(0x7fa0240a0170 msgr2=0x7fa0240a25d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:47:15.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.231+0000 7fa031f61700 1 -- 192.168.123.105:0/1593844920 shutdown_connections 2026-03-10T07:47:15.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.231+0000 7fa031f61700 1 -- 192.168.123.105:0/1593844920 wait complete. 2026-03-10T07:47:15.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.231+0000 7fa031f61700 1 Processor -- start 2026-03-10T07:47:15.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.231+0000 7fa031f61700 1 -- start start 2026-03-10T07:47:15.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.231+0000 7fa031f61700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0240a4cb0 0x7fa02400ce90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:47:15.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.231+0000 7fa031f61700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0240a5df0 0x7fa02400d3d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:47:15.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.231+0000 7fa031f61700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa02400d9f0 con 0x7fa0240a4cb0 2026-03-10T07:47:15.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.231+0000 7fa031f61700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa02400db30 con 0x7fa0240a5df0 2026-03-10T07:47:15.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.232+0000 7fa030f5f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0240a4cb0 0x7fa02400ce90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:47:15.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.232+0000 7fa030f5f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0240a4cb0 0x7fa02400ce90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:42354/0 (socket says 192.168.123.105:42354) 2026-03-10T07:47:15.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.232+0000 7fa030f5f700 1 -- 192.168.123.105:0/3756944773 learned_addr learned my addr 192.168.123.105:0/3756944773 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:47:15.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.232+0000 7fa02bfff700 1 --2- 192.168.123.105:0/3756944773 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0240a5df0 0x7fa02400d3d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:47:15.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.232+0000 7fa030f5f700 1 -- 192.168.123.105:0/3756944773 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0240a5df0 msgr2=0x7fa02400d3d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:15.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.232+0000 7fa030f5f700 1 --2- 192.168.123.105:0/3756944773 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0240a5df0 0x7fa02400d3d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:15.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.232+0000 7fa030f5f700 1 -- 192.168.123.105:0/3756944773 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa018009710 con 0x7fa0240a4cb0 2026-03-10T07:47:15.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.232+0000 7fa030f5f700 1 --2- 192.168.123.105:0/3756944773 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0240a4cb0 0x7fa02400ce90 secure :-1 s=READY pgs=173 cs=0 l=1 rev1=1 crypto rx=0x7fa020051910 tx=0x7fa020051c20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:47:15.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.232+0000 7fa029ffb700 1 -- 192.168.123.105:0/3756944773 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa02004f970 con 0x7fa0240a4cb0 2026-03-10T07:47:15.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.232+0000 7fa031f61700 1 -- 192.168.123.105:0/3756944773 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa0240125e0 con 0x7fa0240a4cb0 2026-03-10T07:47:15.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.232+0000 7fa031f61700 1 -- 192.168.123.105:0/3756944773 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa024012b30 con 0x7fa0240a4cb0 2026-03-10T07:47:15.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.233+0000 7fa029ffb700 1 -- 192.168.123.105:0/3756944773 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa020054470 con 0x7fa0240a4cb0 2026-03-10T07:47:15.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.233+0000 7fa029ffb700 1 -- 192.168.123.105:0/3756944773 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa020053670 con 0x7fa0240a4cb0 2026-03-10T07:47:15.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.233+0000 7fa029ffb700 1 -- 192.168.123.105:0/3756944773 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90252+0+0 (secure 0 0 0) 0x7fa020053890 con 0x7fa0240a4cb0 2026-03-10T07:47:15.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.234+0000 7fa029ffb700 1 --2- 192.168.123.105:0/3756944773 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fa01c06c650 0x7fa01c06eb10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:47:15.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.234+0000 7fa02bfff700 1 --2- 192.168.123.105:0/3756944773 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fa01c06c650 0x7fa01c06eb10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:47:15.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.234+0000 7fa029ffb700 1 -- 192.168.123.105:0/3756944773 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(6..6 src has 1..6) v4 ==== 1313+0+0 (secure 0 0 0) 0x7fa0200cf7f0 con 0x7fa0240a4cb0 2026-03-10T07:47:15.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.234+0000 7fa02bfff700 1 --2- 192.168.123.105:0/3756944773 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fa01c06c650 0x7fa01c06eb10 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7fa018003a40 tx=0x7fa0180058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:47:15.238 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.237+0000 7fa031f61700 1 -- 192.168.123.105:0/3756944773 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa010005320 con 0x7fa0240a4cb0 2026-03-10T07:47:15.241 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.240+0000 7fa029ffb700 1 -- 192.168.123.105:0/3756944773 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fa020064080 con 0x7fa0240a4cb0 2026-03-10T07:47:15.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:15.358+0000 7fa031f61700 1 -- 192.168.123.105:0/3756944773 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vdd", "target": ["mon-mgr", ""]}) v1 -- 0x7fa010000bf0 con 0x7fa01c06c650 2026-03-10T07:47:15.756 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:15 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:15.756 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:15 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:15.756 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:15 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T07:47:15.756 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:15 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T07:47:15.756 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:15 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:15.756 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:47:15 vm05 ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0[68323]: 2026-03-10T07:47:15.685+0000 7fe33a360640 -1 osd.0 0 log_to_monitors true 2026-03-10T07:47:15.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:15 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:15.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:15 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:15.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:15 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T07:47:15.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:15 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T07:47:15.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:15 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:16.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:16 vm05 ceph-mon[50387]: pgmap v10: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T07:47:16.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:16 vm05 ceph-mon[50387]: from='client.14278 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:47:16.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:16 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:16.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:16 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:16.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:16 vm05 ceph-mon[50387]: from='osd.0 [v2:192.168.123.105:6802/196713793,v1:192.168.123.105:6803/196713793]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-10T07:47:16.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:16 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/908853801' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "725f8dca-94da-4c18-aefa-9e9f529cccd4"}]: dispatch 2026-03-10T07:47:16.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:16 vm05 ceph-mon[50387]: from='osd.0 [v2:192.168.123.105:6802/196713793,v1:192.168.123.105:6803/196713793]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-10T07:47:16.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:16 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/908853801' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "725f8dca-94da-4c18-aefa-9e9f529cccd4"}]': finished 2026-03-10T07:47:16.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:16 vm05 ceph-mon[50387]: osdmap e7: 2 total, 0 up, 2 in 2026-03-10T07:47:16.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:16 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T07:47:16.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:16 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T07:47:16.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:16 vm05 ceph-mon[50387]: from='osd.0 [v2:192.168.123.105:6802/196713793,v1:192.168.123.105:6803/196713793]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T07:47:16.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:16 vm08 ceph-mon[59917]: pgmap v10: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T07:47:16.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:16 vm08 ceph-mon[59917]: from='client.14278 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:47:16.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:16 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:16.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:16 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:16.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:16 vm08 ceph-mon[59917]: from='osd.0 [v2:192.168.123.105:6802/196713793,v1:192.168.123.105:6803/196713793]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-10T07:47:16.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:16 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/908853801' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "725f8dca-94da-4c18-aefa-9e9f529cccd4"}]: dispatch 2026-03-10T07:47:16.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:16 vm08 ceph-mon[59917]: from='osd.0 [v2:192.168.123.105:6802/196713793,v1:192.168.123.105:6803/196713793]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-10T07:47:16.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:16 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/908853801' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "725f8dca-94da-4c18-aefa-9e9f529cccd4"}]': finished 2026-03-10T07:47:16.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:16 vm08 ceph-mon[59917]: osdmap e7: 2 total, 0 up, 2 in 2026-03-10T07:47:16.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:16 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T07:47:16.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:16 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T07:47:16.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:16 vm08 ceph-mon[59917]: from='osd.0 [v2:192.168.123.105:6802/196713793,v1:192.168.123.105:6803/196713793]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T07:47:17.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:17 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/1405431714' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T07:47:17.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:17 vm05 ceph-mon[50387]: from='osd.0 [v2:192.168.123.105:6802/196713793,v1:192.168.123.105:6803/196713793]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm05", "root=default"]}]': finished 2026-03-10T07:47:17.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:17 vm05 ceph-mon[50387]: osdmap e8: 2 total, 0 up, 2 in 2026-03-10T07:47:17.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:17 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T07:47:17.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:17 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T07:47:17.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:17 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T07:47:17.908 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:47:17 vm05 ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0[68323]: 2026-03-10T07:47:17.423+0000 7fe32f1c3700 -1 osd.0 0 waiting for initial osdmap 2026-03-10T07:47:17.908 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:47:17 vm05 ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0[68323]: 2026-03-10T07:47:17.428+0000 7fe328fb4700 -1 osd.0 8 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T07:47:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:17 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/1405431714' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T07:47:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:17 vm08 ceph-mon[59917]: from='osd.0 [v2:192.168.123.105:6802/196713793,v1:192.168.123.105:6803/196713793]' entity='osd.0' cmd='[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm05", "root=default"]}]': finished 2026-03-10T07:47:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:17 vm08 ceph-mon[59917]: osdmap e8: 2 total, 0 up, 2 in 2026-03-10T07:47:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:17 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T07:47:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:17 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T07:47:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:17 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T07:47:18.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:18 vm05 ceph-mon[50387]: pgmap v12: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T07:47:18.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:18 vm05 ceph-mon[50387]: osd.0 [v2:192.168.123.105:6802/196713793,v1:192.168.123.105:6803/196713793] boot 2026-03-10T07:47:18.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:18 vm05 ceph-mon[50387]: osdmap e9: 2 total, 1 up, 2 in 2026-03-10T07:47:18.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:18 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T07:47:18.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:18 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T07:47:18.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:18 vm08 ceph-mon[59917]: pgmap v12: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T07:47:18.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:18 vm08 ceph-mon[59917]: osd.0 [v2:192.168.123.105:6802/196713793,v1:192.168.123.105:6803/196713793] boot 2026-03-10T07:47:18.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:18 vm08 ceph-mon[59917]: osdmap e9: 2 total, 1 up, 2 in 2026-03-10T07:47:18.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:18 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T07:47:18.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:18 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T07:47:19.880 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:19 vm05 ceph-mon[50387]: purged_snaps scrub starts 2026-03-10T07:47:19.880 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:19 vm05 ceph-mon[50387]: purged_snaps scrub ok 2026-03-10T07:47:19.880 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:19 vm05 ceph-mon[50387]: osdmap e10: 2 total, 1 up, 2 in 2026-03-10T07:47:19.880 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:19 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T07:47:19.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:19 vm08 ceph-mon[59917]: purged_snaps scrub starts 2026-03-10T07:47:19.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:19 vm08 ceph-mon[59917]: purged_snaps scrub ok 2026-03-10T07:47:19.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:19 vm08 ceph-mon[59917]: osdmap e10: 2 total, 1 up, 2 in 2026-03-10T07:47:19.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:19 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T07:47:20.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:20 vm05 ceph-mon[50387]: pgmap v15: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T07:47:20.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:20 vm08 ceph-mon[59917]: pgmap v15: 0 pgs: ; 0 B data, 0 B used, 0 B / 0 B avail 2026-03-10T07:47:21.870 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:21 vm05 ceph-mon[50387]: Detected new or changed devices on vm05 2026-03-10T07:47:21.870 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:21 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:21.870 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:21 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:21.870 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:21 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:47:21.870 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:21 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:21.870 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:21 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:47:21.870 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:21 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:21.870 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:21 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T07:47:21.870 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:21 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:21.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:21 vm08 ceph-mon[59917]: Detected new or changed devices on vm05 2026-03-10T07:47:21.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:21 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:21.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:21 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:21.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:21 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:47:21.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:21 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:21.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:21 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:47:21.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:21 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:21.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:21 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T07:47:21.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:21 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:22.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:22 vm05 ceph-mon[50387]: Deploying daemon osd.1 on vm05 2026-03-10T07:47:22.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:22 vm05 ceph-mon[50387]: pgmap v17: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T07:47:22.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:22 vm08 ceph-mon[59917]: Deploying daemon osd.1 on vm05 2026-03-10T07:47:22.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:22 vm08 ceph-mon[59917]: pgmap v17: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T07:47:23.724 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:23 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:23.724 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:23 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:23.724 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:23 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:47:23.724 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:23 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:23.724 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:23 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:23.750 INFO:teuthology.orchestra.run.vm05.stdout:Created osd(s) 1 on host 'vm05' 2026-03-10T07:47:23.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:23.747+0000 7fa029ffb700 1 -- 192.168.123.105:0/3756944773 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7fa010000bf0 con 0x7fa01c06c650 2026-03-10T07:47:23.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:23.751+0000 7fa031f61700 1 -- 192.168.123.105:0/3756944773 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fa01c06c650 msgr2=0x7fa01c06eb10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:23.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:23.751+0000 7fa031f61700 1 --2- 192.168.123.105:0/3756944773 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fa01c06c650 0x7fa01c06eb10 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7fa018003a40 tx=0x7fa0180058e0 comp rx=0 tx=0).stop 2026-03-10T07:47:23.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:23.751+0000 7fa031f61700 1 -- 192.168.123.105:0/3756944773 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0240a4cb0 msgr2=0x7fa02400ce90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:23.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:23.751+0000 7fa031f61700 1 --2- 192.168.123.105:0/3756944773 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0240a4cb0 0x7fa02400ce90 secure :-1 s=READY pgs=173 cs=0 l=1 rev1=1 crypto rx=0x7fa020051910 tx=0x7fa020051c20 comp rx=0 tx=0).stop 2026-03-10T07:47:23.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:23.751+0000 7fa031f61700 1 -- 192.168.123.105:0/3756944773 shutdown_connections 2026-03-10T07:47:23.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:23.751+0000 7fa031f61700 1 --2- 192.168.123.105:0/3756944773 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fa01c06c650 0x7fa01c06eb10 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:23.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:23.751+0000 7fa031f61700 1 --2- 192.168.123.105:0/3756944773 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa0240a4cb0 0x7fa02400ce90 unknown :-1 s=CLOSED pgs=173 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:23.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:23.751+0000 7fa031f61700 1 --2- 192.168.123.105:0/3756944773 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa0240a5df0 0x7fa02400d3d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:23.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:23.751+0000 7fa031f61700 1 -- 192.168.123.105:0/3756944773 >> 192.168.123.105:0/3756944773 conn(0x7fa0240a0170 msgr2=0x7fa0240a9020 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:47:23.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:23.751+0000 7fa031f61700 1 -- 192.168.123.105:0/3756944773 shutdown_connections 2026-03-10T07:47:23.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:23.751+0000 7fa031f61700 1 -- 192.168.123.105:0/3756944773 wait complete. 2026-03-10T07:47:23.834 DEBUG:teuthology.orchestra.run.vm05:osd.1> sudo journalctl -f -n 0 -u ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.1.service 2026-03-10T07:47:23.836 INFO:tasks.cephadm:Deploying osd.2 on vm05 with /dev/vdc... 2026-03-10T07:47:23.836 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- lvm zap /dev/vdc 2026-03-10T07:47:23.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:23 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:23.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:23 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:23.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:23 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:47:23.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:23 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:23.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:23 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:24.067 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:47:24.684 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:47:24.702 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph orch daemon add osd vm05:/dev/vdc 2026-03-10T07:47:24.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:24 vm05 ceph-mon[50387]: pgmap v18: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T07:47:24.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:24 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:24.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:24 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:24.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:24 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:24.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:24 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:24.908 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:47:24 vm05 ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1[75139]: 2026-03-10T07:47:24.878+0000 7fbe70532640 -1 osd.1 0 log_to_monitors true 2026-03-10T07:47:24.923 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:47:25.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:24 vm08 ceph-mon[59917]: pgmap v18: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T07:47:25.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:24 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:25.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:24 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:25.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:24 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:25.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:24 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:25.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.256+0000 7f3714c02700 1 -- 192.168.123.105:0/445029775 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3710102ea0 msgr2=0x7f37101032c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:25.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.256+0000 7f3714c02700 1 --2- 192.168.123.105:0/445029775 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3710102ea0 0x7f37101032c0 secure :-1 s=READY pgs=181 cs=0 l=1 rev1=1 crypto rx=0x7f3700009b00 tx=0x7f3700009e10 comp rx=0 tx=0).stop 2026-03-10T07:47:25.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.256+0000 7f3714c02700 1 -- 192.168.123.105:0/445029775 shutdown_connections 2026-03-10T07:47:25.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.256+0000 7f3714c02700 1 --2- 192.168.123.105:0/445029775 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3710104090 0x7f3710104510 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:25.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.256+0000 7f3714c02700 1 --2- 192.168.123.105:0/445029775 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3710102ea0 0x7f37101032c0 unknown :-1 s=CLOSED pgs=181 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:25.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.256+0000 7f3714c02700 1 -- 192.168.123.105:0/445029775 >> 192.168.123.105:0/445029775 conn(0x7f37100fe470 msgr2=0x7f37101008d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:47:25.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.256+0000 7f3714c02700 1 -- 192.168.123.105:0/445029775 shutdown_connections 2026-03-10T07:47:25.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.256+0000 7f3714c02700 1 -- 192.168.123.105:0/445029775 wait complete. 2026-03-10T07:47:25.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.257+0000 7f3714c02700 1 Processor -- start 2026-03-10T07:47:25.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.257+0000 7f3714c02700 1 -- start start 2026-03-10T07:47:25.259 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.257+0000 7f3714c02700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3710102ea0 0x7f3710198660 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:47:25.259 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.257+0000 7f370e59c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3710102ea0 0x7f3710198660 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:47:25.259 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.258+0000 7f370e59c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3710102ea0 0x7f3710198660 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36656/0 (socket says 192.168.123.105:36656) 2026-03-10T07:47:25.259 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.258+0000 7f370e59c700 1 -- 192.168.123.105:0/4228835381 learned_addr learned my addr 192.168.123.105:0/4228835381 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:47:25.259 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.258+0000 7f3714c02700 1 --2- 192.168.123.105:0/4228835381 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3710104090 0x7f3710198ba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:47:25.259 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.258+0000 7f3714c02700 1 -- 192.168.123.105:0/4228835381 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f37101991c0 con 0x7f3710102ea0 2026-03-10T07:47:25.259 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.258+0000 7f3714c02700 1 -- 192.168.123.105:0/4228835381 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3710199300 con 0x7f3710104090 2026-03-10T07:47:25.259 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.258+0000 7f370dd9b700 1 --2- 192.168.123.105:0/4228835381 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3710104090 0x7f3710198ba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:47:25.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.258+0000 7f370e59c700 1 -- 192.168.123.105:0/4228835381 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3710104090 msgr2=0x7f3710198ba0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:25.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.258+0000 7f370e59c700 1 --2- 192.168.123.105:0/4228835381 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3710104090 0x7f3710198ba0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:25.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.258+0000 7f370e59c700 1 -- 192.168.123.105:0/4228835381 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f37000097e0 con 0x7f3710102ea0 2026-03-10T07:47:25.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.258+0000 7f370e59c700 1 --2- 192.168.123.105:0/4228835381 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3710102ea0 0x7f3710198660 secure :-1 s=READY pgs=182 cs=0 l=1 rev1=1 crypto rx=0x7f3700006010 tx=0x7f3700004c30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:47:25.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.258+0000 7f36ff7fe700 1 -- 192.168.123.105:0/4228835381 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f370001d070 con 0x7f3710102ea0 2026-03-10T07:47:25.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.259+0000 7f36ff7fe700 1 -- 192.168.123.105:0/4228835381 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f370000bd80 con 0x7f3710102ea0 2026-03-10T07:47:25.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.259+0000 7f36ff7fe700 1 -- 192.168.123.105:0/4228835381 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f370000f970 con 0x7f3710102ea0 2026-03-10T07:47:25.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.259+0000 7f3714c02700 1 -- 192.168.123.105:0/4228835381 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f371019dd50 con 0x7f3710102ea0 2026-03-10T07:47:25.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.259+0000 7f3714c02700 1 -- 192.168.123.105:0/4228835381 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f371019e1c0 con 0x7f3710102ea0 2026-03-10T07:47:25.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.260+0000 7f3714c02700 1 -- 192.168.123.105:0/4228835381 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f371004ea90 con 0x7f3710102ea0 2026-03-10T07:47:25.264 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.260+0000 7f36ff7fe700 1 -- 192.168.123.105:0/4228835381 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 18) v1 ==== 90252+0+0 (secure 0 0 0) 0x7f370000fad0 con 0x7f3710102ea0 2026-03-10T07:47:25.264 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.261+0000 7f36ff7fe700 1 --2- 192.168.123.105:0/4228835381 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f36f806c290 0x7f36f806e750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:47:25.264 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.263+0000 7f370dd9b700 1 --2- 192.168.123.105:0/4228835381 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f36f806c290 0x7f36f806e750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:47:25.264 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.263+0000 7f370dd9b700 1 --2- 192.168.123.105:0/4228835381 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f36f806c290 0x7f36f806e750 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f3710102bd0 tx=0x7f3704009380 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:47:25.265 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.263+0000 7f36ff7fe700 1 -- 192.168.123.105:0/4228835381 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(10..10 src has 1..10) v4 ==== 1915+0+0 (secure 0 0 0) 0x7f370005b690 con 0x7f3710102ea0 2026-03-10T07:47:25.265 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.264+0000 7f36ff7fe700 1 -- 192.168.123.105:0/4228835381 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f3700092050 con 0x7f3710102ea0 2026-03-10T07:47:25.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:25.370+0000 7f3714c02700 1 -- 192.168.123.105:0/4228835381 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vdc", "target": ["mon-mgr", ""]}) v1 -- 0x7f37101086e0 con 0x7f36f806c290 2026-03-10T07:47:26.005 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:25 vm05 ceph-mon[50387]: from='osd.1 [v2:192.168.123.105:6810/1508516757,v1:192.168.123.105:6811/1508516757]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-10T07:47:26.005 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:25 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T07:47:26.005 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:25 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T07:47:26.005 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:25 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:26.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:25 vm08 ceph-mon[59917]: from='osd.1 [v2:192.168.123.105:6810/1508516757,v1:192.168.123.105:6811/1508516757]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-10T07:47:26.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:25 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T07:47:26.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:25 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T07:47:26.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:25 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:26.753 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:47:26 vm05 ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1[75139]: 2026-03-10T07:47:26.487+0000 7fbe65395700 -1 osd.1 0 waiting for initial osdmap 2026-03-10T07:47:26.753 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:47:26 vm05 ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1[75139]: 2026-03-10T07:47:26.501+0000 7fbe6198b700 -1 osd.1 12 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T07:47:26.753 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:26 vm05 ceph-mon[50387]: pgmap v19: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T07:47:26.753 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:26 vm05 ceph-mon[50387]: from='client.14298 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:47:26.753 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:26 vm05 ceph-mon[50387]: from='osd.1 [v2:192.168.123.105:6810/1508516757,v1:192.168.123.105:6811/1508516757]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-10T07:47:26.753 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:26 vm05 ceph-mon[50387]: osdmap e11: 2 total, 1 up, 2 in 2026-03-10T07:47:26.753 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:26 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T07:47:26.753 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:26 vm05 ceph-mon[50387]: from='osd.1 [v2:192.168.123.105:6810/1508516757,v1:192.168.123.105:6811/1508516757]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T07:47:26.753 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:26 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:26.753 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:26 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:26.753 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:26 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:47:26.753 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:26 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:26.753 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:26 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:47:26.753 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:26 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:26.753 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:26 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:47:26.753 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:26 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/161393598' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "ef60316b-3b61-4644-aa31-97cef548ba7e"}]: dispatch 2026-03-10T07:47:26.753 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:26 vm05 ceph-mon[50387]: from='osd.1 [v2:192.168.123.105:6810/1508516757,v1:192.168.123.105:6811/1508516757]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm05", "root=default"]}]': finished 2026-03-10T07:47:26.753 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:26 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/161393598' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "ef60316b-3b61-4644-aa31-97cef548ba7e"}]': finished 2026-03-10T07:47:26.754 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:26 vm05 ceph-mon[50387]: osdmap e12: 3 total, 1 up, 3 in 2026-03-10T07:47:26.754 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:26 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T07:47:26.754 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:26 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T07:47:26.754 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:26 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T07:47:26.754 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:26 vm08 ceph-mon[59917]: pgmap v19: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T07:47:26.754 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:26 vm08 ceph-mon[59917]: from='client.14298 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm05:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:47:26.754 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:26 vm08 ceph-mon[59917]: from='osd.1 [v2:192.168.123.105:6810/1508516757,v1:192.168.123.105:6811/1508516757]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-10T07:47:26.754 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:26 vm08 ceph-mon[59917]: osdmap e11: 2 total, 1 up, 2 in 2026-03-10T07:47:26.754 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:26 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T07:47:26.754 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:26 vm08 ceph-mon[59917]: from='osd.1 [v2:192.168.123.105:6810/1508516757,v1:192.168.123.105:6811/1508516757]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T07:47:26.754 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:26 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:26.754 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:26 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:26.754 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:26 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:47:26.754 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:26 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:26.754 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:26 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:47:26.754 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:26 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:26.754 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:26 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:47:26.754 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:26 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/161393598' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "ef60316b-3b61-4644-aa31-97cef548ba7e"}]: dispatch 2026-03-10T07:47:26.754 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:26 vm08 ceph-mon[59917]: from='osd.1 [v2:192.168.123.105:6810/1508516757,v1:192.168.123.105:6811/1508516757]' entity='osd.1' cmd='[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm05", "root=default"]}]': finished 2026-03-10T07:47:26.754 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:26 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/161393598' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "ef60316b-3b61-4644-aa31-97cef548ba7e"}]': finished 2026-03-10T07:47:26.754 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:26 vm08 ceph-mon[59917]: osdmap e12: 3 total, 1 up, 3 in 2026-03-10T07:47:26.754 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:26 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T07:47:26.754 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:26 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T07:47:26.754 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:26 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T07:47:28.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:27 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:28.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:27 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:28.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:27 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/2026380902' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T07:47:28.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:27 vm05 ceph-mon[50387]: osd.1 [v2:192.168.123.105:6810/1508516757,v1:192.168.123.105:6811/1508516757] boot 2026-03-10T07:47:28.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:27 vm05 ceph-mon[50387]: osdmap e13: 3 total, 2 up, 3 in 2026-03-10T07:47:28.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:27 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T07:47:28.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:27 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T07:47:28.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:27 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:28.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:27 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:28.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:27 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:28.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:27 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:47:28.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:27 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:28.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:27 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:28.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:27 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:28.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:27 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/2026380902' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T07:47:28.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:27 vm08 ceph-mon[59917]: osd.1 [v2:192.168.123.105:6810/1508516757,v1:192.168.123.105:6811/1508516757] boot 2026-03-10T07:47:28.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:27 vm08 ceph-mon[59917]: osdmap e13: 3 total, 2 up, 3 in 2026-03-10T07:47:28.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:27 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T07:47:28.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:27 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T07:47:28.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:27 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:28.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:27 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:28.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:27 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:28.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:27 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:47:28.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:27 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:29.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:28 vm05 ceph-mon[50387]: purged_snaps scrub starts 2026-03-10T07:47:29.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:28 vm05 ceph-mon[50387]: purged_snaps scrub ok 2026-03-10T07:47:29.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:28 vm05 ceph-mon[50387]: pgmap v22: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T07:47:29.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:28 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:47:29.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:28 vm08 ceph-mon[59917]: purged_snaps scrub starts 2026-03-10T07:47:29.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:28 vm08 ceph-mon[59917]: purged_snaps scrub ok 2026-03-10T07:47:29.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:28 vm08 ceph-mon[59917]: pgmap v22: 0 pgs: ; 0 B data, 26 MiB used, 20 GiB / 20 GiB avail 2026-03-10T07:47:29.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:28 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:47:30.103 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:29 vm05 ceph-mon[50387]: osdmap e14: 3 total, 2 up, 3 in 2026-03-10T07:47:30.103 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:29 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T07:47:30.103 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:29 vm05 ceph-mon[50387]: pgmap v25: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T07:47:30.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:29 vm08 ceph-mon[59917]: osdmap e14: 3 total, 2 up, 3 in 2026-03-10T07:47:30.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:29 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T07:47:30.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:29 vm08 ceph-mon[59917]: pgmap v25: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T07:47:31.088 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:30 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T07:47:31.088 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:30 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:31.088 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:30 vm05 ceph-mon[50387]: Deploying daemon osd.2 on vm05 2026-03-10T07:47:31.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:30 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T07:47:31.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:30 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:31.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:30 vm08 ceph-mon[59917]: Deploying daemon osd.2 on vm05 2026-03-10T07:47:31.966 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:31 vm05 ceph-mon[50387]: pgmap v26: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T07:47:32.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:31 vm08 ceph-mon[59917]: pgmap v26: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T07:47:33.561 INFO:teuthology.orchestra.run.vm05.stdout:Created osd(s) 2 on host 'vm05' 2026-03-10T07:47:33.561 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:33.552+0000 7f36ff7fe700 1 -- 192.168.123.105:0/4228835381 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f37101086e0 con 0x7f36f806c290 2026-03-10T07:47:33.561 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:33.554+0000 7f3714c02700 1 -- 192.168.123.105:0/4228835381 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f36f806c290 msgr2=0x7f36f806e750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:33.561 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:33.554+0000 7f3714c02700 1 --2- 192.168.123.105:0/4228835381 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f36f806c290 0x7f36f806e750 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f3710102bd0 tx=0x7f3704009380 comp rx=0 tx=0).stop 2026-03-10T07:47:33.561 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:33.554+0000 7f3714c02700 1 -- 192.168.123.105:0/4228835381 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3710102ea0 msgr2=0x7f3710198660 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:33.561 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:33.554+0000 7f3714c02700 1 --2- 192.168.123.105:0/4228835381 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3710102ea0 0x7f3710198660 secure :-1 s=READY pgs=182 cs=0 l=1 rev1=1 crypto rx=0x7f3700006010 tx=0x7f3700004c30 comp rx=0 tx=0).stop 2026-03-10T07:47:33.561 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:33.554+0000 7f3714c02700 1 -- 192.168.123.105:0/4228835381 shutdown_connections 2026-03-10T07:47:33.561 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:33.554+0000 7f3714c02700 1 --2- 192.168.123.105:0/4228835381 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f36f806c290 0x7f36f806e750 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:33.561 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:33.554+0000 7f3714c02700 1 --2- 192.168.123.105:0/4228835381 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3710102ea0 0x7f3710198660 unknown :-1 s=CLOSED pgs=182 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:33.561 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:33.554+0000 7f3714c02700 1 --2- 192.168.123.105:0/4228835381 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3710104090 0x7f3710198ba0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:33.561 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:33.554+0000 7f3714c02700 1 -- 192.168.123.105:0/4228835381 >> 192.168.123.105:0/4228835381 conn(0x7f37100fe470 msgr2=0x7f3710100740 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:47:33.561 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:33.554+0000 7f3714c02700 1 -- 192.168.123.105:0/4228835381 shutdown_connections 2026-03-10T07:47:33.561 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:47:33.554+0000 7f3714c02700 1 -- 192.168.123.105:0/4228835381 wait complete. 2026-03-10T07:47:33.627 DEBUG:teuthology.orchestra.run.vm05:osd.2> sudo journalctl -f -n 0 -u ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.2.service 2026-03-10T07:47:33.629 INFO:tasks.cephadm:Deploying osd.3 on vm08 with /dev/vde... 2026-03-10T07:47:33.629 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- lvm zap /dev/vde 2026-03-10T07:47:33.659 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:33 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:33.659 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:33 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:33.659 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:33 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:47:33.659 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:33 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:33.765 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm08/config 2026-03-10T07:47:33.793 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:33 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:33.793 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:33 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:33.793 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:33 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:47:33.793 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:33 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:34.287 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:47:34.302 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph orch daemon add osd vm08:/dev/vde 2026-03-10T07:47:34.455 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm08/config 2026-03-10T07:47:34.704 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.702+0000 7fbf20575700 1 -- 192.168.123.108:0/4197099151 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf18101a50 msgr2=0x7fbf18103e40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:34.704 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.702+0000 7fbf20575700 1 --2- 192.168.123.108:0/4197099151 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf18101a50 0x7fbf18103e40 secure :-1 s=READY pgs=5 cs=0 l=1 rev1=1 crypto rx=0x7fbf08009b50 tx=0x7fbf08009e60 comp rx=0 tx=0).stop 2026-03-10T07:47:34.704 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.702+0000 7fbf20575700 1 -- 192.168.123.108:0/4197099151 shutdown_connections 2026-03-10T07:47:34.704 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.702+0000 7fbf20575700 1 --2- 192.168.123.108:0/4197099151 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf18104380 0x7fbf18106770 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:34.704 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.702+0000 7fbf20575700 1 --2- 192.168.123.108:0/4197099151 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf18101a50 0x7fbf18103e40 unknown :-1 s=CLOSED pgs=5 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:34.704 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.702+0000 7fbf20575700 1 -- 192.168.123.108:0/4197099151 >> 192.168.123.108:0/4197099151 conn(0x7fbf180fb380 msgr2=0x7fbf180fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:47:34.704 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.703+0000 7fbf20575700 1 -- 192.168.123.108:0/4197099151 shutdown_connections 2026-03-10T07:47:34.704 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.703+0000 7fbf20575700 1 -- 192.168.123.108:0/4197099151 wait complete. 2026-03-10T07:47:34.704 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.703+0000 7fbf20575700 1 Processor -- start 2026-03-10T07:47:34.704 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.703+0000 7fbf20575700 1 -- start start 2026-03-10T07:47:34.704 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.703+0000 7fbf20575700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf18101a50 0x7fbf1819ce10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:47:34.705 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.703+0000 7fbf20575700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf18104380 0x7fbf1819d350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:47:34.705 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.703+0000 7fbf20575700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbf1819d970 con 0x7fbf18101a50 2026-03-10T07:47:34.705 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.703+0000 7fbf20575700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbf1819dab0 con 0x7fbf18104380 2026-03-10T07:47:34.705 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.703+0000 7fbf1db10700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf18104380 0x7fbf1819d350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:47:34.705 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.703+0000 7fbf1db10700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf18104380 0x7fbf1819d350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.108:50584/0 (socket says 192.168.123.108:50584) 2026-03-10T07:47:34.705 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.703+0000 7fbf1db10700 1 -- 192.168.123.108:0/2643579946 learned_addr learned my addr 192.168.123.108:0/2643579946 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:47:34.705 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.703+0000 7fbf1e311700 1 --2- 192.168.123.108:0/2643579946 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf18101a50 0x7fbf1819ce10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:47:34.705 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.704+0000 7fbf1db10700 1 -- 192.168.123.108:0/2643579946 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf18101a50 msgr2=0x7fbf1819ce10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:34.705 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.704+0000 7fbf1db10700 1 --2- 192.168.123.108:0/2643579946 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf18101a50 0x7fbf1819ce10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:34.705 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.704+0000 7fbf1db10700 1 -- 192.168.123.108:0/2643579946 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbf080097e0 con 0x7fbf18104380 2026-03-10T07:47:34.705 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.704+0000 7fbf1db10700 1 --2- 192.168.123.108:0/2643579946 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf18104380 0x7fbf1819d350 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fbf1400d8d0 tx=0x7fbf1400dbe0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:47:34.706 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.704+0000 7fbf0f7fe700 1 -- 192.168.123.108:0/2643579946 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbf14009880 con 0x7fbf18104380 2026-03-10T07:47:34.706 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.704+0000 7fbf20575700 1 -- 192.168.123.108:0/2643579946 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbf181a2560 con 0x7fbf18104380 2026-03-10T07:47:34.706 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.704+0000 7fbf20575700 1 -- 192.168.123.108:0/2643579946 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbf181a2ab0 con 0x7fbf18104380 2026-03-10T07:47:34.706 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.704+0000 7fbf0f7fe700 1 -- 192.168.123.108:0/2643579946 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbf14010460 con 0x7fbf18104380 2026-03-10T07:47:34.706 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.704+0000 7fbf0f7fe700 1 -- 192.168.123.108:0/2643579946 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbf1400f5d0 con 0x7fbf18104380 2026-03-10T07:47:34.707 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.706+0000 7fbf20575700 1 -- 192.168.123.108:0/2643579946 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbefc005320 con 0x7fbf18104380 2026-03-10T07:47:34.710 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.709+0000 7fbf0f7fe700 1 -- 192.168.123.108:0/2643579946 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 18) v1 ==== 90252+0+0 (secure 0 0 0) 0x7fbf140099e0 con 0x7fbf18104380 2026-03-10T07:47:34.710 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.709+0000 7fbf0f7fe700 1 --2- 192.168.123.108:0/2643579946 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbf0406c460 0x7fbf0406e920 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:47:34.711 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.709+0000 7fbf0f7fe700 1 -- 192.168.123.108:0/2643579946 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(14..14 src has 1..14) v4 ==== 2347+0+0 (secure 0 0 0) 0x7fbf1408ac30 con 0x7fbf18104380 2026-03-10T07:47:34.711 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.710+0000 7fbf1e311700 1 --2- 192.168.123.108:0/2643579946 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbf0406c460 0x7fbf0406e920 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:47:34.711 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.710+0000 7fbf0f7fe700 1 -- 192.168.123.108:0/2643579946 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fbf140b7180 con 0x7fbf18104380 2026-03-10T07:47:34.712 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.710+0000 7fbf1e311700 1 --2- 192.168.123.108:0/2643579946 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbf0406c460 0x7fbf0406e920 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fbf08006010 tx=0x7fbf08005ea0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:47:34.798 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:34 vm08 ceph-mon[59917]: pgmap v27: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T07:47:34.798 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:34 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:34.798 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:34 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:34.798 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:34 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:34.829 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:34.827+0000 7fbf20575700 1 -- 192.168.123.108:0/2643579946 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vde", "target": ["mon-mgr", ""]}) v1 -- 0x7fbefc000bf0 con 0x7fbf0406c460 2026-03-10T07:47:34.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:34 vm05 ceph-mon[50387]: pgmap v27: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T07:47:34.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:34 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:34.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:34 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:34.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:34 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:35.408 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:47:35 vm05 ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2[82489]: 2026-03-10T07:47:35.192+0000 7f3f005fd640 -1 osd.2 0 log_to_monitors true 2026-03-10T07:47:35.787 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:47:35 vm05 ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2[82489]: 2026-03-10T07:47:35.683+0000 7f3ef6c63700 -1 osd.2 0 waiting for initial osdmap 2026-03-10T07:47:35.787 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:47:35 vm05 ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2[82489]: 2026-03-10T07:47:35.690+0000 7f3eef251700 -1 osd.2 16 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T07:47:35.788 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:35 vm05 ceph-mon[50387]: from='client.24135 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:47:35.788 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:35 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T07:47:35.788 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:35 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T07:47:35.788 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:35 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:35.788 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:35 vm05 ceph-mon[50387]: from='osd.2 [v2:192.168.123.105:6818/4116227648,v1:192.168.123.105:6819/4116227648]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-10T07:47:35.788 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:35 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:35.788 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:35 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:35.788 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:35 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:47:35.788 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:35 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:35.788 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:35 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:47:35.788 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:35 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:35.788 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:35 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:47:35.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:35 vm08 ceph-mon[59917]: from='client.24135 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vde", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:47:35.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:35 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T07:47:35.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:35 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T07:47:35.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:35 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:35.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:35 vm08 ceph-mon[59917]: from='osd.2 [v2:192.168.123.105:6818/4116227648,v1:192.168.123.105:6819/4116227648]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-10T07:47:35.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:35 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:35.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:35 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:35.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:35 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:47:35.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:35 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:35.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:35 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:47:35.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:35 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:35.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:35 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:47:36.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:36 vm05 ceph-mon[50387]: pgmap v28: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T07:47:36.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:36 vm05 ceph-mon[50387]: Detected new or changed devices on vm05 2026-03-10T07:47:36.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:36 vm05 ceph-mon[50387]: from='osd.2 [v2:192.168.123.105:6818/4116227648,v1:192.168.123.105:6819/4116227648]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-10T07:47:36.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:36 vm05 ceph-mon[50387]: osdmap e15: 3 total, 2 up, 3 in 2026-03-10T07:47:36.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:36 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T07:47:36.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:36 vm05 ceph-mon[50387]: from='osd.2 [v2:192.168.123.105:6818/4116227648,v1:192.168.123.105:6819/4116227648]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T07:47:36.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:36 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/923918602' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "4cc7cd7e-7756-40b9-9bc8-029e26495239"}]: dispatch 2026-03-10T07:47:36.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:36 vm05 ceph-mon[50387]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "4cc7cd7e-7756-40b9-9bc8-029e26495239"}]: dispatch 2026-03-10T07:47:36.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:36 vm05 ceph-mon[50387]: from='osd.2 [v2:192.168.123.105:6818/4116227648,v1:192.168.123.105:6819/4116227648]' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm05", "root=default"]}]': finished 2026-03-10T07:47:36.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:36 vm05 ceph-mon[50387]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "4cc7cd7e-7756-40b9-9bc8-029e26495239"}]': finished 2026-03-10T07:47:36.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:36 vm05 ceph-mon[50387]: osdmap e16: 4 total, 2 up, 4 in 2026-03-10T07:47:36.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:36 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T07:47:36.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:36 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:47:36.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:36 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T07:47:36.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:36 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/1327158800' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T07:47:36.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:36 vm08 ceph-mon[59917]: pgmap v28: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T07:47:36.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:36 vm08 ceph-mon[59917]: Detected new or changed devices on vm05 2026-03-10T07:47:36.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:36 vm08 ceph-mon[59917]: from='osd.2 [v2:192.168.123.105:6818/4116227648,v1:192.168.123.105:6819/4116227648]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-10T07:47:36.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:36 vm08 ceph-mon[59917]: osdmap e15: 3 total, 2 up, 3 in 2026-03-10T07:47:36.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:36 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T07:47:36.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:36 vm08 ceph-mon[59917]: from='osd.2 [v2:192.168.123.105:6818/4116227648,v1:192.168.123.105:6819/4116227648]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T07:47:36.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:36 vm08 ceph-mon[59917]: from='client.? 192.168.123.108:0/923918602' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "4cc7cd7e-7756-40b9-9bc8-029e26495239"}]: dispatch 2026-03-10T07:47:36.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:36 vm08 ceph-mon[59917]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "4cc7cd7e-7756-40b9-9bc8-029e26495239"}]: dispatch 2026-03-10T07:47:36.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:36 vm08 ceph-mon[59917]: from='osd.2 [v2:192.168.123.105:6818/4116227648,v1:192.168.123.105:6819/4116227648]' entity='osd.2' cmd='[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm05", "root=default"]}]': finished 2026-03-10T07:47:36.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:36 vm08 ceph-mon[59917]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "4cc7cd7e-7756-40b9-9bc8-029e26495239"}]': finished 2026-03-10T07:47:36.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:36 vm08 ceph-mon[59917]: osdmap e16: 4 total, 2 up, 4 in 2026-03-10T07:47:36.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:36 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T07:47:36.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:36 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:47:36.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:36 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T07:47:36.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:36 vm08 ceph-mon[59917]: from='client.? 192.168.123.108:0/1327158800' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T07:47:37.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:37 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:37.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:37 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:37.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:37 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:37.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:37 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:47:37.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:37 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:37.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:37 vm05 ceph-mon[50387]: osd.2 [v2:192.168.123.105:6818/4116227648,v1:192.168.123.105:6819/4116227648] boot 2026-03-10T07:47:37.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:37 vm05 ceph-mon[50387]: osdmap e17: 4 total, 3 up, 4 in 2026-03-10T07:47:37.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:37 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T07:47:37.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:37 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:47:37.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:37 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T07:47:37.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:37 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch 2026-03-10T07:47:37.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:37 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:37.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:37 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:37.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:37 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:37.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:37 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:47:37.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:37 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:37.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:37 vm08 ceph-mon[59917]: osd.2 [v2:192.168.123.105:6818/4116227648,v1:192.168.123.105:6819/4116227648] boot 2026-03-10T07:47:37.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:37 vm08 ceph-mon[59917]: osdmap e17: 4 total, 3 up, 4 in 2026-03-10T07:47:37.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:37 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T07:47:37.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:37 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:47:37.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:37 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T07:47:37.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:37 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]: dispatch 2026-03-10T07:47:38.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:38 vm05 ceph-mon[50387]: purged_snaps scrub starts 2026-03-10T07:47:38.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:38 vm05 ceph-mon[50387]: purged_snaps scrub ok 2026-03-10T07:47:38.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:38 vm05 ceph-mon[50387]: pgmap v32: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T07:47:38.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:38 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished 2026-03-10T07:47:38.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:38 vm05 ceph-mon[50387]: osdmap e18: 4 total, 3 up, 4 in 2026-03-10T07:47:38.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:38 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:47:38.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:38 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch 2026-03-10T07:47:38.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:38 vm08 ceph-mon[59917]: purged_snaps scrub starts 2026-03-10T07:47:38.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:38 vm08 ceph-mon[59917]: purged_snaps scrub ok 2026-03-10T07:47:38.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:38 vm08 ceph-mon[59917]: pgmap v32: 0 pgs: ; 0 B data, 52 MiB used, 40 GiB / 40 GiB avail 2026-03-10T07:47:38.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:38 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "osd pool create", "format": "json", "pool": ".mgr", "pg_num": 1, "pg_num_min": 1, "pg_num_max": 32, "yes_i_really_mean_it": true}]': finished 2026-03-10T07:47:38.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:38 vm08 ceph-mon[59917]: osdmap e18: 4 total, 3 up, 4 in 2026-03-10T07:47:38.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:38 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:47:38.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:38 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]: dispatch 2026-03-10T07:47:40.112 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:47:40 vm05 sudo[87801]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vdd 2026-03-10T07:47:40.112 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:47:40 vm05 sudo[87801]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T07:47:40.112 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:47:40 vm05 sudo[87801]: pam_unix(sudo:session): session opened for user root by (uid=167) 2026-03-10T07:47:40.112 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:47:40 vm05 sudo[87801]: pam_unix(sudo:session): session closed for user root 2026-03-10T07:47:40.112 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:47:39 vm05 sudo[87798]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vde 2026-03-10T07:47:40.113 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:47:39 vm05 sudo[87798]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T07:47:40.113 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:47:39 vm05 sudo[87798]: pam_unix(sudo:session): session opened for user root by (uid=167) 2026-03-10T07:47:40.113 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:47:39 vm05 sudo[87798]: pam_unix(sudo:session): session closed for user root 2026-03-10T07:47:40.389 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:40 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished 2026-03-10T07:47:40.389 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:40 vm05 ceph-mon[50387]: osdmap e19: 4 total, 3 up, 4 in 2026-03-10T07:47:40.389 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:40 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:47:40.389 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:40 vm05 ceph-mon[50387]: pgmap v35: 1 pgs: 1 creating+peering; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T07:47:40.389 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:40 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T07:47:40.389 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:40 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:40.390 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:47:40 vm05 sudo[87804]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vdc 2026-03-10T07:47:40.390 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:47:40 vm05 sudo[87804]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T07:47:40.390 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:47:40 vm05 sudo[87804]: pam_unix(sudo:session): session opened for user root by (uid=167) 2026-03-10T07:47:40.390 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:47:40 vm05 sudo[87804]: pam_unix(sudo:session): session closed for user root 2026-03-10T07:47:40.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:40 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "osd pool application enable", "format": "json", "pool": ".mgr", "app": "mgr", "yes_i_really_mean_it": true}]': finished 2026-03-10T07:47:40.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:40 vm08 ceph-mon[59917]: osdmap e19: 4 total, 3 up, 4 in 2026-03-10T07:47:40.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:40 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:47:40.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:40 vm08 ceph-mon[59917]: pgmap v35: 1 pgs: 1 creating+peering; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T07:47:40.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:40 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T07:47:40.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:40 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:40.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:40 vm05 sudo[87807]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vda 2026-03-10T07:47:40.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:40 vm05 sudo[87807]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T07:47:40.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:40 vm05 sudo[87807]: pam_unix(sudo:session): session opened for user root by (uid=167) 2026-03-10T07:47:40.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:40 vm05 sudo[87807]: pam_unix(sudo:session): session closed for user root 2026-03-10T07:47:40.723 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:40 vm08 sudo[65485]: ceph : PWD=/ ; USER=root ; COMMAND=/usr/sbin/smartctl -x --json=o /dev/vda 2026-03-10T07:47:40.724 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:40 vm08 sudo[65485]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory 2026-03-10T07:47:40.724 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:40 vm08 sudo[65485]: pam_unix(sudo:session): session opened for user root by (uid=167) 2026-03-10T07:47:40.724 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:40 vm08 sudo[65485]: pam_unix(sudo:session): session closed for user root 2026-03-10T07:47:41.332 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:41 vm08 ceph-mon[59917]: Deploying daemon osd.3 on vm08 2026-03-10T07:47:41.332 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:41 vm08 ceph-mon[59917]: osdmap e20: 4 total, 3 up, 4 in 2026-03-10T07:47:41.332 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:41 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:47:41.332 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:41 vm08 ceph-mon[59917]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-10T07:47:41.332 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:41 vm08 ceph-mon[59917]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-10T07:47:41.332 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:41 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T07:47:41.332 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:41 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:47:41.333 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:41 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T07:47:41.333 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:41 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:47:41.333 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:41 vm08 ceph-mon[59917]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-10T07:47:41.333 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:41 vm08 ceph-mon[59917]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-10T07:47:41.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:41 vm05 ceph-mon[50387]: Deploying daemon osd.3 on vm08 2026-03-10T07:47:41.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:41 vm05 ceph-mon[50387]: osdmap e20: 4 total, 3 up, 4 in 2026-03-10T07:47:41.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:41 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:47:41.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:41 vm05 ceph-mon[50387]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-10T07:47:41.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:41 vm05 ceph-mon[50387]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-10T07:47:41.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:41 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T07:47:41.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:41 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:47:41.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:41 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T07:47:41.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:41 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:47:41.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:41 vm05 ceph-mon[50387]: from='admin socket' entity='admin socket' cmd='smart' args=[json]: dispatch 2026-03-10T07:47:41.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:41 vm05 ceph-mon[50387]: from='admin socket' entity='admin socket' cmd=smart args=[json]: finished 2026-03-10T07:47:42.128 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:42.126+0000 7fbf0f7fe700 1 -- 192.168.123.108:0/2643579946 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fbf14056550 con 0x7fbf18104380 2026-03-10T07:47:42.511 INFO:teuthology.orchestra.run.vm08.stdout:Created osd(s) 3 on host 'vm08' 2026-03-10T07:47:42.511 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:42.509+0000 7fbf0f7fe700 1 -- 192.168.123.108:0/2643579946 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7fbefc000bf0 con 0x7fbf0406c460 2026-03-10T07:47:42.515 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:42.512+0000 7fbf20575700 1 -- 192.168.123.108:0/2643579946 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbf0406c460 msgr2=0x7fbf0406e920 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:42.515 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:42.512+0000 7fbf20575700 1 --2- 192.168.123.108:0/2643579946 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbf0406c460 0x7fbf0406e920 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fbf08006010 tx=0x7fbf08005ea0 comp rx=0 tx=0).stop 2026-03-10T07:47:42.515 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:42.512+0000 7fbf20575700 1 -- 192.168.123.108:0/2643579946 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf18104380 msgr2=0x7fbf1819d350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:42.515 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:42.512+0000 7fbf20575700 1 --2- 192.168.123.108:0/2643579946 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf18104380 0x7fbf1819d350 secure :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0x7fbf1400d8d0 tx=0x7fbf1400dbe0 comp rx=0 tx=0).stop 2026-03-10T07:47:42.515 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:42.512+0000 7fbf20575700 1 -- 192.168.123.108:0/2643579946 shutdown_connections 2026-03-10T07:47:42.515 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:42.512+0000 7fbf20575700 1 --2- 192.168.123.108:0/2643579946 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbf0406c460 0x7fbf0406e920 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:42.515 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:42.512+0000 7fbf20575700 1 --2- 192.168.123.108:0/2643579946 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbf18101a50 0x7fbf1819ce10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:42.515 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:42.512+0000 7fbf20575700 1 --2- 192.168.123.108:0/2643579946 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbf18104380 0x7fbf1819d350 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:42.515 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:42.512+0000 7fbf20575700 1 -- 192.168.123.108:0/2643579946 >> 192.168.123.108:0/2643579946 conn(0x7fbf180fb380 msgr2=0x7fbf180fd7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:47:42.515 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:42.512+0000 7fbf20575700 1 -- 192.168.123.108:0/2643579946 shutdown_connections 2026-03-10T07:47:42.515 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:42.512+0000 7fbf20575700 1 -- 192.168.123.108:0/2643579946 wait complete. 2026-03-10T07:47:42.563 DEBUG:teuthology.orchestra.run.vm08:osd.3> sudo journalctl -f -n 0 -u ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.3.service 2026-03-10T07:47:42.565 INFO:tasks.cephadm:Deploying osd.4 on vm08 with /dev/vdd... 2026-03-10T07:47:42.565 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- lvm zap /dev/vdd 2026-03-10T07:47:42.633 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:42 vm08 ceph-mon[59917]: pgmap v37: 1 pgs: 1 creating+peering; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T07:47:42.633 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:42 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:42.633 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:42 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:42.633 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:42 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:47:42.633 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:42 vm08 ceph-mon[59917]: mgrmap e19: vm05.blexke(active, since 58s), standbys: vm08.orfpog 2026-03-10T07:47:42.633 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:42 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:42.633 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:42 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:42.763 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm08/config 2026-03-10T07:47:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:42 vm05 ceph-mon[50387]: pgmap v37: 1 pgs: 1 creating+peering; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T07:47:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:42 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:42 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:42 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:47:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:42 vm05 ceph-mon[50387]: mgrmap e19: vm05.blexke(active, since 58s), standbys: vm08.orfpog 2026-03-10T07:47:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:42 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:42 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:43.286 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:47:43.298 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph orch daemon add osd vm08:/dev/vdd 2026-03-10T07:47:43.536 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm08/config 2026-03-10T07:47:43.539 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:43 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:43.539 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:43 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:43.539 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:43 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:47:43.841 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.839+0000 7ff16b5ed700 1 -- 192.168.123.108:0/1275915815 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff164107d90 msgr2=0x7ff16410a1c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:43.841 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.839+0000 7ff16b5ed700 1 --2- 192.168.123.108:0/1275915815 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff164107d90 0x7ff16410a1c0 secure :-1 s=READY pgs=193 cs=0 l=1 rev1=1 crypto rx=0x7ff158009b00 tx=0x7ff158009e10 comp rx=0 tx=0).stop 2026-03-10T07:47:43.842 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.840+0000 7ff16b5ed700 1 -- 192.168.123.108:0/1275915815 shutdown_connections 2026-03-10T07:47:43.842 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.840+0000 7ff16b5ed700 1 --2- 192.168.123.108:0/1275915815 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff16410a700 0x7ff16410cb90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:43.842 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.840+0000 7ff16b5ed700 1 --2- 192.168.123.108:0/1275915815 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff164107d90 0x7ff16410a1c0 unknown :-1 s=CLOSED pgs=193 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:43.842 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.841+0000 7ff16b5ed700 1 -- 192.168.123.108:0/1275915815 >> 192.168.123.108:0/1275915815 conn(0x7ff16406daa0 msgr2=0x7ff16406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:47:43.842 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.841+0000 7ff16b5ed700 1 -- 192.168.123.108:0/1275915815 shutdown_connections 2026-03-10T07:47:43.843 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.841+0000 7ff16b5ed700 1 -- 192.168.123.108:0/1275915815 wait complete. 2026-03-10T07:47:43.843 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.842+0000 7ff16b5ed700 1 Processor -- start 2026-03-10T07:47:43.843 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.842+0000 7ff16b5ed700 1 -- start start 2026-03-10T07:47:43.844 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.843+0000 7ff16b5ed700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff164107d90 0x7ff1641adcb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:47:43.844 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.843+0000 7ff16b5ed700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff16410a700 0x7ff1641ae1f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:47:43.844 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.843+0000 7ff16b5ed700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff1641ae810 con 0x7ff164107d90 2026-03-10T07:47:43.844 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.843+0000 7ff16b5ed700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff1641ae950 con 0x7ff16410a700 2026-03-10T07:47:43.845 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.843+0000 7ff169dea700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff16410a700 0x7ff1641ae1f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:47:43.845 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.844+0000 7ff16a5eb700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff164107d90 0x7ff1641adcb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:47:43.845 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.844+0000 7ff16a5eb700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff164107d90 0x7ff1641adcb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.108:59154/0 (socket says 192.168.123.108:59154) 2026-03-10T07:47:43.845 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.844+0000 7ff16a5eb700 1 -- 192.168.123.108:0/3717741071 learned_addr learned my addr 192.168.123.108:0/3717741071 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:47:43.845 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.844+0000 7ff169dea700 1 -- 192.168.123.108:0/3717741071 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff164107d90 msgr2=0x7ff1641adcb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:43.845 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.844+0000 7ff169dea700 1 --2- 192.168.123.108:0/3717741071 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff164107d90 0x7ff1641adcb0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:43.845 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.844+0000 7ff169dea700 1 -- 192.168.123.108:0/3717741071 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff15c009710 con 0x7ff16410a700 2026-03-10T07:47:43.846 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.845+0000 7ff169dea700 1 --2- 192.168.123.108:0/3717741071 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff16410a700 0x7ff1641ae1f0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7ff15c00eda0 tx=0x7ff15c00c5b0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:47:43.846 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.845+0000 7ff1577fe700 1 -- 192.168.123.108:0/3717741071 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff15c00cd70 con 0x7ff16410a700 2026-03-10T07:47:43.847 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.845+0000 7ff16b5ed700 1 -- 192.168.123.108:0/3717741071 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff1580097e0 con 0x7ff16410a700 2026-03-10T07:47:43.847 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.845+0000 7ff16b5ed700 1 -- 192.168.123.108:0/3717741071 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff1641b3630 con 0x7ff16410a700 2026-03-10T07:47:43.847 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.845+0000 7ff1577fe700 1 -- 192.168.123.108:0/3717741071 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff15c004500 con 0x7ff16410a700 2026-03-10T07:47:43.847 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.845+0000 7ff1577fe700 1 -- 192.168.123.108:0/3717741071 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff15c003ea0 con 0x7ff16410a700 2026-03-10T07:47:43.848 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.847+0000 7ff16b5ed700 1 -- 192.168.123.108:0/3717741071 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff148005320 con 0x7ff16410a700 2026-03-10T07:47:43.849 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.848+0000 7ff1577fe700 1 -- 192.168.123.108:0/3717741071 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7ff15c004000 con 0x7ff16410a700 2026-03-10T07:47:43.849 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.848+0000 7ff1577fe700 1 --2- 192.168.123.108:0/3717741071 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff15006c600 0x7ff15006eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:47:43.849 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.848+0000 7ff1577fe700 1 -- 192.168.123.108:0/3717741071 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(20..20 src has 1..20) v4 ==== 3165+0+0 (secure 0 0 0) 0x7ff15c014070 con 0x7ff16410a700 2026-03-10T07:47:43.849 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.848+0000 7ff16a5eb700 1 --2- 192.168.123.108:0/3717741071 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff15006c600 0x7ff15006eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:47:43.850 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.849+0000 7ff16a5eb700 1 --2- 192.168.123.108:0/3717741071 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff15006c600 0x7ff15006eac0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7ff158005190 tx=0x7ff15801a040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:47:43.851 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.850+0000 7ff1577fe700 1 -- 192.168.123.108:0/3717741071 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ff15c059e90 con 0x7ff16410a700 2026-03-10T07:47:43.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:43 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:43.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:43 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:43.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:43 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:47:43.986 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:43.984+0000 7ff16b5ed700 1 -- 192.168.123.108:0/3717741071 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vdd", "target": ["mon-mgr", ""]}) v1 -- 0x7ff148000bf0 con 0x7ff15006c600 2026-03-10T07:47:44.173 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:47:43 vm08 ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3[65774]: 2026-03-10T07:47:43.935+0000 7fd7aa4f2640 -1 osd.3 0 log_to_monitors true 2026-03-10T07:47:44.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:44 vm08 ceph-mon[59917]: pgmap v38: 1 pgs: 1 creating+peering; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T07:47:44.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:44 vm08 ceph-mon[59917]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T07:47:44.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:44 vm08 ceph-mon[59917]: from='osd.3 [v2:192.168.123.108:6800/2127000679,v1:192.168.123.108:6801/2127000679]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T07:47:44.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:44 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T07:47:44.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:44 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T07:47:44.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:44 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:44.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:44 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:44.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:44 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:44.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:44 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:47:44.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:44 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:44.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:44 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:47:44.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:44 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:44.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:44 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:47:44.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:44 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:44.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:44 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:47:44.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:44 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:44.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:44 vm05 ceph-mon[50387]: pgmap v38: 1 pgs: 1 creating+peering; 0 B data, 79 MiB used, 60 GiB / 60 GiB avail 2026-03-10T07:47:44.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:44 vm05 ceph-mon[50387]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T07:47:44.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:44 vm05 ceph-mon[50387]: from='osd.3 [v2:192.168.123.108:6800/2127000679,v1:192.168.123.108:6801/2127000679]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T07:47:44.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:44 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T07:47:44.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:44 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T07:47:44.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:44 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:44.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:44 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:44.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:44 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:44.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:44 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:47:44.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:44 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:44.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:44 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:47:44.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:44 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:44.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:44 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:47:44.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:44 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:44.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:44 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:47:44.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:44 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:45.277 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:47:44 vm08 ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3[65774]: 2026-03-10T07:47:44.925+0000 7fd7a0b58700 -1 osd.3 0 waiting for initial osdmap 2026-03-10T07:47:45.277 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:47:44 vm08 ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3[65774]: 2026-03-10T07:47:44.935+0000 7fd799146700 -1 osd.3 22 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T07:47:45.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:45 vm05 ceph-mon[50387]: from='client.24157 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:47:45.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:45 vm05 ceph-mon[50387]: Detected new or changed devices on vm08 2026-03-10T07:47:45.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:45 vm05 ceph-mon[50387]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-10T07:47:45.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:45 vm05 ceph-mon[50387]: osdmap e21: 4 total, 3 up, 4 in 2026-03-10T07:47:45.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:45 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:47:45.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:45 vm05 ceph-mon[50387]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T07:47:45.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:45 vm05 ceph-mon[50387]: from='osd.3 [v2:192.168.123.108:6800/2127000679,v1:192.168.123.108:6801/2127000679]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T07:47:45.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:45 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/4022292325' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "efa282d4-a651-4903-bdb8-2e148038e567"}]: dispatch 2026-03-10T07:47:45.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:45 vm05 ceph-mon[50387]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "efa282d4-a651-4903-bdb8-2e148038e567"}]: dispatch 2026-03-10T07:47:45.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:45 vm05 ceph-mon[50387]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]': finished 2026-03-10T07:47:45.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:45 vm05 ceph-mon[50387]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "efa282d4-a651-4903-bdb8-2e148038e567"}]': finished 2026-03-10T07:47:45.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:45 vm05 ceph-mon[50387]: osdmap e22: 5 total, 3 up, 5 in 2026-03-10T07:47:45.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:45 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:47:45.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:45 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T07:47:45.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:45 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:47:45.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:45 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/3631343751' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T07:47:45.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:45 vm08 ceph-mon[59917]: from='client.24157 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vdd", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:47:45.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:45 vm08 ceph-mon[59917]: Detected new or changed devices on vm08 2026-03-10T07:47:45.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:45 vm08 ceph-mon[59917]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-10T07:47:45.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:45 vm08 ceph-mon[59917]: osdmap e21: 4 total, 3 up, 4 in 2026-03-10T07:47:45.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:45 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:47:45.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:45 vm08 ceph-mon[59917]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T07:47:45.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:45 vm08 ceph-mon[59917]: from='osd.3 [v2:192.168.123.108:6800/2127000679,v1:192.168.123.108:6801/2127000679]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T07:47:45.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:45 vm08 ceph-mon[59917]: from='client.? 192.168.123.108:0/4022292325' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "efa282d4-a651-4903-bdb8-2e148038e567"}]: dispatch 2026-03-10T07:47:45.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:45 vm08 ceph-mon[59917]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "efa282d4-a651-4903-bdb8-2e148038e567"}]: dispatch 2026-03-10T07:47:45.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:45 vm08 ceph-mon[59917]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]': finished 2026-03-10T07:47:45.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:45 vm08 ceph-mon[59917]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "efa282d4-a651-4903-bdb8-2e148038e567"}]': finished 2026-03-10T07:47:45.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:45 vm08 ceph-mon[59917]: osdmap e22: 5 total, 3 up, 5 in 2026-03-10T07:47:45.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:45 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:47:45.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:45 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T07:47:45.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:45 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:47:45.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:45 vm08 ceph-mon[59917]: from='client.? 192.168.123.108:0/3631343751' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T07:47:46.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:46 vm05 ceph-mon[50387]: pgmap v41: 1 pgs: 1 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail 2026-03-10T07:47:46.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:46 vm05 ceph-mon[50387]: osd.3 [v2:192.168.123.108:6800/2127000679,v1:192.168.123.108:6801/2127000679] boot 2026-03-10T07:47:46.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:46 vm05 ceph-mon[50387]: osdmap e23: 5 total, 4 up, 5 in 2026-03-10T07:47:46.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:46 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:47:46.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:46 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T07:47:46.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:46 vm08 ceph-mon[59917]: pgmap v41: 1 pgs: 1 active+clean; 449 KiB data, 80 MiB used, 60 GiB / 60 GiB avail 2026-03-10T07:47:46.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:46 vm08 ceph-mon[59917]: osd.3 [v2:192.168.123.108:6800/2127000679,v1:192.168.123.108:6801/2127000679] boot 2026-03-10T07:47:46.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:46 vm08 ceph-mon[59917]: osdmap e23: 5 total, 4 up, 5 in 2026-03-10T07:47:46.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:46 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:47:46.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:46 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T07:47:48.402 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:47 vm08 ceph-mon[59917]: purged_snaps scrub starts 2026-03-10T07:47:48.402 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:47 vm08 ceph-mon[59917]: purged_snaps scrub ok 2026-03-10T07:47:48.402 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:47 vm08 ceph-mon[59917]: osdmap e24: 5 total, 4 up, 5 in 2026-03-10T07:47:48.402 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:47 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T07:47:48.402 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:47 vm08 ceph-mon[59917]: pgmap v44: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-10T07:47:48.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:47 vm05 ceph-mon[50387]: purged_snaps scrub starts 2026-03-10T07:47:48.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:47 vm05 ceph-mon[50387]: purged_snaps scrub ok 2026-03-10T07:47:48.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:47 vm05 ceph-mon[50387]: osdmap e24: 5 total, 4 up, 5 in 2026-03-10T07:47:48.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:47 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T07:47:48.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:47 vm05 ceph-mon[50387]: pgmap v44: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail 2026-03-10T07:47:49.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:48 vm08 ceph-mon[59917]: osdmap e25: 5 total, 4 up, 5 in 2026-03-10T07:47:49.054 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:48 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T07:47:49.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:48 vm05 ceph-mon[50387]: osdmap e25: 5 total, 4 up, 5 in 2026-03-10T07:47:49.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:48 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T07:47:50.199 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:49 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T07:47:50.199 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:49 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:50.199 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:49 vm08 ceph-mon[59917]: Deploying daemon osd.4 on vm08 2026-03-10T07:47:50.199 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:49 vm08 ceph-mon[59917]: pgmap v46: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 103 KiB/s, 0 objects/s recovering 2026-03-10T07:47:50.257 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:49 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T07:47:50.257 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:49 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:50.257 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:49 vm05 ceph-mon[50387]: Deploying daemon osd.4 on vm08 2026-03-10T07:47:50.257 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:49 vm05 ceph-mon[50387]: pgmap v46: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 103 KiB/s, 0 objects/s recovering 2026-03-10T07:47:51.666 INFO:teuthology.orchestra.run.vm08.stdout:Created osd(s) 4 on host 'vm08' 2026-03-10T07:47:51.666 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:51.664+0000 7ff1577fe700 1 -- 192.168.123.108:0/3717741071 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7ff148000bf0 con 0x7ff15006c600 2026-03-10T07:47:51.668 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:51.666+0000 7ff16b5ed700 1 -- 192.168.123.108:0/3717741071 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff15006c600 msgr2=0x7ff15006eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:51.668 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:51.666+0000 7ff16b5ed700 1 --2- 192.168.123.108:0/3717741071 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff15006c600 0x7ff15006eac0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7ff158005190 tx=0x7ff15801a040 comp rx=0 tx=0).stop 2026-03-10T07:47:51.668 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:51.666+0000 7ff16b5ed700 1 -- 192.168.123.108:0/3717741071 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff16410a700 msgr2=0x7ff1641ae1f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:51.668 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:51.666+0000 7ff16b5ed700 1 --2- 192.168.123.108:0/3717741071 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff16410a700 0x7ff1641ae1f0 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7ff15c00eda0 tx=0x7ff15c00c5b0 comp rx=0 tx=0).stop 2026-03-10T07:47:51.668 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:51.666+0000 7ff16b5ed700 1 -- 192.168.123.108:0/3717741071 shutdown_connections 2026-03-10T07:47:51.668 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:51.666+0000 7ff16b5ed700 1 --2- 192.168.123.108:0/3717741071 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff15006c600 0x7ff15006eac0 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:51.668 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:51.666+0000 7ff16b5ed700 1 --2- 192.168.123.108:0/3717741071 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff164107d90 0x7ff1641adcb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:51.668 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:51.666+0000 7ff16b5ed700 1 --2- 192.168.123.108:0/3717741071 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff16410a700 0x7ff1641ae1f0 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:51.668 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:51.666+0000 7ff16b5ed700 1 -- 192.168.123.108:0/3717741071 >> 192.168.123.108:0/3717741071 conn(0x7ff16406daa0 msgr2=0x7ff16406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:47:51.668 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:51.666+0000 7ff16b5ed700 1 -- 192.168.123.108:0/3717741071 shutdown_connections 2026-03-10T07:47:51.668 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:51.666+0000 7ff16b5ed700 1 -- 192.168.123.108:0/3717741071 wait complete. 2026-03-10T07:47:51.720 DEBUG:teuthology.orchestra.run.vm08:osd.4> sudo journalctl -f -n 0 -u ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.4.service 2026-03-10T07:47:51.721 INFO:tasks.cephadm:Deploying osd.5 on vm08 with /dev/vdc... 2026-03-10T07:47:51.721 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 ceph-volume -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- lvm zap /dev/vdc 2026-03-10T07:47:51.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:51 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:51.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:51 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:51.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:51 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:47:51.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:51 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:51.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:51 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:51.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:51 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:51.928 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm08/config 2026-03-10T07:47:52.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:51 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:52.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:51 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:52.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:51 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:47:52.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:51 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:52.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:51 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:52.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:51 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:52.470 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T07:47:52.483 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph orch daemon add osd vm08:/dev/vdc 2026-03-10T07:47:52.684 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:47:52 vm08 ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4[72016]: 2026-03-10T07:47:52.621+0000 7f3767801640 -1 osd.4 0 log_to_monitors true 2026-03-10T07:47:52.738 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm08/config 2026-03-10T07:47:52.966 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:52 vm08 ceph-mon[59917]: pgmap v47: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 75 KiB/s, 0 objects/s recovering 2026-03-10T07:47:52.966 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:52 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:52.966 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:52 vm08 ceph-mon[59917]: from='osd.4 [v2:192.168.123.108:6808/551946261,v1:192.168.123.108:6809/551946261]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T07:47:52.966 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:52 vm08 ceph-mon[59917]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T07:47:52.993 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.992+0000 7f64a45f6700 1 -- 192.168.123.108:0/2744817623 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f649c107d90 msgr2=0x7f649c10a1c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:52.993 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.992+0000 7f64a45f6700 1 --2- 192.168.123.108:0/2744817623 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f649c107d90 0x7f649c10a1c0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f6498009b50 tx=0x7f6498009e60 comp rx=0 tx=0).stop 2026-03-10T07:47:52.993 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.992+0000 7f64a45f6700 1 -- 192.168.123.108:0/2744817623 shutdown_connections 2026-03-10T07:47:52.993 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.992+0000 7f64a45f6700 1 --2- 192.168.123.108:0/2744817623 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f649c10a700 0x7f649c10cb90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:52.993 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.992+0000 7f64a45f6700 1 --2- 192.168.123.108:0/2744817623 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f649c107d90 0x7f649c10a1c0 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:52.994 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.992+0000 7f64a45f6700 1 -- 192.168.123.108:0/2744817623 >> 192.168.123.108:0/2744817623 conn(0x7f649c06dae0 msgr2=0x7f649c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:47:52.994 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.992+0000 7f64a45f6700 1 -- 192.168.123.108:0/2744817623 shutdown_connections 2026-03-10T07:47:52.994 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.992+0000 7f64a45f6700 1 -- 192.168.123.108:0/2744817623 wait complete. 2026-03-10T07:47:52.994 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.993+0000 7f64a45f6700 1 Processor -- start 2026-03-10T07:47:52.994 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.993+0000 7f64a45f6700 1 -- start start 2026-03-10T07:47:52.994 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.993+0000 7f64a45f6700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f649c107d90 0x7f649c116a80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:47:52.994 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.993+0000 7f64a45f6700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f649c10a700 0x7f649c116fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:47:52.994 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.993+0000 7f64a45f6700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f649c1175e0 con 0x7f649c107d90 2026-03-10T07:47:52.994 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.993+0000 7f64a45f6700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f649c1b3120 con 0x7f649c10a700 2026-03-10T07:47:52.995 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.993+0000 7f64a2392700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f649c107d90 0x7f649c116a80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:47:52.995 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.993+0000 7f64a1b91700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f649c10a700 0x7f649c116fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:47:52.995 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.993+0000 7f64a1b91700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f649c10a700 0x7f649c116fc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.108:58590/0 (socket says 192.168.123.108:58590) 2026-03-10T07:47:52.995 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.993+0000 7f64a1b91700 1 -- 192.168.123.108:0/2323280314 learned_addr learned my addr 192.168.123.108:0/2323280314 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:47:52.995 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.993+0000 7f64a1b91700 1 -- 192.168.123.108:0/2323280314 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f649c107d90 msgr2=0x7f649c116a80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:47:52.995 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.993+0000 7f64a1b91700 1 --2- 192.168.123.108:0/2323280314 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f649c107d90 0x7f649c116a80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:47:52.995 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.993+0000 7f64a1b91700 1 -- 192.168.123.108:0/2323280314 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f64980097e0 con 0x7f649c10a700 2026-03-10T07:47:52.996 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.994+0000 7f64a1b91700 1 --2- 192.168.123.108:0/2323280314 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f649c10a700 0x7f649c116fc0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f648c00dc40 tx=0x7f648c00df50 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:47:52.996 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.994+0000 7f64937fe700 1 -- 192.168.123.108:0/2323280314 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f648c0098e0 con 0x7f649c10a700 2026-03-10T07:47:52.996 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.994+0000 7f64a45f6700 1 -- 192.168.123.108:0/2323280314 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f649c1b3320 con 0x7f649c10a700 2026-03-10T07:47:52.996 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.994+0000 7f64a45f6700 1 -- 192.168.123.108:0/2323280314 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f649c1b3820 con 0x7f649c10a700 2026-03-10T07:47:52.996 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.994+0000 7f64937fe700 1 -- 192.168.123.108:0/2323280314 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f648c010460 con 0x7f649c10a700 2026-03-10T07:47:52.996 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.994+0000 7f64937fe700 1 -- 192.168.123.108:0/2323280314 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f648c00f5d0 con 0x7f649c10a700 2026-03-10T07:47:52.996 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.995+0000 7f64a45f6700 1 -- 192.168.123.108:0/2323280314 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f649c110c60 con 0x7f649c10a700 2026-03-10T07:47:53.000 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.996+0000 7f64937fe700 1 -- 192.168.123.108:0/2323280314 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f648c0105d0 con 0x7f649c10a700 2026-03-10T07:47:53.002 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.996+0000 7f64937fe700 1 --2- 192.168.123.108:0/2323280314 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f648806c4e0 0x7f648806e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:47:53.002 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.996+0000 7f64937fe700 1 -- 192.168.123.108:0/2323280314 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(25..25 src has 1..25) v4 ==== 3697+0+0 (secure 0 0 0) 0x7f648c08a780 con 0x7f649c10a700 2026-03-10T07:47:53.002 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.996+0000 7f64a2392700 1 --2- 192.168.123.108:0/2323280314 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f648806c4e0 0x7f648806e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:47:53.002 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.999+0000 7f64937fe700 1 -- 192.168.123.108:0/2323280314 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f648c058e20 con 0x7f649c10a700 2026-03-10T07:47:53.002 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:52.999+0000 7f64a2392700 1 --2- 192.168.123.108:0/2323280314 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f648806c4e0 0x7f648806e9a0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f6498006010 tx=0x7f6498005ea0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:47:53.119 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:47:53.117+0000 7f64a45f6700 1 -- 192.168.123.108:0/2323280314 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vdc", "target": ["mon-mgr", ""]}) v1 -- 0x7f649c0611d0 con 0x7f648806c4e0 2026-03-10T07:47:53.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:52 vm05 ceph-mon[50387]: pgmap v47: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 75 KiB/s, 0 objects/s recovering 2026-03-10T07:47:53.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:52 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:53.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:52 vm05 ceph-mon[50387]: from='osd.4 [v2:192.168.123.108:6808/551946261,v1:192.168.123.108:6809/551946261]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T07:47:53.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:52 vm05 ceph-mon[50387]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T07:47:54.370 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:47:54 vm08 ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4[72016]: 2026-03-10T07:47:54.033+0000 7f375de67700 -1 osd.4 0 waiting for initial osdmap 2026-03-10T07:47:54.370 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:47:54 vm08 ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4[72016]: 2026-03-10T07:47:54.047+0000 7f3756455700 -1 osd.4 27 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T07:47:54.370 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-10T07:47:54.370 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: osdmap e26: 5 total, 4 up, 5 in 2026-03-10T07:47:54.370 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T07:47:54.370 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T07:47:54.370 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='osd.4 [v2:192.168.123.108:6808/551946261,v1:192.168.123.108:6809/551946261]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T07:47:54.370 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='client.24177 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:47:54.370 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T07:47:54.370 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T07:47:54.370 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:54.370 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: pgmap v49: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 71 KiB/s, 0 objects/s recovering 2026-03-10T07:47:54.370 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: Detected new or changed devices on vm08 2026-03-10T07:47:54.371 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:54.371 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:54.371 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:47:54.371 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:54.371 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:47:54.371 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:54.371 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:47:54.371 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:54.371 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:47:54.371 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:54.371 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "ef1ae183-14f2-4400-85d6-c48b79ef2819"}]: dispatch 2026-03-10T07:47:54.371 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='client.? 192.168.123.108:0/411533236' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "ef1ae183-14f2-4400-85d6-c48b79ef2819"}]: dispatch 2026-03-10T07:47:54.371 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]': finished 2026-03-10T07:47:54.371 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "ef1ae183-14f2-4400-85d6-c48b79ef2819"}]': finished 2026-03-10T07:47:54.371 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: osdmap e27: 6 total, 4 up, 6 in 2026-03-10T07:47:54.371 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T07:47:54.371 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T07:47:54.371 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:54 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T07:47:54.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-10T07:47:54.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: osdmap e26: 5 total, 4 up, 5 in 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='osd.4 [v2:192.168.123.108:6808/551946261,v1:192.168.123.108:6809/551946261]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='client.24177 -' entity='client.admin' cmd=[{"prefix": "orch daemon add osd", "svc_arg": "vm08:/dev/vdc", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd tree", "states": ["destroyed"], "format": "json"}]: dispatch 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.bootstrap-osd"}]: dispatch 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: pgmap v49: 1 pgs: 1 active+clean; 449 KiB data, 107 MiB used, 80 GiB / 80 GiB avail; 71 KiB/s, 0 objects/s recovering 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: Detected new or changed devices on vm08 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='client.? ' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "ef1ae183-14f2-4400-85d6-c48b79ef2819"}]: dispatch 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/411533236' entity='client.bootstrap-osd' cmd=[{"prefix": "osd new", "uuid": "ef1ae183-14f2-4400-85d6-c48b79ef2819"}]: dispatch 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]': finished 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='client.? ' entity='client.bootstrap-osd' cmd='[{"prefix": "osd new", "uuid": "ef1ae183-14f2-4400-85d6-c48b79ef2819"}]': finished 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: osdmap e27: 6 total, 4 up, 6 in 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T07:47:54.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:54 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T07:47:55.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:55 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/1687291630' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T07:47:55.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:55 vm05 ceph-mon[50387]: osd.4 [v2:192.168.123.108:6808/551946261,v1:192.168.123.108:6809/551946261] boot 2026-03-10T07:47:55.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:55 vm05 ceph-mon[50387]: osdmap e28: 6 total, 5 up, 6 in 2026-03-10T07:47:55.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:55 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T07:47:55.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:55 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T07:47:55.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:55 vm08 ceph-mon[59917]: from='client.? 192.168.123.108:0/1687291630' entity='client.bootstrap-osd' cmd=[{"prefix": "mon getmap"}]: dispatch 2026-03-10T07:47:55.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:55 vm08 ceph-mon[59917]: osd.4 [v2:192.168.123.108:6808/551946261,v1:192.168.123.108:6809/551946261] boot 2026-03-10T07:47:55.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:55 vm08 ceph-mon[59917]: osdmap e28: 6 total, 5 up, 6 in 2026-03-10T07:47:55.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:55 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T07:47:55.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:55 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T07:47:56.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:56 vm05 ceph-mon[50387]: purged_snaps scrub starts 2026-03-10T07:47:56.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:56 vm05 ceph-mon[50387]: purged_snaps scrub ok 2026-03-10T07:47:56.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:56 vm05 ceph-mon[50387]: pgmap v52: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T07:47:56.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:56 vm08 ceph-mon[59917]: purged_snaps scrub starts 2026-03-10T07:47:56.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:56 vm08 ceph-mon[59917]: purged_snaps scrub ok 2026-03-10T07:47:56.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:56 vm08 ceph-mon[59917]: pgmap v52: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T07:47:57.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:57 vm05 ceph-mon[50387]: osdmap e29: 6 total, 5 up, 6 in 2026-03-10T07:47:57.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:57 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T07:47:57.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:57 vm08 ceph-mon[59917]: osdmap e29: 6 total, 5 up, 6 in 2026-03-10T07:47:57.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:57 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T07:47:58.137 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:58 vm08 ceph-mon[59917]: pgmap v54: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T07:47:58.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:58 vm05 ceph-mon[50387]: pgmap v54: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T07:47:59.315 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:59 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T07:47:59.316 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:59 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:59.316 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:59 vm08 ceph-mon[59917]: Deploying daemon osd.5 on vm08 2026-03-10T07:47:59.316 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:47:59 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:47:59.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:59 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T07:47:59.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:59 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:47:59.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:59 vm05 ceph-mon[50387]: Deploying daemon osd.5 on vm08 2026-03-10T07:47:59.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:47:59 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:48:00.142 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:00 vm08 ceph-mon[59917]: pgmap v55: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T07:48:00.142 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:00 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:00.143 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:00 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:00.143 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:00 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:48:00.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:00 vm05 ceph-mon[50387]: pgmap v55: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T07:48:00.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:00 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:00.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:00 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:00.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:00 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:48:00.851 INFO:teuthology.orchestra.run.vm08.stdout:Created osd(s) 5 on host 'vm08' 2026-03-10T07:48:00.852 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:00.846+0000 7f64937fe700 1 -- 192.168.123.108:0/2323280314 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+32 (secure 0 0 0) 0x7f649c0611d0 con 0x7f648806c4e0 2026-03-10T07:48:00.852 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:00.849+0000 7f64a45f6700 1 -- 192.168.123.108:0/2323280314 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f648806c4e0 msgr2=0x7f648806e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:00.852 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:00.849+0000 7f64a45f6700 1 --2- 192.168.123.108:0/2323280314 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f648806c4e0 0x7f648806e9a0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f6498006010 tx=0x7f6498005ea0 comp rx=0 tx=0).stop 2026-03-10T07:48:00.852 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:00.849+0000 7f64a45f6700 1 -- 192.168.123.108:0/2323280314 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f649c10a700 msgr2=0x7f649c116fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:00.852 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:00.849+0000 7f64a45f6700 1 --2- 192.168.123.108:0/2323280314 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f649c10a700 0x7f649c116fc0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f648c00dc40 tx=0x7f648c00df50 comp rx=0 tx=0).stop 2026-03-10T07:48:00.852 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:00.849+0000 7f64a45f6700 1 -- 192.168.123.108:0/2323280314 shutdown_connections 2026-03-10T07:48:00.852 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:00.849+0000 7f64a45f6700 1 --2- 192.168.123.108:0/2323280314 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f648806c4e0 0x7f648806e9a0 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:00.852 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:00.849+0000 7f64a45f6700 1 --2- 192.168.123.108:0/2323280314 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f649c107d90 0x7f649c116a80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:00.852 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:00.849+0000 7f64a45f6700 1 --2- 192.168.123.108:0/2323280314 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f649c10a700 0x7f649c116fc0 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:00.852 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:00.849+0000 7f64a45f6700 1 -- 192.168.123.108:0/2323280314 >> 192.168.123.108:0/2323280314 conn(0x7f649c06dae0 msgr2=0x7f649c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:00.852 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:00.850+0000 7f64a45f6700 1 -- 192.168.123.108:0/2323280314 shutdown_connections 2026-03-10T07:48:00.852 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:00.850+0000 7f64a45f6700 1 -- 192.168.123.108:0/2323280314 wait complete. 2026-03-10T07:48:00.912 DEBUG:teuthology.orchestra.run.vm08:osd.5> sudo journalctl -f -n 0 -u ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.5.service 2026-03-10T07:48:00.919 INFO:tasks.cephadm:Waiting for 6 OSDs to come up... 2026-03-10T07:48:00.919 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph osd stat -f json 2026-03-10T07:48:01.087 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:01.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.346+0000 7f8e9d9c8700 1 -- 192.168.123.105:0/4109131249 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e98104380 msgr2=0x7f8e981047e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:01.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.346+0000 7f8e9d9c8700 1 --2- 192.168.123.105:0/4109131249 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e98104380 0x7f8e981047e0 secure :-1 s=READY pgs=195 cs=0 l=1 rev1=1 crypto rx=0x7f8e88009b00 tx=0x7f8e88009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:01.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.347+0000 7f8e9d9c8700 1 -- 192.168.123.105:0/4109131249 shutdown_connections 2026-03-10T07:48:01.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.347+0000 7f8e9d9c8700 1 --2- 192.168.123.105:0/4109131249 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e98104380 0x7f8e981047e0 unknown :-1 s=CLOSED pgs=195 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:01.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.347+0000 7f8e9d9c8700 1 --2- 192.168.123.105:0/4109131249 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8e98103180 0x7f8e981035a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:01.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.347+0000 7f8e9d9c8700 1 -- 192.168.123.105:0/4109131249 >> 192.168.123.105:0/4109131249 conn(0x7f8e980fe720 msgr2=0x7f8e98100b60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:01.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.348+0000 7f8e9d9c8700 1 -- 192.168.123.105:0/4109131249 shutdown_connections 2026-03-10T07:48:01.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.348+0000 7f8e9d9c8700 1 -- 192.168.123.105:0/4109131249 wait complete. 2026-03-10T07:48:01.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.348+0000 7f8e9d9c8700 1 Processor -- start 2026-03-10T07:48:01.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.348+0000 7f8e9d9c8700 1 -- start start 2026-03-10T07:48:01.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.349+0000 7f8e9d9c8700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8e98103180 0x7f8e98198a80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:01.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.349+0000 7f8e9d9c8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e98104380 0x7f8e98198fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:01.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.349+0000 7f8e9d9c8700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8e981995e0 con 0x7f8e98104380 2026-03-10T07:48:01.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.349+0000 7f8e9d9c8700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8e98199720 con 0x7f8e98103180 2026-03-10T07:48:01.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.349+0000 7f8e967fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e98104380 0x7f8e98198fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:01.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.349+0000 7f8e967fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e98104380 0x7f8e98198fc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46044/0 (socket says 192.168.123.105:46044) 2026-03-10T07:48:01.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.349+0000 7f8e967fc700 1 -- 192.168.123.105:0/204965995 learned_addr learned my addr 192.168.123.105:0/204965995 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:01.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.349+0000 7f8e967fc700 1 -- 192.168.123.105:0/204965995 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8e98103180 msgr2=0x7f8e98198a80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:01.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.349+0000 7f8e96ffd700 1 --2- 192.168.123.105:0/204965995 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8e98103180 0x7f8e98198a80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:01.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.349+0000 7f8e967fc700 1 --2- 192.168.123.105:0/204965995 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8e98103180 0x7f8e98198a80 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:01.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.350+0000 7f8e967fc700 1 -- 192.168.123.105:0/204965995 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8e880097e0 con 0x7f8e98104380 2026-03-10T07:48:01.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.350+0000 7f8e96ffd700 1 --2- 192.168.123.105:0/204965995 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8e98103180 0x7f8e98198a80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:48:01.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.350+0000 7f8e967fc700 1 --2- 192.168.123.105:0/204965995 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e98104380 0x7f8e98198fc0 secure :-1 s=READY pgs=196 cs=0 l=1 rev1=1 crypto rx=0x7f8e88005230 tx=0x7f8e880056c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:01.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.350+0000 7f8e9c9c6700 1 -- 192.168.123.105:0/204965995 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8e8801d070 con 0x7f8e98104380 2026-03-10T07:48:01.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.350+0000 7f8e9c9c6700 1 -- 192.168.123.105:0/204965995 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8e8800bc50 con 0x7f8e98104380 2026-03-10T07:48:01.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.350+0000 7f8e9c9c6700 1 -- 192.168.123.105:0/204965995 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8e8800f800 con 0x7f8e98104380 2026-03-10T07:48:01.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.350+0000 7f8e9d9c8700 1 -- 192.168.123.105:0/204965995 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8e9819e170 con 0x7f8e98104380 2026-03-10T07:48:01.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.350+0000 7f8e9d9c8700 1 -- 192.168.123.105:0/204965995 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8e9819e660 con 0x7f8e98104380 2026-03-10T07:48:01.354 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.352+0000 7f8e9c9c6700 1 -- 192.168.123.105:0/204965995 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f8e8800f960 con 0x7f8e98104380 2026-03-10T07:48:01.354 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.352+0000 7f8e9d9c8700 1 -- 192.168.123.105:0/204965995 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8e98066e80 con 0x7f8e98104380 2026-03-10T07:48:01.357 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.355+0000 7f8e9c9c6700 1 --2- 192.168.123.105:0/204965995 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8e8406c600 0x7f8e8406eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:01.357 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.355+0000 7f8e9c9c6700 1 -- 192.168.123.105:0/204965995 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(29..29 src has 1..29) v4 ==== 4129+0+0 (secure 0 0 0) 0x7f8e8808cc80 con 0x7f8e98104380 2026-03-10T07:48:01.357 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.355+0000 7f8e96ffd700 1 --2- 192.168.123.105:0/204965995 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8e8406c600 0x7f8e8406eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:01.357 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.356+0000 7f8e9c9c6700 1 -- 192.168.123.105:0/204965995 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f8e8805c400 con 0x7f8e98104380 2026-03-10T07:48:01.358 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.356+0000 7f8e96ffd700 1 --2- 192.168.123.105:0/204965995 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8e8406c600 0x7f8e8406eac0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f8e981041e0 tx=0x7f8e80008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:01.466 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.464+0000 7f8e9d9c8700 1 -- 192.168.123.105:0/204965995 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7f8e9819ea30 con 0x7f8e98104380 2026-03-10T07:48:01.467 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.465+0000 7f8e9c9c6700 1 -- 192.168.123.105:0/204965995 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v29) v1 ==== 74+0+130 (secure 0 0 0) 0x7f8e8805bf90 con 0x7f8e98104380 2026-03-10T07:48:01.467 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:48:01.469 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.467+0000 7f8e9d9c8700 1 -- 192.168.123.105:0/204965995 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8e8406c600 msgr2=0x7f8e8406eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:01.469 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.467+0000 7f8e9d9c8700 1 --2- 192.168.123.105:0/204965995 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8e8406c600 0x7f8e8406eac0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f8e981041e0 tx=0x7f8e80008040 comp rx=0 tx=0).stop 2026-03-10T07:48:01.469 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.467+0000 7f8e9d9c8700 1 -- 192.168.123.105:0/204965995 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e98104380 msgr2=0x7f8e98198fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:01.469 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.467+0000 7f8e9d9c8700 1 --2- 192.168.123.105:0/204965995 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e98104380 0x7f8e98198fc0 secure :-1 s=READY pgs=196 cs=0 l=1 rev1=1 crypto rx=0x7f8e88005230 tx=0x7f8e880056c0 comp rx=0 tx=0).stop 2026-03-10T07:48:01.469 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.467+0000 7f8e9d9c8700 1 -- 192.168.123.105:0/204965995 shutdown_connections 2026-03-10T07:48:01.469 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.467+0000 7f8e9d9c8700 1 --2- 192.168.123.105:0/204965995 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8e8406c600 0x7f8e8406eac0 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:01.469 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.467+0000 7f8e9d9c8700 1 --2- 192.168.123.105:0/204965995 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8e98103180 0x7f8e98198a80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:01.469 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.467+0000 7f8e9d9c8700 1 --2- 192.168.123.105:0/204965995 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8e98104380 0x7f8e98198fc0 unknown :-1 s=CLOSED pgs=196 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:01.470 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.467+0000 7f8e9d9c8700 1 -- 192.168.123.105:0/204965995 >> 192.168.123.105:0/204965995 conn(0x7f8e980fe720 msgr2=0x7f8e981075b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:01.470 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.468+0000 7f8e9d9c8700 1 -- 192.168.123.105:0/204965995 shutdown_connections 2026-03-10T07:48:01.470 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:01.468+0000 7f8e9d9c8700 1 -- 192.168.123.105:0/204965995 wait complete. 2026-03-10T07:48:01.539 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":29,"num_osds":6,"num_up_osds":5,"osd_up_since":1773128875,"num_in_osds":6,"osd_in_since":1773128874,"num_remapped_pgs":0} 2026-03-10T07:48:01.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:01 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:01.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:01 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:01.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:01 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:01.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:01 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:01.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:01 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/204965995' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T07:48:02.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:01 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:02.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:01 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:02.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:01 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:02.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:01 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:02.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:01 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/204965995' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T07:48:02.540 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph osd stat -f json 2026-03-10T07:48:02.691 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:02.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:02 vm05 ceph-mon[50387]: pgmap v56: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T07:48:02.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:02 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:02.828 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:02 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:02.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.958+0000 7f7537ded700 1 -- 192.168.123.105:0/10602656 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f75300ff700 msgr2=0x7f75300ffb20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:02.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.958+0000 7f7537ded700 1 --2- 192.168.123.105:0/10602656 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f75300ff700 0x7f75300ffb20 secure :-1 s=READY pgs=197 cs=0 l=1 rev1=1 crypto rx=0x7f7524009b00 tx=0x7f7524009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:02.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.959+0000 7f7537ded700 1 -- 192.168.123.105:0/10602656 shutdown_connections 2026-03-10T07:48:02.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.959+0000 7f7537ded700 1 --2- 192.168.123.105:0/10602656 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7530100060 0x7f75301004e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:02.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.959+0000 7f7537ded700 1 --2- 192.168.123.105:0/10602656 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f75300ff700 0x7f75300ffb20 unknown :-1 s=CLOSED pgs=197 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:02.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.959+0000 7f7537ded700 1 -- 192.168.123.105:0/10602656 >> 192.168.123.105:0/10602656 conn(0x7f75300fb260 msgr2=0x7f75300fd6e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:02.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.960+0000 7f7537ded700 1 -- 192.168.123.105:0/10602656 shutdown_connections 2026-03-10T07:48:02.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.960+0000 7f7537ded700 1 -- 192.168.123.105:0/10602656 wait complete. 2026-03-10T07:48:02.962 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.960+0000 7f7537ded700 1 Processor -- start 2026-03-10T07:48:02.962 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.960+0000 7f7537ded700 1 -- start start 2026-03-10T07:48:02.962 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.961+0000 7f7537ded700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f75300ff700 0x7f7530198860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:02.963 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.961+0000 7f7537ded700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7530100060 0x7f7530198da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:02.963 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.961+0000 7f7537ded700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f75301993c0 con 0x7f75300ff700 2026-03-10T07:48:02.963 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.961+0000 7f7537ded700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7530199500 con 0x7f7530100060 2026-03-10T07:48:02.963 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.961+0000 7f7535388700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7530100060 0x7f7530198da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:02.963 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.961+0000 7f7535388700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7530100060 0x7f7530198da0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:39814/0 (socket says 192.168.123.105:39814) 2026-03-10T07:48:02.963 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.961+0000 7f7535388700 1 -- 192.168.123.105:0/1604062667 learned_addr learned my addr 192.168.123.105:0/1604062667 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:02.963 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.961+0000 7f7535b89700 1 --2- 192.168.123.105:0/1604062667 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f75300ff700 0x7f7530198860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:02.963 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.962+0000 7f7535b89700 1 -- 192.168.123.105:0/1604062667 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7530100060 msgr2=0x7f7530198da0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:02.963 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.962+0000 7f7535b89700 1 --2- 192.168.123.105:0/1604062667 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7530100060 0x7f7530198da0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:02.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.962+0000 7f7535b89700 1 -- 192.168.123.105:0/1604062667 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f75240097e0 con 0x7f75300ff700 2026-03-10T07:48:02.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.962+0000 7f7535388700 1 --2- 192.168.123.105:0/1604062667 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7530100060 0x7f7530198da0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T07:48:02.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.962+0000 7f7535b89700 1 --2- 192.168.123.105:0/1604062667 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f75300ff700 0x7f7530198860 secure :-1 s=READY pgs=198 cs=0 l=1 rev1=1 crypto rx=0x7f7524009fd0 tx=0x7f7524004990 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:02.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.963+0000 7f7522ffd700 1 -- 192.168.123.105:0/1604062667 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f752401d070 con 0x7f75300ff700 2026-03-10T07:48:02.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.963+0000 7f7522ffd700 1 -- 192.168.123.105:0/1604062667 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7524022470 con 0x7f75300ff700 2026-03-10T07:48:02.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.963+0000 7f7522ffd700 1 -- 192.168.123.105:0/1604062667 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f752400f630 con 0x7f75300ff700 2026-03-10T07:48:02.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.963+0000 7f7537ded700 1 -- 192.168.123.105:0/1604062667 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f753019df50 con 0x7f75300ff700 2026-03-10T07:48:02.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.963+0000 7f7537ded700 1 -- 192.168.123.105:0/1604062667 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f753019e3c0 con 0x7f75300ff700 2026-03-10T07:48:02.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.964+0000 7f7522ffd700 1 -- 192.168.123.105:0/1604062667 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f752400f790 con 0x7f75300ff700 2026-03-10T07:48:02.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.964+0000 7f7537ded700 1 -- 192.168.123.105:0/1604062667 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f753004ea90 con 0x7f75300ff700 2026-03-10T07:48:02.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.967+0000 7f7522ffd700 1 --2- 192.168.123.105:0/1604062667 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f751c06c4e0 0x7f751c06e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:02.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.967+0000 7f7522ffd700 1 -- 192.168.123.105:0/1604062667 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(29..29 src has 1..29) v4 ==== 4129+0+0 (secure 0 0 0) 0x7f752408d9a0 con 0x7f75300ff700 2026-03-10T07:48:02.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.967+0000 7f7535388700 1 --2- 192.168.123.105:0/1604062667 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f751c06c4e0 0x7f751c06e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:02.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.968+0000 7f7535388700 1 --2- 192.168.123.105:0/1604062667 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f751c06c4e0 0x7f751c06e9a0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f752c005950 tx=0x7f752c00b500 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:02.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:02.968+0000 7f7522ffd700 1 -- 192.168.123.105:0/1604062667 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f752405c1d0 con 0x7f75300ff700 2026-03-10T07:48:03.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:03.073+0000 7f7537ded700 1 -- 192.168.123.105:0/1604062667 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7f753019e830 con 0x7f75300ff700 2026-03-10T07:48:03.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:03.074+0000 7f7522ffd700 1 -- 192.168.123.105:0/1604062667 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v29) v1 ==== 74+0+130 (secure 0 0 0) 0x7f752405bd60 con 0x7f75300ff700 2026-03-10T07:48:03.076 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:48:03.078 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:02 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:48:03.078 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:02 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:48:03.078 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:02 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:48:03.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:02 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:03.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:02 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:48:03.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:02 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:48:03.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:02 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:48:03.079 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:02 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:03.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:03.076+0000 7f7537ded700 1 -- 192.168.123.105:0/1604062667 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f751c06c4e0 msgr2=0x7f751c06e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:03.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:03.076+0000 7f7537ded700 1 --2- 192.168.123.105:0/1604062667 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f751c06c4e0 0x7f751c06e9a0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7f752c005950 tx=0x7f752c00b500 comp rx=0 tx=0).stop 2026-03-10T07:48:03.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:03.077+0000 7f7537ded700 1 -- 192.168.123.105:0/1604062667 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f75300ff700 msgr2=0x7f7530198860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:03.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:03.077+0000 7f7537ded700 1 --2- 192.168.123.105:0/1604062667 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f75300ff700 0x7f7530198860 secure :-1 s=READY pgs=198 cs=0 l=1 rev1=1 crypto rx=0x7f7524009fd0 tx=0x7f7524004990 comp rx=0 tx=0).stop 2026-03-10T07:48:03.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:03.079+0000 7f7537ded700 1 -- 192.168.123.105:0/1604062667 shutdown_connections 2026-03-10T07:48:03.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:03.079+0000 7f7537ded700 1 --2- 192.168.123.105:0/1604062667 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f751c06c4e0 0x7f751c06e9a0 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:03.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:03.079+0000 7f7537ded700 1 --2- 192.168.123.105:0/1604062667 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f75300ff700 0x7f7530198860 unknown :-1 s=CLOSED pgs=198 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:03.081 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:03.079+0000 7f7537ded700 1 --2- 192.168.123.105:0/1604062667 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7530100060 0x7f7530198da0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:03.081 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:03.079+0000 7f7537ded700 1 -- 192.168.123.105:0/1604062667 >> 192.168.123.105:0/1604062667 conn(0x7f75300fb260 msgr2=0x7f7530107d30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:03.081 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:03.079+0000 7f7537ded700 1 -- 192.168.123.105:0/1604062667 shutdown_connections 2026-03-10T07:48:03.081 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:03.079+0000 7f7537ded700 1 -- 192.168.123.105:0/1604062667 wait complete. 2026-03-10T07:48:03.164 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":29,"num_osds":6,"num_up_osds":5,"osd_up_since":1773128875,"num_in_osds":6,"osd_in_since":1773128874,"num_remapped_pgs":0} 2026-03-10T07:48:03.168 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:48:03 vm08 ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5[78393]: 2026-03-10T07:48:03.054+0000 7fc86bfe9640 -1 osd.5 0 log_to_monitors true 2026-03-10T07:48:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:02 vm08 ceph-mon[59917]: pgmap v56: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T07:48:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:02 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:02 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:02 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:48:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:02 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:48:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:02 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:48:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:02 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:02 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:48:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:02 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:48:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:02 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:48:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:02 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:04.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:03 vm05 ceph-mon[50387]: Detected new or changed devices on vm08 2026-03-10T07:48:04.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:03 vm05 ceph-mon[50387]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T07:48:04.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:03 vm05 ceph-mon[50387]: from='osd.5 [v2:192.168.123.108:6816/2677071054,v1:192.168.123.108:6817/2677071054]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T07:48:04.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:03 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/1604062667' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T07:48:04.165 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph osd stat -f json 2026-03-10T07:48:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:03 vm08 ceph-mon[59917]: Detected new or changed devices on vm08 2026-03-10T07:48:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:03 vm08 ceph-mon[59917]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T07:48:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:03 vm08 ceph-mon[59917]: from='osd.5 [v2:192.168.123.108:6816/2677071054,v1:192.168.123.108:6817/2677071054]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T07:48:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:03 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/1604062667' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T07:48:04.313 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:04.560 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.558+0000 7fb2e424f700 1 -- 192.168.123.105:0/1579091389 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2dc100100 msgr2=0x7fb2dc100580 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:04.561 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.558+0000 7fb2e424f700 1 --2- 192.168.123.105:0/1579091389 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2dc100100 0x7fb2dc100580 secure :-1 s=READY pgs=199 cs=0 l=1 rev1=1 crypto rx=0x7fb2d0009b00 tx=0x7fb2d0009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:04.561 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.559+0000 7fb2e424f700 1 -- 192.168.123.105:0/1579091389 shutdown_connections 2026-03-10T07:48:04.561 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.559+0000 7fb2e424f700 1 --2- 192.168.123.105:0/1579091389 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2dc100100 0x7fb2dc100580 unknown :-1 s=CLOSED pgs=199 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:04.561 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.559+0000 7fb2e424f700 1 --2- 192.168.123.105:0/1579091389 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb2dc0ff7a0 0x7fb2dc0ffbc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:04.561 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.559+0000 7fb2e424f700 1 -- 192.168.123.105:0/1579091389 >> 192.168.123.105:0/1579091389 conn(0x7fb2dc0fb380 msgr2=0x7fb2dc0fd800 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:04.561 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.559+0000 7fb2e424f700 1 -- 192.168.123.105:0/1579091389 shutdown_connections 2026-03-10T07:48:04.561 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.560+0000 7fb2e424f700 1 -- 192.168.123.105:0/1579091389 wait complete. 2026-03-10T07:48:04.562 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.560+0000 7fb2e424f700 1 Processor -- start 2026-03-10T07:48:04.562 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.560+0000 7fb2e424f700 1 -- start start 2026-03-10T07:48:04.562 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.560+0000 7fb2e424f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2dc0ff7a0 0x7fb2dc198a00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:04.562 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.560+0000 7fb2e424f700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb2dc100100 0x7fb2dc198f40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:04.562 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.560+0000 7fb2e424f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb2dc199560 con 0x7fb2dc0ff7a0 2026-03-10T07:48:04.562 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.560+0000 7fb2e424f700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb2dc1996a0 con 0x7fb2dc100100 2026-03-10T07:48:04.563 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.560+0000 7fb2e1feb700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2dc0ff7a0 0x7fb2dc198a00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:04.563 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.560+0000 7fb2e1feb700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2dc0ff7a0 0x7fb2dc198a00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46066/0 (socket says 192.168.123.105:46066) 2026-03-10T07:48:04.563 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.560+0000 7fb2e1feb700 1 -- 192.168.123.105:0/2062227793 learned_addr learned my addr 192.168.123.105:0/2062227793 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:04.563 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.561+0000 7fb2e1feb700 1 -- 192.168.123.105:0/2062227793 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb2dc100100 msgr2=0x7fb2dc198f40 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T07:48:04.563 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.561+0000 7fb2e1feb700 1 --2- 192.168.123.105:0/2062227793 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb2dc100100 0x7fb2dc198f40 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:04.563 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.561+0000 7fb2e1feb700 1 -- 192.168.123.105:0/2062227793 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb2d00097e0 con 0x7fb2dc0ff7a0 2026-03-10T07:48:04.564 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.561+0000 7fb2e1feb700 1 --2- 192.168.123.105:0/2062227793 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2dc0ff7a0 0x7fb2dc198a00 secure :-1 s=READY pgs=200 cs=0 l=1 rev1=1 crypto rx=0x7fb2d800d8d0 tx=0x7fb2d800dc90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:04.564 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.561+0000 7fb2ceffd700 1 -- 192.168.123.105:0/2062227793 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb2d8009940 con 0x7fb2dc0ff7a0 2026-03-10T07:48:04.564 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.561+0000 7fb2ceffd700 1 -- 192.168.123.105:0/2062227793 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb2d8010460 con 0x7fb2dc0ff7a0 2026-03-10T07:48:04.564 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.561+0000 7fb2ceffd700 1 -- 192.168.123.105:0/2062227793 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb2d800f5d0 con 0x7fb2dc0ff7a0 2026-03-10T07:48:04.564 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.561+0000 7fb2e424f700 1 -- 192.168.123.105:0/2062227793 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb2dc19e150 con 0x7fb2dc0ff7a0 2026-03-10T07:48:04.564 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.561+0000 7fb2e424f700 1 -- 192.168.123.105:0/2062227793 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb2dc19e6a0 con 0x7fb2dc0ff7a0 2026-03-10T07:48:04.564 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.562+0000 7fb2ceffd700 1 -- 192.168.123.105:0/2062227793 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fb2d800f730 con 0x7fb2dc0ff7a0 2026-03-10T07:48:04.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.563+0000 7fb2ceffd700 1 --2- 192.168.123.105:0/2062227793 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb2c806c600 0x7fb2c806eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:04.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.563+0000 7fb2e17ea700 1 --2- 192.168.123.105:0/2062227793 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb2c806c600 0x7fb2c806eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:04.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.564+0000 7fb2e424f700 1 -- 192.168.123.105:0/2062227793 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb2dc10be00 con 0x7fb2dc0ff7a0 2026-03-10T07:48:04.568 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.564+0000 7fb2ceffd700 1 -- 192.168.123.105:0/2062227793 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(30..30 src has 1..30) v4 ==== 4150+0+0 (secure 0 0 0) 0x7fb2d808c1b0 con 0x7fb2dc0ff7a0 2026-03-10T07:48:04.568 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.567+0000 7fb2e17ea700 1 --2- 192.168.123.105:0/2062227793 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb2c806c600 0x7fb2c806eac0 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7fb2d0005b40 tx=0x7fb2d0005ad0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:04.568 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.567+0000 7fb2ceffd700 1 -- 192.168.123.105:0/2062227793 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb2d805a710 con 0x7fb2dc0ff7a0 2026-03-10T07:48:04.674 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.672+0000 7fb2e424f700 1 -- 192.168.123.105:0/2062227793 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7fb2dc10bff0 con 0x7fb2dc0ff7a0 2026-03-10T07:48:04.674 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.672+0000 7fb2ceffd700 1 -- 192.168.123.105:0/2062227793 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v30) v1 ==== 74+0+130 (secure 0 0 0) 0x7fb2d8059230 con 0x7fb2dc0ff7a0 2026-03-10T07:48:04.674 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:48:04.676 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.674+0000 7fb2e424f700 1 -- 192.168.123.105:0/2062227793 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb2c806c600 msgr2=0x7fb2c806eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:04.676 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.674+0000 7fb2e424f700 1 --2- 192.168.123.105:0/2062227793 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb2c806c600 0x7fb2c806eac0 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7fb2d0005b40 tx=0x7fb2d0005ad0 comp rx=0 tx=0).stop 2026-03-10T07:48:04.676 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.674+0000 7fb2e424f700 1 -- 192.168.123.105:0/2062227793 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2dc0ff7a0 msgr2=0x7fb2dc198a00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:04.677 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.674+0000 7fb2e424f700 1 --2- 192.168.123.105:0/2062227793 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2dc0ff7a0 0x7fb2dc198a00 secure :-1 s=READY pgs=200 cs=0 l=1 rev1=1 crypto rx=0x7fb2d800d8d0 tx=0x7fb2d800dc90 comp rx=0 tx=0).stop 2026-03-10T07:48:04.677 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.675+0000 7fb2e424f700 1 -- 192.168.123.105:0/2062227793 shutdown_connections 2026-03-10T07:48:04.677 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.675+0000 7fb2e424f700 1 --2- 192.168.123.105:0/2062227793 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb2c806c600 0x7fb2c806eac0 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:04.677 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.675+0000 7fb2e424f700 1 --2- 192.168.123.105:0/2062227793 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2dc0ff7a0 0x7fb2dc198a00 unknown :-1 s=CLOSED pgs=200 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:04.677 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.675+0000 7fb2e424f700 1 --2- 192.168.123.105:0/2062227793 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb2dc100100 0x7fb2dc198f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:04.677 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.675+0000 7fb2e424f700 1 -- 192.168.123.105:0/2062227793 >> 192.168.123.105:0/2062227793 conn(0x7fb2dc0fb380 msgr2=0x7fb2dc107dd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:04.677 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.675+0000 7fb2e424f700 1 -- 192.168.123.105:0/2062227793 shutdown_connections 2026-03-10T07:48:04.677 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:04.675+0000 7fb2e424f700 1 -- 192.168.123.105:0/2062227793 wait complete. 2026-03-10T07:48:04.724 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":30,"num_osds":6,"num_up_osds":5,"osd_up_since":1773128875,"num_in_osds":6,"osd_in_since":1773128874,"num_remapped_pgs":0} 2026-03-10T07:48:05.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:04 vm05 ceph-mon[50387]: pgmap v57: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T07:48:05.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:04 vm05 ceph-mon[50387]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-10T07:48:05.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:04 vm05 ceph-mon[50387]: osdmap e30: 6 total, 5 up, 6 in 2026-03-10T07:48:05.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:04 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T07:48:05.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:04 vm05 ceph-mon[50387]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T07:48:05.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:04 vm05 ceph-mon[50387]: from='osd.5 [v2:192.168.123.108:6816/2677071054,v1:192.168.123.108:6817/2677071054]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T07:48:05.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:04 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/2062227793' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T07:48:05.168 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:48:04 vm08 ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5[78393]: 2026-03-10T07:48:04.842+0000 7fc86264f700 -1 osd.5 0 waiting for initial osdmap 2026-03-10T07:48:05.168 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:48:04 vm08 ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5[78393]: 2026-03-10T07:48:04.854+0000 7fc85cc41700 -1 osd.5 31 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T07:48:05.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:04 vm08 ceph-mon[59917]: pgmap v57: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T07:48:05.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:04 vm08 ceph-mon[59917]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-10T07:48:05.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:04 vm08 ceph-mon[59917]: osdmap e30: 6 total, 5 up, 6 in 2026-03-10T07:48:05.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:04 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T07:48:05.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:04 vm08 ceph-mon[59917]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T07:48:05.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:04 vm08 ceph-mon[59917]: from='osd.5 [v2:192.168.123.108:6816/2677071054,v1:192.168.123.108:6817/2677071054]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T07:48:05.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:04 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/2062227793' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T07:48:05.726 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph osd stat -f json 2026-03-10T07:48:05.885 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:06.033 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:05 vm05 ceph-mon[50387]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]': finished 2026-03-10T07:48:06.033 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:05 vm05 ceph-mon[50387]: osdmap e31: 6 total, 5 up, 6 in 2026-03-10T07:48:06.033 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:05 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T07:48:06.033 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:05 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T07:48:06.033 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:05 vm05 ceph-mon[50387]: pgmap v60: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T07:48:06.138 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.136+0000 7f7588a9c700 1 -- 192.168.123.105:0/2398242342 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7584103120 msgr2=0x7f7584103540 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:06.138 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.136+0000 7f7588a9c700 1 --2- 192.168.123.105:0/2398242342 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7584103120 0x7f7584103540 secure :-1 s=READY pgs=201 cs=0 l=1 rev1=1 crypto rx=0x7f756c009b00 tx=0x7f756c009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:06.138 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.137+0000 7f7588a9c700 1 -- 192.168.123.105:0/2398242342 shutdown_connections 2026-03-10T07:48:06.138 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.137+0000 7f7588a9c700 1 --2- 192.168.123.105:0/2398242342 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7584104320 0x7f7584104780 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:06.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.137+0000 7f7588a9c700 1 --2- 192.168.123.105:0/2398242342 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7584103120 0x7f7584103540 unknown :-1 s=CLOSED pgs=201 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:06.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.137+0000 7f7588a9c700 1 -- 192.168.123.105:0/2398242342 >> 192.168.123.105:0/2398242342 conn(0x7f75840fe6c0 msgr2=0x7f7584100b00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:06.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.137+0000 7f7588a9c700 1 -- 192.168.123.105:0/2398242342 shutdown_connections 2026-03-10T07:48:06.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.137+0000 7f7588a9c700 1 -- 192.168.123.105:0/2398242342 wait complete. 2026-03-10T07:48:06.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.137+0000 7f7588a9c700 1 Processor -- start 2026-03-10T07:48:06.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.137+0000 7f7588a9c700 1 -- start start 2026-03-10T07:48:06.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.137+0000 7f7588a9c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7584103120 0x7f75841946c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:06.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.137+0000 7f7588a9c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7584104320 0x7f7584194c00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:06.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.137+0000 7f7588a9c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f75841951d0 con 0x7f7584104320 2026-03-10T07:48:06.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.137+0000 7f7588a9c700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7584195310 con 0x7f7584103120 2026-03-10T07:48:06.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.138+0000 7f7581d9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7584104320 0x7f7584194c00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:06.140 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.138+0000 7f7581d9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7584104320 0x7f7584194c00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46092/0 (socket says 192.168.123.105:46092) 2026-03-10T07:48:06.140 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.138+0000 7f7581d9b700 1 -- 192.168.123.105:0/2040553052 learned_addr learned my addr 192.168.123.105:0/2040553052 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:06.140 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.138+0000 7f758259c700 1 --2- 192.168.123.105:0/2040553052 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7584103120 0x7f75841946c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:06.140 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.138+0000 7f7581d9b700 1 -- 192.168.123.105:0/2040553052 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7584103120 msgr2=0x7f75841946c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:06.140 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.138+0000 7f7581d9b700 1 --2- 192.168.123.105:0/2040553052 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7584103120 0x7f75841946c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:06.140 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.138+0000 7f7581d9b700 1 -- 192.168.123.105:0/2040553052 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f756c0097e0 con 0x7f7584104320 2026-03-10T07:48:06.140 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.138+0000 7f758259c700 1 --2- 192.168.123.105:0/2040553052 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7584103120 0x7f75841946c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:48:06.140 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.138+0000 7f7581d9b700 1 --2- 192.168.123.105:0/2040553052 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7584104320 0x7f7584194c00 secure :-1 s=READY pgs=202 cs=0 l=1 rev1=1 crypto rx=0x7f757400b700 tx=0x7f757400bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:06.140 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.138+0000 7f757b7fe700 1 -- 192.168.123.105:0/2040553052 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7574010840 con 0x7f7584104320 2026-03-10T07:48:06.141 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.139+0000 7f7588a9c700 1 -- 192.168.123.105:0/2040553052 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f758406a8d0 con 0x7f7584104320 2026-03-10T07:48:06.141 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.139+0000 7f7588a9c700 1 -- 192.168.123.105:0/2040553052 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f758406add0 con 0x7f7584104320 2026-03-10T07:48:06.141 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.140+0000 7f757b7fe700 1 -- 192.168.123.105:0/2040553052 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7574010e80 con 0x7f7584104320 2026-03-10T07:48:06.142 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.140+0000 7f757b7fe700 1 -- 192.168.123.105:0/2040553052 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f757400d590 con 0x7f7584104320 2026-03-10T07:48:06.142 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.140+0000 7f7588a9c700 1 -- 192.168.123.105:0/2040553052 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7564005320 con 0x7f7584104320 2026-03-10T07:48:06.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.143+0000 7f757b7fe700 1 -- 192.168.123.105:0/2040553052 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f757400d770 con 0x7f7584104320 2026-03-10T07:48:06.145 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.143+0000 7f757b7fe700 1 --2- 192.168.123.105:0/2040553052 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f757006c360 0x7f757006e820 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:06.145 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.144+0000 7f757b7fe700 1 -- 192.168.123.105:0/2040553052 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(32..32 src has 1..32) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f757408c0f0 con 0x7f7584104320 2026-03-10T07:48:06.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.144+0000 7f758259c700 1 --2- 192.168.123.105:0/2040553052 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f757006c360 0x7f757006e820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:06.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.144+0000 7f758259c700 1 --2- 192.168.123.105:0/2040553052 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f757006c360 0x7f757006e820 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f756c006010 tx=0x7f756c005dc0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:06.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.144+0000 7f757b7fe700 1 -- 192.168.123.105:0/2040553052 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f75740bf0e0 con 0x7f7584104320 2026-03-10T07:48:06.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:05 vm08 ceph-mon[59917]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]': finished 2026-03-10T07:48:06.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:05 vm08 ceph-mon[59917]: osdmap e31: 6 total, 5 up, 6 in 2026-03-10T07:48:06.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:05 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T07:48:06.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:05 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T07:48:06.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:05 vm08 ceph-mon[59917]: pgmap v60: 1 pgs: 1 active+clean; 449 KiB data, 133 MiB used, 100 GiB / 100 GiB avail 2026-03-10T07:48:06.255 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.253+0000 7f7588a9c700 1 -- 192.168.123.105:0/2040553052 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd stat", "format": "json"} v 0) v1 -- 0x7f75640059f0 con 0x7f7584104320 2026-03-10T07:48:06.255 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.254+0000 7f757b7fe700 1 -- 192.168.123.105:0/2040553052 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd stat", "format": "json"}]=0 v32) v1 ==== 74+0+130 (secure 0 0 0) 0x7f757405a4b0 con 0x7f7584104320 2026-03-10T07:48:06.256 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:48:06.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.256+0000 7f7588a9c700 1 -- 192.168.123.105:0/2040553052 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f757006c360 msgr2=0x7f757006e820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:06.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.256+0000 7f7588a9c700 1 --2- 192.168.123.105:0/2040553052 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f757006c360 0x7f757006e820 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f756c006010 tx=0x7f756c005dc0 comp rx=0 tx=0).stop 2026-03-10T07:48:06.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.256+0000 7f7588a9c700 1 -- 192.168.123.105:0/2040553052 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7584104320 msgr2=0x7f7584194c00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:06.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.256+0000 7f7588a9c700 1 --2- 192.168.123.105:0/2040553052 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7584104320 0x7f7584194c00 secure :-1 s=READY pgs=202 cs=0 l=1 rev1=1 crypto rx=0x7f757400b700 tx=0x7f757400bac0 comp rx=0 tx=0).stop 2026-03-10T07:48:06.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.256+0000 7f7588a9c700 1 -- 192.168.123.105:0/2040553052 shutdown_connections 2026-03-10T07:48:06.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.256+0000 7f7588a9c700 1 --2- 192.168.123.105:0/2040553052 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f757006c360 0x7f757006e820 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:06.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.256+0000 7f7588a9c700 1 --2- 192.168.123.105:0/2040553052 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7584103120 0x7f75841946c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:06.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.256+0000 7f7588a9c700 1 --2- 192.168.123.105:0/2040553052 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7584104320 0x7f7584194c00 unknown :-1 s=CLOSED pgs=202 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:06.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.256+0000 7f7588a9c700 1 -- 192.168.123.105:0/2040553052 >> 192.168.123.105:0/2040553052 conn(0x7f75840fe6c0 msgr2=0x7f7584107550 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:06.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.256+0000 7f7588a9c700 1 -- 192.168.123.105:0/2040553052 shutdown_connections 2026-03-10T07:48:06.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.256+0000 7f7588a9c700 1 -- 192.168.123.105:0/2040553052 wait complete. 2026-03-10T07:48:06.309 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":32,"num_osds":6,"num_up_osds":6,"osd_up_since":1773128885,"num_in_osds":6,"osd_in_since":1773128874,"num_remapped_pgs":0} 2026-03-10T07:48:06.310 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph osd dump --format=json 2026-03-10T07:48:06.463 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:06.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.691+0000 7f760bc1f700 1 -- 192.168.123.105:0/1202119954 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7604068730 msgr2=0x7f7604068b10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:06.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.691+0000 7f760bc1f700 1 --2- 192.168.123.105:0/1202119954 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7604068730 0x7f7604068b10 secure :-1 s=READY pgs=203 cs=0 l=1 rev1=1 crypto rx=0x7f75f4009b00 tx=0x7f75f4009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:06.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.692+0000 7f760bc1f700 1 -- 192.168.123.105:0/1202119954 shutdown_connections 2026-03-10T07:48:06.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.692+0000 7f760bc1f700 1 --2- 192.168.123.105:0/1202119954 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f76040690e0 0x7f7604105b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:06.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.692+0000 7f760bc1f700 1 --2- 192.168.123.105:0/1202119954 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7604068730 0x7f7604068b10 unknown :-1 s=CLOSED pgs=203 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:06.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.692+0000 7f760bc1f700 1 -- 192.168.123.105:0/1202119954 >> 192.168.123.105:0/1202119954 conn(0x7f7604075960 msgr2=0x7f7604075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:06.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.692+0000 7f760bc1f700 1 -- 192.168.123.105:0/1202119954 shutdown_connections 2026-03-10T07:48:06.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.692+0000 7f760bc1f700 1 -- 192.168.123.105:0/1202119954 wait complete. 2026-03-10T07:48:06.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.692+0000 7f760bc1f700 1 Processor -- start 2026-03-10T07:48:06.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.692+0000 7f760bc1f700 1 -- start start 2026-03-10T07:48:06.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.693+0000 7f760bc1f700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7604068730 0x7f760419d260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:06.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.693+0000 7f760bc1f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f76040690e0 0x7f760419d7a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:06.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.693+0000 7f760bc1f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f760419de80 con 0x7f76040690e0 2026-03-10T07:48:06.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.693+0000 7f760bc1f700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f76041a1c10 con 0x7f7604068730 2026-03-10T07:48:06.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.693+0000 7f76099bb700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7604068730 0x7f760419d260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:06.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.693+0000 7f76099bb700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7604068730 0x7f760419d260 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:39862/0 (socket says 192.168.123.105:39862) 2026-03-10T07:48:06.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.693+0000 7f76099bb700 1 -- 192.168.123.105:0/3394686354 learned_addr learned my addr 192.168.123.105:0/3394686354 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:06.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.693+0000 7f76091ba700 1 --2- 192.168.123.105:0/3394686354 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f76040690e0 0x7f760419d7a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:06.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.693+0000 7f76091ba700 1 -- 192.168.123.105:0/3394686354 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7604068730 msgr2=0x7f760419d260 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:06.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.693+0000 7f76091ba700 1 --2- 192.168.123.105:0/3394686354 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7604068730 0x7f760419d260 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:06.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.693+0000 7f76091ba700 1 -- 192.168.123.105:0/3394686354 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f75f40097e0 con 0x7f76040690e0 2026-03-10T07:48:06.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.693+0000 7f76091ba700 1 --2- 192.168.123.105:0/3394686354 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f76040690e0 0x7f760419d7a0 secure :-1 s=READY pgs=204 cs=0 l=1 rev1=1 crypto rx=0x7f7604073d30 tx=0x7f760000dbb0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:06.696 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.694+0000 7f75faffd700 1 -- 192.168.123.105:0/3394686354 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f760000f840 con 0x7f76040690e0 2026-03-10T07:48:06.696 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.694+0000 7f75faffd700 1 -- 192.168.123.105:0/3394686354 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f760000fe80 con 0x7f76040690e0 2026-03-10T07:48:06.696 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.694+0000 7f75faffd700 1 -- 192.168.123.105:0/3394686354 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f760000e5c0 con 0x7f76040690e0 2026-03-10T07:48:06.696 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.694+0000 7f760bc1f700 1 -- 192.168.123.105:0/3394686354 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f76041a1ef0 con 0x7f76040690e0 2026-03-10T07:48:06.696 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.694+0000 7f760bc1f700 1 -- 192.168.123.105:0/3394686354 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f76041a2440 con 0x7f76040690e0 2026-03-10T07:48:06.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.696+0000 7f75faffd700 1 -- 192.168.123.105:0/3394686354 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f7600010460 con 0x7f76040690e0 2026-03-10T07:48:06.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.696+0000 7f75faffd700 1 --2- 192.168.123.105:0/3394686354 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f75f006c4e0 0x7f75f006e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:06.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.696+0000 7f75faffd700 1 -- 192.168.123.105:0/3394686354 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(32..32 src has 1..32) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f760008afc0 con 0x7f76040690e0 2026-03-10T07:48:06.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.696+0000 7f76099bb700 1 --2- 192.168.123.105:0/3394686354 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f75f006c4e0 0x7f75f006e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:06.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.696+0000 7f760bc1f700 1 -- 192.168.123.105:0/3394686354 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f76041094e0 con 0x7f76040690e0 2026-03-10T07:48:06.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.697+0000 7f76099bb700 1 --2- 192.168.123.105:0/3394686354 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f75f006c4e0 0x7f75f006e9a0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f75f4006010 tx=0x7f75f4005a90 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:06.701 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.699+0000 7f75faffd700 1 -- 192.168.123.105:0/3394686354 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f7600055c00 con 0x7f76040690e0 2026-03-10T07:48:06.805 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.803+0000 7f760bc1f700 1 -- 192.168.123.105:0/3394686354 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7f760404ea90 con 0x7f76040690e0 2026-03-10T07:48:06.808 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.806+0000 7f75faffd700 1 -- 192.168.123.105:0/3394686354 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v32) v1 ==== 74+0+11236 (secure 0 0 0) 0x7f7600020070 con 0x7f76040690e0 2026-03-10T07:48:06.808 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:48:06.808 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":32,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","created":"2026-03-10T07:45:40.905180+0000","modified":"2026-03-10T07:48:05.841179+0000","last_up_change":"2026-03-10T07:48:05.841179+0000","last_in_change":"2026-03-10T07:47:54.023889+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-10T07:47:37.474083+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"20","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"b7030a1b-b939-419d-85b0-9e818f756cd8","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6802","nonce":196713793},{"type":"v1","addr":"192.168.123.105:6803","nonce":196713793}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6804","nonce":196713793},{"type":"v1","addr":"192.168.123.105:6805","nonce":196713793}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6808","nonce":196713793},{"type":"v1","addr":"192.168.123.105:6809","nonce":196713793}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6806","nonce":196713793},{"type":"v1","addr":"192.168.123.105:6807","nonce":196713793}]},"public_addr":"192.168.123.105:6803/196713793","cluster_addr":"192.168.123.105:6805/196713793","heartbeat_back_addr":"192.168.123.105:6809/196713793","heartbeat_front_addr":"192.168.123.105:6807/196713793","state":["exists","up"]},{"osd":1,"uuid":"725f8dca-94da-4c18-aefa-9e9f529cccd4","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":13,"up_thru":24,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6810","nonce":1508516757},{"type":"v1","addr":"192.168.123.105:6811","nonce":1508516757}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6812","nonce":1508516757},{"type":"v1","addr":"192.168.123.105:6813","nonce":1508516757}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6816","nonce":1508516757},{"type":"v1","addr":"192.168.123.105:6817","nonce":1508516757}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6814","nonce":1508516757},{"type":"v1","addr":"192.168.123.105:6815","nonce":1508516757}]},"public_addr":"192.168.123.105:6811/1508516757","cluster_addr":"192.168.123.105:6813/1508516757","heartbeat_back_addr":"192.168.123.105:6817/1508516757","heartbeat_front_addr":"192.168.123.105:6815/1508516757","state":["exists","up"]},{"osd":2,"uuid":"ef60316b-3b61-4644-aa31-97cef548ba7e","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":17,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6818","nonce":4116227648},{"type":"v1","addr":"192.168.123.105:6819","nonce":4116227648}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6820","nonce":4116227648},{"type":"v1","addr":"192.168.123.105:6821","nonce":4116227648}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":4116227648},{"type":"v1","addr":"192.168.123.105:6825","nonce":4116227648}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6822","nonce":4116227648},{"type":"v1","addr":"192.168.123.105:6823","nonce":4116227648}]},"public_addr":"192.168.123.105:6819/4116227648","cluster_addr":"192.168.123.105:6821/4116227648","heartbeat_back_addr":"192.168.123.105:6825/4116227648","heartbeat_front_addr":"192.168.123.105:6823/4116227648","state":["exists","up"]},{"osd":3,"uuid":"4cc7cd7e-7756-40b9-9bc8-029e26495239","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":23,"up_thru":27,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6800","nonce":2127000679},{"type":"v1","addr":"192.168.123.108:6801","nonce":2127000679}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6802","nonce":2127000679},{"type":"v1","addr":"192.168.123.108:6803","nonce":2127000679}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6806","nonce":2127000679},{"type":"v1","addr":"192.168.123.108:6807","nonce":2127000679}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6804","nonce":2127000679},{"type":"v1","addr":"192.168.123.108:6805","nonce":2127000679}]},"public_addr":"192.168.123.108:6801/2127000679","cluster_addr":"192.168.123.108:6803/2127000679","heartbeat_back_addr":"192.168.123.108:6807/2127000679","heartbeat_front_addr":"192.168.123.108:6805/2127000679","state":["exists","up"]},{"osd":4,"uuid":"efa282d4-a651-4903-bdb8-2e148038e567","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":28,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6808","nonce":551946261},{"type":"v1","addr":"192.168.123.108:6809","nonce":551946261}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6810","nonce":551946261},{"type":"v1","addr":"192.168.123.108:6811","nonce":551946261}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6814","nonce":551946261},{"type":"v1","addr":"192.168.123.108:6815","nonce":551946261}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6812","nonce":551946261},{"type":"v1","addr":"192.168.123.108:6813","nonce":551946261}]},"public_addr":"192.168.123.108:6809/551946261","cluster_addr":"192.168.123.108:6811/551946261","heartbeat_back_addr":"192.168.123.108:6815/551946261","heartbeat_front_addr":"192.168.123.108:6813/551946261","state":["exists","up"]},{"osd":5,"uuid":"ef1ae183-14f2-4400-85d6-c48b79ef2819","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":32,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6816","nonce":2677071054},{"type":"v1","addr":"192.168.123.108:6817","nonce":2677071054}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6818","nonce":2677071054},{"type":"v1","addr":"192.168.123.108:6819","nonce":2677071054}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6822","nonce":2677071054},{"type":"v1","addr":"192.168.123.108:6823","nonce":2677071054}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6820","nonce":2677071054},{"type":"v1","addr":"192.168.123.108:6821","nonce":2677071054}]},"public_addr":"192.168.123.108:6817/2677071054","cluster_addr":"192.168.123.108:6819/2677071054","heartbeat_back_addr":"192.168.123.108:6823/2677071054","heartbeat_front_addr":"192.168.123.108:6821/2677071054","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T07:47:16.699601+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T07:47:25.867149+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T07:47:36.171124+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T07:47:44.953296+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T07:47:53.656121+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"0.000000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.105:0/4259283135":"2026-03-11T07:46:43.266824+0000","192.168.123.105:0/1998618536":"2026-03-11T07:46:43.266824+0000","192.168.123.105:0/2989100693":"2026-03-11T07:46:09.521991+0000","192.168.123.105:0/1261368764":"2026-03-11T07:46:09.521991+0000","192.168.123.105:6801/2":"2026-03-11T07:45:55.496146+0000","192.168.123.105:0/77887181":"2026-03-11T07:46:43.266824+0000","192.168.123.105:0/2454546307":"2026-03-11T07:45:55.496146+0000","192.168.123.105:6800/2":"2026-03-11T07:45:55.496146+0000","192.168.123.105:0/3548984633":"2026-03-11T07:45:55.496146+0000","192.168.123.105:0/4118350925":"2026-03-11T07:45:55.496146+0000","192.168.123.105:0/597549933":"2026-03-11T07:46:09.521991+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-10T07:48:06.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.808+0000 7f760bc1f700 1 -- 192.168.123.105:0/3394686354 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f75f006c4e0 msgr2=0x7f75f006e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:06.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.808+0000 7f760bc1f700 1 --2- 192.168.123.105:0/3394686354 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f75f006c4e0 0x7f75f006e9a0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f75f4006010 tx=0x7f75f4005a90 comp rx=0 tx=0).stop 2026-03-10T07:48:06.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.809+0000 7f760bc1f700 1 -- 192.168.123.105:0/3394686354 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f76040690e0 msgr2=0x7f760419d7a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:06.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.809+0000 7f760bc1f700 1 --2- 192.168.123.105:0/3394686354 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f76040690e0 0x7f760419d7a0 secure :-1 s=READY pgs=204 cs=0 l=1 rev1=1 crypto rx=0x7f7604073d30 tx=0x7f760000dbb0 comp rx=0 tx=0).stop 2026-03-10T07:48:06.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.809+0000 7f760bc1f700 1 -- 192.168.123.105:0/3394686354 shutdown_connections 2026-03-10T07:48:06.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.809+0000 7f760bc1f700 1 --2- 192.168.123.105:0/3394686354 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f75f006c4e0 0x7f75f006e9a0 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:06.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.809+0000 7f760bc1f700 1 --2- 192.168.123.105:0/3394686354 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7604068730 0x7f760419d260 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:06.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.809+0000 7f760bc1f700 1 --2- 192.168.123.105:0/3394686354 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f76040690e0 0x7f760419d7a0 unknown :-1 s=CLOSED pgs=204 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:06.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.809+0000 7f760bc1f700 1 -- 192.168.123.105:0/3394686354 >> 192.168.123.105:0/3394686354 conn(0x7f7604075960 msgr2=0x7f76040ff7e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:06.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.809+0000 7f760bc1f700 1 -- 192.168.123.105:0/3394686354 shutdown_connections 2026-03-10T07:48:06.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:06.809+0000 7f760bc1f700 1 -- 192.168.123.105:0/3394686354 wait complete. 2026-03-10T07:48:06.882 INFO:tasks.cephadm.ceph_manager.ceph:[{'pool': 1, 'pool_name': '.mgr', 'create_time': '2026-03-10T07:47:37.474083+0000', 'flags': 1, 'flags_names': 'hashpspool', 'type': 1, 'size': 3, 'min_size': 2, 'crush_rule': 0, 'peering_crush_bucket_count': 0, 'peering_crush_bucket_target': 0, 'peering_crush_bucket_barrier': 0, 'peering_crush_bucket_mandatory_member': 2147483647, 'object_hash': 2, 'pg_autoscale_mode': 'off', 'pg_num': 1, 'pg_placement_num': 1, 'pg_placement_num_target': 1, 'pg_num_target': 1, 'pg_num_pending': 1, 'last_pg_merge_meta': {'source_pgid': '0.0', 'ready_epoch': 0, 'last_epoch_started': 0, 'last_epoch_clean': 0, 'source_version': "0'0", 'target_version': "0'0"}, 'last_change': '20', 'last_force_op_resend': '0', 'last_force_op_resend_prenautilus': '0', 'last_force_op_resend_preluminous': '0', 'auid': 0, 'snap_mode': 'selfmanaged', 'snap_seq': 0, 'snap_epoch': 0, 'pool_snaps': [], 'removed_snaps': '[]', 'quota_max_bytes': 0, 'quota_max_objects': 0, 'tiers': [], 'tier_of': -1, 'read_tier': -1, 'write_tier': -1, 'cache_mode': 'none', 'target_max_bytes': 0, 'target_max_objects': 0, 'cache_target_dirty_ratio_micro': 400000, 'cache_target_dirty_high_ratio_micro': 600000, 'cache_target_full_ratio_micro': 800000, 'cache_min_flush_age': 0, 'cache_min_evict_age': 0, 'erasure_code_profile': '', 'hit_set_params': {'type': 'none'}, 'hit_set_period': 0, 'hit_set_count': 0, 'use_gmt_hitset': True, 'min_read_recency_for_promote': 0, 'min_write_recency_for_promote': 0, 'hit_set_grade_decay_rate': 0, 'hit_set_search_last_n': 0, 'grade_table': [], 'stripe_width': 0, 'expected_num_objects': 0, 'fast_read': False, 'options': {'pg_num_max': 32, 'pg_num_min': 1}, 'application_metadata': {'mgr': {}}, 'read_balance': {'score_acting': 6, 'score_stable': 6, 'optimal_score': 0.5, 'raw_score_acting': 3, 'raw_score_stable': 3, 'primary_affinity_weighted': 1, 'average_primary_affinity': 1, 'average_primary_affinity_weighted': 1}}] 2026-03-10T07:48:06.882 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph osd pool get .mgr pg_num 2026-03-10T07:48:07.026 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:07.060 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:06 vm05 ceph-mon[50387]: purged_snaps scrub starts 2026-03-10T07:48:07.061 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:06 vm05 ceph-mon[50387]: purged_snaps scrub ok 2026-03-10T07:48:07.061 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:06 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T07:48:07.061 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:06 vm05 ceph-mon[50387]: osd.5 [v2:192.168.123.108:6816/2677071054,v1:192.168.123.108:6817/2677071054] boot 2026-03-10T07:48:07.061 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:06 vm05 ceph-mon[50387]: osdmap e32: 6 total, 6 up, 6 in 2026-03-10T07:48:07.061 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:06 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T07:48:07.061 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:06 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/2040553052' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T07:48:07.061 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:06 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/3394686354' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T07:48:07.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:06 vm08 ceph-mon[59917]: purged_snaps scrub starts 2026-03-10T07:48:07.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:06 vm08 ceph-mon[59917]: purged_snaps scrub ok 2026-03-10T07:48:07.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:06 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T07:48:07.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:06 vm08 ceph-mon[59917]: osd.5 [v2:192.168.123.108:6816/2677071054,v1:192.168.123.108:6817/2677071054] boot 2026-03-10T07:48:07.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:06 vm08 ceph-mon[59917]: osdmap e32: 6 total, 6 up, 6 in 2026-03-10T07:48:07.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:06 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T07:48:07.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:06 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/2040553052' entity='client.admin' cmd=[{"prefix": "osd stat", "format": "json"}]: dispatch 2026-03-10T07:48:07.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:06 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/3394686354' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T07:48:07.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.263+0000 7fb5f4ac6700 1 -- 192.168.123.105:0/494947456 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb5f0103cf0 msgr2=0x7fb5f0107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:07.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.263+0000 7fb5f4ac6700 1 --2- 192.168.123.105:0/494947456 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb5f0103cf0 0x7fb5f0107d40 secure :-1 s=READY pgs=205 cs=0 l=1 rev1=1 crypto rx=0x7fb5e0009b00 tx=0x7fb5e0009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:07.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.264+0000 7fb5f4ac6700 1 -- 192.168.123.105:0/494947456 shutdown_connections 2026-03-10T07:48:07.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.264+0000 7fb5f4ac6700 1 --2- 192.168.123.105:0/494947456 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb5f0103cf0 0x7fb5f0107d40 unknown :-1 s=CLOSED pgs=205 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:07.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.264+0000 7fb5f4ac6700 1 --2- 192.168.123.105:0/494947456 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb5f0103340 0x7fb5f0103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:07.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.264+0000 7fb5f4ac6700 1 -- 192.168.123.105:0/494947456 >> 192.168.123.105:0/494947456 conn(0x7fb5f00feb90 msgr2=0x7fb5f0100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:07.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.264+0000 7fb5f4ac6700 1 -- 192.168.123.105:0/494947456 shutdown_connections 2026-03-10T07:48:07.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.264+0000 7fb5f4ac6700 1 -- 192.168.123.105:0/494947456 wait complete. 2026-03-10T07:48:07.267 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.265+0000 7fb5f4ac6700 1 Processor -- start 2026-03-10T07:48:07.267 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.265+0000 7fb5f4ac6700 1 -- start start 2026-03-10T07:48:07.267 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.265+0000 7fb5f4ac6700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb5f0103340 0x7fb5f0198e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:07.267 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.265+0000 7fb5f4ac6700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb5f0103cf0 0x7fb5f0199380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:07.267 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.265+0000 7fb5f4ac6700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb5f0199a60 con 0x7fb5f0103340 2026-03-10T07:48:07.267 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.265+0000 7fb5f4ac6700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb5f019d7f0 con 0x7fb5f0103cf0 2026-03-10T07:48:07.267 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.266+0000 7fb5edd9b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb5f0103cf0 0x7fb5f0199380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:07.267 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.266+0000 7fb5edd9b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb5f0103cf0 0x7fb5f0199380 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:39878/0 (socket says 192.168.123.105:39878) 2026-03-10T07:48:07.267 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.266+0000 7fb5edd9b700 1 -- 192.168.123.105:0/2491175809 learned_addr learned my addr 192.168.123.105:0/2491175809 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:07.268 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.266+0000 7fb5ee59c700 1 --2- 192.168.123.105:0/2491175809 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb5f0103340 0x7fb5f0198e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:07.268 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.266+0000 7fb5edd9b700 1 -- 192.168.123.105:0/2491175809 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb5f0103340 msgr2=0x7fb5f0198e40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:07.268 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.266+0000 7fb5edd9b700 1 --2- 192.168.123.105:0/2491175809 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb5f0103340 0x7fb5f0198e40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:07.268 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.266+0000 7fb5edd9b700 1 -- 192.168.123.105:0/2491175809 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb5e00097e0 con 0x7fb5f0103cf0 2026-03-10T07:48:07.268 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.266+0000 7fb5ee59c700 1 --2- 192.168.123.105:0/2491175809 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb5f0103340 0x7fb5f0198e40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T07:48:07.268 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.267+0000 7fb5edd9b700 1 --2- 192.168.123.105:0/2491175809 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb5f0103cf0 0x7fb5f0199380 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fb5e0009fd0 tx=0x7fb5e00049e0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:07.268 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.267+0000 7fb5e77fe700 1 -- 192.168.123.105:0/2491175809 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb5e001d070 con 0x7fb5f0103cf0 2026-03-10T07:48:07.268 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.267+0000 7fb5f4ac6700 1 -- 192.168.123.105:0/2491175809 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb5f019da70 con 0x7fb5f0103cf0 2026-03-10T07:48:07.269 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.267+0000 7fb5f4ac6700 1 -- 192.168.123.105:0/2491175809 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb5f019df60 con 0x7fb5f0103cf0 2026-03-10T07:48:07.269 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.267+0000 7fb5e77fe700 1 -- 192.168.123.105:0/2491175809 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb5e000bc50 con 0x7fb5f0103cf0 2026-03-10T07:48:07.269 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.267+0000 7fb5e77fe700 1 -- 192.168.123.105:0/2491175809 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb5e000f740 con 0x7fb5f0103cf0 2026-03-10T07:48:07.270 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.268+0000 7fb5e77fe700 1 -- 192.168.123.105:0/2491175809 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fb5e000f8a0 con 0x7fb5f0103cf0 2026-03-10T07:48:07.270 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.269+0000 7fb5e77fe700 1 --2- 192.168.123.105:0/2491175809 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb5dc06c530 0x7fb5dc06e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:07.271 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.269+0000 7fb5ee59c700 1 --2- 192.168.123.105:0/2491175809 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb5dc06c530 0x7fb5dc06e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:07.271 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.269+0000 7fb5ee59c700 1 --2- 192.168.123.105:0/2491175809 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb5dc06c530 0x7fb5dc06e9f0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7fb5d8009a20 tx=0x7fb5d8008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:07.271 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.269+0000 7fb5e77fe700 1 -- 192.168.123.105:0/2491175809 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fb5e008cac0 con 0x7fb5f0103cf0 2026-03-10T07:48:07.271 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.270+0000 7fb5f4ac6700 1 -- 192.168.123.105:0/2491175809 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb5d0005320 con 0x7fb5f0103cf0 2026-03-10T07:48:07.274 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.272+0000 7fb5e77fe700 1 -- 192.168.123.105:0/2491175809 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb5e0027080 con 0x7fb5f0103cf0 2026-03-10T07:48:07.381 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.377+0000 7fb5f4ac6700 1 -- 192.168.123.105:0/2491175809 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"} v 0) v1 -- 0x7fb5d0005f70 con 0x7fb5f0103cf0 2026-03-10T07:48:07.381 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.379+0000 7fb5e77fe700 1 -- 192.168.123.105:0/2491175809 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]=0 v33) v1 ==== 93+0+10 (secure 0 0 0) 0x7fb5e0058820 con 0x7fb5f0103cf0 2026-03-10T07:48:07.381 INFO:teuthology.orchestra.run.vm05.stdout:pg_num: 1 2026-03-10T07:48:07.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.381+0000 7fb5f4ac6700 1 -- 192.168.123.105:0/2491175809 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb5dc06c530 msgr2=0x7fb5dc06e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:07.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.381+0000 7fb5f4ac6700 1 --2- 192.168.123.105:0/2491175809 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb5dc06c530 0x7fb5dc06e9f0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7fb5d8009a20 tx=0x7fb5d8008040 comp rx=0 tx=0).stop 2026-03-10T07:48:07.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.382+0000 7fb5f4ac6700 1 -- 192.168.123.105:0/2491175809 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb5f0103cf0 msgr2=0x7fb5f0199380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:07.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.382+0000 7fb5f4ac6700 1 --2- 192.168.123.105:0/2491175809 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb5f0103cf0 0x7fb5f0199380 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fb5e0009fd0 tx=0x7fb5e00049e0 comp rx=0 tx=0).stop 2026-03-10T07:48:07.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.382+0000 7fb5f4ac6700 1 -- 192.168.123.105:0/2491175809 shutdown_connections 2026-03-10T07:48:07.384 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.382+0000 7fb5f4ac6700 1 --2- 192.168.123.105:0/2491175809 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb5dc06c530 0x7fb5dc06e9f0 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:07.384 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.382+0000 7fb5f4ac6700 1 --2- 192.168.123.105:0/2491175809 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb5f0103340 0x7fb5f0198e40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:07.384 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.382+0000 7fb5f4ac6700 1 --2- 192.168.123.105:0/2491175809 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb5f0103cf0 0x7fb5f0199380 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:07.384 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.382+0000 7fb5f4ac6700 1 -- 192.168.123.105:0/2491175809 >> 192.168.123.105:0/2491175809 conn(0x7fb5f00feb90 msgr2=0x7fb5f0100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:07.384 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.382+0000 7fb5f4ac6700 1 -- 192.168.123.105:0/2491175809 shutdown_connections 2026-03-10T07:48:07.384 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.382+0000 7fb5f4ac6700 1 -- 192.168.123.105:0/2491175809 wait complete. 2026-03-10T07:48:07.424 INFO:tasks.cephadm:Setting up client nodes... 2026-03-10T07:48:07.425 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph auth get-or-create client.0 mon 'allow *' osd 'allow *' mds 'allow *' mgr 'allow *' 2026-03-10T07:48:07.572 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:07.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.809+0000 7f2f75fb7700 1 -- 192.168.123.105:0/2053470600 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f70102070 msgr2=0x7f2f701024f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:07.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.809+0000 7f2f75fb7700 1 --2- 192.168.123.105:0/2053470600 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f70102070 0x7f2f701024f0 secure :-1 s=READY pgs=206 cs=0 l=1 rev1=1 crypto rx=0x7f2f60009b00 tx=0x7f2f60009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:07.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.810+0000 7f2f75fb7700 1 -- 192.168.123.105:0/2053470600 shutdown_connections 2026-03-10T07:48:07.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.810+0000 7f2f75fb7700 1 --2- 192.168.123.105:0/2053470600 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f70102070 0x7f2f701024f0 unknown :-1 s=CLOSED pgs=206 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:07.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.810+0000 7f2f75fb7700 1 --2- 192.168.123.105:0/2053470600 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2f70100f10 0x7f2f70101330 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:07.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.810+0000 7f2f75fb7700 1 -- 192.168.123.105:0/2053470600 >> 192.168.123.105:0/2053470600 conn(0x7f2f700fc4b0 msgr2=0x7f2f700fe8f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:07.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.810+0000 7f2f75fb7700 1 -- 192.168.123.105:0/2053470600 shutdown_connections 2026-03-10T07:48:07.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.810+0000 7f2f75fb7700 1 -- 192.168.123.105:0/2053470600 wait complete. 2026-03-10T07:48:07.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.810+0000 7f2f75fb7700 1 Processor -- start 2026-03-10T07:48:07.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.811+0000 7f2f75fb7700 1 -- start start 2026-03-10T07:48:07.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.811+0000 7f2f75fb7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f70100f10 0x7f2f701967b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:07.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.811+0000 7f2f75fb7700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2f70102070 0x7f2f70196cf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:07.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.811+0000 7f2f75fb7700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2f70197310 con 0x7f2f70100f10 2026-03-10T07:48:07.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.811+0000 7f2f6effd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2f70102070 0x7f2f70196cf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:07.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.811+0000 7f2f6effd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2f70102070 0x7f2f70196cf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:39888/0 (socket says 192.168.123.105:39888) 2026-03-10T07:48:07.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.811+0000 7f2f6effd700 1 -- 192.168.123.105:0/2285378371 learned_addr learned my addr 192.168.123.105:0/2285378371 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:07.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.811+0000 7f2f6f7fe700 1 --2- 192.168.123.105:0/2285378371 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f70100f10 0x7f2f701967b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:07.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.811+0000 7f2f75fb7700 1 -- 192.168.123.105:0/2285378371 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2f70197450 con 0x7f2f70102070 2026-03-10T07:48:07.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.812+0000 7f2f6f7fe700 1 -- 192.168.123.105:0/2285378371 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2f70102070 msgr2=0x7f2f70196cf0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:07.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.812+0000 7f2f6f7fe700 1 --2- 192.168.123.105:0/2285378371 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2f70102070 0x7f2f70196cf0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:07.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.812+0000 7f2f6f7fe700 1 -- 192.168.123.105:0/2285378371 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2f600097e0 con 0x7f2f70100f10 2026-03-10T07:48:07.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.812+0000 7f2f6f7fe700 1 --2- 192.168.123.105:0/2285378371 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f70100f10 0x7f2f701967b0 secure :-1 s=READY pgs=207 cs=0 l=1 rev1=1 crypto rx=0x7f2f5800eab0 tx=0x7f2f5800edc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:07.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.812+0000 7f2f6cff9700 1 -- 192.168.123.105:0/2285378371 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2f5800cb20 con 0x7f2f70100f10 2026-03-10T07:48:07.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.812+0000 7f2f6cff9700 1 -- 192.168.123.105:0/2285378371 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f2f5800cc80 con 0x7f2f70100f10 2026-03-10T07:48:07.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.812+0000 7f2f6cff9700 1 -- 192.168.123.105:0/2285378371 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2f58018860 con 0x7f2f70100f10 2026-03-10T07:48:07.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.812+0000 7f2f75fb7700 1 -- 192.168.123.105:0/2285378371 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2f7019bf00 con 0x7f2f70100f10 2026-03-10T07:48:07.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.812+0000 7f2f75fb7700 1 -- 192.168.123.105:0/2285378371 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2f7019c3c0 con 0x7f2f70100f10 2026-03-10T07:48:07.818 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.814+0000 7f2f6cff9700 1 -- 192.168.123.105:0/2285378371 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f2f580189c0 con 0x7f2f70100f10 2026-03-10T07:48:07.818 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.814+0000 7f2f75fb7700 1 -- 192.168.123.105:0/2285378371 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2f70066e80 con 0x7f2f70100f10 2026-03-10T07:48:07.818 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.814+0000 7f2f6cff9700 1 --2- 192.168.123.105:0/2285378371 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2f5c06c600 0x7f2f5c06eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:07.818 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.814+0000 7f2f6cff9700 1 -- 192.168.123.105:0/2285378371 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f2f58014070 con 0x7f2f70100f10 2026-03-10T07:48:07.818 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.817+0000 7f2f6effd700 1 --2- 192.168.123.105:0/2285378371 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2f5c06c600 0x7f2f5c06eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:07.819 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.817+0000 7f2f6effd700 1 --2- 192.168.123.105:0/2285378371 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2f5c06c600 0x7f2f5c06eac0 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f2f60005200 tx=0x7f2f6001a040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:07.819 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.817+0000 7f2f6cff9700 1 -- 192.168.123.105:0/2285378371 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f2f5805b390 con 0x7f2f70100f10 2026-03-10T07:48:07.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:07 vm05 ceph-mon[50387]: osdmap e33: 6 total, 6 up, 6 in 2026-03-10T07:48:07.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:07 vm05 ceph-mon[50387]: pgmap v63: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:48:07.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:07 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/2491175809' entity='client.admin' cmd=[{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]: dispatch 2026-03-10T07:48:07.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.972+0000 7f2f75fb7700 1 -- 192.168.123.105:0/2285378371 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]} v 0) v1 -- 0x7f2f7019c710 con 0x7f2f70100f10 2026-03-10T07:48:07.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.996+0000 7f2f6cff9700 1 -- 192.168.123.105:0/2285378371 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]=0 v16) v1 ==== 170+0+59 (secure 0 0 0) 0x7f2f58063480 con 0x7f2f70100f10 2026-03-10T07:48:07.998 INFO:teuthology.orchestra.run.vm05.stdout:[client.0] 2026-03-10T07:48:07.998 INFO:teuthology.orchestra.run.vm05.stdout: key = AQC3zK9pIy8cOhAAg8BtDVRVoTuJ3er8kHXvCw== 2026-03-10T07:48:08.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.999+0000 7f2f75fb7700 1 -- 192.168.123.105:0/2285378371 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2f5c06c600 msgr2=0x7f2f5c06eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:08.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.999+0000 7f2f75fb7700 1 --2- 192.168.123.105:0/2285378371 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2f5c06c600 0x7f2f5c06eac0 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f2f60005200 tx=0x7f2f6001a040 comp rx=0 tx=0).stop 2026-03-10T07:48:08.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.999+0000 7f2f75fb7700 1 -- 192.168.123.105:0/2285378371 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f70100f10 msgr2=0x7f2f701967b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:08.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.999+0000 7f2f75fb7700 1 --2- 192.168.123.105:0/2285378371 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f70100f10 0x7f2f701967b0 secure :-1 s=READY pgs=207 cs=0 l=1 rev1=1 crypto rx=0x7f2f5800eab0 tx=0x7f2f5800edc0 comp rx=0 tx=0).stop 2026-03-10T07:48:08.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.999+0000 7f2f75fb7700 1 -- 192.168.123.105:0/2285378371 shutdown_connections 2026-03-10T07:48:08.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.999+0000 7f2f75fb7700 1 --2- 192.168.123.105:0/2285378371 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f2f5c06c600 0x7f2f5c06eac0 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:08.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.999+0000 7f2f75fb7700 1 --2- 192.168.123.105:0/2285378371 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2f70100f10 0x7f2f701967b0 unknown :-1 s=CLOSED pgs=207 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:08.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.999+0000 7f2f75fb7700 1 --2- 192.168.123.105:0/2285378371 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2f70102070 0x7f2f70196cf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:08.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.999+0000 7f2f75fb7700 1 -- 192.168.123.105:0/2285378371 >> 192.168.123.105:0/2285378371 conn(0x7f2f700fc4b0 msgr2=0x7f2f70105330 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:08.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:07.999+0000 7f2f75fb7700 1 -- 192.168.123.105:0/2285378371 shutdown_connections 2026-03-10T07:48:08.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:08.000+0000 7f2f75fb7700 1 -- 192.168.123.105:0/2285378371 wait complete. 2026-03-10T07:48:08.063 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T07:48:08.063 DEBUG:teuthology.orchestra.run.vm05:> sudo dd of=/etc/ceph/ceph.client.0.keyring 2026-03-10T07:48:08.063 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod 0644 /etc/ceph/ceph.client.0.keyring 2026-03-10T07:48:08.097 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph auth get-or-create client.1 mon 'allow *' osd 'allow *' mds 'allow *' mgr 'allow *' 2026-03-10T07:48:08.125 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:07 vm08 ceph-mon[59917]: osdmap e33: 6 total, 6 up, 6 in 2026-03-10T07:48:08.125 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:07 vm08 ceph-mon[59917]: pgmap v63: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:48:08.125 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:07 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/2491175809' entity='client.admin' cmd=[{"prefix": "osd pool get", "pool": ".mgr", "var": "pg_num"}]: dispatch 2026-03-10T07:48:08.247 INFO:teuthology.orchestra.run.vm08.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm08/config 2026-03-10T07:48:08.498 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.496+0000 7f8a0832b700 1 -- 192.168.123.108:0/1331924465 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a00104340 msgr2=0x7f8a001047a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:08.498 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.496+0000 7f8a0832b700 1 --2- 192.168.123.108:0/1331924465 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a00104340 0x7f8a001047a0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f89fc009b00 tx=0x7f89fc009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:08.498 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.497+0000 7f8a0832b700 1 -- 192.168.123.108:0/1331924465 shutdown_connections 2026-03-10T07:48:08.498 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.497+0000 7f8a0832b700 1 --2- 192.168.123.108:0/1331924465 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a00104340 0x7f8a001047a0 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:08.498 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.497+0000 7f8a0832b700 1 --2- 192.168.123.108:0/1331924465 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a00103140 0x7f8a00103560 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:08.498 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.497+0000 7f8a0832b700 1 -- 192.168.123.108:0/1331924465 >> 192.168.123.108:0/1331924465 conn(0x7f8a000fe6c0 msgr2=0x7f8a00100b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:08.498 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.497+0000 7f8a0832b700 1 -- 192.168.123.108:0/1331924465 shutdown_connections 2026-03-10T07:48:08.498 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.497+0000 7f8a0832b700 1 -- 192.168.123.108:0/1331924465 wait complete. 2026-03-10T07:48:08.499 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.497+0000 7f8a0832b700 1 Processor -- start 2026-03-10T07:48:08.499 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.498+0000 7f8a0832b700 1 -- start start 2026-03-10T07:48:08.499 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.498+0000 7f8a0832b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a00103140 0x7f8a00198a30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:08.499 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.498+0000 7f8a0832b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a00104340 0x7f8a00198f70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:08.499 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.498+0000 7f8a0832b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8a00199590 con 0x7f8a00104340 2026-03-10T07:48:08.499 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.498+0000 7f8a0832b700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8a001996d0 con 0x7f8a00103140 2026-03-10T07:48:08.499 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.498+0000 7f8a060c7700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a00103140 0x7f8a00198a30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:08.499 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.498+0000 7f8a060c7700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a00103140 0x7f8a00198a30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.108:36372/0 (socket says 192.168.123.108:36372) 2026-03-10T07:48:08.500 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.498+0000 7f8a060c7700 1 -- 192.168.123.108:0/3142043566 learned_addr learned my addr 192.168.123.108:0/3142043566 (peer_addr_for_me v2:192.168.123.108:0/0) 2026-03-10T07:48:08.500 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.498+0000 7f8a058c6700 1 --2- 192.168.123.108:0/3142043566 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a00104340 0x7f8a00198f70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:08.500 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.498+0000 7f8a060c7700 1 -- 192.168.123.108:0/3142043566 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a00104340 msgr2=0x7f8a00198f70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:08.500 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.499+0000 7f8a060c7700 1 --2- 192.168.123.108:0/3142043566 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a00104340 0x7f8a00198f70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:08.500 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.499+0000 7f8a060c7700 1 -- 192.168.123.108:0/3142043566 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f89fc0097e0 con 0x7f8a00103140 2026-03-10T07:48:08.500 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.499+0000 7f8a058c6700 1 --2- 192.168.123.108:0/3142043566 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a00104340 0x7f8a00198f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:48:08.500 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.499+0000 7f8a060c7700 1 --2- 192.168.123.108:0/3142043566 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a00103140 0x7f8a00198a30 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f89f000d9d0 tx=0x7f89f000dd90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:08.500 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.499+0000 7f89f77fe700 1 -- 192.168.123.108:0/3142043566 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f89f00098e0 con 0x7f8a00103140 2026-03-10T07:48:08.504 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.499+0000 7f8a0832b700 1 -- 192.168.123.108:0/3142043566 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8a0019e180 con 0x7f8a00103140 2026-03-10T07:48:08.504 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.499+0000 7f89f77fe700 1 -- 192.168.123.108:0/3142043566 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f89f0010460 con 0x7f8a00103140 2026-03-10T07:48:08.505 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.499+0000 7f8a0832b700 1 -- 192.168.123.108:0/3142043566 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8a0019e6d0 con 0x7f8a00103140 2026-03-10T07:48:08.505 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.500+0000 7f89f77fe700 1 -- 192.168.123.108:0/3142043566 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f89f000f5d0 con 0x7f8a00103140 2026-03-10T07:48:08.505 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.501+0000 7f89f77fe700 1 -- 192.168.123.108:0/3142043566 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f89f000f730 con 0x7f8a00103140 2026-03-10T07:48:08.505 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.501+0000 7f8a0832b700 1 -- 192.168.123.108:0/3142043566 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f89e4005320 con 0x7f8a00103140 2026-03-10T07:48:08.505 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.502+0000 7f89f77fe700 1 --2- 192.168.123.108:0/3142043566 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f89ec06c530 0x7f89ec06e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:08.505 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.502+0000 7f89f77fe700 1 -- 192.168.123.108:0/3142043566 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f89f008aee0 con 0x7f8a00103140 2026-03-10T07:48:08.505 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.503+0000 7f89f77fe700 1 -- 192.168.123.108:0/3142043566 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f89f005a3b0 con 0x7f8a00103140 2026-03-10T07:48:08.505 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.504+0000 7f8a058c6700 1 --2- 192.168.123.108:0/3142043566 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f89ec06c530 0x7f89ec06e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:08.507 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.506+0000 7f8a058c6700 1 --2- 192.168.123.108:0/3142043566 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f89ec06c530 0x7f89ec06e9f0 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f89fc000c00 tx=0x7f89fc005c00 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:08.656 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.653+0000 7f8a0832b700 1 -- 192.168.123.108:0/3142043566 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]} v 0) v1 -- 0x7f89e4005190 con 0x7f8a00103140 2026-03-10T07:48:08.662 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.660+0000 7f89f77fe700 1 -- 192.168.123.108:0/3142043566 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]=0 v17) v1 ==== 170+0+59 (secure 0 0 0) 0x7f89f00170f0 con 0x7f8a00103140 2026-03-10T07:48:08.662 INFO:teuthology.orchestra.run.vm08.stdout:[client.1] 2026-03-10T07:48:08.662 INFO:teuthology.orchestra.run.vm08.stdout: key = AQC4zK9pHjwcJxAAnZW+qximzbIlqbX0efTQJQ== 2026-03-10T07:48:08.664 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.663+0000 7f8a0832b700 1 -- 192.168.123.108:0/3142043566 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f89ec06c530 msgr2=0x7f89ec06e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:08.664 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.663+0000 7f8a0832b700 1 --2- 192.168.123.108:0/3142043566 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f89ec06c530 0x7f89ec06e9f0 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f89fc000c00 tx=0x7f89fc005c00 comp rx=0 tx=0).stop 2026-03-10T07:48:08.664 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.663+0000 7f8a0832b700 1 -- 192.168.123.108:0/3142043566 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a00103140 msgr2=0x7f8a00198a30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:08.664 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.663+0000 7f8a0832b700 1 --2- 192.168.123.108:0/3142043566 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a00103140 0x7f8a00198a30 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f89f000d9d0 tx=0x7f89f000dd90 comp rx=0 tx=0).stop 2026-03-10T07:48:08.664 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.663+0000 7f8a0832b700 1 -- 192.168.123.108:0/3142043566 shutdown_connections 2026-03-10T07:48:08.664 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.663+0000 7f8a0832b700 1 --2- 192.168.123.108:0/3142043566 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f89ec06c530 0x7f89ec06e9f0 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:08.664 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.663+0000 7f8a0832b700 1 --2- 192.168.123.108:0/3142043566 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8a00103140 0x7f8a00198a30 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:08.664 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.663+0000 7f8a0832b700 1 --2- 192.168.123.108:0/3142043566 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8a00104340 0x7f8a00198f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:08.664 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.663+0000 7f8a0832b700 1 -- 192.168.123.108:0/3142043566 >> 192.168.123.108:0/3142043566 conn(0x7f8a000fe6c0 msgr2=0x7f8a00107570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:08.664 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.663+0000 7f8a0832b700 1 -- 192.168.123.108:0/3142043566 shutdown_connections 2026-03-10T07:48:08.665 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:08.663+0000 7f8a0832b700 1 -- 192.168.123.108:0/3142043566 wait complete. 2026-03-10T07:48:08.709 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T07:48:08.709 DEBUG:teuthology.orchestra.run.vm08:> sudo dd of=/etc/ceph/ceph.client.1.keyring 2026-03-10T07:48:08.709 DEBUG:teuthology.orchestra.run.vm08:> sudo chmod 0644 /etc/ceph/ceph.client.1.keyring 2026-03-10T07:48:08.785 INFO:tasks.ceph:Waiting until ceph daemons up and pgs clean... 2026-03-10T07:48:08.785 INFO:tasks.cephadm.ceph_manager.ceph:waiting for mgr available 2026-03-10T07:48:08.785 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph mgr dump --format=json 2026-03-10T07:48:08.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:08 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/2285378371' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T07:48:08.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:08 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/2285378371' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-10T07:48:08.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:08 vm08 ceph-mon[59917]: from='client.? 192.168.123.108:0/3142043566' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T07:48:08.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:08 vm08 ceph-mon[59917]: from='client.? ' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T07:48:08.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:08 vm08 ceph-mon[59917]: from='client.? ' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-10T07:48:08.931 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:09.068 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:08 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/2285378371' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T07:48:09.068 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:08 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/2285378371' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.0", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-10T07:48:09.068 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:08 vm05 ceph-mon[50387]: from='client.? 192.168.123.108:0/3142043566' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T07:48:09.068 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:08 vm05 ceph-mon[50387]: from='client.? ' entity='client.admin' cmd=[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]: dispatch 2026-03-10T07:48:09.068 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:08 vm05 ceph-mon[50387]: from='client.? ' entity='client.admin' cmd='[{"prefix": "auth get-or-create", "entity": "client.1", "caps": ["mon", "allow *", "osd", "allow *", "mds", "allow *", "mgr", "allow *"]}]': finished 2026-03-10T07:48:09.174 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.172+0000 7feb5d0a2700 1 -- 192.168.123.105:0/794181132 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb580752c0 msgr2=0x7feb580756a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:09.174 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.172+0000 7feb5d0a2700 1 --2- 192.168.123.105:0/794181132 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb580752c0 0x7feb580756a0 secure :-1 s=READY pgs=208 cs=0 l=1 rev1=1 crypto rx=0x7feb40009b50 tx=0x7feb40009e60 comp rx=0 tx=0).stop 2026-03-10T07:48:09.174 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.172+0000 7feb5d0a2700 1 -- 192.168.123.105:0/794181132 shutdown_connections 2026-03-10T07:48:09.174 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.172+0000 7feb5d0a2700 1 --2- 192.168.123.105:0/794181132 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb58075be0 0x7feb58111840 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:09.174 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.172+0000 7feb5d0a2700 1 --2- 192.168.123.105:0/794181132 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb580752c0 0x7feb580756a0 unknown :-1 s=CLOSED pgs=208 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:09.174 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.172+0000 7feb5d0a2700 1 -- 192.168.123.105:0/794181132 >> 192.168.123.105:0/794181132 conn(0x7feb580feb90 msgr2=0x7feb58100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:09.174 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.173+0000 7feb5d0a2700 1 -- 192.168.123.105:0/794181132 shutdown_connections 2026-03-10T07:48:09.174 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.173+0000 7feb5d0a2700 1 -- 192.168.123.105:0/794181132 wait complete. 2026-03-10T07:48:09.175 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.173+0000 7feb5d0a2700 1 Processor -- start 2026-03-10T07:48:09.175 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.173+0000 7feb5d0a2700 1 -- start start 2026-03-10T07:48:09.175 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.173+0000 7feb5d0a2700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb580752c0 0x7feb5819d160 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:09.175 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.173+0000 7feb56d9d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb580752c0 0x7feb5819d160 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:09.175 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.174+0000 7feb56d9d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb580752c0 0x7feb5819d160 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38288/0 (socket says 192.168.123.105:38288) 2026-03-10T07:48:09.175 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.174+0000 7feb5d0a2700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb58075be0 0x7feb5819d6a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:09.175 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.174+0000 7feb5d0a2700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feb5819dd80 con 0x7feb580752c0 2026-03-10T07:48:09.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.174+0000 7feb5d0a2700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7feb581a1b10 con 0x7feb58075be0 2026-03-10T07:48:09.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.174+0000 7feb56d9d700 1 -- 192.168.123.105:0/3467897404 learned_addr learned my addr 192.168.123.105:0/3467897404 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:09.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.174+0000 7feb5659c700 1 --2- 192.168.123.105:0/3467897404 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb58075be0 0x7feb5819d6a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:09.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.174+0000 7feb56d9d700 1 -- 192.168.123.105:0/3467897404 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb58075be0 msgr2=0x7feb5819d6a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:09.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.174+0000 7feb56d9d700 1 --2- 192.168.123.105:0/3467897404 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb58075be0 0x7feb5819d6a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:09.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.174+0000 7feb56d9d700 1 -- 192.168.123.105:0/3467897404 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7feb400097e0 con 0x7feb580752c0 2026-03-10T07:48:09.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.174+0000 7feb56d9d700 1 --2- 192.168.123.105:0/3467897404 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb580752c0 0x7feb5819d160 secure :-1 s=READY pgs=209 cs=0 l=1 rev1=1 crypto rx=0x7feb40005e30 tx=0x7feb40005f10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:09.177 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.174+0000 7feb4ffff700 1 -- 192.168.123.105:0/3467897404 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feb4001d070 con 0x7feb580752c0 2026-03-10T07:48:09.177 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.174+0000 7feb4ffff700 1 -- 192.168.123.105:0/3467897404 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7feb40022470 con 0x7feb580752c0 2026-03-10T07:48:09.177 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.174+0000 7feb4ffff700 1 -- 192.168.123.105:0/3467897404 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7feb4000f670 con 0x7feb580752c0 2026-03-10T07:48:09.177 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.174+0000 7feb5d0a2700 1 -- 192.168.123.105:0/3467897404 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7feb581a1df0 con 0x7feb580752c0 2026-03-10T07:48:09.177 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.174+0000 7feb5d0a2700 1 -- 192.168.123.105:0/3467897404 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7feb581a2340 con 0x7feb580752c0 2026-03-10T07:48:09.180 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.176+0000 7feb4ffff700 1 -- 192.168.123.105:0/3467897404 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7feb4000f7d0 con 0x7feb580752c0 2026-03-10T07:48:09.180 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.176+0000 7feb5d0a2700 1 -- 192.168.123.105:0/3467897404 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7feb5810efb0 con 0x7feb580752c0 2026-03-10T07:48:09.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.179+0000 7feb4ffff700 1 --2- 192.168.123.105:0/3467897404 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7feb4406c4e0 0x7feb4406e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:09.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.179+0000 7feb4ffff700 1 -- 192.168.123.105:0/3467897404 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7feb4008daa0 con 0x7feb580752c0 2026-03-10T07:48:09.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.179+0000 7feb4ffff700 1 -- 192.168.123.105:0/3467897404 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7feb40022ac0 con 0x7feb580752c0 2026-03-10T07:48:09.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.179+0000 7feb5659c700 1 --2- 192.168.123.105:0/3467897404 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7feb4406c4e0 0x7feb4406e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:09.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.180+0000 7feb5659c700 1 --2- 192.168.123.105:0/3467897404 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7feb4406c4e0 0x7feb4406e9a0 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7feb5819e780 tx=0x7feb48009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:09.313 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.311+0000 7feb5d0a2700 1 -- 192.168.123.105:0/3467897404 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mgr dump", "format": "json"} v 0) v1 -- 0x7feb5804ea90 con 0x7feb580752c0 2026-03-10T07:48:09.315 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.313+0000 7feb4ffff700 1 -- 192.168.123.105:0/3467897404 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mgr dump", "format": "json"}]=0 v19) v1 ==== 74+0+173028 (secure 0 0 0) 0x7feb40058710 con 0x7feb580752c0 2026-03-10T07:48:09.316 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:48:09.318 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.316+0000 7feb5d0a2700 1 -- 192.168.123.105:0/3467897404 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7feb4406c4e0 msgr2=0x7feb4406e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:09.318 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.316+0000 7feb5d0a2700 1 --2- 192.168.123.105:0/3467897404 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7feb4406c4e0 0x7feb4406e9a0 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7feb5819e780 tx=0x7feb48009450 comp rx=0 tx=0).stop 2026-03-10T07:48:09.318 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.317+0000 7feb5d0a2700 1 -- 192.168.123.105:0/3467897404 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb580752c0 msgr2=0x7feb5819d160 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:09.318 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.317+0000 7feb5d0a2700 1 --2- 192.168.123.105:0/3467897404 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb580752c0 0x7feb5819d160 secure :-1 s=READY pgs=209 cs=0 l=1 rev1=1 crypto rx=0x7feb40005e30 tx=0x7feb40005f10 comp rx=0 tx=0).stop 2026-03-10T07:48:09.319 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.317+0000 7feb5d0a2700 1 -- 192.168.123.105:0/3467897404 shutdown_connections 2026-03-10T07:48:09.319 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.317+0000 7feb5d0a2700 1 --2- 192.168.123.105:0/3467897404 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7feb4406c4e0 0x7feb4406e9a0 unknown :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:09.319 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.317+0000 7feb5d0a2700 1 --2- 192.168.123.105:0/3467897404 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7feb580752c0 0x7feb5819d160 unknown :-1 s=CLOSED pgs=209 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:09.319 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.317+0000 7feb5d0a2700 1 --2- 192.168.123.105:0/3467897404 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7feb58075be0 0x7feb5819d6a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:09.319 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.317+0000 7feb5d0a2700 1 -- 192.168.123.105:0/3467897404 >> 192.168.123.105:0/3467897404 conn(0x7feb580feb90 msgr2=0x7feb580ff800 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:09.319 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.317+0000 7feb5d0a2700 1 -- 192.168.123.105:0/3467897404 shutdown_connections 2026-03-10T07:48:09.319 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.317+0000 7feb5d0a2700 1 -- 192.168.123.105:0/3467897404 wait complete. 2026-03-10T07:48:09.384 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":19,"active_gid":14223,"active_name":"vm05.blexke","active_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6800","nonce":2},{"type":"v1","addr":"192.168.123.105:6801","nonce":2}]},"active_addr":"192.168.123.105:6801/2","active_change":"2026-03-10T07:46:43.267060+0000","active_mgr_features":4540138322906710015,"available":true,"standbys":[{"gid":14248,"name":"vm08.orfpog","mgr_features":4540138322906710015,"available_modules":[{"name":"alerts","can_run":true,"error_string":"","module_options":{"interval":{"name":"interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"How frequently to reexamine health status","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"smtp_destination":{"name":"smtp_destination","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Email address to send alerts to","long_desc":"","tags":[],"see_also":[]},"smtp_from_name":{"name":"smtp_from_name","type":"str","level":"advanced","flags":1,"default_value":"Ceph","min":"","max":"","enum_allowed":[],"desc":"Email From: name","long_desc":"","tags":[],"see_also":[]},"smtp_host":{"name":"smtp_host","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_password":{"name":"smtp_password","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Password to authenticate with","long_desc":"","tags":[],"see_also":[]},"smtp_port":{"name":"smtp_port","type":"int","level":"advanced","flags":1,"default_value":"465","min":"","max":"","enum_allowed":[],"desc":"SMTP port","long_desc":"","tags":[],"see_also":[]},"smtp_sender":{"name":"smtp_sender","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP envelope sender","long_desc":"","tags":[],"see_also":[]},"smtp_ssl":{"name":"smtp_ssl","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Use SSL to connect to SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_user":{"name":"smtp_user","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"User to authenticate as","long_desc":"","tags":[],"see_also":[]}}},{"name":"balancer","can_run":true,"error_string":"","module_options":{"active":{"name":"active","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"automatically balance PGs across cluster","long_desc":"","tags":[],"see_also":[]},"begin_time":{"name":"begin_time","type":"str","level":"advanced","flags":1,"default_value":"0000","min":"","max":"","enum_allowed":[],"desc":"beginning time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"begin_weekday":{"name":"begin_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to this day of the week or later","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"crush_compat_max_iterations":{"name":"crush_compat_max_iterations","type":"uint","level":"advanced","flags":1,"default_value":"25","min":"1","max":"250","enum_allowed":[],"desc":"maximum number of iterations to attempt optimization","long_desc":"","tags":[],"see_also":[]},"crush_compat_metrics":{"name":"crush_compat_metrics","type":"str","level":"advanced","flags":1,"default_value":"pgs,objects,bytes","min":"","max":"","enum_allowed":[],"desc":"metrics with which to calculate OSD utilization","long_desc":"Value is a list of one or more of \"pgs\", \"objects\", or \"bytes\", and indicates which metrics to use to balance utilization.","tags":[],"see_also":[]},"crush_compat_step":{"name":"crush_compat_step","type":"float","level":"advanced","flags":1,"default_value":"0.5","min":"0.001","max":"0.999","enum_allowed":[],"desc":"aggressiveness of optimization","long_desc":".99 is very aggressive, .01 is less aggressive","tags":[],"see_also":[]},"end_time":{"name":"end_time","type":"str","level":"advanced","flags":1,"default_value":"2359","min":"","max":"","enum_allowed":[],"desc":"ending time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"end_weekday":{"name":"end_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to days of the week earlier than this","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_score":{"name":"min_score","type":"float","level":"advanced","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"minimum score, below which no optimization is attempted","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":1,"default_value":"upmap","min":"","max":"","enum_allowed":["crush-compat","none","upmap"],"desc":"Balancer mode","long_desc":"","tags":[],"see_also":[]},"pool_ids":{"name":"pool_ids","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"pools which the automatic balancing will be limited to","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and attempt optimization","long_desc":"","tags":[],"see_also":[]},"upmap_max_deviation":{"name":"upmap_max_deviation","type":"int","level":"advanced","flags":1,"default_value":"5","min":"1","max":"","enum_allowed":[],"desc":"deviation below which no optimization is attempted","long_desc":"If the number of PGs are within this count then no optimization is attempted","tags":[],"see_also":[]},"upmap_max_optimizations":{"name":"upmap_max_optimizations","type":"uint","level":"advanced","flags":1,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"maximum upmap optimizations to make per attempt","long_desc":"","tags":[],"see_also":[]}}},{"name":"cephadm","can_run":true,"error_string":"","module_options":{"agent_down_multiplier":{"name":"agent_down_multiplier","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"","max":"","enum_allowed":[],"desc":"Multiplied by agent refresh rate to calculate how long agent must not report before being marked down","long_desc":"","tags":[],"see_also":[]},"agent_refresh_rate":{"name":"agent_refresh_rate","type":"secs","level":"advanced","flags":0,"default_value":"20","min":"","max":"","enum_allowed":[],"desc":"How often agent on each host will try to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"agent_starting_port":{"name":"agent_starting_port","type":"int","level":"advanced","flags":0,"default_value":"4721","min":"","max":"","enum_allowed":[],"desc":"First port agent will try to bind to (will also try up to next 1000 subsequent ports if blocked)","long_desc":"","tags":[],"see_also":[]},"allow_ptrace":{"name":"allow_ptrace","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow SYS_PTRACE capability on ceph containers","long_desc":"The SYS_PTRACE capability is needed to attach to a process with gdb or strace. Enabling this options can allow debugging daemons that encounter problems at runtime.","tags":[],"see_also":[]},"autotune_interval":{"name":"autotune_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to autotune daemon memory","long_desc":"","tags":[],"see_also":[]},"autotune_memory_target_ratio":{"name":"autotune_memory_target_ratio","type":"float","level":"advanced","flags":0,"default_value":"0.7","min":"","max":"","enum_allowed":[],"desc":"ratio of total system memory to divide amongst autotuned daemons","long_desc":"","tags":[],"see_also":[]},"cgroups_split":{"name":"cgroups_split","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Pass --cgroups=split when cephadm creates containers (currently podman only)","long_desc":"","tags":[],"see_also":[]},"config_checks_enabled":{"name":"config_checks_enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable or disable the cephadm configuration analysis","long_desc":"","tags":[],"see_also":[]},"config_dashboard":{"name":"config_dashboard","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"manage configs like API endpoints in Dashboard.","long_desc":"","tags":[],"see_also":[]},"container_image_alertmanager":{"name":"container_image_alertmanager","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/alertmanager:v0.25.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_base":{"name":"container_image_base","type":"str","level":"advanced","flags":1,"default_value":"quay.io/ceph/ceph","min":"","max":"","enum_allowed":[],"desc":"Container image name, without the tag","long_desc":"","tags":[],"see_also":[]},"container_image_elasticsearch":{"name":"container_image_elasticsearch","type":"str","level":"advanced","flags":0,"default_value":"quay.io/omrizeneva/elasticsearch:6.8.23","min":"","max":"","enum_allowed":[],"desc":"elasticsearch container image","long_desc":"","tags":[],"see_also":[]},"container_image_grafana":{"name":"container_image_grafana","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/ceph-grafana:9.4.7","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_haproxy":{"name":"container_image_haproxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/haproxy:2.3","min":"","max":"","enum_allowed":[],"desc":"HAproxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_agent":{"name":"container_image_jaeger_agent","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-agent:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger agent container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_collector":{"name":"container_image_jaeger_collector","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-collector:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger collector container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_query":{"name":"container_image_jaeger_query","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-query:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger query container image","long_desc":"","tags":[],"see_also":[]},"container_image_keepalived":{"name":"container_image_keepalived","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/keepalived:2.2.4","min":"","max":"","enum_allowed":[],"desc":"Keepalived container image","long_desc":"","tags":[],"see_also":[]},"container_image_loki":{"name":"container_image_loki","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/loki:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Loki container image","long_desc":"","tags":[],"see_also":[]},"container_image_node_exporter":{"name":"container_image_node_exporter","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/node-exporter:v1.5.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_nvmeof":{"name":"container_image_nvmeof","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/nvmeof:0.0.2","min":"","max":"","enum_allowed":[],"desc":"Nvme-of container image","long_desc":"","tags":[],"see_also":[]},"container_image_prometheus":{"name":"container_image_prometheus","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/prometheus:v2.43.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_promtail":{"name":"container_image_promtail","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/promtail:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Promtail container image","long_desc":"","tags":[],"see_also":[]},"container_image_snmp_gateway":{"name":"container_image_snmp_gateway","type":"str","level":"advanced","flags":0,"default_value":"docker.io/maxwo/snmp-notifier:v1.2.1","min":"","max":"","enum_allowed":[],"desc":"SNMP Gateway container image","long_desc":"","tags":[],"see_also":[]},"container_init":{"name":"container_init","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Run podman/docker with `--init`","long_desc":"","tags":[],"see_also":[]},"daemon_cache_timeout":{"name":"daemon_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"seconds to cache service (daemon) inventory","long_desc":"","tags":[],"see_also":[]},"default_cephadm_command_timeout":{"name":"default_cephadm_command_timeout","type":"secs","level":"advanced","flags":0,"default_value":"900","min":"","max":"","enum_allowed":[],"desc":"Default timeout applied to cephadm commands run directly on the host (in seconds)","long_desc":"","tags":[],"see_also":[]},"default_registry":{"name":"default_registry","type":"str","level":"advanced","flags":0,"default_value":"docker.io","min":"","max":"","enum_allowed":[],"desc":"Search-registry to which we should normalize unqualified image names. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"device_cache_timeout":{"name":"device_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"seconds to cache device inventory","long_desc":"","tags":[],"see_also":[]},"device_enhanced_scan":{"name":"device_enhanced_scan","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use libstoragemgmt during device scans","long_desc":"","tags":[],"see_also":[]},"facts_cache_timeout":{"name":"facts_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"seconds to cache host facts data","long_desc":"","tags":[],"see_also":[]},"host_check_interval":{"name":"host_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to perform a host check","long_desc":"","tags":[],"see_also":[]},"inventory_list_all":{"name":"inventory_list_all","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Whether ceph-volume inventory should report more devices (mostly mappers (LVs / mpaths), partitions...)","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_refresh_metadata":{"name":"log_refresh_metadata","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Log all refresh metadata. Includes daemon, device, and host info collected regularly. Only has effect if logging at debug level","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"log to the \"cephadm\" cluster log channel\"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf":{"name":"manage_etc_ceph_ceph_conf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Manage and own /etc/ceph/ceph.conf on the hosts.","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf_hosts":{"name":"manage_etc_ceph_ceph_conf_hosts","type":"str","level":"advanced","flags":0,"default_value":"*","min":"","max":"","enum_allowed":[],"desc":"PlacementSpec describing on which hosts to manage /etc/ceph/ceph.conf","long_desc":"","tags":[],"see_also":[]},"max_count_per_host":{"name":"max_count_per_host","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of daemons per service per host","long_desc":"","tags":[],"see_also":[]},"max_osd_draining_count":{"name":"max_osd_draining_count","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of osds that will be drained simultaneously when osds are removed","long_desc":"","tags":[],"see_also":[]},"migration_current":{"name":"migration_current","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"internal - do not modify","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":0,"default_value":"root","min":"","max":"","enum_allowed":["cephadm-package","root"],"desc":"mode for remote execution of cephadm","long_desc":"","tags":[],"see_also":[]},"prometheus_alerts_path":{"name":"prometheus_alerts_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/prometheus/ceph/ceph_default_alerts.yml","min":"","max":"","enum_allowed":[],"desc":"location of alerts to include in prometheus deployments","long_desc":"","tags":[],"see_also":[]},"registry_insecure":{"name":"registry_insecure","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Registry is to be considered insecure (no TLS available). Only for development purposes.","long_desc":"","tags":[],"see_also":[]},"registry_password":{"name":"registry_password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository password. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"registry_url":{"name":"registry_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Registry url for login purposes. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"registry_username":{"name":"registry_username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository username. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"secure_monitoring_stack":{"name":"secure_monitoring_stack","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable TLS security for all the monitoring stack daemons","long_desc":"","tags":[],"see_also":[]},"service_discovery_port":{"name":"service_discovery_port","type":"int","level":"advanced","flags":0,"default_value":"8765","min":"","max":"","enum_allowed":[],"desc":"cephadm service discovery port","long_desc":"","tags":[],"see_also":[]},"ssh_config_file":{"name":"ssh_config_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"customized SSH config file to connect to managed hosts","long_desc":"","tags":[],"see_also":[]},"use_agent":{"name":"use_agent","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use cephadm agent on each host to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"use_repo_digest":{"name":"use_repo_digest","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Automatically convert image tags to image digest. Make sure all daemons use the same image","long_desc":"","tags":[],"see_also":[]},"warn_on_failed_host_check":{"name":"warn_on_failed_host_check","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if the host check fails","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_daemons":{"name":"warn_on_stray_daemons","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected that are not managed by cephadm","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_hosts":{"name":"warn_on_stray_hosts","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected on a host that is not managed by cephadm","long_desc":"","tags":[],"see_also":[]}}},{"name":"crash","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"retain_interval":{"name":"retain_interval","type":"secs","level":"advanced","flags":1,"default_value":"31536000","min":"","max":"","enum_allowed":[],"desc":"how long to retain crashes before pruning them","long_desc":"","tags":[],"see_also":[]},"warn_recent_interval":{"name":"warn_recent_interval","type":"secs","level":"advanced","flags":1,"default_value":"1209600","min":"","max":"","enum_allowed":[],"desc":"time interval in which to warn about recent crashes","long_desc":"","tags":[],"see_also":[]}}},{"name":"dashboard","can_run":true,"error_string":"","module_options":{"ACCOUNT_LOCKOUT_ATTEMPTS":{"name":"ACCOUNT_LOCKOUT_ATTEMPTS","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_HOST":{"name":"ALERTMANAGER_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_SSL_VERIFY":{"name":"ALERTMANAGER_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_ENABLED":{"name":"AUDIT_API_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_LOG_PAYLOAD":{"name":"AUDIT_API_LOG_PAYLOAD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ENABLE_BROWSABLE_API":{"name":"ENABLE_BROWSABLE_API","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_CEPHFS":{"name":"FEATURE_TOGGLE_CEPHFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_DASHBOARD":{"name":"FEATURE_TOGGLE_DASHBOARD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_ISCSI":{"name":"FEATURE_TOGGLE_ISCSI","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_MIRRORING":{"name":"FEATURE_TOGGLE_MIRRORING","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_NFS":{"name":"FEATURE_TOGGLE_NFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RBD":{"name":"FEATURE_TOGGLE_RBD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RGW":{"name":"FEATURE_TOGGLE_RGW","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE":{"name":"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_PASSWORD":{"name":"GRAFANA_API_PASSWORD","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_SSL_VERIFY":{"name":"GRAFANA_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_URL":{"name":"GRAFANA_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_USERNAME":{"name":"GRAFANA_API_USERNAME","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_FRONTEND_API_URL":{"name":"GRAFANA_FRONTEND_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_UPDATE_DASHBOARDS":{"name":"GRAFANA_UPDATE_DASHBOARDS","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISCSI_API_SSL_VERIFICATION":{"name":"ISCSI_API_SSL_VERIFICATION","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISSUE_TRACKER_API_KEY":{"name":"ISSUE_TRACKER_API_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_HOST":{"name":"PROMETHEUS_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_SSL_VERIFY":{"name":"PROMETHEUS_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_COMPLEXITY_ENABLED":{"name":"PWD_POLICY_CHECK_COMPLEXITY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED":{"name":"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_LENGTH_ENABLED":{"name":"PWD_POLICY_CHECK_LENGTH_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_OLDPWD_ENABLED":{"name":"PWD_POLICY_CHECK_OLDPWD_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_USERNAME_ENABLED":{"name":"PWD_POLICY_CHECK_USERNAME_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_ENABLED":{"name":"PWD_POLICY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_EXCLUSION_LIST":{"name":"PWD_POLICY_EXCLUSION_LIST","type":"str","level":"advanced","flags":0,"default_value":"osd,host,dashboard,pool,block,nfs,ceph,monitors,gateway,logs,crush,maps","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_COMPLEXITY":{"name":"PWD_POLICY_MIN_COMPLEXITY","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_LENGTH":{"name":"PWD_POLICY_MIN_LENGTH","type":"int","level":"advanced","flags":0,"default_value":"8","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"REST_REQUESTS_TIMEOUT":{"name":"REST_REQUESTS_TIMEOUT","type":"int","level":"advanced","flags":0,"default_value":"45","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ACCESS_KEY":{"name":"RGW_API_ACCESS_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ADMIN_RESOURCE":{"name":"RGW_API_ADMIN_RESOURCE","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SECRET_KEY":{"name":"RGW_API_SECRET_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SSL_VERIFY":{"name":"RGW_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"UNSAFE_TLS_v1_2":{"name":"UNSAFE_TLS_v1_2","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_SPAN":{"name":"USER_PWD_EXPIRATION_SPAN","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_1":{"name":"USER_PWD_EXPIRATION_WARNING_1","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_2":{"name":"USER_PWD_EXPIRATION_WARNING_2","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"cross_origin_url":{"name":"cross_origin_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crt_file":{"name":"crt_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"debug":{"name":"debug","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable/disable debug options","long_desc":"","tags":[],"see_also":[]},"jwt_token_ttl":{"name":"jwt_token_ttl","type":"int","level":"advanced","flags":0,"default_value":"28800","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"motd":{"name":"motd","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"The message of the day","long_desc":"","tags":[],"see_also":[]},"redirect_resolve_ip_addr":{"name":"redirect_resolve_ip_addr","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":0,"default_value":"8080","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl_server_port":{"name":"ssl_server_port","type":"int","level":"advanced","flags":0,"default_value":"8443","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":0,"default_value":"redirect","min":"","max":"","enum_allowed":["error","redirect"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":0,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url_prefix":{"name":"url_prefix","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"devicehealth","can_run":true,"error_string":"","module_options":{"enable_monitoring":{"name":"enable_monitoring","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"monitor device health metrics","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mark_out_threshold":{"name":"mark_out_threshold","type":"secs","level":"advanced","flags":1,"default_value":"2419200","min":"","max":"","enum_allowed":[],"desc":"automatically mark OSD if it may fail before this long","long_desc":"","tags":[],"see_also":[]},"pool_name":{"name":"pool_name","type":"str","level":"advanced","flags":1,"default_value":"device_health_metrics","min":"","max":"","enum_allowed":[],"desc":"name of pool in which to store device health metrics","long_desc":"","tags":[],"see_also":[]},"retention_period":{"name":"retention_period","type":"secs","level":"advanced","flags":1,"default_value":"15552000","min":"","max":"","enum_allowed":[],"desc":"how long to retain device health metrics","long_desc":"","tags":[],"see_also":[]},"scrape_frequency":{"name":"scrape_frequency","type":"secs","level":"advanced","flags":1,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"how frequently to scrape device health metrics","long_desc":"","tags":[],"see_also":[]},"self_heal":{"name":"self_heal","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"preemptively heal cluster around devices that may fail","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and check device health","long_desc":"","tags":[],"see_also":[]},"warn_threshold":{"name":"warn_threshold","type":"secs","level":"advanced","flags":1,"default_value":"7257600","min":"","max":"","enum_allowed":[],"desc":"raise health warning if OSD may fail before this long","long_desc":"","tags":[],"see_also":[]}}},{"name":"diskprediction_local","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predict_interval":{"name":"predict_interval","type":"str","level":"advanced","flags":0,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predictor_model":{"name":"predictor_model","type":"str","level":"advanced","flags":0,"default_value":"prophetstor","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"str","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"influx","can_run":false,"error_string":"influxdb python module not found","module_options":{"batch_size":{"name":"batch_size","type":"int","level":"advanced","flags":0,"default_value":"5000","min":"","max":"","enum_allowed":[],"desc":"How big batches of data points should be when sending to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"database":{"name":"database","type":"str","level":"advanced","flags":0,"default_value":"ceph","min":"","max":"","enum_allowed":[],"desc":"InfluxDB database name. You will need to create this database and grant write privileges to the configured username or the username must have admin privileges to create it.","long_desc":"","tags":[],"see_also":[]},"hostname":{"name":"hostname","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server hostname","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"30","min":"5","max":"","enum_allowed":[],"desc":"Time between reports to InfluxDB. Default 30 seconds.","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"password":{"name":"password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"password of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"port":{"name":"port","type":"int","level":"advanced","flags":0,"default_value":"8086","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server port","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"str","level":"advanced","flags":0,"default_value":"false","min":"","max":"","enum_allowed":[],"desc":"Use https connection for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]},"threads":{"name":"threads","type":"int","level":"advanced","flags":0,"default_value":"5","min":"1","max":"32","enum_allowed":[],"desc":"How many worker threads should be spawned for sending data to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"username":{"name":"username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"username of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"verify_ssl":{"name":"verify_ssl","type":"str","level":"advanced","flags":0,"default_value":"true","min":"","max":"","enum_allowed":[],"desc":"Verify https cert for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]}}},{"name":"insights","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"iostat","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"k8sevents","can_run":true,"error_string":"","module_options":{"ceph_event_retention_days":{"name":"ceph_event_retention_days","type":"int","level":"advanced","flags":0,"default_value":"7","min":"","max":"","enum_allowed":[],"desc":"Days to hold ceph event information within local cache","long_desc":"","tags":[],"see_also":[]},"config_check_secs":{"name":"config_check_secs","type":"int","level":"advanced","flags":0,"default_value":"10","min":"10","max":"","enum_allowed":[],"desc":"interval (secs) to check for cluster configuration changes","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"localpool","can_run":true,"error_string":"","module_options":{"failure_domain":{"name":"failure_domain","type":"str","level":"advanced","flags":1,"default_value":"host","min":"","max":"","enum_allowed":[],"desc":"failure domain for any created local pool","long_desc":"what failure domain we should separate data replicas across.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_size":{"name":"min_size","type":"int","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"default min_size for any created local pool","long_desc":"value to set min_size to (unchanged from Ceph's default if this option is not set)","tags":[],"see_also":[]},"num_rep":{"name":"num_rep","type":"int","level":"advanced","flags":1,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"default replica count for any created local pool","long_desc":"","tags":[],"see_also":[]},"pg_num":{"name":"pg_num","type":"int","level":"advanced","flags":1,"default_value":"128","min":"","max":"","enum_allowed":[],"desc":"default pg_num for any created local pool","long_desc":"","tags":[],"see_also":[]},"prefix":{"name":"prefix","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"name prefix for any created local pool","long_desc":"","tags":[],"see_also":[]},"subtree":{"name":"subtree","type":"str","level":"advanced","flags":1,"default_value":"rack","min":"","max":"","enum_allowed":[],"desc":"CRUSH level for which to create a local pool","long_desc":"which CRUSH subtree type the module should create a pool for.","tags":[],"see_also":[]}}},{"name":"mds_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"mirroring","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"nfs","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"orchestrator","can_run":true,"error_string":"","module_options":{"fail_fs":{"name":"fail_fs","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Fail filesystem for rapid multi-rank mds upgrade","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"orchestrator":{"name":"orchestrator","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["cephadm","rook","test_orchestrator"],"desc":"Orchestrator backend","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_perf_query","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"pg_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"threshold":{"name":"threshold","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"1.0","max":"","enum_allowed":[],"desc":"scaling threshold","long_desc":"The factor by which the `NEW PG_NUM` must vary from the current`PG_NUM` before being accepted. Cannot be less than 1.0","tags":[],"see_also":[]}}},{"name":"progress","can_run":true,"error_string":"","module_options":{"allow_pg_recovery_event":{"name":"allow_pg_recovery_event","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow the module to show pg recovery progress","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_completed_events":{"name":"max_completed_events","type":"int","level":"advanced","flags":1,"default_value":"50","min":"","max":"","enum_allowed":[],"desc":"number of past completed events to remember","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"how long the module is going to sleep","long_desc":"","tags":[],"see_also":[]}}},{"name":"prometheus","can_run":true,"error_string":"","module_options":{"cache":{"name":"cache","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"exclude_perf_counters":{"name":"exclude_perf_counters","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Do not include perf-counters in the metrics output","long_desc":"Gathering perf-counters from a single Prometheus exporter can degrade ceph-mgr performance, especially in large clusters. Instead, Ceph-exporter daemons are now used by default for perf-counter gathering. This should only be disabled when no ceph-exporters are deployed.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools":{"name":"rbd_stats_pools","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools_refresh_interval":{"name":"rbd_stats_pools_refresh_interval","type":"int","level":"advanced","flags":0,"default_value":"300","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"scrape_interval":{"name":"scrape_interval","type":"float","level":"advanced","flags":0,"default_value":"15.0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"the IPv4 or IPv6 address on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":1,"default_value":"9283","min":"","max":"","enum_allowed":[],"desc":"the port on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"stale_cache_strategy":{"name":"stale_cache_strategy","type":"str","level":"advanced","flags":0,"default_value":"log","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":1,"default_value":"default","min":"","max":"","enum_allowed":["default","error"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":1,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rbd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_snap_create":{"name":"max_concurrent_snap_create","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mirror_snapshot_schedule":{"name":"mirror_snapshot_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"trash_purge_schedule":{"name":"trash_purge_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"restful","can_run":true,"error_string":"","module_options":{"enable_auth":{"name":"enable_auth","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rgw","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rook","can_run":true,"error_string":"","module_options":{"drive_group_interval":{"name":"drive_group_interval","type":"float","level":"advanced","flags":0,"default_value":"300.0","min":"","max":"","enum_allowed":[],"desc":"interval in seconds between re-application of applied drive_groups","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"storage_class":{"name":"storage_class","type":"str","level":"advanced","flags":0,"default_value":"local","min":"","max":"","enum_allowed":[],"desc":"storage class name for LSO-discovered PVs","long_desc":"","tags":[],"see_also":[]}}},{"name":"selftest","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption1":{"name":"roption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption2":{"name":"roption2","type":"str","level":"advanced","flags":0,"default_value":"xyz","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption1":{"name":"rwoption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption2":{"name":"rwoption2","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption3":{"name":"rwoption3","type":"float","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption4":{"name":"rwoption4","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption5":{"name":"rwoption5","type":"bool","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption6":{"name":"rwoption6","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption7":{"name":"rwoption7","type":"int","level":"advanced","flags":0,"default_value":"","min":"1","max":"42","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testkey":{"name":"testkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testlkey":{"name":"testlkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testnewline":{"name":"testnewline","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"snap_schedule","can_run":true,"error_string":"","module_options":{"allow_m_granularity":{"name":"allow_m_granularity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow minute scheduled snapshots","long_desc":"","tags":[],"see_also":[]},"dump_on_update":{"name":"dump_on_update","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"dump database to debug log on update","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"stats","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"status","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telegraf","can_run":true,"error_string":"","module_options":{"address":{"name":"address","type":"str","level":"advanced","flags":0,"default_value":"unixgram:///tmp/telegraf.sock","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"15","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telemetry","can_run":true,"error_string":"","module_options":{"channel_basic":{"name":"channel_basic","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share basic cluster information (size, version)","long_desc":"","tags":[],"see_also":[]},"channel_crash":{"name":"channel_crash","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share metadata about Ceph daemon crashes (version, stack straces, etc)","long_desc":"","tags":[],"see_also":[]},"channel_device":{"name":"channel_device","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share device health metrics (e.g., SMART data, minus potentially identifying info like serial numbers)","long_desc":"","tags":[],"see_also":[]},"channel_ident":{"name":"channel_ident","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share a user-provided description and/or contact email for the cluster","long_desc":"","tags":[],"see_also":[]},"channel_perf":{"name":"channel_perf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share various performance metrics of a cluster","long_desc":"","tags":[],"see_also":[]},"contact":{"name":"contact","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"description":{"name":"description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"device_url":{"name":"device_url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/device","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"int","level":"advanced","flags":0,"default_value":"24","min":"8","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"last_opt_revision":{"name":"last_opt_revision","type":"int","level":"advanced","flags":0,"default_value":"1","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard":{"name":"leaderboard","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard_description":{"name":"leaderboard_description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"organization":{"name":"organization","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"proxy":{"name":"proxy","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url":{"name":"url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/report","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"test_orchestrator","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"volumes","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_clones":{"name":"max_concurrent_clones","type":"int","level":"advanced","flags":0,"default_value":"4","min":"","max":"","enum_allowed":[],"desc":"Number of asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_delay":{"name":"snapshot_clone_delay","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"Delay clone begin operation by snapshot_clone_delay seconds","long_desc":"","tags":[],"see_also":[]}}},{"name":"zabbix","can_run":true,"error_string":"","module_options":{"discovery_interval":{"name":"discovery_interval","type":"uint","level":"advanced","flags":0,"default_value":"100","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"identifier":{"name":"identifier","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_host":{"name":"zabbix_host","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_port":{"name":"zabbix_port","type":"int","level":"advanced","flags":0,"default_value":"10051","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_sender":{"name":"zabbix_sender","type":"str","level":"advanced","flags":0,"default_value":"/usr/bin/zabbix_sender","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}}]}],"modules":["cephadm","dashboard","iostat","nfs","prometheus","restful"],"available_modules":[{"name":"alerts","can_run":true,"error_string":"","module_options":{"interval":{"name":"interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"How frequently to reexamine health status","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"smtp_destination":{"name":"smtp_destination","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Email address to send alerts to","long_desc":"","tags":[],"see_also":[]},"smtp_from_name":{"name":"smtp_from_name","type":"str","level":"advanced","flags":1,"default_value":"Ceph","min":"","max":"","enum_allowed":[],"desc":"Email From: name","long_desc":"","tags":[],"see_also":[]},"smtp_host":{"name":"smtp_host","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_password":{"name":"smtp_password","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Password to authenticate with","long_desc":"","tags":[],"see_also":[]},"smtp_port":{"name":"smtp_port","type":"int","level":"advanced","flags":1,"default_value":"465","min":"","max":"","enum_allowed":[],"desc":"SMTP port","long_desc":"","tags":[],"see_also":[]},"smtp_sender":{"name":"smtp_sender","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"SMTP envelope sender","long_desc":"","tags":[],"see_also":[]},"smtp_ssl":{"name":"smtp_ssl","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Use SSL to connect to SMTP server","long_desc":"","tags":[],"see_also":[]},"smtp_user":{"name":"smtp_user","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"User to authenticate as","long_desc":"","tags":[],"see_also":[]}}},{"name":"balancer","can_run":true,"error_string":"","module_options":{"active":{"name":"active","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"automatically balance PGs across cluster","long_desc":"","tags":[],"see_also":[]},"begin_time":{"name":"begin_time","type":"str","level":"advanced","flags":1,"default_value":"0000","min":"","max":"","enum_allowed":[],"desc":"beginning time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"begin_weekday":{"name":"begin_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to this day of the week or later","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"crush_compat_max_iterations":{"name":"crush_compat_max_iterations","type":"uint","level":"advanced","flags":1,"default_value":"25","min":"1","max":"250","enum_allowed":[],"desc":"maximum number of iterations to attempt optimization","long_desc":"","tags":[],"see_also":[]},"crush_compat_metrics":{"name":"crush_compat_metrics","type":"str","level":"advanced","flags":1,"default_value":"pgs,objects,bytes","min":"","max":"","enum_allowed":[],"desc":"metrics with which to calculate OSD utilization","long_desc":"Value is a list of one or more of \"pgs\", \"objects\", or \"bytes\", and indicates which metrics to use to balance utilization.","tags":[],"see_also":[]},"crush_compat_step":{"name":"crush_compat_step","type":"float","level":"advanced","flags":1,"default_value":"0.5","min":"0.001","max":"0.999","enum_allowed":[],"desc":"aggressiveness of optimization","long_desc":".99 is very aggressive, .01 is less aggressive","tags":[],"see_also":[]},"end_time":{"name":"end_time","type":"str","level":"advanced","flags":1,"default_value":"2359","min":"","max":"","enum_allowed":[],"desc":"ending time of day to automatically balance","long_desc":"This is a time of day in the format HHMM.","tags":[],"see_also":[]},"end_weekday":{"name":"end_weekday","type":"uint","level":"advanced","flags":1,"default_value":"0","min":"0","max":"6","enum_allowed":[],"desc":"Restrict automatic balancing to days of the week earlier than this","long_desc":"0 = Sunday, 1 = Monday, etc.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_score":{"name":"min_score","type":"float","level":"advanced","flags":1,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"minimum score, below which no optimization is attempted","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":1,"default_value":"upmap","min":"","max":"","enum_allowed":["crush-compat","none","upmap"],"desc":"Balancer mode","long_desc":"","tags":[],"see_also":[]},"pool_ids":{"name":"pool_ids","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"pools which the automatic balancing will be limited to","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and attempt optimization","long_desc":"","tags":[],"see_also":[]},"upmap_max_deviation":{"name":"upmap_max_deviation","type":"int","level":"advanced","flags":1,"default_value":"5","min":"1","max":"","enum_allowed":[],"desc":"deviation below which no optimization is attempted","long_desc":"If the number of PGs are within this count then no optimization is attempted","tags":[],"see_also":[]},"upmap_max_optimizations":{"name":"upmap_max_optimizations","type":"uint","level":"advanced","flags":1,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"maximum upmap optimizations to make per attempt","long_desc":"","tags":[],"see_also":[]}}},{"name":"cephadm","can_run":true,"error_string":"","module_options":{"agent_down_multiplier":{"name":"agent_down_multiplier","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"","max":"","enum_allowed":[],"desc":"Multiplied by agent refresh rate to calculate how long agent must not report before being marked down","long_desc":"","tags":[],"see_also":[]},"agent_refresh_rate":{"name":"agent_refresh_rate","type":"secs","level":"advanced","flags":0,"default_value":"20","min":"","max":"","enum_allowed":[],"desc":"How often agent on each host will try to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"agent_starting_port":{"name":"agent_starting_port","type":"int","level":"advanced","flags":0,"default_value":"4721","min":"","max":"","enum_allowed":[],"desc":"First port agent will try to bind to (will also try up to next 1000 subsequent ports if blocked)","long_desc":"","tags":[],"see_also":[]},"allow_ptrace":{"name":"allow_ptrace","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow SYS_PTRACE capability on ceph containers","long_desc":"The SYS_PTRACE capability is needed to attach to a process with gdb or strace. Enabling this options can allow debugging daemons that encounter problems at runtime.","tags":[],"see_also":[]},"autotune_interval":{"name":"autotune_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to autotune daemon memory","long_desc":"","tags":[],"see_also":[]},"autotune_memory_target_ratio":{"name":"autotune_memory_target_ratio","type":"float","level":"advanced","flags":0,"default_value":"0.7","min":"","max":"","enum_allowed":[],"desc":"ratio of total system memory to divide amongst autotuned daemons","long_desc":"","tags":[],"see_also":[]},"cgroups_split":{"name":"cgroups_split","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Pass --cgroups=split when cephadm creates containers (currently podman only)","long_desc":"","tags":[],"see_also":[]},"config_checks_enabled":{"name":"config_checks_enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable or disable the cephadm configuration analysis","long_desc":"","tags":[],"see_also":[]},"config_dashboard":{"name":"config_dashboard","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"manage configs like API endpoints in Dashboard.","long_desc":"","tags":[],"see_also":[]},"container_image_alertmanager":{"name":"container_image_alertmanager","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/alertmanager:v0.25.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_base":{"name":"container_image_base","type":"str","level":"advanced","flags":1,"default_value":"quay.io/ceph/ceph","min":"","max":"","enum_allowed":[],"desc":"Container image name, without the tag","long_desc":"","tags":[],"see_also":[]},"container_image_elasticsearch":{"name":"container_image_elasticsearch","type":"str","level":"advanced","flags":0,"default_value":"quay.io/omrizeneva/elasticsearch:6.8.23","min":"","max":"","enum_allowed":[],"desc":"elasticsearch container image","long_desc":"","tags":[],"see_also":[]},"container_image_grafana":{"name":"container_image_grafana","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/ceph-grafana:9.4.7","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_haproxy":{"name":"container_image_haproxy","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/haproxy:2.3","min":"","max":"","enum_allowed":[],"desc":"HAproxy container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_agent":{"name":"container_image_jaeger_agent","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-agent:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger agent container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_collector":{"name":"container_image_jaeger_collector","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-collector:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger collector container image","long_desc":"","tags":[],"see_also":[]},"container_image_jaeger_query":{"name":"container_image_jaeger_query","type":"str","level":"advanced","flags":0,"default_value":"quay.io/jaegertracing/jaeger-query:1.29","min":"","max":"","enum_allowed":[],"desc":"Jaeger query container image","long_desc":"","tags":[],"see_also":[]},"container_image_keepalived":{"name":"container_image_keepalived","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/keepalived:2.2.4","min":"","max":"","enum_allowed":[],"desc":"Keepalived container image","long_desc":"","tags":[],"see_also":[]},"container_image_loki":{"name":"container_image_loki","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/loki:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Loki container image","long_desc":"","tags":[],"see_also":[]},"container_image_node_exporter":{"name":"container_image_node_exporter","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/node-exporter:v1.5.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_nvmeof":{"name":"container_image_nvmeof","type":"str","level":"advanced","flags":0,"default_value":"quay.io/ceph/nvmeof:0.0.2","min":"","max":"","enum_allowed":[],"desc":"Nvme-of container image","long_desc":"","tags":[],"see_also":[]},"container_image_prometheus":{"name":"container_image_prometheus","type":"str","level":"advanced","flags":0,"default_value":"quay.io/prometheus/prometheus:v2.43.0","min":"","max":"","enum_allowed":[],"desc":"Prometheus container image","long_desc":"","tags":[],"see_also":[]},"container_image_promtail":{"name":"container_image_promtail","type":"str","level":"advanced","flags":0,"default_value":"docker.io/grafana/promtail:2.4.0","min":"","max":"","enum_allowed":[],"desc":"Promtail container image","long_desc":"","tags":[],"see_also":[]},"container_image_snmp_gateway":{"name":"container_image_snmp_gateway","type":"str","level":"advanced","flags":0,"default_value":"docker.io/maxwo/snmp-notifier:v1.2.1","min":"","max":"","enum_allowed":[],"desc":"SNMP Gateway container image","long_desc":"","tags":[],"see_also":[]},"container_init":{"name":"container_init","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Run podman/docker with `--init`","long_desc":"","tags":[],"see_also":[]},"daemon_cache_timeout":{"name":"daemon_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"seconds to cache service (daemon) inventory","long_desc":"","tags":[],"see_also":[]},"default_cephadm_command_timeout":{"name":"default_cephadm_command_timeout","type":"secs","level":"advanced","flags":0,"default_value":"900","min":"","max":"","enum_allowed":[],"desc":"Default timeout applied to cephadm commands run directly on the host (in seconds)","long_desc":"","tags":[],"see_also":[]},"default_registry":{"name":"default_registry","type":"str","level":"advanced","flags":0,"default_value":"docker.io","min":"","max":"","enum_allowed":[],"desc":"Search-registry to which we should normalize unqualified image names. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"device_cache_timeout":{"name":"device_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"1800","min":"","max":"","enum_allowed":[],"desc":"seconds to cache device inventory","long_desc":"","tags":[],"see_also":[]},"device_enhanced_scan":{"name":"device_enhanced_scan","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use libstoragemgmt during device scans","long_desc":"","tags":[],"see_also":[]},"facts_cache_timeout":{"name":"facts_cache_timeout","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"seconds to cache host facts data","long_desc":"","tags":[],"see_also":[]},"host_check_interval":{"name":"host_check_interval","type":"secs","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to perform a host check","long_desc":"","tags":[],"see_also":[]},"inventory_list_all":{"name":"inventory_list_all","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Whether ceph-volume inventory should report more devices (mostly mappers (LVs / mpaths), partitions...)","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_refresh_metadata":{"name":"log_refresh_metadata","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Log all refresh metadata. Includes daemon, device, and host info collected regularly. Only has effect if logging at debug level","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"log to the \"cephadm\" cluster log channel\"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf":{"name":"manage_etc_ceph_ceph_conf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Manage and own /etc/ceph/ceph.conf on the hosts.","long_desc":"","tags":[],"see_also":[]},"manage_etc_ceph_ceph_conf_hosts":{"name":"manage_etc_ceph_ceph_conf_hosts","type":"str","level":"advanced","flags":0,"default_value":"*","min":"","max":"","enum_allowed":[],"desc":"PlacementSpec describing on which hosts to manage /etc/ceph/ceph.conf","long_desc":"","tags":[],"see_also":[]},"max_count_per_host":{"name":"max_count_per_host","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of daemons per service per host","long_desc":"","tags":[],"see_also":[]},"max_osd_draining_count":{"name":"max_osd_draining_count","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"max number of osds that will be drained simultaneously when osds are removed","long_desc":"","tags":[],"see_also":[]},"migration_current":{"name":"migration_current","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"internal - do not modify","long_desc":"","tags":[],"see_also":[]},"mode":{"name":"mode","type":"str","level":"advanced","flags":0,"default_value":"root","min":"","max":"","enum_allowed":["cephadm-package","root"],"desc":"mode for remote execution of cephadm","long_desc":"","tags":[],"see_also":[]},"prometheus_alerts_path":{"name":"prometheus_alerts_path","type":"str","level":"advanced","flags":0,"default_value":"/etc/prometheus/ceph/ceph_default_alerts.yml","min":"","max":"","enum_allowed":[],"desc":"location of alerts to include in prometheus deployments","long_desc":"","tags":[],"see_also":[]},"registry_insecure":{"name":"registry_insecure","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Registry is to be considered insecure (no TLS available). Only for development purposes.","long_desc":"","tags":[],"see_also":[]},"registry_password":{"name":"registry_password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository password. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"registry_url":{"name":"registry_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Registry url for login purposes. This is not the default registry","long_desc":"","tags":[],"see_also":[]},"registry_username":{"name":"registry_username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"Custom repository username. Only used for logging into a registry.","long_desc":"","tags":[],"see_also":[]},"secure_monitoring_stack":{"name":"secure_monitoring_stack","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable TLS security for all the monitoring stack daemons","long_desc":"","tags":[],"see_also":[]},"service_discovery_port":{"name":"service_discovery_port","type":"int","level":"advanced","flags":0,"default_value":"8765","min":"","max":"","enum_allowed":[],"desc":"cephadm service discovery port","long_desc":"","tags":[],"see_also":[]},"ssh_config_file":{"name":"ssh_config_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"customized SSH config file to connect to managed hosts","long_desc":"","tags":[],"see_also":[]},"use_agent":{"name":"use_agent","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Use cephadm agent on each host to gather and send metadata","long_desc":"","tags":[],"see_also":[]},"use_repo_digest":{"name":"use_repo_digest","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Automatically convert image tags to image digest. Make sure all daemons use the same image","long_desc":"","tags":[],"see_also":[]},"warn_on_failed_host_check":{"name":"warn_on_failed_host_check","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if the host check fails","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_daemons":{"name":"warn_on_stray_daemons","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected that are not managed by cephadm","long_desc":"","tags":[],"see_also":[]},"warn_on_stray_hosts":{"name":"warn_on_stray_hosts","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"raise a health warning if daemons are detected on a host that is not managed by cephadm","long_desc":"","tags":[],"see_also":[]}}},{"name":"crash","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"retain_interval":{"name":"retain_interval","type":"secs","level":"advanced","flags":1,"default_value":"31536000","min":"","max":"","enum_allowed":[],"desc":"how long to retain crashes before pruning them","long_desc":"","tags":[],"see_also":[]},"warn_recent_interval":{"name":"warn_recent_interval","type":"secs","level":"advanced","flags":1,"default_value":"1209600","min":"","max":"","enum_allowed":[],"desc":"time interval in which to warn about recent crashes","long_desc":"","tags":[],"see_also":[]}}},{"name":"dashboard","can_run":true,"error_string":"","module_options":{"ACCOUNT_LOCKOUT_ATTEMPTS":{"name":"ACCOUNT_LOCKOUT_ATTEMPTS","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_HOST":{"name":"ALERTMANAGER_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ALERTMANAGER_API_SSL_VERIFY":{"name":"ALERTMANAGER_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_ENABLED":{"name":"AUDIT_API_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"AUDIT_API_LOG_PAYLOAD":{"name":"AUDIT_API_LOG_PAYLOAD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ENABLE_BROWSABLE_API":{"name":"ENABLE_BROWSABLE_API","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_CEPHFS":{"name":"FEATURE_TOGGLE_CEPHFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_DASHBOARD":{"name":"FEATURE_TOGGLE_DASHBOARD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_ISCSI":{"name":"FEATURE_TOGGLE_ISCSI","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_MIRRORING":{"name":"FEATURE_TOGGLE_MIRRORING","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_NFS":{"name":"FEATURE_TOGGLE_NFS","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RBD":{"name":"FEATURE_TOGGLE_RBD","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"FEATURE_TOGGLE_RGW":{"name":"FEATURE_TOGGLE_RGW","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE":{"name":"GANESHA_CLUSTERS_RADOS_POOL_NAMESPACE","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_PASSWORD":{"name":"GRAFANA_API_PASSWORD","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_SSL_VERIFY":{"name":"GRAFANA_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_URL":{"name":"GRAFANA_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_API_USERNAME":{"name":"GRAFANA_API_USERNAME","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_FRONTEND_API_URL":{"name":"GRAFANA_FRONTEND_API_URL","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"GRAFANA_UPDATE_DASHBOARDS":{"name":"GRAFANA_UPDATE_DASHBOARDS","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISCSI_API_SSL_VERIFICATION":{"name":"ISCSI_API_SSL_VERIFICATION","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ISSUE_TRACKER_API_KEY":{"name":"ISSUE_TRACKER_API_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_HOST":{"name":"PROMETHEUS_API_HOST","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PROMETHEUS_API_SSL_VERIFY":{"name":"PROMETHEUS_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_COMPLEXITY_ENABLED":{"name":"PWD_POLICY_CHECK_COMPLEXITY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED":{"name":"PWD_POLICY_CHECK_EXCLUSION_LIST_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_LENGTH_ENABLED":{"name":"PWD_POLICY_CHECK_LENGTH_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_OLDPWD_ENABLED":{"name":"PWD_POLICY_CHECK_OLDPWD_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_REPETITIVE_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED":{"name":"PWD_POLICY_CHECK_SEQUENTIAL_CHARS_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_CHECK_USERNAME_ENABLED":{"name":"PWD_POLICY_CHECK_USERNAME_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_ENABLED":{"name":"PWD_POLICY_ENABLED","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_EXCLUSION_LIST":{"name":"PWD_POLICY_EXCLUSION_LIST","type":"str","level":"advanced","flags":0,"default_value":"osd,host,dashboard,pool,block,nfs,ceph,monitors,gateway,logs,crush,maps","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_COMPLEXITY":{"name":"PWD_POLICY_MIN_COMPLEXITY","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"PWD_POLICY_MIN_LENGTH":{"name":"PWD_POLICY_MIN_LENGTH","type":"int","level":"advanced","flags":0,"default_value":"8","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"REST_REQUESTS_TIMEOUT":{"name":"REST_REQUESTS_TIMEOUT","type":"int","level":"advanced","flags":0,"default_value":"45","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ACCESS_KEY":{"name":"RGW_API_ACCESS_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_ADMIN_RESOURCE":{"name":"RGW_API_ADMIN_RESOURCE","type":"str","level":"advanced","flags":0,"default_value":"admin","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SECRET_KEY":{"name":"RGW_API_SECRET_KEY","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"RGW_API_SSL_VERIFY":{"name":"RGW_API_SSL_VERIFY","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"UNSAFE_TLS_v1_2":{"name":"UNSAFE_TLS_v1_2","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_SPAN":{"name":"USER_PWD_EXPIRATION_SPAN","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_1":{"name":"USER_PWD_EXPIRATION_WARNING_1","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"USER_PWD_EXPIRATION_WARNING_2":{"name":"USER_PWD_EXPIRATION_WARNING_2","type":"int","level":"advanced","flags":0,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"cross_origin_url":{"name":"cross_origin_url","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"crt_file":{"name":"crt_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"debug":{"name":"debug","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Enable/disable debug options","long_desc":"","tags":[],"see_also":[]},"jwt_token_ttl":{"name":"jwt_token_ttl","type":"int","level":"advanced","flags":0,"default_value":"28800","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"motd":{"name":"motd","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"The message of the day","long_desc":"","tags":[],"see_also":[]},"redirect_resolve_ip_addr":{"name":"redirect_resolve_ip_addr","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":0,"default_value":"8080","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"ssl_server_port":{"name":"ssl_server_port","type":"int","level":"advanced","flags":0,"default_value":"8443","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":0,"default_value":"redirect","min":"","max":"","enum_allowed":["error","redirect"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":0,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url_prefix":{"name":"url_prefix","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"devicehealth","can_run":true,"error_string":"","module_options":{"enable_monitoring":{"name":"enable_monitoring","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"monitor device health metrics","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mark_out_threshold":{"name":"mark_out_threshold","type":"secs","level":"advanced","flags":1,"default_value":"2419200","min":"","max":"","enum_allowed":[],"desc":"automatically mark OSD if it may fail before this long","long_desc":"","tags":[],"see_also":[]},"pool_name":{"name":"pool_name","type":"str","level":"advanced","flags":1,"default_value":"device_health_metrics","min":"","max":"","enum_allowed":[],"desc":"name of pool in which to store device health metrics","long_desc":"","tags":[],"see_also":[]},"retention_period":{"name":"retention_period","type":"secs","level":"advanced","flags":1,"default_value":"15552000","min":"","max":"","enum_allowed":[],"desc":"how long to retain device health metrics","long_desc":"","tags":[],"see_also":[]},"scrape_frequency":{"name":"scrape_frequency","type":"secs","level":"advanced","flags":1,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"how frequently to scrape device health metrics","long_desc":"","tags":[],"see_also":[]},"self_heal":{"name":"self_heal","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"preemptively heal cluster around devices that may fail","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"how frequently to wake up and check device health","long_desc":"","tags":[],"see_also":[]},"warn_threshold":{"name":"warn_threshold","type":"secs","level":"advanced","flags":1,"default_value":"7257600","min":"","max":"","enum_allowed":[],"desc":"raise health warning if OSD may fail before this long","long_desc":"","tags":[],"see_also":[]}}},{"name":"diskprediction_local","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predict_interval":{"name":"predict_interval","type":"str","level":"advanced","flags":0,"default_value":"86400","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"predictor_model":{"name":"predictor_model","type":"str","level":"advanced","flags":0,"default_value":"prophetstor","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"str","level":"advanced","flags":0,"default_value":"600","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"influx","can_run":false,"error_string":"influxdb python module not found","module_options":{"batch_size":{"name":"batch_size","type":"int","level":"advanced","flags":0,"default_value":"5000","min":"","max":"","enum_allowed":[],"desc":"How big batches of data points should be when sending to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"database":{"name":"database","type":"str","level":"advanced","flags":0,"default_value":"ceph","min":"","max":"","enum_allowed":[],"desc":"InfluxDB database name. You will need to create this database and grant write privileges to the configured username or the username must have admin privileges to create it.","long_desc":"","tags":[],"see_also":[]},"hostname":{"name":"hostname","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server hostname","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"30","min":"5","max":"","enum_allowed":[],"desc":"Time between reports to InfluxDB. Default 30 seconds.","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"password":{"name":"password","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"password of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"port":{"name":"port","type":"int","level":"advanced","flags":0,"default_value":"8086","min":"","max":"","enum_allowed":[],"desc":"InfluxDB server port","long_desc":"","tags":[],"see_also":[]},"ssl":{"name":"ssl","type":"str","level":"advanced","flags":0,"default_value":"false","min":"","max":"","enum_allowed":[],"desc":"Use https connection for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]},"threads":{"name":"threads","type":"int","level":"advanced","flags":0,"default_value":"5","min":"1","max":"32","enum_allowed":[],"desc":"How many worker threads should be spawned for sending data to InfluxDB.","long_desc":"","tags":[],"see_also":[]},"username":{"name":"username","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"username of InfluxDB server user","long_desc":"","tags":[],"see_also":[]},"verify_ssl":{"name":"verify_ssl","type":"str","level":"advanced","flags":0,"default_value":"true","min":"","max":"","enum_allowed":[],"desc":"Verify https cert for InfluxDB server. Use \"true\" or \"false\".","long_desc":"","tags":[],"see_also":[]}}},{"name":"insights","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"iostat","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"k8sevents","can_run":true,"error_string":"","module_options":{"ceph_event_retention_days":{"name":"ceph_event_retention_days","type":"int","level":"advanced","flags":0,"default_value":"7","min":"","max":"","enum_allowed":[],"desc":"Days to hold ceph event information within local cache","long_desc":"","tags":[],"see_also":[]},"config_check_secs":{"name":"config_check_secs","type":"int","level":"advanced","flags":0,"default_value":"10","min":"10","max":"","enum_allowed":[],"desc":"interval (secs) to check for cluster configuration changes","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"localpool","can_run":true,"error_string":"","module_options":{"failure_domain":{"name":"failure_domain","type":"str","level":"advanced","flags":1,"default_value":"host","min":"","max":"","enum_allowed":[],"desc":"failure domain for any created local pool","long_desc":"what failure domain we should separate data replicas across.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"min_size":{"name":"min_size","type":"int","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"default min_size for any created local pool","long_desc":"value to set min_size to (unchanged from Ceph's default if this option is not set)","tags":[],"see_also":[]},"num_rep":{"name":"num_rep","type":"int","level":"advanced","flags":1,"default_value":"3","min":"","max":"","enum_allowed":[],"desc":"default replica count for any created local pool","long_desc":"","tags":[],"see_also":[]},"pg_num":{"name":"pg_num","type":"int","level":"advanced","flags":1,"default_value":"128","min":"","max":"","enum_allowed":[],"desc":"default pg_num for any created local pool","long_desc":"","tags":[],"see_also":[]},"prefix":{"name":"prefix","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"name prefix for any created local pool","long_desc":"","tags":[],"see_also":[]},"subtree":{"name":"subtree","type":"str","level":"advanced","flags":1,"default_value":"rack","min":"","max":"","enum_allowed":[],"desc":"CRUSH level for which to create a local pool","long_desc":"which CRUSH subtree type the module should create a pool for.","tags":[],"see_also":[]}}},{"name":"mds_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"mirroring","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"nfs","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"orchestrator","can_run":true,"error_string":"","module_options":{"fail_fs":{"name":"fail_fs","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Fail filesystem for rapid multi-rank mds upgrade","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"orchestrator":{"name":"orchestrator","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["cephadm","rook","test_orchestrator"],"desc":"Orchestrator backend","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_perf_query","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"osd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"pg_autoscaler","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"threshold":{"name":"threshold","type":"float","level":"advanced","flags":0,"default_value":"3.0","min":"1.0","max":"","enum_allowed":[],"desc":"scaling threshold","long_desc":"The factor by which the `NEW PG_NUM` must vary from the current`PG_NUM` before being accepted. Cannot be less than 1.0","tags":[],"see_also":[]}}},{"name":"progress","can_run":true,"error_string":"","module_options":{"allow_pg_recovery_event":{"name":"allow_pg_recovery_event","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow the module to show pg recovery progress","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_completed_events":{"name":"max_completed_events","type":"int","level":"advanced","flags":1,"default_value":"50","min":"","max":"","enum_allowed":[],"desc":"number of past completed events to remember","long_desc":"","tags":[],"see_also":[]},"sleep_interval":{"name":"sleep_interval","type":"secs","level":"advanced","flags":1,"default_value":"5","min":"","max":"","enum_allowed":[],"desc":"how long the module is going to sleep","long_desc":"","tags":[],"see_also":[]}}},{"name":"prometheus","can_run":true,"error_string":"","module_options":{"cache":{"name":"cache","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"exclude_perf_counters":{"name":"exclude_perf_counters","type":"bool","level":"advanced","flags":1,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Do not include perf-counters in the metrics output","long_desc":"Gathering perf-counters from a single Prometheus exporter can degrade ceph-mgr performance, especially in large clusters. Instead, Ceph-exporter daemons are now used by default for perf-counter gathering. This should only be disabled when no ceph-exporters are deployed.","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools":{"name":"rbd_stats_pools","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rbd_stats_pools_refresh_interval":{"name":"rbd_stats_pools_refresh_interval","type":"int","level":"advanced","flags":0,"default_value":"300","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"scrape_interval":{"name":"scrape_interval","type":"float","level":"advanced","flags":0,"default_value":"15.0","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"::","min":"","max":"","enum_allowed":[],"desc":"the IPv4 or IPv6 address on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"int","level":"advanced","flags":1,"default_value":"9283","min":"","max":"","enum_allowed":[],"desc":"the port on which the module listens for HTTP requests","long_desc":"","tags":[],"see_also":[]},"stale_cache_strategy":{"name":"stale_cache_strategy","type":"str","level":"advanced","flags":0,"default_value":"log","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_behaviour":{"name":"standby_behaviour","type":"str","level":"advanced","flags":1,"default_value":"default","min":"","max":"","enum_allowed":["default","error"],"desc":"","long_desc":"","tags":[],"see_also":[]},"standby_error_status_code":{"name":"standby_error_status_code","type":"int","level":"advanced","flags":1,"default_value":"500","min":"400","max":"599","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rbd_support","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_snap_create":{"name":"max_concurrent_snap_create","type":"int","level":"advanced","flags":0,"default_value":"10","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"mirror_snapshot_schedule":{"name":"mirror_snapshot_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"trash_purge_schedule":{"name":"trash_purge_schedule","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"restful","can_run":true,"error_string":"","module_options":{"enable_auth":{"name":"enable_auth","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"key_file":{"name":"key_file","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_addr":{"name":"server_addr","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"server_port":{"name":"server_port","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rgw","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"rook","can_run":true,"error_string":"","module_options":{"drive_group_interval":{"name":"drive_group_interval","type":"float","level":"advanced","flags":0,"default_value":"300.0","min":"","max":"","enum_allowed":[],"desc":"interval in seconds between re-application of applied drive_groups","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"storage_class":{"name":"storage_class","type":"str","level":"advanced","flags":0,"default_value":"local","min":"","max":"","enum_allowed":[],"desc":"storage class name for LSO-discovered PVs","long_desc":"","tags":[],"see_also":[]}}},{"name":"selftest","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption1":{"name":"roption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"roption2":{"name":"roption2","type":"str","level":"advanced","flags":0,"default_value":"xyz","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption1":{"name":"rwoption1","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption2":{"name":"rwoption2","type":"int","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption3":{"name":"rwoption3","type":"float","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption4":{"name":"rwoption4","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption5":{"name":"rwoption5","type":"bool","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption6":{"name":"rwoption6","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"rwoption7":{"name":"rwoption7","type":"int","level":"advanced","flags":0,"default_value":"","min":"1","max":"42","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testkey":{"name":"testkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testlkey":{"name":"testlkey","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"testnewline":{"name":"testnewline","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"snap_schedule","can_run":true,"error_string":"","module_options":{"allow_m_granularity":{"name":"allow_m_granularity","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"allow minute scheduled snapshots","long_desc":"","tags":[],"see_also":[]},"dump_on_update":{"name":"dump_on_update","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"dump database to debug log on update","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"stats","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"status","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telegraf","can_run":true,"error_string":"","module_options":{"address":{"name":"address","type":"str","level":"advanced","flags":0,"default_value":"unixgram:///tmp/telegraf.sock","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"15","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"telemetry","can_run":true,"error_string":"","module_options":{"channel_basic":{"name":"channel_basic","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share basic cluster information (size, version)","long_desc":"","tags":[],"see_also":[]},"channel_crash":{"name":"channel_crash","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share metadata about Ceph daemon crashes (version, stack straces, etc)","long_desc":"","tags":[],"see_also":[]},"channel_device":{"name":"channel_device","type":"bool","level":"advanced","flags":0,"default_value":"True","min":"","max":"","enum_allowed":[],"desc":"Share device health metrics (e.g., SMART data, minus potentially identifying info like serial numbers)","long_desc":"","tags":[],"see_also":[]},"channel_ident":{"name":"channel_ident","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share a user-provided description and/or contact email for the cluster","long_desc":"","tags":[],"see_also":[]},"channel_perf":{"name":"channel_perf","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"Share various performance metrics of a cluster","long_desc":"","tags":[],"see_also":[]},"contact":{"name":"contact","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"description":{"name":"description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"device_url":{"name":"device_url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/device","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"enabled":{"name":"enabled","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"int","level":"advanced","flags":0,"default_value":"24","min":"8","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"last_opt_revision":{"name":"last_opt_revision","type":"int","level":"advanced","flags":0,"default_value":"1","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard":{"name":"leaderboard","type":"bool","level":"advanced","flags":0,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"leaderboard_description":{"name":"leaderboard_description","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"organization":{"name":"organization","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"proxy":{"name":"proxy","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"url":{"name":"url","type":"str","level":"advanced","flags":0,"default_value":"https://telemetry.ceph.com/report","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"test_orchestrator","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}},{"name":"volumes","can_run":true,"error_string":"","module_options":{"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"max_concurrent_clones":{"name":"max_concurrent_clones","type":"int","level":"advanced","flags":0,"default_value":"4","min":"","max":"","enum_allowed":[],"desc":"Number of asynchronous cloner threads","long_desc":"","tags":[],"see_also":[]},"snapshot_clone_delay":{"name":"snapshot_clone_delay","type":"int","level":"advanced","flags":0,"default_value":"0","min":"","max":"","enum_allowed":[],"desc":"Delay clone begin operation by snapshot_clone_delay seconds","long_desc":"","tags":[],"see_also":[]}}},{"name":"zabbix","can_run":true,"error_string":"","module_options":{"discovery_interval":{"name":"discovery_interval","type":"uint","level":"advanced","flags":0,"default_value":"100","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"identifier":{"name":"identifier","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"interval":{"name":"interval","type":"secs","level":"advanced","flags":0,"default_value":"60","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_level":{"name":"log_level","type":"str","level":"advanced","flags":1,"default_value":"","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster":{"name":"log_to_cluster","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_cluster_level":{"name":"log_to_cluster_level","type":"str","level":"advanced","flags":1,"default_value":"info","min":"","max":"","enum_allowed":["","critical","debug","error","info","warning"],"desc":"","long_desc":"","tags":[],"see_also":[]},"log_to_file":{"name":"log_to_file","type":"bool","level":"advanced","flags":1,"default_value":"False","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_host":{"name":"zabbix_host","type":"str","level":"advanced","flags":0,"default_value":"","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_port":{"name":"zabbix_port","type":"int","level":"advanced","flags":0,"default_value":"10051","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]},"zabbix_sender":{"name":"zabbix_sender","type":"str","level":"advanced","flags":0,"default_value":"/usr/bin/zabbix_sender","min":"","max":"","enum_allowed":[],"desc":"","long_desc":"","tags":[],"see_also":[]}}}],"services":{"dashboard":"https://192.168.123.105:8443/","prometheus":"http://192.168.123.105:9283/"},"always_on_modules":{"octopus":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"pacific":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"quincy":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"],"reef":["balancer","crash","devicehealth","orchestrator","pg_autoscaler","progress","rbd_support","status","telemetry","volumes"]},"last_failure_osd_epoch":5,"active_clients":[{"name":"devicehealth","addrvec":[{"type":"v2","addr":"192.168.123.105:0","nonce":177716193}]},{"name":"libcephsqlite","addrvec":[{"type":"v2","addr":"192.168.123.105:0","nonce":3448907780}]},{"name":"rbd_support","addrvec":[{"type":"v2","addr":"192.168.123.105:0","nonce":2233014865}]},{"name":"volumes","addrvec":[{"type":"v2","addr":"192.168.123.105:0","nonce":3200565544}]}]} 2026-03-10T07:48:09.385 INFO:tasks.cephadm.ceph_manager.ceph:mgr available! 2026-03-10T07:48:09.385 INFO:tasks.cephadm.ceph_manager.ceph:waiting for all up 2026-03-10T07:48:09.385 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph osd dump --format=json 2026-03-10T07:48:09.527 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:09.768 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.765+0000 7f1054399700 1 -- 192.168.123.105:0/3394416640 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f104c073130 msgr2=0x7f104c073510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:09.768 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.765+0000 7f1054399700 1 --2- 192.168.123.105:0/3394416640 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f104c073130 0x7f104c073510 secure :-1 s=READY pgs=210 cs=0 l=1 rev1=1 crypto rx=0x7f103c009b00 tx=0x7f103c009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:09.768 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.766+0000 7f1054399700 1 -- 192.168.123.105:0/3394416640 shutdown_connections 2026-03-10T07:48:09.768 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.766+0000 7f1054399700 1 --2- 192.168.123.105:0/3394416640 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f104c073a50 0x7f104c111940 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:09.768 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.766+0000 7f1054399700 1 --2- 192.168.123.105:0/3394416640 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f104c073130 0x7f104c073510 unknown :-1 s=CLOSED pgs=210 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:09.768 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.766+0000 7f1054399700 1 -- 192.168.123.105:0/3394416640 >> 192.168.123.105:0/3394416640 conn(0x7f104c0fc920 msgr2=0x7f104c0fed40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:09.768 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.766+0000 7f1054399700 1 -- 192.168.123.105:0/3394416640 shutdown_connections 2026-03-10T07:48:09.768 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.766+0000 7f1054399700 1 -- 192.168.123.105:0/3394416640 wait complete. 2026-03-10T07:48:09.769 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.767+0000 7f1054399700 1 Processor -- start 2026-03-10T07:48:09.769 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.767+0000 7f1054399700 1 -- start start 2026-03-10T07:48:09.769 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.767+0000 7f1054399700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f104c073130 0x7f104c19d110 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:09.769 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.767+0000 7f1054399700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f104c073a50 0x7f104c19d650 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:09.769 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.767+0000 7f1054399700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f104c19dd30 con 0x7f104c073a50 2026-03-10T07:48:09.769 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.768+0000 7f1054399700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f104c1a1ac0 con 0x7f104c073130 2026-03-10T07:48:09.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.768+0000 7f1051934700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f104c073a50 0x7f104c19d650 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:09.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.768+0000 7f1051934700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f104c073a50 0x7f104c19d650 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38318/0 (socket says 192.168.123.105:38318) 2026-03-10T07:48:09.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.768+0000 7f1051934700 1 -- 192.168.123.105:0/757863316 learned_addr learned my addr 192.168.123.105:0/757863316 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:09.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.768+0000 7f1052135700 1 --2- 192.168.123.105:0/757863316 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f104c073130 0x7f104c19d110 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:09.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.768+0000 7f1051934700 1 -- 192.168.123.105:0/757863316 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f104c073130 msgr2=0x7f104c19d110 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:09.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.768+0000 7f1051934700 1 --2- 192.168.123.105:0/757863316 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f104c073130 0x7f104c19d110 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:09.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.768+0000 7f1051934700 1 -- 192.168.123.105:0/757863316 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f103c0097e0 con 0x7f104c073a50 2026-03-10T07:48:09.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.769+0000 7f1052135700 1 --2- 192.168.123.105:0/757863316 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f104c073130 0x7f104c19d110 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T07:48:09.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.769+0000 7f1051934700 1 --2- 192.168.123.105:0/757863316 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f104c073a50 0x7f104c19d650 secure :-1 s=READY pgs=211 cs=0 l=1 rev1=1 crypto rx=0x7f104800eb10 tx=0x7f104800ee20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:09.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.769+0000 7f10437fe700 1 -- 192.168.123.105:0/757863316 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f104800cc40 con 0x7f104c073a50 2026-03-10T07:48:09.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.769+0000 7f10437fe700 1 -- 192.168.123.105:0/757863316 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f104800cda0 con 0x7f104c073a50 2026-03-10T07:48:09.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.769+0000 7f1054399700 1 -- 192.168.123.105:0/757863316 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f104c1a1da0 con 0x7f104c073a50 2026-03-10T07:48:09.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.769+0000 7f10437fe700 1 -- 192.168.123.105:0/757863316 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1048018810 con 0x7f104c073a50 2026-03-10T07:48:09.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.769+0000 7f1054399700 1 -- 192.168.123.105:0/757863316 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f104c1a22f0 con 0x7f104c073a50 2026-03-10T07:48:09.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.771+0000 7f10437fe700 1 -- 192.168.123.105:0/757863316 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f1048018a80 con 0x7f104c073a50 2026-03-10T07:48:09.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.771+0000 7f10437fe700 1 --2- 192.168.123.105:0/757863316 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f103806c550 0x7f103806ea10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:09.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.771+0000 7f10437fe700 1 -- 192.168.123.105:0/757863316 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f1048014070 con 0x7f104c073a50 2026-03-10T07:48:09.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.772+0000 7f1054399700 1 -- 192.168.123.105:0/757863316 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f104c1a1f30 con 0x7f104c073a50 2026-03-10T07:48:09.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.772+0000 7f1052135700 1 --2- 192.168.123.105:0/757863316 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f103806c550 0x7f103806ea10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:09.777 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.775+0000 7f10437fe700 1 -- 192.168.123.105:0/757863316 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f104c1a1f30 con 0x7f104c073a50 2026-03-10T07:48:09.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.778+0000 7f1052135700 1 --2- 192.168.123.105:0/757863316 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f103806c550 0x7f103806ea10 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f103c00b5c0 tx=0x7f103c005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:09.867 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:09 vm05 ceph-mon[50387]: pgmap v64: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:48:09.867 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:09 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/3467897404' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json"}]: dispatch 2026-03-10T07:48:09.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.883+0000 7f1054399700 1 -- 192.168.123.105:0/757863316 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7f104c1a1f30 con 0x7f104c073a50 2026-03-10T07:48:09.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.884+0000 7f10437fe700 1 -- 192.168.123.105:0/757863316 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v33) v1 ==== 74+0+11259 (secure 0 0 0) 0x7f104c1a1f30 con 0x7f104c073a50 2026-03-10T07:48:09.886 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:48:09.886 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":33,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","created":"2026-03-10T07:45:40.905180+0000","modified":"2026-03-10T07:48:06.849648+0000","last_up_change":"2026-03-10T07:48:05.841179+0000","last_in_change":"2026-03-10T07:47:54.023889+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-10T07:47:37.474083+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"20","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"b7030a1b-b939-419d-85b0-9e818f756cd8","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6802","nonce":196713793},{"type":"v1","addr":"192.168.123.105:6803","nonce":196713793}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6804","nonce":196713793},{"type":"v1","addr":"192.168.123.105:6805","nonce":196713793}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6808","nonce":196713793},{"type":"v1","addr":"192.168.123.105:6809","nonce":196713793}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6806","nonce":196713793},{"type":"v1","addr":"192.168.123.105:6807","nonce":196713793}]},"public_addr":"192.168.123.105:6803/196713793","cluster_addr":"192.168.123.105:6805/196713793","heartbeat_back_addr":"192.168.123.105:6809/196713793","heartbeat_front_addr":"192.168.123.105:6807/196713793","state":["exists","up"]},{"osd":1,"uuid":"725f8dca-94da-4c18-aefa-9e9f529cccd4","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":13,"up_thru":24,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6810","nonce":1508516757},{"type":"v1","addr":"192.168.123.105:6811","nonce":1508516757}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6812","nonce":1508516757},{"type":"v1","addr":"192.168.123.105:6813","nonce":1508516757}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6816","nonce":1508516757},{"type":"v1","addr":"192.168.123.105:6817","nonce":1508516757}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6814","nonce":1508516757},{"type":"v1","addr":"192.168.123.105:6815","nonce":1508516757}]},"public_addr":"192.168.123.105:6811/1508516757","cluster_addr":"192.168.123.105:6813/1508516757","heartbeat_back_addr":"192.168.123.105:6817/1508516757","heartbeat_front_addr":"192.168.123.105:6815/1508516757","state":["exists","up"]},{"osd":2,"uuid":"ef60316b-3b61-4644-aa31-97cef548ba7e","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":17,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6818","nonce":4116227648},{"type":"v1","addr":"192.168.123.105:6819","nonce":4116227648}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6820","nonce":4116227648},{"type":"v1","addr":"192.168.123.105:6821","nonce":4116227648}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":4116227648},{"type":"v1","addr":"192.168.123.105:6825","nonce":4116227648}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6822","nonce":4116227648},{"type":"v1","addr":"192.168.123.105:6823","nonce":4116227648}]},"public_addr":"192.168.123.105:6819/4116227648","cluster_addr":"192.168.123.105:6821/4116227648","heartbeat_back_addr":"192.168.123.105:6825/4116227648","heartbeat_front_addr":"192.168.123.105:6823/4116227648","state":["exists","up"]},{"osd":3,"uuid":"4cc7cd7e-7756-40b9-9bc8-029e26495239","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":23,"up_thru":27,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6800","nonce":2127000679},{"type":"v1","addr":"192.168.123.108:6801","nonce":2127000679}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6802","nonce":2127000679},{"type":"v1","addr":"192.168.123.108:6803","nonce":2127000679}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6806","nonce":2127000679},{"type":"v1","addr":"192.168.123.108:6807","nonce":2127000679}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6804","nonce":2127000679},{"type":"v1","addr":"192.168.123.108:6805","nonce":2127000679}]},"public_addr":"192.168.123.108:6801/2127000679","cluster_addr":"192.168.123.108:6803/2127000679","heartbeat_back_addr":"192.168.123.108:6807/2127000679","heartbeat_front_addr":"192.168.123.108:6805/2127000679","state":["exists","up"]},{"osd":4,"uuid":"efa282d4-a651-4903-bdb8-2e148038e567","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":28,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6808","nonce":551946261},{"type":"v1","addr":"192.168.123.108:6809","nonce":551946261}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6810","nonce":551946261},{"type":"v1","addr":"192.168.123.108:6811","nonce":551946261}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6814","nonce":551946261},{"type":"v1","addr":"192.168.123.108:6815","nonce":551946261}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6812","nonce":551946261},{"type":"v1","addr":"192.168.123.108:6813","nonce":551946261}]},"public_addr":"192.168.123.108:6809/551946261","cluster_addr":"192.168.123.108:6811/551946261","heartbeat_back_addr":"192.168.123.108:6815/551946261","heartbeat_front_addr":"192.168.123.108:6813/551946261","state":["exists","up"]},{"osd":5,"uuid":"ef1ae183-14f2-4400-85d6-c48b79ef2819","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":32,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6816","nonce":2677071054},{"type":"v1","addr":"192.168.123.108:6817","nonce":2677071054}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6818","nonce":2677071054},{"type":"v1","addr":"192.168.123.108:6819","nonce":2677071054}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6822","nonce":2677071054},{"type":"v1","addr":"192.168.123.108:6823","nonce":2677071054}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6820","nonce":2677071054},{"type":"v1","addr":"192.168.123.108:6821","nonce":2677071054}]},"public_addr":"192.168.123.108:6817/2677071054","cluster_addr":"192.168.123.108:6819/2677071054","heartbeat_back_addr":"192.168.123.108:6823/2677071054","heartbeat_front_addr":"192.168.123.108:6821/2677071054","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T07:47:16.699601+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T07:47:25.867149+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T07:47:36.171124+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T07:47:44.953296+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T07:47:53.656121+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T07:48:04.028751+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.105:0/4259283135":"2026-03-11T07:46:43.266824+0000","192.168.123.105:0/1998618536":"2026-03-11T07:46:43.266824+0000","192.168.123.105:0/2989100693":"2026-03-11T07:46:09.521991+0000","192.168.123.105:0/1261368764":"2026-03-11T07:46:09.521991+0000","192.168.123.105:6801/2":"2026-03-11T07:45:55.496146+0000","192.168.123.105:0/77887181":"2026-03-11T07:46:43.266824+0000","192.168.123.105:0/2454546307":"2026-03-11T07:45:55.496146+0000","192.168.123.105:6800/2":"2026-03-11T07:45:55.496146+0000","192.168.123.105:0/3548984633":"2026-03-11T07:45:55.496146+0000","192.168.123.105:0/4118350925":"2026-03-11T07:45:55.496146+0000","192.168.123.105:0/597549933":"2026-03-11T07:46:09.521991+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-10T07:48:09.889 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.887+0000 7f1054399700 1 -- 192.168.123.105:0/757863316 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f103806c550 msgr2=0x7f103806ea10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:09.889 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.887+0000 7f1054399700 1 --2- 192.168.123.105:0/757863316 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f103806c550 0x7f103806ea10 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f103c00b5c0 tx=0x7f103c005fb0 comp rx=0 tx=0).stop 2026-03-10T07:48:09.889 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.887+0000 7f1054399700 1 -- 192.168.123.105:0/757863316 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f104c073a50 msgr2=0x7f104c19d650 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:09.889 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.887+0000 7f1054399700 1 --2- 192.168.123.105:0/757863316 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f104c073a50 0x7f104c19d650 secure :-1 s=READY pgs=211 cs=0 l=1 rev1=1 crypto rx=0x7f104800eb10 tx=0x7f104800ee20 comp rx=0 tx=0).stop 2026-03-10T07:48:09.889 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.887+0000 7f1054399700 1 -- 192.168.123.105:0/757863316 shutdown_connections 2026-03-10T07:48:09.889 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.887+0000 7f1054399700 1 --2- 192.168.123.105:0/757863316 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f103806c550 0x7f103806ea10 secure :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f103c00b5c0 tx=0x7f103c005fb0 comp rx=0 tx=0).stop 2026-03-10T07:48:09.889 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.887+0000 7f1054399700 1 --2- 192.168.123.105:0/757863316 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f104c073130 0x7f104c19d110 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:09.889 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.887+0000 7f1054399700 1 --2- 192.168.123.105:0/757863316 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f104c073a50 0x7f104c19d650 unknown :-1 s=CLOSED pgs=211 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:09.889 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.887+0000 7f1054399700 1 -- 192.168.123.105:0/757863316 >> 192.168.123.105:0/757863316 conn(0x7f104c0fc920 msgr2=0x7f104c103450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:09.889 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.887+0000 7f1054399700 1 -- 192.168.123.105:0/757863316 shutdown_connections 2026-03-10T07:48:09.889 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:09.887+0000 7f1054399700 1 -- 192.168.123.105:0/757863316 wait complete. 2026-03-10T07:48:09.945 INFO:tasks.cephadm.ceph_manager.ceph:all up! 2026-03-10T07:48:09.945 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph osd dump --format=json 2026-03-10T07:48:10.105 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:10.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:09 vm08 ceph-mon[59917]: pgmap v64: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:48:10.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:09 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/3467897404' entity='client.admin' cmd=[{"prefix": "mgr dump", "format": "json"}]: dispatch 2026-03-10T07:48:10.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.359+0000 7f15e9ffb700 1 -- 192.168.123.105:0/1380719089 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f15e40048b0 con 0x7f15ec10f340 2026-03-10T07:48:10.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.359+0000 7f15ebfff700 1 -- 192.168.123.105:0/1380719089 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f15ec10f340 msgr2=0x7f15ec10f720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:10.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.359+0000 7f15ebfff700 1 --2- 192.168.123.105:0/1380719089 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f15ec10f340 0x7f15ec10f720 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f15e400b210 tx=0x7f15e400b520 comp rx=0 tx=0).stop 2026-03-10T07:48:10.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.359+0000 7f15ebfff700 1 -- 192.168.123.105:0/1380719089 shutdown_connections 2026-03-10T07:48:10.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.359+0000 7f15ebfff700 1 --2- 192.168.123.105:0/1380719089 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f15ec10d0f0 0x7f15ec10d570 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:10.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.359+0000 7f15ebfff700 1 --2- 192.168.123.105:0/1380719089 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f15ec10f340 0x7f15ec10f720 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:10.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.359+0000 7f15ebfff700 1 -- 192.168.123.105:0/1380719089 >> 192.168.123.105:0/1380719089 conn(0x7f15ec06ce20 msgr2=0x7f15ec06d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:10.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.360+0000 7f15ebfff700 1 -- 192.168.123.105:0/1380719089 shutdown_connections 2026-03-10T07:48:10.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.360+0000 7f15ebfff700 1 -- 192.168.123.105:0/1380719089 wait complete. 2026-03-10T07:48:10.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.360+0000 7f15ebfff700 1 Processor -- start 2026-03-10T07:48:10.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.360+0000 7f15ebfff700 1 -- start start 2026-03-10T07:48:10.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.360+0000 7f15ebfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f15ec10d0f0 0x7f15ec11c020 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:10.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.360+0000 7f15ebfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f15ec10f340 0x7f15ec116fd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:10.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.360+0000 7f15ebfff700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f15ec1176b0 con 0x7f15ec10f340 2026-03-10T07:48:10.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.360+0000 7f15ebfff700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f15ec117820 con 0x7f15ec10d0f0 2026-03-10T07:48:10.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.361+0000 7f15eaffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f15ec10d0f0 0x7f15ec11c020 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:10.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.361+0000 7f15eaffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f15ec10d0f0 0x7f15ec11c020 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:43694/0 (socket says 192.168.123.105:43694) 2026-03-10T07:48:10.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.361+0000 7f15eaffd700 1 -- 192.168.123.105:0/1274460931 learned_addr learned my addr 192.168.123.105:0/1274460931 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:10.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.361+0000 7f15eaffd700 1 -- 192.168.123.105:0/1274460931 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f15ec10f340 msgr2=0x7f15ec116fd0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:48:10.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.361+0000 7f15eaffd700 1 --2- 192.168.123.105:0/1274460931 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f15ec10f340 0x7f15ec116fd0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:10.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.361+0000 7f15eaffd700 1 -- 192.168.123.105:0/1274460931 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f15e4009e30 con 0x7f15ec10d0f0 2026-03-10T07:48:10.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.361+0000 7f15eaffd700 1 --2- 192.168.123.105:0/1274460931 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f15ec10d0f0 0x7f15ec11c020 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f15e4015040 tx=0x7f15e40043b0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:10.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.361+0000 7f15f094f700 1 -- 192.168.123.105:0/1274460931 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f15e400e070 con 0x7f15ec10d0f0 2026-03-10T07:48:10.366 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.361+0000 7f15ebfff700 1 -- 192.168.123.105:0/1274460931 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f15ec117aa0 con 0x7f15ec10d0f0 2026-03-10T07:48:10.366 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.361+0000 7f15ebfff700 1 -- 192.168.123.105:0/1274460931 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f15ec117e20 con 0x7f15ec10d0f0 2026-03-10T07:48:10.366 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.362+0000 7f15ebfff700 1 -- 192.168.123.105:0/1274460931 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f15ec1114f0 con 0x7f15ec10d0f0 2026-03-10T07:48:10.366 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.364+0000 7f15f094f700 1 -- 192.168.123.105:0/1274460931 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f15e4007a80 con 0x7f15ec10d0f0 2026-03-10T07:48:10.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.364+0000 7f15f094f700 1 -- 192.168.123.105:0/1274460931 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f15e4012c00 con 0x7f15ec10d0f0 2026-03-10T07:48:10.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.365+0000 7f15f094f700 1 -- 192.168.123.105:0/1274460931 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f15e4019040 con 0x7f15ec10d0f0 2026-03-10T07:48:10.371 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.365+0000 7f15f094f700 1 --2- 192.168.123.105:0/1274460931 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f15d406c2e0 0x7f15d406e7a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:10.371 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.365+0000 7f15f094f700 1 -- 192.168.123.105:0/1274460931 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f15e408c690 con 0x7f15ec10d0f0 2026-03-10T07:48:10.371 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.366+0000 7f15ea7fc700 1 --2- 192.168.123.105:0/1274460931 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f15d406c2e0 0x7f15d406e7a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:10.371 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.366+0000 7f15ea7fc700 1 --2- 192.168.123.105:0/1274460931 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f15d406c2e0 0x7f15d406e7a0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f15ec118870 tx=0x7f15e0009500 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:10.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.370+0000 7f15f094f700 1 -- 192.168.123.105:0/1274460931 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f15e40572d0 con 0x7f15ec10d0f0 2026-03-10T07:48:10.482 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.480+0000 7f15ebfff700 1 -- 192.168.123.105:0/1274460931 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "osd dump", "format": "json"} v 0) v1 -- 0x7f15ec04f2e0 con 0x7f15ec10d0f0 2026-03-10T07:48:10.483 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.481+0000 7f15f094f700 1 -- 192.168.123.105:0/1274460931 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "osd dump", "format": "json"}]=0 v33) v1 ==== 74+0+11259 (secure 0 0 0) 0x7f15e405a8f0 con 0x7f15ec10d0f0 2026-03-10T07:48:10.483 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:48:10.483 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":33,"fsid":"12e9780e-1c55-11f1-8896-79f7c2e9b508","created":"2026-03-10T07:45:40.905180+0000","modified":"2026-03-10T07:48:06.849648+0000","last_up_change":"2026-03-10T07:48:05.841179+0000","last_in_change":"2026-03-10T07:47:54.023889+0000","flags":"sortbitwise,recovery_deletes,purged_snapdirs,pglog_hardlimit","flags_num":5799936,"flags_set":["pglog_hardlimit","purged_snapdirs","recovery_deletes","sortbitwise"],"crush_version":14,"full_ratio":0.94999998807907104,"backfillfull_ratio":0.89999997615814209,"nearfull_ratio":0.85000002384185791,"cluster_snapshot":"","pool_max":1,"max_osd":6,"require_min_compat_client":"luminous","min_compat_client":"jewel","require_osd_release":"reef","allow_crimson":false,"pools":[{"pool":1,"pool_name":".mgr","create_time":"2026-03-10T07:47:37.474083+0000","flags":1,"flags_names":"hashpspool","type":1,"size":3,"min_size":2,"crush_rule":0,"peering_crush_bucket_count":0,"peering_crush_bucket_target":0,"peering_crush_bucket_barrier":0,"peering_crush_bucket_mandatory_member":2147483647,"object_hash":2,"pg_autoscale_mode":"off","pg_num":1,"pg_placement_num":1,"pg_placement_num_target":1,"pg_num_target":1,"pg_num_pending":1,"last_pg_merge_meta":{"source_pgid":"0.0","ready_epoch":0,"last_epoch_started":0,"last_epoch_clean":0,"source_version":"0'0","target_version":"0'0"},"last_change":"20","last_force_op_resend":"0","last_force_op_resend_prenautilus":"0","last_force_op_resend_preluminous":"0","auid":0,"snap_mode":"selfmanaged","snap_seq":0,"snap_epoch":0,"pool_snaps":[],"removed_snaps":"[]","quota_max_bytes":0,"quota_max_objects":0,"tiers":[],"tier_of":-1,"read_tier":-1,"write_tier":-1,"cache_mode":"none","target_max_bytes":0,"target_max_objects":0,"cache_target_dirty_ratio_micro":400000,"cache_target_dirty_high_ratio_micro":600000,"cache_target_full_ratio_micro":800000,"cache_min_flush_age":0,"cache_min_evict_age":0,"erasure_code_profile":"","hit_set_params":{"type":"none"},"hit_set_period":0,"hit_set_count":0,"use_gmt_hitset":true,"min_read_recency_for_promote":0,"min_write_recency_for_promote":0,"hit_set_grade_decay_rate":0,"hit_set_search_last_n":0,"grade_table":[],"stripe_width":0,"expected_num_objects":0,"fast_read":false,"options":{"pg_num_max":32,"pg_num_min":1},"application_metadata":{"mgr":{}},"read_balance":{"score_acting":6,"score_stable":6,"optimal_score":0.5,"raw_score_acting":3,"raw_score_stable":3,"primary_affinity_weighted":1,"average_primary_affinity":1,"average_primary_affinity_weighted":1}}],"osds":[{"osd":0,"uuid":"b7030a1b-b939-419d-85b0-9e818f756cd8","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":9,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6802","nonce":196713793},{"type":"v1","addr":"192.168.123.105:6803","nonce":196713793}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6804","nonce":196713793},{"type":"v1","addr":"192.168.123.105:6805","nonce":196713793}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6808","nonce":196713793},{"type":"v1","addr":"192.168.123.105:6809","nonce":196713793}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6806","nonce":196713793},{"type":"v1","addr":"192.168.123.105:6807","nonce":196713793}]},"public_addr":"192.168.123.105:6803/196713793","cluster_addr":"192.168.123.105:6805/196713793","heartbeat_back_addr":"192.168.123.105:6809/196713793","heartbeat_front_addr":"192.168.123.105:6807/196713793","state":["exists","up"]},{"osd":1,"uuid":"725f8dca-94da-4c18-aefa-9e9f529cccd4","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":13,"up_thru":24,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6810","nonce":1508516757},{"type":"v1","addr":"192.168.123.105:6811","nonce":1508516757}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6812","nonce":1508516757},{"type":"v1","addr":"192.168.123.105:6813","nonce":1508516757}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6816","nonce":1508516757},{"type":"v1","addr":"192.168.123.105:6817","nonce":1508516757}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6814","nonce":1508516757},{"type":"v1","addr":"192.168.123.105:6815","nonce":1508516757}]},"public_addr":"192.168.123.105:6811/1508516757","cluster_addr":"192.168.123.105:6813/1508516757","heartbeat_back_addr":"192.168.123.105:6817/1508516757","heartbeat_front_addr":"192.168.123.105:6815/1508516757","state":["exists","up"]},{"osd":2,"uuid":"ef60316b-3b61-4644-aa31-97cef548ba7e","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":17,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6818","nonce":4116227648},{"type":"v1","addr":"192.168.123.105:6819","nonce":4116227648}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6820","nonce":4116227648},{"type":"v1","addr":"192.168.123.105:6821","nonce":4116227648}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6824","nonce":4116227648},{"type":"v1","addr":"192.168.123.105:6825","nonce":4116227648}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6822","nonce":4116227648},{"type":"v1","addr":"192.168.123.105:6823","nonce":4116227648}]},"public_addr":"192.168.123.105:6819/4116227648","cluster_addr":"192.168.123.105:6821/4116227648","heartbeat_back_addr":"192.168.123.105:6825/4116227648","heartbeat_front_addr":"192.168.123.105:6823/4116227648","state":["exists","up"]},{"osd":3,"uuid":"4cc7cd7e-7756-40b9-9bc8-029e26495239","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":23,"up_thru":27,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6800","nonce":2127000679},{"type":"v1","addr":"192.168.123.108:6801","nonce":2127000679}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6802","nonce":2127000679},{"type":"v1","addr":"192.168.123.108:6803","nonce":2127000679}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6806","nonce":2127000679},{"type":"v1","addr":"192.168.123.108:6807","nonce":2127000679}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6804","nonce":2127000679},{"type":"v1","addr":"192.168.123.108:6805","nonce":2127000679}]},"public_addr":"192.168.123.108:6801/2127000679","cluster_addr":"192.168.123.108:6803/2127000679","heartbeat_back_addr":"192.168.123.108:6807/2127000679","heartbeat_front_addr":"192.168.123.108:6805/2127000679","state":["exists","up"]},{"osd":4,"uuid":"efa282d4-a651-4903-bdb8-2e148038e567","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":28,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6808","nonce":551946261},{"type":"v1","addr":"192.168.123.108:6809","nonce":551946261}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6810","nonce":551946261},{"type":"v1","addr":"192.168.123.108:6811","nonce":551946261}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6814","nonce":551946261},{"type":"v1","addr":"192.168.123.108:6815","nonce":551946261}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6812","nonce":551946261},{"type":"v1","addr":"192.168.123.108:6813","nonce":551946261}]},"public_addr":"192.168.123.108:6809/551946261","cluster_addr":"192.168.123.108:6811/551946261","heartbeat_back_addr":"192.168.123.108:6815/551946261","heartbeat_front_addr":"192.168.123.108:6813/551946261","state":["exists","up"]},{"osd":5,"uuid":"ef1ae183-14f2-4400-85d6-c48b79ef2819","up":1,"in":1,"weight":1,"primary_affinity":1,"last_clean_begin":0,"last_clean_end":0,"up_from":32,"up_thru":0,"down_at":0,"lost_at":0,"public_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6816","nonce":2677071054},{"type":"v1","addr":"192.168.123.108:6817","nonce":2677071054}]},"cluster_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6818","nonce":2677071054},{"type":"v1","addr":"192.168.123.108:6819","nonce":2677071054}]},"heartbeat_back_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6822","nonce":2677071054},{"type":"v1","addr":"192.168.123.108:6823","nonce":2677071054}]},"heartbeat_front_addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6820","nonce":2677071054},{"type":"v1","addr":"192.168.123.108:6821","nonce":2677071054}]},"public_addr":"192.168.123.108:6817/2677071054","cluster_addr":"192.168.123.108:6819/2677071054","heartbeat_back_addr":"192.168.123.108:6823/2677071054","heartbeat_front_addr":"192.168.123.108:6821/2677071054","state":["exists","up"]}],"osd_xinfo":[{"osd":0,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T07:47:16.699601+0000","dead_epoch":0},{"osd":1,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T07:47:25.867149+0000","dead_epoch":0},{"osd":2,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T07:47:36.171124+0000","dead_epoch":0},{"osd":3,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T07:47:44.953296+0000","dead_epoch":0},{"osd":4,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T07:47:53.656121+0000","dead_epoch":0},{"osd":5,"down_stamp":"0.000000","laggy_probability":0,"laggy_interval":0,"features":4540138322906710015,"old_weight":0,"last_purged_snaps_scrub":"2026-03-10T07:48:04.028751+0000","dead_epoch":0}],"pg_upmap":[],"pg_upmap_items":[],"pg_upmap_primaries":[],"pg_temp":[],"primary_temp":[],"blocklist":{"192.168.123.105:0/4259283135":"2026-03-11T07:46:43.266824+0000","192.168.123.105:0/1998618536":"2026-03-11T07:46:43.266824+0000","192.168.123.105:0/2989100693":"2026-03-11T07:46:09.521991+0000","192.168.123.105:0/1261368764":"2026-03-11T07:46:09.521991+0000","192.168.123.105:6801/2":"2026-03-11T07:45:55.496146+0000","192.168.123.105:0/77887181":"2026-03-11T07:46:43.266824+0000","192.168.123.105:0/2454546307":"2026-03-11T07:45:55.496146+0000","192.168.123.105:6800/2":"2026-03-11T07:45:55.496146+0000","192.168.123.105:0/3548984633":"2026-03-11T07:45:55.496146+0000","192.168.123.105:0/4118350925":"2026-03-11T07:45:55.496146+0000","192.168.123.105:0/597549933":"2026-03-11T07:46:09.521991+0000"},"range_blocklist":{},"erasure_code_profiles":{"default":{"crush-failure-domain":"osd","k":"2","m":"1","plugin":"jerasure","technique":"reed_sol_van"}},"removed_snaps_queue":[],"new_removed_snaps":[],"new_purged_snaps":[],"crush_node_flags":{},"device_class_flags":{},"stretch_mode":{"stretch_mode_enabled":false,"stretch_bucket_count":0,"degraded_stretch_mode":0,"recovering_stretch_mode":0,"stretch_mode_bucket":0}} 2026-03-10T07:48:10.485 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.484+0000 7f15ebfff700 1 -- 192.168.123.105:0/1274460931 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f15d406c2e0 msgr2=0x7f15d406e7a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:10.485 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.484+0000 7f15ebfff700 1 --2- 192.168.123.105:0/1274460931 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f15d406c2e0 0x7f15d406e7a0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f15ec118870 tx=0x7f15e0009500 comp rx=0 tx=0).stop 2026-03-10T07:48:10.485 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.484+0000 7f15ebfff700 1 -- 192.168.123.105:0/1274460931 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f15ec10d0f0 msgr2=0x7f15ec11c020 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:10.485 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.484+0000 7f15ebfff700 1 --2- 192.168.123.105:0/1274460931 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f15ec10d0f0 0x7f15ec11c020 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f15e4015040 tx=0x7f15e40043b0 comp rx=0 tx=0).stop 2026-03-10T07:48:10.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.484+0000 7f15ebfff700 1 -- 192.168.123.105:0/1274460931 shutdown_connections 2026-03-10T07:48:10.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.484+0000 7f15ebfff700 1 --2- 192.168.123.105:0/1274460931 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f15d406c2e0 0x7f15d406e7a0 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:10.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.484+0000 7f15ebfff700 1 --2- 192.168.123.105:0/1274460931 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f15ec10d0f0 0x7f15ec11c020 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:10.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.484+0000 7f15ebfff700 1 --2- 192.168.123.105:0/1274460931 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f15ec10f340 0x7f15ec116fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:10.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.484+0000 7f15ebfff700 1 -- 192.168.123.105:0/1274460931 >> 192.168.123.105:0/1274460931 conn(0x7f15ec06ce20 msgr2=0x7f15ec070520 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:10.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.484+0000 7f15ebfff700 1 -- 192.168.123.105:0/1274460931 shutdown_connections 2026-03-10T07:48:10.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:10.484+0000 7f15ebfff700 1 -- 192.168.123.105:0/1274460931 wait complete. 2026-03-10T07:48:10.530 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph tell osd.0 flush_pg_stats 2026-03-10T07:48:10.530 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph tell osd.1 flush_pg_stats 2026-03-10T07:48:10.530 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph tell osd.2 flush_pg_stats 2026-03-10T07:48:10.530 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph tell osd.3 flush_pg_stats 2026-03-10T07:48:10.531 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph tell osd.4 flush_pg_stats 2026-03-10T07:48:10.531 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph tell osd.5 flush_pg_stats 2026-03-10T07:48:10.870 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:10 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/757863316' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T07:48:10.871 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:10 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/1274460931' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T07:48:10.995 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:11.017 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:11.025 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:11.028 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:11.071 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:11.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:10 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/757863316' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T07:48:11.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:10 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/1274460931' entity='client.admin' cmd=[{"prefix": "osd dump", "format": "json"}]: dispatch 2026-03-10T07:48:11.187 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:11.857 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.855+0000 7f270e0f1700 1 -- 192.168.123.105:0/2954639048 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f270810f660 msgr2=0x7f2708107d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:11.857 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.855+0000 7f270e0f1700 1 --2- 192.168.123.105:0/2954639048 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f270810f660 0x7f2708107d90 secure :-1 s=READY pgs=212 cs=0 l=1 rev1=1 crypto rx=0x7f26f8009b00 tx=0x7f26f8009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:11.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.858+0000 7f270e0f1700 1 -- 192.168.123.105:0/2954639048 shutdown_connections 2026-03-10T07:48:11.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.858+0000 7f270e0f1700 1 --2- 192.168.123.105:0/2954639048 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f27081082d0 0x7f2708108750 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.858+0000 7f270e0f1700 1 --2- 192.168.123.105:0/2954639048 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f270810f660 0x7f2708107d90 unknown :-1 s=CLOSED pgs=212 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.858+0000 7f270e0f1700 1 -- 192.168.123.105:0/2954639048 >> 192.168.123.105:0/2954639048 conn(0x7f270806d0f0 msgr2=0x7f270806d500 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:11.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.858+0000 7f270e0f1700 1 -- 192.168.123.105:0/2954639048 shutdown_connections 2026-03-10T07:48:11.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.858+0000 7f270e0f1700 1 -- 192.168.123.105:0/2954639048 wait complete. 2026-03-10T07:48:11.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.859+0000 7f270e0f1700 1 Processor -- start 2026-03-10T07:48:11.861 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.860+0000 7f270e0f1700 1 -- start start 2026-03-10T07:48:11.862 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.860+0000 7f270e0f1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f27081082d0 0x7f27081ab6e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:11.862 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.860+0000 7f270e0f1700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f270810f660 0x7f27081abc20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:11.862 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.860+0000 7f270e0f1700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f27081ac2b0 con 0x7f27081082d0 2026-03-10T07:48:11.862 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.860+0000 7f270e0f1700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f27081a5760 con 0x7f270810f660 2026-03-10T07:48:11.862 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.861+0000 7f2706ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f270810f660 0x7f27081abc20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:11.862 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.861+0000 7f2706ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f270810f660 0x7f27081abc20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:43710/0 (socket says 192.168.123.105:43710) 2026-03-10T07:48:11.862 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.861+0000 7f2706ffd700 1 -- 192.168.123.105:0/3365644358 learned_addr learned my addr 192.168.123.105:0/3365644358 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:11.862 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.861+0000 7f27077fe700 1 --2- 192.168.123.105:0/3365644358 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f27081082d0 0x7f27081ab6e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:11.863 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.861+0000 7f2706ffd700 1 -- 192.168.123.105:0/3365644358 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f27081082d0 msgr2=0x7f27081ab6e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:11.863 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.861+0000 7f2706ffd700 1 --2- 192.168.123.105:0/3365644358 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f27081082d0 0x7f27081ab6e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.863 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.861+0000 7f2706ffd700 1 -- 192.168.123.105:0/3365644358 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f26f80097e0 con 0x7f270810f660 2026-03-10T07:48:11.863 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.861+0000 7f2706ffd700 1 --2- 192.168.123.105:0/3365644358 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f270810f660 0x7f27081abc20 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f26fc00eb10 tx=0x7f26fc00ee20 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:11.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.862+0000 7f2704ff9700 1 -- 192.168.123.105:0/3365644358 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f26fc00cc40 con 0x7f270810f660 2026-03-10T07:48:11.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.862+0000 7f2704ff9700 1 -- 192.168.123.105:0/3365644358 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f26fc00cda0 con 0x7f270810f660 2026-03-10T07:48:11.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.863+0000 7f2704ff9700 1 -- 192.168.123.105:0/3365644358 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f26fc018810 con 0x7f270810f660 2026-03-10T07:48:11.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.863+0000 7f270e0f1700 1 -- 192.168.123.105:0/3365644358 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f27081a5a40 con 0x7f270810f660 2026-03-10T07:48:11.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.863+0000 7f270e0f1700 1 -- 192.168.123.105:0/3365644358 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f27081a5f60 con 0x7f270810f660 2026-03-10T07:48:11.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.864+0000 7f270e0f1700 1 -- 192.168.123.105:0/3365644358 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7f270811d280 con 0x7f270810f660 2026-03-10T07:48:11.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.865+0000 7f2704ff9700 1 -- 192.168.123.105:0/3365644358 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f26fc018970 con 0x7f270810f660 2026-03-10T07:48:11.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.865+0000 7f2704ff9700 1 --2- 192.168.123.105:0/3365644358 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f26f006c380 0x7f26f006e840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:11.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.866+0000 7f27077fe700 1 --2- 192.168.123.105:0/3365644358 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f26f006c380 0x7f26f006e840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:11.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.866+0000 7f2704ff9700 1 -- 192.168.123.105:0/3365644358 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f26fc014070 con 0x7f270810f660 2026-03-10T07:48:11.868 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.866+0000 7f27077fe700 1 --2- 192.168.123.105:0/3365644358 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f26f006c380 0x7f26f006e840 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f26f800b5c0 tx=0x7f26f800bfa0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:11.868 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.866+0000 7f2704ff9700 1 --2- 192.168.123.105:0/3365644358 >> [v2:192.168.123.108:6808/551946261,v1:192.168.123.108:6809/551946261] conn(0x7f26f0071db0 0x7f26f00741d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:11.868 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.866+0000 7f2704ff9700 1 -- 192.168.123.105:0/3365644358 --> [v2:192.168.123.108:6808/551946261,v1:192.168.123.108:6809/551946261] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f26f0074880 con 0x7f26f0071db0 2026-03-10T07:48:11.868 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.867+0000 7f2707fff700 1 --2- 192.168.123.105:0/3365644358 >> [v2:192.168.123.108:6808/551946261,v1:192.168.123.108:6809/551946261] conn(0x7f26f0071db0 0x7f26f00741d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:11.868 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.867+0000 7f2704ff9700 1 -- 192.168.123.105:0/3365644358 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_get_version_reply(handle=1 version=33) v2 ==== 24+0+0 (secure 0 0 0) 0x7f26fc08c6d0 con 0x7f270810f660 2026-03-10T07:48:11.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.870+0000 7f2707fff700 1 --2- 192.168.123.105:0/3365644358 >> [v2:192.168.123.108:6808/551946261,v1:192.168.123.108:6809/551946261] conn(0x7f26f0071db0 0x7f26f00741d0 crc :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.4 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:11.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.872+0000 7f2704ff9700 1 -- 192.168.123.105:0/3365644358 <== osd.4 v2:192.168.123.108:6808/551946261 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f26f0074880 con 0x7f26f0071db0 2026-03-10T07:48:11.884 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.882+0000 7f574648b700 1 -- 192.168.123.105:0/1854480886 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f57401082d0 msgr2=0x7f5740108750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:11.884 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.882+0000 7f574648b700 1 --2- 192.168.123.105:0/1854480886 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f57401082d0 0x7f5740108750 secure :-1 s=READY pgs=213 cs=0 l=1 rev1=1 crypto rx=0x7f5734009a60 tx=0x7f5734009d70 comp rx=0 tx=0).stop 2026-03-10T07:48:11.888 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.886+0000 7f574648b700 1 -- 192.168.123.105:0/1854480886 shutdown_connections 2026-03-10T07:48:11.888 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:11 vm05 ceph-mon[50387]: pgmap v65: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:48:11.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.886+0000 7f574648b700 1 --2- 192.168.123.105:0/1854480886 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f57401082d0 0x7f5740108750 unknown :-1 s=CLOSED pgs=213 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.886+0000 7f574648b700 1 --2- 192.168.123.105:0/1854480886 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f574010f660 0x7f5740107d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.886+0000 7f574648b700 1 -- 192.168.123.105:0/1854480886 >> 192.168.123.105:0/1854480886 conn(0x7f574006d0f0 msgr2=0x7f574006d500 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:11.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.889+0000 7f574648b700 1 -- 192.168.123.105:0/1854480886 shutdown_connections 2026-03-10T07:48:11.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.889+0000 7f574648b700 1 -- 192.168.123.105:0/1854480886 wait complete. 2026-03-10T07:48:11.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.890+0000 7f574648b700 1 Processor -- start 2026-03-10T07:48:11.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.890+0000 7f574648b700 1 -- start start 2026-03-10T07:48:11.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.890+0000 7f574648b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f57401082d0 0x7f5740117db0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:11.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.890+0000 7f574648b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f574010f660 0x7f5740112e20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:11.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.890+0000 7f574648b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5740113360 con 0x7f57401082d0 2026-03-10T07:48:11.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.890+0000 7f574648b700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f57401134d0 con 0x7f574010f660 2026-03-10T07:48:11.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.890+0000 7f573ffff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f57401082d0 0x7f5740117db0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:11.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.890+0000 7f573ffff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f57401082d0 0x7f5740117db0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38350/0 (socket says 192.168.123.105:38350) 2026-03-10T07:48:11.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.890+0000 7f573ffff700 1 -- 192.168.123.105:0/493186551 learned_addr learned my addr 192.168.123.105:0/493186551 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:11.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.890+0000 7f573f7fe700 1 --2- 192.168.123.105:0/493186551 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f574010f660 0x7f5740112e20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:11.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.891+0000 7f573ffff700 1 -- 192.168.123.105:0/493186551 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f574010f660 msgr2=0x7f5740112e20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:11.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.891+0000 7f573ffff700 1 --2- 192.168.123.105:0/493186551 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f574010f660 0x7f5740112e20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.891+0000 7f573ffff700 1 -- 192.168.123.105:0/493186551 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5734009710 con 0x7f57401082d0 2026-03-10T07:48:11.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.891+0000 7f573ffff700 1 --2- 192.168.123.105:0/493186551 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f57401082d0 0x7f5740117db0 secure :-1 s=READY pgs=214 cs=0 l=1 rev1=1 crypto rx=0x7f573000ea30 tx=0x7f573000edf0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:11.894 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.891+0000 7f573d7fa700 1 -- 192.168.123.105:0/493186551 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f573000cc40 con 0x7f57401082d0 2026-03-10T07:48:11.894 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.891+0000 7f573d7fa700 1 -- 192.168.123.105:0/493186551 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f573000cda0 con 0x7f57401082d0 2026-03-10T07:48:11.894 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.891+0000 7f573d7fa700 1 -- 192.168.123.105:0/493186551 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5730010430 con 0x7f57401082d0 2026-03-10T07:48:11.894 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.892+0000 7f574648b700 1 -- 192.168.123.105:0/493186551 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f57401137b0 con 0x7f57401082d0 2026-03-10T07:48:11.894 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.892+0000 7f574648b700 1 -- 192.168.123.105:0/493186551 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f57401b8690 con 0x7f57401082d0 2026-03-10T07:48:11.896 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.893+0000 7f573d7fa700 1 -- 192.168.123.105:0/493186551 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f5730004880 con 0x7f57401082d0 2026-03-10T07:48:11.896 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.894+0000 7f573d7fa700 1 --2- 192.168.123.105:0/493186551 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f572806c530 0x7f572806e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:11.896 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.894+0000 7f573d7fa700 1 -- 192.168.123.105:0/493186551 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f5730014070 con 0x7f57401082d0 2026-03-10T07:48:11.896 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.894+0000 7f573f7fe700 1 --2- 192.168.123.105:0/493186551 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f572806c530 0x7f572806e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:11.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.894+0000 7f573f7fe700 1 --2- 192.168.123.105:0/493186551 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f572806c530 0x7f572806e9f0 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f573400b5c0 tx=0x7f57340058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:11.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.894+0000 7f574648b700 1 --2- 192.168.123.105:0/493186551 >> [v2:192.168.123.108:6816/2677071054,v1:192.168.123.108:6817/2677071054] conn(0x7f57400611d0 0x7f57400615b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:11.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.894+0000 7f574648b700 1 -- 192.168.123.105:0/493186551 --> [v2:192.168.123.108:6816/2677071054,v1:192.168.123.108:6817/2677071054] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f574004ea90 con 0x7f57400611d0 2026-03-10T07:48:11.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.895+0000 7f5744a28700 1 --2- 192.168.123.105:0/493186551 >> [v2:192.168.123.108:6816/2677071054,v1:192.168.123.108:6817/2677071054] conn(0x7f57400611d0 0x7f57400615b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:11.904 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.901+0000 7f5744a28700 1 --2- 192.168.123.105:0/493186551 >> [v2:192.168.123.108:6816/2677071054,v1:192.168.123.108:6817/2677071054] conn(0x7f57400611d0 0x7f57400615b0 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.5 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:11.906 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.902+0000 7f573d7fa700 1 -- 192.168.123.105:0/493186551 <== osd.5 v2:192.168.123.108:6816/2677071054 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f574004ea90 con 0x7f57400611d0 2026-03-10T07:48:11.912 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.910+0000 7f1beec07700 1 -- 192.168.123.105:0/3980063765 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1be81082d0 msgr2=0x7f1be8108750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:11.912 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.910+0000 7f1beec07700 1 --2- 192.168.123.105:0/3980063765 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1be81082d0 0x7f1be8108750 secure :-1 s=READY pgs=215 cs=0 l=1 rev1=1 crypto rx=0x7f1bdc009b00 tx=0x7f1bdc009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:11.912 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.910+0000 7f1beec07700 1 -- 192.168.123.105:0/3980063765 shutdown_connections 2026-03-10T07:48:11.912 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.910+0000 7f1beec07700 1 --2- 192.168.123.105:0/3980063765 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1be81082d0 0x7f1be8108750 unknown :-1 s=CLOSED pgs=215 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.912 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.910+0000 7f1beec07700 1 --2- 192.168.123.105:0/3980063765 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1be810f660 0x7f1be8107d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.912 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.910+0000 7f1beec07700 1 -- 192.168.123.105:0/3980063765 >> 192.168.123.105:0/3980063765 conn(0x7f1be806d0f0 msgr2=0x7f1be806d500 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:11.913 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.911+0000 7f1beec07700 1 -- 192.168.123.105:0/3980063765 shutdown_connections 2026-03-10T07:48:11.913 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.911+0000 7f1beec07700 1 -- 192.168.123.105:0/3980063765 wait complete. 2026-03-10T07:48:11.913 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.911+0000 7f1beec07700 1 Processor -- start 2026-03-10T07:48:11.913 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.912+0000 7f1beec07700 1 -- start start 2026-03-10T07:48:11.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.902+0000 7f270e0f1700 1 -- 192.168.123.105:0/3365644358 --> [v2:192.168.123.108:6808/551946261,v1:192.168.123.108:6809/551946261] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f270804ea90 con 0x7f26f0071db0 2026-03-10T07:48:11.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.913+0000 7f2704ff9700 1 -- 192.168.123.105:0/3365644358 <== osd.4 v2:192.168.123.108:6808/551946261 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+12 (crc 0 0 0) 0x7f270804ea90 con 0x7f26f0071db0 2026-03-10T07:48:11.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.912+0000 7f1beec07700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1be810f660 0x7f1be8117e20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:11.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.912+0000 7f1beec07700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1be8112e20 0x7f1be81132a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:11.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.912+0000 7f1beec07700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1be81137e0 con 0x7f1be8112e20 2026-03-10T07:48:11.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.912+0000 7f1beec07700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1be8113950 con 0x7f1be810f660 2026-03-10T07:48:11.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.914+0000 7f574648b700 1 -- 192.168.123.105:0/493186551 --> [v2:192.168.123.108:6816/2677071054,v1:192.168.123.108:6817/2677071054] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f5740066e80 con 0x7f57400611d0 2026-03-10T07:48:11.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.914+0000 7f1be7fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1be8112e20 0x7f1be81132a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:11.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.915+0000 7f573d7fa700 1 -- 192.168.123.105:0/493186551 <== osd.5 v2:192.168.123.108:6816/2677071054 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+12 (crc 0 0 0) 0x7f5740066e80 con 0x7f57400611d0 2026-03-10T07:48:11.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.915+0000 7f5726ffd700 1 -- 192.168.123.105:0/493186551 >> [v2:192.168.123.108:6816/2677071054,v1:192.168.123.108:6817/2677071054] conn(0x7f57400611d0 msgr2=0x7f57400615b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:11.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.915+0000 7f5726ffd700 1 --2- 192.168.123.105:0/493186551 >> [v2:192.168.123.108:6816/2677071054,v1:192.168.123.108:6817/2677071054] conn(0x7f57400611d0 0x7f57400615b0 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.915+0000 7f5726ffd700 1 -- 192.168.123.105:0/493186551 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f572806c530 msgr2=0x7f572806e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:11.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.915+0000 7f5726ffd700 1 --2- 192.168.123.105:0/493186551 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f572806c530 0x7f572806e9f0 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f573400b5c0 tx=0x7f57340058e0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.915+0000 7f5726ffd700 1 -- 192.168.123.105:0/493186551 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f57401082d0 msgr2=0x7f5740117db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:11.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.915+0000 7f5726ffd700 1 --2- 192.168.123.105:0/493186551 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f57401082d0 0x7f5740117db0 secure :-1 s=READY pgs=214 cs=0 l=1 rev1=1 crypto rx=0x7f573000ea30 tx=0x7f573000edf0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.916+0000 7f5726ffd700 1 -- 192.168.123.105:0/493186551 shutdown_connections 2026-03-10T07:48:11.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.916+0000 7f5726ffd700 1 --2- 192.168.123.105:0/493186551 >> [v2:192.168.123.108:6816/2677071054,v1:192.168.123.108:6817/2677071054] conn(0x7f57400611d0 0x7f57400615b0 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.916+0000 7f5726ffd700 1 --2- 192.168.123.105:0/493186551 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f572806c530 0x7f572806e9f0 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.916+0000 7f5726ffd700 1 --2- 192.168.123.105:0/493186551 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f57401082d0 0x7f5740117db0 unknown :-1 s=CLOSED pgs=214 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.916+0000 7f5726ffd700 1 --2- 192.168.123.105:0/493186551 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f574010f660 0x7f5740112e20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.916+0000 7f5726ffd700 1 -- 192.168.123.105:0/493186551 >> 192.168.123.105:0/493186551 conn(0x7f574006d0f0 msgr2=0x7f5740071870 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:11.918 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.916+0000 7f5726ffd700 1 -- 192.168.123.105:0/493186551 shutdown_connections 2026-03-10T07:48:11.918 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.916+0000 7f5726ffd700 1 -- 192.168.123.105:0/493186551 wait complete. 2026-03-10T07:48:11.918 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.914+0000 7f1be7fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1be8112e20 0x7f1be81132a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38384/0 (socket says 192.168.123.105:38384) 2026-03-10T07:48:11.918 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.914+0000 7f1be7fff700 1 -- 192.168.123.105:0/4291512782 learned_addr learned my addr 192.168.123.105:0/4291512782 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:11.919 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.917+0000 7f1bec9a3700 1 --2- 192.168.123.105:0/4291512782 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1be810f660 0x7f1be8117e20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:11.919 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.917+0000 7f270e0f1700 1 -- 192.168.123.105:0/3365644358 >> [v2:192.168.123.108:6808/551946261,v1:192.168.123.108:6809/551946261] conn(0x7f26f0071db0 msgr2=0x7f26f00741d0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:11.919 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.917+0000 7f270e0f1700 1 --2- 192.168.123.105:0/3365644358 >> [v2:192.168.123.108:6808/551946261,v1:192.168.123.108:6809/551946261] conn(0x7f26f0071db0 0x7f26f00741d0 crc :-1 s=READY pgs=4 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.917+0000 7f270e0f1700 1 -- 192.168.123.105:0/3365644358 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f26f006c380 msgr2=0x7f26f006e840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:11.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.917+0000 7f270e0f1700 1 --2- 192.168.123.105:0/3365644358 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f26f006c380 0x7f26f006e840 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f26f800b5c0 tx=0x7f26f800bfa0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.918+0000 7f1bec9a3700 1 -- 192.168.123.105:0/4291512782 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1be8112e20 msgr2=0x7f1be81132a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:11.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.918+0000 7f1bec9a3700 1 --2- 192.168.123.105:0/4291512782 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1be8112e20 0x7f1be81132a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.918+0000 7f1bec9a3700 1 -- 192.168.123.105:0/4291512782 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1bdc0097e0 con 0x7f1be810f660 2026-03-10T07:48:11.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.917+0000 7f270e0f1700 1 -- 192.168.123.105:0/3365644358 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f270810f660 msgr2=0x7f27081abc20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:11.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.917+0000 7f270e0f1700 1 --2- 192.168.123.105:0/3365644358 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f270810f660 0x7f27081abc20 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f26fc00eb10 tx=0x7f26fc00ee20 comp rx=0 tx=0).stop 2026-03-10T07:48:11.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.918+0000 7f270e0f1700 1 -- 192.168.123.105:0/3365644358 shutdown_connections 2026-03-10T07:48:11.920 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.918+0000 7f270e0f1700 1 --2- 192.168.123.105:0/3365644358 >> [v2:192.168.123.108:6808/551946261,v1:192.168.123.108:6809/551946261] conn(0x7f26f0071db0 0x7f26f00741d0 unknown :-1 s=CLOSED pgs=4 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.918+0000 7f270e0f1700 1 --2- 192.168.123.105:0/3365644358 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f26f006c380 0x7f26f006e840 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.918+0000 7f270e0f1700 1 --2- 192.168.123.105:0/3365644358 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f27081082d0 0x7f27081ab6e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.918+0000 7f270e0f1700 1 --2- 192.168.123.105:0/3365644358 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f270810f660 0x7f27081abc20 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.918+0000 7f270e0f1700 1 -- 192.168.123.105:0/3365644358 >> 192.168.123.105:0/3365644358 conn(0x7f270806d0f0 msgr2=0x7f270810d3d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:11.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.919+0000 7f270e0f1700 1 -- 192.168.123.105:0/3365644358 shutdown_connections 2026-03-10T07:48:11.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.920+0000 7f1bec9a3700 1 --2- 192.168.123.105:0/4291512782 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1be810f660 0x7f1be8117e20 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f1bd800eab0 tx=0x7f1bd800edc0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:11.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.920+0000 7f270e0f1700 1 -- 192.168.123.105:0/3365644358 wait complete. 2026-03-10T07:48:11.928 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.925+0000 7f1be5ffb700 1 -- 192.168.123.105:0/4291512782 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1bd800cb20 con 0x7f1be810f660 2026-03-10T07:48:11.929 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.925+0000 7f1be5ffb700 1 -- 192.168.123.105:0/4291512782 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1bd800cc80 con 0x7f1be810f660 2026-03-10T07:48:11.929 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.925+0000 7f1be5ffb700 1 -- 192.168.123.105:0/4291512782 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1bd8018930 con 0x7f1be810f660 2026-03-10T07:48:11.929 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.926+0000 7f1beec07700 1 -- 192.168.123.105:0/4291512782 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1be8113c30 con 0x7f1be810f660 2026-03-10T07:48:11.929 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.926+0000 7f1beec07700 1 -- 192.168.123.105:0/4291512782 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1be81b8860 con 0x7f1be810f660 2026-03-10T07:48:11.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.917+0000 7f464ed2d700 1 -- 192.168.123.105:0/3408954446 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f464810d0f0 msgr2=0x7f464810d570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:11.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.917+0000 7f464ed2d700 1 --2- 192.168.123.105:0/3408954446 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f464810d0f0 0x7f464810d570 secure :-1 s=READY pgs=216 cs=0 l=1 rev1=1 crypto rx=0x7f4644009b50 tx=0x7f4644009e60 comp rx=0 tx=0).stop 2026-03-10T07:48:11.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.928+0000 7f464ed2d700 1 -- 192.168.123.105:0/3408954446 shutdown_connections 2026-03-10T07:48:11.931 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.929+0000 7f1be5ffb700 1 -- 192.168.123.105:0/4291512782 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f1bd8018a90 con 0x7f1be810f660 2026-03-10T07:48:11.932 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.929+0000 7f1be5ffb700 1 --2- 192.168.123.105:0/4291512782 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1bd006c530 0x7f1bd006e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:11.932 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.930+0000 7f1be5ffb700 1 -- 192.168.123.105:0/4291512782 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f1bd8014070 con 0x7f1be810f660 2026-03-10T07:48:11.933 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.928+0000 7f464ed2d700 1 --2- 192.168.123.105:0/3408954446 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f464810d0f0 0x7f464810d570 unknown :-1 s=CLOSED pgs=216 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.933 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.928+0000 7f464ed2d700 1 --2- 192.168.123.105:0/3408954446 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f464810f340 0x7f464810f720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.933 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.928+0000 7f464ed2d700 1 -- 192.168.123.105:0/3408954446 >> 192.168.123.105:0/3408954446 conn(0x7f464806ce20 msgr2=0x7f464806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:11.933 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.931+0000 7f1be7fff700 1 --2- 192.168.123.105:0/4291512782 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1bd006c530 0x7f1bd006e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:11.933 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.931+0000 7f1beec07700 1 --2- 192.168.123.105:0/4291512782 >> [v2:192.168.123.105:6818/4116227648,v1:192.168.123.105:6819/4116227648] conn(0x7f1be80611d0 0x7f1be80615b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:11.933 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.932+0000 7f464ed2d700 1 -- 192.168.123.105:0/3408954446 shutdown_connections 2026-03-10T07:48:11.936 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.934+0000 7f464ed2d700 1 -- 192.168.123.105:0/3408954446 wait complete. 2026-03-10T07:48:11.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.935+0000 7f1bed1a4700 1 --2- 192.168.123.105:0/4291512782 >> [v2:192.168.123.105:6818/4116227648,v1:192.168.123.105:6819/4116227648] conn(0x7f1be80611d0 0x7f1be80615b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:11.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.935+0000 7f1be7fff700 1 --2- 192.168.123.105:0/4291512782 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1bd006c530 0x7f1bd006e9f0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f1bdc005fd0 tx=0x7f1bdc019040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:11.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.935+0000 7f1bed1a4700 1 --2- 192.168.123.105:0/4291512782 >> [v2:192.168.123.105:6818/4116227648,v1:192.168.123.105:6819/4116227648] conn(0x7f1be80611d0 0x7f1be80615b0 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.2 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:11.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.936+0000 7f1beec07700 1 -- 192.168.123.105:0/4291512782 --> [v2:192.168.123.105:6818/4116227648,v1:192.168.123.105:6819/4116227648] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f1be804ea90 con 0x7f1be80611d0 2026-03-10T07:48:11.938 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.934+0000 7f464ed2d700 1 Processor -- start 2026-03-10T07:48:11.938 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.936+0000 7f464ed2d700 1 -- start start 2026-03-10T07:48:11.939 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.936+0000 7f464ed2d700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f464810d0f0 0x7f4648117050 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:11.939 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.936+0000 7f464ed2d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f464811c050 0x7f4648117590 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:11.939 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.936+0000 7f464ed2d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4648117b60 con 0x7f464811c050 2026-03-10T07:48:11.939 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.936+0000 7f464ed2d700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4648117ca0 con 0x7f464810d0f0 2026-03-10T07:48:11.939 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.936+0000 7f464d52a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f464811c050 0x7f4648117590 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:11.939 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.936+0000 7f464d52a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f464811c050 0x7f4648117590 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38398/0 (socket says 192.168.123.105:38398) 2026-03-10T07:48:11.939 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.936+0000 7f464d52a700 1 -- 192.168.123.105:0/2542494304 learned_addr learned my addr 192.168.123.105:0/2542494304 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:11.939 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.937+0000 7f464d52a700 1 -- 192.168.123.105:0/2542494304 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f464810d0f0 msgr2=0x7f4648117050 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:48:11.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.937+0000 7f464d52a700 1 --2- 192.168.123.105:0/2542494304 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f464810d0f0 0x7f4648117050 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.937+0000 7f464d52a700 1 -- 192.168.123.105:0/2542494304 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f46440097e0 con 0x7f464811c050 2026-03-10T07:48:11.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.938+0000 7f1be5ffb700 1 -- 192.168.123.105:0/4291512782 <== osd.2 v2:192.168.123.105:6818/4116227648 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f1be804ea90 con 0x7f1be80611d0 2026-03-10T07:48:11.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.937+0000 7f464d52a700 1 --2- 192.168.123.105:0/2542494304 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f464811c050 0x7f4648117590 secure :-1 s=READY pgs=217 cs=0 l=1 rev1=1 crypto rx=0x7f4644006010 tx=0x7f464400b870 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:11.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.938+0000 7f463effd700 1 -- 192.168.123.105:0/2542494304 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f464401d070 con 0x7f464811c050 2026-03-10T07:48:11.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.938+0000 7f463effd700 1 -- 192.168.123.105:0/2542494304 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f464400bc80 con 0x7f464811c050 2026-03-10T07:48:11.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.938+0000 7f463effd700 1 -- 192.168.123.105:0/2542494304 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4644021920 con 0x7f464811c050 2026-03-10T07:48:11.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.938+0000 7f464ed2d700 1 -- 192.168.123.105:0/2542494304 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f46481b84a0 con 0x7f464811c050 2026-03-10T07:48:11.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.939+0000 7f464ed2d700 1 -- 192.168.123.105:0/2542494304 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f46481b8800 con 0x7f464811c050 2026-03-10T07:48:11.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.941+0000 7f463effd700 1 -- 192.168.123.105:0/2542494304 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f464400f460 con 0x7f464811c050 2026-03-10T07:48:11.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.944+0000 7f463effd700 1 --2- 192.168.123.105:0/2542494304 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f463406c600 0x7f463406eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:11.947 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.945+0000 7f464dd2b700 1 --2- 192.168.123.105:0/2542494304 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f463406c600 0x7f463406eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:11.947 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.946+0000 7f463effd700 1 -- 192.168.123.105:0/2542494304 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f464408c6a0 con 0x7f464811c050 2026-03-10T07:48:11.948 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.946+0000 7f464dd2b700 1 --2- 192.168.123.105:0/2542494304 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f463406c600 0x7f463406eac0 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f4648072e70 tx=0x7f4638006c60 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:11.948 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.947+0000 7f464ed2d700 1 --2- 192.168.123.105:0/2542494304 >> [v2:192.168.123.105:6810/1508516757,v1:192.168.123.105:6811/1508516757] conn(0x7f462c001610 0x7f462c003ad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:11.949 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.947+0000 7f464ed2d700 1 -- 192.168.123.105:0/2542494304 --> [v2:192.168.123.105:6810/1508516757,v1:192.168.123.105:6811/1508516757] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f462c006c00 con 0x7f462c001610 2026-03-10T07:48:11.949 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.947+0000 7f464e52c700 1 --2- 192.168.123.105:0/2542494304 >> [v2:192.168.123.105:6810/1508516757,v1:192.168.123.105:6811/1508516757] conn(0x7f462c001610 0x7f462c003ad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:11.950 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.948+0000 7f464e52c700 1 --2- 192.168.123.105:0/2542494304 >> [v2:192.168.123.105:6810/1508516757,v1:192.168.123.105:6811/1508516757] conn(0x7f462c001610 0x7f462c003ad0 crc :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:11.950 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.948+0000 7f463effd700 1 -- 192.168.123.105:0/2542494304 <== osd.1 v2:192.168.123.105:6810/1508516757 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f462c006c00 con 0x7f462c001610 2026-03-10T07:48:11.977 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.975+0000 7f464ed2d700 1 -- 192.168.123.105:0/2542494304 --> [v2:192.168.123.105:6810/1508516757,v1:192.168.123.105:6811/1508516757] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f462c005ce0 con 0x7f462c001610 2026-03-10T07:48:11.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.977+0000 7f463effd700 1 -- 192.168.123.105:0/2542494304 <== osd.1 v2:192.168.123.105:6810/1508516757 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7f462c005ce0 con 0x7f462c001610 2026-03-10T07:48:11.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.978+0000 7f464ed2d700 1 -- 192.168.123.105:0/2542494304 >> [v2:192.168.123.105:6810/1508516757,v1:192.168.123.105:6811/1508516757] conn(0x7f462c001610 msgr2=0x7f462c003ad0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:11.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.978+0000 7f464ed2d700 1 --2- 192.168.123.105:0/2542494304 >> [v2:192.168.123.105:6810/1508516757,v1:192.168.123.105:6811/1508516757] conn(0x7f462c001610 0x7f462c003ad0 crc :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.982 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.979+0000 7f464ed2d700 1 -- 192.168.123.105:0/2542494304 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f463406c600 msgr2=0x7f463406eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:11.982 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.979+0000 7f1beec07700 1 -- 192.168.123.105:0/4291512782 --> [v2:192.168.123.105:6818/4116227648,v1:192.168.123.105:6819/4116227648] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f1be8066e80 con 0x7f1be80611d0 2026-03-10T07:48:11.982 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.980+0000 7f1be5ffb700 1 -- 192.168.123.105:0/4291512782 <== osd.2 v2:192.168.123.105:6818/4116227648 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7f1be8066e80 con 0x7f1be80611d0 2026-03-10T07:48:11.982 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.980+0000 7f464ed2d700 1 --2- 192.168.123.105:0/2542494304 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f463406c600 0x7f463406eac0 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f4648072e70 tx=0x7f4638006c60 comp rx=0 tx=0).stop 2026-03-10T07:48:11.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.981+0000 7f1bcf7fe700 1 -- 192.168.123.105:0/4291512782 >> [v2:192.168.123.105:6818/4116227648,v1:192.168.123.105:6819/4116227648] conn(0x7f1be80611d0 msgr2=0x7f1be80615b0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:11.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.981+0000 7f1bcf7fe700 1 --2- 192.168.123.105:0/4291512782 >> [v2:192.168.123.105:6818/4116227648,v1:192.168.123.105:6819/4116227648] conn(0x7f1be80611d0 0x7f1be80615b0 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.981+0000 7f1bcf7fe700 1 -- 192.168.123.105:0/4291512782 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1bd006c530 msgr2=0x7f1bd006e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:11.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.981+0000 7f1bcf7fe700 1 --2- 192.168.123.105:0/4291512782 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1bd006c530 0x7f1bd006e9f0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f1bdc005fd0 tx=0x7f1bdc019040 comp rx=0 tx=0).stop 2026-03-10T07:48:11.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.981+0000 7f1bcf7fe700 1 -- 192.168.123.105:0/4291512782 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1be810f660 msgr2=0x7f1be8117e20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:11.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.981+0000 7f1bcf7fe700 1 --2- 192.168.123.105:0/4291512782 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1be810f660 0x7f1be8117e20 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f1bd800eab0 tx=0x7f1bd800edc0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.981+0000 7f1bcf7fe700 1 -- 192.168.123.105:0/4291512782 shutdown_connections 2026-03-10T07:48:11.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.981+0000 7f1bcf7fe700 1 --2- 192.168.123.105:0/4291512782 >> [v2:192.168.123.105:6818/4116227648,v1:192.168.123.105:6819/4116227648] conn(0x7f1be80611d0 0x7f1be80615b0 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.981+0000 7f1bcf7fe700 1 --2- 192.168.123.105:0/4291512782 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1bd006c530 0x7f1bd006e9f0 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.981+0000 7f1bcf7fe700 1 --2- 192.168.123.105:0/4291512782 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1be810f660 0x7f1be8117e20 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.981+0000 7f1bcf7fe700 1 --2- 192.168.123.105:0/4291512782 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1be8112e20 0x7f1be81132a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.981+0000 7f1bcf7fe700 1 -- 192.168.123.105:0/4291512782 >> 192.168.123.105:0/4291512782 conn(0x7f1be806d0f0 msgr2=0x7f1be8071870 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:11.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.982+0000 7f1bcf7fe700 1 -- 192.168.123.105:0/4291512782 shutdown_connections 2026-03-10T07:48:11.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.982+0000 7f1bcf7fe700 1 -- 192.168.123.105:0/4291512782 wait complete. 2026-03-10T07:48:11.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.981+0000 7f464ed2d700 1 -- 192.168.123.105:0/2542494304 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f464811c050 msgr2=0x7f4648117590 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:11.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.982+0000 7f464ed2d700 1 --2- 192.168.123.105:0/2542494304 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f464811c050 0x7f4648117590 secure :-1 s=READY pgs=217 cs=0 l=1 rev1=1 crypto rx=0x7f4644006010 tx=0x7f464400b870 comp rx=0 tx=0).stop 2026-03-10T07:48:11.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.982+0000 7f464ed2d700 1 -- 192.168.123.105:0/2542494304 shutdown_connections 2026-03-10T07:48:11.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.983+0000 7f464ed2d700 1 --2- 192.168.123.105:0/2542494304 >> [v2:192.168.123.105:6810/1508516757,v1:192.168.123.105:6811/1508516757] conn(0x7f462c001610 0x7f462c003ad0 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.983+0000 7f464ed2d700 1 --2- 192.168.123.105:0/2542494304 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f463406c600 0x7f463406eac0 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.983+0000 7f464ed2d700 1 --2- 192.168.123.105:0/2542494304 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f464810d0f0 0x7f4648117050 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.983+0000 7f464ed2d700 1 --2- 192.168.123.105:0/2542494304 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f464811c050 0x7f4648117590 unknown :-1 s=CLOSED pgs=217 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:11.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.984+0000 7f464ed2d700 1 -- 192.168.123.105:0/2542494304 >> 192.168.123.105:0/2542494304 conn(0x7f464806ce20 msgr2=0x7f4648109de0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:11.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.985+0000 7f464ed2d700 1 -- 192.168.123.105:0/2542494304 shutdown_connections 2026-03-10T07:48:11.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:11.986+0000 7f464ed2d700 1 -- 192.168.123.105:0/2542494304 wait complete. 2026-03-10T07:48:12.015 INFO:teuthology.orchestra.run.vm05.stdout:137438953475 2026-03-10T07:48:12.016 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph osd last-stat-seq osd.5 2026-03-10T07:48:12.030 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.024+0000 7f346e9d1700 1 -- 192.168.123.105:0/3429443742 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f34681089a0 msgr2=0x7f346810be70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:12.030 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.024+0000 7f346e9d1700 1 --2- 192.168.123.105:0/3429443742 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f34681089a0 0x7f346810be70 secure :-1 s=READY pgs=218 cs=0 l=1 rev1=1 crypto rx=0x7f345c009b50 tx=0x7f345c009e60 comp rx=0 tx=0).stop 2026-03-10T07:48:12.032 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.029+0000 7f346e9d1700 1 -- 192.168.123.105:0/3429443742 shutdown_connections 2026-03-10T07:48:12.032 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.029+0000 7f346e9d1700 1 --2- 192.168.123.105:0/3429443742 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f34681089a0 0x7f346810be70 unknown :-1 s=CLOSED pgs=218 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:12.032 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.029+0000 7f346e9d1700 1 --2- 192.168.123.105:0/3429443742 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3468107ff0 0x7f34681083d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:12.032 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.029+0000 7f346e9d1700 1 -- 192.168.123.105:0/3429443742 >> 192.168.123.105:0/3429443742 conn(0x7f346806ce20 msgr2=0x7f346806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:12.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.029+0000 7f346e9d1700 1 -- 192.168.123.105:0/3429443742 shutdown_connections 2026-03-10T07:48:12.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.029+0000 7f346e9d1700 1 -- 192.168.123.105:0/3429443742 wait complete. 2026-03-10T07:48:12.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.029+0000 7f346e9d1700 1 Processor -- start 2026-03-10T07:48:12.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.029+0000 7f346e9d1700 1 -- start start 2026-03-10T07:48:12.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.029+0000 7f346e9d1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3468107ff0 0x7f346807e210 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:12.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.029+0000 7f346e9d1700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f346807e750 0x7f34681b8e10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:12.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.029+0000 7f346e9d1700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f346807ebd0 con 0x7f3468107ff0 2026-03-10T07:48:12.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.029+0000 7f346e9d1700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f346807ed40 con 0x7f346807e750 2026-03-10T07:48:12.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.030+0000 7f3467fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3468107ff0 0x7f346807e210 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:12.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.030+0000 7f3467fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3468107ff0 0x7f346807e210 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38410/0 (socket says 192.168.123.105:38410) 2026-03-10T07:48:12.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.030+0000 7f3467fff700 1 -- 192.168.123.105:0/3438613343 learned_addr learned my addr 192.168.123.105:0/3438613343 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:12.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.030+0000 7f3467fff700 1 -- 192.168.123.105:0/3438613343 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f346807e750 msgr2=0x7f34681b8e10 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:48:12.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.030+0000 7f3467fff700 1 --2- 192.168.123.105:0/3438613343 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f346807e750 0x7f34681b8e10 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:12.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.030+0000 7f3467fff700 1 -- 192.168.123.105:0/3438613343 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f345c0097e0 con 0x7f3468107ff0 2026-03-10T07:48:12.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.030+0000 7f3467fff700 1 --2- 192.168.123.105:0/3438613343 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3468107ff0 0x7f346807e210 secure :-1 s=READY pgs=219 cs=0 l=1 rev1=1 crypto rx=0x7f345800ed70 tx=0x7f345800c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:12.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.030+0000 7f34657fa700 1 -- 192.168.123.105:0/3438613343 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3458009980 con 0x7f3468107ff0 2026-03-10T07:48:12.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.031+0000 7f346e9d1700 1 -- 192.168.123.105:0/3438613343 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f34681b9350 con 0x7f3468107ff0 2026-03-10T07:48:12.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.031+0000 7f346e9d1700 1 -- 192.168.123.105:0/3438613343 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f346807a660 con 0x7f3468107ff0 2026-03-10T07:48:12.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.034+0000 7f34657fa700 1 -- 192.168.123.105:0/3438613343 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f345800cd70 con 0x7f3468107ff0 2026-03-10T07:48:12.037 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.034+0000 7f34657fa700 1 -- 192.168.123.105:0/3438613343 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f34580189c0 con 0x7f3468107ff0 2026-03-10T07:48:12.037 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.034+0000 7f34657fa700 1 -- 192.168.123.105:0/3438613343 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f3458018be0 con 0x7f3468107ff0 2026-03-10T07:48:12.039 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.036+0000 7f34657fa700 1 --2- 192.168.123.105:0/3438613343 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f345006c6e0 0x7f345006eba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:12.042 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.036+0000 7f34677fe700 1 --2- 192.168.123.105:0/3438613343 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f345006c6e0 0x7f345006eba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:12.042 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.036+0000 7f34677fe700 1 --2- 192.168.123.105:0/3438613343 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f345006c6e0 0x7f345006eba0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f345c009b20 tx=0x7f345c005bc0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:12.042 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.036+0000 7f34657fa700 1 -- 192.168.123.105:0/3438613343 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f3458014070 con 0x7f3468107ff0 2026-03-10T07:48:12.042 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.036+0000 7f346e9d1700 1 --2- 192.168.123.105:0/3438613343 >> [v2:192.168.123.108:6800/2127000679,v1:192.168.123.108:6801/2127000679] conn(0x7f3454001610 0x7f3454003ad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:12.042 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.036+0000 7f346e9d1700 1 -- 192.168.123.105:0/3438613343 --> [v2:192.168.123.108:6800/2127000679,v1:192.168.123.108:6801/2127000679] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f3454006c00 con 0x7f3454001610 2026-03-10T07:48:12.042 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.037+0000 7f346cf6e700 1 --2- 192.168.123.105:0/3438613343 >> [v2:192.168.123.108:6800/2127000679,v1:192.168.123.108:6801/2127000679] conn(0x7f3454001610 0x7f3454003ad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:12.042 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.038+0000 7f346cf6e700 1 --2- 192.168.123.105:0/3438613343 >> [v2:192.168.123.108:6800/2127000679,v1:192.168.123.108:6801/2127000679] conn(0x7f3454001610 0x7f3454003ad0 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.3 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:12.042 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.039+0000 7f34657fa700 1 -- 192.168.123.105:0/3438613343 <== osd.3 v2:192.168.123.108:6800/2127000679 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f3454006c00 con 0x7f3454001610 2026-03-10T07:48:12.091 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.088+0000 7f346e9d1700 1 -- 192.168.123.105:0/3438613343 --> [v2:192.168.123.108:6800/2127000679,v1:192.168.123.108:6801/2127000679] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f3454005ce0 con 0x7f3454001610 2026-03-10T07:48:12.093 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.090+0000 7f34657fa700 1 -- 192.168.123.105:0/3438613343 <== osd.3 v2:192.168.123.108:6800/2127000679 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7f3454005ce0 con 0x7f3454001610 2026-03-10T07:48:12.098 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.095+0000 7f346e9d1700 1 -- 192.168.123.105:0/3438613343 >> [v2:192.168.123.108:6800/2127000679,v1:192.168.123.108:6801/2127000679] conn(0x7f3454001610 msgr2=0x7f3454003ad0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:12.098 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.095+0000 7f346e9d1700 1 --2- 192.168.123.105:0/3438613343 >> [v2:192.168.123.108:6800/2127000679,v1:192.168.123.108:6801/2127000679] conn(0x7f3454001610 0x7f3454003ad0 crc :-1 s=READY pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:12.099 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.096+0000 7f346e9d1700 1 -- 192.168.123.105:0/3438613343 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f345006c6e0 msgr2=0x7f345006eba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:12.099 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.096+0000 7f346e9d1700 1 --2- 192.168.123.105:0/3438613343 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f345006c6e0 0x7f345006eba0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f345c009b20 tx=0x7f345c005bc0 comp rx=0 tx=0).stop 2026-03-10T07:48:12.099 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.096+0000 7f346e9d1700 1 -- 192.168.123.105:0/3438613343 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3468107ff0 msgr2=0x7f346807e210 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:12.099 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.096+0000 7f346e9d1700 1 --2- 192.168.123.105:0/3438613343 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3468107ff0 0x7f346807e210 secure :-1 s=READY pgs=219 cs=0 l=1 rev1=1 crypto rx=0x7f345800ed70 tx=0x7f345800c5b0 comp rx=0 tx=0).stop 2026-03-10T07:48:12.100 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.098+0000 7f346e9d1700 1 -- 192.168.123.105:0/3438613343 shutdown_connections 2026-03-10T07:48:12.100 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.098+0000 7f346e9d1700 1 --2- 192.168.123.105:0/3438613343 >> [v2:192.168.123.108:6800/2127000679,v1:192.168.123.108:6801/2127000679] conn(0x7f3454001610 0x7f3454003ad0 unknown :-1 s=CLOSED pgs=6 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:12.100 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.098+0000 7f346e9d1700 1 --2- 192.168.123.105:0/3438613343 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f345006c6e0 0x7f345006eba0 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:12.100 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.098+0000 7f346e9d1700 1 --2- 192.168.123.105:0/3438613343 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3468107ff0 0x7f346807e210 unknown :-1 s=CLOSED pgs=219 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:12.100 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.098+0000 7f346e9d1700 1 --2- 192.168.123.105:0/3438613343 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f346807e750 0x7f34681b8e10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:12.100 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.098+0000 7f346e9d1700 1 -- 192.168.123.105:0/3438613343 >> 192.168.123.105:0/3438613343 conn(0x7f346806ce20 msgr2=0x7f3468070600 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:12.101 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.098+0000 7f346e9d1700 1 -- 192.168.123.105:0/3438613343 shutdown_connections 2026-03-10T07:48:12.101 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.099+0000 7f346e9d1700 1 -- 192.168.123.105:0/3438613343 wait complete. 2026-03-10T07:48:12.125 INFO:teuthology.orchestra.run.vm05.stdout:73014444040 2026-03-10T07:48:12.125 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph osd last-stat-seq osd.2 2026-03-10T07:48:12.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.125+0000 7f246d231700 1 -- 192.168.123.105:0/3275107314 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f246810f340 msgr2=0x7f246810f720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:12.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.125+0000 7f246d231700 1 --2- 192.168.123.105:0/3275107314 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f246810f340 0x7f246810f720 secure :-1 s=READY pgs=220 cs=0 l=1 rev1=1 crypto rx=0x7f2458009b50 tx=0x7f2458009e60 comp rx=0 tx=0).stop 2026-03-10T07:48:12.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.125+0000 7f246d231700 1 -- 192.168.123.105:0/3275107314 shutdown_connections 2026-03-10T07:48:12.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.125+0000 7f246d231700 1 --2- 192.168.123.105:0/3275107314 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f246810d0f0 0x7f246810d570 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:12.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.125+0000 7f246d231700 1 --2- 192.168.123.105:0/3275107314 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f246810f340 0x7f246810f720 secure :-1 s=CLOSED pgs=220 cs=0 l=1 rev1=1 crypto rx=0x7f2458009b50 tx=0x7f2458009e60 comp rx=0 tx=0).stop 2026-03-10T07:48:12.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.125+0000 7f246d231700 1 -- 192.168.123.105:0/3275107314 >> 192.168.123.105:0/3275107314 conn(0x7f246806ce20 msgr2=0x7f246806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:12.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.129+0000 7f246d231700 1 -- 192.168.123.105:0/3275107314 shutdown_connections 2026-03-10T07:48:12.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.131+0000 7f246d231700 1 -- 192.168.123.105:0/3275107314 wait complete. 2026-03-10T07:48:12.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.131+0000 7f246d231700 1 Processor -- start 2026-03-10T07:48:12.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.131+0000 7f246d231700 1 -- start start 2026-03-10T07:48:12.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.131+0000 7f246d231700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f246810d0f0 0x7f24681171a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:12.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.131+0000 7f246d231700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f246811c1a0 0x7f24681176e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:12.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.131+0000 7f246d231700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2468117dc0 con 0x7f246811c1a0 2026-03-10T07:48:12.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.131+0000 7f246d231700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2468117f30 con 0x7f246810d0f0 2026-03-10T07:48:12.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.133+0000 7f24677fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f246811c1a0 0x7f24681176e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:12.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.133+0000 7f24677fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f246811c1a0 0x7f24681176e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38434/0 (socket says 192.168.123.105:38434) 2026-03-10T07:48:12.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.133+0000 7f24677fe700 1 -- 192.168.123.105:0/1470270227 learned_addr learned my addr 192.168.123.105:0/1470270227 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:12.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.133+0000 7f24677fe700 1 -- 192.168.123.105:0/1470270227 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f246810d0f0 msgr2=0x7f24681171a0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T07:48:12.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.133+0000 7f24677fe700 1 --2- 192.168.123.105:0/1470270227 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f246810d0f0 0x7f24681171a0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:12.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.133+0000 7f24677fe700 1 -- 192.168.123.105:0/1470270227 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f24580097e0 con 0x7f246811c1a0 2026-03-10T07:48:12.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.133+0000 7f24677fe700 1 --2- 192.168.123.105:0/1470270227 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f246811c1a0 0x7f24681176e0 secure :-1 s=READY pgs=221 cs=0 l=1 rev1=1 crypto rx=0x7f245c00ed70 tx=0x7f245c00c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:12.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.134+0000 7f24657fa700 1 -- 192.168.123.105:0/1470270227 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f245c009980 con 0x7f246811c1a0 2026-03-10T07:48:12.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.134+0000 7f24657fa700 1 -- 192.168.123.105:0/1470270227 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f245c00cd70 con 0x7f246811c1a0 2026-03-10T07:48:12.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.134+0000 7f24657fa700 1 -- 192.168.123.105:0/1470270227 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f245c0189c0 con 0x7f246811c1a0 2026-03-10T07:48:12.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.134+0000 7f246d231700 1 -- 192.168.123.105:0/1470270227 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f24681b8400 con 0x7f246811c1a0 2026-03-10T07:48:12.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.134+0000 7f246d231700 1 -- 192.168.123.105:0/1470270227 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f24681b8950 con 0x7f246811c1a0 2026-03-10T07:48:12.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.135+0000 7f244effd700 1 -- 192.168.123.105:0/1470270227 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_get_version(what=osdmap handle=1) v1 -- 0x7f2454000ff0 con 0x7f246811c1a0 2026-03-10T07:48:12.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.137+0000 7f24657fa700 1 -- 192.168.123.105:0/1470270227 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f245c018b20 con 0x7f246811c1a0 2026-03-10T07:48:12.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.143+0000 7f24657fa700 1 --2- 192.168.123.105:0/1470270227 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f245006c600 0x7f245006eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:12.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.143+0000 7f24657fa700 1 -- 192.168.123.105:0/1470270227 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f245c014070 con 0x7f246811c1a0 2026-03-10T07:48:12.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.143+0000 7f24657fa700 1 --2- 192.168.123.105:0/1470270227 >> [v2:192.168.123.105:6802/196713793,v1:192.168.123.105:6803/196713793] conn(0x7f24500721a0 0x7f24500745c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:12.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.143+0000 7f24657fa700 1 -- 192.168.123.105:0/1470270227 --> [v2:192.168.123.105:6802/196713793,v1:192.168.123.105:6803/196713793] -- command(tid 1: {"prefix": "get_command_descriptions"}) v1 -- 0x7f2450074c70 con 0x7f24500721a0 2026-03-10T07:48:12.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.143+0000 7f24657fa700 1 -- 192.168.123.105:0/1470270227 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_get_version_reply(handle=1 version=33) v2 ==== 24+0+0 (secure 0 0 0) 0x7f245c08bf90 con 0x7f246811c1a0 2026-03-10T07:48:12.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.144+0000 7f246ca30700 1 --2- 192.168.123.105:0/1470270227 >> [v2:192.168.123.105:6802/196713793,v1:192.168.123.105:6803/196713793] conn(0x7f24500721a0 0x7f24500745c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:12.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.144+0000 7f246ca30700 1 --2- 192.168.123.105:0/1470270227 >> [v2:192.168.123.105:6802/196713793,v1:192.168.123.105:6803/196713793] conn(0x7f24500721a0 0x7f24500745c0 crc :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).ready entity=osd.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:12.160 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.151+0000 7f2467fff700 1 --2- 192.168.123.105:0/1470270227 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f245006c600 0x7f245006eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:12.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.151+0000 7f24657fa700 1 -- 192.168.123.105:0/1470270227 <== osd.0 v2:192.168.123.105:6802/196713793 1 ==== command_reply(tid 1: 0 ) v1 ==== 8+0+25106 (crc 0 0 0) 0x7f2450074c70 con 0x7f24500721a0 2026-03-10T07:48:12.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.152+0000 7f2467fff700 1 --2- 192.168.123.105:0/1470270227 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f245006c600 0x7f245006eac0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f2458005e50 tx=0x7f2458005da0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:12.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:11 vm08 ceph-mon[59917]: pgmap v65: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:48:12.175 INFO:teuthology.orchestra.run.vm05.stdout:55834574858 2026-03-10T07:48:12.176 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph osd last-stat-seq osd.1 2026-03-10T07:48:12.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.182+0000 7f244effd700 1 -- 192.168.123.105:0/1470270227 --> [v2:192.168.123.105:6802/196713793,v1:192.168.123.105:6803/196713793] -- command(tid 2: {"prefix": "flush_pg_stats"}) v1 -- 0x7f2454002d70 con 0x7f24500721a0 2026-03-10T07:48:12.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.183+0000 7f24657fa700 1 -- 192.168.123.105:0/1470270227 <== osd.0 v2:192.168.123.105:6802/196713793 2 ==== command_reply(tid 2: 0 ) v1 ==== 8+0+11 (crc 0 0 0) 0x7f2454002d70 con 0x7f24500721a0 2026-03-10T07:48:12.209 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.187+0000 7f246d231700 1 -- 192.168.123.105:0/1470270227 >> [v2:192.168.123.105:6802/196713793,v1:192.168.123.105:6803/196713793] conn(0x7f24500721a0 msgr2=0x7f24500745c0 crc :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:12.209 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.187+0000 7f246d231700 1 --2- 192.168.123.105:0/1470270227 >> [v2:192.168.123.105:6802/196713793,v1:192.168.123.105:6803/196713793] conn(0x7f24500721a0 0x7f24500745c0 crc :-1 s=READY pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:12.209 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.189+0000 7f246d231700 1 -- 192.168.123.105:0/1470270227 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f245006c600 msgr2=0x7f245006eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:12.209 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.189+0000 7f246d231700 1 --2- 192.168.123.105:0/1470270227 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f245006c600 0x7f245006eac0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7f2458005e50 tx=0x7f2458005da0 comp rx=0 tx=0).stop 2026-03-10T07:48:12.209 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.189+0000 7f246d231700 1 -- 192.168.123.105:0/1470270227 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f246811c1a0 msgr2=0x7f24681176e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:12.209 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.189+0000 7f246d231700 1 --2- 192.168.123.105:0/1470270227 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f246811c1a0 0x7f24681176e0 secure :-1 s=READY pgs=221 cs=0 l=1 rev1=1 crypto rx=0x7f245c00ed70 tx=0x7f245c00c5b0 comp rx=0 tx=0).stop 2026-03-10T07:48:12.209 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.190+0000 7f246d231700 1 -- 192.168.123.105:0/1470270227 shutdown_connections 2026-03-10T07:48:12.209 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.190+0000 7f246d231700 1 --2- 192.168.123.105:0/1470270227 >> [v2:192.168.123.105:6802/196713793,v1:192.168.123.105:6803/196713793] conn(0x7f24500721a0 0x7f24500745c0 unknown :-1 s=CLOSED pgs=7 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:12.209 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.190+0000 7f246d231700 1 --2- 192.168.123.105:0/1470270227 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f245006c600 0x7f245006eac0 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:12.209 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.190+0000 7f246d231700 1 --2- 192.168.123.105:0/1470270227 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f246810d0f0 0x7f24681171a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:12.209 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.190+0000 7f246d231700 1 --2- 192.168.123.105:0/1470270227 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f246811c1a0 0x7f24681176e0 unknown :-1 s=CLOSED pgs=221 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:12.209 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.190+0000 7f246d231700 1 -- 192.168.123.105:0/1470270227 >> 192.168.123.105:0/1470270227 conn(0x7f246806ce20 msgr2=0x7f2468070690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:12.209 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.192+0000 7f246d231700 1 -- 192.168.123.105:0/1470270227 shutdown_connections 2026-03-10T07:48:12.209 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.194+0000 7f246d231700 1 -- 192.168.123.105:0/1470270227 wait complete. 2026-03-10T07:48:12.251 INFO:teuthology.orchestra.run.vm05.stdout:120259084293 2026-03-10T07:48:12.251 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph osd last-stat-seq osd.4 2026-03-10T07:48:12.278 INFO:teuthology.orchestra.run.vm05.stdout:98784247815 2026-03-10T07:48:12.278 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph osd last-stat-seq osd.3 2026-03-10T07:48:12.280 INFO:teuthology.orchestra.run.vm05.stdout:38654705676 2026-03-10T07:48:12.280 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph osd last-stat-seq osd.0 2026-03-10T07:48:12.442 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:12.690 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:12.705 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:12.879 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:12.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.967+0000 7f0d63fff700 1 -- 192.168.123.105:0/4176020687 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d64100e90 msgr2=0x7f0d64101270 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:12.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.967+0000 7f0d63fff700 1 --2- 192.168.123.105:0/4176020687 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d64100e90 0x7f0d64101270 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f0d4c009a60 tx=0x7f0d4c009d70 comp rx=0 tx=0).stop 2026-03-10T07:48:12.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.980+0000 7f0d63fff700 1 -- 192.168.123.105:0/4176020687 shutdown_connections 2026-03-10T07:48:12.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.980+0000 7f0d63fff700 1 --2- 192.168.123.105:0/4176020687 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0d64101840 0x7f0d64105890 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:12.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.980+0000 7f0d63fff700 1 --2- 192.168.123.105:0/4176020687 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d64100e90 0x7f0d64101270 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:12.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.980+0000 7f0d63fff700 1 -- 192.168.123.105:0/4176020687 >> 192.168.123.105:0/4176020687 conn(0x7f0d640fc760 msgr2=0x7f0d640feb80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:12.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.980+0000 7f0d63fff700 1 -- 192.168.123.105:0/4176020687 shutdown_connections 2026-03-10T07:48:12.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.981+0000 7f0d63fff700 1 -- 192.168.123.105:0/4176020687 wait complete. 2026-03-10T07:48:12.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.981+0000 7f0d63fff700 1 Processor -- start 2026-03-10T07:48:12.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.984+0000 7f0d63fff700 1 -- start start 2026-03-10T07:48:12.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.987+0000 7f0d63fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d64100e90 0x7f0d64196950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:12.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.987+0000 7f0d63fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0d64101840 0x7f0d64196e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:12.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.987+0000 7f0d63fff700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0d64197520 con 0x7f0d64101840 2026-03-10T07:48:12.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.987+0000 7f0d63fff700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0d6419b330 con 0x7f0d64100e90 2026-03-10T07:48:12.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.987+0000 7f0d62ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d64100e90 0x7f0d64196950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:12.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.987+0000 7f0d62ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d64100e90 0x7f0d64196950 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:43792/0 (socket says 192.168.123.105:43792) 2026-03-10T07:48:12.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.987+0000 7f0d62ffd700 1 -- 192.168.123.105:0/568617628 learned_addr learned my addr 192.168.123.105:0/568617628 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:12.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.987+0000 7f0d62ffd700 1 -- 192.168.123.105:0/568617628 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0d64101840 msgr2=0x7f0d64196e90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:12.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.987+0000 7f0d62ffd700 1 --2- 192.168.123.105:0/568617628 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0d64101840 0x7f0d64196e90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:12.990 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.988+0000 7f0d62ffd700 1 -- 192.168.123.105:0/568617628 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0d4c009710 con 0x7f0d64100e90 2026-03-10T07:48:12.990 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.988+0000 7f0d62ffd700 1 --2- 192.168.123.105:0/568617628 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d64100e90 0x7f0d64196950 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f0d4c00f6c0 tx=0x7f0d4c00f7a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:12.992 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.988+0000 7f0d5bfff700 1 -- 192.168.123.105:0/568617628 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0d4c01d070 con 0x7f0d64100e90 2026-03-10T07:48:12.992 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.989+0000 7f0d5bfff700 1 -- 192.168.123.105:0/568617628 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0d4c00bb40 con 0x7f0d64100e90 2026-03-10T07:48:12.992 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.989+0000 7f0d5bfff700 1 -- 192.168.123.105:0/568617628 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0d4c017620 con 0x7f0d64100e90 2026-03-10T07:48:12.992 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.989+0000 7f0d63fff700 1 -- 192.168.123.105:0/568617628 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0d6419b5b0 con 0x7f0d64100e90 2026-03-10T07:48:12.992 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.989+0000 7f0d63fff700 1 -- 192.168.123.105:0/568617628 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0d6419ba40 con 0x7f0d64100e90 2026-03-10T07:48:12.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.990+0000 7f0d5bfff700 1 -- 192.168.123.105:0/568617628 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f0d4c0229e0 con 0x7f0d64100e90 2026-03-10T07:48:12.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.990+0000 7f0d63fff700 1 -- 192.168.123.105:0/568617628 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0d641091e0 con 0x7f0d64100e90 2026-03-10T07:48:12.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.990+0000 7f0d5bfff700 1 --2- 192.168.123.105:0/568617628 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0d5006c4e0 0x7f0d5006e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:12.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.990+0000 7f0d5bfff700 1 -- 192.168.123.105:0/568617628 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f0d4c08c970 con 0x7f0d64100e90 2026-03-10T07:48:13.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.998+0000 7f0d5bfff700 1 -- 192.168.123.105:0/568617628 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f0d4c05b070 con 0x7f0d64100e90 2026-03-10T07:48:13.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:12.998+0000 7f0d627fc700 1 --2- 192.168.123.105:0/568617628 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0d5006c4e0 0x7f0d5006e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:13.002 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.000+0000 7f0d627fc700 1 --2- 192.168.123.105:0/568617628 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0d5006c4e0 0x7f0d5006e9a0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f0d54005950 tx=0x7f0d540058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:13.129 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:13.150 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:13.181 INFO:teuthology.orchestra.run.vm05.stdout:137438953473 2026-03-10T07:48:13.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.176+0000 7f0d63fff700 1 -- 192.168.123.105:0/568617628 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 5} v 0) v1 -- 0x7f0d6404f2e0 con 0x7f0d64100e90 2026-03-10T07:48:13.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.178+0000 7f0d5bfff700 1 -- 192.168.123.105:0/568617628 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 5}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7f0d4c027070 con 0x7f0d64100e90 2026-03-10T07:48:13.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.181+0000 7f0d59ffb700 1 -- 192.168.123.105:0/568617628 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0d5006c4e0 msgr2=0x7f0d5006e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:13.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.181+0000 7f0d59ffb700 1 --2- 192.168.123.105:0/568617628 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0d5006c4e0 0x7f0d5006e9a0 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f0d54005950 tx=0x7f0d540058e0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.181+0000 7f0d59ffb700 1 -- 192.168.123.105:0/568617628 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d64100e90 msgr2=0x7f0d64196950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:13.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.181+0000 7f0d59ffb700 1 --2- 192.168.123.105:0/568617628 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d64100e90 0x7f0d64196950 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f0d4c00f6c0 tx=0x7f0d4c00f7a0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.181+0000 7f0d59ffb700 1 -- 192.168.123.105:0/568617628 shutdown_connections 2026-03-10T07:48:13.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.181+0000 7f0d59ffb700 1 --2- 192.168.123.105:0/568617628 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0d5006c4e0 0x7f0d5006e9a0 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.181+0000 7f0d59ffb700 1 --2- 192.168.123.105:0/568617628 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0d64100e90 0x7f0d64196950 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.181+0000 7f0d59ffb700 1 --2- 192.168.123.105:0/568617628 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0d64101840 0x7f0d64196e90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.182+0000 7f0d59ffb700 1 -- 192.168.123.105:0/568617628 >> 192.168.123.105:0/568617628 conn(0x7f0d640fc760 msgr2=0x7f0d640feb20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:13.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.182+0000 7f0d59ffb700 1 -- 192.168.123.105:0/568617628 shutdown_connections 2026-03-10T07:48:13.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.182+0000 7f0d59ffb700 1 -- 192.168.123.105:0/568617628 wait complete. 2026-03-10T07:48:13.272 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:13 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/568617628' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 5}]: dispatch 2026-03-10T07:48:13.337 INFO:tasks.cephadm.ceph_manager.ceph:need seq 137438953475 got 137438953473 for osd.5 2026-03-10T07:48:13.561 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:13 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/568617628' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 5}]: dispatch 2026-03-10T07:48:13.577 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.571+0000 7f6387fff700 1 -- 192.168.123.105:0/2316066161 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f638810f340 msgr2=0x7f638810f720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:13.577 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.571+0000 7f6387fff700 1 --2- 192.168.123.105:0/2316066161 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f638810f340 0x7f638810f720 secure :-1 s=READY pgs=222 cs=0 l=1 rev1=1 crypto rx=0x7f6378009ab0 tx=0x7f6378009dc0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.591 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.585+0000 7f6387fff700 1 -- 192.168.123.105:0/2316066161 shutdown_connections 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.585+0000 7f6387fff700 1 --2- 192.168.123.105:0/2316066161 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f638810d0f0 0x7f638810d570 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.585+0000 7f6387fff700 1 --2- 192.168.123.105:0/2316066161 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f638810f340 0x7f638810f720 unknown :-1 s=CLOSED pgs=222 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.585+0000 7f6387fff700 1 -- 192.168.123.105:0/2316066161 >> 192.168.123.105:0/2316066161 conn(0x7f638806ce20 msgr2=0x7f638806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.585+0000 7f6387fff700 1 -- 192.168.123.105:0/2316066161 shutdown_connections 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.585+0000 7f6387fff700 1 -- 192.168.123.105:0/2316066161 wait complete. 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.585+0000 7f6387fff700 1 Processor -- start 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.585+0000 7f6387fff700 1 -- start start 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.586+0000 7f6387fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f638810d0f0 0x7f638811bfe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.586+0000 7f6387fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f638810f340 0x7f6388116fe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.586+0000 7f6387fff700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f63881176c0 con 0x7f638810f340 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.586+0000 7f63867fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f638810f340 0x7f6388116fe0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.586+0000 7f63867fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f638810f340 0x7f6388116fe0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38488/0 (socket says 192.168.123.105:38488) 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.586+0000 7f63867fc700 1 -- 192.168.123.105:0/1160935539 learned_addr learned my addr 192.168.123.105:0/1160935539 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.586+0000 7f6387fff700 1 -- 192.168.123.105:0/1160935539 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6388117830 con 0x7f638810d0f0 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.587+0000 7f6386ffd700 1 --2- 192.168.123.105:0/1160935539 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f638810d0f0 0x7f638811bfe0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.587+0000 7f63867fc700 1 -- 192.168.123.105:0/1160935539 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f638810d0f0 msgr2=0x7f638811bfe0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.587+0000 7f63867fc700 1 --2- 192.168.123.105:0/1160935539 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f638810d0f0 0x7f638811bfe0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.587+0000 7f63867fc700 1 -- 192.168.123.105:0/1160935539 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6378009710 con 0x7f638810f340 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.587+0000 7f63867fc700 1 --2- 192.168.123.105:0/1160935539 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f638810f340 0x7f6388116fe0 secure :-1 s=READY pgs=223 cs=0 l=1 rev1=1 crypto rx=0x7f637c00d900 tx=0x7f637c00dcc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.587+0000 7f636ffff700 1 -- 192.168.123.105:0/1160935539 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f637c0098e0 con 0x7f638810f340 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.587+0000 7f636ffff700 1 -- 192.168.123.105:0/1160935539 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f637c010460 con 0x7f638810f340 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.587+0000 7f6387fff700 1 -- 192.168.123.105:0/1160935539 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6388117b10 con 0x7f638810f340 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.587+0000 7f6387fff700 1 -- 192.168.123.105:0/1160935539 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f63881b85d0 con 0x7f638810f340 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.588+0000 7f636ffff700 1 -- 192.168.123.105:0/1160935539 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f637c00f640 con 0x7f638810f340 2026-03-10T07:48:13.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.589+0000 7f636ffff700 1 -- 192.168.123.105:0/1160935539 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f637c00f7a0 con 0x7f638810f340 2026-03-10T07:48:13.594 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.592+0000 7f636ffff700 1 --2- 192.168.123.105:0/1160935539 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f637006c600 0x7f637006eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:13.594 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.592+0000 7f6386ffd700 1 --2- 192.168.123.105:0/1160935539 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f637006c600 0x7f637006eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:13.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.594+0000 7f6386ffd700 1 --2- 192.168.123.105:0/1160935539 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f637006c600 0x7f637006eac0 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f637800c010 tx=0x7f637800be20 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:13.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.595+0000 7f636ffff700 1 -- 192.168.123.105:0/1160935539 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f637c08c2c0 con 0x7f638810f340 2026-03-10T07:48:13.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.595+0000 7f6387fff700 1 -- 192.168.123.105:0/1160935539 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6374005320 con 0x7f638810f340 2026-03-10T07:48:13.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.599+0000 7f636ffff700 1 -- 192.168.123.105:0/1160935539 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f637c05a990 con 0x7f638810f340 2026-03-10T07:48:13.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.623+0000 7fb37c374700 1 -- 192.168.123.105:0/2884652799 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3741089a0 msgr2=0x7fb37410be70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:13.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.623+0000 7fb37c374700 1 --2- 192.168.123.105:0/2884652799 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3741089a0 0x7fb37410be70 secure :-1 s=READY pgs=224 cs=0 l=1 rev1=1 crypto rx=0x7fb36c00cd40 tx=0x7fb36c00a320 comp rx=0 tx=0).stop 2026-03-10T07:48:13.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.623+0000 7fb37c374700 1 -- 192.168.123.105:0/2884652799 shutdown_connections 2026-03-10T07:48:13.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.623+0000 7fb37c374700 1 --2- 192.168.123.105:0/2884652799 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb3741089a0 0x7fb37410be70 unknown :-1 s=CLOSED pgs=224 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.623+0000 7fb37c374700 1 --2- 192.168.123.105:0/2884652799 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb374107ff0 0x7fb3741083d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.623+0000 7fb37c374700 1 -- 192.168.123.105:0/2884652799 >> 192.168.123.105:0/2884652799 conn(0x7fb37406ce20 msgr2=0x7fb37406d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:13.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.623+0000 7fb37c374700 1 -- 192.168.123.105:0/2884652799 shutdown_connections 2026-03-10T07:48:13.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.624+0000 7fb37c374700 1 -- 192.168.123.105:0/2884652799 wait complete. 2026-03-10T07:48:13.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.625+0000 7fb37c374700 1 Processor -- start 2026-03-10T07:48:13.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.625+0000 7fb37c374700 1 -- start start 2026-03-10T07:48:13.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.625+0000 7fb37c374700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb374107ff0 0x7fb37407cf10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:13.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.625+0000 7fb37c374700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb37407d450 0x7fb37407d8d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:13.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.625+0000 7fb37c374700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb374081a90 con 0x7fb37407d450 2026-03-10T07:48:13.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.625+0000 7fb37c374700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb374081c00 con 0x7fb374107ff0 2026-03-10T07:48:13.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.625+0000 7fb37990f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb37407d450 0x7fb37407d8d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:13.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.625+0000 7fb37990f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb37407d450 0x7fb37407d8d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38504/0 (socket says 192.168.123.105:38504) 2026-03-10T07:48:13.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.625+0000 7fb37990f700 1 -- 192.168.123.105:0/2882746531 learned_addr learned my addr 192.168.123.105:0/2882746531 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:13.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.625+0000 7fb37a110700 1 --2- 192.168.123.105:0/2882746531 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb374107ff0 0x7fb37407cf10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:13.629 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.625+0000 7fb37990f700 1 -- 192.168.123.105:0/2882746531 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb374107ff0 msgr2=0x7fb37407cf10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:13.629 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.625+0000 7fb37990f700 1 --2- 192.168.123.105:0/2882746531 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb374107ff0 0x7fb37407cf10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.629 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.625+0000 7fb37990f700 1 -- 192.168.123.105:0/2882746531 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb36c00c9f0 con 0x7fb37407d450 2026-03-10T07:48:13.633 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.627+0000 7fb37990f700 1 --2- 192.168.123.105:0/2882746531 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb37407d450 0x7fb37407d8d0 secure :-1 s=READY pgs=225 cs=0 l=1 rev1=1 crypto rx=0x7fb36c00bb40 tx=0x7fb36c00bb70 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:13.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.627+0000 7fb36b7fe700 1 -- 192.168.123.105:0/2882746531 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb36c00dea0 con 0x7fb37407d450 2026-03-10T07:48:13.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.627+0000 7fb37c374700 1 -- 192.168.123.105:0/2882746531 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb374081e80 con 0x7fb37407d450 2026-03-10T07:48:13.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.627+0000 7fb37c374700 1 -- 192.168.123.105:0/2882746531 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb3740823d0 con 0x7fb37407d450 2026-03-10T07:48:13.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.631+0000 7fb36b7fe700 1 -- 192.168.123.105:0/2882746531 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb36c009d70 con 0x7fb37407d450 2026-03-10T07:48:13.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.631+0000 7fb36b7fe700 1 -- 192.168.123.105:0/2882746531 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb36c01f920 con 0x7fb37407d450 2026-03-10T07:48:13.635 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.633+0000 7fb36b7fe700 1 -- 192.168.123.105:0/2882746531 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fb36c004030 con 0x7fb37407d450 2026-03-10T07:48:13.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.637+0000 7fb36b7fe700 1 --2- 192.168.123.105:0/2882746531 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb36006c530 0x7fb36006e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:13.641 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.638+0000 7fb37a110700 1 --2- 192.168.123.105:0/2882746531 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb36006c530 0x7fb36006e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:13.641 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.638+0000 7ffb7f281700 1 -- 192.168.123.105:0/277442789 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffb781089a0 msgr2=0x7ffb7810be70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:13.641 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.638+0000 7ffb7f281700 1 --2- 192.168.123.105:0/277442789 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffb781089a0 0x7ffb7810be70 secure :-1 s=READY pgs=226 cs=0 l=1 rev1=1 crypto rx=0x7ffb7000b600 tx=0x7ffb7000b910 comp rx=0 tx=0).stop 2026-03-10T07:48:13.641 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.638+0000 7ffb7f281700 1 -- 192.168.123.105:0/277442789 shutdown_connections 2026-03-10T07:48:13.641 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.638+0000 7ffb7f281700 1 --2- 192.168.123.105:0/277442789 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffb781089a0 0x7ffb7810be70 unknown :-1 s=CLOSED pgs=226 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.641 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.638+0000 7ffb7f281700 1 --2- 192.168.123.105:0/277442789 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffb78107ff0 0x7ffb781083d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.641 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.638+0000 7ffb7f281700 1 -- 192.168.123.105:0/277442789 >> 192.168.123.105:0/277442789 conn(0x7ffb7806ce20 msgr2=0x7ffb7806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:13.642 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.639+0000 7fb36b7fe700 1 -- 192.168.123.105:0/2882746531 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fb36c08d6b0 con 0x7fb37407d450 2026-03-10T07:48:13.642 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.639+0000 7fb37a110700 1 --2- 192.168.123.105:0/2882746531 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb36006c530 0x7fb36006e9f0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7fb370005950 tx=0x7fb37000f820 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:13.642 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.638+0000 7ffb7f281700 1 -- 192.168.123.105:0/277442789 shutdown_connections 2026-03-10T07:48:13.642 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.638+0000 7ffb7f281700 1 -- 192.168.123.105:0/277442789 wait complete. 2026-03-10T07:48:13.642 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.638+0000 7ffb7f281700 1 Processor -- start 2026-03-10T07:48:13.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.638+0000 7ffb7f281700 1 -- start start 2026-03-10T07:48:13.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.639+0000 7ffb7f281700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffb78107ff0 0x7ffb7807cfc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:13.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.639+0000 7ffb7f281700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffb7807d500 0x7ffb7807d980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:13.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.639+0000 7ffb7f281700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ffb78081bd0 con 0x7ffb78107ff0 2026-03-10T07:48:13.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.639+0000 7ffb7f281700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ffb78081d10 con 0x7ffb7807d500 2026-03-10T07:48:13.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.639+0000 7ffb7d01d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffb78107ff0 0x7ffb7807cfc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:13.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.639+0000 7ffb7d01d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffb78107ff0 0x7ffb7807cfc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38524/0 (socket says 192.168.123.105:38524) 2026-03-10T07:48:13.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.639+0000 7ffb7d01d700 1 -- 192.168.123.105:0/1972252519 learned_addr learned my addr 192.168.123.105:0/1972252519 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:13.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.639+0000 7ffb7d01d700 1 -- 192.168.123.105:0/1972252519 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffb7807d500 msgr2=0x7ffb7807d980 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:13.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.639+0000 7ffb7d01d700 1 --2- 192.168.123.105:0/1972252519 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffb7807d500 0x7ffb7807d980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.639+0000 7ffb7d01d700 1 -- 192.168.123.105:0/1972252519 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ffb7000b050 con 0x7ffb78107ff0 2026-03-10T07:48:13.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.639+0000 7ffb7d01d700 1 --2- 192.168.123.105:0/1972252519 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffb78107ff0 0x7ffb7807cfc0 secure :-1 s=READY pgs=227 cs=0 l=1 rev1=1 crypto rx=0x7ffb7400ba70 tx=0x7ffb7400be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:13.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.640+0000 7ffb6e7fc700 1 -- 192.168.123.105:0/1972252519 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ffb7400c760 con 0x7ffb78107ff0 2026-03-10T07:48:13.644 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.640+0000 7ffb7f281700 1 -- 192.168.123.105:0/1972252519 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ffb78081ff0 con 0x7ffb78107ff0 2026-03-10T07:48:13.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.645+0000 7fb3697fa700 1 -- 192.168.123.105:0/2882746531 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb358005320 con 0x7fb37407d450 2026-03-10T07:48:13.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.649+0000 7fb36b7fe700 1 -- 192.168.123.105:0/2882746531 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb36c007e60 con 0x7fb37407d450 2026-03-10T07:48:13.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.640+0000 7ffb7f281700 1 -- 192.168.123.105:0/1972252519 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ffb78082540 con 0x7ffb78107ff0 2026-03-10T07:48:13.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.650+0000 7ffb6e7fc700 1 -- 192.168.123.105:0/1972252519 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ffb7400cda0 con 0x7ffb78107ff0 2026-03-10T07:48:13.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.650+0000 7ffb6e7fc700 1 -- 192.168.123.105:0/1972252519 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ffb74012550 con 0x7ffb78107ff0 2026-03-10T07:48:13.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.650+0000 7ffb6e7fc700 1 -- 192.168.123.105:0/1972252519 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7ffb74012770 con 0x7ffb78107ff0 2026-03-10T07:48:13.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.651+0000 7ffb6e7fc700 1 --2- 192.168.123.105:0/1972252519 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ffb6406e8f0 0x7ffb64070db0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:13.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.651+0000 7ffb7c81c700 1 --2- 192.168.123.105:0/1972252519 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ffb6406e8f0 0x7ffb64070db0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:13.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.652+0000 7ffb7c81c700 1 --2- 192.168.123.105:0/1972252519 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ffb6406e8f0 0x7ffb64070db0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7ffb7000bd90 tx=0x7ffb70009f90 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:13.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.652+0000 7ffb6e7fc700 1 -- 192.168.123.105:0/1972252519 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7ffb7408b900 con 0x7ffb78107ff0 2026-03-10T07:48:13.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.652+0000 7ffb7f281700 1 -- 192.168.123.105:0/1972252519 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ffb5c005320 con 0x7ffb78107ff0 2026-03-10T07:48:13.657 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.655+0000 7ffb6e7fc700 1 -- 192.168.123.105:0/1972252519 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ffb74056540 con 0x7ffb78107ff0 2026-03-10T07:48:13.800 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.798+0000 7ffb7f281700 1 -- 192.168.123.105:0/1972252519 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 2} v 0) v1 -- 0x7ffb5c0059f0 con 0x7ffb78107ff0 2026-03-10T07:48:13.804 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.802+0000 7ffb6e7fc700 1 -- 192.168.123.105:0/1972252519 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 2}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7ffb74019070 con 0x7ffb78107ff0 2026-03-10T07:48:13.804 INFO:teuthology.orchestra.run.vm05.stdout:73014444039 2026-03-10T07:48:13.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.805+0000 7ffb7f281700 1 -- 192.168.123.105:0/1972252519 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ffb6406e8f0 msgr2=0x7ffb64070db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:13.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.805+0000 7ffb7f281700 1 --2- 192.168.123.105:0/1972252519 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ffb6406e8f0 0x7ffb64070db0 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7ffb7000bd90 tx=0x7ffb70009f90 comp rx=0 tx=0).stop 2026-03-10T07:48:13.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.805+0000 7ffb7f281700 1 -- 192.168.123.105:0/1972252519 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffb78107ff0 msgr2=0x7ffb7807cfc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:13.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.805+0000 7ffb7f281700 1 --2- 192.168.123.105:0/1972252519 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffb78107ff0 0x7ffb7807cfc0 secure :-1 s=READY pgs=227 cs=0 l=1 rev1=1 crypto rx=0x7ffb7400ba70 tx=0x7ffb7400be30 comp rx=0 tx=0).stop 2026-03-10T07:48:13.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.805+0000 7ffb7f281700 1 -- 192.168.123.105:0/1972252519 shutdown_connections 2026-03-10T07:48:13.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.805+0000 7ffb7f281700 1 --2- 192.168.123.105:0/1972252519 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ffb6406e8f0 0x7ffb64070db0 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.805+0000 7ffb7f281700 1 --2- 192.168.123.105:0/1972252519 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffb78107ff0 0x7ffb7807cfc0 unknown :-1 s=CLOSED pgs=227 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.806+0000 7ffb7f281700 1 --2- 192.168.123.105:0/1972252519 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffb7807d500 0x7ffb7807d980 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.806+0000 7ffb7f281700 1 -- 192.168.123.105:0/1972252519 >> 192.168.123.105:0/1972252519 conn(0x7ffb7806ce20 msgr2=0x7ffb780706e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:13.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.806+0000 7ffb7f281700 1 -- 192.168.123.105:0/1972252519 shutdown_connections 2026-03-10T07:48:13.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.806+0000 7ffb7f281700 1 -- 192.168.123.105:0/1972252519 wait complete. 2026-03-10T07:48:13.872 INFO:tasks.cephadm.ceph_manager.ceph:need seq 73014444040 got 73014444039 for osd.2 2026-03-10T07:48:13.928 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.924+0000 7f94ecef1700 1 -- 192.168.123.105:0/1400110206 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f94e810d0f0 msgr2=0x7f94e810d570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:13.928 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.924+0000 7f94ecef1700 1 --2- 192.168.123.105:0/1400110206 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f94e810d0f0 0x7f94e810d570 secure :-1 s=READY pgs=228 cs=0 l=1 rev1=1 crypto rx=0x7f94d8009b50 tx=0x7f94d8009e60 comp rx=0 tx=0).stop 2026-03-10T07:48:13.928 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.924+0000 7f94ecef1700 1 -- 192.168.123.105:0/1400110206 shutdown_connections 2026-03-10T07:48:13.928 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.924+0000 7f94ecef1700 1 --2- 192.168.123.105:0/1400110206 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f94e810d0f0 0x7f94e810d570 unknown :-1 s=CLOSED pgs=228 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.928 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.924+0000 7f94ecef1700 1 --2- 192.168.123.105:0/1400110206 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f94e810f340 0x7f94e810f720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.928 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.924+0000 7f94ecef1700 1 -- 192.168.123.105:0/1400110206 >> 192.168.123.105:0/1400110206 conn(0x7f94e806ce20 msgr2=0x7f94e806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:13.929 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.928+0000 7f94ecef1700 1 -- 192.168.123.105:0/1400110206 shutdown_connections 2026-03-10T07:48:13.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.928+0000 7f94ecef1700 1 -- 192.168.123.105:0/1400110206 wait complete. 2026-03-10T07:48:13.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.928+0000 7f94ecef1700 1 Processor -- start 2026-03-10T07:48:13.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.928+0000 7f94ecef1700 1 -- start start 2026-03-10T07:48:13.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.928+0000 7f94ecef1700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f94e810d0f0 0x7f94e819cfe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:13.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.928+0000 7f94ecef1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f94e810f340 0x7f94e819d520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:13.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.928+0000 7f94ecef1700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f94e819dc00 con 0x7f94e810f340 2026-03-10T07:48:13.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.928+0000 7f94ecef1700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f94e81a1990 con 0x7f94e810d0f0 2026-03-10T07:48:13.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.928+0000 7f94e6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f94e810f340 0x7f94e819d520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:13.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.928+0000 7f94e6ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f94e810f340 0x7f94e819d520 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38542/0 (socket says 192.168.123.105:38542) 2026-03-10T07:48:13.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.928+0000 7f94e6ffd700 1 -- 192.168.123.105:0/3821559708 learned_addr learned my addr 192.168.123.105:0/3821559708 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:13.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.929+0000 7f94e6ffd700 1 -- 192.168.123.105:0/3821559708 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f94e810d0f0 msgr2=0x7f94e819cfe0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:13.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.929+0000 7f94e6ffd700 1 --2- 192.168.123.105:0/3821559708 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f94e810d0f0 0x7f94e819cfe0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.929+0000 7f94e6ffd700 1 -- 192.168.123.105:0/3821559708 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f94d80097e0 con 0x7f94e810f340 2026-03-10T07:48:13.931 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.929+0000 7f94e6ffd700 1 --2- 192.168.123.105:0/3821559708 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f94e810f340 0x7f94e819d520 secure :-1 s=READY pgs=229 cs=0 l=1 rev1=1 crypto rx=0x7f94d8000c00 tx=0x7f94d801c930 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:13.931 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.929+0000 7f94e4ff9700 1 -- 192.168.123.105:0/3821559708 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f94d802f070 con 0x7f94e810f340 2026-03-10T07:48:13.931 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.929+0000 7f94e4ff9700 1 -- 192.168.123.105:0/3821559708 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f94d801cc30 con 0x7f94e810f340 2026-03-10T07:48:13.931 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.930+0000 7f94e4ff9700 1 -- 192.168.123.105:0/3821559708 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f94d8028760 con 0x7f94e810f340 2026-03-10T07:48:13.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.932+0000 7f94ecef1700 1 -- 192.168.123.105:0/3821559708 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f94e81a1c10 con 0x7f94e810f340 2026-03-10T07:48:13.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.932+0000 7f94ecef1700 1 -- 192.168.123.105:0/3821559708 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f94e81a2080 con 0x7f94e810f340 2026-03-10T07:48:13.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.932+0000 7f94ecef1700 1 -- 192.168.123.105:0/3821559708 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f94e81973e0 con 0x7f94e810f340 2026-03-10T07:48:13.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.935+0000 7f94e4ff9700 1 -- 192.168.123.105:0/3821559708 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f94d80288c0 con 0x7f94e810f340 2026-03-10T07:48:13.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.936+0000 7f94e4ff9700 1 --2- 192.168.123.105:0/3821559708 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f94d006c2e0 0x7f94d006e7a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:13.938 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.936+0000 7f94e77fe700 1 --2- 192.168.123.105:0/3821559708 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f94d006c2e0 0x7f94d006e7a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:13.938 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.936+0000 7f94e77fe700 1 --2- 192.168.123.105:0/3821559708 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f94d006c2e0 0x7f94d006e7a0 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f94dc009770 tx=0x7f94dc006cd0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:13.938 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.936+0000 7f94e4ff9700 1 -- 192.168.123.105:0/3821559708 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f94d809df60 con 0x7f94e810f340 2026-03-10T07:48:13.938 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.937+0000 7f94e4ff9700 1 -- 192.168.123.105:0/3821559708 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f94d809e400 con 0x7f94e810f340 2026-03-10T07:48:13.944 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.942+0000 7fb3697fa700 1 -- 192.168.123.105:0/2882746531 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 1} v 0) v1 -- 0x7fb358005190 con 0x7fb37407d450 2026-03-10T07:48:13.944 INFO:teuthology.orchestra.run.vm05.stdout:55834574857 2026-03-10T07:48:13.944 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.943+0000 7fb36b7fe700 1 -- 192.168.123.105:0/2882746531 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 1}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7fb36c024090 con 0x7fb37407d450 2026-03-10T07:48:13.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.952+0000 7fb37c374700 1 -- 192.168.123.105:0/2882746531 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb36006c530 msgr2=0x7fb36006e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:13.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.952+0000 7fb37c374700 1 --2- 192.168.123.105:0/2882746531 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb36006c530 0x7fb36006e9f0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7fb370005950 tx=0x7fb37000f820 comp rx=0 tx=0).stop 2026-03-10T07:48:13.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.952+0000 7fb37c374700 1 -- 192.168.123.105:0/2882746531 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb37407d450 msgr2=0x7fb37407d8d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:13.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.952+0000 7fb37c374700 1 --2- 192.168.123.105:0/2882746531 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb37407d450 0x7fb37407d8d0 secure :-1 s=READY pgs=225 cs=0 l=1 rev1=1 crypto rx=0x7fb36c00bb40 tx=0x7fb36c00bb70 comp rx=0 tx=0).stop 2026-03-10T07:48:13.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.953+0000 7fb37c374700 1 -- 192.168.123.105:0/2882746531 shutdown_connections 2026-03-10T07:48:13.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.953+0000 7fb37c374700 1 --2- 192.168.123.105:0/2882746531 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb36006c530 0x7fb36006e9f0 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.953+0000 7fb37c374700 1 --2- 192.168.123.105:0/2882746531 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb374107ff0 0x7fb37407cf10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.953+0000 7fb37c374700 1 --2- 192.168.123.105:0/2882746531 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb37407d450 0x7fb37407d8d0 unknown :-1 s=CLOSED pgs=225 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.953+0000 7fb37c374700 1 -- 192.168.123.105:0/2882746531 >> 192.168.123.105:0/2882746531 conn(0x7fb37406ce20 msgr2=0x7fb3740705e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:13.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.954+0000 7fb37c374700 1 -- 192.168.123.105:0/2882746531 shutdown_connections 2026-03-10T07:48:13.956 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.954+0000 7f335f3d7700 1 -- 192.168.123.105:0/3482627769 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f335810f380 msgr2=0x7f335810f760 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:13.957 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.954+0000 7fb37c374700 1 -- 192.168.123.105:0/2882746531 wait complete. 2026-03-10T07:48:13.957 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.954+0000 7f335f3d7700 1 --2- 192.168.123.105:0/3482627769 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f335810f380 0x7f335810f760 secure :-1 s=READY pgs=230 cs=0 l=1 rev1=1 crypto rx=0x7f3354009b00 tx=0x7f3354009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:13.957 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.956+0000 7f335f3d7700 1 -- 192.168.123.105:0/3482627769 shutdown_connections 2026-03-10T07:48:13.957 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.956+0000 7f335f3d7700 1 --2- 192.168.123.105:0/3482627769 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f335810cf00 0x7f335810d380 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.957 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.956+0000 7f335f3d7700 1 --2- 192.168.123.105:0/3482627769 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f335810f380 0x7f335810f760 unknown :-1 s=CLOSED pgs=230 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.958 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.956+0000 7f335f3d7700 1 -- 192.168.123.105:0/3482627769 >> 192.168.123.105:0/3482627769 conn(0x7f335806ce10 msgr2=0x7f335806d220 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:13.958 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.956+0000 7f335f3d7700 1 -- 192.168.123.105:0/3482627769 shutdown_connections 2026-03-10T07:48:13.959 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.957+0000 7f335f3d7700 1 -- 192.168.123.105:0/3482627769 wait complete. 2026-03-10T07:48:13.960 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.958+0000 7f335f3d7700 1 Processor -- start 2026-03-10T07:48:13.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.961+0000 7f335f3d7700 1 -- start start 2026-03-10T07:48:13.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.961+0000 7f335f3d7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f335810cf00 0x7f33581130a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:13.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.961+0000 7f335f3d7700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f335810f380 0x7f33581135e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:13.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.961+0000 7f335f3d7700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f33581181f0 con 0x7f335810cf00 2026-03-10T07:48:13.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.961+0000 7f335f3d7700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3358118360 con 0x7f335810f380 2026-03-10T07:48:13.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.962+0000 7f335e3d5700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f335810cf00 0x7f33581130a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:13.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.962+0000 7f335e3d5700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f335810cf00 0x7f33581130a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38556/0 (socket says 192.168.123.105:38556) 2026-03-10T07:48:13.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.962+0000 7f335e3d5700 1 -- 192.168.123.105:0/4132964427 learned_addr learned my addr 192.168.123.105:0/4132964427 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:13.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.962+0000 7f335dbd4700 1 --2- 192.168.123.105:0/4132964427 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f335810f380 0x7f33581135e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:13.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.963+0000 7f335dbd4700 1 -- 192.168.123.105:0/4132964427 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f335810cf00 msgr2=0x7f33581130a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:13.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.963+0000 7f335dbd4700 1 --2- 192.168.123.105:0/4132964427 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f335810cf00 0x7f33581130a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:13.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.963+0000 7f335dbd4700 1 -- 192.168.123.105:0/4132964427 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f33540097e0 con 0x7f335810f380 2026-03-10T07:48:13.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.963+0000 7f335dbd4700 1 --2- 192.168.123.105:0/4132964427 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f335810f380 0x7f33581135e0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f335000c390 tx=0x7f335000c6a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:13.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.963+0000 7f334f7fe700 1 -- 192.168.123.105:0/4132964427 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f335000e030 con 0x7f335810f380 2026-03-10T07:48:13.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.964+0000 7f335f3d7700 1 -- 192.168.123.105:0/4132964427 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3358113be0 con 0x7f335810f380 2026-03-10T07:48:13.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.964+0000 7f335f3d7700 1 -- 192.168.123.105:0/4132964427 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f33581b8570 con 0x7f335810f380 2026-03-10T07:48:13.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.964+0000 7f334f7fe700 1 -- 192.168.123.105:0/4132964427 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f335000f040 con 0x7f335810f380 2026-03-10T07:48:13.968 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.966+0000 7f334f7fe700 1 -- 192.168.123.105:0/4132964427 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3350014630 con 0x7f335810f380 2026-03-10T07:48:13.968 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.966+0000 7f334f7fe700 1 -- 192.168.123.105:0/4132964427 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f3350014870 con 0x7f335810f380 2026-03-10T07:48:13.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.968+0000 7f334f7fe700 1 --2- 192.168.123.105:0/4132964427 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f334406c400 0x7f334406e8c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:13.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.969+0000 7f335e3d5700 1 --2- 192.168.123.105:0/4132964427 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f334406c400 0x7f334406e8c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:13.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.974+0000 7f335e3d5700 1 --2- 192.168.123.105:0/4132964427 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f334406c400 0x7f334406e8c0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f3354005fd0 tx=0x7f335400b540 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:13.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.974+0000 7f334f7fe700 1 -- 192.168.123.105:0/4132964427 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f335008c5d0 con 0x7f335810f380 2026-03-10T07:48:13.976 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.975+0000 7f335f3d7700 1 -- 192.168.123.105:0/4132964427 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f333c005320 con 0x7f335810f380 2026-03-10T07:48:13.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:13.994+0000 7f334f7fe700 1 -- 192.168.123.105:0/4132964427 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f3350057210 con 0x7f335810f380 2026-03-10T07:48:14.009 INFO:tasks.cephadm.ceph_manager.ceph:need seq 55834574858 got 55834574857 for osd.1 2026-03-10T07:48:14.009 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.002+0000 7f6387fff700 1 -- 192.168.123.105:0/1160935539 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 4} v 0) v1 -- 0x7f6374005190 con 0x7f638810f340 2026-03-10T07:48:14.009 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.007+0000 7f636ffff700 1 -- 192.168.123.105:0/1160935539 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 4}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7f637c020020 con 0x7f638810f340 2026-03-10T07:48:14.009 INFO:teuthology.orchestra.run.vm05.stdout:120259084292 2026-03-10T07:48:14.012 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.010+0000 7f636dffb700 1 -- 192.168.123.105:0/1160935539 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f637006c600 msgr2=0x7f637006eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:14.012 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.010+0000 7f636dffb700 1 --2- 192.168.123.105:0/1160935539 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f637006c600 0x7f637006eac0 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f637800c010 tx=0x7f637800be20 comp rx=0 tx=0).stop 2026-03-10T07:48:14.012 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.011+0000 7f636dffb700 1 -- 192.168.123.105:0/1160935539 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f638810f340 msgr2=0x7f6388116fe0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:14.012 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.011+0000 7f636dffb700 1 --2- 192.168.123.105:0/1160935539 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f638810f340 0x7f6388116fe0 secure :-1 s=READY pgs=223 cs=0 l=1 rev1=1 crypto rx=0x7f637c00d900 tx=0x7f637c00dcc0 comp rx=0 tx=0).stop 2026-03-10T07:48:14.013 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.011+0000 7f636dffb700 1 -- 192.168.123.105:0/1160935539 shutdown_connections 2026-03-10T07:48:14.013 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.011+0000 7f636dffb700 1 --2- 192.168.123.105:0/1160935539 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f637006c600 0x7f637006eac0 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:14.013 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.011+0000 7f636dffb700 1 --2- 192.168.123.105:0/1160935539 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f638810d0f0 0x7f638811bfe0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:14.013 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.011+0000 7f636dffb700 1 --2- 192.168.123.105:0/1160935539 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f638810f340 0x7f6388116fe0 unknown :-1 s=CLOSED pgs=223 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:14.013 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.011+0000 7f636dffb700 1 -- 192.168.123.105:0/1160935539 >> 192.168.123.105:0/1160935539 conn(0x7f638806ce20 msgr2=0x7f6388070530 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:14.013 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.012+0000 7f636dffb700 1 -- 192.168.123.105:0/1160935539 shutdown_connections 2026-03-10T07:48:14.013 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.012+0000 7f636dffb700 1 -- 192.168.123.105:0/1160935539 wait complete. 2026-03-10T07:48:14.129 INFO:tasks.cephadm.ceph_manager.ceph:need seq 120259084293 got 120259084292 for osd.4 2026-03-10T07:48:14.136 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.134+0000 7f94ecef1700 1 -- 192.168.123.105:0/3821559708 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 3} v 0) v1 -- 0x7f94e819e340 con 0x7f94e810f340 2026-03-10T07:48:14.136 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.134+0000 7f94e4ff9700 1 -- 192.168.123.105:0/3821559708 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 3}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f94d8038090 con 0x7f94e810f340 2026-03-10T07:48:14.136 INFO:teuthology.orchestra.run.vm05.stdout:98784247813 2026-03-10T07:48:14.138 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.136+0000 7f94ecef1700 1 -- 192.168.123.105:0/3821559708 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f94d006c2e0 msgr2=0x7f94d006e7a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:14.138 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.136+0000 7f94ecef1700 1 --2- 192.168.123.105:0/3821559708 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f94d006c2e0 0x7f94d006e7a0 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f94dc009770 tx=0x7f94dc006cd0 comp rx=0 tx=0).stop 2026-03-10T07:48:14.138 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.137+0000 7f94ecef1700 1 -- 192.168.123.105:0/3821559708 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f94e810f340 msgr2=0x7f94e819d520 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:14.138 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.137+0000 7f94ecef1700 1 --2- 192.168.123.105:0/3821559708 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f94e810f340 0x7f94e819d520 secure :-1 s=READY pgs=229 cs=0 l=1 rev1=1 crypto rx=0x7f94d8000c00 tx=0x7f94d801c930 comp rx=0 tx=0).stop 2026-03-10T07:48:14.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.137+0000 7f94ecef1700 1 -- 192.168.123.105:0/3821559708 shutdown_connections 2026-03-10T07:48:14.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.137+0000 7f94ecef1700 1 --2- 192.168.123.105:0/3821559708 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f94d006c2e0 0x7f94d006e7a0 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:14.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.137+0000 7f94ecef1700 1 --2- 192.168.123.105:0/3821559708 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f94e810d0f0 0x7f94e819cfe0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:14.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.137+0000 7f94ecef1700 1 --2- 192.168.123.105:0/3821559708 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f94e810f340 0x7f94e819d520 unknown :-1 s=CLOSED pgs=229 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:14.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.137+0000 7f94ecef1700 1 -- 192.168.123.105:0/3821559708 >> 192.168.123.105:0/3821559708 conn(0x7f94e806ce20 msgr2=0x7f94e8109c50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:14.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.137+0000 7f94ecef1700 1 -- 192.168.123.105:0/3821559708 shutdown_connections 2026-03-10T07:48:14.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.137+0000 7f94ecef1700 1 -- 192.168.123.105:0/3821559708 wait complete. 2026-03-10T07:48:14.165 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.163+0000 7f335f3d7700 1 -- 192.168.123.105:0/4132964427 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 0} v 0) v1 -- 0x7f333c0059f0 con 0x7f335810f380 2026-03-10T07:48:14.169 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.166+0000 7f334f7fe700 1 -- 192.168.123.105:0/4132964427 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 0}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f335005a830 con 0x7f335810f380 2026-03-10T07:48:14.169 INFO:teuthology.orchestra.run.vm05.stdout:38654705675 2026-03-10T07:48:14.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.179+0000 7f334d7fa700 1 -- 192.168.123.105:0/4132964427 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f334406c400 msgr2=0x7f334406e8c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:14.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.179+0000 7f334d7fa700 1 --2- 192.168.123.105:0/4132964427 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f334406c400 0x7f334406e8c0 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f3354005fd0 tx=0x7f335400b540 comp rx=0 tx=0).stop 2026-03-10T07:48:14.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.179+0000 7f334d7fa700 1 -- 192.168.123.105:0/4132964427 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f335810f380 msgr2=0x7f33581135e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:14.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.179+0000 7f334d7fa700 1 --2- 192.168.123.105:0/4132964427 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f335810f380 0x7f33581135e0 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f335000c390 tx=0x7f335000c6a0 comp rx=0 tx=0).stop 2026-03-10T07:48:14.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.179+0000 7f334d7fa700 1 -- 192.168.123.105:0/4132964427 shutdown_connections 2026-03-10T07:48:14.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.179+0000 7f334d7fa700 1 --2- 192.168.123.105:0/4132964427 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f334406c400 0x7f334406e8c0 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:14.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.179+0000 7f334d7fa700 1 --2- 192.168.123.105:0/4132964427 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f335810cf00 0x7f33581130a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:14.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.179+0000 7f334d7fa700 1 --2- 192.168.123.105:0/4132964427 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f335810f380 0x7f33581135e0 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:14.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.179+0000 7f334d7fa700 1 -- 192.168.123.105:0/4132964427 >> 192.168.123.105:0/4132964427 conn(0x7f335806ce10 msgr2=0x7f3358119450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:14.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.180+0000 7f334d7fa700 1 -- 192.168.123.105:0/4132964427 shutdown_connections 2026-03-10T07:48:14.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.180+0000 7f334d7fa700 1 -- 192.168.123.105:0/4132964427 wait complete. 2026-03-10T07:48:14.190 INFO:tasks.cephadm.ceph_manager.ceph:need seq 98784247815 got 98784247813 for osd.3 2026-03-10T07:48:14.253 INFO:tasks.cephadm.ceph_manager.ceph:need seq 38654705676 got 38654705675 for osd.0 2026-03-10T07:48:14.338 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph osd last-stat-seq osd.5 2026-03-10T07:48:14.364 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:14 vm05 ceph-mon[50387]: pgmap v66: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:48:14.364 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:14 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:48:14.364 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:14 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/1972252519' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 2}]: dispatch 2026-03-10T07:48:14.364 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:14 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/2882746531' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 1}]: dispatch 2026-03-10T07:48:14.364 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:14 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/1160935539' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 4}]: dispatch 2026-03-10T07:48:14.364 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:14 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/3821559708' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-10T07:48:14.364 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:14 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/4132964427' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 0}]: dispatch 2026-03-10T07:48:14.485 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:14.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:14 vm08 ceph-mon[59917]: pgmap v66: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:48:14.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:14 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:48:14.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:14 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/1972252519' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 2}]: dispatch 2026-03-10T07:48:14.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:14 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/2882746531' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 1}]: dispatch 2026-03-10T07:48:14.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:14 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/1160935539' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 4}]: dispatch 2026-03-10T07:48:14.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:14 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/3821559708' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-10T07:48:14.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:14 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/4132964427' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 0}]: dispatch 2026-03-10T07:48:14.708 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.705+0000 7f49ddf26700 1 -- 192.168.123.105:0/3306425930 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f49d8073130 msgr2=0x7f49d8073510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:14.708 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.705+0000 7f49ddf26700 1 --2- 192.168.123.105:0/3306425930 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f49d8073130 0x7f49d8073510 secure :-1 s=READY pgs=231 cs=0 l=1 rev1=1 crypto rx=0x7f49c0009b00 tx=0x7f49c0009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:14.708 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.706+0000 7f49ddf26700 1 -- 192.168.123.105:0/3306425930 shutdown_connections 2026-03-10T07:48:14.708 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.706+0000 7f49ddf26700 1 --2- 192.168.123.105:0/3306425930 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f49d8073a50 0x7f49d8111990 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:14.708 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.706+0000 7f49ddf26700 1 --2- 192.168.123.105:0/3306425930 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f49d8073130 0x7f49d8073510 unknown :-1 s=CLOSED pgs=231 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:14.708 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.706+0000 7f49ddf26700 1 -- 192.168.123.105:0/3306425930 >> 192.168.123.105:0/3306425930 conn(0x7f49d80fc9b0 msgr2=0x7f49d80fedd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:14.708 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.706+0000 7f49ddf26700 1 -- 192.168.123.105:0/3306425930 shutdown_connections 2026-03-10T07:48:14.708 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.707+0000 7f49ddf26700 1 -- 192.168.123.105:0/3306425930 wait complete. 2026-03-10T07:48:14.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.707+0000 7f49ddf26700 1 Processor -- start 2026-03-10T07:48:14.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.707+0000 7f49ddf26700 1 -- start start 2026-03-10T07:48:14.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.708+0000 7f49ddf26700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f49d8073a50 0x7f49d819d390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:14.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.708+0000 7f49ddf26700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f49d819d8d0 0x7f49d81a1d40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:14.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.708+0000 7f49ddf26700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f49d819def0 con 0x7f49d8073a50 2026-03-10T07:48:14.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.708+0000 7f49ddf26700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f49d819e060 con 0x7f49d819d8d0 2026-03-10T07:48:14.710 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.708+0000 7f49d77fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f49d8073a50 0x7f49d819d390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:14.710 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.708+0000 7f49d6ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f49d819d8d0 0x7f49d81a1d40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:14.710 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.708+0000 7f49d6ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f49d819d8d0 0x7f49d81a1d40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:43870/0 (socket says 192.168.123.105:43870) 2026-03-10T07:48:14.710 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.708+0000 7f49d6ffd700 1 -- 192.168.123.105:0/229393783 learned_addr learned my addr 192.168.123.105:0/229393783 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:14.710 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.708+0000 7f49d6ffd700 1 -- 192.168.123.105:0/229393783 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f49d8073a50 msgr2=0x7f49d819d390 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:14.710 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.709+0000 7f49d6ffd700 1 --2- 192.168.123.105:0/229393783 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f49d8073a50 0x7f49d819d390 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:14.710 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.709+0000 7f49d6ffd700 1 -- 192.168.123.105:0/229393783 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f49c00097e0 con 0x7f49d819d8d0 2026-03-10T07:48:14.710 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.709+0000 7f49d77fe700 1 --2- 192.168.123.105:0/229393783 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f49d8073a50 0x7f49d819d390 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T07:48:14.711 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.709+0000 7f49d6ffd700 1 --2- 192.168.123.105:0/229393783 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f49d819d8d0 0x7f49d81a1d40 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f49c800d8d0 tx=0x7f49c800dbe0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:14.711 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.709+0000 7f49d4ff9700 1 -- 192.168.123.105:0/229393783 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f49c8009880 con 0x7f49d819d8d0 2026-03-10T07:48:14.711 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.709+0000 7f49d4ff9700 1 -- 192.168.123.105:0/229393783 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f49c8010460 con 0x7f49d819d8d0 2026-03-10T07:48:14.711 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.709+0000 7f49ddf26700 1 -- 192.168.123.105:0/229393783 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f49d81a2340 con 0x7f49d819d8d0 2026-03-10T07:48:14.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.709+0000 7f49d4ff9700 1 -- 192.168.123.105:0/229393783 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f49c800f5d0 con 0x7f49d819d8d0 2026-03-10T07:48:14.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.709+0000 7f49ddf26700 1 -- 192.168.123.105:0/229393783 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f49d81a2890 con 0x7f49d819d8d0 2026-03-10T07:48:14.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.710+0000 7f49ddf26700 1 -- 192.168.123.105:0/229393783 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f49d810f110 con 0x7f49d819d8d0 2026-03-10T07:48:14.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.712+0000 7f49d4ff9700 1 -- 192.168.123.105:0/229393783 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f49c80105d0 con 0x7f49d819d8d0 2026-03-10T07:48:14.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.712+0000 7f49d4ff9700 1 --2- 192.168.123.105:0/229393783 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f49c406c4e0 0x7f49c406e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:14.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.712+0000 7f49d4ff9700 1 -- 192.168.123.105:0/229393783 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f49c8020030 con 0x7f49d819d8d0 2026-03-10T07:48:14.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.712+0000 7f49d77fe700 1 --2- 192.168.123.105:0/229393783 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f49c406c4e0 0x7f49c406e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:14.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.713+0000 7f49d77fe700 1 --2- 192.168.123.105:0/229393783 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f49c406c4e0 0x7f49c406e9a0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f49c0000c00 tx=0x7f49c000b560 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:14.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.715+0000 7f49d4ff9700 1 -- 192.168.123.105:0/229393783 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f49c8056270 con 0x7f49d819d8d0 2026-03-10T07:48:14.818 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.816+0000 7f49ddf26700 1 -- 192.168.123.105:0/229393783 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 5} v 0) v1 -- 0x7f49d8066eb0 con 0x7f49d819d8d0 2026-03-10T07:48:14.819 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.817+0000 7f49d4ff9700 1 -- 192.168.123.105:0/229393783 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 5}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7f49c8059890 con 0x7f49d819d8d0 2026-03-10T07:48:14.819 INFO:teuthology.orchestra.run.vm05.stdout:137438953475 2026-03-10T07:48:14.821 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.819+0000 7f49ddf26700 1 -- 192.168.123.105:0/229393783 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f49c406c4e0 msgr2=0x7f49c406e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:14.821 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.819+0000 7f49ddf26700 1 --2- 192.168.123.105:0/229393783 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f49c406c4e0 0x7f49c406e9a0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f49c0000c00 tx=0x7f49c000b560 comp rx=0 tx=0).stop 2026-03-10T07:48:14.821 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.820+0000 7f49ddf26700 1 -- 192.168.123.105:0/229393783 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f49d819d8d0 msgr2=0x7f49d81a1d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:14.821 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.820+0000 7f49ddf26700 1 --2- 192.168.123.105:0/229393783 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f49d819d8d0 0x7f49d81a1d40 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f49c800d8d0 tx=0x7f49c800dbe0 comp rx=0 tx=0).stop 2026-03-10T07:48:14.821 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.820+0000 7f49ddf26700 1 -- 192.168.123.105:0/229393783 shutdown_connections 2026-03-10T07:48:14.821 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.820+0000 7f49ddf26700 1 --2- 192.168.123.105:0/229393783 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f49c406c4e0 0x7f49c406e9a0 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:14.822 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.820+0000 7f49ddf26700 1 --2- 192.168.123.105:0/229393783 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f49d8073a50 0x7f49d819d390 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:14.822 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.820+0000 7f49ddf26700 1 --2- 192.168.123.105:0/229393783 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f49d819d8d0 0x7f49d81a1d40 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:14.822 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.820+0000 7f49ddf26700 1 -- 192.168.123.105:0/229393783 >> 192.168.123.105:0/229393783 conn(0x7f49d80fc9b0 msgr2=0x7f49d81034a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:14.822 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.820+0000 7f49ddf26700 1 -- 192.168.123.105:0/229393783 shutdown_connections 2026-03-10T07:48:14.822 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:14.820+0000 7f49ddf26700 1 -- 192.168.123.105:0/229393783 wait complete. 2026-03-10T07:48:14.873 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph osd last-stat-seq osd.2 2026-03-10T07:48:14.881 INFO:tasks.cephadm.ceph_manager.ceph:need seq 137438953475 got 137438953475 for osd.5 2026-03-10T07:48:14.881 DEBUG:teuthology.parallel:result is None 2026-03-10T07:48:15.010 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph osd last-stat-seq osd.1 2026-03-10T07:48:15.014 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:15.130 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph osd last-stat-seq osd.4 2026-03-10T07:48:15.190 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph osd last-stat-seq osd.3 2026-03-10T07:48:15.253 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph osd last-stat-seq osd.0 2026-03-10T07:48:15.264 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:15.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.318+0000 7f95cee08700 1 -- 192.168.123.105:0/868248696 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95c810d310 msgr2=0x7f95c810d6f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:15.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.318+0000 7f95cee08700 1 --2- 192.168.123.105:0/868248696 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95c810d310 0x7f95c810d6f0 secure :-1 s=READY pgs=232 cs=0 l=1 rev1=1 crypto rx=0x7f95c4008790 tx=0x7f95c4008aa0 comp rx=0 tx=0).stop 2026-03-10T07:48:15.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.318+0000 7f95cee08700 1 -- 192.168.123.105:0/868248696 shutdown_connections 2026-03-10T07:48:15.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.318+0000 7f95cee08700 1 --2- 192.168.123.105:0/868248696 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95c8107d90 0x7f95c81081f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:15.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.318+0000 7f95cee08700 1 --2- 192.168.123.105:0/868248696 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95c810d310 0x7f95c810d6f0 unknown :-1 s=CLOSED pgs=232 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:15.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.318+0000 7f95cee08700 1 -- 192.168.123.105:0/868248696 >> 192.168.123.105:0/868248696 conn(0x7f95c806ce20 msgr2=0x7f95c806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:15.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.319+0000 7f95cee08700 1 -- 192.168.123.105:0/868248696 shutdown_connections 2026-03-10T07:48:15.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.319+0000 7f95cee08700 1 -- 192.168.123.105:0/868248696 wait complete. 2026-03-10T07:48:15.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.319+0000 7f95cee08700 1 Processor -- start 2026-03-10T07:48:15.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.319+0000 7f95cee08700 1 -- start start 2026-03-10T07:48:15.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.319+0000 7f95cee08700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95c8107d90 0x7f95c80798c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:15.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.319+0000 7f95cee08700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95c8079e00 0x7f95c807a280 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:15.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.319+0000 7f95cee08700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f95c8080930 con 0x7f95c8079e00 2026-03-10T07:48:15.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.319+0000 7f95cee08700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f95c807e440 con 0x7f95c8107d90 2026-03-10T07:48:15.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.319+0000 7f95cd605700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95c8079e00 0x7f95c807a280 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:15.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.319+0000 7f95cd605700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95c8079e00 0x7f95c807a280 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38594/0 (socket says 192.168.123.105:38594) 2026-03-10T07:48:15.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.319+0000 7f95cd605700 1 -- 192.168.123.105:0/2379461168 learned_addr learned my addr 192.168.123.105:0/2379461168 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:15.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.319+0000 7f95cd605700 1 -- 192.168.123.105:0/2379461168 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95c8107d90 msgr2=0x7f95c80798c0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:48:15.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.319+0000 7f95cd605700 1 --2- 192.168.123.105:0/2379461168 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95c8107d90 0x7f95c80798c0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:15.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.319+0000 7f95cd605700 1 -- 192.168.123.105:0/2379461168 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f95c4008440 con 0x7f95c8079e00 2026-03-10T07:48:15.325 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.320+0000 7f95cd605700 1 --2- 192.168.123.105:0/2379461168 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95c8079e00 0x7f95c807a280 secure :-1 s=READY pgs=233 cs=0 l=1 rev1=1 crypto rx=0x7f95c00060b0 tx=0x7f95c0008a00 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:15.325 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.320+0000 7f95beffd700 1 -- 192.168.123.105:0/2379461168 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f95c000d730 con 0x7f95c8079e00 2026-03-10T07:48:15.325 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.321+0000 7f95cee08700 1 -- 192.168.123.105:0/2379461168 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f95c807e6c0 con 0x7f95c8079e00 2026-03-10T07:48:15.325 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.321+0000 7f95cee08700 1 -- 192.168.123.105:0/2379461168 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f95c807ec10 con 0x7f95c8079e00 2026-03-10T07:48:15.325 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.321+0000 7f95beffd700 1 -- 192.168.123.105:0/2379461168 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f95c0004bb0 con 0x7f95c8079e00 2026-03-10T07:48:15.325 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.321+0000 7f95beffd700 1 -- 192.168.123.105:0/2379461168 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f95c0005230 con 0x7f95c8079e00 2026-03-10T07:48:15.325 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.321+0000 7f95cee08700 1 -- 192.168.123.105:0/2379461168 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f95ac005320 con 0x7f95c8079e00 2026-03-10T07:48:15.327 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.322+0000 7f95beffd700 1 -- 192.168.123.105:0/2379461168 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f95c0004d20 con 0x7f95c8079e00 2026-03-10T07:48:15.327 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.325+0000 7f95beffd700 1 --2- 192.168.123.105:0/2379461168 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f95b406c600 0x7f95b406eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:15.327 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.325+0000 7f95cde06700 1 --2- 192.168.123.105:0/2379461168 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f95b406c600 0x7f95b406eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:15.327 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.325+0000 7f95beffd700 1 -- 192.168.123.105:0/2379461168 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f95c008ae20 con 0x7f95c8079e00 2026-03-10T07:48:15.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.326+0000 7f95cde06700 1 --2- 192.168.123.105:0/2379461168 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f95b406c600 0x7f95b406eac0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f95c4005b40 tx=0x7f95c4005ab0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:15.330 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.328+0000 7f95beffd700 1 -- 192.168.123.105:0/2379461168 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f95c0055a60 con 0x7f95c8079e00 2026-03-10T07:48:15.420 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:15 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/229393783' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 5}]: dispatch 2026-03-10T07:48:15.503 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.501+0000 7f95cee08700 1 -- 192.168.123.105:0/2379461168 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 2} v 0) v1 -- 0x7f95ac005190 con 0x7f95c8079e00 2026-03-10T07:48:15.503 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.501+0000 7f95beffd700 1 -- 192.168.123.105:0/2379461168 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 2}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f95c0059080 con 0x7f95c8079e00 2026-03-10T07:48:15.507 INFO:teuthology.orchestra.run.vm05.stdout:73014444041 2026-03-10T07:48:15.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.506+0000 7f95cee08700 1 -- 192.168.123.105:0/2379461168 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f95b406c600 msgr2=0x7f95b406eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:15.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.506+0000 7f95cee08700 1 --2- 192.168.123.105:0/2379461168 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f95b406c600 0x7f95b406eac0 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f95c4005b40 tx=0x7f95c4005ab0 comp rx=0 tx=0).stop 2026-03-10T07:48:15.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.506+0000 7f95cee08700 1 -- 192.168.123.105:0/2379461168 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95c8079e00 msgr2=0x7f95c807a280 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:15.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.506+0000 7f95cee08700 1 --2- 192.168.123.105:0/2379461168 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95c8079e00 0x7f95c807a280 secure :-1 s=READY pgs=233 cs=0 l=1 rev1=1 crypto rx=0x7f95c00060b0 tx=0x7f95c0008a00 comp rx=0 tx=0).stop 2026-03-10T07:48:15.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.508+0000 7f95cee08700 1 -- 192.168.123.105:0/2379461168 shutdown_connections 2026-03-10T07:48:15.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.508+0000 7f95cee08700 1 --2- 192.168.123.105:0/2379461168 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f95b406c600 0x7f95b406eac0 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:15.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.508+0000 7f95cee08700 1 --2- 192.168.123.105:0/2379461168 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95c8107d90 0x7f95c80798c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:15.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.508+0000 7f95cee08700 1 --2- 192.168.123.105:0/2379461168 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95c8079e00 0x7f95c807a280 unknown :-1 s=CLOSED pgs=233 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:15.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.508+0000 7f95cee08700 1 -- 192.168.123.105:0/2379461168 >> 192.168.123.105:0/2379461168 conn(0x7f95c806ce20 msgr2=0x7f95c806fec0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:15.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.508+0000 7f95cee08700 1 -- 192.168.123.105:0/2379461168 shutdown_connections 2026-03-10T07:48:15.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.508+0000 7f95cee08700 1 -- 192.168.123.105:0/2379461168 wait complete. 2026-03-10T07:48:15.584 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:15.610 INFO:tasks.cephadm.ceph_manager.ceph:need seq 73014444040 got 73014444041 for osd.2 2026-03-10T07:48:15.611 DEBUG:teuthology.parallel:result is None 2026-03-10T07:48:15.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:15 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/229393783' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 5}]: dispatch 2026-03-10T07:48:15.775 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:15.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.809+0000 7fca70623700 1 -- 192.168.123.105:0/1409051051 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca6810f660 msgr2=0x7fca68107d90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:15.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.809+0000 7fca70623700 1 --2- 192.168.123.105:0/1409051051 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca6810f660 0x7fca68107d90 secure :-1 s=READY pgs=234 cs=0 l=1 rev1=1 crypto rx=0x7fca64009b00 tx=0x7fca64009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:15.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.811+0000 7fca70623700 1 -- 192.168.123.105:0/1409051051 shutdown_connections 2026-03-10T07:48:15.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.811+0000 7fca70623700 1 --2- 192.168.123.105:0/1409051051 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca681082d0 0x7fca68108750 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:15.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.811+0000 7fca70623700 1 --2- 192.168.123.105:0/1409051051 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca6810f660 0x7fca68107d90 unknown :-1 s=CLOSED pgs=234 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:15.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.811+0000 7fca70623700 1 -- 192.168.123.105:0/1409051051 >> 192.168.123.105:0/1409051051 conn(0x7fca6806d0f0 msgr2=0x7fca6806d500 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:15.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.812+0000 7fca70623700 1 -- 192.168.123.105:0/1409051051 shutdown_connections 2026-03-10T07:48:15.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.812+0000 7fca70623700 1 -- 192.168.123.105:0/1409051051 wait complete. 2026-03-10T07:48:15.816 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.812+0000 7fca70623700 1 Processor -- start 2026-03-10T07:48:15.816 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.812+0000 7fca70623700 1 -- start start 2026-03-10T07:48:15.816 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.813+0000 7fca70623700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca681082d0 0x7fca68119eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:15.816 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.813+0000 7fca70623700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca6810f660 0x7fca68114f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:15.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.813+0000 7fca70623700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fca681154d0 con 0x7fca6810f660 2026-03-10T07:48:15.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.813+0000 7fca70623700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fca68115640 con 0x7fca681082d0 2026-03-10T07:48:15.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.813+0000 7fca6dbbe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca6810f660 0x7fca68114f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:15.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.813+0000 7fca6dbbe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca6810f660 0x7fca68114f00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38614/0 (socket says 192.168.123.105:38614) 2026-03-10T07:48:15.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.813+0000 7fca6dbbe700 1 -- 192.168.123.105:0/2246322375 learned_addr learned my addr 192.168.123.105:0/2246322375 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:15.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.813+0000 7fca6e3bf700 1 --2- 192.168.123.105:0/2246322375 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca681082d0 0x7fca68119eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:15.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.813+0000 7fca6dbbe700 1 -- 192.168.123.105:0/2246322375 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca681082d0 msgr2=0x7fca68119eb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:15.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.813+0000 7fca6dbbe700 1 --2- 192.168.123.105:0/2246322375 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca681082d0 0x7fca68119eb0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:15.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.813+0000 7fca6dbbe700 1 -- 192.168.123.105:0/2246322375 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fca640097e0 con 0x7fca6810f660 2026-03-10T07:48:15.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.814+0000 7fca6dbbe700 1 --2- 192.168.123.105:0/2246322375 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca6810f660 0x7fca68114f00 secure :-1 s=READY pgs=235 cs=0 l=1 rev1=1 crypto rx=0x7fca5800eb10 tx=0x7fca5800ee20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:15.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.814+0000 7fca5f7fe700 1 -- 192.168.123.105:0/2246322375 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fca5800cc40 con 0x7fca6810f660 2026-03-10T07:48:15.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.814+0000 7fca5f7fe700 1 -- 192.168.123.105:0/2246322375 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fca5800cda0 con 0x7fca6810f660 2026-03-10T07:48:15.819 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.815+0000 7fca5f7fe700 1 -- 192.168.123.105:0/2246322375 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fca58018810 con 0x7fca6810f660 2026-03-10T07:48:15.819 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.815+0000 7fca70623700 1 -- 192.168.123.105:0/2246322375 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fca68115920 con 0x7fca6810f660 2026-03-10T07:48:15.819 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.815+0000 7fca70623700 1 -- 192.168.123.105:0/2246322375 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fca681b8690 con 0x7fca6810f660 2026-03-10T07:48:15.819 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.816+0000 7fca5f7fe700 1 -- 192.168.123.105:0/2246322375 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fca58018970 con 0x7fca6810f660 2026-03-10T07:48:15.819 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.816+0000 7fca5f7fe700 1 --2- 192.168.123.105:0/2246322375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fca5406c380 0x7fca5406e840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:15.819 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.817+0000 7fca6e3bf700 1 --2- 192.168.123.105:0/2246322375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fca5406c380 0x7fca5406e840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:15.819 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.817+0000 7fca6e3bf700 1 --2- 192.168.123.105:0/2246322375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fca5406c380 0x7fca5406e840 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fca6400b5c0 tx=0x7fca64005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:15.819 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.817+0000 7fca5f7fe700 1 -- 192.168.123.105:0/2246322375 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fca58014070 con 0x7fca6810f660 2026-03-10T07:48:15.819 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.817+0000 7fca70623700 1 -- 192.168.123.105:0/2246322375 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fca6804ea90 con 0x7fca6810f660 2026-03-10T07:48:15.822 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:15.820+0000 7fca5f7fe700 1 -- 192.168.123.105:0/2246322375 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fca58056070 con 0x7fca6810f660 2026-03-10T07:48:15.839 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:16.028 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.026+0000 7fca70623700 1 -- 192.168.123.105:0/2246322375 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 1} v 0) v1 -- 0x7fca68066ef0 con 0x7fca6810f660 2026-03-10T07:48:16.033 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.027+0000 7fca5f7fe700 1 -- 192.168.123.105:0/2246322375 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 1}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7fca68066ef0 con 0x7fca6810f660 2026-03-10T07:48:16.035 INFO:teuthology.orchestra.run.vm05.stdout:55834574858 2026-03-10T07:48:16.042 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.039+0000 7fca70623700 1 -- 192.168.123.105:0/2246322375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fca5406c380 msgr2=0x7fca5406e840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:16.042 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.039+0000 7fca70623700 1 --2- 192.168.123.105:0/2246322375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fca5406c380 0x7fca5406e840 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fca6400b5c0 tx=0x7fca64005fb0 comp rx=0 tx=0).stop 2026-03-10T07:48:16.042 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.039+0000 7fca70623700 1 -- 192.168.123.105:0/2246322375 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca6810f660 msgr2=0x7fca68114f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:16.042 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.039+0000 7fca70623700 1 --2- 192.168.123.105:0/2246322375 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca6810f660 0x7fca68114f00 secure :-1 s=READY pgs=235 cs=0 l=1 rev1=1 crypto rx=0x7fca5800eb10 tx=0x7fca5800ee20 comp rx=0 tx=0).stop 2026-03-10T07:48:16.042 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.040+0000 7fca70623700 1 -- 192.168.123.105:0/2246322375 shutdown_connections 2026-03-10T07:48:16.042 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.040+0000 7fca70623700 1 --2- 192.168.123.105:0/2246322375 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fca5406c380 0x7fca5406e840 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:16.042 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.040+0000 7fca70623700 1 --2- 192.168.123.105:0/2246322375 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fca681082d0 0x7fca68119eb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:16.042 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.040+0000 7fca70623700 1 --2- 192.168.123.105:0/2246322375 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fca6810f660 0x7fca68114f00 unknown :-1 s=CLOSED pgs=235 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:16.042 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.040+0000 7fca70623700 1 -- 192.168.123.105:0/2246322375 >> 192.168.123.105:0/2246322375 conn(0x7fca6806d0f0 msgr2=0x7fca6806f960 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:16.042 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.040+0000 7fca70623700 1 -- 192.168.123.105:0/2246322375 shutdown_connections 2026-03-10T07:48:16.045 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.043+0000 7fca70623700 1 -- 192.168.123.105:0/2246322375 wait complete. 2026-03-10T07:48:16.115 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.110+0000 7f7cc67e4700 1 -- 192.168.123.105:0/4099366570 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7cb80a55b0 msgr2=0x7f7cb80b78c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:16.115 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.110+0000 7f7cc67e4700 1 --2- 192.168.123.105:0/4099366570 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7cb80a55b0 0x7f7cb80b78c0 secure :-1 s=READY pgs=236 cs=0 l=1 rev1=1 crypto rx=0x7f7cbc009b00 tx=0x7f7cbc009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:16.115 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.111+0000 7f7cc67e4700 1 -- 192.168.123.105:0/4099366570 shutdown_connections 2026-03-10T07:48:16.115 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.111+0000 7f7cc67e4700 1 --2- 192.168.123.105:0/4099366570 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7cb80a55b0 0x7f7cb80b78c0 unknown :-1 s=CLOSED pgs=236 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:16.115 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.111+0000 7f7cc67e4700 1 --2- 192.168.123.105:0/4099366570 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7cb80a4c90 0x7f7cb80a5070 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:16.115 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.111+0000 7f7cc67e4700 1 -- 192.168.123.105:0/4099366570 >> 192.168.123.105:0/4099366570 conn(0x7f7cb801a6e0 msgr2=0x7f7cb801aaf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:16.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.112+0000 7f7cc67e4700 1 -- 192.168.123.105:0/4099366570 shutdown_connections 2026-03-10T07:48:16.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.112+0000 7f7cc67e4700 1 -- 192.168.123.105:0/4099366570 wait complete. 2026-03-10T07:48:16.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.112+0000 7f7cc67e4700 1 Processor -- start 2026-03-10T07:48:16.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.112+0000 7f7cc67e4700 1 -- start start 2026-03-10T07:48:16.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.112+0000 7f7cc67e4700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7cb80a4c90 0x7f7cb8142fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:16.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.112+0000 7f7cc67e4700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7cb8143500 0x7f7cb8147970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:16.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.112+0000 7f7cc67e4700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7cb8143b20 con 0x7f7cb8143500 2026-03-10T07:48:16.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.112+0000 7f7cc67e4700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7cb8143c90 con 0x7f7cb80a4c90 2026-03-10T07:48:16.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.114+0000 7f7cc57e2700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7cb80a4c90 0x7f7cb8142fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:16.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.114+0000 7f7cc57e2700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7cb80a4c90 0x7f7cb8142fc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:43956/0 (socket says 192.168.123.105:43956) 2026-03-10T07:48:16.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.114+0000 7f7cc57e2700 1 -- 192.168.123.105:0/1307258525 learned_addr learned my addr 192.168.123.105:0/1307258525 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:16.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.114+0000 7f7cc4fe1700 1 --2- 192.168.123.105:0/1307258525 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7cb8143500 0x7f7cb8147970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:16.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.114+0000 7f7cc57e2700 1 -- 192.168.123.105:0/1307258525 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7cb8143500 msgr2=0x7f7cb8147970 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:16.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.114+0000 7f7cc57e2700 1 --2- 192.168.123.105:0/1307258525 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7cb8143500 0x7f7cb8147970 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:16.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.114+0000 7f7cc57e2700 1 -- 192.168.123.105:0/1307258525 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7cbc0097e0 con 0x7f7cb80a4c90 2026-03-10T07:48:16.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.114+0000 7f7cc4fe1700 1 --2- 192.168.123.105:0/1307258525 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7cb8143500 0x7f7cb8147970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T07:48:16.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.117+0000 7f7cc57e2700 1 --2- 192.168.123.105:0/1307258525 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7cb80a4c90 0x7f7cb8142fc0 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f7cc0066b20 tx=0x7f7cc0072a50 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:16.121 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.118+0000 7f7cb67fc700 1 -- 192.168.123.105:0/1307258525 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7cc0074970 con 0x7f7cb80a4c90 2026-03-10T07:48:16.121 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.118+0000 7f7cb67fc700 1 -- 192.168.123.105:0/1307258525 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7cc00733c0 con 0x7f7cb80a4c90 2026-03-10T07:48:16.121 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.118+0000 7f7cc67e4700 1 -- 192.168.123.105:0/1307258525 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7cb8147f70 con 0x7f7cb80a4c90 2026-03-10T07:48:16.121 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.118+0000 7f7cc67e4700 1 -- 192.168.123.105:0/1307258525 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7cb8148490 con 0x7f7cb80a4c90 2026-03-10T07:48:16.121 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.118+0000 7f7cc67e4700 1 -- 192.168.123.105:0/1307258525 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7cb81487c0 con 0x7f7cb80a4c90 2026-03-10T07:48:16.121 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.119+0000 7f7cb67fc700 1 -- 192.168.123.105:0/1307258525 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7cc007b570 con 0x7f7cb80a4c90 2026-03-10T07:48:16.124 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.122+0000 7f7cb67fc700 1 -- 192.168.123.105:0/1307258525 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f7cc007b750 con 0x7f7cb80a4c90 2026-03-10T07:48:16.124 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.122+0000 7f7cb67fc700 1 --2- 192.168.123.105:0/1307258525 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7cac06c6d0 0x7f7cac06eb90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:16.124 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.122+0000 7f7cb67fc700 1 -- 192.168.123.105:0/1307258525 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f7cc00ef8d0 con 0x7f7cb80a4c90 2026-03-10T07:48:16.124 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.122+0000 7f7cc4fe1700 1 --2- 192.168.123.105:0/1307258525 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7cac06c6d0 0x7f7cac06eb90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:16.124 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.123+0000 7f7cc4fe1700 1 --2- 192.168.123.105:0/1307258525 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7cac06c6d0 0x7f7cac06eb90 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f7cbc00b5c0 tx=0x7f7cbc011040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:16.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.123+0000 7f7cb67fc700 1 -- 192.168.123.105:0/1307258525 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f7cc00ba510 con 0x7f7cb80a4c90 2026-03-10T07:48:16.136 INFO:tasks.cephadm.ceph_manager.ceph:need seq 55834574858 got 55834574858 for osd.1 2026-03-10T07:48:16.136 DEBUG:teuthology.parallel:result is None 2026-03-10T07:48:16.304 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.301+0000 7f7cc67e4700 1 -- 192.168.123.105:0/1307258525 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 4} v 0) v1 -- 0x7f7cb8144350 con 0x7f7cb80a4c90 2026-03-10T07:48:16.308 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.306+0000 7f7cb67fc700 1 -- 192.168.123.105:0/1307258525 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 4}]=0 v0) v1 ==== 74+0+13 (secure 0 0 0) 0x7f7cc00bdb30 con 0x7f7cb80a4c90 2026-03-10T07:48:16.308 INFO:teuthology.orchestra.run.vm05.stdout:120259084294 2026-03-10T07:48:16.316 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.312+0000 7f7cc67e4700 1 -- 192.168.123.105:0/1307258525 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7cac06c6d0 msgr2=0x7f7cac06eb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:16.316 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.312+0000 7f7cc67e4700 1 --2- 192.168.123.105:0/1307258525 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7cac06c6d0 0x7f7cac06eb90 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f7cbc00b5c0 tx=0x7f7cbc011040 comp rx=0 tx=0).stop 2026-03-10T07:48:16.316 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.312+0000 7f7cc67e4700 1 -- 192.168.123.105:0/1307258525 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7cb80a4c90 msgr2=0x7f7cb8142fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:16.316 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.312+0000 7f7cc67e4700 1 --2- 192.168.123.105:0/1307258525 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7cb80a4c90 0x7f7cb8142fc0 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f7cc0066b20 tx=0x7f7cc0072a50 comp rx=0 tx=0).stop 2026-03-10T07:48:16.317 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.312+0000 7f7cc67e4700 1 -- 192.168.123.105:0/1307258525 shutdown_connections 2026-03-10T07:48:16.317 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.312+0000 7f7cc67e4700 1 --2- 192.168.123.105:0/1307258525 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7cac06c6d0 0x7f7cac06eb90 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:16.317 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.312+0000 7f7cc67e4700 1 --2- 192.168.123.105:0/1307258525 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7cb80a4c90 0x7f7cb8142fc0 secure :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f7cc0066b20 tx=0x7f7cc0072a50 comp rx=0 tx=0).stop 2026-03-10T07:48:16.317 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.312+0000 7f7cc67e4700 1 --2- 192.168.123.105:0/1307258525 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7cb8143500 0x7f7cb8147970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:16.317 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.312+0000 7f7cc67e4700 1 -- 192.168.123.105:0/1307258525 >> 192.168.123.105:0/1307258525 conn(0x7f7cb801a6e0 msgr2=0x7f7cb80a3c80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:16.317 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.313+0000 7f7cc67e4700 1 -- 192.168.123.105:0/1307258525 shutdown_connections 2026-03-10T07:48:16.317 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.313+0000 7f7cc67e4700 1 -- 192.168.123.105:0/1307258525 wait complete. 2026-03-10T07:48:16.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.360+0000 7f855dd94700 1 -- 192.168.123.105:0/2316411696 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f855810d0f0 msgr2=0x7f855810d570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:16.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.360+0000 7f855dd94700 1 --2- 192.168.123.105:0/2316411696 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f855810d0f0 0x7f855810d570 secure :-1 s=READY pgs=237 cs=0 l=1 rev1=1 crypto rx=0x7f8544009b00 tx=0x7f8544009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:16.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.360+0000 7f855dd94700 1 -- 192.168.123.105:0/2316411696 shutdown_connections 2026-03-10T07:48:16.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.360+0000 7f855dd94700 1 --2- 192.168.123.105:0/2316411696 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f855810d0f0 0x7f855810d570 unknown :-1 s=CLOSED pgs=237 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:16.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.360+0000 7f855dd94700 1 --2- 192.168.123.105:0/2316411696 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f855810f340 0x7f855810f720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:16.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.360+0000 7f855dd94700 1 -- 192.168.123.105:0/2316411696 >> 192.168.123.105:0/2316411696 conn(0x7f855806ce20 msgr2=0x7f855806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:16.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.361+0000 7f855dd94700 1 -- 192.168.123.105:0/2316411696 shutdown_connections 2026-03-10T07:48:16.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.361+0000 7f855dd94700 1 -- 192.168.123.105:0/2316411696 wait complete. 2026-03-10T07:48:16.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.361+0000 7f855dd94700 1 Processor -- start 2026-03-10T07:48:16.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.361+0000 7f855dd94700 1 -- start start 2026-03-10T07:48:16.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.361+0000 7f855dd94700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f855810d0f0 0x7f85581a5970 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:16.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.361+0000 7f855dd94700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f855810f340 0x7f85581a5eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:16.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.361+0000 7f855dd94700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f85581a6540 con 0x7f855810f340 2026-03-10T07:48:16.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.361+0000 7f855dd94700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f85581aa350 con 0x7f855810d0f0 2026-03-10T07:48:16.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.362+0000 7f8557fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f855810f340 0x7f85581a5eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:16.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.362+0000 7f8557fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f855810f340 0x7f85581a5eb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38666/0 (socket says 192.168.123.105:38666) 2026-03-10T07:48:16.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.362+0000 7f8557fff700 1 -- 192.168.123.105:0/2869833025 learned_addr learned my addr 192.168.123.105:0/2869833025 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:16.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.362+0000 7f855cd92700 1 --2- 192.168.123.105:0/2869833025 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f855810d0f0 0x7f85581a5970 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:16.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.362+0000 7f8557fff700 1 -- 192.168.123.105:0/2869833025 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f855810d0f0 msgr2=0x7f85581a5970 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:16.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.362+0000 7f8557fff700 1 --2- 192.168.123.105:0/2869833025 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f855810d0f0 0x7f85581a5970 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:16.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.362+0000 7f8557fff700 1 -- 192.168.123.105:0/2869833025 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f85440097e0 con 0x7f855810f340 2026-03-10T07:48:16.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.362+0000 7f8557fff700 1 --2- 192.168.123.105:0/2869833025 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f855810f340 0x7f85581a5eb0 secure :-1 s=READY pgs=238 cs=0 l=1 rev1=1 crypto rx=0x7f85440052f0 tx=0x7f8544003680 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:16.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.365+0000 7f8555ffb700 1 -- 192.168.123.105:0/2869833025 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f854401d070 con 0x7f855810f340 2026-03-10T07:48:16.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.365+0000 7f855dd94700 1 -- 192.168.123.105:0/2869833025 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f85581aa5d0 con 0x7f855810f340 2026-03-10T07:48:16.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.365+0000 7f855dd94700 1 -- 192.168.123.105:0/2869833025 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f85581aaac0 con 0x7f855810f340 2026-03-10T07:48:16.370 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.366+0000 7f8555ffb700 1 -- 192.168.123.105:0/2869833025 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8544003c90 con 0x7f855810f340 2026-03-10T07:48:16.371 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.366+0000 7f855dd94700 1 -- 192.168.123.105:0/2869833025 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8558119d00 con 0x7f855810f340 2026-03-10T07:48:16.371 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.369+0000 7f8555ffb700 1 -- 192.168.123.105:0/2869833025 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8544021820 con 0x7f855810f340 2026-03-10T07:48:16.371 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.369+0000 7f8555ffb700 1 -- 192.168.123.105:0/2869833025 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f854402b430 con 0x7f855810f340 2026-03-10T07:48:16.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.370+0000 7f8555ffb700 1 --2- 192.168.123.105:0/2869833025 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f854806c6d0 0x7f854806eb90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:16.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.370+0000 7f855cd92700 1 --2- 192.168.123.105:0/2869833025 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f854806c6d0 0x7f854806eb90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:16.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.371+0000 7f8555ffb700 1 -- 192.168.123.105:0/2869833025 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f854408da00 con 0x7f855810f340 2026-03-10T07:48:16.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.371+0000 7f855cd92700 1 --2- 192.168.123.105:0/2869833025 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f854806c6d0 0x7f854806eb90 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f854c005950 tx=0x7f854c00b410 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:16.373 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.371+0000 7f8555ffb700 1 -- 192.168.123.105:0/2869833025 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f85440931d0 con 0x7f855810f340 2026-03-10T07:48:16.418 INFO:tasks.cephadm.ceph_manager.ceph:need seq 120259084293 got 120259084294 for osd.4 2026-03-10T07:48:16.418 DEBUG:teuthology.parallel:result is None 2026-03-10T07:48:16.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.420+0000 7f53d2ba9700 1 -- 192.168.123.105:0/793421851 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f53cc1082d0 msgr2=0x7f53cc108750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:16.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.420+0000 7f53d2ba9700 1 --2- 192.168.123.105:0/793421851 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f53cc1082d0 0x7f53cc108750 secure :-1 s=READY pgs=239 cs=0 l=1 rev1=1 crypto rx=0x7f53c0009b50 tx=0x7f53c0009e60 comp rx=0 tx=0).stop 2026-03-10T07:48:16.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.420+0000 7f53d2ba9700 1 -- 192.168.123.105:0/793421851 shutdown_connections 2026-03-10T07:48:16.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.420+0000 7f53d2ba9700 1 --2- 192.168.123.105:0/793421851 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f53cc1082d0 0x7f53cc108750 unknown :-1 s=CLOSED pgs=239 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:16.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.420+0000 7f53d2ba9700 1 --2- 192.168.123.105:0/793421851 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f53cc10f660 0x7f53cc107d90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:16.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.420+0000 7f53d2ba9700 1 -- 192.168.123.105:0/793421851 >> 192.168.123.105:0/793421851 conn(0x7f53cc06d0f0 msgr2=0x7f53cc06d500 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:16.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.420+0000 7f53d2ba9700 1 -- 192.168.123.105:0/793421851 shutdown_connections 2026-03-10T07:48:16.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.420+0000 7f53d2ba9700 1 -- 192.168.123.105:0/793421851 wait complete. 2026-03-10T07:48:16.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.421+0000 7f53d2ba9700 1 Processor -- start 2026-03-10T07:48:16.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.421+0000 7f53d2ba9700 1 -- start start 2026-03-10T07:48:16.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.421+0000 7f53d2ba9700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f53cc1082d0 0x7f53cc117ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:16.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.421+0000 7f53d2ba9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f53cc10f660 0x7f53cc112ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:16.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.421+0000 7f53d2ba9700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f53cc113470 con 0x7f53cc10f660 2026-03-10T07:48:16.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.421+0000 7f53d2ba9700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f53cc1135e0 con 0x7f53cc1082d0 2026-03-10T07:48:16.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.421+0000 7f53cbfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f53cc10f660 0x7f53cc112ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:16.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.421+0000 7f53cbfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f53cc10f660 0x7f53cc112ea0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38678/0 (socket says 192.168.123.105:38678) 2026-03-10T07:48:16.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.421+0000 7f53cbfff700 1 -- 192.168.123.105:0/3828877738 learned_addr learned my addr 192.168.123.105:0/3828877738 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:16.424 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.421+0000 7f53d0945700 1 --2- 192.168.123.105:0/3828877738 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f53cc1082d0 0x7f53cc117ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:16.424 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.421+0000 7f53cbfff700 1 -- 192.168.123.105:0/3828877738 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f53cc1082d0 msgr2=0x7f53cc117ea0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:16.424 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.421+0000 7f53cbfff700 1 --2- 192.168.123.105:0/3828877738 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f53cc1082d0 0x7f53cc117ea0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:16.424 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.421+0000 7f53cbfff700 1 -- 192.168.123.105:0/3828877738 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f53c00097e0 con 0x7f53cc10f660 2026-03-10T07:48:16.424 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.422+0000 7f53cbfff700 1 --2- 192.168.123.105:0/3828877738 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f53cc10f660 0x7f53cc112ea0 secure :-1 s=READY pgs=240 cs=0 l=1 rev1=1 crypto rx=0x7f53c0005950 tx=0x7f53c00049c0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:16.424 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.422+0000 7f53c9ffb700 1 -- 192.168.123.105:0/3828877738 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f53c001d070 con 0x7f53cc10f660 2026-03-10T07:48:16.424 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.422+0000 7f53c9ffb700 1 -- 192.168.123.105:0/3828877738 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f53c000bb80 con 0x7f53cc10f660 2026-03-10T07:48:16.425 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.423+0000 7f53c9ffb700 1 -- 192.168.123.105:0/3828877738 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f53c000f740 con 0x7f53cc10f660 2026-03-10T07:48:16.425 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.423+0000 7f53d2ba9700 1 -- 192.168.123.105:0/3828877738 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f53cc113860 con 0x7f53cc10f660 2026-03-10T07:48:16.425 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.423+0000 7f53d2ba9700 1 -- 192.168.123.105:0/3828877738 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f53cc113d20 con 0x7f53cc10f660 2026-03-10T07:48:16.425 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.423+0000 7f53d2ba9700 1 -- 192.168.123.105:0/3828877738 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f53cc04ea90 con 0x7f53cc10f660 2026-03-10T07:48:16.429 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.427+0000 7f53c9ffb700 1 -- 192.168.123.105:0/3828877738 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f53c0004d50 con 0x7f53cc10f660 2026-03-10T07:48:16.429 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.427+0000 7f53c9ffb700 1 --2- 192.168.123.105:0/3828877738 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f53b406c380 0x7f53b406e840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:16.429 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.427+0000 7f53c9ffb700 1 -- 192.168.123.105:0/3828877738 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f53c008ccc0 con 0x7f53cc10f660 2026-03-10T07:48:16.429 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.427+0000 7f53c9ffb700 1 -- 192.168.123.105:0/3828877738 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f53c00902a0 con 0x7f53cc10f660 2026-03-10T07:48:16.432 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.430+0000 7f53d0945700 1 --2- 192.168.123.105:0/3828877738 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f53b406c380 0x7f53b406e840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:16.432 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.430+0000 7f53d0945700 1 --2- 192.168.123.105:0/3828877738 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f53b406c380 0x7f53b406e840 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f53bc006fd0 tx=0x7f53bc008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:16.506 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:16 vm05 ceph-mon[50387]: pgmap v67: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:48:16.506 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:16 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/2379461168' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 2}]: dispatch 2026-03-10T07:48:16.506 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:16 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/2246322375' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 1}]: dispatch 2026-03-10T07:48:16.506 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.504+0000 7f855dd94700 1 -- 192.168.123.105:0/2869833025 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 3} v 0) v1 -- 0x7f855804ea90 con 0x7f855810f340 2026-03-10T07:48:16.509 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.507+0000 7f8555ffb700 1 -- 192.168.123.105:0/2869833025 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 3}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f854405c0d0 con 0x7f855810f340 2026-03-10T07:48:16.509 INFO:teuthology.orchestra.run.vm05.stdout:98784247815 2026-03-10T07:48:16.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.510+0000 7f855dd94700 1 -- 192.168.123.105:0/2869833025 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f854806c6d0 msgr2=0x7f854806eb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:16.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.510+0000 7f855dd94700 1 --2- 192.168.123.105:0/2869833025 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f854806c6d0 0x7f854806eb90 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f854c005950 tx=0x7f854c00b410 comp rx=0 tx=0).stop 2026-03-10T07:48:16.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.510+0000 7f855dd94700 1 -- 192.168.123.105:0/2869833025 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f855810f340 msgr2=0x7f85581a5eb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:16.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.510+0000 7f855dd94700 1 --2- 192.168.123.105:0/2869833025 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f855810f340 0x7f85581a5eb0 secure :-1 s=READY pgs=238 cs=0 l=1 rev1=1 crypto rx=0x7f85440052f0 tx=0x7f8544003680 comp rx=0 tx=0).stop 2026-03-10T07:48:16.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.510+0000 7f855dd94700 1 -- 192.168.123.105:0/2869833025 shutdown_connections 2026-03-10T07:48:16.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.510+0000 7f855dd94700 1 --2- 192.168.123.105:0/2869833025 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f854806c6d0 0x7f854806eb90 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:16.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.510+0000 7f855dd94700 1 --2- 192.168.123.105:0/2869833025 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f855810d0f0 0x7f85581a5970 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:16.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.510+0000 7f855dd94700 1 --2- 192.168.123.105:0/2869833025 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f855810f340 0x7f85581a5eb0 unknown :-1 s=CLOSED pgs=238 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:16.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.510+0000 7f855dd94700 1 -- 192.168.123.105:0/2869833025 >> 192.168.123.105:0/2869833025 conn(0x7f855806ce20 msgr2=0x7f85580706a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:16.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.510+0000 7f855dd94700 1 -- 192.168.123.105:0/2869833025 shutdown_connections 2026-03-10T07:48:16.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.510+0000 7f855dd94700 1 -- 192.168.123.105:0/2869833025 wait complete. 2026-03-10T07:48:16.553 INFO:tasks.cephadm.ceph_manager.ceph:need seq 98784247815 got 98784247815 for osd.3 2026-03-10T07:48:16.553 DEBUG:teuthology.parallel:result is None 2026-03-10T07:48:16.560 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.558+0000 7f53d2ba9700 1 -- 192.168.123.105:0/3828877738 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "osd last-stat-seq", "id": 0} v 0) v1 -- 0x7f53cc066e80 con 0x7f53cc10f660 2026-03-10T07:48:16.560 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.559+0000 7f53c9ffb700 1 -- 192.168.123.105:0/3828877738 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "osd last-stat-seq", "id": 0}]=0 v0) v1 ==== 74+0+12 (secure 0 0 0) 0x7f53c0027090 con 0x7f53cc10f660 2026-03-10T07:48:16.560 INFO:teuthology.orchestra.run.vm05.stdout:38654705677 2026-03-10T07:48:16.562 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.561+0000 7f53d2ba9700 1 -- 192.168.123.105:0/3828877738 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f53b406c380 msgr2=0x7f53b406e840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:16.563 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.561+0000 7f53d2ba9700 1 --2- 192.168.123.105:0/3828877738 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f53b406c380 0x7f53b406e840 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f53bc006fd0 tx=0x7f53bc008040 comp rx=0 tx=0).stop 2026-03-10T07:48:16.563 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.561+0000 7f53d2ba9700 1 -- 192.168.123.105:0/3828877738 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f53cc10f660 msgr2=0x7f53cc112ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:16.563 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.561+0000 7f53d2ba9700 1 --2- 192.168.123.105:0/3828877738 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f53cc10f660 0x7f53cc112ea0 secure :-1 s=READY pgs=240 cs=0 l=1 rev1=1 crypto rx=0x7f53c0005950 tx=0x7f53c00049c0 comp rx=0 tx=0).stop 2026-03-10T07:48:16.563 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.561+0000 7f53d2ba9700 1 -- 192.168.123.105:0/3828877738 shutdown_connections 2026-03-10T07:48:16.563 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.561+0000 7f53d2ba9700 1 --2- 192.168.123.105:0/3828877738 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f53b406c380 0x7f53b406e840 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:16.563 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.561+0000 7f53d2ba9700 1 --2- 192.168.123.105:0/3828877738 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f53cc1082d0 0x7f53cc117ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:16.563 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.561+0000 7f53d2ba9700 1 --2- 192.168.123.105:0/3828877738 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f53cc10f660 0x7f53cc112ea0 unknown :-1 s=CLOSED pgs=240 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:16.563 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.561+0000 7f53d2ba9700 1 -- 192.168.123.105:0/3828877738 >> 192.168.123.105:0/3828877738 conn(0x7f53cc06d0f0 msgr2=0x7f53cc10d3f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:16.563 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.562+0000 7f53d2ba9700 1 -- 192.168.123.105:0/3828877738 shutdown_connections 2026-03-10T07:48:16.563 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:16.562+0000 7f53d2ba9700 1 -- 192.168.123.105:0/3828877738 wait complete. 2026-03-10T07:48:16.625 INFO:tasks.cephadm.ceph_manager.ceph:need seq 38654705676 got 38654705677 for osd.0 2026-03-10T07:48:16.625 DEBUG:teuthology.parallel:result is None 2026-03-10T07:48:16.626 INFO:tasks.cephadm.ceph_manager.ceph:waiting for clean 2026-03-10T07:48:16.626 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph pg dump --format=json 2026-03-10T07:48:16.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:16 vm08 ceph-mon[59917]: pgmap v67: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:48:16.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:16 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/2379461168' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 2}]: dispatch 2026-03-10T07:48:16.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:16 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/2246322375' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 1}]: dispatch 2026-03-10T07:48:16.813 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:17.067 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.065+0000 7f521de77700 1 -- 192.168.123.105:0/1756675594 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5218073130 msgr2=0x7f5218073510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:17.067 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.065+0000 7f521de77700 1 --2- 192.168.123.105:0/1756675594 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5218073130 0x7f5218073510 secure :-1 s=READY pgs=241 cs=0 l=1 rev1=1 crypto rx=0x7f5200009b50 tx=0x7f5200009e60 comp rx=0 tx=0).stop 2026-03-10T07:48:17.067 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.066+0000 7f521de77700 1 -- 192.168.123.105:0/1756675594 shutdown_connections 2026-03-10T07:48:17.067 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.066+0000 7f521de77700 1 --2- 192.168.123.105:0/1756675594 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5218073a50 0x7f5218111990 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:17.067 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.066+0000 7f521de77700 1 --2- 192.168.123.105:0/1756675594 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5218073130 0x7f5218073510 unknown :-1 s=CLOSED pgs=241 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:17.067 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.066+0000 7f521de77700 1 -- 192.168.123.105:0/1756675594 >> 192.168.123.105:0/1756675594 conn(0x7f52180fc9b0 msgr2=0x7f52180fedd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:17.067 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.066+0000 7f521de77700 1 -- 192.168.123.105:0/1756675594 shutdown_connections 2026-03-10T07:48:17.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.066+0000 7f521de77700 1 -- 192.168.123.105:0/1756675594 wait complete. 2026-03-10T07:48:17.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.066+0000 7f521de77700 1 Processor -- start 2026-03-10T07:48:17.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.067+0000 7f521de77700 1 -- start start 2026-03-10T07:48:17.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.067+0000 7f521de77700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5218073130 0x7f5218072900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:17.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.067+0000 7f521de77700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5218073a50 0x7f521806d900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:17.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.067+0000 7f521de77700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f521806ded0 con 0x7f5218073a50 2026-03-10T07:48:17.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.067+0000 7f521de77700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f521806e040 con 0x7f5218073130 2026-03-10T07:48:17.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.067+0000 7f52177fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5218073130 0x7f5218072900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:17.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.067+0000 7f52177fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5218073130 0x7f5218072900 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:44014/0 (socket says 192.168.123.105:44014) 2026-03-10T07:48:17.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.067+0000 7f52177fe700 1 -- 192.168.123.105:0/1969433676 learned_addr learned my addr 192.168.123.105:0/1969433676 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:17.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.067+0000 7f52177fe700 1 -- 192.168.123.105:0/1969433676 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5218073a50 msgr2=0x7f521806d900 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:17.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.067+0000 7f5216ffd700 1 --2- 192.168.123.105:0/1969433676 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5218073a50 0x7f521806d900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:17.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.068+0000 7f52177fe700 1 --2- 192.168.123.105:0/1969433676 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5218073a50 0x7f521806d900 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:17.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.068+0000 7f52177fe700 1 -- 192.168.123.105:0/1969433676 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f52000097e0 con 0x7f5218073130 2026-03-10T07:48:17.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.068+0000 7f5216ffd700 1 --2- 192.168.123.105:0/1969433676 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5218073a50 0x7f521806d900 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:48:17.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.068+0000 7f52177fe700 1 --2- 192.168.123.105:0/1969433676 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5218073130 0x7f5218072900 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f52000094d0 tx=0x7f52000049e0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:17.070 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.068+0000 7f5214ff9700 1 -- 192.168.123.105:0/1969433676 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f520001d070 con 0x7f5218073130 2026-03-10T07:48:17.070 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.068+0000 7f521de77700 1 -- 192.168.123.105:0/1969433676 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f521806e320 con 0x7f5218073130 2026-03-10T07:48:17.070 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.068+0000 7f5214ff9700 1 -- 192.168.123.105:0/1969433676 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f520000bc30 con 0x7f5218073130 2026-03-10T07:48:17.070 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.068+0000 7f5214ff9700 1 -- 192.168.123.105:0/1969433676 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f520000f780 con 0x7f5218073130 2026-03-10T07:48:17.070 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.068+0000 7f521de77700 1 -- 192.168.123.105:0/1969433676 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f52181aff00 con 0x7f5218073130 2026-03-10T07:48:17.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.069+0000 7f521de77700 1 -- 192.168.123.105:0/1969433676 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f521810f110 con 0x7f5218073130 2026-03-10T07:48:17.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.070+0000 7f5214ff9700 1 -- 192.168.123.105:0/1969433676 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f520000f8e0 con 0x7f5218073130 2026-03-10T07:48:17.072 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.070+0000 7f5214ff9700 1 --2- 192.168.123.105:0/1969433676 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f520406c490 0x7f520406e950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:17.072 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.070+0000 7f5216ffd700 1 --2- 192.168.123.105:0/1969433676 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f520406c490 0x7f520406e950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:17.072 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.070+0000 7f5214ff9700 1 -- 192.168.123.105:0/1969433676 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f520000bda0 con 0x7f5218073130 2026-03-10T07:48:17.072 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.071+0000 7f5216ffd700 1 --2- 192.168.123.105:0/1969433676 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f520406c490 0x7f520406e950 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f521806f0f0 tx=0x7f5208008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:17.074 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.072+0000 7f5214ff9700 1 -- 192.168.123.105:0/1969433676 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f5200092050 con 0x7f5218073130 2026-03-10T07:48:17.177 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.175+0000 7f521de77700 1 -- 192.168.123.105:0/1969433676 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7f521806ef10 con 0x7f520406c490 2026-03-10T07:48:17.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.176+0000 7f5214ff9700 1 -- 192.168.123.105:0/1969433676 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 dumped all) v1 ==== 18+0+19134 (secure 0 0 0) 0x7f521806ef10 con 0x7f520406c490 2026-03-10T07:48:17.178 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:48:17.180 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.179+0000 7f521de77700 1 -- 192.168.123.105:0/1969433676 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f520406c490 msgr2=0x7f520406e950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:17.180 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.179+0000 7f521de77700 1 --2- 192.168.123.105:0/1969433676 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f520406c490 0x7f520406e950 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f521806f0f0 tx=0x7f5208008040 comp rx=0 tx=0).stop 2026-03-10T07:48:17.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.179+0000 7f521de77700 1 -- 192.168.123.105:0/1969433676 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5218073130 msgr2=0x7f5218072900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:17.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.179+0000 7f521de77700 1 --2- 192.168.123.105:0/1969433676 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5218073130 0x7f5218072900 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f52000094d0 tx=0x7f52000049e0 comp rx=0 tx=0).stop 2026-03-10T07:48:17.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.179+0000 7f521de77700 1 -- 192.168.123.105:0/1969433676 shutdown_connections 2026-03-10T07:48:17.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.179+0000 7f521de77700 1 --2- 192.168.123.105:0/1969433676 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f520406c490 0x7f520406e950 unknown :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:17.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.179+0000 7f521de77700 1 --2- 192.168.123.105:0/1969433676 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5218073130 0x7f5218072900 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:17.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.179+0000 7f521de77700 1 --2- 192.168.123.105:0/1969433676 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5218073a50 0x7f521806d900 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:17.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.180+0000 7f521de77700 1 -- 192.168.123.105:0/1969433676 >> 192.168.123.105:0/1969433676 conn(0x7f52180fc9b0 msgr2=0x7f52181034a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:17.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.180+0000 7f521de77700 1 -- 192.168.123.105:0/1969433676 shutdown_connections 2026-03-10T07:48:17.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.180+0000 7f521de77700 1 -- 192.168.123.105:0/1969433676 wait complete. 2026-03-10T07:48:17.182 INFO:teuthology.orchestra.run.vm05.stderr:dumped all 2026-03-10T07:48:17.224 INFO:teuthology.orchestra.run.vm05.stdout:{"pg_ready":true,"pg_map":{"version":67,"stamp":"2026-03-10T07:48:15.286183+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":3,"num_osds":6,"num_per_pool_osds":6,"num_per_pool_omap_osds":3,"kb":125804544,"kb_used":163644,"kb_used_data":3084,"kb_used_omap":0,"kb_used_meta":160512,"kb_avail":125640900,"statfs":{"total":128823853056,"available":128656281600,"internally_reserved":0,"allocated":3158016,"data_stored":2043408,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":164364288},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"9.436200"},"pg_stats":[{"pgid":"1.0","version":"20'76","reported_seq":138,"reported_epoch":32,"state":"active+clean","last_fresh":"2026-03-10T07:48:05.851521+0000","last_change":"2026-03-10T07:47:55.035052+0000","last_active":"2026-03-10T07:48:05.851521+0000","last_peered":"2026-03-10T07:48:05.851521+0000","last_clean":"2026-03-10T07:48:05.851521+0000","last_became_active":"2026-03-10T07:47:55.034906+0000","last_became_peered":"2026-03-10T07:47:55.034906+0000","last_unstale":"2026-03-10T07:48:05.851521+0000","last_undegraded":"2026-03-10T07:48:05.851521+0000","last_fullsized":"2026-03-10T07:48:05.851521+0000","mapping_epoch":27,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":28,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-10T07:47:38.098498+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-10T07:47:38.098498+0000","last_clean_scrub_stamp":"2026-03-10T07:47:38.098498+0000","objects_scrubbed":0,"log_size":76,"log_dups_size":0,"ondisk_log_size":76,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-11T14:39:24.759973+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[3,0,1],"acting":[3,0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":3,"acting_primary":3,"purged_snaps":[]}],"pool_stats":[{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1388544,"data_stored":1377840,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":4}],"osd_stats":[{"osd":5,"up_from":32,"seq":137438953475,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":110928,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,4],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.72299999999999998}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.78300000000000003}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.70199999999999996}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.72999999999999998}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.87}]}]},{"osd":4,"up_from":28,"seq":120259084294,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":110928,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.41199999999999998}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.39600000000000002}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.57499999999999996}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.56499999999999995}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.47699999999999998}]}]},{"osd":3,"up_from":23,"seq":98784247815,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":570208,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.36399999999999999}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.32100000000000001}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.44700000000000001}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.309}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.40899999999999997}]}]},{"osd":2,"up_from":17,"seq":73014444041,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":110928,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.52700000000000002}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.34200000000000003}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.54100000000000004}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.65900000000000003}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.54900000000000004}]}]},{"osd":0,"up_from":9,"seq":38654705677,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":570208,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[1,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.66200000000000003}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.59799999999999998}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.83899999999999997}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.70799999999999996}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.78400000000000003}]}]},{"osd":1,"up_from":13,"seq":55834574859,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":570208,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.42399999999999999}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.35299999999999998}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.50800000000000001}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.63}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.499}]}]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":3,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-10T07:48:17.224 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph pg dump --format=json 2026-03-10T07:48:17.370 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:17.428 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:17 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/1307258525' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 4}]: dispatch 2026-03-10T07:48:17.428 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:17 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/2869833025' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-10T07:48:17.428 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:17 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/3828877738' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 0}]: dispatch 2026-03-10T07:48:17.621 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.618+0000 7f28e58fe700 1 -- 192.168.123.105:0/904451 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f28e0103cf0 msgr2=0x7f28e0107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:17.621 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.618+0000 7f28e58fe700 1 --2- 192.168.123.105:0/904451 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f28e0103cf0 0x7f28e0107d40 secure :-1 s=READY pgs=242 cs=0 l=1 rev1=1 crypto rx=0x7f28d0009b00 tx=0x7f28d0009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:17.621 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.619+0000 7f28e58fe700 1 -- 192.168.123.105:0/904451 shutdown_connections 2026-03-10T07:48:17.621 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.619+0000 7f28e58fe700 1 --2- 192.168.123.105:0/904451 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f28e0103cf0 0x7f28e0107d40 unknown :-1 s=CLOSED pgs=242 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:17.621 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.619+0000 7f28e58fe700 1 --2- 192.168.123.105:0/904451 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f28e0103340 0x7f28e0103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:17.621 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.619+0000 7f28e58fe700 1 -- 192.168.123.105:0/904451 >> 192.168.123.105:0/904451 conn(0x7f28e00feb90 msgr2=0x7f28e0100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:17.621 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.619+0000 7f28e58fe700 1 -- 192.168.123.105:0/904451 shutdown_connections 2026-03-10T07:48:17.621 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.619+0000 7f28e58fe700 1 -- 192.168.123.105:0/904451 wait complete. 2026-03-10T07:48:17.621 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.620+0000 7f28e58fe700 1 Processor -- start 2026-03-10T07:48:17.622 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.620+0000 7f28e58fe700 1 -- start start 2026-03-10T07:48:17.622 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.620+0000 7f28e58fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f28e0103340 0x7f28e0198dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:17.622 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.620+0000 7f28e58fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f28e0103cf0 0x7f28e0199300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:17.622 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.620+0000 7f28e58fe700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f28e01999e0 con 0x7f28e0103cf0 2026-03-10T07:48:17.622 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.620+0000 7f28e58fe700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f28e019d770 con 0x7f28e0103340 2026-03-10T07:48:17.622 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.620+0000 7f28de7fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f28e0103cf0 0x7f28e0199300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:17.622 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.621+0000 7f28de7fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f28e0103cf0 0x7f28e0199300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38726/0 (socket says 192.168.123.105:38726) 2026-03-10T07:48:17.622 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.621+0000 7f28de7fc700 1 -- 192.168.123.105:0/3029648700 learned_addr learned my addr 192.168.123.105:0/3029648700 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:17.622 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.621+0000 7f28de7fc700 1 -- 192.168.123.105:0/3029648700 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f28e0103340 msgr2=0x7f28e0198dc0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:48:17.623 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.621+0000 7f28de7fc700 1 --2- 192.168.123.105:0/3029648700 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f28e0103340 0x7f28e0198dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:17.623 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.621+0000 7f28de7fc700 1 -- 192.168.123.105:0/3029648700 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f28d00097e0 con 0x7f28e0103cf0 2026-03-10T07:48:17.623 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.621+0000 7f28de7fc700 1 --2- 192.168.123.105:0/3029648700 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f28e0103cf0 0x7f28e0199300 secure :-1 s=READY pgs=243 cs=0 l=1 rev1=1 crypto rx=0x7f28d0009fd0 tx=0x7f28d0004970 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:17.623 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.621+0000 7f28e48fc700 1 -- 192.168.123.105:0/3029648700 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f28d001d070 con 0x7f28e0103cf0 2026-03-10T07:48:17.623 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.622+0000 7f28e58fe700 1 -- 192.168.123.105:0/3029648700 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f28e019d9f0 con 0x7f28e0103cf0 2026-03-10T07:48:17.623 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.622+0000 7f28e48fc700 1 -- 192.168.123.105:0/3029648700 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f28d000bc50 con 0x7f28e0103cf0 2026-03-10T07:48:17.624 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.622+0000 7f28e58fe700 1 -- 192.168.123.105:0/3029648700 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f28e019dee0 con 0x7f28e0103cf0 2026-03-10T07:48:17.625 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.623+0000 7f28d67fc700 1 -- 192.168.123.105:0/3029648700 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f28c40052f0 con 0x7f28e0103cf0 2026-03-10T07:48:17.627 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.626+0000 7f28e48fc700 1 -- 192.168.123.105:0/3029648700 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f28d0022470 con 0x7f28e0103cf0 2026-03-10T07:48:17.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.626+0000 7f28e48fc700 1 -- 192.168.123.105:0/3029648700 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f28d0022650 con 0x7f28e0103cf0 2026-03-10T07:48:17.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.627+0000 7f28e48fc700 1 --2- 192.168.123.105:0/3029648700 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f28cc06c4e0 0x7f28cc06e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:17.628 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.627+0000 7f28e48fc700 1 -- 192.168.123.105:0/3029648700 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f28d008d690 con 0x7f28e0103cf0 2026-03-10T07:48:17.629 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.627+0000 7f28e48fc700 1 -- 192.168.123.105:0/3029648700 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f28d008db10 con 0x7f28e0103cf0 2026-03-10T07:48:17.629 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.627+0000 7f28deffd700 1 --2- 192.168.123.105:0/3029648700 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f28cc06c4e0 0x7f28cc06e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:17.629 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.627+0000 7f28deffd700 1 --2- 192.168.123.105:0/3029648700 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f28cc06c4e0 0x7f28cc06e9a0 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f28c8005950 tx=0x7f28c80058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:17.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:17 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/1307258525' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 4}]: dispatch 2026-03-10T07:48:17.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:17 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/2869833025' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 3}]: dispatch 2026-03-10T07:48:17.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:17 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/3828877738' entity='client.admin' cmd=[{"prefix": "osd last-stat-seq", "id": 0}]: dispatch 2026-03-10T07:48:17.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.727+0000 7f28d67fc700 1 -- 192.168.123.105:0/3029648700 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}) v1 -- 0x7f28c4000bc0 con 0x7f28cc06c4e0 2026-03-10T07:48:17.732 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.730+0000 7f28e48fc700 1 -- 192.168.123.105:0/3029648700 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 dumped all) v1 ==== 18+0+19108 (secure 0 0 0) 0x7f28c4000bc0 con 0x7f28cc06c4e0 2026-03-10T07:48:17.732 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:48:17.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.732+0000 7f28d67fc700 1 -- 192.168.123.105:0/3029648700 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f28cc06c4e0 msgr2=0x7f28cc06e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:17.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.732+0000 7f28d67fc700 1 --2- 192.168.123.105:0/3029648700 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f28cc06c4e0 0x7f28cc06e9a0 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f28c8005950 tx=0x7f28c80058e0 comp rx=0 tx=0).stop 2026-03-10T07:48:17.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.732+0000 7f28d67fc700 1 -- 192.168.123.105:0/3029648700 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f28e0103cf0 msgr2=0x7f28e0199300 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:17.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.732+0000 7f28d67fc700 1 --2- 192.168.123.105:0/3029648700 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f28e0103cf0 0x7f28e0199300 secure :-1 s=READY pgs=243 cs=0 l=1 rev1=1 crypto rx=0x7f28d0009fd0 tx=0x7f28d0004970 comp rx=0 tx=0).stop 2026-03-10T07:48:17.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.733+0000 7f28d67fc700 1 -- 192.168.123.105:0/3029648700 shutdown_connections 2026-03-10T07:48:17.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.733+0000 7f28d67fc700 1 --2- 192.168.123.105:0/3029648700 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f28cc06c4e0 0x7f28cc06e9a0 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:17.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.733+0000 7f28d67fc700 1 --2- 192.168.123.105:0/3029648700 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f28e0103340 0x7f28e0198dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:17.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.733+0000 7f28d67fc700 1 --2- 192.168.123.105:0/3029648700 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f28e0103cf0 0x7f28e0199300 unknown :-1 s=CLOSED pgs=243 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:17.735 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.733+0000 7f28d67fc700 1 -- 192.168.123.105:0/3029648700 >> 192.168.123.105:0/3029648700 conn(0x7f28e00feb90 msgr2=0x7f28e01075b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:17.735 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.733+0000 7f28d67fc700 1 -- 192.168.123.105:0/3029648700 shutdown_connections 2026-03-10T07:48:17.735 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:17.733+0000 7f28d67fc700 1 -- 192.168.123.105:0/3029648700 wait complete. 2026-03-10T07:48:17.736 INFO:teuthology.orchestra.run.vm05.stderr:dumped all 2026-03-10T07:48:17.778 INFO:teuthology.orchestra.run.vm05.stdout:{"pg_ready":true,"pg_map":{"version":68,"stamp":"2026-03-10T07:48:17.286521+0000","last_osdmap_epoch":0,"last_pg_scan":0,"pg_stats_sum":{"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":0},"osd_stats_sum":{"up_from":0,"seq":0,"num_pgs":3,"num_osds":6,"num_per_pool_osds":6,"num_per_pool_omap_osds":3,"kb":125804544,"kb_used":163644,"kb_used_data":3084,"kb_used_omap":0,"kb_used_meta":160512,"kb_avail":125640900,"statfs":{"total":128823853056,"available":128656281600,"internally_reserved":0,"allocated":3158016,"data_stored":2043408,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":164364288},"hb_peers":[],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[]},"pg_stats_delta":{"stat_sum":{"num_bytes":0,"num_objects":0,"num_object_clones":0,"num_object_copies":0,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":0,"num_whiteouts":0,"num_read":0,"num_read_kb":0,"num_write":0,"num_write_kb":0,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":0,"ondisk_log_size":0,"up":0,"acting":0,"num_store_stats":0,"stamp_delta":"10.428763"},"pg_stats":[{"pgid":"1.0","version":"20'76","reported_seq":138,"reported_epoch":32,"state":"active+clean","last_fresh":"2026-03-10T07:48:05.851521+0000","last_change":"2026-03-10T07:47:55.035052+0000","last_active":"2026-03-10T07:48:05.851521+0000","last_peered":"2026-03-10T07:48:05.851521+0000","last_clean":"2026-03-10T07:48:05.851521+0000","last_became_active":"2026-03-10T07:47:55.034906+0000","last_became_peered":"2026-03-10T07:47:55.034906+0000","last_unstale":"2026-03-10T07:48:05.851521+0000","last_undegraded":"2026-03-10T07:48:05.851521+0000","last_fullsized":"2026-03-10T07:48:05.851521+0000","mapping_epoch":27,"log_start":"0'0","ondisk_log_start":"0'0","created":18,"last_epoch_clean":28,"parent":"0.0","parent_split_bits":0,"last_scrub":"0'0","last_scrub_stamp":"2026-03-10T07:47:38.098498+0000","last_deep_scrub":"0'0","last_deep_scrub_stamp":"2026-03-10T07:47:38.098498+0000","last_clean_scrub_stamp":"2026-03-10T07:47:38.098498+0000","objects_scrubbed":0,"log_size":76,"log_dups_size":0,"ondisk_log_size":76,"stats_invalid":false,"dirty_stats_invalid":false,"omap_stats_invalid":false,"hitset_stats_invalid":false,"hitset_bytes_stats_invalid":false,"pin_stats_invalid":false,"manifest_stats_invalid":false,"snaptrimq_len":0,"last_scrub_duration":0,"scrub_schedule":"periodic scrub scheduled @ 2026-03-11T14:39:24.759973+0000","scrub_duration":0,"objects_trimmed":0,"snaptrim_duration":0,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"up":[3,0,1],"acting":[3,0,1],"avail_no_missing":[],"object_location_counts":[],"blocked_by":[],"up_primary":3,"acting_primary":3,"purged_snaps":[]}],"pool_stats":[{"poolid":1,"num_pg":1,"stat_sum":{"num_bytes":459280,"num_objects":2,"num_object_clones":0,"num_object_copies":6,"num_objects_missing_on_primary":0,"num_objects_missing":0,"num_objects_degraded":0,"num_objects_misplaced":0,"num_objects_unfound":0,"num_objects_dirty":2,"num_whiteouts":0,"num_read":96,"num_read_kb":82,"num_write":113,"num_write_kb":1372,"num_scrub_errors":0,"num_shallow_scrub_errors":0,"num_deep_scrub_errors":0,"num_objects_recovered":0,"num_bytes_recovered":0,"num_keys_recovered":0,"num_objects_omap":0,"num_objects_hit_set_archive":0,"num_bytes_hit_set_archive":0,"num_flush":0,"num_flush_kb":0,"num_evict":0,"num_evict_kb":0,"num_promote":0,"num_flush_mode_high":0,"num_flush_mode_low":0,"num_evict_mode_some":0,"num_evict_mode_full":0,"num_objects_pinned":0,"num_legacy_snapsets":0,"num_large_omap_objects":0,"num_objects_manifest":0,"num_omap_bytes":0,"num_omap_keys":0,"num_objects_repaired":0},"store_stats":{"total":0,"available":0,"internally_reserved":0,"allocated":1388544,"data_stored":1377840,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},"log_size":76,"ondisk_log_size":76,"up":3,"acting":3,"num_store_stats":4}],"osd_stats":[{"osd":5,"up_from":32,"seq":137438953476,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":110928,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,4],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.44600000000000001}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.56699999999999995}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.65400000000000003}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.40500000000000003}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.376}]}]},{"osd":4,"up_from":28,"seq":120259084294,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":110928,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,3,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.41199999999999998}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.39600000000000002}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.57499999999999996}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.56499999999999995}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.47699999999999998}]}]},{"osd":3,"up_from":23,"seq":98784247816,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":570208,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,2,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.441}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.36499999999999999}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.54500000000000004}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.378}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.214}]}]},{"osd":2,"up_from":17,"seq":73014444041,"num_pgs":0,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":0,"kb":20967424,"kb_used":27048,"kb_used_data":288,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20940376,"statfs":{"total":21470642176,"available":21442945024,"internally_reserved":0,"allocated":294912,"data_stored":110928,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,1,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.52700000000000002}]},{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.34200000000000003}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.54100000000000004}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.65900000000000003}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.54900000000000004}]}]},{"osd":0,"up_from":9,"seq":38654705677,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":570208,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[1,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":1,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.66200000000000003}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.59799999999999998}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.83899999999999997}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.70799999999999996}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.78400000000000003}]}]},{"osd":1,"up_from":13,"seq":55834574859,"num_pgs":1,"num_osds":1,"num_per_pool_osds":1,"num_per_pool_omap_osds":1,"kb":20967424,"kb_used":27500,"kb_used_data":740,"kb_used_omap":0,"kb_used_meta":26752,"kb_avail":20939924,"statfs":{"total":21470642176,"available":21442482176,"internally_reserved":0,"allocated":757760,"data_stored":570208,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":27394048},"hb_peers":[0,2,3,4,5],"snap_trim_queue_len":0,"num_snap_trimming":0,"num_shards_repaired":0,"op_queue_age_hist":{"histogram":[],"upper_bound":1},"perf_stat":{"commit_latency_ms":0,"apply_latency_ms":0,"commit_latency_ns":0,"apply_latency_ns":0},"alerts":[],"network_ping_times":[{"osd":0,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.42399999999999999}]},{"osd":2,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.35299999999999998}]},{"osd":3,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.50800000000000001}]},{"osd":4,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.63}]},{"osd":5,"last update":"Thu Jan 1 00:00:00 1970","interfaces":[{"interface":"back","average":{"1min":0,"5min":0,"15min":0},"min":{"1min":0,"5min":0,"15min":0},"max":{"1min":0,"5min":0,"15min":0},"last":0.499}]}]}],"pool_statfs":[{"poolid":1,"osd":0,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":1,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":2,"total":0,"available":0,"internally_reserved":0,"allocated":0,"data_stored":0,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0},{"poolid":1,"osd":3,"total":0,"available":0,"internally_reserved":0,"allocated":462848,"data_stored":459280,"data_compressed":0,"data_compressed_allocated":0,"data_compressed_original":0,"omap_allocated":0,"internal_metadata":0}]}} 2026-03-10T07:48:17.778 INFO:tasks.cephadm.ceph_manager.ceph:clean! 2026-03-10T07:48:17.778 INFO:tasks.ceph:Waiting until ceph cluster ceph is healthy... 2026-03-10T07:48:17.779 INFO:tasks.cephadm.ceph_manager.ceph:wait_until_healthy 2026-03-10T07:48:17.779 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph health --format=json 2026-03-10T07:48:17.944 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:18.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.180+0000 7f0100467700 1 -- 192.168.123.105:0/2342129015 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f00f81020e0 msgr2=0x7f00f81024c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:18.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.180+0000 7f0100467700 1 --2- 192.168.123.105:0/2342129015 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f00f81020e0 0x7f00f81024c0 secure :-1 s=READY pgs=244 cs=0 l=1 rev1=1 crypto rx=0x7f00f4009b00 tx=0x7f00f4009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:18.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.181+0000 7f0100467700 1 -- 192.168.123.105:0/2342129015 shutdown_connections 2026-03-10T07:48:18.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.181+0000 7f0100467700 1 --2- 192.168.123.105:0/2342129015 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f00f8102a00 0x7f00f810aef0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:18.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.181+0000 7f0100467700 1 --2- 192.168.123.105:0/2342129015 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f00f81020e0 0x7f00f81024c0 unknown :-1 s=CLOSED pgs=244 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:18.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.181+0000 7f0100467700 1 -- 192.168.123.105:0/2342129015 >> 192.168.123.105:0/2342129015 conn(0x7f00f80fb830 msgr2=0x7f00f80fdc50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:18.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.181+0000 7f0100467700 1 -- 192.168.123.105:0/2342129015 shutdown_connections 2026-03-10T07:48:18.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.181+0000 7f0100467700 1 -- 192.168.123.105:0/2342129015 wait complete. 2026-03-10T07:48:18.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.182+0000 7f0100467700 1 Processor -- start 2026-03-10T07:48:18.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.182+0000 7f0100467700 1 -- start start 2026-03-10T07:48:18.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.182+0000 7f0100467700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f00f81020e0 0x7f00f819ca00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:18.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.182+0000 7f0100467700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f00f8102a00 0x7f00f819cf40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:18.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.182+0000 7f00fe203700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f00f81020e0 0x7f00f819ca00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:18.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.182+0000 7f00fe203700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f00f81020e0 0x7f00f819ca00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38742/0 (socket says 192.168.123.105:38742) 2026-03-10T07:48:18.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.182+0000 7f00fe203700 1 -- 192.168.123.105:0/956398141 learned_addr learned my addr 192.168.123.105:0/956398141 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:18.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.182+0000 7f0100467700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f00f819d5d0 con 0x7f00f81020e0 2026-03-10T07:48:18.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.182+0000 7f0100467700 1 -- 192.168.123.105:0/956398141 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f00f8196a80 con 0x7f00f8102a00 2026-03-10T07:48:18.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.183+0000 7f00fda02700 1 --2- 192.168.123.105:0/956398141 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f00f8102a00 0x7f00f819cf40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:18.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.183+0000 7f00fe203700 1 -- 192.168.123.105:0/956398141 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f00f8102a00 msgr2=0x7f00f819cf40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:18.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.183+0000 7f00fe203700 1 --2- 192.168.123.105:0/956398141 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f00f8102a00 0x7f00f819cf40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:18.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.183+0000 7f00fe203700 1 -- 192.168.123.105:0/956398141 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f00f40097e0 con 0x7f00f81020e0 2026-03-10T07:48:18.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.183+0000 7f00fda02700 1 --2- 192.168.123.105:0/956398141 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f00f8102a00 0x7f00f819cf40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T07:48:18.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.184+0000 7f00fe203700 1 --2- 192.168.123.105:0/956398141 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f00f81020e0 0x7f00f819ca00 secure :-1 s=READY pgs=245 cs=0 l=1 rev1=1 crypto rx=0x7f00f4004900 tx=0x7f00f4004930 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:18.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.184+0000 7f00ef7fe700 1 -- 192.168.123.105:0/956398141 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f00f401d070 con 0x7f00f81020e0 2026-03-10T07:48:18.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.184+0000 7f00ef7fe700 1 -- 192.168.123.105:0/956398141 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f00f400bc50 con 0x7f00f81020e0 2026-03-10T07:48:18.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.184+0000 7f00ef7fe700 1 -- 192.168.123.105:0/956398141 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f00f400f700 con 0x7f00f81020e0 2026-03-10T07:48:18.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.184+0000 7f0100467700 1 -- 192.168.123.105:0/956398141 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f00f8196d00 con 0x7f00f81020e0 2026-03-10T07:48:18.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.184+0000 7f0100467700 1 -- 192.168.123.105:0/956398141 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f00f81971f0 con 0x7f00f81020e0 2026-03-10T07:48:18.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.185+0000 7f00ef7fe700 1 -- 192.168.123.105:0/956398141 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f00f400f860 con 0x7f00f81020e0 2026-03-10T07:48:18.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.185+0000 7f00ef7fe700 1 --2- 192.168.123.105:0/956398141 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f00e406c5b0 0x7f00e406ea70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:18.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.186+0000 7f00ef7fe700 1 -- 192.168.123.105:0/956398141 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f00f408dad0 con 0x7f00f81020e0 2026-03-10T07:48:18.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.186+0000 7f00fda02700 1 --2- 192.168.123.105:0/956398141 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f00e406c5b0 0x7f00e406ea70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:18.188 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.186+0000 7f0100467700 1 -- 192.168.123.105:0/956398141 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f00f81085f0 con 0x7f00f81020e0 2026-03-10T07:48:18.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.189+0000 7f00ef7fe700 1 -- 192.168.123.105:0/956398141 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f00f405c250 con 0x7f00f81020e0 2026-03-10T07:48:18.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.189+0000 7f00fda02700 1 --2- 192.168.123.105:0/956398141 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f00e406c5b0 0x7f00e406ea70 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f00f80fcf70 tx=0x7f00e8009380 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:18.256 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:18 vm05 ceph-mon[50387]: from='client.24275 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T07:48:18.256 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:18 vm05 ceph-mon[50387]: pgmap v68: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:48:18.321 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.319+0000 7f0100467700 1 -- 192.168.123.105:0/956398141 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "format": "json"} v 0) v1 -- 0x7f00f8068a10 con 0x7f00f81020e0 2026-03-10T07:48:18.322 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.320+0000 7f00ef7fe700 1 -- 192.168.123.105:0/956398141 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "format": "json"}]=0 v0) v1 ==== 72+0+46 (secure 0 0 0) 0x7f00f4027070 con 0x7f00f81020e0 2026-03-10T07:48:18.322 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:48:18.322 INFO:teuthology.orchestra.run.vm05.stdout:{"status":"HEALTH_OK","checks":{},"mutes":[]} 2026-03-10T07:48:18.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.322+0000 7f0100467700 1 -- 192.168.123.105:0/956398141 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f00e406c5b0 msgr2=0x7f00e406ea70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:18.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.322+0000 7f0100467700 1 --2- 192.168.123.105:0/956398141 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f00e406c5b0 0x7f00e406ea70 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f00f80fcf70 tx=0x7f00e8009380 comp rx=0 tx=0).stop 2026-03-10T07:48:18.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.323+0000 7f0100467700 1 -- 192.168.123.105:0/956398141 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f00f81020e0 msgr2=0x7f00f819ca00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:18.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.323+0000 7f0100467700 1 --2- 192.168.123.105:0/956398141 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f00f81020e0 0x7f00f819ca00 secure :-1 s=READY pgs=245 cs=0 l=1 rev1=1 crypto rx=0x7f00f4004900 tx=0x7f00f4004930 comp rx=0 tx=0).stop 2026-03-10T07:48:18.324 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.323+0000 7f0100467700 1 -- 192.168.123.105:0/956398141 shutdown_connections 2026-03-10T07:48:18.325 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.323+0000 7f0100467700 1 --2- 192.168.123.105:0/956398141 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f00e406c5b0 0x7f00e406ea70 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:18.325 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.323+0000 7f0100467700 1 --2- 192.168.123.105:0/956398141 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f00f81020e0 0x7f00f819ca00 unknown :-1 s=CLOSED pgs=245 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:18.325 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.323+0000 7f0100467700 1 --2- 192.168.123.105:0/956398141 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f00f8102a00 0x7f00f819cf40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:18.325 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.323+0000 7f0100467700 1 -- 192.168.123.105:0/956398141 >> 192.168.123.105:0/956398141 conn(0x7f00f80fb830 msgr2=0x7f00f8105730 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:18.325 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.323+0000 7f0100467700 1 -- 192.168.123.105:0/956398141 shutdown_connections 2026-03-10T07:48:18.325 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.323+0000 7f0100467700 1 -- 192.168.123.105:0/956398141 wait complete. 2026-03-10T07:48:18.386 INFO:tasks.cephadm.ceph_manager.ceph:wait_until_healthy done 2026-03-10T07:48:18.387 INFO:tasks.cephadm:Setup complete, yielding 2026-03-10T07:48:18.387 INFO:teuthology.run_tasks:Running task print... 2026-03-10T07:48:18.388 INFO:teuthology.task.print:**** done end installing v18.2.1 cephadm ... 2026-03-10T07:48:18.388 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T07:48:18.391 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm05.local 2026-03-10T07:48:18.391 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- bash -c 'ceph config set mgr mgr/cephadm/use_repo_digest true --force' 2026-03-10T07:48:18.535 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:18.560 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:18 vm05 ceph-mon[50387]: from='client.14466 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T07:48:18.582 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:18 vm08 ceph-mon[59917]: from='client.24275 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T07:48:18.582 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:18 vm08 ceph-mon[59917]: pgmap v68: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:48:18.583 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:18 vm08 ceph-mon[59917]: from='client.14466 -' entity='client.admin' cmd=[{"prefix": "pg dump", "target": ["mon-mgr", ""], "format": "json"}]: dispatch 2026-03-10T07:48:18.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.772+0000 7f835aeb4700 1 -- 192.168.123.105:0/111138822 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8354074d40 msgr2=0x7f83540731a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:18.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.772+0000 7f835aeb4700 1 --2- 192.168.123.105:0/111138822 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8354074d40 0x7f83540731a0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f8344009b00 tx=0x7f8344009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:18.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.773+0000 7f835aeb4700 1 -- 192.168.123.105:0/111138822 shutdown_connections 2026-03-10T07:48:18.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.773+0000 7f835aeb4700 1 --2- 192.168.123.105:0/111138822 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8354073770 0x7f8354073bf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:18.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.773+0000 7f835aeb4700 1 --2- 192.168.123.105:0/111138822 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8354074d40 0x7f83540731a0 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:18.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.773+0000 7f835aeb4700 1 -- 192.168.123.105:0/111138822 >> 192.168.123.105:0/111138822 conn(0x7f83540fc4c0 msgr2=0x7f83540fe920 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:18.775 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.773+0000 7f835aeb4700 1 -- 192.168.123.105:0/111138822 shutdown_connections 2026-03-10T07:48:18.775 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.773+0000 7f835aeb4700 1 -- 192.168.123.105:0/111138822 wait complete. 2026-03-10T07:48:18.775 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.774+0000 7f835aeb4700 1 Processor -- start 2026-03-10T07:48:18.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.774+0000 7f835aeb4700 1 -- start start 2026-03-10T07:48:18.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.774+0000 7f835aeb4700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8354073770 0x7f8354198990 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:18.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.774+0000 7f835aeb4700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8354074d40 0x7f8354198ed0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:18.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.774+0000 7f835aeb4700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f83541994f0 con 0x7f8354073770 2026-03-10T07:48:18.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.774+0000 7f835aeb4700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8354199630 con 0x7f8354074d40 2026-03-10T07:48:18.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.774+0000 7f8358c50700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8354073770 0x7f8354198990 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:18.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.775+0000 7f8353fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8354074d40 0x7f8354198ed0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:18.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.775+0000 7f8353fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8354074d40 0x7f8354198ed0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:44058/0 (socket says 192.168.123.105:44058) 2026-03-10T07:48:18.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.775+0000 7f8353fff700 1 -- 192.168.123.105:0/4068355817 learned_addr learned my addr 192.168.123.105:0/4068355817 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:18.777 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.775+0000 7f8358c50700 1 -- 192.168.123.105:0/4068355817 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8354074d40 msgr2=0x7f8354198ed0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:18.777 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.775+0000 7f8358c50700 1 --2- 192.168.123.105:0/4068355817 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8354074d40 0x7f8354198ed0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:18.777 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.775+0000 7f8358c50700 1 -- 192.168.123.105:0/4068355817 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8348009710 con 0x7f8354073770 2026-03-10T07:48:18.777 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.775+0000 7f8358c50700 1 --2- 192.168.123.105:0/4068355817 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8354073770 0x7f8354198990 secure :-1 s=READY pgs=246 cs=0 l=1 rev1=1 crypto rx=0x7f8344009b00 tx=0x7f8344004950 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:18.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.776+0000 7f8351ffb700 1 -- 192.168.123.105:0/4068355817 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f834401d070 con 0x7f8354073770 2026-03-10T07:48:18.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.776+0000 7f8351ffb700 1 -- 192.168.123.105:0/4068355817 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f834400bb40 con 0x7f8354073770 2026-03-10T07:48:18.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.776+0000 7f8351ffb700 1 -- 192.168.123.105:0/4068355817 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8344003c00 con 0x7f8354073770 2026-03-10T07:48:18.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.776+0000 7f835aeb4700 1 -- 192.168.123.105:0/4068355817 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f83440097e0 con 0x7f8354073770 2026-03-10T07:48:18.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.776+0000 7f835aeb4700 1 -- 192.168.123.105:0/4068355817 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8354100c00 con 0x7f8354073770 2026-03-10T07:48:18.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.777+0000 7f835aeb4700 1 -- 192.168.123.105:0/4068355817 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8354066e80 con 0x7f8354073770 2026-03-10T07:48:18.782 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.780+0000 7f8351ffb700 1 -- 192.168.123.105:0/4068355817 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f8344003d60 con 0x7f8354073770 2026-03-10T07:48:18.782 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.780+0000 7f8351ffb700 1 --2- 192.168.123.105:0/4068355817 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f833c06c600 0x7f833c06eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:18.783 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.780+0000 7f8351ffb700 1 -- 192.168.123.105:0/4068355817 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f834408d570 con 0x7f8354073770 2026-03-10T07:48:18.783 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.780+0000 7f8351ffb700 1 -- 192.168.123.105:0/4068355817 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f834408d9f0 con 0x7f8354073770 2026-03-10T07:48:18.783 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.781+0000 7f8353fff700 1 --2- 192.168.123.105:0/4068355817 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f833c06c600 0x7f833c06eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:18.783 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.781+0000 7f8353fff700 1 --2- 192.168.123.105:0/4068355817 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f833c06c600 0x7f833c06eac0 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f8348009fd0 tx=0x7f8348009450 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:18.889 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.886+0000 7f835aeb4700 1 -- 192.168.123.105:0/4068355817 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config set, name=mgr/cephadm/use_repo_digest}] v 0) v1 -- 0x7f8354100ee0 con 0x7f8354073770 2026-03-10T07:48:18.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.896+0000 7f8351ffb700 1 -- 192.168.123.105:0/4068355817 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/cephadm/use_repo_digest}]=0 v14)=0 v14) v1 ==== 143+0+0 (secure 0 0 0) 0x7f834405bc40 con 0x7f8354073770 2026-03-10T07:48:18.900 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.898+0000 7f835aeb4700 1 -- 192.168.123.105:0/4068355817 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f833c06c600 msgr2=0x7f833c06eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:18.900 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.898+0000 7f835aeb4700 1 --2- 192.168.123.105:0/4068355817 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f833c06c600 0x7f833c06eac0 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f8348009fd0 tx=0x7f8348009450 comp rx=0 tx=0).stop 2026-03-10T07:48:18.900 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.898+0000 7f835aeb4700 1 -- 192.168.123.105:0/4068355817 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8354073770 msgr2=0x7f8354198990 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:18.900 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.898+0000 7f835aeb4700 1 --2- 192.168.123.105:0/4068355817 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8354073770 0x7f8354198990 secure :-1 s=READY pgs=246 cs=0 l=1 rev1=1 crypto rx=0x7f8344009b00 tx=0x7f8344004950 comp rx=0 tx=0).stop 2026-03-10T07:48:18.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.899+0000 7f835aeb4700 1 -- 192.168.123.105:0/4068355817 shutdown_connections 2026-03-10T07:48:18.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.899+0000 7f835aeb4700 1 --2- 192.168.123.105:0/4068355817 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f833c06c600 0x7f833c06eac0 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:18.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.899+0000 7f835aeb4700 1 --2- 192.168.123.105:0/4068355817 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8354073770 0x7f8354198990 unknown :-1 s=CLOSED pgs=246 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:18.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.899+0000 7f835aeb4700 1 --2- 192.168.123.105:0/4068355817 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8354074d40 0x7f8354198ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:18.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.899+0000 7f835aeb4700 1 -- 192.168.123.105:0/4068355817 >> 192.168.123.105:0/4068355817 conn(0x7f83540fc4c0 msgr2=0x7f8354106da0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:18.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.899+0000 7f835aeb4700 1 -- 192.168.123.105:0/4068355817 shutdown_connections 2026-03-10T07:48:18.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:18.899+0000 7f835aeb4700 1 -- 192.168.123.105:0/4068355817 wait complete. 2026-03-10T07:48:18.943 INFO:teuthology.run_tasks:Running task print... 2026-03-10T07:48:18.945 INFO:teuthology.task.print:**** done cephadm.shell ceph config set mgr... 2026-03-10T07:48:18.945 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T07:48:18.947 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm05.local 2026-03-10T07:48:18.947 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- bash -c 'ceph orch status' 2026-03-10T07:48:19.114 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:19.371 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.369+0000 7ff76e768700 1 -- 192.168.123.105:0/898168554 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff768103150 msgr2=0x7ff768103570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:19.371 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.369+0000 7ff76e768700 1 --2- 192.168.123.105:0/898168554 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff768103150 0x7ff768103570 secure :-1 s=READY pgs=247 cs=0 l=1 rev1=1 crypto rx=0x7ff750009b50 tx=0x7ff750009e60 comp rx=0 tx=0).stop 2026-03-10T07:48:19.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.370+0000 7ff76e768700 1 -- 192.168.123.105:0/898168554 shutdown_connections 2026-03-10T07:48:19.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.370+0000 7ff76e768700 1 --2- 192.168.123.105:0/898168554 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff768104350 0x7ff7681047b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:19.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.370+0000 7ff76e768700 1 --2- 192.168.123.105:0/898168554 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff768103150 0x7ff768103570 unknown :-1 s=CLOSED pgs=247 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:19.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.370+0000 7ff76e768700 1 -- 192.168.123.105:0/898168554 >> 192.168.123.105:0/898168554 conn(0x7ff7680fe6d0 msgr2=0x7ff768100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:19.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.370+0000 7ff76e768700 1 -- 192.168.123.105:0/898168554 shutdown_connections 2026-03-10T07:48:19.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.370+0000 7ff76e768700 1 -- 192.168.123.105:0/898168554 wait complete. 2026-03-10T07:48:19.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.371+0000 7ff76e768700 1 Processor -- start 2026-03-10T07:48:19.373 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.371+0000 7ff76e768700 1 -- start start 2026-03-10T07:48:19.373 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.371+0000 7ff76e768700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff768104350 0x7ff768198ce0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:19.373 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.371+0000 7ff76e768700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff768199220 0x7ff76819e250 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:19.373 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.371+0000 7ff76e768700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff7681996a0 con 0x7ff768199220 2026-03-10T07:48:19.373 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.371+0000 7ff76e768700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff768199810 con 0x7ff768104350 2026-03-10T07:48:19.373 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.371+0000 7ff7677fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff768199220 0x7ff76819e250 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:19.373 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.371+0000 7ff7677fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff768199220 0x7ff76819e250 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46808/0 (socket says 192.168.123.105:46808) 2026-03-10T07:48:19.373 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.371+0000 7ff7677fe700 1 -- 192.168.123.105:0/2355757718 learned_addr learned my addr 192.168.123.105:0/2355757718 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:19.373 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.372+0000 7ff7677fe700 1 -- 192.168.123.105:0/2355757718 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff768104350 msgr2=0x7ff768198ce0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:48:19.374 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.372+0000 7ff767fff700 1 --2- 192.168.123.105:0/2355757718 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff768104350 0x7ff768198ce0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:19.374 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.372+0000 7ff7677fe700 1 --2- 192.168.123.105:0/2355757718 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff768104350 0x7ff768198ce0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:19.374 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.372+0000 7ff7677fe700 1 -- 192.168.123.105:0/2355757718 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff7500097e0 con 0x7ff768199220 2026-03-10T07:48:19.374 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.372+0000 7ff767fff700 1 --2- 192.168.123.105:0/2355757718 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff768104350 0x7ff768198ce0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:48:19.374 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.372+0000 7ff7677fe700 1 --2- 192.168.123.105:0/2355757718 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff768199220 0x7ff76819e250 secure :-1 s=READY pgs=248 cs=0 l=1 rev1=1 crypto rx=0x7ff75800eb10 tx=0x7ff75800eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:19.375 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.372+0000 7ff7657fa700 1 -- 192.168.123.105:0/2355757718 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff75800cca0 con 0x7ff768199220 2026-03-10T07:48:19.375 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.372+0000 7ff7657fa700 1 -- 192.168.123.105:0/2355757718 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff75800ce00 con 0x7ff768199220 2026-03-10T07:48:19.375 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.372+0000 7ff7657fa700 1 -- 192.168.123.105:0/2355757718 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff7580189c0 con 0x7ff768199220 2026-03-10T07:48:19.375 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.373+0000 7ff76e768700 1 -- 192.168.123.105:0/2355757718 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff76819e7f0 con 0x7ff768199220 2026-03-10T07:48:19.375 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.373+0000 7ff76e768700 1 -- 192.168.123.105:0/2355757718 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff76819ed10 con 0x7ff768199220 2026-03-10T07:48:19.376 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.374+0000 7ff76e768700 1 -- 192.168.123.105:0/2355757718 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff768066e80 con 0x7ff768199220 2026-03-10T07:48:19.376 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.374+0000 7ff7657fa700 1 -- 192.168.123.105:0/2355757718 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7ff758018b20 con 0x7ff768199220 2026-03-10T07:48:19.379 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.377+0000 7ff7657fa700 1 --2- 192.168.123.105:0/2355757718 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff75406c5b0 0x7ff75406ea70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:19.379 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.377+0000 7ff7657fa700 1 -- 192.168.123.105:0/2355757718 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7ff758014070 con 0x7ff768199220 2026-03-10T07:48:19.379 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.377+0000 7ff767fff700 1 --2- 192.168.123.105:0/2355757718 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff75406c5b0 0x7ff75406ea70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:19.379 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.377+0000 7ff7657fa700 1 -- 192.168.123.105:0/2355757718 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ff758056900 con 0x7ff768199220 2026-03-10T07:48:19.379 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.378+0000 7ff767fff700 1 --2- 192.168.123.105:0/2355757718 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff75406c5b0 0x7ff75406ea70 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7ff76819a1c0 tx=0x7ff7500058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:19.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:19 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/956398141' entity='client.admin' cmd=[{"prefix": "health", "format": "json"}]: dispatch 2026-03-10T07:48:19.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:19 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/4068355817' entity='client.admin' 2026-03-10T07:48:19.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:19 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:48:19.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:19 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:48:19.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:19 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:48:19.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:19 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:19.490 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.488+0000 7ff76e768700 1 -- 192.168.123.105:0/2355757718 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch status", "target": ["mon-mgr", ""]}) v1 -- 0x7ff76819eff0 con 0x7ff75406c5b0 2026-03-10T07:48:19.490 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.489+0000 7ff7657fa700 1 -- 192.168.123.105:0/2355757718 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+43 (secure 0 0 0) 0x7ff76819eff0 con 0x7ff75406c5b0 2026-03-10T07:48:19.491 INFO:teuthology.orchestra.run.vm05.stdout:Backend: cephadm 2026-03-10T07:48:19.491 INFO:teuthology.orchestra.run.vm05.stdout:Available: Yes 2026-03-10T07:48:19.491 INFO:teuthology.orchestra.run.vm05.stdout:Paused: No 2026-03-10T07:48:19.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.491+0000 7ff76e768700 1 -- 192.168.123.105:0/2355757718 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff75406c5b0 msgr2=0x7ff75406ea70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:19.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.491+0000 7ff76e768700 1 --2- 192.168.123.105:0/2355757718 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff75406c5b0 0x7ff75406ea70 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7ff76819a1c0 tx=0x7ff7500058e0 comp rx=0 tx=0).stop 2026-03-10T07:48:19.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.491+0000 7ff76e768700 1 -- 192.168.123.105:0/2355757718 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff768199220 msgr2=0x7ff76819e250 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:19.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.491+0000 7ff76e768700 1 --2- 192.168.123.105:0/2355757718 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff768199220 0x7ff76819e250 secure :-1 s=READY pgs=248 cs=0 l=1 rev1=1 crypto rx=0x7ff75800eb10 tx=0x7ff75800eed0 comp rx=0 tx=0).stop 2026-03-10T07:48:19.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.491+0000 7ff76e768700 1 -- 192.168.123.105:0/2355757718 shutdown_connections 2026-03-10T07:48:19.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.492+0000 7ff76e768700 1 --2- 192.168.123.105:0/2355757718 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff75406c5b0 0x7ff75406ea70 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:19.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.492+0000 7ff76e768700 1 --2- 192.168.123.105:0/2355757718 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff768104350 0x7ff768198ce0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:19.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.492+0000 7ff76e768700 1 --2- 192.168.123.105:0/2355757718 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff768199220 0x7ff76819e250 unknown :-1 s=CLOSED pgs=248 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:19.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.492+0000 7ff76e768700 1 -- 192.168.123.105:0/2355757718 >> 192.168.123.105:0/2355757718 conn(0x7ff7680fe6d0 msgr2=0x7ff768107580 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:19.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.492+0000 7ff76e768700 1 -- 192.168.123.105:0/2355757718 shutdown_connections 2026-03-10T07:48:19.494 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.492+0000 7ff76e768700 1 -- 192.168.123.105:0/2355757718 wait complete. 2026-03-10T07:48:19.556 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- bash -c 'ceph orch ps' 2026-03-10T07:48:19.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:19 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/956398141' entity='client.admin' cmd=[{"prefix": "health", "format": "json"}]: dispatch 2026-03-10T07:48:19.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:19 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/4068355817' entity='client.admin' 2026-03-10T07:48:19.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:19 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:48:19.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:19 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:48:19.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:19 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:48:19.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:19 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:19.697 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:19.939 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.937+0000 7f7e0efdc700 1 -- 192.168.123.105:0/1870690572 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7e080737f0 msgr2=0x7f7e08073c70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:19.939 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.937+0000 7f7e0efdc700 1 --2- 192.168.123.105:0/1870690572 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7e080737f0 0x7f7e08073c70 secure :-1 s=READY pgs=249 cs=0 l=1 rev1=1 crypto rx=0x7f7dfc009b50 tx=0x7f7dfc009e60 comp rx=0 tx=0).stop 2026-03-10T07:48:19.939 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.938+0000 7f7e0efdc700 1 -- 192.168.123.105:0/1870690572 shutdown_connections 2026-03-10T07:48:19.939 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.938+0000 7f7e0efdc700 1 --2- 192.168.123.105:0/1870690572 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7e080737f0 0x7f7e08073c70 unknown :-1 s=CLOSED pgs=249 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:19.939 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.938+0000 7f7e0efdc700 1 --2- 192.168.123.105:0/1870690572 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7e08074dc0 0x7f7e08073220 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:19.939 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.938+0000 7f7e0efdc700 1 -- 192.168.123.105:0/1870690572 >> 192.168.123.105:0/1870690572 conn(0x7f7e080fc4b0 msgr2=0x7f7e080fe8f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:19.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.938+0000 7f7e0efdc700 1 -- 192.168.123.105:0/1870690572 shutdown_connections 2026-03-10T07:48:19.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.938+0000 7f7e0efdc700 1 -- 192.168.123.105:0/1870690572 wait complete. 2026-03-10T07:48:19.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.939+0000 7f7e0efdc700 1 Processor -- start 2026-03-10T07:48:19.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.939+0000 7f7e0efdc700 1 -- start start 2026-03-10T07:48:19.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.939+0000 7f7e0efdc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7e080737f0 0x7f7e0819ceb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:19.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.939+0000 7f7e0efdc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7e08074dc0 0x7f7e0819d3f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:19.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.939+0000 7f7e0efdc700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7e0819da10 con 0x7f7e08074dc0 2026-03-10T07:48:19.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.939+0000 7f7e0efdc700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7e0819db50 con 0x7f7e080737f0 2026-03-10T07:48:19.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.939+0000 7f7e07fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7e08074dc0 0x7f7e0819d3f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:19.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.939+0000 7f7e07fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7e08074dc0 0x7f7e0819d3f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46832/0 (socket says 192.168.123.105:46832) 2026-03-10T07:48:19.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.939+0000 7f7e07fff700 1 -- 192.168.123.105:0/790402791 learned_addr learned my addr 192.168.123.105:0/790402791 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:19.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.940+0000 7f7e0cd78700 1 --2- 192.168.123.105:0/790402791 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7e080737f0 0x7f7e0819ceb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:19.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.940+0000 7f7e07fff700 1 -- 192.168.123.105:0/790402791 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7e080737f0 msgr2=0x7f7e0819ceb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:19.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.940+0000 7f7e07fff700 1 --2- 192.168.123.105:0/790402791 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7e080737f0 0x7f7e0819ceb0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:19.942 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.940+0000 7f7e07fff700 1 -- 192.168.123.105:0/790402791 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7dfc0097e0 con 0x7f7e08074dc0 2026-03-10T07:48:19.942 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.940+0000 7f7e0cd78700 1 --2- 192.168.123.105:0/790402791 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7e080737f0 0x7f7e0819ceb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T07:48:19.942 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.940+0000 7f7e07fff700 1 --2- 192.168.123.105:0/790402791 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7e08074dc0 0x7f7e0819d3f0 secure :-1 s=READY pgs=250 cs=0 l=1 rev1=1 crypto rx=0x7f7dfc006010 tx=0x7f7dfc00b920 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:19.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.941+0000 7f7e05ffb700 1 -- 192.168.123.105:0/790402791 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7dfc01d070 con 0x7f7e08074dc0 2026-03-10T07:48:19.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.941+0000 7f7e05ffb700 1 -- 192.168.123.105:0/790402791 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7dfc004500 con 0x7f7e08074dc0 2026-03-10T07:48:19.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.941+0000 7f7e05ffb700 1 -- 192.168.123.105:0/790402791 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7dfc00f670 con 0x7f7e08074dc0 2026-03-10T07:48:19.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.941+0000 7f7e0efdc700 1 -- 192.168.123.105:0/790402791 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7e081a25a0 con 0x7f7e08074dc0 2026-03-10T07:48:19.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.941+0000 7f7e0efdc700 1 -- 192.168.123.105:0/790402791 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7e081a2a30 con 0x7f7e08074dc0 2026-03-10T07:48:19.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.942+0000 7f7e05ffb700 1 -- 192.168.123.105:0/790402791 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f7dfc003e90 con 0x7f7e08074dc0 2026-03-10T07:48:19.947 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.942+0000 7f7e0efdc700 1 -- 192.168.123.105:0/790402791 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7e08066e80 con 0x7f7e08074dc0 2026-03-10T07:48:19.947 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.944+0000 7f7e05ffb700 1 --2- 192.168.123.105:0/790402791 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7df806c490 0x7f7df806e950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:19.947 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.944+0000 7f7e05ffb700 1 -- 192.168.123.105:0/790402791 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f7dfc08cd10 con 0x7f7e08074dc0 2026-03-10T07:48:19.947 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.945+0000 7f7e0cd78700 1 --2- 192.168.123.105:0/790402791 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7df806c490 0x7f7df806e950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:19.947 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.945+0000 7f7e05ffb700 1 -- 192.168.123.105:0/790402791 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f7dfc05b480 con 0x7f7e08074dc0 2026-03-10T07:48:19.947 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:19.945+0000 7f7e0cd78700 1 --2- 192.168.123.105:0/790402791 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7df806c490 0x7f7df806e950 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f7e080751a0 tx=0x7f7df4008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:20.056 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.053+0000 7f7e0efdc700 1 -- 192.168.123.105:0/790402791 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f7e08103fc0 con 0x7f7df806c490 2026-03-10T07:48:20.065 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.063+0000 7f7e05ffb700 1 -- 192.168.123.105:0/790402791 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+2640 (secure 0 0 0) 0x7f7e08103fc0 con 0x7f7df806c490 2026-03-10T07:48:20.065 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T07:48:20.065 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (75s) 43s ago 117s 22.6M - 0.25.0 c8568f914cd2 f87529717116 2026-03-10T07:48:20.065 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (2m) 43s ago 2m 7838k - 18.2.1 5be31c24972a 26c4db858175 2026-03-10T07:48:20.065 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (91s) 19s ago 91s 8014k - 18.2.1 5be31c24972a 209e2398a09c 2026-03-10T07:48:20.065 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (2m) 43s ago 2m 7419k - 18.2.1 5be31c24972a d3d7b92c8ac3 2026-03-10T07:48:20.065 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (90s) 19s ago 90s 7415k - 18.2.1 5be31c24972a 96136e0195f7 2026-03-10T07:48:20.065 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (74s) 43s ago 107s 76.9M - 9.4.7 954c08fa6188 35089be30fc6 2026-03-10T07:48:20.065 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.blexke vm05 *:9283,8765,8443 running (2m) 43s ago 2m 490M - 18.2.1 5be31c24972a 4af6d7f6e0f4 2026-03-10T07:48:20.065 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.orfpog vm08 *:8443,9283,8765 running (87s) 19s ago 87s 447M - 18.2.1 5be31c24972a 7b89b610a4ab 2026-03-10T07:48:20.065 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (2m) 43s ago 2m 44.5M 2048M 18.2.1 5be31c24972a 2a459bf05146 2026-03-10T07:48:20.065 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (85s) 19s ago 85s 42.0M 2048M 18.2.1 5be31c24972a e01dfb712474 2026-03-10T07:48:20.065 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (2m) 43s ago 2m 13.7M - 1.5.0 0da6a335fe13 cb6188e5fa06 2026-03-10T07:48:20.065 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (88s) 19s ago 88s 13.7M - 1.5.0 0da6a335fe13 f73da8e379d9 2026-03-10T07:48:20.065 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (67s) 43s ago 67s 37.8M 4096M 18.2.1 5be31c24972a 9b7c5ea48cea 2026-03-10T07:48:20.065 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (57s) 43s ago 57s 41.3M 4096M 18.2.1 5be31c24972a 88e0b65b2c93 2026-03-10T07:48:20.065 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (47s) 43s ago 47s 34.1M 4096M 18.2.1 5be31c24972a b8d3cbeb49f1 2026-03-10T07:48:20.065 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (38s) 19s ago 38s 43.5M 4096M 18.2.1 5be31c24972a 0a62c54a86c0 2026-03-10T07:48:20.066 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (29s) 19s ago 29s 39.0M 4096M 18.2.1 5be31c24972a bd748b691ccd 2026-03-10T07:48:20.066 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (20s) 19s ago 20s 12.2M 4096M 18.2.1 5be31c24972a 9f08820ae98b 2026-03-10T07:48:20.066 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (69s) 43s ago 102s 32.4M - 2.43.0 a07b618ecd1d bcb499ab4929 2026-03-10T07:48:20.067 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.066+0000 7f7e0efdc700 1 -- 192.168.123.105:0/790402791 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7df806c490 msgr2=0x7f7df806e950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:20.067 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.066+0000 7f7e0efdc700 1 --2- 192.168.123.105:0/790402791 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7df806c490 0x7f7df806e950 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f7e080751a0 tx=0x7f7df4008040 comp rx=0 tx=0).stop 2026-03-10T07:48:20.067 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.066+0000 7f7e0efdc700 1 -- 192.168.123.105:0/790402791 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7e08074dc0 msgr2=0x7f7e0819d3f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:20.067 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.066+0000 7f7e0efdc700 1 --2- 192.168.123.105:0/790402791 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7e08074dc0 0x7f7e0819d3f0 secure :-1 s=READY pgs=250 cs=0 l=1 rev1=1 crypto rx=0x7f7dfc006010 tx=0x7f7dfc00b920 comp rx=0 tx=0).stop 2026-03-10T07:48:20.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.066+0000 7f7e0efdc700 1 -- 192.168.123.105:0/790402791 shutdown_connections 2026-03-10T07:48:20.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.066+0000 7f7e0efdc700 1 --2- 192.168.123.105:0/790402791 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f7df806c490 0x7f7df806e950 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:20.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.066+0000 7f7e0efdc700 1 --2- 192.168.123.105:0/790402791 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7e080737f0 0x7f7e0819ceb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:20.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.066+0000 7f7e0efdc700 1 --2- 192.168.123.105:0/790402791 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7e08074dc0 0x7f7e0819d3f0 unknown :-1 s=CLOSED pgs=250 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:20.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.066+0000 7f7e0efdc700 1 -- 192.168.123.105:0/790402791 >> 192.168.123.105:0/790402791 conn(0x7f7e080fc4b0 msgr2=0x7f7e081028a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:20.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.067+0000 7f7e0efdc700 1 -- 192.168.123.105:0/790402791 shutdown_connections 2026-03-10T07:48:20.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.067+0000 7f7e0efdc700 1 -- 192.168.123.105:0/790402791 wait complete. 2026-03-10T07:48:20.128 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- bash -c 'ceph orch ls' 2026-03-10T07:48:20.272 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:20.310 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:20 vm05 ceph-mon[50387]: pgmap v69: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:48:20.310 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:20 vm05 ceph-mon[50387]: from='client.14478 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:48:20.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.524+0000 7f5f2ad6d700 1 -- 192.168.123.105:0/2817363149 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5f24074dc0 msgr2=0x7f5f24073220 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:20.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.524+0000 7f5f2ad6d700 1 --2- 192.168.123.105:0/2817363149 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5f24074dc0 0x7f5f24073220 secure :-1 s=READY pgs=251 cs=0 l=1 rev1=1 crypto rx=0x7f5f14009b00 tx=0x7f5f14009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:20.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.525+0000 7f5f2ad6d700 1 -- 192.168.123.105:0/2817363149 shutdown_connections 2026-03-10T07:48:20.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.525+0000 7f5f2ad6d700 1 --2- 192.168.123.105:0/2817363149 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5f240737f0 0x7f5f24073c70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:20.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.525+0000 7f5f2ad6d700 1 --2- 192.168.123.105:0/2817363149 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5f24074dc0 0x7f5f24073220 unknown :-1 s=CLOSED pgs=251 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:20.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.525+0000 7f5f2ad6d700 1 -- 192.168.123.105:0/2817363149 >> 192.168.123.105:0/2817363149 conn(0x7f5f240fc460 msgr2=0x7f5f240fe8e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:20.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.525+0000 7f5f2ad6d700 1 -- 192.168.123.105:0/2817363149 shutdown_connections 2026-03-10T07:48:20.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.525+0000 7f5f2ad6d700 1 -- 192.168.123.105:0/2817363149 wait complete. 2026-03-10T07:48:20.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.525+0000 7f5f2ad6d700 1 Processor -- start 2026-03-10T07:48:20.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.526+0000 7f5f2ad6d700 1 -- start start 2026-03-10T07:48:20.527 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.526+0000 7f5f2ad6d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5f240737f0 0x7f5f24198a10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:20.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.526+0000 7f5f2ad6d700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5f24074dc0 0x7f5f24198f50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:20.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.526+0000 7f5f2ad6d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5f241994e0 con 0x7f5f240737f0 2026-03-10T07:48:20.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.526+0000 7f5f2ad6d700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5f24199620 con 0x7f5f24074dc0 2026-03-10T07:48:20.528 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.526+0000 7f5f28b09700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5f240737f0 0x7f5f24198a10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:20.529 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.526+0000 7f5f28b09700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5f240737f0 0x7f5f24198a10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46848/0 (socket says 192.168.123.105:46848) 2026-03-10T07:48:20.529 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.526+0000 7f5f28b09700 1 -- 192.168.123.105:0/2925052900 learned_addr learned my addr 192.168.123.105:0/2925052900 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:20.529 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.526+0000 7f5f28b09700 1 -- 192.168.123.105:0/2925052900 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5f24074dc0 msgr2=0x7f5f24198f50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:20.529 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.526+0000 7f5f23fff700 1 --2- 192.168.123.105:0/2925052900 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5f24074dc0 0x7f5f24198f50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:20.529 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.526+0000 7f5f28b09700 1 --2- 192.168.123.105:0/2925052900 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5f24074dc0 0x7f5f24198f50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:20.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.526+0000 7f5f28b09700 1 -- 192.168.123.105:0/2925052900 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5f140097e0 con 0x7f5f240737f0 2026-03-10T07:48:20.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.526+0000 7f5f28b09700 1 --2- 192.168.123.105:0/2925052900 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5f240737f0 0x7f5f24198a10 secure :-1 s=READY pgs=252 cs=0 l=1 rev1=1 crypto rx=0x7f5f14009fd0 tx=0x7f5f14004ab0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:20.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.527+0000 7f5f21ffb700 1 -- 192.168.123.105:0/2925052900 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5f1400bcd0 con 0x7f5f240737f0 2026-03-10T07:48:20.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.527+0000 7f5f21ffb700 1 -- 192.168.123.105:0/2925052900 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5f14021d30 con 0x7f5f240737f0 2026-03-10T07:48:20.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.527+0000 7f5f21ffb700 1 -- 192.168.123.105:0/2925052900 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5f1400fe30 con 0x7f5f240737f0 2026-03-10T07:48:20.531 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.527+0000 7f5f2ad6d700 1 -- 192.168.123.105:0/2925052900 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5f241008c0 con 0x7f5f240737f0 2026-03-10T07:48:20.531 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.527+0000 7f5f2ad6d700 1 -- 192.168.123.105:0/2925052900 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5f24100d80 con 0x7f5f240737f0 2026-03-10T07:48:20.531 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.528+0000 7f5f21ffb700 1 -- 192.168.123.105:0/2925052900 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f5f14034430 con 0x7f5f240737f0 2026-03-10T07:48:20.531 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.528+0000 7f5f21ffb700 1 --2- 192.168.123.105:0/2925052900 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5f0c06c5b0 0x7f5f0c06ea70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:20.531 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.528+0000 7f5f21ffb700 1 -- 192.168.123.105:0/2925052900 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f5f14096630 con 0x7f5f240737f0 2026-03-10T07:48:20.531 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.529+0000 7f5f23fff700 1 --2- 192.168.123.105:0/2925052900 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5f0c06c5b0 0x7f5f0c06ea70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:20.531 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.530+0000 7f5f23fff700 1 --2- 192.168.123.105:0/2925052900 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5f0c06c5b0 0x7f5f0c06ea70 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f5f24074af0 tx=0x7f5f18005d70 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:20.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.530+0000 7f5f2ad6d700 1 -- 192.168.123.105:0/2925052900 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5f24066e80 con 0x7f5f240737f0 2026-03-10T07:48:20.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.533+0000 7f5f21ffb700 1 -- 192.168.123.105:0/2925052900 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f5f14064c80 con 0x7f5f240737f0 2026-03-10T07:48:20.640 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.638+0000 7f5f2ad6d700 1 -- 192.168.123.105:0/2925052900 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch ls", "target": ["mon-mgr", ""]}) v1 -- 0x7f5f24108400 con 0x7f5f0c06c5b0 2026-03-10T07:48:20.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.643+0000 7f5f21ffb700 1 -- 192.168.123.105:0/2925052900 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+1150 (secure 0 0 0) 0x7f5f24108400 con 0x7f5f0c06c5b0 2026-03-10T07:48:20.645 INFO:teuthology.orchestra.run.vm05.stdout:NAME PORTS RUNNING REFRESHED AGE PLACEMENT 2026-03-10T07:48:20.645 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager ?:9093,9094 1/1 44s ago 2m count:1 2026-03-10T07:48:20.645 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter 2/2 44s ago 2m * 2026-03-10T07:48:20.645 INFO:teuthology.orchestra.run.vm05.stdout:crash 2/2 44s ago 2m * 2026-03-10T07:48:20.645 INFO:teuthology.orchestra.run.vm05.stdout:grafana ?:3000 1/1 44s ago 2m count:1 2026-03-10T07:48:20.645 INFO:teuthology.orchestra.run.vm05.stdout:mgr 2/2 44s ago 2m count:2 2026-03-10T07:48:20.645 INFO:teuthology.orchestra.run.vm05.stdout:mon 2/2 44s ago 2m vm05:192.168.123.105=vm05;vm08:192.168.123.108=vm08;count:2 2026-03-10T07:48:20.645 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter ?:9100 2/2 44s ago 2m * 2026-03-10T07:48:20.645 INFO:teuthology.orchestra.run.vm05.stdout:osd 6 44s ago - 2026-03-10T07:48:20.645 INFO:teuthology.orchestra.run.vm05.stdout:prometheus ?:9095 1/1 44s ago 2m count:1 2026-03-10T07:48:20.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.646+0000 7f5f2ad6d700 1 -- 192.168.123.105:0/2925052900 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5f0c06c5b0 msgr2=0x7f5f0c06ea70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:20.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.646+0000 7f5f2ad6d700 1 --2- 192.168.123.105:0/2925052900 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5f0c06c5b0 0x7f5f0c06ea70 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f5f24074af0 tx=0x7f5f18005d70 comp rx=0 tx=0).stop 2026-03-10T07:48:20.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.646+0000 7f5f2ad6d700 1 -- 192.168.123.105:0/2925052900 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5f240737f0 msgr2=0x7f5f24198a10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:20.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.646+0000 7f5f2ad6d700 1 --2- 192.168.123.105:0/2925052900 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5f240737f0 0x7f5f24198a10 secure :-1 s=READY pgs=252 cs=0 l=1 rev1=1 crypto rx=0x7f5f14009fd0 tx=0x7f5f14004ab0 comp rx=0 tx=0).stop 2026-03-10T07:48:20.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.646+0000 7f5f2ad6d700 1 -- 192.168.123.105:0/2925052900 shutdown_connections 2026-03-10T07:48:20.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.646+0000 7f5f2ad6d700 1 --2- 192.168.123.105:0/2925052900 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5f0c06c5b0 0x7f5f0c06ea70 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:20.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.646+0000 7f5f2ad6d700 1 --2- 192.168.123.105:0/2925052900 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5f240737f0 0x7f5f24198a10 unknown :-1 s=CLOSED pgs=252 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:20.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.646+0000 7f5f2ad6d700 1 --2- 192.168.123.105:0/2925052900 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5f24074dc0 0x7f5f24198f50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:20.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.646+0000 7f5f2ad6d700 1 -- 192.168.123.105:0/2925052900 >> 192.168.123.105:0/2925052900 conn(0x7f5f240fc460 msgr2=0x7f5f24106ce0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:20.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.646+0000 7f5f2ad6d700 1 -- 192.168.123.105:0/2925052900 shutdown_connections 2026-03-10T07:48:20.648 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:20.646+0000 7f5f2ad6d700 1 -- 192.168.123.105:0/2925052900 wait complete. 2026-03-10T07:48:20.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:20 vm08 ceph-mon[59917]: pgmap v69: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:48:20.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:20 vm08 ceph-mon[59917]: from='client.14478 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:48:20.706 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- bash -c 'ceph orch host ls' 2026-03-10T07:48:20.845 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:21.089 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.087+0000 7f3b367b9700 1 -- 192.168.123.105:0/3590434324 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b30103150 msgr2=0x7f3b30103570 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:21.090 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.087+0000 7f3b367b9700 1 --2- 192.168.123.105:0/3590434324 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b30103150 0x7f3b30103570 secure :-1 s=READY pgs=253 cs=0 l=1 rev1=1 crypto rx=0x7f3b18009b00 tx=0x7f3b18009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:21.090 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.088+0000 7f3b367b9700 1 -- 192.168.123.105:0/3590434324 shutdown_connections 2026-03-10T07:48:21.090 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.088+0000 7f3b367b9700 1 --2- 192.168.123.105:0/3590434324 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b30104350 0x7f3b301047b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:21.090 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.088+0000 7f3b367b9700 1 --2- 192.168.123.105:0/3590434324 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b30103150 0x7f3b30103570 unknown :-1 s=CLOSED pgs=253 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:21.090 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.088+0000 7f3b367b9700 1 -- 192.168.123.105:0/3590434324 >> 192.168.123.105:0/3590434324 conn(0x7f3b300fe6d0 msgr2=0x7f3b30100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:21.090 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.088+0000 7f3b367b9700 1 -- 192.168.123.105:0/3590434324 shutdown_connections 2026-03-10T07:48:21.090 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.088+0000 7f3b367b9700 1 -- 192.168.123.105:0/3590434324 wait complete. 2026-03-10T07:48:21.090 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.089+0000 7f3b367b9700 1 Processor -- start 2026-03-10T07:48:21.090 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.089+0000 7f3b367b9700 1 -- start start 2026-03-10T07:48:21.090 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.089+0000 7f3b367b9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b30103150 0x7f3b30198aa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:21.091 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.089+0000 7f3b367b9700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b30104350 0x7f3b30198fe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:21.091 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.089+0000 7f3b367b9700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3b30199600 con 0x7f3b30103150 2026-03-10T07:48:21.091 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.089+0000 7f3b367b9700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3b30199740 con 0x7f3b30104350 2026-03-10T07:48:21.091 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.089+0000 7f3b2ffff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b30103150 0x7f3b30198aa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:21.091 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.089+0000 7f3b2ffff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b30103150 0x7f3b30198aa0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46864/0 (socket says 192.168.123.105:46864) 2026-03-10T07:48:21.091 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.089+0000 7f3b2ffff700 1 -- 192.168.123.105:0/2970041848 learned_addr learned my addr 192.168.123.105:0/2970041848 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:21.091 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.089+0000 7f3b2ffff700 1 -- 192.168.123.105:0/2970041848 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b30104350 msgr2=0x7f3b30198fe0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T07:48:21.091 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.089+0000 7f3b2ffff700 1 --2- 192.168.123.105:0/2970041848 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b30104350 0x7f3b30198fe0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:21.091 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.089+0000 7f3b2ffff700 1 -- 192.168.123.105:0/2970041848 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3b180097e0 con 0x7f3b30103150 2026-03-10T07:48:21.091 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.090+0000 7f3b2ffff700 1 --2- 192.168.123.105:0/2970041848 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b30103150 0x7f3b30198aa0 secure :-1 s=READY pgs=254 cs=0 l=1 rev1=1 crypto rx=0x7f3b18004a30 tx=0x7f3b18004b10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:21.092 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.090+0000 7f3b2d7fa700 1 -- 192.168.123.105:0/2970041848 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3b1801d070 con 0x7f3b30103150 2026-03-10T07:48:21.092 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.090+0000 7f3b2d7fa700 1 -- 192.168.123.105:0/2970041848 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3b1800bcd0 con 0x7f3b30103150 2026-03-10T07:48:21.093 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.090+0000 7f3b2d7fa700 1 -- 192.168.123.105:0/2970041848 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3b180218c0 con 0x7f3b30103150 2026-03-10T07:48:21.093 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.090+0000 7f3b367b9700 1 -- 192.168.123.105:0/2970041848 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3b3019e190 con 0x7f3b30103150 2026-03-10T07:48:21.093 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.090+0000 7f3b367b9700 1 -- 192.168.123.105:0/2970041848 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3b3019e680 con 0x7f3b30103150 2026-03-10T07:48:21.093 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.091+0000 7f3b2d7fa700 1 -- 192.168.123.105:0/2970041848 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f3b1802b430 con 0x7f3b30103150 2026-03-10T07:48:21.093 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.092+0000 7f3b367b9700 1 -- 192.168.123.105:0/2970041848 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3b30066e80 con 0x7f3b30103150 2026-03-10T07:48:21.094 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.092+0000 7f3b2d7fa700 1 --2- 192.168.123.105:0/2970041848 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3b1c06c490 0x7f3b1c06e950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:21.094 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.092+0000 7f3b2d7fa700 1 -- 192.168.123.105:0/2970041848 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f3b1808cac0 con 0x7f3b30103150 2026-03-10T07:48:21.094 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.092+0000 7f3b2f7fe700 1 --2- 192.168.123.105:0/2970041848 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3b1c06c490 0x7f3b1c06e950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:21.096 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.094+0000 7f3b2f7fe700 1 --2- 192.168.123.105:0/2970041848 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3b1c06c490 0x7f3b1c06e950 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f3b20009730 tx=0x7f3b20006cb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:21.096 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.095+0000 7f3b2d7fa700 1 -- 192.168.123.105:0/2970041848 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f3b1805c0d0 con 0x7f3b30103150 2026-03-10T07:48:21.209 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.207+0000 7f3b367b9700 1 -- 192.168.123.105:0/2970041848 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch host ls", "target": ["mon-mgr", ""]}) v1 -- 0x7f3b30108ca0 con 0x7f3b1c06c490 2026-03-10T07:48:21.210 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.209+0000 7f3b2d7fa700 1 -- 192.168.123.105:0/2970041848 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+139 (secure 0 0 0) 0x7f3b30108ca0 con 0x7f3b1c06c490 2026-03-10T07:48:21.210 INFO:teuthology.orchestra.run.vm05.stdout:HOST ADDR LABELS STATUS 2026-03-10T07:48:21.210 INFO:teuthology.orchestra.run.vm05.stdout:vm05 192.168.123.105 2026-03-10T07:48:21.210 INFO:teuthology.orchestra.run.vm05.stdout:vm08 192.168.123.108 2026-03-10T07:48:21.210 INFO:teuthology.orchestra.run.vm05.stdout:2 hosts in cluster 2026-03-10T07:48:21.212 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.211+0000 7f3b367b9700 1 -- 192.168.123.105:0/2970041848 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3b1c06c490 msgr2=0x7f3b1c06e950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:21.212 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.211+0000 7f3b367b9700 1 --2- 192.168.123.105:0/2970041848 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3b1c06c490 0x7f3b1c06e950 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f3b20009730 tx=0x7f3b20006cb0 comp rx=0 tx=0).stop 2026-03-10T07:48:21.213 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.211+0000 7f3b367b9700 1 -- 192.168.123.105:0/2970041848 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b30103150 msgr2=0x7f3b30198aa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:21.213 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.211+0000 7f3b367b9700 1 --2- 192.168.123.105:0/2970041848 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b30103150 0x7f3b30198aa0 secure :-1 s=READY pgs=254 cs=0 l=1 rev1=1 crypto rx=0x7f3b18004a30 tx=0x7f3b18004b10 comp rx=0 tx=0).stop 2026-03-10T07:48:21.213 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.211+0000 7f3b367b9700 1 -- 192.168.123.105:0/2970041848 shutdown_connections 2026-03-10T07:48:21.213 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.211+0000 7f3b367b9700 1 --2- 192.168.123.105:0/2970041848 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f3b1c06c490 0x7f3b1c06e950 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:21.213 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.211+0000 7f3b367b9700 1 --2- 192.168.123.105:0/2970041848 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b30103150 0x7f3b30198aa0 unknown :-1 s=CLOSED pgs=254 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:21.213 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.211+0000 7f3b367b9700 1 --2- 192.168.123.105:0/2970041848 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b30104350 0x7f3b30198fe0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:21.213 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.211+0000 7f3b367b9700 1 -- 192.168.123.105:0/2970041848 >> 192.168.123.105:0/2970041848 conn(0x7f3b300fe6d0 msgr2=0x7f3b30107580 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:21.213 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.211+0000 7f3b367b9700 1 -- 192.168.123.105:0/2970041848 shutdown_connections 2026-03-10T07:48:21.213 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.211+0000 7f3b367b9700 1 -- 192.168.123.105:0/2970041848 wait complete. 2026-03-10T07:48:21.272 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- bash -c 'ceph orch device ls' 2026-03-10T07:48:21.409 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:21.460 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:21 vm05 ceph-mon[50387]: from='client.14482 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:48:21.460 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:21 vm05 ceph-mon[50387]: from='client.14486 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:48:21.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.648+0000 7fa7c28e1700 1 -- 192.168.123.105:0/4168680828 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7bc0737f0 msgr2=0x7fa7bc073c70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:21.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.648+0000 7fa7c28e1700 1 --2- 192.168.123.105:0/4168680828 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7bc0737f0 0x7fa7bc073c70 secure :-1 s=READY pgs=255 cs=0 l=1 rev1=1 crypto rx=0x7fa7ac009b00 tx=0x7fa7ac009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:21.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.649+0000 7fa7c28e1700 1 -- 192.168.123.105:0/4168680828 shutdown_connections 2026-03-10T07:48:21.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.649+0000 7fa7c28e1700 1 --2- 192.168.123.105:0/4168680828 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7bc0737f0 0x7fa7bc073c70 unknown :-1 s=CLOSED pgs=255 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:21.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.649+0000 7fa7c28e1700 1 --2- 192.168.123.105:0/4168680828 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa7bc074dc0 0x7fa7bc073220 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:21.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.649+0000 7fa7c28e1700 1 -- 192.168.123.105:0/4168680828 >> 192.168.123.105:0/4168680828 conn(0x7fa7bc0fc4c0 msgr2=0x7fa7bc0fe920 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:21.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.649+0000 7fa7c28e1700 1 -- 192.168.123.105:0/4168680828 shutdown_connections 2026-03-10T07:48:21.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.649+0000 7fa7c28e1700 1 -- 192.168.123.105:0/4168680828 wait complete. 2026-03-10T07:48:21.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.650+0000 7fa7c28e1700 1 Processor -- start 2026-03-10T07:48:21.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.650+0000 7fa7c28e1700 1 -- start start 2026-03-10T07:48:21.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.650+0000 7fa7c28e1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7bc0737f0 0x7fa7bc19cdf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:21.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.650+0000 7fa7c28e1700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa7bc074dc0 0x7fa7bc19d330 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:21.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.650+0000 7fa7c28e1700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa7bc19d8c0 con 0x7fa7bc0737f0 2026-03-10T07:48:21.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.650+0000 7fa7c28e1700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa7bc19da00 con 0x7fa7bc074dc0 2026-03-10T07:48:21.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.650+0000 7fa7bb7fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa7bc074dc0 0x7fa7bc19d330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:21.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.650+0000 7fa7bb7fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa7bc074dc0 0x7fa7bc19d330 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:60108/0 (socket says 192.168.123.105:60108) 2026-03-10T07:48:21.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.650+0000 7fa7bb7fe700 1 -- 192.168.123.105:0/1411462624 learned_addr learned my addr 192.168.123.105:0/1411462624 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:21.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.651+0000 7fa7bbfff700 1 --2- 192.168.123.105:0/1411462624 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7bc0737f0 0x7fa7bc19cdf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:21.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.651+0000 7fa7bb7fe700 1 -- 192.168.123.105:0/1411462624 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7bc0737f0 msgr2=0x7fa7bc19cdf0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:21.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.651+0000 7fa7bb7fe700 1 --2- 192.168.123.105:0/1411462624 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7bc0737f0 0x7fa7bc19cdf0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:21.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.651+0000 7fa7bb7fe700 1 -- 192.168.123.105:0/1411462624 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa7ac0097e0 con 0x7fa7bc074dc0 2026-03-10T07:48:21.652 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.651+0000 7fa7bbfff700 1 --2- 192.168.123.105:0/1411462624 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7bc0737f0 0x7fa7bc19cdf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:48:21.653 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.651+0000 7fa7bb7fe700 1 --2- 192.168.123.105:0/1411462624 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa7bc074dc0 0x7fa7bc19d330 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fa7ac000c00 tx=0x7fa7ac0056c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:21.653 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.651+0000 7fa7b97fa700 1 -- 192.168.123.105:0/1411462624 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa7ac01d070 con 0x7fa7bc074dc0 2026-03-10T07:48:21.653 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.651+0000 7fa7c28e1700 1 -- 192.168.123.105:0/1411462624 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa7bc1a2460 con 0x7fa7bc074dc0 2026-03-10T07:48:21.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.651+0000 7fa7c28e1700 1 -- 192.168.123.105:0/1411462624 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa7bc1a2920 con 0x7fa7bc074dc0 2026-03-10T07:48:21.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.652+0000 7fa7b97fa700 1 -- 192.168.123.105:0/1411462624 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa7ac00bc50 con 0x7fa7bc074dc0 2026-03-10T07:48:21.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.652+0000 7fa7b97fa700 1 -- 192.168.123.105:0/1411462624 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa7ac00f800 con 0x7fa7bc074dc0 2026-03-10T07:48:21.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.652+0000 7fa7c28e1700 1 -- 192.168.123.105:0/1411462624 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa7bc066e80 con 0x7fa7bc074dc0 2026-03-10T07:48:21.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.653+0000 7fa7b97fa700 1 -- 192.168.123.105:0/1411462624 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fa7ac022ae0 con 0x7fa7bc074dc0 2026-03-10T07:48:21.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.653+0000 7fa7b97fa700 1 --2- 192.168.123.105:0/1411462624 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fa7a806c530 0x7fa7a806e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:21.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.653+0000 7fa7b97fa700 1 -- 192.168.123.105:0/1411462624 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7fa7ac08cc60 con 0x7fa7bc074dc0 2026-03-10T07:48:21.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.653+0000 7fa7bbfff700 1 --2- 192.168.123.105:0/1411462624 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fa7a806c530 0x7fa7a806e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:21.655 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.653+0000 7fa7bbfff700 1 --2- 192.168.123.105:0/1411462624 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fa7a806c530 0x7fa7a806e9f0 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7fa7a4005fd0 tx=0x7fa7a4005e20 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:21.657 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.656+0000 7fa7b97fa700 1 -- 192.168.123.105:0/1411462624 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fa7ac05b360 con 0x7fa7bc074dc0 2026-03-10T07:48:21.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:21 vm08 ceph-mon[59917]: from='client.14482 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:48:21.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:21 vm08 ceph-mon[59917]: from='client.14486 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:48:21.765 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.761+0000 7fa7c28e1700 1 -- 192.168.123.105:0/1411462624 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch device ls", "target": ["mon-mgr", ""]}) v1 -- 0x7fa7bc103ee0 con 0x7fa7a806c530 2026-03-10T07:48:21.765 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.763+0000 7fa7b97fa700 1 -- 192.168.123.105:0/1411462624 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+1278 (secure 0 0 0) 0x7fa7bc103ee0 con 0x7fa7a806c530 2026-03-10T07:48:21.765 INFO:teuthology.orchestra.run.vm05.stdout:HOST PATH TYPE DEVICE ID SIZE AVAILABLE REFRESHED REJECT REASONS 2026-03-10T07:48:21.765 INFO:teuthology.orchestra.run.vm05.stdout:vm05 /dev/vdb hdd DWNBRSTVMM05001 20.0G Yes 46s ago 2026-03-10T07:48:21.765 INFO:teuthology.orchestra.run.vm05.stdout:vm05 /dev/vdc hdd DWNBRSTVMM05002 20.0G No 46s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-10T07:48:21.765 INFO:teuthology.orchestra.run.vm05.stdout:vm05 /dev/vdd hdd DWNBRSTVMM05003 20.0G No 46s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-10T07:48:21.765 INFO:teuthology.orchestra.run.vm05.stdout:vm05 /dev/vde hdd DWNBRSTVMM05004 20.0G No 46s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-10T07:48:21.765 INFO:teuthology.orchestra.run.vm05.stdout:vm08 /dev/vdb hdd DWNBRSTVMM08001 20.0G Yes 19s ago 2026-03-10T07:48:21.765 INFO:teuthology.orchestra.run.vm05.stdout:vm08 /dev/vdc hdd DWNBRSTVMM08002 20.0G No 19s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-10T07:48:21.765 INFO:teuthology.orchestra.run.vm05.stdout:vm08 /dev/vdd hdd DWNBRSTVMM08003 20.0G No 19s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-10T07:48:21.765 INFO:teuthology.orchestra.run.vm05.stdout:vm08 /dev/vde hdd DWNBRSTVMM08004 20.0G No 19s ago Has a FileSystem, Insufficient space (<10 extents) on vgs, LVM detected 2026-03-10T07:48:21.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.765+0000 7fa7c28e1700 1 -- 192.168.123.105:0/1411462624 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fa7a806c530 msgr2=0x7fa7a806e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:21.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.765+0000 7fa7c28e1700 1 --2- 192.168.123.105:0/1411462624 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fa7a806c530 0x7fa7a806e9f0 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7fa7a4005fd0 tx=0x7fa7a4005e20 comp rx=0 tx=0).stop 2026-03-10T07:48:21.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.765+0000 7fa7c28e1700 1 -- 192.168.123.105:0/1411462624 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa7bc074dc0 msgr2=0x7fa7bc19d330 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:21.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.765+0000 7fa7c28e1700 1 --2- 192.168.123.105:0/1411462624 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa7bc074dc0 0x7fa7bc19d330 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fa7ac000c00 tx=0x7fa7ac0056c0 comp rx=0 tx=0).stop 2026-03-10T07:48:21.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.766+0000 7fa7c28e1700 1 -- 192.168.123.105:0/1411462624 shutdown_connections 2026-03-10T07:48:21.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.766+0000 7fa7c28e1700 1 --2- 192.168.123.105:0/1411462624 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fa7a806c530 0x7fa7a806e9f0 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:21.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.766+0000 7fa7c28e1700 1 --2- 192.168.123.105:0/1411462624 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa7bc0737f0 0x7fa7bc19cdf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:21.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.766+0000 7fa7c28e1700 1 --2- 192.168.123.105:0/1411462624 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa7bc074dc0 0x7fa7bc19d330 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:21.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.766+0000 7fa7c28e1700 1 -- 192.168.123.105:0/1411462624 >> 192.168.123.105:0/1411462624 conn(0x7fa7bc0fc4c0 msgr2=0x7fa7bc1027c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:21.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.766+0000 7fa7c28e1700 1 -- 192.168.123.105:0/1411462624 shutdown_connections 2026-03-10T07:48:21.768 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:21.766+0000 7fa7c28e1700 1 -- 192.168.123.105:0/1411462624 wait complete. 2026-03-10T07:48:21.831 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T07:48:21.833 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm05.local 2026-03-10T07:48:21.833 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- bash -c 'ceph fs volume create cephfs --placement=4' 2026-03-10T07:48:21.990 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:22.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.231+0000 7f96eed8f700 1 -- 192.168.123.105:0/276487965 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f96e8104340 msgr2=0x7f96e81047a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:22.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.231+0000 7f96eed8f700 1 --2- 192.168.123.105:0/276487965 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f96e8104340 0x7f96e81047a0 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f96dc009a60 tx=0x7f96dc009d70 comp rx=0 tx=0).stop 2026-03-10T07:48:22.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.232+0000 7f96eed8f700 1 -- 192.168.123.105:0/276487965 shutdown_connections 2026-03-10T07:48:22.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.232+0000 7f96eed8f700 1 --2- 192.168.123.105:0/276487965 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f96e8104340 0x7f96e81047a0 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:22.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.232+0000 7f96eed8f700 1 --2- 192.168.123.105:0/276487965 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f96e8103140 0x7f96e8103560 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:22.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.232+0000 7f96eed8f700 1 -- 192.168.123.105:0/276487965 >> 192.168.123.105:0/276487965 conn(0x7f96e80fe6c0 msgr2=0x7f96e8100b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:22.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.232+0000 7f96eed8f700 1 -- 192.168.123.105:0/276487965 shutdown_connections 2026-03-10T07:48:22.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.233+0000 7f96eed8f700 1 -- 192.168.123.105:0/276487965 wait complete. 2026-03-10T07:48:22.235 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.233+0000 7f96eed8f700 1 Processor -- start 2026-03-10T07:48:22.235 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.233+0000 7f96eed8f700 1 -- start start 2026-03-10T07:48:22.235 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.233+0000 7f96eed8f700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f96e8103140 0x7f96e81989c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:22.235 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.233+0000 7f96eed8f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f96e8104340 0x7f96e8198f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:22.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.233+0000 7f96eed8f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f96e8199520 con 0x7f96e8104340 2026-03-10T07:48:22.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.233+0000 7f96eed8f700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f96e8199660 con 0x7f96e8103140 2026-03-10T07:48:22.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.234+0000 7f96ecb2b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f96e8103140 0x7f96e81989c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:22.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.234+0000 7f96e7fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f96e8104340 0x7f96e8198f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:22.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.234+0000 7f96e7fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f96e8104340 0x7f96e8198f00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46912/0 (socket says 192.168.123.105:46912) 2026-03-10T07:48:22.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.234+0000 7f96e7fff700 1 -- 192.168.123.105:0/2788291836 learned_addr learned my addr 192.168.123.105:0/2788291836 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:22.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.234+0000 7f96e7fff700 1 -- 192.168.123.105:0/2788291836 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f96e8103140 msgr2=0x7f96e81989c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:22.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.234+0000 7f96e7fff700 1 --2- 192.168.123.105:0/2788291836 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f96e8103140 0x7f96e81989c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:22.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.234+0000 7f96e7fff700 1 -- 192.168.123.105:0/2788291836 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f96d80097e0 con 0x7f96e8104340 2026-03-10T07:48:22.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.234+0000 7f96ecb2b700 1 --2- 192.168.123.105:0/2788291836 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f96e8103140 0x7f96e81989c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:48:22.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.234+0000 7f96e7fff700 1 --2- 192.168.123.105:0/2788291836 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f96e8104340 0x7f96e8198f00 secure :-1 s=READY pgs=256 cs=0 l=1 rev1=1 crypto rx=0x7f96dc0096a0 tx=0x7f96dc00f880 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:22.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.235+0000 7f96e5ffb700 1 -- 192.168.123.105:0/2788291836 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f96dc01d070 con 0x7f96e8104340 2026-03-10T07:48:22.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.235+0000 7f96eed8f700 1 -- 192.168.123.105:0/2788291836 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f96dc009710 con 0x7f96e8104340 2026-03-10T07:48:22.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.235+0000 7f96eed8f700 1 -- 192.168.123.105:0/2788291836 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f96e819e410 con 0x7f96e8104340 2026-03-10T07:48:22.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.235+0000 7f96e5ffb700 1 -- 192.168.123.105:0/2788291836 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f96dc00fe60 con 0x7f96e8104340 2026-03-10T07:48:22.237 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.235+0000 7f96e5ffb700 1 -- 192.168.123.105:0/2788291836 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f96dc0177d0 con 0x7f96e8104340 2026-03-10T07:48:22.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.237+0000 7f96eed8f700 1 -- 192.168.123.105:0/2788291836 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f96d4005320 con 0x7f96e8104340 2026-03-10T07:48:22.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.237+0000 7f96e5ffb700 1 -- 192.168.123.105:0/2788291836 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f96dc017a10 con 0x7f96e8104340 2026-03-10T07:48:22.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.238+0000 7f96e5ffb700 1 --2- 192.168.123.105:0/2788291836 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f96d006c490 0x7f96d006e950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:22.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.238+0000 7f96ecb2b700 1 --2- 192.168.123.105:0/2788291836 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f96d006c490 0x7f96d006e950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:22.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.238+0000 7f96ecb2b700 1 --2- 192.168.123.105:0/2788291836 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f96d006c490 0x7f96d006e950 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f96e81041a0 tx=0x7f96d8009500 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:22.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.238+0000 7f96e5ffb700 1 -- 192.168.123.105:0/2788291836 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(33..33 src has 1..33) v4 ==== 4446+0+0 (secure 0 0 0) 0x7f96dc08c660 con 0x7f96e8104340 2026-03-10T07:48:22.242 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.241+0000 7f96e5ffb700 1 -- 192.168.123.105:0/2788291836 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f96dc05ad60 con 0x7f96e8104340 2026-03-10T07:48:22.274 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:22 vm05 ceph-mon[50387]: from='client.14490 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:48:22.274 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:22 vm05 ceph-mon[50387]: pgmap v70: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:48:22.274 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:22 vm05 ceph-mon[50387]: from='client.24289 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:48:22.366 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:22.364+0000 7f96eed8f700 1 -- 192.168.123.105:0/2788291836 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}) v1 -- 0x7f96d4000bf0 con 0x7f96d006c490 2026-03-10T07:48:22.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:22 vm08 ceph-mon[59917]: from='client.14490 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:48:22.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:22 vm08 ceph-mon[59917]: pgmap v70: 1 pgs: 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:48:22.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:22 vm08 ceph-mon[59917]: from='client.24289 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:48:23.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:23 vm08 ceph-mon[59917]: from='client.14496 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:48:23.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:23 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch 2026-03-10T07:48:23.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:23 vm05 ceph-mon[50387]: from='client.14496 -' entity='client.admin' cmd=[{"prefix": "fs volume create", "name": "cephfs", "placement": "4", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:48:23.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:23 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]: dispatch 2026-03-10T07:48:24.332 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.330+0000 7f96e5ffb700 1 -- 192.168.123.105:0/2788291836 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+0 (secure 0 0 0) 0x7f96d4000bf0 con 0x7f96d006c490 2026-03-10T07:48:24.334 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.333+0000 7f96eed8f700 1 -- 192.168.123.105:0/2788291836 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f96d006c490 msgr2=0x7f96d006e950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:24.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.333+0000 7f96eed8f700 1 --2- 192.168.123.105:0/2788291836 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f96d006c490 0x7f96d006e950 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f96e81041a0 tx=0x7f96d8009500 comp rx=0 tx=0).stop 2026-03-10T07:48:24.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.333+0000 7f96eed8f700 1 -- 192.168.123.105:0/2788291836 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f96e8104340 msgr2=0x7f96e8198f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:24.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.333+0000 7f96eed8f700 1 --2- 192.168.123.105:0/2788291836 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f96e8104340 0x7f96e8198f00 secure :-1 s=READY pgs=256 cs=0 l=1 rev1=1 crypto rx=0x7f96dc0096a0 tx=0x7f96dc00f880 comp rx=0 tx=0).stop 2026-03-10T07:48:24.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.333+0000 7f96eed8f700 1 -- 192.168.123.105:0/2788291836 shutdown_connections 2026-03-10T07:48:24.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.333+0000 7f96eed8f700 1 --2- 192.168.123.105:0/2788291836 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f96d006c490 0x7f96d006e950 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:24.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.333+0000 7f96eed8f700 1 --2- 192.168.123.105:0/2788291836 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f96e8103140 0x7f96e81989c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:24.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.333+0000 7f96eed8f700 1 --2- 192.168.123.105:0/2788291836 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f96e8104340 0x7f96e8198f00 unknown :-1 s=CLOSED pgs=256 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:24.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.333+0000 7f96eed8f700 1 -- 192.168.123.105:0/2788291836 >> 192.168.123.105:0/2788291836 conn(0x7f96e80fe6c0 msgr2=0x7f96e8107570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:24.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.333+0000 7f96eed8f700 1 -- 192.168.123.105:0/2788291836 shutdown_connections 2026-03-10T07:48:24.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.333+0000 7f96eed8f700 1 -- 192.168.123.105:0/2788291836 wait complete. 2026-03-10T07:48:24.402 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- bash -c 'ceph fs dump' 2026-03-10T07:48:24.585 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:24.618 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:24 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]': finished 2026-03-10T07:48:24.618 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:24 vm05 ceph-mon[50387]: osdmap e34: 6 total, 6 up, 6 in 2026-03-10T07:48:24.618 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:24 vm05 ceph-mon[50387]: pgmap v72: 33 pgs: 32 unknown, 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:48:24.618 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:24 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch 2026-03-10T07:48:24.618 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:24 vm05 ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm05[50383]: 2026-03-10T07:48:24.308+0000 7f1d80cd3700 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T07:48:24.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:24 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "osd pool create", "pool": "cephfs.cephfs.meta"}]': finished 2026-03-10T07:48:24.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:24 vm08 ceph-mon[59917]: osdmap e34: 6 total, 6 up, 6 in 2026-03-10T07:48:24.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:24 vm08 ceph-mon[59917]: pgmap v72: 33 pgs: 32 unknown, 1 active+clean; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:48:24.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:24 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]: dispatch 2026-03-10T07:48:24.951 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.949+0000 7f5e42eb0700 1 -- 192.168.123.105:0/485635707 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e3c107d90 msgr2=0x7f5e3c10a1c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:24.951 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.949+0000 7f5e42eb0700 1 --2- 192.168.123.105:0/485635707 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e3c107d90 0x7f5e3c10a1c0 secure :-1 s=READY pgs=257 cs=0 l=1 rev1=1 crypto rx=0x7f5e38009b00 tx=0x7f5e38009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:24.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.949+0000 7f5e42eb0700 1 -- 192.168.123.105:0/485635707 shutdown_connections 2026-03-10T07:48:24.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.949+0000 7f5e42eb0700 1 --2- 192.168.123.105:0/485635707 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5e3c10a700 0x7f5e3c10cb90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:24.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.949+0000 7f5e42eb0700 1 --2- 192.168.123.105:0/485635707 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e3c107d90 0x7f5e3c10a1c0 unknown :-1 s=CLOSED pgs=257 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:24.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.949+0000 7f5e42eb0700 1 -- 192.168.123.105:0/485635707 >> 192.168.123.105:0/485635707 conn(0x7f5e3c06daa0 msgr2=0x7f5e3c06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:24.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.950+0000 7f5e42eb0700 1 -- 192.168.123.105:0/485635707 shutdown_connections 2026-03-10T07:48:24.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.950+0000 7f5e42eb0700 1 -- 192.168.123.105:0/485635707 wait complete. 2026-03-10T07:48:24.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.950+0000 7f5e42eb0700 1 Processor -- start 2026-03-10T07:48:24.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.950+0000 7f5e42eb0700 1 -- start start 2026-03-10T07:48:24.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.950+0000 7f5e42eb0700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e3c107d90 0x7f5e3c116ba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:24.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.950+0000 7f5e42eb0700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5e3c10a700 0x7f5e3c117100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:24.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.950+0000 7f5e42eb0700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5e3c1aeff0 con 0x7f5e3c107d90 2026-03-10T07:48:24.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.950+0000 7f5e42eb0700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5e3c1af160 con 0x7f5e3c10a700 2026-03-10T07:48:24.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.951+0000 7f5e416ad700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5e3c10a700 0x7f5e3c117100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:24.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.951+0000 7f5e416ad700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5e3c10a700 0x7f5e3c117100 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:60146/0 (socket says 192.168.123.105:60146) 2026-03-10T07:48:24.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.951+0000 7f5e416ad700 1 -- 192.168.123.105:0/3318912806 learned_addr learned my addr 192.168.123.105:0/3318912806 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:24.953 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.951+0000 7f5e41eae700 1 --2- 192.168.123.105:0/3318912806 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e3c107d90 0x7f5e3c116ba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:24.953 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.951+0000 7f5e416ad700 1 -- 192.168.123.105:0/3318912806 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e3c107d90 msgr2=0x7f5e3c116ba0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:24.953 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.951+0000 7f5e416ad700 1 --2- 192.168.123.105:0/3318912806 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e3c107d90 0x7f5e3c116ba0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:24.953 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.951+0000 7f5e416ad700 1 -- 192.168.123.105:0/3318912806 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5e380097e0 con 0x7f5e3c10a700 2026-03-10T07:48:24.953 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.952+0000 7f5e41eae700 1 --2- 192.168.123.105:0/3318912806 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e3c107d90 0x7f5e3c116ba0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:48:24.953 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.952+0000 7f5e416ad700 1 --2- 192.168.123.105:0/3318912806 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5e3c10a700 0x7f5e3c117100 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f5e2c00b700 tx=0x7f5e2c00bac0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:24.954 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.953+0000 7f5e32ffd700 1 -- 192.168.123.105:0/3318912806 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5e2c010840 con 0x7f5e3c10a700 2026-03-10T07:48:24.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.953+0000 7f5e42eb0700 1 -- 192.168.123.105:0/3318912806 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5e3c1af3e0 con 0x7f5e3c10a700 2026-03-10T07:48:24.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.953+0000 7f5e42eb0700 1 -- 192.168.123.105:0/3318912806 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5e3c1af930 con 0x7f5e3c10a700 2026-03-10T07:48:24.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.953+0000 7f5e42eb0700 1 -- 192.168.123.105:0/3318912806 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5e3c110c60 con 0x7f5e3c10a700 2026-03-10T07:48:24.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.954+0000 7f5e32ffd700 1 -- 192.168.123.105:0/3318912806 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5e2c010e80 con 0x7f5e3c10a700 2026-03-10T07:48:24.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.954+0000 7f5e32ffd700 1 -- 192.168.123.105:0/3318912806 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5e2c00d590 con 0x7f5e3c10a700 2026-03-10T07:48:24.957 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.955+0000 7f5e32ffd700 1 -- 192.168.123.105:0/3318912806 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f5e2c00f3e0 con 0x7f5e3c10a700 2026-03-10T07:48:24.957 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.955+0000 7f5e32ffd700 1 --2- 192.168.123.105:0/3318912806 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5e2806c600 0x7f5e2806eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:24.957 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.955+0000 7f5e32ffd700 1 -- 192.168.123.105:0/3318912806 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(36..36 src has 1..36) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f5e2c08b5a0 con 0x7f5e3c10a700 2026-03-10T07:48:24.957 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.955+0000 7f5e41eae700 1 --2- 192.168.123.105:0/3318912806 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5e2806c600 0x7f5e2806eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:24.957 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.956+0000 7f5e41eae700 1 --2- 192.168.123.105:0/3318912806 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5e2806c600 0x7f5e2806eac0 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f5e38000c00 tx=0x7f5e38005c00 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:24.959 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:24.957+0000 7f5e32ffd700 1 -- 192.168.123.105:0/3318912806 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f5e2c055ea0 con 0x7f5e3c10a700 2026-03-10T07:48:25.112 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.110+0000 7f5e42eb0700 1 -- 192.168.123.105:0/3318912806 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f5e3c04ea90 con 0x7f5e3c10a700 2026-03-10T07:48:25.113 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.111+0000 7f5e32ffd700 1 -- 192.168.123.105:0/3318912806 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 2 v2) v1 ==== 75+0+1093 (secure 0 0 0) 0x7f5e2c0594c0 con 0x7f5e3c10a700 2026-03-10T07:48:25.115 INFO:teuthology.orchestra.run.vm05.stdout:e2 2026-03-10T07:48:25.115 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T07:48:25.115 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:48:25.115 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T07:48:25.115 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:48:25.115 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T07:48:25.115 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T07:48:25.115 INFO:teuthology.orchestra.run.vm05.stdout:epoch 2 2026-03-10T07:48:25.115 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T07:48:25.115 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T07:48:24.309293+0000 2026-03-10T07:48:25.115 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T07:48:24.309342+0000 2026-03-10T07:48:25.115 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T07:48:25.115 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T07:48:25.115 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T07:48:25.115 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T07:48:25.115 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T07:48:25.116 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T07:48:25.116 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T07:48:25.116 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 0 2026-03-10T07:48:25.116 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:48:25.116 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T07:48:25.116 INFO:teuthology.orchestra.run.vm05.stdout:in 2026-03-10T07:48:25.116 INFO:teuthology.orchestra.run.vm05.stdout:up {} 2026-03-10T07:48:25.116 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T07:48:25.116 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T07:48:25.116 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T07:48:25.116 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T07:48:25.116 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T07:48:25.116 INFO:teuthology.orchestra.run.vm05.stdout:inline_data disabled 2026-03-10T07:48:25.116 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T07:48:25.116 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T07:48:25.116 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 0 2026-03-10T07:48:25.116 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:48:25.116 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:48:25.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.115+0000 7f5e30ff9700 1 -- 192.168.123.105:0/3318912806 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5e2806c600 msgr2=0x7f5e2806eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:25.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.115+0000 7f5e30ff9700 1 --2- 192.168.123.105:0/3318912806 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5e2806c600 0x7f5e2806eac0 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f5e38000c00 tx=0x7f5e38005c00 comp rx=0 tx=0).stop 2026-03-10T07:48:25.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.115+0000 7f5e30ff9700 1 -- 192.168.123.105:0/3318912806 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5e3c10a700 msgr2=0x7f5e3c117100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:25.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.115+0000 7f5e30ff9700 1 --2- 192.168.123.105:0/3318912806 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5e3c10a700 0x7f5e3c117100 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f5e2c00b700 tx=0x7f5e2c00bac0 comp rx=0 tx=0).stop 2026-03-10T07:48:25.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.116+0000 7f5e30ff9700 1 -- 192.168.123.105:0/3318912806 shutdown_connections 2026-03-10T07:48:25.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.116+0000 7f5e30ff9700 1 --2- 192.168.123.105:0/3318912806 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5e2806c600 0x7f5e2806eac0 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:25.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.116+0000 7f5e30ff9700 1 --2- 192.168.123.105:0/3318912806 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5e3c107d90 0x7f5e3c116ba0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:25.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.116+0000 7f5e30ff9700 1 --2- 192.168.123.105:0/3318912806 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5e3c10a700 0x7f5e3c117100 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:25.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.116+0000 7f5e30ff9700 1 -- 192.168.123.105:0/3318912806 >> 192.168.123.105:0/3318912806 conn(0x7f5e3c06daa0 msgr2=0x7f5e3c06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:25.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.116+0000 7f5e30ff9700 1 -- 192.168.123.105:0/3318912806 shutdown_connections 2026-03-10T07:48:25.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.116+0000 7f5e30ff9700 1 -- 192.168.123.105:0/3318912806 wait complete. 2026-03-10T07:48:25.122 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 2 2026-03-10T07:48:25.257 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T07:48:25.259 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm05.local 2026-03-10T07:48:25.260 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- bash -c 'ceph fs set cephfs max_mds 1' 2026-03-10T07:48:25.488 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:25.531 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:25 vm05 ceph-mon[50387]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED) 2026-03-10T07:48:25.532 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:25 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]': finished 2026-03-10T07:48:25.532 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:25 vm05 ceph-mon[50387]: osdmap e35: 6 total, 6 up, 6 in 2026-03-10T07:48:25.532 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:25 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch 2026-03-10T07:48:25.532 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:25 vm05 ceph-mon[50387]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T07:48:25.532 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:25 vm05 ceph-mon[50387]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX) 2026-03-10T07:48:25.532 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:25 vm05 ceph-mon[50387]: osdmap e36: 6 total, 6 up, 6 in 2026-03-10T07:48:25.532 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:25 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished 2026-03-10T07:48:25.532 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:25 vm05 ceph-mon[50387]: fsmap cephfs:0 2026-03-10T07:48:25.532 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:25 vm05 ceph-mon[50387]: Saving service mds.cephfs spec with placement count:4 2026-03-10T07:48:25.532 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:25 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:25.532 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:25 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:48:25.532 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:25 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:48:25.532 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:25 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:48:25.532 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:25 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:25.532 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:25 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.omfhnh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T07:48:25.532 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:25 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.omfhnh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T07:48:25.532 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:25 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:48:25.532 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:25 vm05 ceph-mon[50387]: Deploying daemon mds.cephfs.vm05.omfhnh on vm05 2026-03-10T07:48:25.532 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:25 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/3318912806' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T07:48:25.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:25 vm08 ceph-mon[59917]: Health check failed: 1 pool(s) do not have an application enabled (POOL_APP_NOT_ENABLED) 2026-03-10T07:48:25.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:25 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"bulk": true, "prefix": "osd pool create", "pool": "cephfs.cephfs.data"}]': finished 2026-03-10T07:48:25.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:25 vm08 ceph-mon[59917]: osdmap e35: 6 total, 6 up, 6 in 2026-03-10T07:48:25.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:25 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]: dispatch 2026-03-10T07:48:25.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:25 vm08 ceph-mon[59917]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T07:48:25.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:25 vm08 ceph-mon[59917]: Health check failed: 1 filesystem is online with fewer MDS than max_mds (MDS_UP_LESS_THAN_MAX) 2026-03-10T07:48:25.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:25 vm08 ceph-mon[59917]: osdmap e36: 6 total, 6 up, 6 in 2026-03-10T07:48:25.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:25 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "fs new", "fs_name": "cephfs", "metadata": "cephfs.cephfs.meta", "data": "cephfs.cephfs.data"}]': finished 2026-03-10T07:48:25.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:25 vm08 ceph-mon[59917]: fsmap cephfs:0 2026-03-10T07:48:25.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:25 vm08 ceph-mon[59917]: Saving service mds.cephfs spec with placement count:4 2026-03-10T07:48:25.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:25 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:25.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:25 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:48:25.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:25 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:48:25.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:25 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:48:25.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:25 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:25.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:25 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.omfhnh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T07:48:25.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:25 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.omfhnh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T07:48:25.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:25 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:48:25.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:25 vm08 ceph-mon[59917]: Deploying daemon mds.cephfs.vm05.omfhnh on vm05 2026-03-10T07:48:25.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:25 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/3318912806' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T07:48:25.764 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.762+0000 7fc1af96c700 1 -- 192.168.123.105:0/250510094 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc1a8100f00 msgr2=0x7fc1a8101320 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:25.765 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.762+0000 7fc1af96c700 1 --2- 192.168.123.105:0/250510094 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc1a8100f00 0x7fc1a8101320 secure :-1 s=READY pgs=258 cs=0 l=1 rev1=1 crypto rx=0x7fc1a4009b50 tx=0x7fc1a4009e60 comp rx=0 tx=0).stop 2026-03-10T07:48:25.765 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.763+0000 7fc1af96c700 1 -- 192.168.123.105:0/250510094 shutdown_connections 2026-03-10T07:48:25.765 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.763+0000 7fc1af96c700 1 --2- 192.168.123.105:0/250510094 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc1a8102060 0x7fc1a81024e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:25.765 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.763+0000 7fc1af96c700 1 --2- 192.168.123.105:0/250510094 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc1a8100f00 0x7fc1a8101320 unknown :-1 s=CLOSED pgs=258 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:25.765 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.763+0000 7fc1af96c700 1 -- 192.168.123.105:0/250510094 >> 192.168.123.105:0/250510094 conn(0x7fc1a80fc460 msgr2=0x7fc1a80fe8e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:25.765 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.763+0000 7fc1af96c700 1 -- 192.168.123.105:0/250510094 shutdown_connections 2026-03-10T07:48:25.765 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.763+0000 7fc1af96c700 1 -- 192.168.123.105:0/250510094 wait complete. 2026-03-10T07:48:25.765 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.764+0000 7fc1af96c700 1 Processor -- start 2026-03-10T07:48:25.766 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.764+0000 7fc1af96c700 1 -- start start 2026-03-10T07:48:25.766 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.764+0000 7fc1af96c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc1a8100f00 0x7fc1a8071e10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:25.766 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.764+0000 7fc1af96c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc1a8102060 0x7fc1a8072350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:25.766 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.764+0000 7fc1af96c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc1a8072970 con 0x7fc1a8102060 2026-03-10T07:48:25.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.764+0000 7fc1af96c700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc1a8072ab0 con 0x7fc1a8100f00 2026-03-10T07:48:25.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.764+0000 7fc1acf07700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc1a8102060 0x7fc1a8072350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:25.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.764+0000 7fc1acf07700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc1a8102060 0x7fc1a8072350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46952/0 (socket says 192.168.123.105:46952) 2026-03-10T07:48:25.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.764+0000 7fc1acf07700 1 -- 192.168.123.105:0/1624240953 learned_addr learned my addr 192.168.123.105:0/1624240953 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:25.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.764+0000 7fc1acf07700 1 -- 192.168.123.105:0/1624240953 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc1a8100f00 msgr2=0x7fc1a8071e10 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:48:25.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.764+0000 7fc1acf07700 1 --2- 192.168.123.105:0/1624240953 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc1a8100f00 0x7fc1a8071e10 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:25.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.764+0000 7fc1acf07700 1 -- 192.168.123.105:0/1624240953 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc1a40097e0 con 0x7fc1a8102060 2026-03-10T07:48:25.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.765+0000 7fc1acf07700 1 --2- 192.168.123.105:0/1624240953 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc1a8102060 0x7fc1a8072350 secure :-1 s=READY pgs=259 cs=0 l=1 rev1=1 crypto rx=0x7fc19c00d8d0 tx=0x7fc19c00dc90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:25.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.765+0000 7fc19a7fc700 1 -- 192.168.123.105:0/1624240953 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc19c009940 con 0x7fc1a8102060 2026-03-10T07:48:25.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.765+0000 7fc19a7fc700 1 -- 192.168.123.105:0/1624240953 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc19c010460 con 0x7fc1a8102060 2026-03-10T07:48:25.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.765+0000 7fc19a7fc700 1 -- 192.168.123.105:0/1624240953 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc19c00f5d0 con 0x7fc1a8102060 2026-03-10T07:48:25.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.765+0000 7fc1af96c700 1 -- 192.168.123.105:0/1624240953 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc1a8073540 con 0x7fc1a8102060 2026-03-10T07:48:25.767 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.766+0000 7fc1af96c700 1 -- 192.168.123.105:0/1624240953 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc1a8073a90 con 0x7fc1a8102060 2026-03-10T07:48:25.768 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.766+0000 7fc1af96c700 1 -- 192.168.123.105:0/1624240953 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc1a8066e80 con 0x7fc1a8102060 2026-03-10T07:48:25.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.767+0000 7fc19a7fc700 1 -- 192.168.123.105:0/1624240953 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fc19c0105d0 con 0x7fc1a8102060 2026-03-10T07:48:25.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.767+0000 7fc19a7fc700 1 --2- 192.168.123.105:0/1624240953 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc19406c4e0 0x7fc19406e9a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:25.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.768+0000 7fc19a7fc700 1 -- 192.168.123.105:0/1624240953 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(37..37 src has 1..37) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fc19c08b610 con 0x7fc1a8102060 2026-03-10T07:48:25.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.770+0000 7fc19a7fc700 1 -- 192.168.123.105:0/1624240953 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fc19c059710 con 0x7fc1a8102060 2026-03-10T07:48:25.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.770+0000 7fc1ad708700 1 --2- 192.168.123.105:0/1624240953 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc19406c4e0 0x7fc19406e9a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:25.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.771+0000 7fc1ad708700 1 --2- 192.168.123.105:0/1624240953 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc19406c4e0 0x7fc19406e9a0 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7fc1a4009b20 tx=0x7fc1a4005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:25.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:25.900+0000 7fc1af96c700 1 -- 192.168.123.105:0/1624240953 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"} v 0) v1 -- 0x7fc1a8074290 con 0x7fc1a8102060 2026-03-10T07:48:26.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.384+0000 7fc19a7fc700 1 -- 192.168.123.105:0/1624240953 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]=0 v3) v1 ==== 105+0+0 (secure 0 0 0) 0x7fc19c059530 con 0x7fc1a8102060 2026-03-10T07:48:26.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.386+0000 7fc1af96c700 1 -- 192.168.123.105:0/1624240953 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc19406c4e0 msgr2=0x7fc19406e9a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:26.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.386+0000 7fc1af96c700 1 --2- 192.168.123.105:0/1624240953 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc19406c4e0 0x7fc19406e9a0 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7fc1a4009b20 tx=0x7fc1a4005fb0 comp rx=0 tx=0).stop 2026-03-10T07:48:26.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.386+0000 7fc1af96c700 1 -- 192.168.123.105:0/1624240953 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc1a8102060 msgr2=0x7fc1a8072350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:26.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.386+0000 7fc1af96c700 1 --2- 192.168.123.105:0/1624240953 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc1a8102060 0x7fc1a8072350 secure :-1 s=READY pgs=259 cs=0 l=1 rev1=1 crypto rx=0x7fc19c00d8d0 tx=0x7fc19c00dc90 comp rx=0 tx=0).stop 2026-03-10T07:48:26.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.387+0000 7fc1af96c700 1 -- 192.168.123.105:0/1624240953 shutdown_connections 2026-03-10T07:48:26.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.387+0000 7fc1af96c700 1 --2- 192.168.123.105:0/1624240953 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fc19406c4e0 0x7fc19406e9a0 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:26.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.387+0000 7fc1af96c700 1 --2- 192.168.123.105:0/1624240953 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc1a8100f00 0x7fc1a8071e10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:26.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.387+0000 7fc1af96c700 1 --2- 192.168.123.105:0/1624240953 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc1a8102060 0x7fc1a8072350 unknown :-1 s=CLOSED pgs=259 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:26.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.387+0000 7fc1af96c700 1 -- 192.168.123.105:0/1624240953 >> 192.168.123.105:0/1624240953 conn(0x7fc1a80fc460 msgr2=0x7fc1a8105320 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:26.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.387+0000 7fc1af96c700 1 -- 192.168.123.105:0/1624240953 shutdown_connections 2026-03-10T07:48:26.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.387+0000 7fc1af96c700 1 -- 192.168.123.105:0/1624240953 wait complete. 2026-03-10T07:48:26.459 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T07:48:26.461 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm05.local 2026-03-10T07:48:26.462 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- bash -c 'ceph fs set cephfs allow_standby_replay false' 2026-03-10T07:48:26.643 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:26.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:26 vm05 ceph-mon[50387]: pgmap v75: 65 pgs: 13 creating+peering, 10 active+clean, 42 unknown; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:48:26.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:26 vm05 ceph-mon[50387]: osdmap e37: 6 total, 6 up, 6 in 2026-03-10T07:48:26.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:26 vm05 ceph-mon[50387]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled) 2026-03-10T07:48:26.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:26 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:26.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:26 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:26.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:26 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:26.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:26 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.ybmbgd", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T07:48:26.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:26 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.ybmbgd", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T07:48:26.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:26 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:48:26.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:26 vm05 ceph-mon[50387]: Deploying daemon mds.cephfs.vm08.ybmbgd on vm08 2026-03-10T07:48:26.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:26 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/1624240953' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]: dispatch 2026-03-10T07:48:26.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:26 vm08 ceph-mon[59917]: pgmap v75: 65 pgs: 13 creating+peering, 10 active+clean, 42 unknown; 449 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:48:26.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:26 vm08 ceph-mon[59917]: osdmap e37: 6 total, 6 up, 6 in 2026-03-10T07:48:26.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:26 vm08 ceph-mon[59917]: Health check cleared: POOL_APP_NOT_ENABLED (was: 1 pool(s) do not have an application enabled) 2026-03-10T07:48:26.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:26 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:26.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:26 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:26.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:26 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:26.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:26 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.ybmbgd", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T07:48:26.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:26 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.ybmbgd", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T07:48:26.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:26 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:48:26.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:26 vm08 ceph-mon[59917]: Deploying daemon mds.cephfs.vm08.ybmbgd on vm08 2026-03-10T07:48:26.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:26 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/1624240953' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]: dispatch 2026-03-10T07:48:26.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.975+0000 7f0ae765b700 1 -- 192.168.123.105:0/3857024733 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ae0108750 msgr2=0x7f0ae0108bb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:26.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.975+0000 7f0ae765b700 1 --2- 192.168.123.105:0/3857024733 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ae0108750 0x7f0ae0108bb0 secure :-1 s=READY pgs=260 cs=0 l=1 rev1=1 crypto rx=0x7f0adc009b00 tx=0x7f0adc009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:26.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.978+0000 7f0ae765b700 1 -- 192.168.123.105:0/3857024733 shutdown_connections 2026-03-10T07:48:26.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.978+0000 7f0ae765b700 1 --2- 192.168.123.105:0/3857024733 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ae0108750 0x7f0ae0108bb0 unknown :-1 s=CLOSED pgs=260 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:26.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.978+0000 7f0ae765b700 1 --2- 192.168.123.105:0/3857024733 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0ae0107550 0x7f0ae0107970 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:26.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.978+0000 7f0ae765b700 1 -- 192.168.123.105:0/3857024733 >> 192.168.123.105:0/3857024733 conn(0x7f0ae0076500 msgr2=0x7f0ae0078960 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:26.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.978+0000 7f0ae765b700 1 -- 192.168.123.105:0/3857024733 shutdown_connections 2026-03-10T07:48:26.980 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.979+0000 7f0ae765b700 1 -- 192.168.123.105:0/3857024733 wait complete. 2026-03-10T07:48:26.982 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.979+0000 7f0ae765b700 1 Processor -- start 2026-03-10T07:48:26.982 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.981+0000 7f0ae765b700 1 -- start start 2026-03-10T07:48:26.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.981+0000 7f0ae765b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ae0107550 0x7f0ae019ce70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:26.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.981+0000 7f0ae765b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0ae0108750 0x7f0ae019d3b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:26.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.981+0000 7f0ae765b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0ae019d9d0 con 0x7f0ae0107550 2026-03-10T07:48:26.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.981+0000 7f0ae4bf6700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0ae0108750 0x7f0ae019d3b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:26.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.981+0000 7f0ae765b700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0ae019db10 con 0x7f0ae0108750 2026-03-10T07:48:26.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.981+0000 7f0ae4bf6700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0ae0108750 0x7f0ae019d3b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:60192/0 (socket says 192.168.123.105:60192) 2026-03-10T07:48:26.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.981+0000 7f0ae4bf6700 1 -- 192.168.123.105:0/3776558681 learned_addr learned my addr 192.168.123.105:0/3776558681 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:26.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.982+0000 7f0ae4bf6700 1 -- 192.168.123.105:0/3776558681 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ae0107550 msgr2=0x7f0ae019ce70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:26.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.982+0000 7f0ae53f7700 1 --2- 192.168.123.105:0/3776558681 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ae0107550 0x7f0ae019ce70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:26.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.982+0000 7f0ae4bf6700 1 --2- 192.168.123.105:0/3776558681 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ae0107550 0x7f0ae019ce70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:26.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.982+0000 7f0ae4bf6700 1 -- 192.168.123.105:0/3776558681 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0adc0097e0 con 0x7f0ae0108750 2026-03-10T07:48:26.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.982+0000 7f0ae4bf6700 1 --2- 192.168.123.105:0/3776558681 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0ae0108750 0x7f0ae019d3b0 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f0adc005230 tx=0x7f0adc0056c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:26.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.982+0000 7f0ae53f7700 1 --2- 192.168.123.105:0/3776558681 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ae0107550 0x7f0ae019ce70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:48:26.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.982+0000 7f0ad27fc700 1 -- 192.168.123.105:0/3776558681 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0adc01d070 con 0x7f0ae0108750 2026-03-10T07:48:26.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.982+0000 7f0ad27fc700 1 -- 192.168.123.105:0/3776558681 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0adc00bc50 con 0x7f0ae0108750 2026-03-10T07:48:26.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.982+0000 7f0ad27fc700 1 -- 192.168.123.105:0/3776558681 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0adc00f850 con 0x7f0ae0108750 2026-03-10T07:48:26.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.983+0000 7f0ae765b700 1 -- 192.168.123.105:0/3776558681 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0ae01a2560 con 0x7f0ae0108750 2026-03-10T07:48:26.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.983+0000 7f0ae765b700 1 -- 192.168.123.105:0/3776558681 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0ae01a2a00 con 0x7f0ae0108750 2026-03-10T07:48:26.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.984+0000 7f0ae765b700 1 -- 192.168.123.105:0/3776558681 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0ac4005320 con 0x7f0ae0108750 2026-03-10T07:48:26.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.984+0000 7f0ad27fc700 1 -- 192.168.123.105:0/3776558681 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f0adc00fac0 con 0x7f0ae0108750 2026-03-10T07:48:26.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.984+0000 7f0ad27fc700 1 --2- 192.168.123.105:0/3776558681 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0acc06c530 0x7f0acc06e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:26.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.984+0000 7f0ad27fc700 1 -- 192.168.123.105:0/3776558681 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f0adc08cfa0 con 0x7f0ae0108750 2026-03-10T07:48:26.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.986+0000 7f0ae53f7700 1 --2- 192.168.123.105:0/3776558681 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0acc06c530 0x7f0acc06e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:26.988 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.987+0000 7f0ad27fc700 1 -- 192.168.123.105:0/3776558681 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f0adc05b2e0 con 0x7f0ae0108750 2026-03-10T07:48:26.988 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:26.987+0000 7f0ae53f7700 1 --2- 192.168.123.105:0/3776558681 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0acc06c530 0x7f0acc06e9f0 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f0ae01085b0 tx=0x7f0ad4008040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:27.149 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:27.147+0000 7f0ae765b700 1 -- 192.168.123.105:0/3776558681 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"} v 0) v1 -- 0x7f0ac4005f70 con 0x7f0ae0108750 2026-03-10T07:48:27.334 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:27 vm08 ceph-mon[59917]: osdmap e38: 6 total, 6 up, 6 in 2026-03-10T07:48:27.334 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:27 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:27.334 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:27 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:27.334 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:27 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:27.334 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:27 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.pavqil", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T07:48:27.334 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:27 vm08 ceph-mon[59917]: mds.? [v2:192.168.123.105:6826/723078808,v1:192.168.123.105:6827/723078808] up:boot 2026-03-10T07:48:27.334 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:27 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/1624240953' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]': finished 2026-03-10T07:48:27.334 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:27 vm08 ceph-mon[59917]: mds.? [v2:192.168.123.108:6824/3692157290,v1:192.168.123.108:6825/3692157290] up:boot 2026-03-10T07:48:27.334 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:27 vm08 ceph-mon[59917]: daemon mds.cephfs.vm08.ybmbgd assigned to filesystem cephfs as rank 0 (now has 1 ranks) 2026-03-10T07:48:27.334 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:27 vm08 ceph-mon[59917]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T07:48:27.334 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:27 vm08 ceph-mon[59917]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds) 2026-03-10T07:48:27.334 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:27 vm08 ceph-mon[59917]: Cluster is now healthy 2026-03-10T07:48:27.334 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:27 vm08 ceph-mon[59917]: fsmap cephfs:0 2 up:standby 2026-03-10T07:48:27.334 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:27 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.omfhnh"}]: dispatch 2026-03-10T07:48:27.337 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:27 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.ybmbgd"}]: dispatch 2026-03-10T07:48:27.338 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:27 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.pavqil", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T07:48:27.338 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:27 vm08 ceph-mon[59917]: fsmap cephfs:1 {0=cephfs.vm08.ybmbgd=up:creating} 1 up:standby 2026-03-10T07:48:27.338 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:27 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:48:27.338 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:27 vm08 ceph-mon[59917]: Deploying daemon mds.cephfs.vm05.pavqil on vm05 2026-03-10T07:48:27.338 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:27 vm08 ceph-mon[59917]: daemon mds.cephfs.vm08.ybmbgd is now active in filesystem cephfs as rank 0 2026-03-10T07:48:27.338 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:27 vm08 ceph-mon[59917]: mds.? [v2:192.168.123.108:6824/3692157290,v1:192.168.123.108:6825/3692157290] up:active 2026-03-10T07:48:27.338 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:27 vm08 ceph-mon[59917]: fsmap cephfs:1 {0=cephfs.vm08.ybmbgd=up:active} 1 up:standby 2026-03-10T07:48:27.338 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:27 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/3776558681' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]: dispatch 2026-03-10T07:48:27.338 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:27 vm08 ceph-mon[59917]: from='client.? ' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]: dispatch 2026-03-10T07:48:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:27 vm05 ceph-mon[50387]: osdmap e38: 6 total, 6 up, 6 in 2026-03-10T07:48:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:27 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:27 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:27 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:27 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.pavqil", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T07:48:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:27 vm05 ceph-mon[50387]: mds.? [v2:192.168.123.105:6826/723078808,v1:192.168.123.105:6827/723078808] up:boot 2026-03-10T07:48:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:27 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/1624240953' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "max_mds", "val": "1"}]': finished 2026-03-10T07:48:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:27 vm05 ceph-mon[50387]: mds.? [v2:192.168.123.108:6824/3692157290,v1:192.168.123.108:6825/3692157290] up:boot 2026-03-10T07:48:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:27 vm05 ceph-mon[50387]: daemon mds.cephfs.vm08.ybmbgd assigned to filesystem cephfs as rank 0 (now has 1 ranks) 2026-03-10T07:48:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:27 vm05 ceph-mon[50387]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T07:48:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:27 vm05 ceph-mon[50387]: Health check cleared: MDS_UP_LESS_THAN_MAX (was: 1 filesystem is online with fewer MDS than max_mds) 2026-03-10T07:48:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:27 vm05 ceph-mon[50387]: Cluster is now healthy 2026-03-10T07:48:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:27 vm05 ceph-mon[50387]: fsmap cephfs:0 2 up:standby 2026-03-10T07:48:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:27 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.omfhnh"}]: dispatch 2026-03-10T07:48:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:27 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.ybmbgd"}]: dispatch 2026-03-10T07:48:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:27 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.pavqil", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T07:48:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:27 vm05 ceph-mon[50387]: fsmap cephfs:1 {0=cephfs.vm08.ybmbgd=up:creating} 1 up:standby 2026-03-10T07:48:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:27 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:48:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:27 vm05 ceph-mon[50387]: Deploying daemon mds.cephfs.vm05.pavqil on vm05 2026-03-10T07:48:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:27 vm05 ceph-mon[50387]: daemon mds.cephfs.vm08.ybmbgd is now active in filesystem cephfs as rank 0 2026-03-10T07:48:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:27 vm05 ceph-mon[50387]: mds.? [v2:192.168.123.108:6824/3692157290,v1:192.168.123.108:6825/3692157290] up:active 2026-03-10T07:48:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:27 vm05 ceph-mon[50387]: fsmap cephfs:1 {0=cephfs.vm08.ybmbgd=up:active} 1 up:standby 2026-03-10T07:48:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:27 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/3776558681' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]: dispatch 2026-03-10T07:48:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:27 vm05 ceph-mon[50387]: from='client.? ' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]: dispatch 2026-03-10T07:48:28.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.128+0000 7f0ad27fc700 1 -- 192.168.123.105:0/3776558681 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]=0 v6) v1 ==== 122+0+0 (secure 0 0 0) 0x7f0adc05ae70 con 0x7f0ae0108750 2026-03-10T07:48:28.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.131+0000 7f0acbfff700 1 -- 192.168.123.105:0/3776558681 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0acc06c530 msgr2=0x7f0acc06e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:28.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.131+0000 7f0acbfff700 1 --2- 192.168.123.105:0/3776558681 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0acc06c530 0x7f0acc06e9f0 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f0ae01085b0 tx=0x7f0ad4008040 comp rx=0 tx=0).stop 2026-03-10T07:48:28.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.131+0000 7f0acbfff700 1 -- 192.168.123.105:0/3776558681 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0ae0108750 msgr2=0x7f0ae019d3b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:28.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.131+0000 7f0acbfff700 1 --2- 192.168.123.105:0/3776558681 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0ae0108750 0x7f0ae019d3b0 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f0adc005230 tx=0x7f0adc0056c0 comp rx=0 tx=0).stop 2026-03-10T07:48:28.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.131+0000 7f0acbfff700 1 -- 192.168.123.105:0/3776558681 shutdown_connections 2026-03-10T07:48:28.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.131+0000 7f0acbfff700 1 --2- 192.168.123.105:0/3776558681 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f0acc06c530 0x7f0acc06e9f0 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:28.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.131+0000 7f0acbfff700 1 --2- 192.168.123.105:0/3776558681 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0ae0107550 0x7f0ae019ce70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:28.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.131+0000 7f0acbfff700 1 --2- 192.168.123.105:0/3776558681 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0ae0108750 0x7f0ae019d3b0 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:28.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.131+0000 7f0acbfff700 1 -- 192.168.123.105:0/3776558681 >> 192.168.123.105:0/3776558681 conn(0x7f0ae0076500 msgr2=0x7f0ae010b980 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:28.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.131+0000 7f0acbfff700 1 -- 192.168.123.105:0/3776558681 shutdown_connections 2026-03-10T07:48:28.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.131+0000 7f0acbfff700 1 -- 192.168.123.105:0/3776558681 wait complete. 2026-03-10T07:48:28.183 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T07:48:28.185 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm05.local 2026-03-10T07:48:28.185 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- bash -c 'ceph fs set cephfs inline_data true --yes-i-really-really-mean-it' 2026-03-10T07:48:28.340 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:28.628 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:28 vm08 ceph-mon[59917]: pgmap v78: 65 pgs: 13 creating+peering, 38 active+clean, 14 unknown; 450 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 767 B/s wr, 3 op/s 2026-03-10T07:48:28.628 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:28 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:28.628 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:28 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:28.628 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:28 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:28.628 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:28 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.dgsaon", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T07:48:28.628 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:28 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.dgsaon", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T07:48:28.628 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:28 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:48:28.628 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:28 vm08 ceph-mon[59917]: Deploying daemon mds.cephfs.vm08.dgsaon on vm08 2026-03-10T07:48:28.628 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:28 vm08 ceph-mon[59917]: from='client.? ' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]': finished 2026-03-10T07:48:28.628 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:28 vm08 ceph-mon[59917]: mds.? [v2:192.168.123.105:6828/427555544,v1:192.168.123.105:6829/427555544] up:boot 2026-03-10T07:48:28.628 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:28 vm08 ceph-mon[59917]: fsmap cephfs:1 {0=cephfs.vm08.ybmbgd=up:active} 2 up:standby 2026-03-10T07:48:28.628 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:28 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.pavqil"}]: dispatch 2026-03-10T07:48:28.628 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:28 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:48:28.628 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:28 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:28.628 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:28 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:28.628 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:28 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:28.628 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:28 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:28.628 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:28 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:28.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:28 vm05 ceph-mon[50387]: pgmap v78: 65 pgs: 13 creating+peering, 38 active+clean, 14 unknown; 450 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 767 B/s wr, 3 op/s 2026-03-10T07:48:28.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:28 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:28.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:28 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:28.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:28 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:28.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:28 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.dgsaon", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T07:48:28.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:28 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.dgsaon", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]': finished 2026-03-10T07:48:28.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:28 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:48:28.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:28 vm05 ceph-mon[50387]: Deploying daemon mds.cephfs.vm08.dgsaon on vm08 2026-03-10T07:48:28.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:28 vm05 ceph-mon[50387]: from='client.? ' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "allow_standby_replay", "val": "false"}]': finished 2026-03-10T07:48:28.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:28 vm05 ceph-mon[50387]: mds.? [v2:192.168.123.105:6828/427555544,v1:192.168.123.105:6829/427555544] up:boot 2026-03-10T07:48:28.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:28 vm05 ceph-mon[50387]: fsmap cephfs:1 {0=cephfs.vm08.ybmbgd=up:active} 2 up:standby 2026-03-10T07:48:28.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:28 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.pavqil"}]: dispatch 2026-03-10T07:48:28.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:28 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:48:28.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:28 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:28.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:28 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:28.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:28 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:28.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:28 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:28.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:28 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:28.671 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.668+0000 7f465b59e700 1 -- 192.168.123.105:0/2629332550 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f465c100c90 msgr2=0x7f465c1010b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:28.671 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.668+0000 7f465b59e700 1 --2- 192.168.123.105:0/2629332550 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f465c100c90 0x7f465c1010b0 secure :-1 s=READY pgs=263 cs=0 l=1 rev1=1 crypto rx=0x7f464c009b00 tx=0x7f464c009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:28.676 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.674+0000 7f465b59e700 1 -- 192.168.123.105:0/2629332550 shutdown_connections 2026-03-10T07:48:28.676 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.674+0000 7f465b59e700 1 --2- 192.168.123.105:0/2629332550 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f465c101e90 0x7f465c1022f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:28.676 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.674+0000 7f465b59e700 1 --2- 192.168.123.105:0/2629332550 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f465c100c90 0x7f465c1010b0 unknown :-1 s=CLOSED pgs=263 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:28.676 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.674+0000 7f465b59e700 1 -- 192.168.123.105:0/2629332550 >> 192.168.123.105:0/2629332550 conn(0x7f465c0fc210 msgr2=0x7f465c0fe670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:28.676 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.674+0000 7f465b59e700 1 -- 192.168.123.105:0/2629332550 shutdown_connections 2026-03-10T07:48:28.676 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.674+0000 7f465b59e700 1 -- 192.168.123.105:0/2629332550 wait complete. 2026-03-10T07:48:28.676 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.675+0000 7f465b59e700 1 Processor -- start 2026-03-10T07:48:28.676 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.675+0000 7f465b59e700 1 -- start start 2026-03-10T07:48:28.677 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.675+0000 7f465b59e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f465c100c90 0x7f465c196560 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:28.677 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.675+0000 7f465a59c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f465c100c90 0x7f465c196560 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:28.677 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.675+0000 7f465a59c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f465c100c90 0x7f465c196560 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:47028/0 (socket says 192.168.123.105:47028) 2026-03-10T07:48:28.677 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.675+0000 7f465a59c700 1 -- 192.168.123.105:0/2864361691 learned_addr learned my addr 192.168.123.105:0/2864361691 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:28.677 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.675+0000 7f465b59e700 1 --2- 192.168.123.105:0/2864361691 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f465c101e90 0x7f465c196aa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:28.677 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.675+0000 7f465b59e700 1 -- 192.168.123.105:0/2864361691 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f465c1970c0 con 0x7f465c100c90 2026-03-10T07:48:28.677 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.675+0000 7f465b59e700 1 -- 192.168.123.105:0/2864361691 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f465c197200 con 0x7f465c101e90 2026-03-10T07:48:28.677 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.676+0000 7f4659d9b700 1 --2- 192.168.123.105:0/2864361691 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f465c101e90 0x7f465c196aa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:28.677 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.676+0000 7f465a59c700 1 -- 192.168.123.105:0/2864361691 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f465c101e90 msgr2=0x7f465c196aa0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:28.677 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.676+0000 7f465a59c700 1 --2- 192.168.123.105:0/2864361691 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f465c101e90 0x7f465c196aa0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:28.677 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.676+0000 7f465a59c700 1 -- 192.168.123.105:0/2864361691 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f464c0097e0 con 0x7f465c100c90 2026-03-10T07:48:28.678 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.676+0000 7f465a59c700 1 --2- 192.168.123.105:0/2864361691 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f465c100c90 0x7f465c196560 secure :-1 s=READY pgs=264 cs=0 l=1 rev1=1 crypto rx=0x7f464c000c00 tx=0x7f464c004990 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:28.678 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.676+0000 7f464b7fe700 1 -- 192.168.123.105:0/2864361691 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f464c01d070 con 0x7f465c100c90 2026-03-10T07:48:28.678 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.676+0000 7f464b7fe700 1 -- 192.168.123.105:0/2864361691 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f464c022470 con 0x7f465c100c90 2026-03-10T07:48:28.679 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.676+0000 7f464b7fe700 1 -- 192.168.123.105:0/2864361691 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f464c00f670 con 0x7f465c100c90 2026-03-10T07:48:28.679 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.676+0000 7f465b59e700 1 -- 192.168.123.105:0/2864361691 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f465c19bc50 con 0x7f465c100c90 2026-03-10T07:48:28.679 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.676+0000 7f465b59e700 1 -- 192.168.123.105:0/2864361691 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f465c19c140 con 0x7f465c100c90 2026-03-10T07:48:28.682 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.678+0000 7f465b59e700 1 -- 192.168.123.105:0/2864361691 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f465c04ea90 con 0x7f465c100c90 2026-03-10T07:48:28.682 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.678+0000 7f464b7fe700 1 -- 192.168.123.105:0/2864361691 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f464c0225e0 con 0x7f465c100c90 2026-03-10T07:48:28.682 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.678+0000 7f464b7fe700 1 --2- 192.168.123.105:0/2864361691 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f464406c530 0x7f464406e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:28.682 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.678+0000 7f464b7fe700 1 -- 192.168.123.105:0/2864361691 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7f464c08cf30 con 0x7f465c100c90 2026-03-10T07:48:28.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.681+0000 7f4659d9b700 1 --2- 192.168.123.105:0/2864361691 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f464406c530 0x7f464406e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:28.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.681+0000 7f464b7fe700 1 -- 192.168.123.105:0/2864361691 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f464c05b3e0 con 0x7f465c100c90 2026-03-10T07:48:28.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.682+0000 7f4659d9b700 1 --2- 192.168.123.105:0/2864361691 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f464406c530 0x7f464406e9f0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f4650005950 tx=0x7f465000b410 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:28.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:28.808+0000 7f465b59e700 1 -- 192.168.123.105:0/2864361691 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true} v 0) v1 -- 0x7f465c19c390 con 0x7f465c100c90 2026-03-10T07:48:29.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.397+0000 7f464b7fe700 1 -- 192.168.123.105:0/2864361691 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true}]=0 inline data enabled v7) v1 ==== 168+0+0 (secure 0 0 0) 0x7f464c0270d0 con 0x7f465c100c90 2026-03-10T07:48:29.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.399+0000 7f46497fa700 1 -- 192.168.123.105:0/2864361691 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f464406c530 msgr2=0x7f464406e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:29.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.399+0000 7f46497fa700 1 --2- 192.168.123.105:0/2864361691 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f464406c530 0x7f464406e9f0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f4650005950 tx=0x7f465000b410 comp rx=0 tx=0).stop 2026-03-10T07:48:29.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.399+0000 7f46497fa700 1 -- 192.168.123.105:0/2864361691 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f465c100c90 msgr2=0x7f465c196560 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:29.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.399+0000 7f46497fa700 1 --2- 192.168.123.105:0/2864361691 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f465c100c90 0x7f465c196560 secure :-1 s=READY pgs=264 cs=0 l=1 rev1=1 crypto rx=0x7f464c000c00 tx=0x7f464c004990 comp rx=0 tx=0).stop 2026-03-10T07:48:29.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.400+0000 7f46497fa700 1 -- 192.168.123.105:0/2864361691 shutdown_connections 2026-03-10T07:48:29.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.400+0000 7f46497fa700 1 --2- 192.168.123.105:0/2864361691 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f464406c530 0x7f464406e9f0 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:29.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.400+0000 7f46497fa700 1 --2- 192.168.123.105:0/2864361691 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f465c100c90 0x7f465c196560 unknown :-1 s=CLOSED pgs=264 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:29.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.400+0000 7f46497fa700 1 --2- 192.168.123.105:0/2864361691 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f465c101e90 0x7f465c196aa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:29.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.400+0000 7f46497fa700 1 -- 192.168.123.105:0/2864361691 >> 192.168.123.105:0/2864361691 conn(0x7f465c0fc210 msgr2=0x7f465c1050c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:29.402 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.400+0000 7f46497fa700 1 -- 192.168.123.105:0/2864361691 shutdown_connections 2026-03-10T07:48:29.403 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.401+0000 7f46497fa700 1 -- 192.168.123.105:0/2864361691 wait complete. 2026-03-10T07:48:29.405 INFO:teuthology.orchestra.run.vm05.stderr:inline data enabled 2026-03-10T07:48:29.474 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T07:48:29.476 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm05.local 2026-03-10T07:48:29.476 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- bash -c 'ceph fs dump' 2026-03-10T07:48:29.501 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:29 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:48:29.501 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:29 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/2864361691' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true}]: dispatch 2026-03-10T07:48:29.501 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:29 vm05 ceph-mon[50387]: Health check failed: 1 filesystem with deprecated feature inline_data (FS_INLINE_DATA_DEPRECATED) 2026-03-10T07:48:29.501 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:29 vm05 ceph-mon[50387]: mds.? [v2:192.168.123.108:6826/2963085185,v1:192.168.123.108:6827/2963085185] up:boot 2026-03-10T07:48:29.501 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:29 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/2864361691' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true}]': finished 2026-03-10T07:48:29.501 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:29 vm05 ceph-mon[50387]: fsmap cephfs:1 {0=cephfs.vm08.ybmbgd=up:active} 3 up:standby 2026-03-10T07:48:29.501 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:29 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.dgsaon"}]: dispatch 2026-03-10T07:48:29.692 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:29.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:29 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:48:29.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:29 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/2864361691' entity='client.admin' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true}]: dispatch 2026-03-10T07:48:29.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:29 vm08 ceph-mon[59917]: Health check failed: 1 filesystem with deprecated feature inline_data (FS_INLINE_DATA_DEPRECATED) 2026-03-10T07:48:29.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:29 vm08 ceph-mon[59917]: mds.? [v2:192.168.123.108:6826/2963085185,v1:192.168.123.108:6827/2963085185] up:boot 2026-03-10T07:48:29.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:29 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/2864361691' entity='client.admin' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "inline_data", "val": "true", "yes_i_really_really_mean_it": true}]': finished 2026-03-10T07:48:29.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:29 vm08 ceph-mon[59917]: fsmap cephfs:1 {0=cephfs.vm08.ybmbgd=up:active} 3 up:standby 2026-03-10T07:48:29.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:29 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.dgsaon"}]: dispatch 2026-03-10T07:48:29.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.995+0000 7fb2eb59e700 1 -- 192.168.123.105:0/498564612 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2ec072b20 msgr2=0x7fb2ec072f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:29.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.995+0000 7fb2eb59e700 1 --2- 192.168.123.105:0/498564612 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2ec072b20 0x7fb2ec072f40 secure :-1 s=READY pgs=265 cs=0 l=1 rev1=1 crypto rx=0x7fb2dc009b50 tx=0x7fb2dc009e60 comp rx=0 tx=0).stop 2026-03-10T07:48:29.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.996+0000 7fb2eb59e700 1 -- 192.168.123.105:0/498564612 shutdown_connections 2026-03-10T07:48:29.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.996+0000 7fb2eb59e700 1 --2- 192.168.123.105:0/498564612 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb2ec075a10 0x7fb2ec077ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:29.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.996+0000 7fb2eb59e700 1 --2- 192.168.123.105:0/498564612 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2ec072b20 0x7fb2ec072f40 unknown :-1 s=CLOSED pgs=265 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:29.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.996+0000 7fb2eb59e700 1 -- 192.168.123.105:0/498564612 >> 192.168.123.105:0/498564612 conn(0x7fb2ec06daa0 msgr2=0x7fb2ec06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:30.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.996+0000 7fb2eb59e700 1 -- 192.168.123.105:0/498564612 shutdown_connections 2026-03-10T07:48:30.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.996+0000 7fb2eb59e700 1 -- 192.168.123.105:0/498564612 wait complete. 2026-03-10T07:48:30.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.996+0000 7fb2eb59e700 1 Processor -- start 2026-03-10T07:48:30.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.996+0000 7fb2eb59e700 1 -- start start 2026-03-10T07:48:30.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.996+0000 7fb2eb59e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb2ec072b20 0x7fb2ec082dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:30.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.996+0000 7fb2eb59e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2ec075a10 0x7fb2ec083300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:30.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.996+0000 7fb2eb59e700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb2ec0838d0 con 0x7fb2ec075a10 2026-03-10T07:48:30.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.996+0000 7fb2eb59e700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb2ec083a40 con 0x7fb2ec072b20 2026-03-10T07:48:30.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.997+0000 7fb2e9d9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2ec075a10 0x7fb2ec083300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:30.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.999+0000 7fb2e9d9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2ec075a10 0x7fb2ec083300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36984/0 (socket says 192.168.123.105:36984) 2026-03-10T07:48:30.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.999+0000 7fb2e9d9b700 1 -- 192.168.123.105:0/2320862776 learned_addr learned my addr 192.168.123.105:0/2320862776 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:30.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.999+0000 7fb2e9d9b700 1 -- 192.168.123.105:0/2320862776 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb2ec072b20 msgr2=0x7fb2ec082dc0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:48:30.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.999+0000 7fb2e9d9b700 1 --2- 192.168.123.105:0/2320862776 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb2ec072b20 0x7fb2ec082dc0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:30.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.999+0000 7fb2e9d9b700 1 -- 192.168.123.105:0/2320862776 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb2dc0097e0 con 0x7fb2ec075a10 2026-03-10T07:48:30.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:29.999+0000 7fb2e9d9b700 1 --2- 192.168.123.105:0/2320862776 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2ec075a10 0x7fb2ec083300 secure :-1 s=READY pgs=266 cs=0 l=1 rev1=1 crypto rx=0x7fb2e40060b0 tx=0x7fb2e400d750 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:30.002 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.000+0000 7fb2db7fe700 1 -- 192.168.123.105:0/2320862776 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb2e4015400 con 0x7fb2ec075a10 2026-03-10T07:48:30.002 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.001+0000 7fb2eb59e700 1 -- 192.168.123.105:0/2320862776 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb2ec12e580 con 0x7fb2ec075a10 2026-03-10T07:48:30.002 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.001+0000 7fb2eb59e700 1 -- 192.168.123.105:0/2320862776 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb2ec12ead0 con 0x7fb2ec075a10 2026-03-10T07:48:30.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.001+0000 7fb2db7fe700 1 -- 192.168.123.105:0/2320862776 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fb2e400f040 con 0x7fb2ec075a10 2026-03-10T07:48:30.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.001+0000 7fb2db7fe700 1 -- 192.168.123.105:0/2320862776 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb2e4014a20 con 0x7fb2ec075a10 2026-03-10T07:48:30.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.002+0000 7fb2eb59e700 1 -- 192.168.123.105:0/2320862776 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb2cc005320 con 0x7fb2ec075a10 2026-03-10T07:48:30.009 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.003+0000 7fb2db7fe700 1 -- 192.168.123.105:0/2320862776 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fb2e4014cc0 con 0x7fb2ec075a10 2026-03-10T07:48:30.009 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.003+0000 7fb2db7fe700 1 --2- 192.168.123.105:0/2320862776 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb2d406c600 0x7fb2d406eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:30.009 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.003+0000 7fb2db7fe700 1 -- 192.168.123.105:0/2320862776 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(38..38 src has 1..38) v4 ==== 5276+0+0 (secure 0 0 0) 0x7fb2e408b400 con 0x7fb2ec075a10 2026-03-10T07:48:30.009 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.005+0000 7fb2ea59c700 1 --2- 192.168.123.105:0/2320862776 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb2d406c600 0x7fb2d406eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:30.009 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.005+0000 7fb2ea59c700 1 --2- 192.168.123.105:0/2320862776 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb2d406c600 0x7fb2d406eac0 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7fb2dc005cb0 tx=0x7fb2dc005bc0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:30.009 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.007+0000 7fb2db7fe700 1 -- 192.168.123.105:0/2320862776 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fb2e4055d00 con 0x7fb2ec075a10 2026-03-10T07:48:30.157 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.154+0000 7fb2eb59e700 1 -- 192.168.123.105:0/2320862776 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fb2cc006200 con 0x7fb2ec075a10 2026-03-10T07:48:30.157 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.155+0000 7fb2db7fe700 1 -- 192.168.123.105:0/2320862776 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 7 v7) v1 ==== 75+0+1769 (secure 0 0 0) 0x7fb2e4059320 con 0x7fb2ec075a10 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:e7 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:epoch 7 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T07:48:24.309293+0000 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T07:48:29.392777+0000 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 0 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24303} 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:inline_data enabled 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ybmbgd{0:24303} state up:active seq 2 addr [v2:192.168.123.108:6824/3692157290,v1:192.168.123.108:6825/3692157290] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.pavqil{-1:14512} state up:standby seq 1 addr [v2:192.168.123.105:6828/427555544,v1:192.168.123.105:6829/427555544] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.omfhnh{-1:24297} state up:standby seq 1 addr [v2:192.168.123.105:6826/723078808,v1:192.168.123.105:6827/723078808] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:48:30.158 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.dgsaon{-1:24313} state up:standby seq 1 addr [v2:192.168.123.108:6826/2963085185,v1:192.168.123.108:6827/2963085185] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:48:30.167 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.165+0000 7fb2d97fa700 1 -- 192.168.123.105:0/2320862776 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb2d406c600 msgr2=0x7fb2d406eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:30.167 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.165+0000 7fb2d97fa700 1 --2- 192.168.123.105:0/2320862776 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb2d406c600 0x7fb2d406eac0 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7fb2dc005cb0 tx=0x7fb2dc005bc0 comp rx=0 tx=0).stop 2026-03-10T07:48:30.167 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.165+0000 7fb2d97fa700 1 -- 192.168.123.105:0/2320862776 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2ec075a10 msgr2=0x7fb2ec083300 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:30.167 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.165+0000 7fb2d97fa700 1 --2- 192.168.123.105:0/2320862776 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2ec075a10 0x7fb2ec083300 secure :-1 s=READY pgs=266 cs=0 l=1 rev1=1 crypto rx=0x7fb2e40060b0 tx=0x7fb2e400d750 comp rx=0 tx=0).stop 2026-03-10T07:48:30.168 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.167+0000 7fb2d97fa700 1 -- 192.168.123.105:0/2320862776 shutdown_connections 2026-03-10T07:48:30.168 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.167+0000 7fb2d97fa700 1 --2- 192.168.123.105:0/2320862776 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fb2d406c600 0x7fb2d406eac0 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:30.168 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.167+0000 7fb2d97fa700 1 --2- 192.168.123.105:0/2320862776 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb2ec072b20 0x7fb2ec082dc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:30.168 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.167+0000 7fb2d97fa700 1 --2- 192.168.123.105:0/2320862776 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb2ec075a10 0x7fb2ec083300 unknown :-1 s=CLOSED pgs=266 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:30.168 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.167+0000 7fb2d97fa700 1 -- 192.168.123.105:0/2320862776 >> 192.168.123.105:0/2320862776 conn(0x7fb2ec06daa0 msgr2=0x7fb2ec07aa50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:30.168 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.167+0000 7fb2d97fa700 1 -- 192.168.123.105:0/2320862776 shutdown_connections 2026-03-10T07:48:30.168 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.167+0000 7fb2d97fa700 1 -- 192.168.123.105:0/2320862776 wait complete. 2026-03-10T07:48:30.169 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 7 2026-03-10T07:48:30.218 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- bash -c 'ceph --format=json fs dump | jq -e ".filesystems | length == 1"' 2026-03-10T07:48:30.452 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:30.557 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:30 vm05 ceph-mon[50387]: pgmap v79: 65 pgs: 6 creating+peering, 59 active+clean; 450 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1023 B/s wr, 6 op/s 2026-03-10T07:48:30.557 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:30 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:30.557 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:30 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:30.557 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:30 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:30.557 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:30 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/2320862776' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T07:48:30.557 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:30 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:30.557 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:30 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:30.557 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:30 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:48:30.557 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:30 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:48:30.557 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:30 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:30.557 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:30 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:48:30.557 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:30 vm05 ceph-mon[50387]: mds.? [v2:192.168.123.105:6826/723078808,v1:192.168.123.105:6827/723078808] up:standby 2026-03-10T07:48:30.557 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:30 vm05 ceph-mon[50387]: Dropping low affinity active daemon mds.cephfs.vm08.ybmbgd in favor of higher affinity standby. 2026-03-10T07:48:30.557 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:30 vm05 ceph-mon[50387]: Replacing daemon mds.cephfs.vm08.ybmbgd as rank 0 with standby daemon mds.cephfs.vm05.omfhnh 2026-03-10T07:48:30.557 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:30 vm05 ceph-mon[50387]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T07:48:30.557 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:30 vm05 ceph-mon[50387]: fsmap cephfs:1 {0=cephfs.vm08.ybmbgd=up:active} 3 up:standby 2026-03-10T07:48:30.557 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:30 vm05 ceph-mon[50387]: osdmap e39: 6 total, 6 up, 6 in 2026-03-10T07:48:30.557 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:30 vm05 ceph-mon[50387]: fsmap cephfs:1/1 {0=cephfs.vm05.omfhnh=up:replay} 2 up:standby 2026-03-10T07:48:30.752 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:30 vm08 ceph-mon[59917]: pgmap v79: 65 pgs: 6 creating+peering, 59 active+clean; 450 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1023 B/s wr, 6 op/s 2026-03-10T07:48:30.752 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:30 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:30.752 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:30 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:30.752 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:30 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:30.752 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:30 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/2320862776' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T07:48:30.752 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:30 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:30.752 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:30 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:30.752 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:30 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:48:30.752 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:30 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:48:30.752 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:30 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:30.752 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:30 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:48:30.752 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:30 vm08 ceph-mon[59917]: mds.? [v2:192.168.123.105:6826/723078808,v1:192.168.123.105:6827/723078808] up:standby 2026-03-10T07:48:30.752 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:30 vm08 ceph-mon[59917]: Dropping low affinity active daemon mds.cephfs.vm08.ybmbgd in favor of higher affinity standby. 2026-03-10T07:48:30.752 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:30 vm08 ceph-mon[59917]: Replacing daemon mds.cephfs.vm08.ybmbgd as rank 0 with standby daemon mds.cephfs.vm05.omfhnh 2026-03-10T07:48:30.752 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:30 vm08 ceph-mon[59917]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T07:48:30.752 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:30 vm08 ceph-mon[59917]: fsmap cephfs:1 {0=cephfs.vm08.ybmbgd=up:active} 3 up:standby 2026-03-10T07:48:30.752 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:30 vm08 ceph-mon[59917]: osdmap e39: 6 total, 6 up, 6 in 2026-03-10T07:48:30.752 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:30 vm08 ceph-mon[59917]: fsmap cephfs:1/1 {0=cephfs.vm05.omfhnh=up:replay} 2 up:standby 2026-03-10T07:48:30.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.765+0000 7fd9de8d2700 1 -- 192.168.123.105:0/1704455951 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd9d8075c80 msgr2=0x7fd9d8078110 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:30.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.765+0000 7fd9de8d2700 1 --2- 192.168.123.105:0/1704455951 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd9d8075c80 0x7fd9d8078110 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7fd9c800bc70 tx=0x7fd9c800bf80 comp rx=0 tx=0).stop 2026-03-10T07:48:30.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.768+0000 7fd9de8d2700 1 -- 192.168.123.105:0/1704455951 shutdown_connections 2026-03-10T07:48:30.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.768+0000 7fd9de8d2700 1 --2- 192.168.123.105:0/1704455951 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd9d8075c80 0x7fd9d8078110 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:30.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.768+0000 7fd9de8d2700 1 --2- 192.168.123.105:0/1704455951 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd9d8072d90 0x7fd9d80731b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:30.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.768+0000 7fd9de8d2700 1 -- 192.168.123.105:0/1704455951 >> 192.168.123.105:0/1704455951 conn(0x7fd9d806dda0 msgr2=0x7fd9d8070220 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:30.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.770+0000 7fd9de8d2700 1 -- 192.168.123.105:0/1704455951 shutdown_connections 2026-03-10T07:48:30.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.770+0000 7fd9de8d2700 1 -- 192.168.123.105:0/1704455951 wait complete. 2026-03-10T07:48:30.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.770+0000 7fd9de8d2700 1 Processor -- start 2026-03-10T07:48:30.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.770+0000 7fd9de8d2700 1 -- start start 2026-03-10T07:48:30.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.770+0000 7fd9de8d2700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd9d8072d90 0x7fd9d81b2190 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:30.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.770+0000 7fd9de8d2700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd9d8075c80 0x7fd9d81b26d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:30.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.770+0000 7fd9de8d2700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd9d81b2cf0 con 0x7fd9d8075c80 2026-03-10T07:48:30.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.770+0000 7fd9de8d2700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd9d81b2e30 con 0x7fd9d8072d90 2026-03-10T07:48:30.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.771+0000 7fd9d7fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd9d8072d90 0x7fd9d81b2190 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:30.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.771+0000 7fd9d7fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd9d8072d90 0x7fd9d81b2190 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:50746/0 (socket says 192.168.123.105:50746) 2026-03-10T07:48:30.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.771+0000 7fd9d7fff700 1 -- 192.168.123.105:0/497526156 learned_addr learned my addr 192.168.123.105:0/497526156 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:30.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.771+0000 7fd9d77fe700 1 --2- 192.168.123.105:0/497526156 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd9d8075c80 0x7fd9d81b26d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:30.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.771+0000 7fd9d7fff700 1 -- 192.168.123.105:0/497526156 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd9d8075c80 msgr2=0x7fd9d81b26d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:30.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.771+0000 7fd9d7fff700 1 --2- 192.168.123.105:0/497526156 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd9d8075c80 0x7fd9d81b26d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:30.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.771+0000 7fd9d7fff700 1 -- 192.168.123.105:0/497526156 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd9c800b920 con 0x7fd9d8072d90 2026-03-10T07:48:30.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.771+0000 7fd9d7fff700 1 --2- 192.168.123.105:0/497526156 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd9d8072d90 0x7fd9d81b2190 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7fd9d000b3f0 tx=0x7fd9d000b700 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:30.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.771+0000 7fd9d57fa700 1 -- 192.168.123.105:0/497526156 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd9d0003bb0 con 0x7fd9d8072d90 2026-03-10T07:48:30.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.771+0000 7fd9de8d2700 1 -- 192.168.123.105:0/497526156 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd9d81b78e0 con 0x7fd9d8072d90 2026-03-10T07:48:30.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.772+0000 7fd9de8d2700 1 -- 192.168.123.105:0/497526156 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd9d81b7e30 con 0x7fd9d8072d90 2026-03-10T07:48:30.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.772+0000 7fd9d57fa700 1 -- 192.168.123.105:0/497526156 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fd9d000be90 con 0x7fd9d8072d90 2026-03-10T07:48:30.775 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.772+0000 7fd9d57fa700 1 -- 192.168.123.105:0/497526156 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd9d0004470 con 0x7fd9d8072d90 2026-03-10T07:48:30.775 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.774+0000 7fd9d57fa700 1 -- 192.168.123.105:0/497526156 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fd9d00046b0 con 0x7fd9d8072d90 2026-03-10T07:48:30.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.774+0000 7fd9d57fa700 1 --2- 192.168.123.105:0/497526156 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd9c006c490 0x7fd9c006e950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:30.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.774+0000 7fd9d77fe700 1 --2- 192.168.123.105:0/497526156 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd9c006c490 0x7fd9c006e950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:30.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.774+0000 7fd9d57fa700 1 -- 192.168.123.105:0/497526156 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7fd9d0013030 con 0x7fd9d8072d90 2026-03-10T07:48:30.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.774+0000 7fd9de8d2700 1 -- 192.168.123.105:0/497526156 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd9c4005320 con 0x7fd9d8072d90 2026-03-10T07:48:30.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.775+0000 7fd9d77fe700 1 --2- 192.168.123.105:0/497526156 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd9c006c490 0x7fd9c006e950 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7fd9c80059d0 tx=0x7fd9c8005940 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:30.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.778+0000 7fd9d57fa700 1 -- 192.168.123.105:0/497526156 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fd9d005a0f0 con 0x7fd9d8072d90 2026-03-10T07:48:30.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.916+0000 7fd9de8d2700 1 -- 192.168.123.105:0/497526156 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7fd9c4005f70 con 0x7fd9d8072d90 2026-03-10T07:48:30.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.918+0000 7fd9d57fa700 1 -- 192.168.123.105:0/497526156 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 9 v9) v1 ==== 93+0+3963 (secure 0 0 0) 0x7fd9d0059c80 con 0x7fd9d8072d90 2026-03-10T07:48:30.923 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.922+0000 7fd9beffd700 1 -- 192.168.123.105:0/497526156 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd9c006c490 msgr2=0x7fd9c006e950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:30.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.922+0000 7fd9beffd700 1 --2- 192.168.123.105:0/497526156 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd9c006c490 0x7fd9c006e950 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7fd9c80059d0 tx=0x7fd9c8005940 comp rx=0 tx=0).stop 2026-03-10T07:48:30.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.922+0000 7fd9beffd700 1 -- 192.168.123.105:0/497526156 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd9d8072d90 msgr2=0x7fd9d81b2190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:30.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.922+0000 7fd9beffd700 1 --2- 192.168.123.105:0/497526156 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd9d8072d90 0x7fd9d81b2190 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7fd9d000b3f0 tx=0x7fd9d000b700 comp rx=0 tx=0).stop 2026-03-10T07:48:30.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.922+0000 7fd9beffd700 1 -- 192.168.123.105:0/497526156 shutdown_connections 2026-03-10T07:48:30.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.922+0000 7fd9beffd700 1 --2- 192.168.123.105:0/497526156 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fd9c006c490 0x7fd9c006e950 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:30.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.922+0000 7fd9beffd700 1 --2- 192.168.123.105:0/497526156 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd9d8072d90 0x7fd9d81b2190 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:30.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.923+0000 7fd9beffd700 1 --2- 192.168.123.105:0/497526156 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd9d8075c80 0x7fd9d81b26d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:30.924 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.923+0000 7fd9beffd700 1 -- 192.168.123.105:0/497526156 >> 192.168.123.105:0/497526156 conn(0x7fd9d806dda0 msgr2=0x7fd9d80777b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:30.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.924+0000 7fd9beffd700 1 -- 192.168.123.105:0/497526156 shutdown_connections 2026-03-10T07:48:30.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:30.924+0000 7fd9beffd700 1 -- 192.168.123.105:0/497526156 wait complete. 2026-03-10T07:48:30.933 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 9 2026-03-10T07:48:30.943 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T07:48:30.992 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- bash -c 'while ! ceph --format=json mds versions | jq -e ". | add == 4"; do sleep 1; done' 2026-03-10T07:48:31.189 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:31.514 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.511+0000 7f8b812ff700 1 -- 192.168.123.105:0/4280373322 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8b7c072b20 msgr2=0x7f8b7c072f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:31.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.511+0000 7f8b812ff700 1 --2- 192.168.123.105:0/4280373322 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8b7c072b20 0x7f8b7c072f40 secure :-1 s=READY pgs=268 cs=0 l=1 rev1=1 crypto rx=0x7f8b6c008790 tx=0x7f8b6c008aa0 comp rx=0 tx=0).stop 2026-03-10T07:48:31.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.511+0000 7f8b812ff700 1 -- 192.168.123.105:0/4280373322 shutdown_connections 2026-03-10T07:48:31.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.511+0000 7f8b812ff700 1 --2- 192.168.123.105:0/4280373322 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8b7c075a10 0x7f8b7c077ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:31.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.511+0000 7f8b812ff700 1 --2- 192.168.123.105:0/4280373322 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8b7c072b20 0x7f8b7c072f40 unknown :-1 s=CLOSED pgs=268 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:31.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.511+0000 7f8b812ff700 1 -- 192.168.123.105:0/4280373322 >> 192.168.123.105:0/4280373322 conn(0x7f8b7c06daa0 msgr2=0x7f8b7c06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:31.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.511+0000 7f8b812ff700 1 -- 192.168.123.105:0/4280373322 shutdown_connections 2026-03-10T07:48:31.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.511+0000 7f8b812ff700 1 -- 192.168.123.105:0/4280373322 wait complete. 2026-03-10T07:48:31.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.512+0000 7f8b812ff700 1 Processor -- start 2026-03-10T07:48:31.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.512+0000 7f8b812ff700 1 -- start start 2026-03-10T07:48:31.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.512+0000 7f8b812ff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8b7c075a10 0x7f8b7c083130 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:31.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.512+0000 7f8b812ff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8b7c083670 0x7f8b7c1b3120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:31.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.512+0000 7f8b812ff700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8b7c083b80 con 0x7f8b7c075a10 2026-03-10T07:48:31.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.512+0000 7f8b812ff700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8b7c083cf0 con 0x7f8b7c083670 2026-03-10T07:48:31.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.512+0000 7f8b7affd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8b7c075a10 0x7f8b7c083130 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:31.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.512+0000 7f8b7affd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8b7c075a10 0x7f8b7c083130 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:37018/0 (socket says 192.168.123.105:37018) 2026-03-10T07:48:31.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.512+0000 7f8b7affd700 1 -- 192.168.123.105:0/3904165526 learned_addr learned my addr 192.168.123.105:0/3904165526 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:31.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.512+0000 7f8b7a7fc700 1 --2- 192.168.123.105:0/3904165526 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8b7c083670 0x7f8b7c1b3120 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:31.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.512+0000 7f8b7affd700 1 -- 192.168.123.105:0/3904165526 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8b7c083670 msgr2=0x7f8b7c1b3120 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:31.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.512+0000 7f8b7affd700 1 --2- 192.168.123.105:0/3904165526 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8b7c083670 0x7f8b7c1b3120 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:31.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.512+0000 7f8b7affd700 1 -- 192.168.123.105:0/3904165526 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8b6c008440 con 0x7f8b7c075a10 2026-03-10T07:48:31.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.512+0000 7f8b7affd700 1 --2- 192.168.123.105:0/3904165526 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8b7c075a10 0x7f8b7c083130 secure :-1 s=READY pgs=269 cs=0 l=1 rev1=1 crypto rx=0x7f8b6c000c00 tx=0x7f8b6c009fc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:31.517 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.513+0000 7f8b63fff700 1 -- 192.168.123.105:0/3904165526 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8b6c004090 con 0x7f8b7c075a10 2026-03-10T07:48:31.517 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.513+0000 7f8b812ff700 1 -- 192.168.123.105:0/3904165526 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8b7c1b36c0 con 0x7f8b7c075a10 2026-03-10T07:48:31.517 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.513+0000 7f8b812ff700 1 -- 192.168.123.105:0/3904165526 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8b7c1b3c10 con 0x7f8b7c075a10 2026-03-10T07:48:31.517 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.514+0000 7f8b63fff700 1 -- 192.168.123.105:0/3904165526 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8b6c00b610 con 0x7f8b7c075a10 2026-03-10T07:48:31.517 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.514+0000 7f8b63fff700 1 -- 192.168.123.105:0/3904165526 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8b6c00a2e0 con 0x7f8b7c075a10 2026-03-10T07:48:31.517 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.515+0000 7f8b63fff700 1 -- 192.168.123.105:0/3904165526 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f8b6c00a440 con 0x7f8b7c075a10 2026-03-10T07:48:31.517 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.515+0000 7f8b63fff700 1 --2- 192.168.123.105:0/3904165526 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8b6406c600 0x7f8b6406eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:31.517 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.515+0000 7f8b7a7fc700 1 --2- 192.168.123.105:0/3904165526 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8b6406c600 0x7f8b6406eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:31.517 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.515+0000 7f8b63fff700 1 -- 192.168.123.105:0/3904165526 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f8b6c095470 con 0x7f8b7c075a10 2026-03-10T07:48:31.517 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.516+0000 7f8b7a7fc700 1 --2- 192.168.123.105:0/3904165526 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8b6406c600 0x7f8b6406eac0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f8b74000c50 tx=0x7f8b740012b0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:31.518 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.516+0000 7f8b812ff700 1 -- 192.168.123.105:0/3904165526 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8b68005320 con 0x7f8b7c075a10 2026-03-10T07:48:31.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.518+0000 7f8b63fff700 1 -- 192.168.123.105:0/3904165526 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f8b6c05fd10 con 0x7f8b7c075a10 2026-03-10T07:48:31.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.693+0000 7f8b812ff700 1 -- 192.168.123.105:0/3904165526 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "mds versions", "format": "json"} v 0) v1 -- 0x7f8b68005190 con 0x7f8b7c075a10 2026-03-10T07:48:31.697 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.696+0000 7f8b63fff700 1 -- 192.168.123.105:0/3904165526 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "mds versions", "format": "json"}]=0 v10) v1 ==== 78+0+83 (secure 0 0 0) 0x7f8b6c063330 con 0x7f8b7c075a10 2026-03-10T07:48:31.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.698+0000 7f8b61ffb700 1 -- 192.168.123.105:0/3904165526 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8b6406c600 msgr2=0x7f8b6406eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:31.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.698+0000 7f8b61ffb700 1 --2- 192.168.123.105:0/3904165526 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8b6406c600 0x7f8b6406eac0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f8b74000c50 tx=0x7f8b740012b0 comp rx=0 tx=0).stop 2026-03-10T07:48:31.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.698+0000 7f8b61ffb700 1 -- 192.168.123.105:0/3904165526 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8b7c075a10 msgr2=0x7f8b7c083130 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:31.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.698+0000 7f8b61ffb700 1 --2- 192.168.123.105:0/3904165526 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8b7c075a10 0x7f8b7c083130 secure :-1 s=READY pgs=269 cs=0 l=1 rev1=1 crypto rx=0x7f8b6c000c00 tx=0x7f8b6c009fc0 comp rx=0 tx=0).stop 2026-03-10T07:48:31.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.698+0000 7f8b61ffb700 1 -- 192.168.123.105:0/3904165526 shutdown_connections 2026-03-10T07:48:31.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.698+0000 7f8b61ffb700 1 --2- 192.168.123.105:0/3904165526 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8b6406c600 0x7f8b6406eac0 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:31.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.698+0000 7f8b61ffb700 1 --2- 192.168.123.105:0/3904165526 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8b7c075a10 0x7f8b7c083130 unknown :-1 s=CLOSED pgs=269 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:31.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.698+0000 7f8b61ffb700 1 --2- 192.168.123.105:0/3904165526 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8b7c083670 0x7f8b7c1b3120 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:31.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.698+0000 7f8b61ffb700 1 -- 192.168.123.105:0/3904165526 >> 192.168.123.105:0/3904165526 conn(0x7f8b7c06daa0 msgr2=0x7f8b7c06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:31.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.698+0000 7f8b61ffb700 1 -- 192.168.123.105:0/3904165526 shutdown_connections 2026-03-10T07:48:31.700 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:31.698+0000 7f8b61ffb700 1 -- 192.168.123.105:0/3904165526 wait complete. 2026-03-10T07:48:31.709 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T07:48:31.750 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:31 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/497526156' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T07:48:31.750 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:31 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:31.750 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:31 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:31.750 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:31 vm05 ceph-mon[50387]: mds.? [v2:192.168.123.105:6826/723078808,v1:192.168.123.105:6827/723078808] up:reconnect 2026-03-10T07:48:31.750 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:31 vm05 ceph-mon[50387]: mds.? [v2:192.168.123.108:6824/3815585988,v1:192.168.123.108:6825/3815585988] up:boot 2026-03-10T07:48:31.750 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:31 vm05 ceph-mon[50387]: fsmap cephfs:1/1 {0=cephfs.vm05.omfhnh=up:reconnect} 3 up:standby 2026-03-10T07:48:31.750 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:31 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.ybmbgd"}]: dispatch 2026-03-10T07:48:31.752 INFO:teuthology.run_tasks:Running task fs.pre_upgrade_save... 2026-03-10T07:48:31.756 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 2026-03-10T07:48:31.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:31 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/497526156' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T07:48:31.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:31 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:31.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:31 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:31.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:31 vm08 ceph-mon[59917]: mds.? [v2:192.168.123.105:6826/723078808,v1:192.168.123.105:6827/723078808] up:reconnect 2026-03-10T07:48:31.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:31 vm08 ceph-mon[59917]: mds.? [v2:192.168.123.108:6824/3815585988,v1:192.168.123.108:6825/3815585988] up:boot 2026-03-10T07:48:31.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:31 vm08 ceph-mon[59917]: fsmap cephfs:1/1 {0=cephfs.vm05.omfhnh=up:reconnect} 3 up:standby 2026-03-10T07:48:31.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:31 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.ybmbgd"}]: dispatch 2026-03-10T07:48:31.966 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:32.259 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.255+0000 7f36e8f0a700 1 -- 192.168.123.105:0/2244393052 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36dc09bde0 msgr2=0x7f36dc09c1c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:32.259 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.255+0000 7f36e8f0a700 1 --2- 192.168.123.105:0/2244393052 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36dc09bde0 0x7f36dc09c1c0 secure :-1 s=READY pgs=270 cs=0 l=1 rev1=1 crypto rx=0x7f36d8009ab0 tx=0x7f36d8009dc0 comp rx=0 tx=0).stop 2026-03-10T07:48:32.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.258+0000 7f36e8f0a700 1 -- 192.168.123.105:0/2244393052 shutdown_connections 2026-03-10T07:48:32.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.258+0000 7f36e8f0a700 1 --2- 192.168.123.105:0/2244393052 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f36dc095e90 0x7f36dc0962f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:32.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.258+0000 7f36e8f0a700 1 --2- 192.168.123.105:0/2244393052 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36dc09bde0 0x7f36dc09c1c0 unknown :-1 s=CLOSED pgs=270 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:32.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.258+0000 7f36e8f0a700 1 -- 192.168.123.105:0/2244393052 >> 192.168.123.105:0/2244393052 conn(0x7f36dc00b8e0 msgr2=0x7f36dc00bcf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:32.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.258+0000 7f36e8f0a700 1 -- 192.168.123.105:0/2244393052 shutdown_connections 2026-03-10T07:48:32.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.258+0000 7f36e8f0a700 1 -- 192.168.123.105:0/2244393052 wait complete. 2026-03-10T07:48:32.264 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.261+0000 7f36e8f0a700 1 Processor -- start 2026-03-10T07:48:32.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.263+0000 7f36e8f0a700 1 -- start start 2026-03-10T07:48:32.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.264+0000 7f36e8f0a700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f36dc095e90 0x7f36dc132ee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:32.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.264+0000 7f36e8f0a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36dc09bde0 0x7f36dc133420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:32.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.264+0000 7f36e8f0a700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f36dc133a40 con 0x7f36dc09bde0 2026-03-10T07:48:32.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.264+0000 7f36e8f0a700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f36dc133bb0 con 0x7f36dc095e90 2026-03-10T07:48:32.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.264+0000 7f36e37fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f36dc095e90 0x7f36dc132ee0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:32.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.264+0000 7f36e37fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f36dc095e90 0x7f36dc132ee0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:50774/0 (socket says 192.168.123.105:50774) 2026-03-10T07:48:32.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.264+0000 7f36e37fe700 1 -- 192.168.123.105:0/1455013209 learned_addr learned my addr 192.168.123.105:0/1455013209 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:32.267 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.264+0000 7f36e2ffd700 1 --2- 192.168.123.105:0/1455013209 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36dc09bde0 0x7f36dc133420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:32.267 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.265+0000 7f36e37fe700 1 -- 192.168.123.105:0/1455013209 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36dc09bde0 msgr2=0x7f36dc133420 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:32.267 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.265+0000 7f36e37fe700 1 --2- 192.168.123.105:0/1455013209 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36dc09bde0 0x7f36dc133420 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:32.267 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.265+0000 7f36e37fe700 1 -- 192.168.123.105:0/1455013209 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f36d8009710 con 0x7f36dc095e90 2026-03-10T07:48:32.267 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.265+0000 7f36e37fe700 1 --2- 192.168.123.105:0/1455013209 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f36dc095e90 0x7f36dc132ee0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f36d8000c00 tx=0x7f36d800ba90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:32.268 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.265+0000 7f36e0ff9700 1 -- 192.168.123.105:0/1455013209 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f36d801d070 con 0x7f36dc095e90 2026-03-10T07:48:32.268 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.265+0000 7f36e8f0a700 1 -- 192.168.123.105:0/1455013209 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f36dc13be90 con 0x7f36dc095e90 2026-03-10T07:48:32.268 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.266+0000 7f36e8f0a700 1 -- 192.168.123.105:0/1455013209 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f36dc13c350 con 0x7f36dc095e90 2026-03-10T07:48:32.268 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.266+0000 7f36e0ff9700 1 -- 192.168.123.105:0/1455013209 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f36d8005480 con 0x7f36dc095e90 2026-03-10T07:48:32.268 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.266+0000 7f36e0ff9700 1 -- 192.168.123.105:0/1455013209 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f36d8022700 con 0x7f36dc095e90 2026-03-10T07:48:32.269 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.267+0000 7f36e0ff9700 1 -- 192.168.123.105:0/1455013209 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f36d80173f0 con 0x7f36dc095e90 2026-03-10T07:48:32.270 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.268+0000 7f36e0ff9700 1 --2- 192.168.123.105:0/1455013209 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f36d406c550 0x7f36d406ea10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:32.270 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.268+0000 7f36e0ff9700 1 -- 192.168.123.105:0/1455013209 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f36d808e060 con 0x7f36dc095e90 2026-03-10T07:48:32.270 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.268+0000 7f36e8f0a700 1 -- 192.168.123.105:0/1455013209 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f36c8005320 con 0x7f36dc095e90 2026-03-10T07:48:32.271 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.269+0000 7f36e2ffd700 1 --2- 192.168.123.105:0/1455013209 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f36d406c550 0x7f36d406ea10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:32.273 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.271+0000 7f36e2ffd700 1 --2- 192.168.123.105:0/1455013209 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f36d406c550 0x7f36d406ea10 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f36dc0a0570 tx=0x7f36d0009500 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:32.273 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.271+0000 7f36e0ff9700 1 -- 192.168.123.105:0/1455013209 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f36d8058070 con 0x7f36dc095e90 2026-03-10T07:48:32.419 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.417+0000 7f36e8f0a700 1 -- 192.168.123.105:0/1455013209 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7f36c8005f70 con 0x7f36dc095e90 2026-03-10T07:48:32.419 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.418+0000 7f36e0ff9700 1 -- 192.168.123.105:0/1455013209 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 10 v10) v1 ==== 94+0+4751 (secure 0 0 0) 0x7f36d80588a0 con 0x7f36dc095e90 2026-03-10T07:48:32.420 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:48:32.420 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":10,"default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14512,"name":"cephfs.vm05.pavqil","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6829/427555544","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":427555544},{"type":"v1","addr":"192.168.123.105:6829","nonce":427555544}]},"join_fscid":-1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":6},{"gid":14524,"name":"cephfs.vm08.ybmbgd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3815585988","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3815585988},{"type":"v1","addr":"192.168.123.108:6825","nonce":3815585988}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24313,"name":"cephfs.vm08.dgsaon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6827/2963085185","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2963085185},{"type":"v1","addr":"192.168.123.108:6827","nonce":2963085185}]},"join_fscid":-1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":7}],"filesystems":[{"mdsmap":{"epoch":10,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:48:31.410530+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"last_failure":0,"last_failure_osd_epoch":39,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24297},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24297":{"gid":24297,"name":"cephfs.vm05.omfhnh","rank":0,"incarnation":9,"state":"up:reconnect","state_seq":3,"addr":"192.168.123.105:6827/723078808","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":723078808},{"type":"v1","addr":"192.168.123.105:6827","nonce":723078808}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1},"id":1}]} 2026-03-10T07:48:32.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.421+0000 7f36e8f0a700 1 -- 192.168.123.105:0/1455013209 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f36d406c550 msgr2=0x7f36d406ea10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:32.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.421+0000 7f36e8f0a700 1 --2- 192.168.123.105:0/1455013209 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f36d406c550 0x7f36d406ea10 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f36dc0a0570 tx=0x7f36d0009500 comp rx=0 tx=0).stop 2026-03-10T07:48:32.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.421+0000 7f36e8f0a700 1 -- 192.168.123.105:0/1455013209 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f36dc095e90 msgr2=0x7f36dc132ee0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:32.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.421+0000 7f36e8f0a700 1 --2- 192.168.123.105:0/1455013209 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f36dc095e90 0x7f36dc132ee0 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7f36d8000c00 tx=0x7f36d800ba90 comp rx=0 tx=0).stop 2026-03-10T07:48:32.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.421+0000 7f36e8f0a700 1 -- 192.168.123.105:0/1455013209 shutdown_connections 2026-03-10T07:48:32.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.421+0000 7f36e8f0a700 1 --2- 192.168.123.105:0/1455013209 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f36d406c550 0x7f36d406ea10 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:32.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.421+0000 7f36e8f0a700 1 --2- 192.168.123.105:0/1455013209 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f36dc095e90 0x7f36dc132ee0 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:32.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.421+0000 7f36e8f0a700 1 --2- 192.168.123.105:0/1455013209 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f36dc09bde0 0x7f36dc133420 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:32.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.421+0000 7f36e8f0a700 1 -- 192.168.123.105:0/1455013209 >> 192.168.123.105:0/1455013209 conn(0x7f36dc00b8e0 msgr2=0x7f36dc094e90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:32.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.422+0000 7f36e8f0a700 1 -- 192.168.123.105:0/1455013209 shutdown_connections 2026-03-10T07:48:32.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:32.422+0000 7f36e8f0a700 1 -- 192.168.123.105:0/1455013209 wait complete. 2026-03-10T07:48:32.424 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 10 2026-03-10T07:48:32.488 DEBUG:tasks.fs:fs fscid=1,name=cephfs state = {'epoch': 10, 'max_mds': 1, 'flags': 18} 2026-03-10T07:48:32.488 INFO:teuthology.run_tasks:Running task ceph-fuse... 2026-03-10T07:48:32.499 INFO:tasks.ceph_fuse:Running ceph_fuse task... 2026-03-10T07:48:32.499 INFO:tasks.ceph_fuse:config is {'client.0': {}, 'client.1': {}} 2026-03-10T07:48:32.499 INFO:tasks.ceph_fuse:client.0 config is {} 2026-03-10T07:48:32.500 INFO:tasks.cephfs.mount:cephfs_mntpt = None 2026-03-10T07:48:32.500 INFO:tasks.cephfs.mount:hostfs_mntpt = /home/ubuntu/cephtest/mnt.0 2026-03-10T07:48:32.500 INFO:tasks.ceph_fuse:client.1 config is {} 2026-03-10T07:48:32.500 INFO:tasks.cephfs.mount:cephfs_mntpt = None 2026-03-10T07:48:32.500 INFO:tasks.cephfs.mount:hostfs_mntpt = /home/ubuntu/cephtest/mnt.1 2026-03-10T07:48:32.500 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:48:32.500 DEBUG:teuthology.orchestra.run.vm05:> ip netns list 2026-03-10T07:48:32.514 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:48:32.515 DEBUG:teuthology.orchestra.run.vm05:> sudo ip link delete ceph-brx 2026-03-10T07:48:32.576 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:32 vm05 ceph-mon[50387]: pgmap v81: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.3 KiB/s wr, 7 op/s 2026-03-10T07:48:32.576 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:32 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/3904165526' entity='client.admin' cmd=[{"prefix": "mds versions", "format": "json"}]: dispatch 2026-03-10T07:48:32.576 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:32 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:32.576 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:32 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:32.576 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:32 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:48:32.576 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:32 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:48:32.576 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:32 vm05 ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:32.576 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:32 vm05 ceph-mon[50387]: from='client.? 192.168.123.105:0/1455013209' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T07:48:32.576 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:32 vm05 ceph-mon[50387]: mds.? [v2:192.168.123.105:6826/723078808,v1:192.168.123.105:6827/723078808] up:rejoin 2026-03-10T07:48:32.576 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:32 vm05 ceph-mon[50387]: mds.? [v2:192.168.123.105:6828/427555544,v1:192.168.123.105:6829/427555544] up:standby 2026-03-10T07:48:32.576 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:32 vm05 ceph-mon[50387]: mds.? [v2:192.168.123.108:6826/2963085185,v1:192.168.123.108:6827/2963085185] up:standby 2026-03-10T07:48:32.576 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:32 vm05 ceph-mon[50387]: fsmap cephfs:1/1 {0=cephfs.vm05.omfhnh=up:rejoin} 3 up:standby 2026-03-10T07:48:32.576 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:32 vm05 ceph-mon[50387]: daemon mds.cephfs.vm05.omfhnh is now active in filesystem cephfs as rank 0 2026-03-10T07:48:32.582 INFO:teuthology.orchestra.run.vm05.stderr:Cannot find device "ceph-brx" 2026-03-10T07:48:32.583 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T07:48:32.583 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:48:32.583 DEBUG:teuthology.orchestra.run.vm08:> ip netns list 2026-03-10T07:48:32.599 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:48:32.599 DEBUG:teuthology.orchestra.run.vm08:> sudo ip link delete ceph-brx 2026-03-10T07:48:32.667 INFO:teuthology.orchestra.run.vm08.stderr:Cannot find device "ceph-brx" 2026-03-10T07:48:32.669 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T07:48:32.669 INFO:tasks.ceph_fuse:Mounting ceph-fuse clients... 2026-03-10T07:48:32.669 DEBUG:tasks.ceph_fuse:passing mntargs=[] 2026-03-10T07:48:32.669 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs ls 2026-03-10T07:48:32.808 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:32.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:32 vm08 ceph-mon[59917]: pgmap v81: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.3 KiB/s wr, 7 op/s 2026-03-10T07:48:32.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:32 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/3904165526' entity='client.admin' cmd=[{"prefix": "mds versions", "format": "json"}]: dispatch 2026-03-10T07:48:32.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:32 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:32.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:32 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:32.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:32 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:48:32.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:32 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:48:32.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:32 vm08 ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:32.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:32 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/1455013209' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T07:48:32.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:32 vm08 ceph-mon[59917]: mds.? [v2:192.168.123.105:6826/723078808,v1:192.168.123.105:6827/723078808] up:rejoin 2026-03-10T07:48:32.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:32 vm08 ceph-mon[59917]: mds.? [v2:192.168.123.105:6828/427555544,v1:192.168.123.105:6829/427555544] up:standby 2026-03-10T07:48:32.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:32 vm08 ceph-mon[59917]: mds.? [v2:192.168.123.108:6826/2963085185,v1:192.168.123.108:6827/2963085185] up:standby 2026-03-10T07:48:32.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:32 vm08 ceph-mon[59917]: fsmap cephfs:1/1 {0=cephfs.vm05.omfhnh=up:rejoin} 3 up:standby 2026-03-10T07:48:32.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:32 vm08 ceph-mon[59917]: daemon mds.cephfs.vm05.omfhnh is now active in filesystem cephfs as rank 0 2026-03-10T07:48:33.040 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.037+0000 7efc34c05700 1 -- 192.168.123.105:0/2627822047 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efc30102a00 msgr2=0x7efc3010aef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:33.040 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.037+0000 7efc34c05700 1 --2- 192.168.123.105:0/2627822047 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efc30102a00 0x7efc3010aef0 secure :-1 s=READY pgs=271 cs=0 l=1 rev1=1 crypto rx=0x7efc20009b50 tx=0x7efc20009e60 comp rx=0 tx=0).stop 2026-03-10T07:48:33.040 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.038+0000 7efc34c05700 1 -- 192.168.123.105:0/2627822047 shutdown_connections 2026-03-10T07:48:33.040 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.038+0000 7efc34c05700 1 --2- 192.168.123.105:0/2627822047 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efc30102a00 0x7efc3010aef0 unknown :-1 s=CLOSED pgs=271 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:33.040 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.038+0000 7efc34c05700 1 --2- 192.168.123.105:0/2627822047 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efc301020e0 0x7efc301024c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:33.040 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.038+0000 7efc34c05700 1 -- 192.168.123.105:0/2627822047 >> 192.168.123.105:0/2627822047 conn(0x7efc300fb830 msgr2=0x7efc300fdc50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:33.040 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.038+0000 7efc34c05700 1 -- 192.168.123.105:0/2627822047 shutdown_connections 2026-03-10T07:48:33.040 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.038+0000 7efc34c05700 1 -- 192.168.123.105:0/2627822047 wait complete. 2026-03-10T07:48:33.040 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.039+0000 7efc34c05700 1 Processor -- start 2026-03-10T07:48:33.041 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.039+0000 7efc34c05700 1 -- start start 2026-03-10T07:48:33.041 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.039+0000 7efc34c05700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efc301020e0 0x7efc3019c980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:33.041 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.039+0000 7efc34c05700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efc30102a00 0x7efc3019cec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:33.041 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.039+0000 7efc34c05700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efc3019d550 con 0x7efc301020e0 2026-03-10T07:48:33.041 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.039+0000 7efc34c05700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7efc30196a50 con 0x7efc30102a00 2026-03-10T07:48:33.041 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.040+0000 7efc2dd9b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efc30102a00 0x7efc3019cec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:33.041 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.040+0000 7efc2dd9b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efc30102a00 0x7efc3019cec0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:50802/0 (socket says 192.168.123.105:50802) 2026-03-10T07:48:33.041 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.040+0000 7efc2dd9b700 1 -- 192.168.123.105:0/917797761 learned_addr learned my addr 192.168.123.105:0/917797761 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:33.041 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.040+0000 7efc2dd9b700 1 -- 192.168.123.105:0/917797761 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efc301020e0 msgr2=0x7efc3019c980 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:48:33.042 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.040+0000 7efc2e59c700 1 --2- 192.168.123.105:0/917797761 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efc301020e0 0x7efc3019c980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:33.042 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.040+0000 7efc2dd9b700 1 --2- 192.168.123.105:0/917797761 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efc301020e0 0x7efc3019c980 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:33.042 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.040+0000 7efc2dd9b700 1 -- 192.168.123.105:0/917797761 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7efc200097e0 con 0x7efc30102a00 2026-03-10T07:48:33.042 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.040+0000 7efc2dd9b700 1 --2- 192.168.123.105:0/917797761 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efc30102a00 0x7efc3019cec0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7efc20000c00 tx=0x7efc200049f0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:33.042 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.040+0000 7efc277fe700 1 -- 192.168.123.105:0/917797761 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efc2001d070 con 0x7efc30102a00 2026-03-10T07:48:33.043 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.041+0000 7efc34c05700 1 -- 192.168.123.105:0/917797761 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7efc30196cd0 con 0x7efc30102a00 2026-03-10T07:48:33.043 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.041+0000 7efc34c05700 1 -- 192.168.123.105:0/917797761 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7efc30197160 con 0x7efc30102a00 2026-03-10T07:48:33.043 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.040+0000 7efc2e59c700 1 --2- 192.168.123.105:0/917797761 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efc301020e0 0x7efc3019c980 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:48:33.043 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.041+0000 7efc277fe700 1 -- 192.168.123.105:0/917797761 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7efc20004500 con 0x7efc30102a00 2026-03-10T07:48:33.043 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.041+0000 7efc277fe700 1 -- 192.168.123.105:0/917797761 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7efc20022470 con 0x7efc30102a00 2026-03-10T07:48:33.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.042+0000 7efc34c05700 1 -- 192.168.123.105:0/917797761 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7efc301085f0 con 0x7efc30102a00 2026-03-10T07:48:33.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.042+0000 7efc277fe700 1 -- 192.168.123.105:0/917797761 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7efc2000bc30 con 0x7efc30102a00 2026-03-10T07:48:33.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.042+0000 7efc277fe700 1 --2- 192.168.123.105:0/917797761 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efc1c06c550 0x7efc1c06ea10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:33.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.042+0000 7efc277fe700 1 -- 192.168.123.105:0/917797761 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7efc2008d2c0 con 0x7efc30102a00 2026-03-10T07:48:33.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.042+0000 7efc2e59c700 1 --2- 192.168.123.105:0/917797761 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efc1c06c550 0x7efc1c06ea10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:33.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.043+0000 7efc2e59c700 1 --2- 192.168.123.105:0/917797761 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efc1c06c550 0x7efc1c06ea10 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7efc18005d90 tx=0x7efc18005d00 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:33.047 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.045+0000 7efc277fe700 1 -- 192.168.123.105:0/917797761 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7efc2005b690 con 0x7efc30102a00 2026-03-10T07:48:33.169 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.167+0000 7efc34c05700 1 -- 192.168.123.105:0/917797761 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs ls"} v 0) v1 -- 0x7efc3004ea90 con 0x7efc30102a00 2026-03-10T07:48:33.169 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.168+0000 7efc277fe700 1 -- 192.168.123.105:0/917797761 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs ls"}]=0 v11) v1 ==== 53+0+83 (secure 0 0 0) 0x7efc2005b220 con 0x7efc30102a00 2026-03-10T07:48:33.169 INFO:teuthology.orchestra.run.vm05.stdout:name: cephfs, metadata pool: cephfs.cephfs.meta, data pools: [cephfs.cephfs.data ] 2026-03-10T07:48:33.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.170+0000 7efc34c05700 1 -- 192.168.123.105:0/917797761 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efc1c06c550 msgr2=0x7efc1c06ea10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:33.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.170+0000 7efc34c05700 1 --2- 192.168.123.105:0/917797761 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efc1c06c550 0x7efc1c06ea10 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7efc18005d90 tx=0x7efc18005d00 comp rx=0 tx=0).stop 2026-03-10T07:48:33.172 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.170+0000 7efc34c05700 1 -- 192.168.123.105:0/917797761 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efc30102a00 msgr2=0x7efc3019cec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:33.172 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.170+0000 7efc34c05700 1 --2- 192.168.123.105:0/917797761 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efc30102a00 0x7efc3019cec0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7efc20000c00 tx=0x7efc200049f0 comp rx=0 tx=0).stop 2026-03-10T07:48:33.172 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.170+0000 7efc34c05700 1 -- 192.168.123.105:0/917797761 shutdown_connections 2026-03-10T07:48:33.172 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.170+0000 7efc34c05700 1 --2- 192.168.123.105:0/917797761 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7efc1c06c550 0x7efc1c06ea10 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:33.172 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.170+0000 7efc34c05700 1 --2- 192.168.123.105:0/917797761 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7efc301020e0 0x7efc3019c980 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:33.172 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.170+0000 7efc34c05700 1 --2- 192.168.123.105:0/917797761 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7efc30102a00 0x7efc3019cec0 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:33.172 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.170+0000 7efc34c05700 1 -- 192.168.123.105:0/917797761 >> 192.168.123.105:0/917797761 conn(0x7efc300fb830 msgr2=0x7efc30105730 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:33.172 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.171+0000 7efc34c05700 1 -- 192.168.123.105:0/917797761 shutdown_connections 2026-03-10T07:48:33.172 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33.171+0000 7efc34c05700 1 -- 192.168.123.105:0/917797761 wait complete. 2026-03-10T07:48:33.235 INFO:tasks.cephfs.mount:Mounting default Ceph FS; just confirmed its presence on cluster 2026-03-10T07:48:33.235 INFO:tasks.cephfs.mount:Mounting Ceph FS. Following are details of mount; remember "None" represents Python type None - 2026-03-10T07:48:33.235 INFO:tasks.cephfs.mount:self.client_remote.hostname = vm05.local 2026-03-10T07:48:33.235 INFO:tasks.cephfs.mount:self.client.name = client.0 2026-03-10T07:48:33.235 INFO:tasks.cephfs.mount:self.hostfs_mntpt = /home/ubuntu/cephtest/mnt.0 2026-03-10T07:48:33.235 INFO:tasks.cephfs.mount:self.cephfs_name = None 2026-03-10T07:48:33.235 INFO:tasks.cephfs.mount:self.cephfs_mntpt = None 2026-03-10T07:48:33.235 INFO:tasks.cephfs.mount:self.client_keyring_path = None 2026-03-10T07:48:33.235 INFO:tasks.cephfs.mount:Setting the 'None' netns for '/home/ubuntu/cephtest/mnt.0' 2026-03-10T07:48:33.235 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:48:33.235 DEBUG:teuthology.orchestra.run.vm05:> ip addr 2026-03-10T07:48:33.250 INFO:teuthology.orchestra.run.vm05.stdout:1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 2026-03-10T07:48:33.250 INFO:teuthology.orchestra.run.vm05.stdout: link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 2026-03-10T07:48:33.251 INFO:teuthology.orchestra.run.vm05.stdout: inet 127.0.0.1/8 scope host lo 2026-03-10T07:48:33.251 INFO:teuthology.orchestra.run.vm05.stdout: valid_lft forever preferred_lft forever 2026-03-10T07:48:33.251 INFO:teuthology.orchestra.run.vm05.stdout: inet6 ::1/128 scope host 2026-03-10T07:48:33.251 INFO:teuthology.orchestra.run.vm05.stdout: valid_lft forever preferred_lft forever 2026-03-10T07:48:33.251 INFO:teuthology.orchestra.run.vm05.stdout:2: eth0: mtu 1500 qdisc fq_codel state UP group default qlen 1000 2026-03-10T07:48:33.251 INFO:teuthology.orchestra.run.vm05.stdout: link/ether 52:55:00:00:00:05 brd ff:ff:ff:ff:ff:ff 2026-03-10T07:48:33.251 INFO:teuthology.orchestra.run.vm05.stdout: altname enp0s3 2026-03-10T07:48:33.251 INFO:teuthology.orchestra.run.vm05.stdout: altname ens3 2026-03-10T07:48:33.251 INFO:teuthology.orchestra.run.vm05.stdout: inet 192.168.123.105/24 brd 192.168.123.255 scope global dynamic noprefixroute eth0 2026-03-10T07:48:33.251 INFO:teuthology.orchestra.run.vm05.stdout: valid_lft 3138sec preferred_lft 3138sec 2026-03-10T07:48:33.251 INFO:teuthology.orchestra.run.vm05.stdout: inet6 fe80::5055:ff:fe00:5/64 scope link noprefixroute 2026-03-10T07:48:33.251 INFO:teuthology.orchestra.run.vm05.stdout: valid_lft forever preferred_lft forever 2026-03-10T07:48:33.251 INFO:tasks.cephfs.mount:Setuping the 'ceph-brx' with 192.168.159.254/20 2026-03-10T07:48:33.251 DEBUG:teuthology.orchestra.run.vm05:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T07:48:33.251 DEBUG:teuthology.orchestra.run.vm05:> set -e 2026-03-10T07:48:33.251 DEBUG:teuthology.orchestra.run.vm05:> sudo ip link add name ceph-brx type bridge 2026-03-10T07:48:33.251 DEBUG:teuthology.orchestra.run.vm05:> sudo ip addr flush dev ceph-brx 2026-03-10T07:48:33.251 DEBUG:teuthology.orchestra.run.vm05:> sudo ip link set ceph-brx up 2026-03-10T07:48:33.251 DEBUG:teuthology.orchestra.run.vm05:> sudo ip addr add 192.168.159.254/20 brd 192.168.159.255 dev ceph-brx 2026-03-10T07:48:33.251 DEBUG:teuthology.orchestra.run.vm05:> ') 2026-03-10T07:48:33.326 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T07:48:33.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T07:48:33.407 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:48:33.407 DEBUG:teuthology.orchestra.run.vm05:> echo 1 | sudo tee /proc/sys/net/ipv4/ip_forward 2026-03-10T07:48:33.476 INFO:teuthology.orchestra.run.vm05.stdout:1 2026-03-10T07:48:33.477 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:48:33.477 DEBUG:teuthology.orchestra.run.vm05:> ip r 2026-03-10T07:48:33.532 INFO:teuthology.orchestra.run.vm05.stdout:default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.105 metric 100 2026-03-10T07:48:33.532 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.105 metric 100 2026-03-10T07:48:33.532 INFO:teuthology.orchestra.run.vm05.stdout:192.168.144.0/20 dev ceph-brx proto kernel scope link src 192.168.159.254 2026-03-10T07:48:33.532 DEBUG:teuthology.orchestra.run.vm05:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T07:48:33.532 DEBUG:teuthology.orchestra.run.vm05:> set -e 2026-03-10T07:48:33.532 DEBUG:teuthology.orchestra.run.vm05:> sudo iptables -A FORWARD -o eth0 -i ceph-brx -j ACCEPT 2026-03-10T07:48:33.532 DEBUG:teuthology.orchestra.run.vm05:> sudo iptables -A FORWARD -i eth0 -o ceph-brx -j ACCEPT 2026-03-10T07:48:33.532 DEBUG:teuthology.orchestra.run.vm05:> sudo iptables -t nat -A POSTROUTING -s 192.168.159.254/20 -o eth0 -j MASQUERADE 2026-03-10T07:48:33.532 DEBUG:teuthology.orchestra.run.vm05:> ') 2026-03-10T07:48:33.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T07:48:33.670 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T07:48:33.674 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:48:33.675 DEBUG:teuthology.orchestra.run.vm05:> ip netns list 2026-03-10T07:48:33.730 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:48:33.730 DEBUG:teuthology.orchestra.run.vm05:> ip netns list-id 2026-03-10T07:48:33.785 DEBUG:teuthology.orchestra.run.vm05:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T07:48:33.785 DEBUG:teuthology.orchestra.run.vm05:> set -e 2026-03-10T07:48:33.785 DEBUG:teuthology.orchestra.run.vm05:> sudo ip netns add ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T07:48:33.785 DEBUG:teuthology.orchestra.run.vm05:> sudo ip netns set ceph-ns--home-ubuntu-cephtest-mnt.0 0 2026-03-10T07:48:33.786 DEBUG:teuthology.orchestra.run.vm05:> ') 2026-03-10T07:48:33.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T07:48:33.868 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:33 vm05.local ceph-mon[50387]: from='client.? 192.168.123.105:0/917797761' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-10T07:48:33.868 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:33 vm05.local ceph-mon[50387]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T07:48:33.868 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:33 vm05.local ceph-mon[50387]: mds.? [v2:192.168.123.105:6826/723078808,v1:192.168.123.105:6827/723078808] up:active 2026-03-10T07:48:33.868 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:33 vm05.local ceph-mon[50387]: fsmap cephfs:1 {0=cephfs.vm05.omfhnh=up:active} 3 up:standby 2026-03-10T07:48:33.885 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T07:48:33.889 INFO:tasks.cephfs.mount:Setuping the netns 'ceph-ns--home-ubuntu-cephtest-mnt.0' with 192.168.144.1/20 2026-03-10T07:48:33.889 DEBUG:teuthology.orchestra.run.vm05:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T07:48:33.889 DEBUG:teuthology.orchestra.run.vm05:> set -e 2026-03-10T07:48:33.889 DEBUG:teuthology.orchestra.run.vm05:> sudo ip link add veth0 netns ceph-ns--home-ubuntu-cephtest-mnt.0 type veth peer name brx.0 2026-03-10T07:48:33.889 DEBUG:teuthology.orchestra.run.vm05:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip addr add 192.168.144.1/20 brd 192.168.159.255 dev veth0 2026-03-10T07:48:33.889 DEBUG:teuthology.orchestra.run.vm05:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip link set veth0 up 2026-03-10T07:48:33.889 DEBUG:teuthology.orchestra.run.vm05:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip link set lo up 2026-03-10T07:48:33.889 DEBUG:teuthology.orchestra.run.vm05:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.0 ip route add default via 192.168.159.254 2026-03-10T07:48:33.889 DEBUG:teuthology.orchestra.run.vm05:> ') 2026-03-10T07:48:33.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:33 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/917797761' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-10T07:48:33.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:33 vm08 ceph-mon[59917]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T07:48:33.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:33 vm08 ceph-mon[59917]: mds.? [v2:192.168.123.105:6826/723078808,v1:192.168.123.105:6827/723078808] up:active 2026-03-10T07:48:33.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:33 vm08 ceph-mon[59917]: fsmap cephfs:1 {0=cephfs.vm05.omfhnh=up:active} 3 up:standby 2026-03-10T07:48:33.967 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:33 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T07:48:34.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:34 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T07:48:34.040 DEBUG:teuthology.orchestra.run.vm05:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T07:48:34.040 DEBUG:teuthology.orchestra.run.vm05:> set -e 2026-03-10T07:48:34.040 DEBUG:teuthology.orchestra.run.vm05:> sudo ip link set brx.0 up 2026-03-10T07:48:34.040 DEBUG:teuthology.orchestra.run.vm05:> sudo ip link set dev brx.0 master ceph-brx 2026-03-10T07:48:34.040 DEBUG:teuthology.orchestra.run.vm05:> ') 2026-03-10T07:48:34.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:34 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T07:48:34.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:34 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T07:48:34.152 INFO:tasks.cephfs.fuse_mount:Client client.0 config is {} 2026-03-10T07:48:34.152 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-10T07:48:34.152 DEBUG:teuthology.orchestra.run.vm05:> mkdir -p -v /home/ubuntu/cephtest/mnt.0 2026-03-10T07:48:34.206 INFO:teuthology.orchestra.run.vm05.stdout:mkdir: created directory '/home/ubuntu/cephtest/mnt.0' 2026-03-10T07:48:34.207 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-10T07:48:34.207 DEBUG:teuthology.orchestra.run.vm05:> chmod 0000 /home/ubuntu/cephtest/mnt.0 2026-03-10T07:48:34.263 DEBUG:teuthology.orchestra.run.vm05:> sudo modprobe fuse 2026-03-10T07:48:34.326 DEBUG:teuthology.orchestra.run.vm05:> cat /proc/self/mounts | awk '{print $2}' 2026-03-10T07:48:34.382 INFO:teuthology.orchestra.run.vm05.stdout:/proc 2026-03-10T07:48:34.382 INFO:teuthology.orchestra.run.vm05.stdout:/sys 2026-03-10T07:48:34.382 INFO:teuthology.orchestra.run.vm05.stdout:/dev 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/sys/kernel/security 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/dev/shm 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/dev/pts 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/run 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/cgroup 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/pstore 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/bpf 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/sys/kernel/config 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/ 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/nfs/rpc_pipefs 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/selinux 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/dev/hugepages 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/dev/mqueue 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/sys/kernel/debug 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/sys/kernel/tracing 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/run/credentials/systemd-sysctl.service 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/fuse/connections 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/run/credentials/systemd-sysusers.service 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/run/user/1000 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/637f97e52295e941f6f75b6130fb3e770445184663fc1bb506b5ae2ebe4558f4/merged 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/6b02429a0911514f3cd19ac69cf43a8ea5106b6e72a83e9229fe22389d765aaa/merged 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/run/user/0 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/164242f85a4c1a641b543bda3adc5860b888bb6bc2ae3c45adbac8a2bcd2c101/merged 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/68d4bc585f01a784471218acea358a30e27622ae167ee17fd02e7a3ef5f75124/merged 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/71290cd2b149a710b085d062f2d4e8e314264177b41ecf327b45722e7edbfe8a/merged 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/369886ed23f0e0b35a8c3d24b4788c43a04dbf7af9315929911f2dbdc72af39e/merged 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/3af9d1b0fd692151dd09aa91c77e1daaf37350b33405f4bc227aef4583930895/merged 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/dd6cbf62042ede1660195bf3a76682850a7548777003972940a71225d796a240/merged 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/600189ffe511a326c280c0fb52c63fe88ece35c67417080949f5893b96d9a8a7/merged 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/9682afbef251acf6fa23a1233c6586b09926bb6ffcb7793cd3c76b96ef110f44/merged 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/d3a096fbf8e370401ad13e017734abc866da6c53eb44bff5f525b32b02a68043/merged 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/8f5eb7d374ef7fce98c5614a62eb94eba689b86f18dcb044c2759827cb8e089b/merged 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/04fb334d946dbff6f6aff12640e007f02eb17680a5570302a4e1be5e05f81b8c/merged 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/run/netns 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run.vm05.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T07:48:34.383 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:48:34.383 DEBUG:teuthology.orchestra.run.vm05:> ls /sys/fs/fuse/connections 2026-03-10T07:48:34.438 INFO:tasks.cephfs.fuse_mount:Pre-mount connections: [] 2026-03-10T07:48:34.438 DEBUG:teuthology.orchestra.run.vm05:> (cd /home/ubuntu/cephtest && exec sudo nsenter --net=/var/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-fuse -f --admin-socket '/var/run/ceph/$cluster-$name.$pid.asok' /home/ubuntu/cephtest/mnt.0 --id 0) 2026-03-10T07:48:34.480 DEBUG:teuthology.orchestra.run.vm05:> sudo modprobe fuse 2026-03-10T07:48:34.506 DEBUG:teuthology.orchestra.run.vm05:> cat /proc/self/mounts | awk '{print $2}' 2026-03-10T07:48:34.550 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm05.stderr:2026-03-10T07:48:34.546+0000 7f888956f480 -1 init, newargv = 0x564262f01170 newargc=15 2026-03-10T07:48:34.550 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm05.stderr:ceph-fuse[96652]: starting ceph client 2026-03-10T07:48:34.558 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm05.stderr:ceph-fuse[96652]: starting fuse 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/proc 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/sys 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/dev 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/sys/kernel/security 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/dev/shm 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/dev/pts 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/run 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/cgroup 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/pstore 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/bpf 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/sys/kernel/config 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/ 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/nfs/rpc_pipefs 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/selinux 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/dev/hugepages 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/dev/mqueue 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/sys/kernel/debug 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/sys/kernel/tracing 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/run/credentials/systemd-sysctl.service 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/sys/fs/fuse/connections 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/run/credentials/systemd-sysusers.service 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/run/user/1000 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/637f97e52295e941f6f75b6130fb3e770445184663fc1bb506b5ae2ebe4558f4/merged 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/6b02429a0911514f3cd19ac69cf43a8ea5106b6e72a83e9229fe22389d765aaa/merged 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/run/user/0 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/164242f85a4c1a641b543bda3adc5860b888bb6bc2ae3c45adbac8a2bcd2c101/merged 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/68d4bc585f01a784471218acea358a30e27622ae167ee17fd02e7a3ef5f75124/merged 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/71290cd2b149a710b085d062f2d4e8e314264177b41ecf327b45722e7edbfe8a/merged 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/369886ed23f0e0b35a8c3d24b4788c43a04dbf7af9315929911f2dbdc72af39e/merged 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/3af9d1b0fd692151dd09aa91c77e1daaf37350b33405f4bc227aef4583930895/merged 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/dd6cbf62042ede1660195bf3a76682850a7548777003972940a71225d796a240/merged 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/600189ffe511a326c280c0fb52c63fe88ece35c67417080949f5893b96d9a8a7/merged 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/9682afbef251acf6fa23a1233c6586b09926bb6ffcb7793cd3c76b96ef110f44/merged 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/d3a096fbf8e370401ad13e017734abc866da6c53eb44bff5f525b32b02a68043/merged 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/8f5eb7d374ef7fce98c5614a62eb94eba689b86f18dcb044c2759827cb8e089b/merged 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/var/lib/containers/storage/overlay/04fb334d946dbff6f6aff12640e007f02eb17680a5570302a4e1be5e05f81b8c/merged 2026-03-10T07:48:34.570 INFO:teuthology.orchestra.run.vm05.stdout:/run/netns 2026-03-10T07:48:34.571 INFO:teuthology.orchestra.run.vm05.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T07:48:34.571 INFO:teuthology.orchestra.run.vm05.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.0 2026-03-10T07:48:34.571 INFO:teuthology.orchestra.run.vm05.stdout:/home/ubuntu/cephtest/mnt.0 2026-03-10T07:48:34.571 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:48:34.571 DEBUG:teuthology.orchestra.run.vm05:> ls /sys/fs/fuse/connections 2026-03-10T07:48:34.626 INFO:teuthology.orchestra.run.vm05.stdout:73 2026-03-10T07:48:34.626 INFO:tasks.cephfs.fuse_mount:Post-mount connections: [73] 2026-03-10T07:48:34.626 DEBUG:teuthology.orchestra.run.vm05:> sudo stdin-killer -- python3 -c ' 2026-03-10T07:48:34.626 DEBUG:teuthology.orchestra.run.vm05:> import glob 2026-03-10T07:48:34.626 DEBUG:teuthology.orchestra.run.vm05:> import re 2026-03-10T07:48:34.626 DEBUG:teuthology.orchestra.run.vm05:> import os 2026-03-10T07:48:34.626 DEBUG:teuthology.orchestra.run.vm05:> import subprocess 2026-03-10T07:48:34.626 DEBUG:teuthology.orchestra.run.vm05:> 2026-03-10T07:48:34.626 DEBUG:teuthology.orchestra.run.vm05:> def _find_admin_socket(client_name): 2026-03-10T07:48:34.626 DEBUG:teuthology.orchestra.run.vm05:> asok_path = "/var/run/ceph/ceph-client.0.*.asok" 2026-03-10T07:48:34.626 DEBUG:teuthology.orchestra.run.vm05:> files = glob.glob(asok_path) 2026-03-10T07:48:34.626 DEBUG:teuthology.orchestra.run.vm05:> mountpoint = "/home/ubuntu/cephtest/mnt.0" 2026-03-10T07:48:34.626 DEBUG:teuthology.orchestra.run.vm05:> 2026-03-10T07:48:34.626 DEBUG:teuthology.orchestra.run.vm05:> # Given a non-glob path, it better be there 2026-03-10T07:48:34.626 DEBUG:teuthology.orchestra.run.vm05:> if "*" not in asok_path: 2026-03-10T07:48:34.626 DEBUG:teuthology.orchestra.run.vm05:> assert(len(files) == 1) 2026-03-10T07:48:34.626 DEBUG:teuthology.orchestra.run.vm05:> return files[0] 2026-03-10T07:48:34.626 DEBUG:teuthology.orchestra.run.vm05:> 2026-03-10T07:48:34.626 DEBUG:teuthology.orchestra.run.vm05:> for f in files: 2026-03-10T07:48:34.627 DEBUG:teuthology.orchestra.run.vm05:> pid = re.match(".*\.(\d+)\.asok$", f).group(1) 2026-03-10T07:48:34.627 DEBUG:teuthology.orchestra.run.vm05:> if os.path.exists("/proc/{0}".format(pid)): 2026-03-10T07:48:34.627 DEBUG:teuthology.orchestra.run.vm05:> with open("/proc/{0}/cmdline".format(pid), '"'"'r'"'"') as proc_f: 2026-03-10T07:48:34.627 DEBUG:teuthology.orchestra.run.vm05:> contents = proc_f.read() 2026-03-10T07:48:34.627 DEBUG:teuthology.orchestra.run.vm05:> if mountpoint in contents: 2026-03-10T07:48:34.627 DEBUG:teuthology.orchestra.run.vm05:> return f 2026-03-10T07:48:34.627 DEBUG:teuthology.orchestra.run.vm05:> raise RuntimeError("Client socket {0} not found".format(client_name)) 2026-03-10T07:48:34.627 DEBUG:teuthology.orchestra.run.vm05:> 2026-03-10T07:48:34.627 DEBUG:teuthology.orchestra.run.vm05:> print(_find_admin_socket("client.0")) 2026-03-10T07:48:34.627 DEBUG:teuthology.orchestra.run.vm05:> ' 2026-03-10T07:48:34.719 INFO:teuthology.orchestra.run.vm05.stdout:/var/run/ceph/ceph-client.0.96652.asok 2026-03-10T07:48:34.721 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:34 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T07:48:34.727 INFO:tasks.cephfs.fuse_mount:Found client admin socket at /var/run/ceph/ceph-client.0.96652.asok 2026-03-10T07:48:34.727 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:48:34.727 DEBUG:teuthology.orchestra.run.vm05:> sudo ceph --admin-daemon /var/run/ceph/ceph-client.0.96652.asok status 2026-03-10T07:48:34.787 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:34 vm05.local ceph-mon[50387]: pgmap v82: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 642 B/s rd, 1.8 KiB/s wr, 6 op/s 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "metadata": { 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "ceph_sha1": "7fe91d5d5842e04be3b4f514d6dd990c54b29c76", 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "ceph_version": "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)", 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "entity_id": "0", 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "hostname": "vm05.local", 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "mount_point": "/home/ubuntu/cephtest/mnt.0", 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "pid": "96652", 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "root": "/" 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "dentry_count": 0, 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "dentry_pinned_count": 0, 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "id": 14542, 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "inst": { 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "name": { 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "type": "client", 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "num": 14542 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "addr": { 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "type": "v1", 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "addr": "192.168.144.1:0", 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "nonce": 2345317662 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "addr": { 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "type": "v1", 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "addr": "192.168.144.1:0", 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "nonce": 2345317662 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "inst_str": "client.14542 192.168.144.1:0/2345317662", 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "addr_str": "192.168.144.1:0/2345317662", 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "inode_count": 1, 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "mds_epoch": 12, 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "osd_epoch": 39, 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "osd_epoch_barrier": 0, 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "blocklisted": false, 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout: "fs_name": "cephfs" 2026-03-10T07:48:34.831 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:48:34.836 DEBUG:tasks.ceph_fuse:passing mntargs=[] 2026-03-10T07:48:34.837 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs ls 2026-03-10T07:48:34.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:34 vm08 ceph-mon[59917]: pgmap v82: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 642 B/s rd, 1.8 KiB/s wr, 6 op/s 2026-03-10T07:48:35.018 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:35.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.269+0000 7f8be7351700 1 -- 192.168.123.105:0/418927127 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8be0103340 msgr2=0x7f8be0103720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:35.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.269+0000 7f8be7351700 1 --2- 192.168.123.105:0/418927127 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8be0103340 0x7f8be0103720 secure :-1 s=READY pgs=274 cs=0 l=1 rev1=1 crypto rx=0x7f8bd0009b50 tx=0x7f8bd0009e60 comp rx=0 tx=0).stop 2026-03-10T07:48:35.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.270+0000 7f8be7351700 1 -- 192.168.123.105:0/418927127 shutdown_connections 2026-03-10T07:48:35.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.270+0000 7f8be7351700 1 --2- 192.168.123.105:0/418927127 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8be0103cf0 0x7f8be0107d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:35.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.270+0000 7f8be7351700 1 --2- 192.168.123.105:0/418927127 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8be0103340 0x7f8be0103720 unknown :-1 s=CLOSED pgs=274 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:35.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.270+0000 7f8be7351700 1 -- 192.168.123.105:0/418927127 >> 192.168.123.105:0/418927127 conn(0x7f8be00feb90 msgr2=0x7f8be0100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:35.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.271+0000 7f8be7351700 1 -- 192.168.123.105:0/418927127 shutdown_connections 2026-03-10T07:48:35.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.271+0000 7f8be7351700 1 -- 192.168.123.105:0/418927127 wait complete. 2026-03-10T07:48:35.273 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.271+0000 7f8be7351700 1 Processor -- start 2026-03-10T07:48:35.273 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.271+0000 7f8be7351700 1 -- start start 2026-03-10T07:48:35.273 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.272+0000 7f8be7351700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8be0103cf0 0x7f8be0199090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:35.274 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.272+0000 7f8be7351700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8be01995d0 0x7f8be019da40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:35.274 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.272+0000 7f8be7351700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8be0199bf0 con 0x7f8be01995d0 2026-03-10T07:48:35.274 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.272+0000 7f8be7351700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8be0199d60 con 0x7f8be0103cf0 2026-03-10T07:48:35.274 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.272+0000 7f8be48ec700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8be01995d0 0x7f8be019da40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:35.274 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.272+0000 7f8be48ec700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8be01995d0 0x7f8be019da40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:37080/0 (socket says 192.168.123.105:37080) 2026-03-10T07:48:35.274 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.272+0000 7f8be48ec700 1 -- 192.168.123.105:0/4236183595 learned_addr learned my addr 192.168.123.105:0/4236183595 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:35.274 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.272+0000 7f8be48ec700 1 -- 192.168.123.105:0/4236183595 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8be0103cf0 msgr2=0x7f8be0199090 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:48:35.274 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.272+0000 7f8be50ed700 1 --2- 192.168.123.105:0/4236183595 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8be0103cf0 0x7f8be0199090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:35.274 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.273+0000 7f8be48ec700 1 --2- 192.168.123.105:0/4236183595 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8be0103cf0 0x7f8be0199090 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:35.274 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.273+0000 7f8be48ec700 1 -- 192.168.123.105:0/4236183595 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8bd00097e0 con 0x7f8be01995d0 2026-03-10T07:48:35.274 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.273+0000 7f8be50ed700 1 --2- 192.168.123.105:0/4236183595 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8be0103cf0 0x7f8be0199090 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:48:35.274 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.273+0000 7f8be48ec700 1 --2- 192.168.123.105:0/4236183595 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8be01995d0 0x7f8be019da40 secure :-1 s=READY pgs=275 cs=0 l=1 rev1=1 crypto rx=0x7f8bdc009fd0 tx=0x7f8bdc00eea0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:35.275 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.273+0000 7f8bd67fc700 1 -- 192.168.123.105:0/4236183595 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8bdc00cca0 con 0x7f8be01995d0 2026-03-10T07:48:35.275 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.273+0000 7f8bd67fc700 1 -- 192.168.123.105:0/4236183595 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8bdc00ce00 con 0x7f8be01995d0 2026-03-10T07:48:35.275 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.273+0000 7f8bd67fc700 1 -- 192.168.123.105:0/4236183595 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8bdc010490 con 0x7f8be01995d0 2026-03-10T07:48:35.275 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.273+0000 7f8be7351700 1 -- 192.168.123.105:0/4236183595 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8be019e040 con 0x7f8be01995d0 2026-03-10T07:48:35.275 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.273+0000 7f8be7351700 1 -- 192.168.123.105:0/4236183595 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8be019e590 con 0x7f8be01995d0 2026-03-10T07:48:35.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.274+0000 7f8be7351700 1 -- 192.168.123.105:0/4236183595 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8be010b6e0 con 0x7f8be01995d0 2026-03-10T07:48:35.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.276+0000 7f8bd67fc700 1 -- 192.168.123.105:0/4236183595 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f8bdc004750 con 0x7f8be01995d0 2026-03-10T07:48:35.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.276+0000 7f8bd67fc700 1 --2- 192.168.123.105:0/4236183595 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8bcc06c5b0 0x7f8bcc06ea70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:35.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.276+0000 7f8bd67fc700 1 -- 192.168.123.105:0/4236183595 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f8bdc014070 con 0x7f8be01995d0 2026-03-10T07:48:35.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.277+0000 7f8bd67fc700 1 -- 192.168.123.105:0/4236183595 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f8bdc056b20 con 0x7f8be01995d0 2026-03-10T07:48:35.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.277+0000 7f8be50ed700 1 --2- 192.168.123.105:0/4236183595 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8bcc06c5b0 0x7f8bcc06ea70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:35.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.278+0000 7f8be50ed700 1 --2- 192.168.123.105:0/4236183595 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8bcc06c5b0 0x7f8bcc06ea70 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f8bd0006010 tx=0x7f8bd00058e0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:35.410 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.408+0000 7f8be7351700 1 -- 192.168.123.105:0/4236183595 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs ls"} v 0) v1 -- 0x7f8be019e870 con 0x7f8be01995d0 2026-03-10T07:48:35.410 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.408+0000 7f8bd67fc700 1 -- 192.168.123.105:0/4236183595 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs ls"}]=0 v12) v1 ==== 53+0+83 (secure 0 0 0) 0x7f8bdc05a140 con 0x7f8be01995d0 2026-03-10T07:48:35.410 INFO:teuthology.orchestra.run.vm05.stdout:name: cephfs, metadata pool: cephfs.cephfs.meta, data pools: [cephfs.cephfs.data ] 2026-03-10T07:48:35.412 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.411+0000 7f8be7351700 1 -- 192.168.123.105:0/4236183595 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8bcc06c5b0 msgr2=0x7f8bcc06ea70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:35.412 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.411+0000 7f8be7351700 1 --2- 192.168.123.105:0/4236183595 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8bcc06c5b0 0x7f8bcc06ea70 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f8bd0006010 tx=0x7f8bd00058e0 comp rx=0 tx=0).stop 2026-03-10T07:48:35.413 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.411+0000 7f8be7351700 1 -- 192.168.123.105:0/4236183595 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8be01995d0 msgr2=0x7f8be019da40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:35.413 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.411+0000 7f8be7351700 1 --2- 192.168.123.105:0/4236183595 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8be01995d0 0x7f8be019da40 secure :-1 s=READY pgs=275 cs=0 l=1 rev1=1 crypto rx=0x7f8bdc009fd0 tx=0x7f8bdc00eea0 comp rx=0 tx=0).stop 2026-03-10T07:48:35.413 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.412+0000 7f8be7351700 1 -- 192.168.123.105:0/4236183595 shutdown_connections 2026-03-10T07:48:35.413 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.412+0000 7f8be7351700 1 --2- 192.168.123.105:0/4236183595 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f8bcc06c5b0 0x7f8bcc06ea70 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:35.413 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.412+0000 7f8be7351700 1 --2- 192.168.123.105:0/4236183595 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8be0103cf0 0x7f8be0199090 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:35.413 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.412+0000 7f8be7351700 1 --2- 192.168.123.105:0/4236183595 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8be01995d0 0x7f8be019da40 unknown :-1 s=CLOSED pgs=275 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:35.413 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.412+0000 7f8be7351700 1 -- 192.168.123.105:0/4236183595 >> 192.168.123.105:0/4236183595 conn(0x7f8be00feb90 msgr2=0x7f8be01002f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:35.414 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.412+0000 7f8be7351700 1 -- 192.168.123.105:0/4236183595 shutdown_connections 2026-03-10T07:48:35.414 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:35.412+0000 7f8be7351700 1 -- 192.168.123.105:0/4236183595 wait complete. 2026-03-10T07:48:35.472 INFO:tasks.cephfs.mount:Mounting default Ceph FS; just confirmed its presence on cluster 2026-03-10T07:48:35.472 INFO:tasks.cephfs.mount:Mounting Ceph FS. Following are details of mount; remember "None" represents Python type None - 2026-03-10T07:48:35.472 INFO:tasks.cephfs.mount:self.client_remote.hostname = vm08.local 2026-03-10T07:48:35.472 INFO:tasks.cephfs.mount:self.client.name = client.1 2026-03-10T07:48:35.472 INFO:tasks.cephfs.mount:self.hostfs_mntpt = /home/ubuntu/cephtest/mnt.1 2026-03-10T07:48:35.472 INFO:tasks.cephfs.mount:self.cephfs_name = None 2026-03-10T07:48:35.472 INFO:tasks.cephfs.mount:self.cephfs_mntpt = None 2026-03-10T07:48:35.472 INFO:tasks.cephfs.mount:self.client_keyring_path = None 2026-03-10T07:48:35.472 INFO:tasks.cephfs.mount:Setting the 'None' netns for '/home/ubuntu/cephtest/mnt.1' 2026-03-10T07:48:35.472 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:48:35.472 DEBUG:teuthology.orchestra.run.vm08:> ip addr 2026-03-10T07:48:35.487 INFO:teuthology.orchestra.run.vm08.stdout:1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 2026-03-10T07:48:35.487 INFO:teuthology.orchestra.run.vm08.stdout: link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 2026-03-10T07:48:35.487 INFO:teuthology.orchestra.run.vm08.stdout: inet 127.0.0.1/8 scope host lo 2026-03-10T07:48:35.487 INFO:teuthology.orchestra.run.vm08.stdout: valid_lft forever preferred_lft forever 2026-03-10T07:48:35.487 INFO:teuthology.orchestra.run.vm08.stdout: inet6 ::1/128 scope host 2026-03-10T07:48:35.487 INFO:teuthology.orchestra.run.vm08.stdout: valid_lft forever preferred_lft forever 2026-03-10T07:48:35.487 INFO:teuthology.orchestra.run.vm08.stdout:2: eth0: mtu 1500 qdisc fq_codel state UP group default qlen 1000 2026-03-10T07:48:35.487 INFO:teuthology.orchestra.run.vm08.stdout: link/ether 52:55:00:00:00:08 brd ff:ff:ff:ff:ff:ff 2026-03-10T07:48:35.487 INFO:teuthology.orchestra.run.vm08.stdout: altname enp0s3 2026-03-10T07:48:35.487 INFO:teuthology.orchestra.run.vm08.stdout: altname ens3 2026-03-10T07:48:35.487 INFO:teuthology.orchestra.run.vm08.stdout: inet 192.168.123.108/24 brd 192.168.123.255 scope global dynamic noprefixroute eth0 2026-03-10T07:48:35.487 INFO:teuthology.orchestra.run.vm08.stdout: valid_lft 3172sec preferred_lft 3172sec 2026-03-10T07:48:35.487 INFO:teuthology.orchestra.run.vm08.stdout: inet6 fe80::5055:ff:fe00:8/64 scope link noprefixroute 2026-03-10T07:48:35.487 INFO:teuthology.orchestra.run.vm08.stdout: valid_lft forever preferred_lft forever 2026-03-10T07:48:35.487 INFO:tasks.cephfs.mount:Setuping the 'ceph-brx' with 192.168.159.254/20 2026-03-10T07:48:35.487 DEBUG:teuthology.orchestra.run.vm08:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T07:48:35.487 DEBUG:teuthology.orchestra.run.vm08:> set -e 2026-03-10T07:48:35.487 DEBUG:teuthology.orchestra.run.vm08:> sudo ip link add name ceph-brx type bridge 2026-03-10T07:48:35.487 DEBUG:teuthology.orchestra.run.vm08:> sudo ip addr flush dev ceph-brx 2026-03-10T07:48:35.487 DEBUG:teuthology.orchestra.run.vm08:> sudo ip link set ceph-brx up 2026-03-10T07:48:35.487 DEBUG:teuthology.orchestra.run.vm08:> sudo ip addr add 192.168.159.254/20 brd 192.168.159.255 dev ceph-brx 2026-03-10T07:48:35.487 DEBUG:teuthology.orchestra.run.vm08:> ') 2026-03-10T07:48:35.561 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:35 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T07:48:35.635 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:35 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T07:48:35.647 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:48:35.647 DEBUG:teuthology.orchestra.run.vm08:> echo 1 | sudo tee /proc/sys/net/ipv4/ip_forward 2026-03-10T07:48:35.716 INFO:teuthology.orchestra.run.vm08.stdout:1 2026-03-10T07:48:35.718 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:48:35.718 DEBUG:teuthology.orchestra.run.vm08:> ip r 2026-03-10T07:48:35.772 INFO:teuthology.orchestra.run.vm08.stdout:default via 192.168.123.1 dev eth0 proto dhcp src 192.168.123.108 metric 100 2026-03-10T07:48:35.772 INFO:teuthology.orchestra.run.vm08.stdout:192.168.123.0/24 dev eth0 proto kernel scope link src 192.168.123.108 metric 100 2026-03-10T07:48:35.772 INFO:teuthology.orchestra.run.vm08.stdout:192.168.144.0/20 dev ceph-brx proto kernel scope link src 192.168.159.254 2026-03-10T07:48:35.772 DEBUG:teuthology.orchestra.run.vm08:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T07:48:35.772 DEBUG:teuthology.orchestra.run.vm08:> set -e 2026-03-10T07:48:35.772 DEBUG:teuthology.orchestra.run.vm08:> sudo iptables -A FORWARD -o eth0 -i ceph-brx -j ACCEPT 2026-03-10T07:48:35.772 DEBUG:teuthology.orchestra.run.vm08:> sudo iptables -A FORWARD -i eth0 -o ceph-brx -j ACCEPT 2026-03-10T07:48:35.772 DEBUG:teuthology.orchestra.run.vm08:> sudo iptables -t nat -A POSTROUTING -s 192.168.159.254/20 -o eth0 -j MASQUERADE 2026-03-10T07:48:35.773 DEBUG:teuthology.orchestra.run.vm08:> ') 2026-03-10T07:48:35.847 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:35 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T07:48:35.855 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:35 vm08 ceph-mon[59917]: from='client.? 192.168.123.105:0/4236183595' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-10T07:48:35.906 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:35 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T07:48:35.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:35 vm05.local ceph-mon[50387]: from='client.? 192.168.123.105:0/4236183595' entity='client.admin' cmd=[{"prefix": "fs ls"}]: dispatch 2026-03-10T07:48:35.910 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:48:35.910 DEBUG:teuthology.orchestra.run.vm08:> ip netns list 2026-03-10T07:48:35.966 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:48:35.966 DEBUG:teuthology.orchestra.run.vm08:> ip netns list-id 2026-03-10T07:48:36.023 DEBUG:teuthology.orchestra.run.vm08:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T07:48:36.024 DEBUG:teuthology.orchestra.run.vm08:> set -e 2026-03-10T07:48:36.024 DEBUG:teuthology.orchestra.run.vm08:> sudo ip netns add ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T07:48:36.024 DEBUG:teuthology.orchestra.run.vm08:> sudo ip netns set ceph-ns--home-ubuntu-cephtest-mnt.1 0 2026-03-10T07:48:36.024 DEBUG:teuthology.orchestra.run.vm08:> ') 2026-03-10T07:48:36.100 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:36 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T07:48:36.124 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:36 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T07:48:36.127 INFO:tasks.cephfs.mount:Setuping the netns 'ceph-ns--home-ubuntu-cephtest-mnt.1' with 192.168.144.1/20 2026-03-10T07:48:36.127 DEBUG:teuthology.orchestra.run.vm08:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T07:48:36.127 DEBUG:teuthology.orchestra.run.vm08:> set -e 2026-03-10T07:48:36.128 DEBUG:teuthology.orchestra.run.vm08:> sudo ip link add veth0 netns ceph-ns--home-ubuntu-cephtest-mnt.1 type veth peer name brx.0 2026-03-10T07:48:36.128 DEBUG:teuthology.orchestra.run.vm08:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip addr add 192.168.144.1/20 brd 192.168.159.255 dev veth0 2026-03-10T07:48:36.128 DEBUG:teuthology.orchestra.run.vm08:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip link set veth0 up 2026-03-10T07:48:36.128 DEBUG:teuthology.orchestra.run.vm08:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip link set lo up 2026-03-10T07:48:36.128 DEBUG:teuthology.orchestra.run.vm08:> sudo ip netns exec ceph-ns--home-ubuntu-cephtest-mnt.1 ip route add default via 192.168.159.254 2026-03-10T07:48:36.128 DEBUG:teuthology.orchestra.run.vm08:> ') 2026-03-10T07:48:36.206 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:36 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T07:48:36.260 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:36 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T07:48:36.264 DEBUG:teuthology.orchestra.run.vm08:> (cd / && exec stdin-killer --timeout=300 -- bash -c ' 2026-03-10T07:48:36.264 DEBUG:teuthology.orchestra.run.vm08:> set -e 2026-03-10T07:48:36.264 DEBUG:teuthology.orchestra.run.vm08:> sudo ip link set brx.0 up 2026-03-10T07:48:36.264 DEBUG:teuthology.orchestra.run.vm08:> sudo ip link set dev brx.0 master ceph-brx 2026-03-10T07:48:36.264 DEBUG:teuthology.orchestra.run.vm08:> ') 2026-03-10T07:48:36.341 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:36 stdin-killer INFO: expiration expected; waiting 300 seconds for command to complete 2026-03-10T07:48:36.371 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:36 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T07:48:36.374 INFO:tasks.cephfs.fuse_mount:Client client.1 config is {} 2026-03-10T07:48:36.374 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-10T07:48:36.374 DEBUG:teuthology.orchestra.run.vm08:> mkdir -p -v /home/ubuntu/cephtest/mnt.1 2026-03-10T07:48:36.431 INFO:teuthology.orchestra.run.vm08.stdout:mkdir: created directory '/home/ubuntu/cephtest/mnt.1' 2026-03-10T07:48:36.432 INFO:teuthology.orchestra.run:Running command with timeout 60 2026-03-10T07:48:36.432 DEBUG:teuthology.orchestra.run.vm08:> chmod 0000 /home/ubuntu/cephtest/mnt.1 2026-03-10T07:48:36.486 DEBUG:teuthology.orchestra.run.vm08:> sudo modprobe fuse 2026-03-10T07:48:36.553 DEBUG:teuthology.orchestra.run.vm08:> cat /proc/self/mounts | awk '{print $2}' 2026-03-10T07:48:36.609 INFO:teuthology.orchestra.run.vm08.stdout:/proc 2026-03-10T07:48:36.609 INFO:teuthology.orchestra.run.vm08.stdout:/sys 2026-03-10T07:48:36.609 INFO:teuthology.orchestra.run.vm08.stdout:/dev 2026-03-10T07:48:36.609 INFO:teuthology.orchestra.run.vm08.stdout:/sys/kernel/security 2026-03-10T07:48:36.609 INFO:teuthology.orchestra.run.vm08.stdout:/dev/shm 2026-03-10T07:48:36.609 INFO:teuthology.orchestra.run.vm08.stdout:/dev/pts 2026-03-10T07:48:36.609 INFO:teuthology.orchestra.run.vm08.stdout:/run 2026-03-10T07:48:36.609 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/cgroup 2026-03-10T07:48:36.609 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/pstore 2026-03-10T07:48:36.609 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/bpf 2026-03-10T07:48:36.609 INFO:teuthology.orchestra.run.vm08.stdout:/sys/kernel/config 2026-03-10T07:48:36.609 INFO:teuthology.orchestra.run.vm08.stdout:/ 2026-03-10T07:48:36.609 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/nfs/rpc_pipefs 2026-03-10T07:48:36.609 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/selinux 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/dev/hugepages 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/dev/mqueue 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/sys/kernel/debug 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/sys/kernel/tracing 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/run/credentials/systemd-sysctl.service 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/fuse/connections 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/run/credentials/systemd-sysusers.service 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/run/user/1000 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/run/user/0 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/b87e596ee0e4724da7acfc1e4cdd645e0574da9728c7dfd7b20754b0519152d1/merged 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/8f637089ce3445d63b1d3bd7298803bd7570f263430e22aa6b033c539f28b0b8/merged 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/d31c608a5f1017f651d67ef297a3d754c489830f8dd4463335c81ecee477a253/merged 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/f3ed4a29d9f267f2c24159bb52dfb0b2dd2af7bdd3a6c164c0f467f105cc7379/merged 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/9eb51ac595f549d8ecf532ed7f857e615fb425f4cf0f3bced2fb4c3bdeaac7c0/merged 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/7ea78bacee378f173a15f52fa44734d7a22c5cc5dafbd87ee90c6262f8c6fc38/merged 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/868f0952f0024b322500dd07dfef20878c71cec8aa1d282037c35679d70289cc/merged 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/d9595fd2aca4dabedbe04bd0b795c8192c06c3fbc992492322eaab185e34b697/merged 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/4ac5cff280bdd398041273e80323313ccc8e12ac4678184f2f9e409adee3c471/merged 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/32d6c784ab086c5cc55056f11df3b17479002772a034414b1c713d85a226dbbd/merged 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/run/netns 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run.vm08.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T07:48:36.610 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:48:36.610 DEBUG:teuthology.orchestra.run.vm08:> ls /sys/fs/fuse/connections 2026-03-10T07:48:36.664 INFO:tasks.cephfs.fuse_mount:Pre-mount connections: [] 2026-03-10T07:48:36.664 DEBUG:teuthology.orchestra.run.vm08:> (cd /home/ubuntu/cephtest && exec sudo nsenter --net=/var/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 sudo adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage daemon-helper kill ceph-fuse -f --admin-socket '/var/run/ceph/$cluster-$name.$pid.asok' /home/ubuntu/cephtest/mnt.1 --id 1) 2026-03-10T07:48:36.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:36 vm08.local ceph-mon[59917]: pgmap v83: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 3.6 KiB/s rd, 1.6 KiB/s wr, 8 op/s 2026-03-10T07:48:36.669 DEBUG:teuthology.orchestra.run.vm08:> sudo modprobe fuse 2026-03-10T07:48:36.697 DEBUG:teuthology.orchestra.run.vm08:> cat /proc/self/mounts | awk '{print $2}' 2026-03-10T07:48:36.743 INFO:tasks.cephfs.fuse_mount.ceph-fuse.1.vm08.stderr:ceph-fuse[85009]: starting ceph client 2026-03-10T07:48:36.743 INFO:tasks.cephfs.fuse_mount.ceph-fuse.1.vm08.stderr:2026-03-10T07:48:36.742+0000 7f8938bc6480 -1 init, newargv = 0x55e24d4da760 newargc=15 2026-03-10T07:48:36.751 INFO:tasks.cephfs.fuse_mount.ceph-fuse.1.vm08.stderr:ceph-fuse[85009]: starting fuse 2026-03-10T07:48:36.765 INFO:teuthology.orchestra.run.vm08.stdout:/proc 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/sys 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/dev 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/sys/kernel/security 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/dev/shm 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/dev/pts 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/run 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/cgroup 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/pstore 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/bpf 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/sys/kernel/config 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/ 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/nfs/rpc_pipefs 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/selinux 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/dev/hugepages 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/dev/mqueue 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/sys/kernel/debug 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/sys/kernel/tracing 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/run/credentials/systemd-sysctl.service 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/sys/fs/fuse/connections 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/run/credentials/systemd-sysusers.service 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/run/credentials/systemd-tmpfiles-setup-dev.service 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/run/credentials/systemd-tmpfiles-setup.service 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/run/user/1000 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/run/user/0 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/proc/sys/fs/binfmt_misc 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/b87e596ee0e4724da7acfc1e4cdd645e0574da9728c7dfd7b20754b0519152d1/merged 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/8f637089ce3445d63b1d3bd7298803bd7570f263430e22aa6b033c539f28b0b8/merged 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/d31c608a5f1017f651d67ef297a3d754c489830f8dd4463335c81ecee477a253/merged 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/f3ed4a29d9f267f2c24159bb52dfb0b2dd2af7bdd3a6c164c0f467f105cc7379/merged 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/9eb51ac595f549d8ecf532ed7f857e615fb425f4cf0f3bced2fb4c3bdeaac7c0/merged 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/7ea78bacee378f173a15f52fa44734d7a22c5cc5dafbd87ee90c6262f8c6fc38/merged 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/868f0952f0024b322500dd07dfef20878c71cec8aa1d282037c35679d70289cc/merged 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/d9595fd2aca4dabedbe04bd0b795c8192c06c3fbc992492322eaab185e34b697/merged 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/4ac5cff280bdd398041273e80323313ccc8e12ac4678184f2f9e409adee3c471/merged 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/var/lib/containers/storage/overlay/32d6c784ab086c5cc55056f11df3b17479002772a034414b1c713d85a226dbbd/merged 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/run/netns 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/run/netns/ceph-ns--home-ubuntu-cephtest-mnt.1 2026-03-10T07:48:36.766 INFO:teuthology.orchestra.run.vm08.stdout:/home/ubuntu/cephtest/mnt.1 2026-03-10T07:48:36.767 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:48:36.767 DEBUG:teuthology.orchestra.run.vm08:> ls /sys/fs/fuse/connections 2026-03-10T07:48:36.822 INFO:teuthology.orchestra.run.vm08.stdout:90 2026-03-10T07:48:36.822 INFO:tasks.cephfs.fuse_mount:Post-mount connections: [90] 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> sudo stdin-killer -- python3 -c ' 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> import glob 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> import re 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> import os 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> import subprocess 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> def _find_admin_socket(client_name): 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> asok_path = "/var/run/ceph/ceph-client.1.*.asok" 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> files = glob.glob(asok_path) 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> mountpoint = "/home/ubuntu/cephtest/mnt.1" 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> # Given a non-glob path, it better be there 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> if "*" not in asok_path: 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> assert(len(files) == 1) 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> return files[0] 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> for f in files: 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> pid = re.match(".*\.(\d+)\.asok$", f).group(1) 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> if os.path.exists("/proc/{0}".format(pid)): 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> with open("/proc/{0}/cmdline".format(pid), '"'"'r'"'"') as proc_f: 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> contents = proc_f.read() 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> if mountpoint in contents: 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> return f 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> raise RuntimeError("Client socket {0} not found".format(client_name)) 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> print(_find_admin_socket("client.1")) 2026-03-10T07:48:36.823 DEBUG:teuthology.orchestra.run.vm08:> ' 2026-03-10T07:48:36.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:36 vm05.local ceph-mon[50387]: pgmap v83: 65 pgs: 65 active+clean; 451 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 3.6 KiB/s rd, 1.6 KiB/s wr, 8 op/s 2026-03-10T07:48:36.921 INFO:teuthology.orchestra.run.vm08.stdout:/var/run/ceph/ceph-client.1.85009.asok 2026-03-10T07:48:36.924 INFO:teuthology.orchestra.run.vm08.stderr:2026-03-10T07:48:36 stdin-killer INFO: command exited with status 0: exiting normally with same code! 2026-03-10T07:48:36.930 INFO:tasks.cephfs.fuse_mount:Found client admin socket at /var/run/ceph/ceph-client.1.85009.asok 2026-03-10T07:48:36.930 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:48:36.930 DEBUG:teuthology.orchestra.run.vm08:> sudo ceph --admin-daemon /var/run/ceph/ceph-client.1.85009.asok status 2026-03-10T07:48:37.040 INFO:teuthology.orchestra.run.vm08.stdout:{ 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "metadata": { 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "ceph_sha1": "7fe91d5d5842e04be3b4f514d6dd990c54b29c76", 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "ceph_version": "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)", 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "entity_id": "1", 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "hostname": "vm08.local", 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "mount_point": "/home/ubuntu/cephtest/mnt.1", 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "pid": "85009", 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "root": "/" 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: }, 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "dentry_count": 0, 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "dentry_pinned_count": 0, 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "id": 24347, 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "inst": { 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "name": { 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "type": "client", 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "num": 24347 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: }, 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "addr": { 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "type": "v1", 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "addr": "192.168.123.108:0", 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "nonce": 2923077033 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: } 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: }, 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "addr": { 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "type": "v1", 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "addr": "192.168.123.108:0", 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "nonce": 2923077033 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: }, 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "inst_str": "client.24347 192.168.123.108:0/2923077033", 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "addr_str": "192.168.123.108:0/2923077033", 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "inode_count": 1, 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "mds_epoch": 12, 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "osd_epoch": 39, 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "osd_epoch_barrier": 0, 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "blocklisted": false, 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout: "fs_name": "cephfs" 2026-03-10T07:48:37.041 INFO:teuthology.orchestra.run.vm08.stdout:} 2026-03-10T07:48:37.046 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:48:37.047 DEBUG:teuthology.orchestra.run.vm05:> stat --file-system '--printf=%T 2026-03-10T07:48:37.047 DEBUG:teuthology.orchestra.run.vm05:> ' -- /home/ubuntu/cephtest/mnt.0 2026-03-10T07:48:37.063 INFO:teuthology.orchestra.run.vm05.stdout:fuseblk 2026-03-10T07:48:37.063 INFO:tasks.cephfs.fuse_mount:ceph-fuse is mounted on /home/ubuntu/cephtest/mnt.0 2026-03-10T07:48:37.064 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:48:37.064 DEBUG:teuthology.orchestra.run.vm05:> sudo chmod 1777 /home/ubuntu/cephtest/mnt.0 2026-03-10T07:48:37.136 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:48:37.136 DEBUG:teuthology.orchestra.run.vm08:> stat --file-system '--printf=%T 2026-03-10T07:48:37.136 DEBUG:teuthology.orchestra.run.vm08:> ' -- /home/ubuntu/cephtest/mnt.1 2026-03-10T07:48:37.154 INFO:teuthology.orchestra.run.vm08.stdout:fuseblk 2026-03-10T07:48:37.155 INFO:tasks.cephfs.fuse_mount:ceph-fuse is mounted on /home/ubuntu/cephtest/mnt.1 2026-03-10T07:48:37.155 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:48:37.155 DEBUG:teuthology.orchestra.run.vm08:> sudo chmod 1777 /home/ubuntu/cephtest/mnt.1 2026-03-10T07:48:37.224 INFO:teuthology.run_tasks:Running task print... 2026-03-10T07:48:37.227 INFO:teuthology.task.print:**** done client 2026-03-10T07:48:37.227 INFO:teuthology.run_tasks:Running task parallel... 2026-03-10T07:48:37.230 INFO:teuthology.task.parallel:starting parallel... 2026-03-10T07:48:37.230 INFO:teuthology.task.parallel:In parallel, running task sequential... 2026-03-10T07:48:37.230 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-10T07:48:37.230 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm05.local 2026-03-10T07:48:37.230 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force' 2026-03-10T07:48:37.230 INFO:teuthology.task.parallel:In parallel, running task sequential... 2026-03-10T07:48:37.230 INFO:teuthology.task.sequential:In sequential, running task workunit... 2026-03-10T07:48:37.232 INFO:tasks.workunit:Pulling workunits from ref 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b 2026-03-10T07:48:37.232 INFO:tasks.workunit:Making a separate scratch dir for every client... 2026-03-10T07:48:37.232 INFO:tasks.workunit:timeout=3h 2026-03-10T07:48:37.232 INFO:tasks.workunit:cleanup=True 2026-03-10T07:48:37.232 DEBUG:teuthology.orchestra.run.vm05:> stat -- /home/ubuntu/cephtest/mnt.0 2026-03-10T07:48:37.254 INFO:teuthology.orchestra.run.vm05.stdout: File: /home/ubuntu/cephtest/mnt.0 2026-03-10T07:48:37.254 INFO:teuthology.orchestra.run.vm05.stdout: Size: 0 Blocks: 1 IO Block: 4194304 directory 2026-03-10T07:48:37.254 INFO:teuthology.orchestra.run.vm05.stdout:Device: 49h/73d Inode: 1 Links: 2 2026-03-10T07:48:37.254 INFO:teuthology.orchestra.run.vm05.stdout:Access: (1777/drwxrwxrwt) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-10T07:48:37.255 INFO:teuthology.orchestra.run.vm05.stdout:Context: system_u:object_r:fusefs_t:s0 2026-03-10T07:48:37.255 INFO:teuthology.orchestra.run.vm05.stdout:Access: 1970-01-01 00:00:00.000000000 +0000 2026-03-10T07:48:37.255 INFO:teuthology.orchestra.run.vm05.stdout:Modify: 2026-03-10 07:48:26.393687267 +0000 2026-03-10T07:48:37.255 INFO:teuthology.orchestra.run.vm05.stdout:Change: 2026-03-10 07:48:37.133447544 +0000 2026-03-10T07:48:37.255 INFO:teuthology.orchestra.run.vm05.stdout: Birth: - 2026-03-10T07:48:37.255 INFO:tasks.workunit:Did not need to create dir /home/ubuntu/cephtest/mnt.0 2026-03-10T07:48:37.255 DEBUG:teuthology.orchestra.run.vm05:> cd -- /home/ubuntu/cephtest/mnt.0 && sudo install -d -m 0755 --owner=ubuntu -- client.0 2026-03-10T07:48:37.329 DEBUG:teuthology.orchestra.run.vm08:> stat -- /home/ubuntu/cephtest/mnt.1 2026-03-10T07:48:37.348 INFO:teuthology.orchestra.run.vm08.stdout: File: /home/ubuntu/cephtest/mnt.1 2026-03-10T07:48:37.348 INFO:teuthology.orchestra.run.vm08.stdout: Size: 0 Blocks: 1 IO Block: 4194304 directory 2026-03-10T07:48:37.348 INFO:teuthology.orchestra.run.vm08.stdout:Device: 5ah/90d Inode: 1 Links: 3 2026-03-10T07:48:37.348 INFO:teuthology.orchestra.run.vm08.stdout:Access: (1777/drwxrwxrwt) Uid: ( 0/ root) Gid: ( 0/ root) 2026-03-10T07:48:37.349 INFO:teuthology.orchestra.run.vm08.stdout:Context: system_u:object_r:fusefs_t:s0 2026-03-10T07:48:37.349 INFO:teuthology.orchestra.run.vm08.stdout:Access: 1970-01-01 00:00:00.000000000 +0000 2026-03-10T07:48:37.349 INFO:teuthology.orchestra.run.vm08.stdout:Modify: 2026-03-10 07:48:37.324171598 +0000 2026-03-10T07:48:37.349 INFO:teuthology.orchestra.run.vm08.stdout:Change: 2026-03-10 07:48:37.324171598 +0000 2026-03-10T07:48:37.349 INFO:teuthology.orchestra.run.vm08.stdout: Birth: - 2026-03-10T07:48:37.349 INFO:tasks.workunit:Did not need to create dir /home/ubuntu/cephtest/mnt.1 2026-03-10T07:48:37.349 DEBUG:teuthology.orchestra.run.vm08:> cd -- /home/ubuntu/cephtest/mnt.1 && sudo install -d -m 0755 --owner=ubuntu -- client.1 2026-03-10T07:48:37.383 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:37.419 DEBUG:teuthology.orchestra.run.vm05:> rm -rf /home/ubuntu/cephtest/clone.client.0 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.0 && cd /home/ubuntu/cephtest/clone.client.0 && git checkout 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b 2026-03-10T07:48:37.419 DEBUG:teuthology.orchestra.run.vm08:> rm -rf /home/ubuntu/cephtest/clone.client.1 && git clone https://github.com/kshtsk/ceph.git /home/ubuntu/cephtest/clone.client.1 && cd /home/ubuntu/cephtest/clone.client.1 && git checkout 75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b 2026-03-10T07:48:37.445 INFO:tasks.workunit.client.0.vm05.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.0'... 2026-03-10T07:48:37.477 INFO:tasks.workunit.client.1.vm08.stderr:Cloning into '/home/ubuntu/cephtest/clone.client.1'... 2026-03-10T07:48:37.637 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.635+0000 7ff36fa22700 1 -- 192.168.123.105:0/3022451239 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff368103180 msgr2=0x7ff3681035a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:37.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.635+0000 7ff36fa22700 1 --2- 192.168.123.105:0/3022451239 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff368103180 0x7ff3681035a0 secure :-1 s=READY pgs=276 cs=0 l=1 rev1=1 crypto rx=0x7ff358009b50 tx=0x7ff358009e60 comp rx=0 tx=0).stop 2026-03-10T07:48:37.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.636+0000 7ff36fa22700 1 -- 192.168.123.105:0/3022451239 shutdown_connections 2026-03-10T07:48:37.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.636+0000 7ff36fa22700 1 --2- 192.168.123.105:0/3022451239 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff368104380 0x7ff3681047e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:37.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.636+0000 7ff36fa22700 1 --2- 192.168.123.105:0/3022451239 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff368103180 0x7ff3681035a0 unknown :-1 s=CLOSED pgs=276 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:37.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.636+0000 7ff36fa22700 1 -- 192.168.123.105:0/3022451239 >> 192.168.123.105:0/3022451239 conn(0x7ff3680fe720 msgr2=0x7ff368100b60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:37.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.636+0000 7ff36fa22700 1 -- 192.168.123.105:0/3022451239 shutdown_connections 2026-03-10T07:48:37.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.636+0000 7ff36fa22700 1 -- 192.168.123.105:0/3022451239 wait complete. 2026-03-10T07:48:37.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.636+0000 7ff36fa22700 1 Processor -- start 2026-03-10T07:48:37.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.636+0000 7ff36fa22700 1 -- start start 2026-03-10T07:48:37.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.637+0000 7ff36fa22700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff368103180 0x7ff368078b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:37.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.637+0000 7ff36fa22700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff368104380 0x7ff368079080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:37.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.637+0000 7ff36fa22700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff368075630 con 0x7ff368104380 2026-03-10T07:48:37.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.637+0000 7ff36fa22700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff3680757a0 con 0x7ff368103180 2026-03-10T07:48:37.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.637+0000 7ff36cfbd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff368104380 0x7ff368079080 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:37.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.637+0000 7ff36cfbd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff368104380 0x7ff368079080 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:37104/0 (socket says 192.168.123.105:37104) 2026-03-10T07:48:37.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.637+0000 7ff36cfbd700 1 -- 192.168.123.105:0/2612878364 learned_addr learned my addr 192.168.123.105:0/2612878364 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:37.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.637+0000 7ff36cfbd700 1 -- 192.168.123.105:0/2612878364 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff368103180 msgr2=0x7ff368078b40 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:48:37.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.637+0000 7ff36cfbd700 1 --2- 192.168.123.105:0/2612878364 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff368103180 0x7ff368078b40 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:37.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.637+0000 7ff36cfbd700 1 -- 192.168.123.105:0/2612878364 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff3580097e0 con 0x7ff368104380 2026-03-10T07:48:37.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.637+0000 7ff36cfbd700 1 --2- 192.168.123.105:0/2612878364 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff368104380 0x7ff368079080 secure :-1 s=READY pgs=277 cs=0 l=1 rev1=1 crypto rx=0x7ff36400b700 tx=0x7ff36400bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:37.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.638+0000 7ff35e7fc700 1 -- 192.168.123.105:0/2612878364 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff364010840 con 0x7ff368104380 2026-03-10T07:48:37.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.638+0000 7ff35e7fc700 1 -- 192.168.123.105:0/2612878364 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff364010e80 con 0x7ff368104380 2026-03-10T07:48:37.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.638+0000 7ff35e7fc700 1 -- 192.168.123.105:0/2612878364 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff36400d590 con 0x7ff368104380 2026-03-10T07:48:37.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.638+0000 7ff36fa22700 1 -- 192.168.123.105:0/2612878364 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff368075a80 con 0x7ff368104380 2026-03-10T07:48:37.640 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.639+0000 7ff36fa22700 1 -- 192.168.123.105:0/2612878364 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff368075fa0 con 0x7ff368104380 2026-03-10T07:48:37.643 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.639+0000 7ff35e7fc700 1 -- 192.168.123.105:0/2612878364 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7ff36400f3e0 con 0x7ff368104380 2026-03-10T07:48:37.644 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.639+0000 7ff36fa22700 1 -- 192.168.123.105:0/2612878364 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff36804ea90 con 0x7ff368104380 2026-03-10T07:48:37.644 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.639+0000 7ff35e7fc700 1 --2- 192.168.123.105:0/2612878364 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff35406c490 0x7ff35406e950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:37.644 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.640+0000 7ff35e7fc700 1 -- 192.168.123.105:0/2612878364 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7ff36408a7d0 con 0x7ff368104380 2026-03-10T07:48:37.644 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.642+0000 7ff35e7fc700 1 -- 192.168.123.105:0/2612878364 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ff364095020 con 0x7ff368104380 2026-03-10T07:48:37.644 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.642+0000 7ff36d7be700 1 --2- 192.168.123.105:0/2612878364 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff35406c490 0x7ff35406e950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:37.644 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.642+0000 7ff36d7be700 1 --2- 192.168.123.105:0/2612878364 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff35406c490 0x7ff35406e950 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7ff358009b20 tx=0x7ff358005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:37.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.748+0000 7ff36fa22700 1 -- 192.168.123.105:0/2612878364 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}] v 0) v1 -- 0x7ff368066e80 con 0x7ff368104380 2026-03-10T07:48:37.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.748+0000 7ff35e7fc700 1 -- 192.168.123.105:0/2612878364 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}]=0 v15)=0 v15) v1 ==== 155+0+0 (secure 0 0 0) 0x7ff364058a10 con 0x7ff368104380 2026-03-10T07:48:37.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.750+0000 7ff36fa22700 1 -- 192.168.123.105:0/2612878364 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff35406c490 msgr2=0x7ff35406e950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:37.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.750+0000 7ff36fa22700 1 --2- 192.168.123.105:0/2612878364 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff35406c490 0x7ff35406e950 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7ff358009b20 tx=0x7ff358005fb0 comp rx=0 tx=0).stop 2026-03-10T07:48:37.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.750+0000 7ff36fa22700 1 -- 192.168.123.105:0/2612878364 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff368104380 msgr2=0x7ff368079080 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:37.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.750+0000 7ff36fa22700 1 --2- 192.168.123.105:0/2612878364 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff368104380 0x7ff368079080 secure :-1 s=READY pgs=277 cs=0 l=1 rev1=1 crypto rx=0x7ff36400b700 tx=0x7ff36400bac0 comp rx=0 tx=0).stop 2026-03-10T07:48:37.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.751+0000 7ff36fa22700 1 -- 192.168.123.105:0/2612878364 shutdown_connections 2026-03-10T07:48:37.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.751+0000 7ff36fa22700 1 --2- 192.168.123.105:0/2612878364 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff35406c490 0x7ff35406e950 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:37.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.751+0000 7ff36fa22700 1 --2- 192.168.123.105:0/2612878364 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff368103180 0x7ff368078b40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:37.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.751+0000 7ff36fa22700 1 --2- 192.168.123.105:0/2612878364 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff368104380 0x7ff368079080 unknown :-1 s=CLOSED pgs=277 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:37.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.751+0000 7ff36fa22700 1 -- 192.168.123.105:0/2612878364 >> 192.168.123.105:0/2612878364 conn(0x7ff3680fe720 msgr2=0x7ff3681075b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:37.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.751+0000 7ff36fa22700 1 -- 192.168.123.105:0/2612878364 shutdown_connections 2026-03-10T07:48:37.752 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:37.751+0000 7ff36fa22700 1 -- 192.168.123.105:0/2612878364 wait complete. 2026-03-10T07:48:37.831 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force' 2026-03-10T07:48:38.024 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:38.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.255+0000 7f58888eb700 1 -- 192.168.123.105:0/881259627 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5880104350 msgr2=0x7f58801047b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:38.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.255+0000 7f58888eb700 1 --2- 192.168.123.105:0/881259627 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5880104350 0x7f58801047b0 secure :-1 s=READY pgs=278 cs=0 l=1 rev1=1 crypto rx=0x7f5874009b50 tx=0x7f5874009e60 comp rx=0 tx=0).stop 2026-03-10T07:48:38.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.256+0000 7f58888eb700 1 -- 192.168.123.105:0/881259627 shutdown_connections 2026-03-10T07:48:38.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.256+0000 7f58888eb700 1 --2- 192.168.123.105:0/881259627 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5880104350 0x7f58801047b0 unknown :-1 s=CLOSED pgs=278 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:38.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.256+0000 7f58888eb700 1 --2- 192.168.123.105:0/881259627 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5880103150 0x7f5880103570 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:38.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.256+0000 7f58888eb700 1 -- 192.168.123.105:0/881259627 >> 192.168.123.105:0/881259627 conn(0x7f58800fe6d0 msgr2=0x7f5880100b30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:38.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.257+0000 7f58888eb700 1 -- 192.168.123.105:0/881259627 shutdown_connections 2026-03-10T07:48:38.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.257+0000 7f58888eb700 1 -- 192.168.123.105:0/881259627 wait complete. 2026-03-10T07:48:38.259 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.257+0000 7f58888eb700 1 Processor -- start 2026-03-10T07:48:38.259 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.257+0000 7f58888eb700 1 -- start start 2026-03-10T07:48:38.259 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.258+0000 7f58888eb700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5880103150 0x7f58801989d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:38.259 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.258+0000 7f58888eb700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5880104350 0x7f5880198f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:38.259 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.258+0000 7f58888eb700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5880199530 con 0x7f5880104350 2026-03-10T07:48:38.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.258+0000 7f58888eb700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5880199670 con 0x7f5880103150 2026-03-10T07:48:38.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.258+0000 7f5885e86700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5880104350 0x7f5880198f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:38.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.258+0000 7f5885e86700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5880104350 0x7f5880198f10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:37126/0 (socket says 192.168.123.105:37126) 2026-03-10T07:48:38.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.258+0000 7f5885e86700 1 -- 192.168.123.105:0/1003068209 learned_addr learned my addr 192.168.123.105:0/1003068209 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:38.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.258+0000 7f5885e86700 1 -- 192.168.123.105:0/1003068209 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5880103150 msgr2=0x7f58801989d0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:48:38.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.258+0000 7f5886687700 1 --2- 192.168.123.105:0/1003068209 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5880103150 0x7f58801989d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:38.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.258+0000 7f5885e86700 1 --2- 192.168.123.105:0/1003068209 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5880103150 0x7f58801989d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:38.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.258+0000 7f5885e86700 1 -- 192.168.123.105:0/1003068209 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f58740097e0 con 0x7f5880104350 2026-03-10T07:48:38.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.259+0000 7f5885e86700 1 --2- 192.168.123.105:0/1003068209 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5880104350 0x7f5880198f10 secure :-1 s=READY pgs=279 cs=0 l=1 rev1=1 crypto rx=0x7f587400b5c0 tx=0x7f5874005250 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:38.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.259+0000 7f587b7fe700 1 -- 192.168.123.105:0/1003068209 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f587401d070 con 0x7f5880104350 2026-03-10T07:48:38.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.259+0000 7f587b7fe700 1 -- 192.168.123.105:0/1003068209 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f587400bc30 con 0x7f5880104350 2026-03-10T07:48:38.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.259+0000 7f587b7fe700 1 -- 192.168.123.105:0/1003068209 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f587400f910 con 0x7f5880104350 2026-03-10T07:48:38.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.259+0000 7f58888eb700 1 -- 192.168.123.105:0/1003068209 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f588019e0c0 con 0x7f5880104350 2026-03-10T07:48:38.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.259+0000 7f58888eb700 1 -- 192.168.123.105:0/1003068209 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5880075530 con 0x7f5880104350 2026-03-10T07:48:38.262 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.259+0000 7f5886687700 1 --2- 192.168.123.105:0/1003068209 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5880103150 0x7f58801989d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:48:38.262 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.260+0000 7f58888eb700 1 -- 192.168.123.105:0/1003068209 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5880066e80 con 0x7f5880104350 2026-03-10T07:48:38.265 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.263+0000 7f587b7fe700 1 -- 192.168.123.105:0/1003068209 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f5874022be0 con 0x7f5880104350 2026-03-10T07:48:38.265 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.264+0000 7f587b7fe700 1 --2- 192.168.123.105:0/1003068209 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f587006c600 0x7f587006eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:38.265 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.264+0000 7f587b7fe700 1 -- 192.168.123.105:0/1003068209 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f587408d830 con 0x7f5880104350 2026-03-10T07:48:38.265 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.264+0000 7f587b7fe700 1 -- 192.168.123.105:0/1003068209 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f58740ba180 con 0x7f5880104350 2026-03-10T07:48:38.265 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.264+0000 7f5886687700 1 --2- 192.168.123.105:0/1003068209 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f587006c600 0x7f587006eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:38.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.264+0000 7f5886687700 1 --2- 192.168.123.105:0/1003068209 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f587006c600 0x7f587006eac0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f586c005fd0 tx=0x7f586c005f00 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:38.375 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:38 vm05.local ceph-mon[50387]: pgmap v84: 65 pgs: 65 active+clean; 452 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 5.0 KiB/s rd, 1.4 KiB/s wr, 8 op/s 2026-03-10T07:48:38.375 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.373+0000 7f58888eb700 1 -- 192.168.123.105:0/1003068209 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}] v 0) v1 -- 0x7f5880075a60 con 0x7f5880104350 2026-03-10T07:48:38.375 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.373+0000 7f587b7fe700 1 -- 192.168.123.105:0/1003068209 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}]=0 v15)=0 v15) v1 ==== 163+0+0 (secure 0 0 0) 0x7f587405b8d0 con 0x7f5880104350 2026-03-10T07:48:38.379 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.377+0000 7f58888eb700 1 -- 192.168.123.105:0/1003068209 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f587006c600 msgr2=0x7f587006eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:38.379 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.377+0000 7f58888eb700 1 --2- 192.168.123.105:0/1003068209 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f587006c600 0x7f587006eac0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f586c005fd0 tx=0x7f586c005f00 comp rx=0 tx=0).stop 2026-03-10T07:48:38.379 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.377+0000 7f58888eb700 1 -- 192.168.123.105:0/1003068209 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5880104350 msgr2=0x7f5880198f10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:38.379 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.378+0000 7f58888eb700 1 --2- 192.168.123.105:0/1003068209 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5880104350 0x7f5880198f10 secure :-1 s=READY pgs=279 cs=0 l=1 rev1=1 crypto rx=0x7f587400b5c0 tx=0x7f5874005250 comp rx=0 tx=0).stop 2026-03-10T07:48:38.379 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.378+0000 7f58888eb700 1 -- 192.168.123.105:0/1003068209 shutdown_connections 2026-03-10T07:48:38.379 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.378+0000 7f58888eb700 1 --2- 192.168.123.105:0/1003068209 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f587006c600 0x7f587006eac0 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:38.380 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.378+0000 7f58888eb700 1 --2- 192.168.123.105:0/1003068209 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5880103150 0x7f58801989d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:38.380 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.378+0000 7f58888eb700 1 --2- 192.168.123.105:0/1003068209 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5880104350 0x7f5880198f10 unknown :-1 s=CLOSED pgs=279 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:38.380 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.378+0000 7f58888eb700 1 -- 192.168.123.105:0/1003068209 >> 192.168.123.105:0/1003068209 conn(0x7f58800fe6d0 msgr2=0x7f5880107580 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:38.380 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.378+0000 7f58888eb700 1 -- 192.168.123.105:0/1003068209 shutdown_connections 2026-03-10T07:48:38.380 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.379+0000 7f58888eb700 1 -- 192.168.123.105:0/1003068209 wait complete. 2026-03-10T07:48:38.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:38 vm08.local ceph-mon[59917]: pgmap v84: 65 pgs: 65 active+clean; 452 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 5.0 KiB/s rd, 1.4 KiB/s wr, 8 op/s 2026-03-10T07:48:38.445 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set global log_to_journald false --force' 2026-03-10T07:48:38.587 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:38.823 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.821+0000 7f5af30f7700 1 -- 192.168.123.105:0/645070933 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5aec0737f0 msgr2=0x7f5aec073c70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:38.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.821+0000 7f5af30f7700 1 --2- 192.168.123.105:0/645070933 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5aec0737f0 0x7f5aec073c70 secure :-1 s=READY pgs=280 cs=0 l=1 rev1=1 crypto rx=0x7f5adc009b00 tx=0x7f5adc009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:38.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.822+0000 7f5af30f7700 1 -- 192.168.123.105:0/645070933 shutdown_connections 2026-03-10T07:48:38.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.822+0000 7f5af30f7700 1 --2- 192.168.123.105:0/645070933 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5aec0737f0 0x7f5aec073c70 unknown :-1 s=CLOSED pgs=280 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:38.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.822+0000 7f5af30f7700 1 --2- 192.168.123.105:0/645070933 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5aec074dc0 0x7f5aec073220 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:38.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.822+0000 7f5af30f7700 1 -- 192.168.123.105:0/645070933 >> 192.168.123.105:0/645070933 conn(0x7f5aec0fc4c0 msgr2=0x7f5aec0fe920 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:38.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.822+0000 7f5af30f7700 1 -- 192.168.123.105:0/645070933 shutdown_connections 2026-03-10T07:48:38.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.822+0000 7f5af30f7700 1 -- 192.168.123.105:0/645070933 wait complete. 2026-03-10T07:48:38.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.823+0000 7f5af30f7700 1 Processor -- start 2026-03-10T07:48:38.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.823+0000 7f5af30f7700 1 -- start start 2026-03-10T07:48:38.824 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.823+0000 7f5af30f7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5aec0737f0 0x7f5aec19cdf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:38.825 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.823+0000 7f5af30f7700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5aec074dc0 0x7f5aec19d330 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:38.825 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.823+0000 7f5af30f7700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5aec19d950 con 0x7f5aec0737f0 2026-03-10T07:48:38.825 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.823+0000 7f5af30f7700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5aec19da90 con 0x7f5aec074dc0 2026-03-10T07:48:38.825 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.823+0000 7f5aebfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5aec074dc0 0x7f5aec19d330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:38.825 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.823+0000 7f5aebfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5aec074dc0 0x7f5aec19d330 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:50854/0 (socket says 192.168.123.105:50854) 2026-03-10T07:48:38.825 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.823+0000 7f5aebfff700 1 -- 192.168.123.105:0/2985193576 learned_addr learned my addr 192.168.123.105:0/2985193576 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:38.825 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.823+0000 7f5af0e93700 1 --2- 192.168.123.105:0/2985193576 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5aec0737f0 0x7f5aec19cdf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:38.825 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.824+0000 7f5af0e93700 1 -- 192.168.123.105:0/2985193576 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5aec074dc0 msgr2=0x7f5aec19d330 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:38.825 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.824+0000 7f5af0e93700 1 --2- 192.168.123.105:0/2985193576 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5aec074dc0 0x7f5aec19d330 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:38.825 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.824+0000 7f5af0e93700 1 -- 192.168.123.105:0/2985193576 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5adc0097e0 con 0x7f5aec0737f0 2026-03-10T07:48:38.825 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.824+0000 7f5aebfff700 1 --2- 192.168.123.105:0/2985193576 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5aec074dc0 0x7f5aec19d330 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T07:48:38.827 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.824+0000 7f5af0e93700 1 --2- 192.168.123.105:0/2985193576 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5aec0737f0 0x7f5aec19cdf0 secure :-1 s=READY pgs=281 cs=0 l=1 rev1=1 crypto rx=0x7f5aec0751a0 tx=0x7f5ae000dc60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:38.827 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.824+0000 7f5ae9ffb700 1 -- 192.168.123.105:0/2985193576 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5ae00098e0 con 0x7f5aec0737f0 2026-03-10T07:48:38.827 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.824+0000 7f5ae9ffb700 1 -- 192.168.123.105:0/2985193576 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5ae0010460 con 0x7f5aec0737f0 2026-03-10T07:48:38.827 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.824+0000 7f5ae9ffb700 1 -- 192.168.123.105:0/2985193576 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5ae000b5d0 con 0x7f5aec0737f0 2026-03-10T07:48:38.827 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.824+0000 7f5af30f7700 1 -- 192.168.123.105:0/2985193576 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5aec1a2540 con 0x7f5aec0737f0 2026-03-10T07:48:38.827 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.825+0000 7f5af30f7700 1 -- 192.168.123.105:0/2985193576 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5aec1a2a90 con 0x7f5aec0737f0 2026-03-10T07:48:38.828 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.826+0000 7f5ae9ffb700 1 -- 192.168.123.105:0/2985193576 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f5ae000f5d0 con 0x7f5aec0737f0 2026-03-10T07:48:38.828 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.827+0000 7f5ae9ffb700 1 --2- 192.168.123.105:0/2985193576 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5ad406c600 0x7f5ad406eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:38.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.827+0000 7f5aebfff700 1 --2- 192.168.123.105:0/2985193576 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5ad406c600 0x7f5ad406eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:38.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.827+0000 7f5ae9ffb700 1 -- 192.168.123.105:0/2985193576 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f5ae008c520 con 0x7f5aec0737f0 2026-03-10T07:48:38.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.827+0000 7f5af30f7700 1 -- 192.168.123.105:0/2985193576 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5aec1a26d0 con 0x7f5aec0737f0 2026-03-10T07:48:38.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.829+0000 7f5aebfff700 1 --2- 192.168.123.105:0/2985193576 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5ad406c600 0x7f5ad406eac0 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f5adc00b5c0 tx=0x7f5adc005fb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:38.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.830+0000 7f5ae9ffb700 1 -- 192.168.123.105:0/2985193576 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f5aec1a26d0 con 0x7f5aec0737f0 2026-03-10T07:48:38.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.941+0000 7f5af30f7700 1 -- 192.168.123.105:0/2985193576 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config set, name=log_to_journald}] v 0) v1 -- 0x7f5aec1a26d0 con 0x7f5aec0737f0 2026-03-10T07:48:38.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.941+0000 7f5ae9ffb700 1 -- 192.168.123.105:0/2985193576 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{prefix=config set, name=log_to_journald}]=0 v15)=0 v15) v1 ==== 135+0+0 (secure 0 0 0) 0x7f5aec1a26d0 con 0x7f5aec0737f0 2026-03-10T07:48:38.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.944+0000 7f5af30f7700 1 -- 192.168.123.105:0/2985193576 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5ad406c600 msgr2=0x7f5ad406eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:38.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.944+0000 7f5af30f7700 1 --2- 192.168.123.105:0/2985193576 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5ad406c600 0x7f5ad406eac0 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f5adc00b5c0 tx=0x7f5adc005fb0 comp rx=0 tx=0).stop 2026-03-10T07:48:38.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.944+0000 7f5af30f7700 1 -- 192.168.123.105:0/2985193576 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5aec0737f0 msgr2=0x7f5aec19cdf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:38.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.944+0000 7f5af30f7700 1 --2- 192.168.123.105:0/2985193576 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5aec0737f0 0x7f5aec19cdf0 secure :-1 s=READY pgs=281 cs=0 l=1 rev1=1 crypto rx=0x7f5aec0751a0 tx=0x7f5ae000dc60 comp rx=0 tx=0).stop 2026-03-10T07:48:38.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.944+0000 7f5af30f7700 1 -- 192.168.123.105:0/2985193576 shutdown_connections 2026-03-10T07:48:38.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.944+0000 7f5af30f7700 1 --2- 192.168.123.105:0/2985193576 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f5ad406c600 0x7f5ad406eac0 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:38.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.944+0000 7f5af30f7700 1 --2- 192.168.123.105:0/2985193576 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5aec0737f0 0x7f5aec19cdf0 unknown :-1 s=CLOSED pgs=281 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:38.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.944+0000 7f5af30f7700 1 --2- 192.168.123.105:0/2985193576 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5aec074dc0 0x7f5aec19d330 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:38.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.944+0000 7f5af30f7700 1 -- 192.168.123.105:0/2985193576 >> 192.168.123.105:0/2985193576 conn(0x7f5aec0fc4c0 msgr2=0x7f5aec1027c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:38.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.945+0000 7f5af30f7700 1 -- 192.168.123.105:0/2985193576 shutdown_connections 2026-03-10T07:48:38.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:38.945+0000 7f5af30f7700 1 -- 192.168.123.105:0/2985193576 wait complete. 2026-03-10T07:48:38.984 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1 --daemon-types mgr' 2026-03-10T07:48:39.137 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:48:39.406 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.403+0000 7f6d981a2700 1 -- 192.168.123.105:0/2462798195 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d90104340 msgr2=0x7f6d901047a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:39.406 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.403+0000 7f6d981a2700 1 --2- 192.168.123.105:0/2462798195 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d90104340 0x7f6d901047a0 secure :-1 s=READY pgs=282 cs=0 l=1 rev1=1 crypto rx=0x7f6d8c009b00 tx=0x7f6d8c009e10 comp rx=0 tx=0).stop 2026-03-10T07:48:39.406 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.404+0000 7f6d981a2700 1 -- 192.168.123.105:0/2462798195 shutdown_connections 2026-03-10T07:48:39.406 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.404+0000 7f6d981a2700 1 --2- 192.168.123.105:0/2462798195 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d90104340 0x7f6d901047a0 unknown :-1 s=CLOSED pgs=282 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:39.406 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.404+0000 7f6d981a2700 1 --2- 192.168.123.105:0/2462798195 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6d90103140 0x7f6d90103560 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:39.406 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.404+0000 7f6d981a2700 1 -- 192.168.123.105:0/2462798195 >> 192.168.123.105:0/2462798195 conn(0x7f6d900fe6c0 msgr2=0x7f6d90100b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:48:39.406 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.405+0000 7f6d981a2700 1 -- 192.168.123.105:0/2462798195 shutdown_connections 2026-03-10T07:48:39.407 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.405+0000 7f6d981a2700 1 -- 192.168.123.105:0/2462798195 wait complete. 2026-03-10T07:48:39.407 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.405+0000 7f6d981a2700 1 Processor -- start 2026-03-10T07:48:39.407 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.405+0000 7f6d981a2700 1 -- start start 2026-03-10T07:48:39.407 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.406+0000 7f6d981a2700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d90103140 0x7f6d90078b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:39.407 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.406+0000 7f6d981a2700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6d90104340 0x7f6d90079080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:39.407 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.406+0000 7f6d981a2700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6d900755a0 con 0x7f6d90103140 2026-03-10T07:48:39.408 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.406+0000 7f6d981a2700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6d90075710 con 0x7f6d90104340 2026-03-10T07:48:39.408 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.406+0000 7f6d95f3e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d90103140 0x7f6d90078b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:39.408 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.406+0000 7f6d95f3e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d90103140 0x7f6d90078b40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:51932/0 (socket says 192.168.123.105:51932) 2026-03-10T07:48:39.408 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.406+0000 7f6d95f3e700 1 -- 192.168.123.105:0/2888545933 learned_addr learned my addr 192.168.123.105:0/2888545933 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:48:39.408 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.406+0000 7f6d9573d700 1 --2- 192.168.123.105:0/2888545933 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6d90104340 0x7f6d90079080 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:39.408 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.406+0000 7f6d95f3e700 1 -- 192.168.123.105:0/2888545933 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6d90104340 msgr2=0x7f6d90079080 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:48:39.408 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.406+0000 7f6d95f3e700 1 --2- 192.168.123.105:0/2888545933 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6d90104340 0x7f6d90079080 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:48:39.408 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.406+0000 7f6d95f3e700 1 -- 192.168.123.105:0/2888545933 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6d8c0097e0 con 0x7f6d90103140 2026-03-10T07:48:39.408 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.406+0000 7f6d95f3e700 1 --2- 192.168.123.105:0/2888545933 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d90103140 0x7f6d90078b40 secure :-1 s=READY pgs=283 cs=0 l=1 rev1=1 crypto rx=0x7f6d8000cc60 tx=0x7f6d800074a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:39.408 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.407+0000 7f6d86ffd700 1 -- 192.168.123.105:0/2888545933 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6d80007af0 con 0x7f6d90103140 2026-03-10T07:48:39.408 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.407+0000 7f6d981a2700 1 -- 192.168.123.105:0/2888545933 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6d900759f0 con 0x7f6d90103140 2026-03-10T07:48:39.409 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.407+0000 7f6d981a2700 1 -- 192.168.123.105:0/2888545933 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6d90075f40 con 0x7f6d90103140 2026-03-10T07:48:39.409 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.408+0000 7f6d86ffd700 1 -- 192.168.123.105:0/2888545933 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6d80004500 con 0x7f6d90103140 2026-03-10T07:48:39.411 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.408+0000 7f6d86ffd700 1 -- 192.168.123.105:0/2888545933 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6d8000f450 con 0x7f6d90103140 2026-03-10T07:48:39.413 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.412+0000 7f6d86ffd700 1 -- 192.168.123.105:0/2888545933 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f6d80003680 con 0x7f6d90103140 2026-03-10T07:48:39.413 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.412+0000 7f6d981a2700 1 -- 192.168.123.105:0/2888545933 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6d90066e80 con 0x7f6d90103140 2026-03-10T07:48:39.414 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.412+0000 7f6d86ffd700 1 --2- 192.168.123.105:0/2888545933 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6d7c06c2e0 0x7f6d7c06e7a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:48:39.414 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.413+0000 7f6d86ffd700 1 -- 192.168.123.105:0/2888545933 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f6d8008b6a0 con 0x7f6d90103140 2026-03-10T07:48:39.416 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.415+0000 7f6d86ffd700 1 -- 192.168.123.105:0/2888545933 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f6d800599d0 con 0x7f6d90103140 2026-03-10T07:48:39.416 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.415+0000 7f6d9573d700 1 --2- 192.168.123.105:0/2888545933 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6d7c06c2e0 0x7f6d7c06e7a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:48:39.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.415+0000 7f6d9573d700 1 --2- 192.168.123.105:0/2888545933 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6d7c06c2e0 0x7f6d7c06e7a0 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f6d8c005340 tx=0x7f6d8c00b540 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:48:39.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:48:39.526+0000 7f6d981a2700 1 -- 192.168.123.105:0/2888545933 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "daemon_types": "mgr", "target": ["mon-mgr", ""]}) v1 -- 0x7f6d901a2ec0 con 0x7f6d7c06c2e0 2026-03-10T07:48:40.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:40 vm05.local ceph-mon[50387]: pgmap v85: 65 pgs: 65 active+clean; 452 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 8.0 KiB/s rd, 1.2 KiB/s wr, 9 op/s 2026-03-10T07:48:40.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:40 vm05.local ceph-mon[50387]: from='client.14564 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "daemon_types": "mgr", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:48:40.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:40 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:41.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:40 vm08.local ceph-mon[59917]: pgmap v85: 65 pgs: 65 active+clean; 452 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 8.0 KiB/s rd, 1.2 KiB/s wr, 9 op/s 2026-03-10T07:48:41.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:40 vm08.local ceph-mon[59917]: from='client.14564 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "daemon_types": "mgr", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:48:41.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:40 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:43.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:42 vm05.local ceph-mon[50387]: pgmap v86: 65 pgs: 65 active+clean; 452 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 7.3 KiB/s rd, 282 B/s wr, 7 op/s 2026-03-10T07:48:43.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:42 vm08.local ceph-mon[59917]: pgmap v86: 65 pgs: 65 active+clean; 452 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 7.3 KiB/s rd, 282 B/s wr, 7 op/s 2026-03-10T07:48:44.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:44 vm05.local ceph-mon[50387]: pgmap v87: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 6.7 KiB/s rd, 682 B/s wr, 6 op/s 2026-03-10T07:48:44.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:44 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:44.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:44 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:48:44.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:44 vm08.local ceph-mon[59917]: pgmap v87: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 6.7 KiB/s rd, 682 B/s wr, 6 op/s 2026-03-10T07:48:44.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:44 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:48:44.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:44 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:48:46.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:46 vm05.local ceph-mon[50387]: pgmap v88: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 6.2 KiB/s rd, 767 B/s wr, 6 op/s 2026-03-10T07:48:46.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:46 vm08.local ceph-mon[59917]: pgmap v88: 65 pgs: 65 active+clean; 457 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 6.2 KiB/s rd, 767 B/s wr, 6 op/s 2026-03-10T07:48:48.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:48 vm08.local ceph-mon[59917]: pgmap v89: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 4.0 KiB/s rd, 1.1 KiB/s wr, 4 op/s 2026-03-10T07:48:48.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:48 vm05.local ceph-mon[50387]: pgmap v89: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 4.0 KiB/s rd, 1.1 KiB/s wr, 4 op/s 2026-03-10T07:48:50.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:50 vm05.local ceph-mon[50387]: pgmap v90: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.5 KiB/s rd, 938 B/s wr, 3 op/s 2026-03-10T07:48:51.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:50 vm08.local ceph-mon[59917]: pgmap v90: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 2.5 KiB/s rd, 938 B/s wr, 3 op/s 2026-03-10T07:48:53.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:53 vm05.local ceph-mon[50387]: pgmap v91: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 938 B/s wr, 0 op/s 2026-03-10T07:48:53.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:53 vm08.local ceph-mon[59917]: pgmap v91: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 938 B/s wr, 0 op/s 2026-03-10T07:48:54.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:54 vm08.local ceph-mon[59917]: pgmap v92: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 938 B/s wr, 0 op/s 2026-03-10T07:48:54.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:54 vm05.local ceph-mon[50387]: pgmap v92: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 938 B/s wr, 0 op/s 2026-03-10T07:48:56.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:56 vm05.local ceph-mon[50387]: pgmap v93: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 511 B/s wr, 0 op/s 2026-03-10T07:48:56.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:56 vm08.local ceph-mon[59917]: pgmap v93: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 511 B/s wr, 0 op/s 2026-03-10T07:48:58.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:58 vm05.local ceph-mon[50387]: pgmap v94: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 426 B/s wr, 0 op/s 2026-03-10T07:48:58.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:58 vm08.local ceph-mon[59917]: pgmap v94: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 426 B/s wr, 0 op/s 2026-03-10T07:48:59.145 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:48:59 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:48:59.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:48:59 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:49:00.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:00 vm05.local ceph-mon[50387]: pgmap v95: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 85 B/s wr, 0 op/s 2026-03-10T07:49:00.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:00 vm08.local ceph-mon[59917]: pgmap v95: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 85 B/s wr, 0 op/s 2026-03-10T07:49:03.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:02 vm05.local ceph-mon[50387]: pgmap v96: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:02 vm08.local ceph-mon[59917]: pgmap v96: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:03 vm08.local ceph-mon[59917]: pgmap v97: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:04.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:03 vm05.local ceph-mon[50387]: pgmap v97: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:06.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:06 vm05.local ceph-mon[50387]: pgmap v98: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:06.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:06 vm08.local ceph-mon[59917]: pgmap v98: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:08.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:08 vm08.local ceph-mon[59917]: pgmap v99: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:08.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:08 vm05.local ceph-mon[50387]: pgmap v99: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:10.742 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:10 vm05.local ceph-mon[50387]: pgmap v100: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:10.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:10 vm08.local ceph-mon[59917]: pgmap v100: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:12.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:12 vm05.local ceph-mon[50387]: pgmap v101: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:12.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:12 vm08.local ceph-mon[59917]: pgmap v101: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:13.886 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:13 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:49:13.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:13 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:49:15.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:14 vm05.local ceph-mon[50387]: pgmap v102: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:15.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:14 vm08.local ceph-mon[59917]: pgmap v102: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:16.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:15 vm05.local ceph-mon[50387]: pgmap v103: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:16.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:15 vm08.local ceph-mon[59917]: pgmap v103: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:19.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:18 vm08.local ceph-mon[59917]: pgmap v104: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:19.252 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:18 vm05.local ceph-mon[50387]: pgmap v104: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:19.430 INFO:teuthology.orchestra.run.vm05.stdout:Initiating upgrade to quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T07:49:19.431 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:19.428+0000 7f6d86ffd700 1 -- 192.168.123.105:0/2888545933 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+89 (secure 0 0 0) 0x7f6d901a2ec0 con 0x7f6d7c06c2e0 2026-03-10T07:49:19.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:19.430+0000 7f6d981a2700 1 -- 192.168.123.105:0/2888545933 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6d7c06c2e0 msgr2=0x7f6d7c06e7a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:19.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:19.430+0000 7f6d981a2700 1 --2- 192.168.123.105:0/2888545933 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6d7c06c2e0 0x7f6d7c06e7a0 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f6d8c005340 tx=0x7f6d8c00b540 comp rx=0 tx=0).stop 2026-03-10T07:49:19.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:19.430+0000 7f6d981a2700 1 -- 192.168.123.105:0/2888545933 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d90103140 msgr2=0x7f6d90078b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:19.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:19.430+0000 7f6d981a2700 1 --2- 192.168.123.105:0/2888545933 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d90103140 0x7f6d90078b40 secure :-1 s=READY pgs=283 cs=0 l=1 rev1=1 crypto rx=0x7f6d8000cc60 tx=0x7f6d800074a0 comp rx=0 tx=0).stop 2026-03-10T07:49:19.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:19.430+0000 7f6d981a2700 1 -- 192.168.123.105:0/2888545933 shutdown_connections 2026-03-10T07:49:19.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:19.430+0000 7f6d981a2700 1 --2- 192.168.123.105:0/2888545933 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f6d7c06c2e0 0x7f6d7c06e7a0 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:19.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:19.431+0000 7f6d981a2700 1 --2- 192.168.123.105:0/2888545933 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6d90103140 0x7f6d90078b40 unknown :-1 s=CLOSED pgs=283 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:19.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:19.431+0000 7f6d981a2700 1 --2- 192.168.123.105:0/2888545933 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6d90104340 0x7f6d90079080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:19.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:19.431+0000 7f6d981a2700 1 -- 192.168.123.105:0/2888545933 >> 192.168.123.105:0/2888545933 conn(0x7f6d900fe6c0 msgr2=0x7f6d90107570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:49:19.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:19.431+0000 7f6d981a2700 1 -- 192.168.123.105:0/2888545933 shutdown_connections 2026-03-10T07:49:19.433 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:19.431+0000 7f6d981a2700 1 -- 192.168.123.105:0/2888545933 wait complete. 2026-03-10T07:49:19.507 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'while ceph orch upgrade status | jq '"'"'.in_progress'"'"' | grep true && ! ceph orch upgrade status | jq '"'"'.message'"'"' | grep Error ; do ceph orch ps ; ceph versions ; ceph orch upgrade status ; sleep 30 ; done' 2026-03-10T07:49:19.946 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.434+0000 7ffa931df700 1 -- 192.168.123.105:0/1453814925 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa8c072b20 msgr2=0x7ffa8c072f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.434+0000 7ffa931df700 1 --2- 192.168.123.105:0/1453814925 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa8c072b20 0x7ffa8c072f40 secure :-1 s=READY pgs=284 cs=0 l=1 rev1=1 crypto rx=0x7ffa8400b600 tx=0x7ffa8400b910 comp rx=0 tx=0).stop 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.435+0000 7ffa931df700 1 -- 192.168.123.105:0/1453814925 shutdown_connections 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.435+0000 7ffa931df700 1 --2- 192.168.123.105:0/1453814925 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa8c075a10 0x7ffa8c077ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.435+0000 7ffa931df700 1 --2- 192.168.123.105:0/1453814925 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa8c072b20 0x7ffa8c072f40 unknown :-1 s=CLOSED pgs=284 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.435+0000 7ffa931df700 1 -- 192.168.123.105:0/1453814925 >> 192.168.123.105:0/1453814925 conn(0x7ffa8c06daa0 msgr2=0x7ffa8c06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.435+0000 7ffa931df700 1 -- 192.168.123.105:0/1453814925 shutdown_connections 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.435+0000 7ffa931df700 1 -- 192.168.123.105:0/1453814925 wait complete. 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.435+0000 7ffa931df700 1 Processor -- start 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.435+0000 7ffa931df700 1 -- start start 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.435+0000 7ffa931df700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa8c075a10 0x7ffa8c1aedf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.435+0000 7ffa931df700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa8c1af330 0x7ffa8c1b43a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.435+0000 7ffa931df700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ffa8c1af840 con 0x7ffa8c1af330 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.435+0000 7ffa931df700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ffa8c1af9b0 con 0x7ffa8c075a10 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.436+0000 7ffa919dc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa8c1af330 0x7ffa8c1b43a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.436+0000 7ffa919dc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa8c1af330 0x7ffa8c1b43a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:34514/0 (socket says 192.168.123.105:34514) 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.436+0000 7ffa919dc700 1 -- 192.168.123.105:0/1132445032 learned_addr learned my addr 192.168.123.105:0/1132445032 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.436+0000 7ffa921dd700 1 --2- 192.168.123.105:0/1132445032 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa8c075a10 0x7ffa8c1aedf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.436+0000 7ffa919dc700 1 -- 192.168.123.105:0/1132445032 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa8c075a10 msgr2=0x7ffa8c1aedf0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.436+0000 7ffa919dc700 1 --2- 192.168.123.105:0/1132445032 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa8c075a10 0x7ffa8c1aedf0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.436+0000 7ffa919dc700 1 -- 192.168.123.105:0/1132445032 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ffa8400b050 con 0x7ffa8c1af330 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.436+0000 7ffa919dc700 1 --2- 192.168.123.105:0/1132445032 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa8c1af330 0x7ffa8c1b43a0 secure :-1 s=READY pgs=285 cs=0 l=1 rev1=1 crypto rx=0x7ffa8800c970 tx=0x7ffa8800cc80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.437+0000 7ffa837fe700 1 -- 192.168.123.105:0/1132445032 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ffa88008940 con 0x7ffa8c1af330 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.437+0000 7ffa931df700 1 -- 192.168.123.105:0/1132445032 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ffa8c07c460 con 0x7ffa8c1af330 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.437+0000 7ffa931df700 1 -- 192.168.123.105:0/1132445032 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ffa8c07c9b0 con 0x7ffa8c1af330 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.438+0000 7ffa837fe700 1 -- 192.168.123.105:0/1132445032 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ffa8800f460 con 0x7ffa8c1af330 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.438+0000 7ffa837fe700 1 -- 192.168.123.105:0/1132445032 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ffa88018610 con 0x7ffa8c1af330 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.439+0000 7ffa837fe700 1 -- 192.168.123.105:0/1132445032 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7ffa880213e0 con 0x7ffa8c1af330 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.439+0000 7ffa837fe700 1 --2- 192.168.123.105:0/1132445032 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ffa7806c610 0x7ffa7806ead0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.439+0000 7ffa921dd700 1 --2- 192.168.123.105:0/1132445032 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ffa7806c610 0x7ffa7806ead0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:20.445 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.439+0000 7ffa837fe700 1 -- 192.168.123.105:0/1132445032 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7ffa8808ca60 con 0x7ffa8c1af330 2026-03-10T07:49:20.447 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.440+0000 7ffa931df700 1 -- 192.168.123.105:0/1132445032 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ffa70005320 con 0x7ffa8c1af330 2026-03-10T07:49:20.447 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.443+0000 7ffa921dd700 1 --2- 192.168.123.105:0/1132445032 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ffa7806c610 0x7ffa7806ead0 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7ffa84000f80 tx=0x7ffa8400bd50 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:49:20.447 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.443+0000 7ffa837fe700 1 -- 192.168.123.105:0/1132445032 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ffa8805ad90 con 0x7ffa8c1af330 2026-03-10T07:49:20.598 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.594+0000 7ffa931df700 1 -- 192.168.123.105:0/1132445032 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7ffa70000bf0 con 0x7ffa7806c610 2026-03-10T07:49:20.598 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:20 vm05.local ceph-mon[50387]: pgmap v105: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:20.598 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:20 vm05.local ceph-mon[50387]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T07:49:20.598 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:20 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:49:20.598 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:20 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:49:20.602 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.598+0000 7ffa837fe700 1 -- 192.168.123.105:0/1132445032 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+259 (secure 0 0 0) 0x7ffa70000bf0 con 0x7ffa7806c610 2026-03-10T07:49:20.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.602+0000 7ffa817fa700 1 -- 192.168.123.105:0/1132445032 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ffa7806c610 msgr2=0x7ffa7806ead0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:20.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.602+0000 7ffa817fa700 1 --2- 192.168.123.105:0/1132445032 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ffa7806c610 0x7ffa7806ead0 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7ffa84000f80 tx=0x7ffa8400bd50 comp rx=0 tx=0).stop 2026-03-10T07:49:20.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.602+0000 7ffa817fa700 1 -- 192.168.123.105:0/1132445032 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa8c1af330 msgr2=0x7ffa8c1b43a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:20.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.602+0000 7ffa817fa700 1 --2- 192.168.123.105:0/1132445032 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa8c1af330 0x7ffa8c1b43a0 secure :-1 s=READY pgs=285 cs=0 l=1 rev1=1 crypto rx=0x7ffa8800c970 tx=0x7ffa8800cc80 comp rx=0 tx=0).stop 2026-03-10T07:49:20.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.602+0000 7ffa817fa700 1 -- 192.168.123.105:0/1132445032 shutdown_connections 2026-03-10T07:49:20.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.602+0000 7ffa817fa700 1 --2- 192.168.123.105:0/1132445032 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ffa7806c610 0x7ffa7806ead0 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:20.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.602+0000 7ffa817fa700 1 --2- 192.168.123.105:0/1132445032 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ffa8c075a10 0x7ffa8c1aedf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:20.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.602+0000 7ffa817fa700 1 --2- 192.168.123.105:0/1132445032 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ffa8c1af330 0x7ffa8c1b43a0 unknown :-1 s=CLOSED pgs=285 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:20.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.602+0000 7ffa817fa700 1 -- 192.168.123.105:0/1132445032 >> 192.168.123.105:0/1132445032 conn(0x7ffa8c06daa0 msgr2=0x7ffa8c06def0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:49:20.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.602+0000 7ffa817fa700 1 -- 192.168.123.105:0/1132445032 shutdown_connections 2026-03-10T07:49:20.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.603+0000 7ffa817fa700 1 -- 192.168.123.105:0/1132445032 wait complete. 2026-03-10T07:49:20.619 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T07:49:20.705 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.702+0000 7f853235f700 1 -- 192.168.123.105:0/1080605795 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f852c072b20 msgr2=0x7f852c072f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:20.705 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.702+0000 7f853235f700 1 --2- 192.168.123.105:0/1080605795 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f852c072b20 0x7f852c072f40 secure :-1 s=READY pgs=286 cs=0 l=1 rev1=1 crypto rx=0x7f8528009b00 tx=0x7f8528009e10 comp rx=0 tx=0).stop 2026-03-10T07:49:20.705 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.703+0000 7f853235f700 1 -- 192.168.123.105:0/1080605795 shutdown_connections 2026-03-10T07:49:20.705 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.703+0000 7f853235f700 1 --2- 192.168.123.105:0/1080605795 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f852c075a10 0x7f852c077ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:20.705 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.703+0000 7f853235f700 1 --2- 192.168.123.105:0/1080605795 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f852c072b20 0x7f852c072f40 unknown :-1 s=CLOSED pgs=286 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:20.705 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.703+0000 7f853235f700 1 -- 192.168.123.105:0/1080605795 >> 192.168.123.105:0/1080605795 conn(0x7f852c06daa0 msgr2=0x7f852c06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:49:20.708 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.703+0000 7f853235f700 1 -- 192.168.123.105:0/1080605795 shutdown_connections 2026-03-10T07:49:20.708 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.703+0000 7f853235f700 1 -- 192.168.123.105:0/1080605795 wait complete. 2026-03-10T07:49:20.708 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.703+0000 7f853235f700 1 Processor -- start 2026-03-10T07:49:20.708 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.703+0000 7f853235f700 1 -- start start 2026-03-10T07:49:20.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.703+0000 7f853235f700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f852c075a10 0x7f852c083080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:20.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.703+0000 7f853235f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f852c0835c0 0x7f852c1b3090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:20.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.703+0000 7f853235f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f852c083b00 con 0x7f852c0835c0 2026-03-10T07:49:20.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.703+0000 7f853235f700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f852c083c70 con 0x7f852c075a10 2026-03-10T07:49:20.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.705+0000 7f8530b5c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f852c0835c0 0x7f852c1b3090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:20.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.705+0000 7f8530b5c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f852c0835c0 0x7f852c1b3090 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:34524/0 (socket says 192.168.123.105:34524) 2026-03-10T07:49:20.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.705+0000 7f8530b5c700 1 -- 192.168.123.105:0/3452204026 learned_addr learned my addr 192.168.123.105:0/3452204026 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:49:20.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.705+0000 7f8530b5c700 1 -- 192.168.123.105:0/3452204026 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f852c075a10 msgr2=0x7f852c083080 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T07:49:20.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.705+0000 7f8530b5c700 1 --2- 192.168.123.105:0/3452204026 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f852c075a10 0x7f852c083080 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:20.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.705+0000 7f8530b5c700 1 -- 192.168.123.105:0/3452204026 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f85280097e0 con 0x7f852c0835c0 2026-03-10T07:49:20.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.705+0000 7f8530b5c700 1 --2- 192.168.123.105:0/3452204026 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f852c0835c0 0x7f852c1b3090 secure :-1 s=READY pgs=287 cs=0 l=1 rev1=1 crypto rx=0x7f852400e910 tx=0x7f852400ecd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:49:20.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.706+0000 7f85227fc700 1 -- 192.168.123.105:0/3452204026 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8524004d60 con 0x7f852c0835c0 2026-03-10T07:49:20.709 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.706+0000 7f853235f700 1 -- 192.168.123.105:0/3452204026 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f852c1b36f0 con 0x7f852c0835c0 2026-03-10T07:49:20.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.706+0000 7f853235f700 1 -- 192.168.123.105:0/3452204026 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f852c1b3bc0 con 0x7f852c0835c0 2026-03-10T07:49:20.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.706+0000 7f853235f700 1 -- 192.168.123.105:0/3452204026 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f852c04ea90 con 0x7f852c0835c0 2026-03-10T07:49:20.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.711+0000 7f85227fc700 1 -- 192.168.123.105:0/3452204026 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f8524007d10 con 0x7f852c0835c0 2026-03-10T07:49:20.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.711+0000 7f85227fc700 1 -- 192.168.123.105:0/3452204026 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f852400b740 con 0x7f852c0835c0 2026-03-10T07:49:20.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.711+0000 7f85227fc700 1 -- 192.168.123.105:0/3452204026 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f852400b960 con 0x7f852c0835c0 2026-03-10T07:49:20.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.712+0000 7f85227fc700 1 --2- 192.168.123.105:0/3452204026 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f851806c6d0 0x7f851806eb90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:20.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.712+0000 7f853135d700 1 --2- 192.168.123.105:0/3452204026 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f851806c6d0 0x7f851806eb90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:20.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.712+0000 7f853135d700 1 --2- 192.168.123.105:0/3452204026 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f851806c6d0 0x7f851806eb90 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f8528009b00 tx=0x7f852800d750 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:49:20.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.712+0000 7f85227fc700 1 -- 192.168.123.105:0/3452204026 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f852408cce0 con 0x7f852c0835c0 2026-03-10T07:49:20.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.713+0000 7f85227fc700 1 -- 192.168.123.105:0/3452204026 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f85240912a0 con 0x7f852c0835c0 2026-03-10T07:49:20.857 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.854+0000 7f853235f700 1 -- 192.168.123.105:0/3452204026 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f852c077710 con 0x7f851806c6d0 2026-03-10T07:49:20.858 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.856+0000 7f85227fc700 1 -- 192.168.123.105:0/3452204026 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+259 (secure 0 0 0) 0x7f852c077710 con 0x7f851806c6d0 2026-03-10T07:49:20.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.859+0000 7f8517fff700 1 -- 192.168.123.105:0/3452204026 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f851806c6d0 msgr2=0x7f851806eb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:20.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.859+0000 7f8517fff700 1 --2- 192.168.123.105:0/3452204026 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f851806c6d0 0x7f851806eb90 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7f8528009b00 tx=0x7f852800d750 comp rx=0 tx=0).stop 2026-03-10T07:49:20.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.859+0000 7f8517fff700 1 -- 192.168.123.105:0/3452204026 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f852c0835c0 msgr2=0x7f852c1b3090 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:20.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.859+0000 7f8517fff700 1 --2- 192.168.123.105:0/3452204026 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f852c0835c0 0x7f852c1b3090 secure :-1 s=READY pgs=287 cs=0 l=1 rev1=1 crypto rx=0x7f852400e910 tx=0x7f852400ecd0 comp rx=0 tx=0).stop 2026-03-10T07:49:20.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.863+0000 7f8517fff700 1 -- 192.168.123.105:0/3452204026 shutdown_connections 2026-03-10T07:49:20.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.863+0000 7f8517fff700 1 --2- 192.168.123.105:0/3452204026 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f851806c6d0 0x7f851806eb90 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:20.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.863+0000 7f8517fff700 1 --2- 192.168.123.105:0/3452204026 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f852c075a10 0x7f852c083080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:20.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.863+0000 7f8517fff700 1 --2- 192.168.123.105:0/3452204026 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f852c0835c0 0x7f852c1b3090 unknown :-1 s=CLOSED pgs=287 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:20.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.863+0000 7f8517fff700 1 -- 192.168.123.105:0/3452204026 >> 192.168.123.105:0/3452204026 conn(0x7f852c06daa0 msgr2=0x7f852c06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:49:20.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.863+0000 7f8517fff700 1 -- 192.168.123.105:0/3452204026 shutdown_connections 2026-03-10T07:49:20.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.865+0000 7f8517fff700 1 -- 192.168.123.105:0/3452204026 wait complete. 2026-03-10T07:49:20.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:20 vm08.local ceph-mon[59917]: pgmap v105: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:20.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:20 vm08.local ceph-mon[59917]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T07:49:20.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:20 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:49:20.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:20 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:49:20.953 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.951+0000 7fbe765ba700 1 -- 192.168.123.105:0/2807325505 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbe70075a10 msgr2=0x7fbe70077ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:20.953 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.951+0000 7fbe765ba700 1 --2- 192.168.123.105:0/2807325505 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbe70075a10 0x7fbe70077ea0 secure :-1 s=READY pgs=288 cs=0 l=1 rev1=1 crypto rx=0x7fbe6800b3a0 tx=0x7fbe6800b6b0 comp rx=0 tx=0).stop 2026-03-10T07:49:20.953 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.951+0000 7fbe765ba700 1 -- 192.168.123.105:0/2807325505 shutdown_connections 2026-03-10T07:49:20.953 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.951+0000 7fbe765ba700 1 --2- 192.168.123.105:0/2807325505 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbe70075a10 0x7fbe70077ea0 unknown :-1 s=CLOSED pgs=288 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:20.953 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.951+0000 7fbe765ba700 1 --2- 192.168.123.105:0/2807325505 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbe70072b20 0x7fbe70072f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:20.953 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.951+0000 7fbe765ba700 1 -- 192.168.123.105:0/2807325505 >> 192.168.123.105:0/2807325505 conn(0x7fbe7006daa0 msgr2=0x7fbe7006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:49:20.954 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.951+0000 7fbe765ba700 1 -- 192.168.123.105:0/2807325505 shutdown_connections 2026-03-10T07:49:20.954 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.951+0000 7fbe765ba700 1 -- 192.168.123.105:0/2807325505 wait complete. 2026-03-10T07:49:20.954 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.951+0000 7fbe765ba700 1 Processor -- start 2026-03-10T07:49:20.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.951+0000 7fbe765ba700 1 -- start start 2026-03-10T07:49:20.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.952+0000 7fbe765ba700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbe70072b20 0x7fbe70083080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:20.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.952+0000 7fbe765ba700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbe700835c0 0x7fbe701b3090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:20.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.952+0000 7fbe765ba700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbe70083b00 con 0x7fbe70072b20 2026-03-10T07:49:20.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.952+0000 7fbe765ba700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbe70083c70 con 0x7fbe700835c0 2026-03-10T07:49:20.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.952+0000 7fbe755b8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbe70072b20 0x7fbe70083080 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:20.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.952+0000 7fbe74db7700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbe700835c0 0x7fbe701b3090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:20.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.952+0000 7fbe74db7700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbe700835c0 0x7fbe701b3090 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:54062/0 (socket says 192.168.123.105:54062) 2026-03-10T07:49:20.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.952+0000 7fbe74db7700 1 -- 192.168.123.105:0/430889848 learned_addr learned my addr 192.168.123.105:0/430889848 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:49:20.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.952+0000 7fbe74db7700 1 -- 192.168.123.105:0/430889848 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbe70072b20 msgr2=0x7fbe70083080 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:20.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.952+0000 7fbe74db7700 1 --2- 192.168.123.105:0/430889848 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbe70072b20 0x7fbe70083080 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:20.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.952+0000 7fbe74db7700 1 -- 192.168.123.105:0/430889848 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbe6800b050 con 0x7fbe700835c0 2026-03-10T07:49:20.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.952+0000 7fbe755b8700 1 --2- 192.168.123.105:0/430889848 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbe70072b20 0x7fbe70083080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T07:49:20.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.952+0000 7fbe74db7700 1 --2- 192.168.123.105:0/430889848 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbe700835c0 0x7fbe701b3090 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7fbe6800bb30 tx=0x7fbe68007b60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:49:20.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.953+0000 7fbe667fc700 1 -- 192.168.123.105:0/430889848 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbe6800e050 con 0x7fbe700835c0 2026-03-10T07:49:20.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.953+0000 7fbe765ba700 1 -- 192.168.123.105:0/430889848 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbe701b3690 con 0x7fbe700835c0 2026-03-10T07:49:20.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.953+0000 7fbe765ba700 1 -- 192.168.123.105:0/430889848 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbe701b3b60 con 0x7fbe700835c0 2026-03-10T07:49:20.955 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.954+0000 7fbe667fc700 1 -- 192.168.123.105:0/430889848 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fbe68003e80 con 0x7fbe700835c0 2026-03-10T07:49:20.956 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.954+0000 7fbe667fc700 1 -- 192.168.123.105:0/430889848 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbe6801bb90 con 0x7fbe700835c0 2026-03-10T07:49:20.957 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.955+0000 7fbe667fc700 1 -- 192.168.123.105:0/430889848 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fbe68019040 con 0x7fbe700835c0 2026-03-10T07:49:20.957 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.956+0000 7fbe667fc700 1 --2- 192.168.123.105:0/430889848 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbe5c06e750 0x7fbe5c070c10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:20.957 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.956+0000 7fbe755b8700 1 --2- 192.168.123.105:0/430889848 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbe5c06e750 0x7fbe5c070c10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:20.957 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.956+0000 7fbe667fc700 1 -- 192.168.123.105:0/430889848 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7fbe6808dae0 con 0x7fbe700835c0 2026-03-10T07:49:20.958 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.956+0000 7fbe755b8700 1 --2- 192.168.123.105:0/430889848 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbe5c06e750 0x7fbe5c070c10 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7fbe6c005950 tx=0x7fbe6c00a300 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:49:20.958 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.956+0000 7fbe765ba700 1 -- 192.168.123.105:0/430889848 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbe54005320 con 0x7fbe700835c0 2026-03-10T07:49:20.962 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:20.960+0000 7fbe667fc700 1 -- 192.168.123.105:0/430889848 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fbe68058300 con 0x7fbe700835c0 2026-03-10T07:49:21.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.111+0000 7fbe765ba700 1 -- 192.168.123.105:0/430889848 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fbe54000bf0 con 0x7fbe5c06e750 2026-03-10T07:49:21.124 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T07:49:21.124 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (2m) 48s ago 2m 24.7M - 0.25.0 c8568f914cd2 f87529717116 2026-03-10T07:49:21.124 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (3m) 48s ago 3m 8120k - 18.2.1 5be31c24972a 26c4db858175 2026-03-10T07:49:21.124 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (2m) 49s ago 2m 8216k - 18.2.1 5be31c24972a 209e2398a09c 2026-03-10T07:49:21.124 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (3m) 48s ago 3m 7419k - 18.2.1 5be31c24972a d3d7b92c8ac3 2026-03-10T07:49:21.124 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (2m) 49s ago 2m 7415k - 18.2.1 5be31c24972a 96136e0195f7 2026-03-10T07:49:21.124 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (2m) 48s ago 2m 80.7M - 9.4.7 954c08fa6188 35089be30fc6 2026-03-10T07:49:21.124 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.omfhnh vm05 running (55s) 48s ago 55s 16.1M - 18.2.1 5be31c24972a e23de179e09c 2026-03-10T07:49:21.125 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.pavqil vm05 running (53s) 48s ago 53s 13.3M - 18.2.1 5be31c24972a 5b9e5afa214c 2026-03-10T07:49:21.125 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.dgsaon vm08 running (52s) 49s ago 52s 13.9M - 18.2.1 5be31c24972a 1696aee522b5 2026-03-10T07:49:21.125 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ybmbgd vm08 running (54s) 49s ago 54s 10.9M - 18.2.1 5be31c24972a 30b0e51cd2ed 2026-03-10T07:49:21.125 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.blexke vm05 *:9283,8765,8443 running (3m) 48s ago 3m 501M - 18.2.1 5be31c24972a 4af6d7f6e0f4 2026-03-10T07:49:21.125 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.orfpog vm08 *:8443,9283,8765 running (2m) 49s ago 2m 448M - 18.2.1 5be31c24972a 7b89b610a4ab 2026-03-10T07:49:21.125 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (3m) 48s ago 3m 50.4M 2048M 18.2.1 5be31c24972a 2a459bf05146 2026-03-10T07:49:21.125 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (2m) 49s ago 2m 45.2M 2048M 18.2.1 5be31c24972a e01dfb712474 2026-03-10T07:49:21.125 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (3m) 48s ago 3m 14.0M - 1.5.0 0da6a335fe13 cb6188e5fa06 2026-03-10T07:49:21.125 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (2m) 49s ago 2m 15.9M - 1.5.0 0da6a335fe13 f73da8e379d9 2026-03-10T07:49:21.125 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (2m) 48s ago 2m 46.6M 4096M 18.2.1 5be31c24972a 9b7c5ea48cea 2026-03-10T07:49:21.125 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (118s) 48s ago 118s 50.0M 4096M 18.2.1 5be31c24972a 88e0b65b2c93 2026-03-10T07:49:21.125 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (108s) 48s ago 108s 46.8M 4096M 18.2.1 5be31c24972a b8d3cbeb49f1 2026-03-10T07:49:21.125 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (99s) 49s ago 99s 46.8M 4096M 18.2.1 5be31c24972a 0a62c54a86c0 2026-03-10T07:49:21.125 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (90s) 49s ago 90s 43.3M 4096M 18.2.1 5be31c24972a bd748b691ccd 2026-03-10T07:49:21.125 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (81s) 49s ago 81s 43.0M 4096M 18.2.1 5be31c24972a 9f08820ae98b 2026-03-10T07:49:21.125 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (2m) 48s ago 2m 37.0M - 2.43.0 a07b618ecd1d bcb499ab4929 2026-03-10T07:49:21.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.121+0000 7fbe667fc700 1 -- 192.168.123.105:0/430889848 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3288 (secure 0 0 0) 0x7fbe54000bf0 con 0x7fbe5c06e750 2026-03-10T07:49:21.127 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.125+0000 7fbe5bfff700 1 -- 192.168.123.105:0/430889848 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbe5c06e750 msgr2=0x7fbe5c070c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:21.127 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.125+0000 7fbe5bfff700 1 --2- 192.168.123.105:0/430889848 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbe5c06e750 0x7fbe5c070c10 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7fbe6c005950 tx=0x7fbe6c00a300 comp rx=0 tx=0).stop 2026-03-10T07:49:21.127 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.125+0000 7fbe5bfff700 1 -- 192.168.123.105:0/430889848 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbe700835c0 msgr2=0x7fbe701b3090 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:21.127 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.125+0000 7fbe5bfff700 1 --2- 192.168.123.105:0/430889848 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbe700835c0 0x7fbe701b3090 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7fbe6800bb30 tx=0x7fbe68007b60 comp rx=0 tx=0).stop 2026-03-10T07:49:21.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.127+0000 7fbe5bfff700 1 -- 192.168.123.105:0/430889848 shutdown_connections 2026-03-10T07:49:21.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.127+0000 7fbe5bfff700 1 --2- 192.168.123.105:0/430889848 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fbe5c06e750 0x7fbe5c070c10 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:21.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.127+0000 7fbe5bfff700 1 --2- 192.168.123.105:0/430889848 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbe70072b20 0x7fbe70083080 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:21.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.127+0000 7fbe5bfff700 1 --2- 192.168.123.105:0/430889848 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbe700835c0 0x7fbe701b3090 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:21.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.127+0000 7fbe5bfff700 1 -- 192.168.123.105:0/430889848 >> 192.168.123.105:0/430889848 conn(0x7fbe7006daa0 msgr2=0x7fbe7006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:49:21.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.128+0000 7fbe5bfff700 1 -- 192.168.123.105:0/430889848 shutdown_connections 2026-03-10T07:49:21.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.128+0000 7fbe5bfff700 1 -- 192.168.123.105:0/430889848 wait complete. 2026-03-10T07:49:21.229 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.227+0000 7f1b7f2c5700 1 -- 192.168.123.105:0/704926197 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1b78072b20 msgr2=0x7f1b78072f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:21.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.227+0000 7f1b7f2c5700 1 --2- 192.168.123.105:0/704926197 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1b78072b20 0x7f1b78072f40 secure :-1 s=READY pgs=289 cs=0 l=1 rev1=1 crypto rx=0x7f1b740099c0 tx=0x7f1b74009cd0 comp rx=0 tx=0).stop 2026-03-10T07:49:21.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.228+0000 7f1b7f2c5700 1 -- 192.168.123.105:0/704926197 shutdown_connections 2026-03-10T07:49:21.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.228+0000 7f1b7f2c5700 1 --2- 192.168.123.105:0/704926197 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1b78075a10 0x7f1b78077ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:21.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.228+0000 7f1b7f2c5700 1 --2- 192.168.123.105:0/704926197 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1b78072b20 0x7f1b78072f40 unknown :-1 s=CLOSED pgs=289 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:21.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.228+0000 7f1b7f2c5700 1 -- 192.168.123.105:0/704926197 >> 192.168.123.105:0/704926197 conn(0x7f1b7806daa0 msgr2=0x7f1b7806ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:49:21.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.228+0000 7f1b7f2c5700 1 -- 192.168.123.105:0/704926197 shutdown_connections 2026-03-10T07:49:21.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.228+0000 7f1b7f2c5700 1 -- 192.168.123.105:0/704926197 wait complete. 2026-03-10T07:49:21.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.228+0000 7f1b7f2c5700 1 Processor -- start 2026-03-10T07:49:21.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.228+0000 7f1b7f2c5700 1 -- start start 2026-03-10T07:49:21.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.228+0000 7f1b7f2c5700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1b78075a10 0x7f1b78083070 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:21.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.228+0000 7f1b7f2c5700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1b780835b0 0x7f1b781b30d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:21.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.228+0000 7f1b7f2c5700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1b78083ac0 con 0x7f1b78075a10 2026-03-10T07:49:21.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.228+0000 7f1b7f2c5700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1b78083c30 con 0x7f1b780835b0 2026-03-10T07:49:21.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.229+0000 7f1b7dac2700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1b780835b0 0x7f1b781b30d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:21.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.229+0000 7f1b7dac2700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1b780835b0 0x7f1b781b30d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:54084/0 (socket says 192.168.123.105:54084) 2026-03-10T07:49:21.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.229+0000 7f1b7dac2700 1 -- 192.168.123.105:0/4024007762 learned_addr learned my addr 192.168.123.105:0/4024007762 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:49:21.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.229+0000 7f1b7e2c3700 1 --2- 192.168.123.105:0/4024007762 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1b78075a10 0x7f1b78083070 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:21.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.229+0000 7f1b7e2c3700 1 -- 192.168.123.105:0/4024007762 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1b780835b0 msgr2=0x7f1b781b30d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:21.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.229+0000 7f1b7e2c3700 1 --2- 192.168.123.105:0/4024007762 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1b780835b0 0x7f1b781b30d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:21.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.229+0000 7f1b7e2c3700 1 -- 192.168.123.105:0/4024007762 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1b740096b0 con 0x7f1b78075a10 2026-03-10T07:49:21.231 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.230+0000 7f1b7e2c3700 1 --2- 192.168.123.105:0/4024007762 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1b78075a10 0x7f1b78083070 secure :-1 s=READY pgs=290 cs=0 l=1 rev1=1 crypto rx=0x7f1b74005b40 tx=0x7f1b74005280 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:49:21.232 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.230+0000 7f1b6f7fe700 1 -- 192.168.123.105:0/4024007762 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1b74003bf0 con 0x7f1b78075a10 2026-03-10T07:49:21.233 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.230+0000 7f1b7f2c5700 1 -- 192.168.123.105:0/4024007762 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1b781b3610 con 0x7f1b78075a10 2026-03-10T07:49:21.233 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.230+0000 7f1b7f2c5700 1 -- 192.168.123.105:0/4024007762 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1b781b3b00 con 0x7f1b78075a10 2026-03-10T07:49:21.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.233+0000 7f1b6f7fe700 1 -- 192.168.123.105:0/4024007762 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1b74003d50 con 0x7f1b78075a10 2026-03-10T07:49:21.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.233+0000 7f1b6f7fe700 1 -- 192.168.123.105:0/4024007762 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1b740219b0 con 0x7f1b78075a10 2026-03-10T07:49:21.234 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.233+0000 7f1b6f7fe700 1 -- 192.168.123.105:0/4024007762 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f1b74005400 con 0x7f1b78075a10 2026-03-10T07:49:21.235 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.234+0000 7f1b6f7fe700 1 --2- 192.168.123.105:0/4024007762 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1b6406c6d0 0x7f1b6406eb90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:21.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.234+0000 7f1b7dac2700 1 --2- 192.168.123.105:0/4024007762 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1b6406c6d0 0x7f1b6406eb90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:21.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.234+0000 7f1b7dac2700 1 --2- 192.168.123.105:0/4024007762 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1b6406c6d0 0x7f1b6406eb90 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f1b7000bfd0 tx=0x7f1b7000bf20 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:49:21.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.234+0000 7f1b6f7fe700 1 -- 192.168.123.105:0/4024007762 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f1b7408d810 con 0x7f1b78075a10 2026-03-10T07:49:21.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.235+0000 7f1b7f2c5700 1 -- 192.168.123.105:0/4024007762 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f1b5c005320 con 0x7f1b78075a10 2026-03-10T07:49:21.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.237+0000 7f1b6f7fe700 1 -- 192.168.123.105:0/4024007762 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f1b74058160 con 0x7f1b78075a10 2026-03-10T07:49:21.415 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.413+0000 7f1b7f2c5700 1 -- 192.168.123.105:0/4024007762 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f1b5c006200 con 0x7f1b78075a10 2026-03-10T07:49:21.416 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.414+0000 7f1b6f7fe700 1 -- 192.168.123.105:0/4024007762 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7f1b7405b780 con 0x7f1b78075a10 2026-03-10T07:49:21.416 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:49:21.417 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T07:49:21.417 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-10T07:49:21.417 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:49:21.417 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T07:49:21.417 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-10T07:49:21.417 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:49:21.417 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T07:49:21.417 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 6 2026-03-10T07:49:21.417 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:49:21.417 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T07:49:21.417 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T07:49:21.417 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:49:21.417 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T07:49:21.417 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 14 2026-03-10T07:49:21.417 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T07:49:21.417 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:49:21.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.420+0000 7f1b6d7fa700 1 -- 192.168.123.105:0/4024007762 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1b6406c6d0 msgr2=0x7f1b6406eb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:21.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.420+0000 7f1b6d7fa700 1 --2- 192.168.123.105:0/4024007762 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1b6406c6d0 0x7f1b6406eb90 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7f1b7000bfd0 tx=0x7f1b7000bf20 comp rx=0 tx=0).stop 2026-03-10T07:49:21.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.420+0000 7f1b6d7fa700 1 -- 192.168.123.105:0/4024007762 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1b78075a10 msgr2=0x7f1b78083070 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:21.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.420+0000 7f1b6d7fa700 1 --2- 192.168.123.105:0/4024007762 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1b78075a10 0x7f1b78083070 secure :-1 s=READY pgs=290 cs=0 l=1 rev1=1 crypto rx=0x7f1b74005b40 tx=0x7f1b74005280 comp rx=0 tx=0).stop 2026-03-10T07:49:21.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.420+0000 7f1b6d7fa700 1 -- 192.168.123.105:0/4024007762 shutdown_connections 2026-03-10T07:49:21.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.420+0000 7f1b6d7fa700 1 --2- 192.168.123.105:0/4024007762 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f1b6406c6d0 0x7f1b6406eb90 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:21.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.420+0000 7f1b6d7fa700 1 --2- 192.168.123.105:0/4024007762 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1b78075a10 0x7f1b78083070 secure :-1 s=CLOSED pgs=290 cs=0 l=1 rev1=1 crypto rx=0x7f1b74005b40 tx=0x7f1b74005280 comp rx=0 tx=0).stop 2026-03-10T07:49:21.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.420+0000 7f1b6d7fa700 1 --2- 192.168.123.105:0/4024007762 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1b780835b0 0x7f1b781b30d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:21.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.420+0000 7f1b6d7fa700 1 -- 192.168.123.105:0/4024007762 >> 192.168.123.105:0/4024007762 conn(0x7f1b7806daa0 msgr2=0x7f1b7806ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:49:21.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.421+0000 7f1b6d7fa700 1 -- 192.168.123.105:0/4024007762 shutdown_connections 2026-03-10T07:49:21.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.421+0000 7f1b6d7fa700 1 -- 192.168.123.105:0/4024007762 wait complete. 2026-03-10T07:49:21.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.508+0000 7fe23a32d700 1 -- 192.168.123.105:0/193651807 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe234075a40 msgr2=0x7fe234077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:21.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.508+0000 7fe23a32d700 1 --2- 192.168.123.105:0/193651807 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe234075a40 0x7fe234077ed0 secure :-1 s=READY pgs=291 cs=0 l=1 rev1=1 crypto rx=0x7fe22c00b780 tx=0x7fe22c00ba90 comp rx=0 tx=0).stop 2026-03-10T07:49:21.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.508+0000 7fe23a32d700 1 -- 192.168.123.105:0/193651807 shutdown_connections 2026-03-10T07:49:21.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.508+0000 7fe23a32d700 1 --2- 192.168.123.105:0/193651807 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe234075a40 0x7fe234077ed0 unknown :-1 s=CLOSED pgs=291 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:21.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.508+0000 7fe23a32d700 1 --2- 192.168.123.105:0/193651807 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe234072b50 0x7fe234072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:21.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.508+0000 7fe23a32d700 1 -- 192.168.123.105:0/193651807 >> 192.168.123.105:0/193651807 conn(0x7fe23406dae0 msgr2=0x7fe23406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:49:21.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.508+0000 7fe23a32d700 1 -- 192.168.123.105:0/193651807 shutdown_connections 2026-03-10T07:49:21.510 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.508+0000 7fe23a32d700 1 -- 192.168.123.105:0/193651807 wait complete. 2026-03-10T07:49:21.511 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.509+0000 7fe23a32d700 1 Processor -- start 2026-03-10T07:49:21.511 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.509+0000 7fe23a32d700 1 -- start start 2026-03-10T07:49:21.511 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.509+0000 7fe23a32d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe234072b50 0x7fe234083100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:21.511 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.509+0000 7fe23a32d700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe234083640 0x7fe23412e400 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:21.511 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.509+0000 7fe23a32d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe234083b80 con 0x7fe234072b50 2026-03-10T07:49:21.511 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.509+0000 7fe23a32d700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe234083cf0 con 0x7fe234083640 2026-03-10T07:49:21.511 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.509+0000 7fe233fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe234072b50 0x7fe234083100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:21.511 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.509+0000 7fe2337fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe234083640 0x7fe23412e400 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:21.511 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.509+0000 7fe2337fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe234083640 0x7fe23412e400 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:54106/0 (socket says 192.168.123.105:54106) 2026-03-10T07:49:21.511 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.509+0000 7fe2337fe700 1 -- 192.168.123.105:0/1096665678 learned_addr learned my addr 192.168.123.105:0/1096665678 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:49:21.511 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.509+0000 7fe233fff700 1 -- 192.168.123.105:0/1096665678 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe234083640 msgr2=0x7fe23412e400 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:21.511 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.509+0000 7fe233fff700 1 --2- 192.168.123.105:0/1096665678 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe234083640 0x7fe23412e400 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:21.511 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.509+0000 7fe233fff700 1 -- 192.168.123.105:0/1096665678 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe22c00b050 con 0x7fe234072b50 2026-03-10T07:49:21.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.510+0000 7fe233fff700 1 --2- 192.168.123.105:0/1096665678 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe234072b50 0x7fe234083100 secure :-1 s=READY pgs=292 cs=0 l=1 rev1=1 crypto rx=0x7fe22400b700 tx=0x7fe22400ba10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:49:21.512 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.510+0000 7fe2317fa700 1 -- 192.168.123.105:0/1096665678 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe224011840 con 0x7fe234072b50 2026-03-10T07:49:21.513 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.510+0000 7fe23a32d700 1 -- 192.168.123.105:0/1096665678 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe23412ea60 con 0x7fe234072b50 2026-03-10T07:49:21.513 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.510+0000 7fe23a32d700 1 -- 192.168.123.105:0/1096665678 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe23412ef60 con 0x7fe234072b50 2026-03-10T07:49:21.513 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.511+0000 7fe2317fa700 1 -- 192.168.123.105:0/1096665678 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fe224011e80 con 0x7fe234072b50 2026-03-10T07:49:21.513 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.511+0000 7fe2317fa700 1 -- 192.168.123.105:0/1096665678 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe22400f550 con 0x7fe234072b50 2026-03-10T07:49:21.514 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.512+0000 7fe2317fa700 1 -- 192.168.123.105:0/1096665678 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fe22400f6b0 con 0x7fe234072b50 2026-03-10T07:49:21.514 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.513+0000 7fe2317fa700 1 --2- 192.168.123.105:0/1096665678 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe21c06c600 0x7fe21c06eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:21.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.513+0000 7fe2337fe700 1 --2- 192.168.123.105:0/1096665678 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe21c06c600 0x7fe21c06eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:21.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.513+0000 7fe2337fe700 1 --2- 192.168.123.105:0/1096665678 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe21c06c600 0x7fe21c06eac0 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7fe22c00b020 tx=0x7fe22c00afb0 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:49:21.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.514+0000 7fe2317fa700 1 -- 192.168.123.105:0/1096665678 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7fe22408c5d0 con 0x7fe234072b50 2026-03-10T07:49:21.515 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.514+0000 7fe23a32d700 1 -- 192.168.123.105:0/1096665678 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe220005320 con 0x7fe234072b50 2026-03-10T07:49:21.518 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.517+0000 7fe2317fa700 1 -- 192.168.123.105:0/1096665678 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fe224056e70 con 0x7fe234072b50 2026-03-10T07:49:21.630 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.628+0000 7fe23a32d700 1 -- 192.168.123.105:0/1096665678 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fe220000bf0 con 0x7fe21c06c600 2026-03-10T07:49:21.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.629+0000 7fe2317fa700 1 -- 192.168.123.105:0/1096665678 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+351 (secure 0 0 0) 0x7fe220000bf0 con 0x7fe21c06c600 2026-03-10T07:49:21.631 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:49:21.631 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", 2026-03-10T07:49:21.631 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T07:49:21.631 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading daemons of type(s) mgr", 2026-03-10T07:49:21.631 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [], 2026-03-10T07:49:21.631 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "", 2026-03-10T07:49:21.631 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Doing first pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df image", 2026-03-10T07:49:21.631 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T07:49:21.631 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:49:21.633 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.632+0000 7fe21affd700 1 -- 192.168.123.105:0/1096665678 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe21c06c600 msgr2=0x7fe21c06eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:21.633 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.632+0000 7fe21affd700 1 --2- 192.168.123.105:0/1096665678 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe21c06c600 0x7fe21c06eac0 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7fe22c00b020 tx=0x7fe22c00afb0 comp rx=0 tx=0).stop 2026-03-10T07:49:21.633 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.632+0000 7fe21affd700 1 -- 192.168.123.105:0/1096665678 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe234072b50 msgr2=0x7fe234083100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:21.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.632+0000 7fe21affd700 1 --2- 192.168.123.105:0/1096665678 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe234072b50 0x7fe234083100 secure :-1 s=READY pgs=292 cs=0 l=1 rev1=1 crypto rx=0x7fe22400b700 tx=0x7fe22400ba10 comp rx=0 tx=0).stop 2026-03-10T07:49:21.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.632+0000 7fe21affd700 1 -- 192.168.123.105:0/1096665678 shutdown_connections 2026-03-10T07:49:21.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.632+0000 7fe21affd700 1 --2- 192.168.123.105:0/1096665678 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fe21c06c600 0x7fe21c06eac0 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:21.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.632+0000 7fe21affd700 1 --2- 192.168.123.105:0/1096665678 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe234072b50 0x7fe234083100 unknown :-1 s=CLOSED pgs=292 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:21.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.632+0000 7fe21affd700 1 --2- 192.168.123.105:0/1096665678 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe234083640 0x7fe23412e400 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:21.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.633+0000 7fe21affd700 1 -- 192.168.123.105:0/1096665678 >> 192.168.123.105:0/1096665678 conn(0x7fe23406dae0 msgr2=0x7fe23406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:49:21.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.633+0000 7fe21affd700 1 -- 192.168.123.105:0/1096665678 shutdown_connections 2026-03-10T07:49:21.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:21.633+0000 7fe21affd700 1 -- 192.168.123.105:0/1096665678 wait complete. 2026-03-10T07:49:21.733 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:21 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:49:21.733 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:21 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:49:21.733 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:21 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:49:21.733 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:21 vm05.local ceph-mon[50387]: from='client.14568 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:49:21.733 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:21 vm05.local ceph-mon[50387]: from='client.? 192.168.123.105:0/4024007762' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:49:22.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:21 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:49:22.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:21 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:49:22.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:21 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:49:22.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:21 vm08.local ceph-mon[59917]: from='client.14568 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:49:22.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:21 vm08.local ceph-mon[59917]: from='client.? 192.168.123.105:0/4024007762' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:49:23.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:22 vm05.local ceph-mon[50387]: from='client.14572 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:49:23.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:22 vm05.local ceph-mon[50387]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T07:49:23.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:22 vm05.local ceph-mon[50387]: from='client.24361 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:49:23.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:22 vm05.local ceph-mon[50387]: pgmap v106: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:23.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:22 vm05.local ceph-mon[50387]: from='client.14584 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:49:23.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:22 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:49:23.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:22 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:49:23.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:22 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:49:23.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:22 vm08.local ceph-mon[59917]: from='client.14572 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:49:23.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:22 vm08.local ceph-mon[59917]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T07:49:23.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:22 vm08.local ceph-mon[59917]: from='client.24361 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:49:23.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:22 vm08.local ceph-mon[59917]: pgmap v106: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:23.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:22 vm08.local ceph-mon[59917]: from='client.14584 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:49:23.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:22 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:49:23.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:22 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:49:23.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:22 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:49:24.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:23 vm05.local ceph-mon[50387]: Upgrade: Target is version 19.2.3-678-ge911bdeb (unknown) 2026-03-10T07:49:24.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:23 vm05.local ceph-mon[50387]: Upgrade: Target container is quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, digests ['quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc'] 2026-03-10T07:49:24.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:23 vm05.local ceph-mon[50387]: Upgrade: Need to upgrade myself (mgr.vm05.blexke) 2026-03-10T07:49:24.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:23 vm05.local ceph-mon[50387]: Upgrade: Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc on vm08 2026-03-10T07:49:24.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:23 vm08.local ceph-mon[59917]: Upgrade: Target is version 19.2.3-678-ge911bdeb (unknown) 2026-03-10T07:49:24.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:23 vm08.local ceph-mon[59917]: Upgrade: Target container is quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, digests ['quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc'] 2026-03-10T07:49:24.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:23 vm08.local ceph-mon[59917]: Upgrade: Need to upgrade myself (mgr.vm05.blexke) 2026-03-10T07:49:24.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:23 vm08.local ceph-mon[59917]: Upgrade: Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc on vm08 2026-03-10T07:49:25.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:24 vm05.local ceph-mon[50387]: pgmap v107: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:25.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:24 vm08.local ceph-mon[59917]: pgmap v107: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:27.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:26 vm05.local ceph-mon[50387]: pgmap v108: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:27.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:26 vm08.local ceph-mon[59917]: pgmap v108: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:29.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:29 vm05.local ceph-mon[50387]: pgmap v109: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:29.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:29 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:49:29.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:29 vm08.local ceph-mon[59917]: pgmap v109: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:29.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:29 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:49:30.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:30 vm05.local ceph-mon[50387]: pgmap v110: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:30.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:30 vm08.local ceph-mon[59917]: pgmap v110: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:32.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:32 vm05.local ceph-mon[50387]: pgmap v111: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:32.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:32 vm08.local ceph-mon[59917]: pgmap v111: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:34.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:33 vm05.local ceph-mon[50387]: pgmap v112: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:34.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:33 vm08.local ceph-mon[59917]: pgmap v112: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:36.568 INFO:tasks.workunit.client.1.vm08.stderr:Note: switching to '75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b'. 2026-03-10T07:49:36.568 INFO:tasks.workunit.client.1.vm08.stderr: 2026-03-10T07:49:36.568 INFO:tasks.workunit.client.1.vm08.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-10T07:49:36.568 INFO:tasks.workunit.client.1.vm08.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-10T07:49:36.568 INFO:tasks.workunit.client.1.vm08.stderr:state without impacting any branches by switching back to a branch. 2026-03-10T07:49:36.568 INFO:tasks.workunit.client.1.vm08.stderr: 2026-03-10T07:49:36.568 INFO:tasks.workunit.client.1.vm08.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-10T07:49:36.568 INFO:tasks.workunit.client.1.vm08.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-10T07:49:36.568 INFO:tasks.workunit.client.1.vm08.stderr: 2026-03-10T07:49:36.568 INFO:tasks.workunit.client.1.vm08.stderr: git switch -c 2026-03-10T07:49:36.568 INFO:tasks.workunit.client.1.vm08.stderr: 2026-03-10T07:49:36.569 INFO:tasks.workunit.client.1.vm08.stderr:Or undo this operation with: 2026-03-10T07:49:36.569 INFO:tasks.workunit.client.1.vm08.stderr: 2026-03-10T07:49:36.569 INFO:tasks.workunit.client.1.vm08.stderr: git switch - 2026-03-10T07:49:36.569 INFO:tasks.workunit.client.1.vm08.stderr: 2026-03-10T07:49:36.569 INFO:tasks.workunit.client.1.vm08.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-10T07:49:36.569 INFO:tasks.workunit.client.1.vm08.stderr: 2026-03-10T07:49:36.569 INFO:tasks.workunit.client.1.vm08.stderr:HEAD is now at 75a68fd8ca3 qa/suites/orch/cephadm/osds: drop nvme_loop task 2026-03-10T07:49:36.573 DEBUG:teuthology.orchestra.run.vm08:> cd -- /home/ubuntu/cephtest/clone.client.1/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.1 2026-03-10T07:49:36.632 INFO:tasks.workunit.client.1.vm08.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-10T07:49:36.633 INFO:tasks.workunit.client.1.vm08.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/direct_io' 2026-03-10T07:49:36.633 INFO:tasks.workunit.client.1.vm08.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-10T07:49:36.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:36 vm05.local ceph-mon[50387]: pgmap v113: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:36.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:36 vm08.local ceph-mon[59917]: pgmap v113: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:36.680 INFO:tasks.workunit.client.1.vm08.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-10T07:49:36.723 INFO:tasks.workunit.client.1.vm08.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-10T07:49:36.751 INFO:tasks.workunit.client.1.vm08.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/direct_io' 2026-03-10T07:49:36.753 INFO:tasks.workunit.client.1.vm08.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/fs' 2026-03-10T07:49:36.753 INFO:tasks.workunit.client.1.vm08.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-10T07:49:36.783 INFO:tasks.workunit.client.1.vm08.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.1/qa/workunits/fs' 2026-03-10T07:49:36.786 DEBUG:teuthology.orchestra.run.vm08:> set -ex 2026-03-10T07:49:36.786 DEBUG:teuthology.orchestra.run.vm08:> dd if=/home/ubuntu/cephtest/workunits.list.client.1 of=/dev/stdout 2026-03-10T07:49:36.842 INFO:tasks.workunit:Running workunits matching suites/fsstress.sh on client.1... 2026-03-10T07:49:36.843 INFO:tasks.workunit:Running workunit suites/fsstress.sh... 2026-03-10T07:49:36.843 DEBUG:teuthology.orchestra.run.vm08:workunit test suites/fsstress.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.1/client.1/tmp && cd -- /home/ubuntu/cephtest/mnt.1/client.1/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="1" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.1 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.1 CEPH_MNT=/home/ubuntu/cephtest/mnt.1 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.1/qa/workunits/suites/fsstress.sh 2026-03-10T07:49:36.913 INFO:tasks.workunit.client.1.vm08.stderr:+ mkdir -p fsstress 2026-03-10T07:49:36.915 INFO:tasks.workunit.client.1.vm08.stderr:+ pushd fsstress 2026-03-10T07:49:36.915 INFO:tasks.workunit.client.1.vm08.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-10T07:49:36.916 INFO:tasks.workunit.client.1.vm08.stderr:+ wget -q -O ltp-full.tgz http://download.ceph.com/qa/ltp-full-20091231.tgz 2026-03-10T07:49:38.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:38 vm05.local ceph-mon[50387]: pgmap v114: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:39.051 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:38 vm08.local ceph-mon[59917]: pgmap v114: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:40.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:40 vm05.local ceph-mon[50387]: pgmap v115: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:41.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:40 vm08.local ceph-mon[59917]: pgmap v115: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:41.564 INFO:tasks.workunit.client.0.vm05.stderr:Note: switching to '75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b'. 2026-03-10T07:49:41.564 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-10T07:49:41.564 INFO:tasks.workunit.client.0.vm05.stderr:You are in 'detached HEAD' state. You can look around, make experimental 2026-03-10T07:49:41.564 INFO:tasks.workunit.client.0.vm05.stderr:changes and commit them, and you can discard any commits you make in this 2026-03-10T07:49:41.564 INFO:tasks.workunit.client.0.vm05.stderr:state without impacting any branches by switching back to a branch. 2026-03-10T07:49:41.564 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-10T07:49:41.564 INFO:tasks.workunit.client.0.vm05.stderr:If you want to create a new branch to retain commits you create, you may 2026-03-10T07:49:41.564 INFO:tasks.workunit.client.0.vm05.stderr:do so (now or later) by using -c with the switch command. Example: 2026-03-10T07:49:41.564 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-10T07:49:41.564 INFO:tasks.workunit.client.0.vm05.stderr: git switch -c 2026-03-10T07:49:41.564 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-10T07:49:41.564 INFO:tasks.workunit.client.0.vm05.stderr:Or undo this operation with: 2026-03-10T07:49:41.564 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-10T07:49:41.564 INFO:tasks.workunit.client.0.vm05.stderr: git switch - 2026-03-10T07:49:41.564 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-10T07:49:41.564 INFO:tasks.workunit.client.0.vm05.stderr:Turn off this advice by setting config variable advice.detachedHead to false 2026-03-10T07:49:41.564 INFO:tasks.workunit.client.0.vm05.stderr: 2026-03-10T07:49:41.564 INFO:tasks.workunit.client.0.vm05.stderr:HEAD is now at 75a68fd8ca3 qa/suites/orch/cephadm/osds: drop nvme_loop task 2026-03-10T07:49:41.569 DEBUG:teuthology.orchestra.run.vm05:> cd -- /home/ubuntu/cephtest/clone.client.0/qa/workunits && if test -e Makefile ; then make ; fi && find -executable -type f -printf '%P\0' >/home/ubuntu/cephtest/workunits.list.client.0 2026-03-10T07:49:41.586 INFO:tasks.workunit.client.0.vm05.stdout:for d in direct_io fs ; do ( cd $d ; make all ) ; done 2026-03-10T07:49:41.588 INFO:tasks.workunit.client.0.vm05.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-10T07:49:41.588 INFO:tasks.workunit.client.0.vm05.stdout:cc -Wall -Wextra -D_GNU_SOURCE direct_io_test.c -o direct_io_test 2026-03-10T07:49:41.644 INFO:tasks.workunit.client.0.vm05.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_sync_io.c -o test_sync_io 2026-03-10T07:49:41.680 INFO:tasks.workunit.client.0.vm05.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_short_dio_read.c -o test_short_dio_read 2026-03-10T07:49:41.709 INFO:tasks.workunit.client.0.vm05.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/direct_io' 2026-03-10T07:49:41.710 INFO:tasks.workunit.client.0.vm05.stdout:make[1]: Entering directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-10T07:49:41.710 INFO:tasks.workunit.client.0.vm05.stdout:cc -Wall -Wextra -D_GNU_SOURCE test_o_trunc.c -o test_o_trunc 2026-03-10T07:49:41.740 INFO:tasks.workunit.client.0.vm05.stdout:make[1]: Leaving directory '/home/ubuntu/cephtest/clone.client.0/qa/workunits/fs' 2026-03-10T07:49:41.743 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T07:49:41.743 DEBUG:teuthology.orchestra.run.vm05:> dd if=/home/ubuntu/cephtest/workunits.list.client.0 of=/dev/stdout 2026-03-10T07:49:41.799 INFO:tasks.workunit:Running workunits matching suites/fsstress.sh on client.0... 2026-03-10T07:49:41.800 INFO:tasks.workunit:Running workunit suites/fsstress.sh... 2026-03-10T07:49:41.800 DEBUG:teuthology.orchestra.run.vm05:workunit test suites/fsstress.sh> mkdir -p -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && cd -- /home/ubuntu/cephtest/mnt.0/client.0/tmp && CEPH_CLI_TEST_DUP_COMMAND=1 CEPH_REF=75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b TESTDIR="/home/ubuntu/cephtest" CEPH_ARGS="--cluster ceph" CEPH_ID="0" PATH=$PATH:/usr/sbin CEPH_BASE=/home/ubuntu/cephtest/clone.client.0 CEPH_ROOT=/home/ubuntu/cephtest/clone.client.0 CEPH_MNT=/home/ubuntu/cephtest/mnt.0 adjust-ulimits ceph-coverage /home/ubuntu/cephtest/archive/coverage timeout 3h /home/ubuntu/cephtest/clone.client.0/qa/workunits/suites/fsstress.sh 2026-03-10T07:49:41.864 INFO:tasks.workunit.client.0.vm05.stderr:+ mkdir -p fsstress 2026-03-10T07:49:41.866 INFO:tasks.workunit.client.0.vm05.stderr:+ pushd fsstress 2026-03-10T07:49:41.867 INFO:tasks.workunit.client.0.vm05.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-10T07:49:41.867 INFO:tasks.workunit.client.0.vm05.stderr:+ wget -q -O ltp-full.tgz http://download.ceph.com/qa/ltp-full-20091231.tgz 2026-03-10T07:49:43.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:42 vm05.local ceph-mon[50387]: pgmap v116: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:43.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:42 vm08.local ceph-mon[59917]: pgmap v116: 65 pgs: 65 active+clean; 460 KiB data, 160 MiB used, 120 GiB / 120 GiB avail 2026-03-10T07:49:44.137 INFO:tasks.workunit.client.0.vm05.stderr:+ tar xzf ltp-full.tgz 2026-03-10T07:49:44.361 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:44 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:49:44.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:44 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:49:44.928 INFO:tasks.workunit.client.1.vm08.stderr:+ tar xzf ltp-full.tgz 2026-03-10T07:49:45.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:45 vm05.local ceph-mon[50387]: pgmap v117: 65 pgs: 65 active+clean; 471 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.1 KiB/s wr, 0 op/s 2026-03-10T07:49:45.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:45 vm08.local ceph-mon[59917]: pgmap v117: 65 pgs: 65 active+clean; 471 KiB data, 160 MiB used, 120 GiB / 120 GiB avail; 1.1 KiB/s wr, 0 op/s 2026-03-10T07:49:46.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:46 vm05.local ceph-mon[50387]: pgmap v118: 65 pgs: 65 active+clean; 4.5 MiB data, 164 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 348 KiB/s wr, 1 op/s 2026-03-10T07:49:46.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:46 vm08.local ceph-mon[59917]: pgmap v118: 65 pgs: 65 active+clean; 4.5 MiB data, 164 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 348 KiB/s wr, 1 op/s 2026-03-10T07:49:48.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:48 vm05.local ceph-mon[50387]: pgmap v119: 65 pgs: 65 active+clean; 20 MiB data, 194 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 1.7 MiB/s wr, 28 op/s 2026-03-10T07:49:48.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:48 vm08.local ceph-mon[59917]: pgmap v119: 65 pgs: 65 active+clean; 20 MiB data, 194 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 1.7 MiB/s wr, 28 op/s 2026-03-10T07:49:51.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:50 vm05.local ceph-mon[50387]: pgmap v120: 65 pgs: 65 active+clean; 30 MiB data, 224 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 2.5 MiB/s wr, 55 op/s 2026-03-10T07:49:51.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:50 vm08.local ceph-mon[59917]: pgmap v120: 65 pgs: 65 active+clean; 30 MiB data, 224 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 2.5 MiB/s wr, 55 op/s 2026-03-10T07:49:51.727 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.725+0000 7ff1c6766700 1 -- 192.168.123.105:0/4054602531 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff1c00ffed0 msgr2=0x7ff1c0100350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:51.727 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.725+0000 7ff1c6766700 1 --2- 192.168.123.105:0/4054602531 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff1c00ffed0 0x7ff1c0100350 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7ff1b0009a60 tx=0x7ff1b0009d70 comp rx=0 tx=0).stop 2026-03-10T07:49:51.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.727+0000 7ff1c6766700 1 -- 192.168.123.105:0/4054602531 shutdown_connections 2026-03-10T07:49:51.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.727+0000 7ff1c6766700 1 --2- 192.168.123.105:0/4054602531 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff1c00ffed0 0x7ff1c0100350 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:51.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.727+0000 7ff1c6766700 1 --2- 192.168.123.105:0/4054602531 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff1c00ff570 0x7ff1c00ff990 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:51.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.727+0000 7ff1c6766700 1 -- 192.168.123.105:0/4054602531 >> 192.168.123.105:0/4054602531 conn(0x7ff1c00fb110 msgr2=0x7ff1c00fd570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:49:51.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.727+0000 7ff1c6766700 1 -- 192.168.123.105:0/4054602531 shutdown_connections 2026-03-10T07:49:51.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.727+0000 7ff1c6766700 1 -- 192.168.123.105:0/4054602531 wait complete. 2026-03-10T07:49:51.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.728+0000 7ff1c6766700 1 Processor -- start 2026-03-10T07:49:51.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.728+0000 7ff1c6766700 1 -- start start 2026-03-10T07:49:51.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.728+0000 7ff1c6766700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff1c00ff570 0x7ff1c01050f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:51.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.728+0000 7ff1c6766700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff1c00ffed0 0x7ff1c0105630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:51.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.728+0000 7ff1c6766700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff1c0101be0 con 0x7ff1c00ffed0 2026-03-10T07:49:51.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.728+0000 7ff1c6766700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff1c0101d50 con 0x7ff1c00ff570 2026-03-10T07:49:51.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.728+0000 7ff1c5764700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff1c00ff570 0x7ff1c01050f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:51.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.728+0000 7ff1c5764700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff1c00ff570 0x7ff1c01050f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:48090/0 (socket says 192.168.123.105:48090) 2026-03-10T07:49:51.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.728+0000 7ff1c5764700 1 -- 192.168.123.105:0/1116294096 learned_addr learned my addr 192.168.123.105:0/1116294096 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:49:51.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.729+0000 7ff1c5764700 1 -- 192.168.123.105:0/1116294096 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff1c00ffed0 msgr2=0x7ff1c0105630 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:51.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.729+0000 7ff1c5764700 1 --2- 192.168.123.105:0/1116294096 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff1c00ffed0 0x7ff1c0105630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:51.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.729+0000 7ff1c5764700 1 -- 192.168.123.105:0/1116294096 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff1b0009710 con 0x7ff1c00ff570 2026-03-10T07:49:51.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.729+0000 7ff1c5764700 1 --2- 192.168.123.105:0/1116294096 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff1c00ff570 0x7ff1c01050f0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7ff1bc00ea30 tx=0x7ff1bc00edf0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:49:51.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.730+0000 7ff1b67fc700 1 -- 192.168.123.105:0/1116294096 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff1bc00cc40 con 0x7ff1c00ff570 2026-03-10T07:49:51.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.730+0000 7ff1c6766700 1 -- 192.168.123.105:0/1116294096 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff1c0102030 con 0x7ff1c00ff570 2026-03-10T07:49:51.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.730+0000 7ff1c6766700 1 -- 192.168.123.105:0/1116294096 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff1c0102580 con 0x7ff1c00ff570 2026-03-10T07:49:51.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.730+0000 7ff1b67fc700 1 -- 192.168.123.105:0/1116294096 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7ff1bc00cda0 con 0x7ff1c00ff570 2026-03-10T07:49:51.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.731+0000 7ff1b67fc700 1 -- 192.168.123.105:0/1116294096 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff1bc010430 con 0x7ff1c00ff570 2026-03-10T07:49:51.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.731+0000 7ff1b67fc700 1 -- 192.168.123.105:0/1116294096 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7ff1bc0106b0 con 0x7ff1c00ff570 2026-03-10T07:49:51.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.731+0000 7ff1c6766700 1 -- 192.168.123.105:0/1116294096 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff1a4005320 con 0x7ff1c00ff570 2026-03-10T07:49:51.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.731+0000 7ff1b67fc700 1 --2- 192.168.123.105:0/1116294096 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff1ac06c320 0x7ff1ac06e7e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:51.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.731+0000 7ff1b67fc700 1 -- 192.168.123.105:0/1116294096 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7ff1bc014070 con 0x7ff1c00ff570 2026-03-10T07:49:51.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.732+0000 7ff1c4f63700 1 --2- 192.168.123.105:0/1116294096 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff1ac06c320 0x7ff1ac06e7e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:51.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.732+0000 7ff1c4f63700 1 --2- 192.168.123.105:0/1116294096 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff1ac06c320 0x7ff1ac06e7e0 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7ff1b0009a60 tx=0x7ff1b000b540 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:49:51.735 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.734+0000 7ff1b67fc700 1 -- 192.168.123.105:0/1116294096 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7ff1bc05b060 con 0x7ff1c00ff570 2026-03-10T07:49:51.877 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.875+0000 7ff1c6766700 1 -- 192.168.123.105:0/1116294096 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7ff1a4000bf0 con 0x7ff1ac06c320 2026-03-10T07:49:51.880 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.877+0000 7ff1b67fc700 1 -- 192.168.123.105:0/1116294096 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+434 (secure 0 0 0) 0x7ff1a4000bf0 con 0x7ff1ac06c320 2026-03-10T07:49:51.880 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.879+0000 7ff1abfff700 1 -- 192.168.123.105:0/1116294096 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff1ac06c320 msgr2=0x7ff1ac06e7e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:51.880 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.879+0000 7ff1abfff700 1 --2- 192.168.123.105:0/1116294096 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff1ac06c320 0x7ff1ac06e7e0 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7ff1b0009a60 tx=0x7ff1b000b540 comp rx=0 tx=0).stop 2026-03-10T07:49:51.880 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.879+0000 7ff1abfff700 1 -- 192.168.123.105:0/1116294096 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff1c00ff570 msgr2=0x7ff1c01050f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:51.880 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.879+0000 7ff1abfff700 1 --2- 192.168.123.105:0/1116294096 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff1c00ff570 0x7ff1c01050f0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7ff1bc00ea30 tx=0x7ff1bc00edf0 comp rx=0 tx=0).stop 2026-03-10T07:49:51.881 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.879+0000 7ff1abfff700 1 -- 192.168.123.105:0/1116294096 shutdown_connections 2026-03-10T07:49:51.881 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.879+0000 7ff1abfff700 1 --2- 192.168.123.105:0/1116294096 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7ff1ac06c320 0x7ff1ac06e7e0 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:51.881 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.879+0000 7ff1abfff700 1 --2- 192.168.123.105:0/1116294096 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff1c00ff570 0x7ff1c01050f0 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:51.881 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.879+0000 7ff1abfff700 1 --2- 192.168.123.105:0/1116294096 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff1c00ffed0 0x7ff1c0105630 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:51.881 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.879+0000 7ff1abfff700 1 -- 192.168.123.105:0/1116294096 >> 192.168.123.105:0/1116294096 conn(0x7ff1c00fb110 msgr2=0x7ff1c0107ba0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:49:51.881 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.879+0000 7ff1abfff700 1 -- 192.168.123.105:0/1116294096 shutdown_connections 2026-03-10T07:49:51.881 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:51.879+0000 7ff1abfff700 1 -- 192.168.123.105:0/1116294096 wait complete. 2026-03-10T07:49:51.889 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T07:49:52.392 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.389+0000 7f05d9bc9700 1 -- 192.168.123.105:0/1278723870 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05d4075c80 msgr2=0x7f05d4078110 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:52.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.389+0000 7f05d9bc9700 1 --2- 192.168.123.105:0/1278723870 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05d4075c80 0x7f05d4078110 secure :-1 s=READY pgs=293 cs=0 l=1 rev1=1 crypto rx=0x7f05cc00d3f0 tx=0x7f05cc00d700 comp rx=0 tx=0).stop 2026-03-10T07:49:52.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.390+0000 7f05d9bc9700 1 -- 192.168.123.105:0/1278723870 shutdown_connections 2026-03-10T07:49:52.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.390+0000 7f05d9bc9700 1 --2- 192.168.123.105:0/1278723870 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05d4075c80 0x7f05d4078110 unknown :-1 s=CLOSED pgs=293 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:52.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.390+0000 7f05d9bc9700 1 --2- 192.168.123.105:0/1278723870 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f05d4072d90 0x7f05d40731b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:52.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.390+0000 7f05d9bc9700 1 -- 192.168.123.105:0/1278723870 >> 192.168.123.105:0/1278723870 conn(0x7f05d406dda0 msgr2=0x7f05d4070220 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:49:52.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.390+0000 7f05d9bc9700 1 -- 192.168.123.105:0/1278723870 shutdown_connections 2026-03-10T07:49:52.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.390+0000 7f05d9bc9700 1 -- 192.168.123.105:0/1278723870 wait complete. 2026-03-10T07:49:52.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.390+0000 7f05d9bc9700 1 Processor -- start 2026-03-10T07:49:52.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.390+0000 7f05d9bc9700 1 -- start start 2026-03-10T07:49:52.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.390+0000 7f05d9bc9700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f05d4072d90 0x7f05d412bdb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:52.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.390+0000 7f05d9bc9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05d4083bb0 0x7f05d412e300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:52.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.390+0000 7f05d9bc9700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f05d412e8d0 con 0x7f05d4083bb0 2026-03-10T07:49:52.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.390+0000 7f05d9bc9700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f05d412ea40 con 0x7f05d4072d90 2026-03-10T07:49:52.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.391+0000 7f05d37fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f05d4072d90 0x7f05d412bdb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:52.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.391+0000 7f05d37fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f05d4072d90 0x7f05d412bdb0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:48106/0 (socket says 192.168.123.105:48106) 2026-03-10T07:49:52.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.391+0000 7f05d37fe700 1 -- 192.168.123.105:0/1123163620 learned_addr learned my addr 192.168.123.105:0/1123163620 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:49:52.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.391+0000 7f05d37fe700 1 -- 192.168.123.105:0/1123163620 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05d4083bb0 msgr2=0x7f05d412e300 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:52.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.391+0000 7f05d37fe700 1 --2- 192.168.123.105:0/1123163620 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05d4083bb0 0x7f05d412e300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:52.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.392+0000 7f05d37fe700 1 -- 192.168.123.105:0/1123163620 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f05cc007ed0 con 0x7f05d4072d90 2026-03-10T07:49:52.396 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.392+0000 7f05d37fe700 1 --2- 192.168.123.105:0/1123163620 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f05d4072d90 0x7f05d412bdb0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f05c400eb10 tx=0x7f05c400eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:49:52.398 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.392+0000 7f05d0ff9700 1 -- 192.168.123.105:0/1123163620 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f05c400cca0 con 0x7f05d4072d90 2026-03-10T07:49:52.398 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.392+0000 7f05d0ff9700 1 -- 192.168.123.105:0/1123163620 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f05c400ce00 con 0x7f05d4072d90 2026-03-10T07:49:52.398 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.393+0000 7f05d0ff9700 1 -- 192.168.123.105:0/1123163620 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f05c4018910 con 0x7f05d4072d90 2026-03-10T07:49:52.398 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.394+0000 7f05d9bc9700 1 -- 192.168.123.105:0/1123163620 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f05d412ed20 con 0x7f05d4072d90 2026-03-10T07:49:52.398 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.394+0000 7f05d9bc9700 1 -- 192.168.123.105:0/1123163620 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f05d412f1f0 con 0x7f05d4072d90 2026-03-10T07:49:52.398 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.395+0000 7f05d9bc9700 1 -- 192.168.123.105:0/1123163620 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f05d407d560 con 0x7f05d4072d90 2026-03-10T07:49:52.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.884+0000 7f05d0ff9700 1 -- 192.168.123.105:0/1123163620 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f05c4018a70 con 0x7f05d4072d90 2026-03-10T07:49:52.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.884+0000 7f05d0ff9700 1 --2- 192.168.123.105:0/1123163620 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f05bc06c2e0 0x7f05bc06e7a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:52.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.884+0000 7f05d0ff9700 1 -- 192.168.123.105:0/1123163620 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f05c4014070 con 0x7f05d4072d90 2026-03-10T07:49:52.886 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.885+0000 7f05d2ffd700 1 --2- 192.168.123.105:0/1123163620 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f05bc06c2e0 0x7f05bc06e7a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:52.888 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.886+0000 7f05d0ff9700 1 -- 192.168.123.105:0/1123163620 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f05c405a6c0 con 0x7f05d4072d90 2026-03-10T07:49:52.896 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:52.894+0000 7f05d2ffd700 1 --2- 192.168.123.105:0/1123163620 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f05bc06c2e0 0x7f05bc06e7a0 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f05cc00dad0 tx=0x7f05cc006040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:49:53.039 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.036+0000 7f05d9bc9700 1 -- 192.168.123.105:0/1123163620 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f05d402cec0 con 0x7f05bc06c2e0 2026-03-10T07:49:53.040 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.039+0000 7f05d0ff9700 1 -- 192.168.123.105:0/1123163620 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+434 (secure 0 0 0) 0x7f05d402cec0 con 0x7f05bc06c2e0 2026-03-10T07:49:53.043 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.042+0000 7f05ba7fc700 1 -- 192.168.123.105:0/1123163620 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f05bc06c2e0 msgr2=0x7f05bc06e7a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:53.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.042+0000 7f05ba7fc700 1 --2- 192.168.123.105:0/1123163620 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f05bc06c2e0 0x7f05bc06e7a0 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f05cc00dad0 tx=0x7f05cc006040 comp rx=0 tx=0).stop 2026-03-10T07:49:53.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.042+0000 7f05ba7fc700 1 -- 192.168.123.105:0/1123163620 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f05d4072d90 msgr2=0x7f05d412bdb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:53.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.042+0000 7f05ba7fc700 1 --2- 192.168.123.105:0/1123163620 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f05d4072d90 0x7f05d412bdb0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f05c400eb10 tx=0x7f05c400eed0 comp rx=0 tx=0).stop 2026-03-10T07:49:53.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.043+0000 7f05ba7fc700 1 -- 192.168.123.105:0/1123163620 shutdown_connections 2026-03-10T07:49:53.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.043+0000 7f05ba7fc700 1 --2- 192.168.123.105:0/1123163620 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f05bc06c2e0 0x7f05bc06e7a0 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:53.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.043+0000 7f05ba7fc700 1 --2- 192.168.123.105:0/1123163620 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f05d4072d90 0x7f05d412bdb0 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:53.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.043+0000 7f05ba7fc700 1 --2- 192.168.123.105:0/1123163620 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f05d4083bb0 0x7f05d412e300 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:53.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.043+0000 7f05ba7fc700 1 -- 192.168.123.105:0/1123163620 >> 192.168.123.105:0/1123163620 conn(0x7f05d406dda0 msgr2=0x7f05d4077560 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:49:53.045 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.043+0000 7f05ba7fc700 1 -- 192.168.123.105:0/1123163620 shutdown_connections 2026-03-10T07:49:53.045 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.044+0000 7f05ba7fc700 1 -- 192.168.123.105:0/1123163620 wait complete. 2026-03-10T07:49:53.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.146+0000 7fccb56b7700 1 -- 192.168.123.105:0/3985525749 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fccb0075a10 msgr2=0x7fccb0077ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:53.148 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:52 vm05.local ceph-mon[50387]: pgmap v121: 65 pgs: 65 active+clean; 33 MiB data, 254 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 2.7 MiB/s wr, 89 op/s 2026-03-10T07:49:53.148 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.146+0000 7fccb56b7700 1 --2- 192.168.123.105:0/3985525749 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fccb0075a10 0x7fccb0077ea0 secure :-1 s=READY pgs=294 cs=0 l=1 rev1=1 crypto rx=0x7fcca8009230 tx=0x7fcca8009260 comp rx=0 tx=0).stop 2026-03-10T07:49:53.148 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.146+0000 7fccb56b7700 1 -- 192.168.123.105:0/3985525749 shutdown_connections 2026-03-10T07:49:53.148 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.146+0000 7fccb56b7700 1 --2- 192.168.123.105:0/3985525749 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fccb0075a10 0x7fccb0077ea0 unknown :-1 s=CLOSED pgs=294 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:53.148 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.146+0000 7fccb56b7700 1 --2- 192.168.123.105:0/3985525749 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fccb0072b20 0x7fccb0072f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:53.148 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.146+0000 7fccb56b7700 1 -- 192.168.123.105:0/3985525749 >> 192.168.123.105:0/3985525749 conn(0x7fccb006daa0 msgr2=0x7fccb006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:49:53.149 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.147+0000 7fccb56b7700 1 -- 192.168.123.105:0/3985525749 shutdown_connections 2026-03-10T07:49:53.149 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.147+0000 7fccb56b7700 1 -- 192.168.123.105:0/3985525749 wait complete. 2026-03-10T07:49:53.149 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.147+0000 7fccb56b7700 1 Processor -- start 2026-03-10T07:49:53.149 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.147+0000 7fccb56b7700 1 -- start start 2026-03-10T07:49:53.149 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.147+0000 7fccb56b7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fccb0072b20 0x7fccb0081520 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:53.149 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.147+0000 7fccb56b7700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fccb0081a60 0x7fccb012e170 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:53.149 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.147+0000 7fccb56b7700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fccb0081ee0 con 0x7fccb0072b20 2026-03-10T07:49:53.149 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.147+0000 7fccb56b7700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fccb0082050 con 0x7fccb0081a60 2026-03-10T07:49:53.150 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.147+0000 7fccaf7fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fccb0081a60 0x7fccb012e170 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:53.150 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.147+0000 7fccaf7fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fccb0081a60 0x7fccb012e170 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:48118/0 (socket says 192.168.123.105:48118) 2026-03-10T07:49:53.150 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.147+0000 7fccaf7fe700 1 -- 192.168.123.105:0/1336770476 learned_addr learned my addr 192.168.123.105:0/1336770476 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:49:53.150 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.147+0000 7fccaffff700 1 --2- 192.168.123.105:0/1336770476 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fccb0072b20 0x7fccb0081520 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:53.150 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.148+0000 7fccaf7fe700 1 -- 192.168.123.105:0/1336770476 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fccb0072b20 msgr2=0x7fccb0081520 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:53.150 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.148+0000 7fccaf7fe700 1 --2- 192.168.123.105:0/1336770476 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fccb0072b20 0x7fccb0081520 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:53.150 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.148+0000 7fccaf7fe700 1 -- 192.168.123.105:0/1336770476 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcca8008ee0 con 0x7fccb0081a60 2026-03-10T07:49:53.150 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.148+0000 7fccaf7fe700 1 --2- 192.168.123.105:0/1336770476 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fccb0081a60 0x7fccb012e170 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fcca80076d0 tx=0x7fcca800ee70 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:49:53.150 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.149+0000 7fccad7fa700 1 -- 192.168.123.105:0/1336770476 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcca8013070 con 0x7fccb0081a60 2026-03-10T07:49:53.152 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.150+0000 7fccb56b7700 1 -- 192.168.123.105:0/1336770476 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fccb012e6b0 con 0x7fccb0081a60 2026-03-10T07:49:53.152 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.150+0000 7fccb56b7700 1 -- 192.168.123.105:0/1336770476 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fccb012eb50 con 0x7fccb0081a60 2026-03-10T07:49:53.153 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.151+0000 7fccad7fa700 1 -- 192.168.123.105:0/1336770476 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fcca8003e80 con 0x7fccb0081a60 2026-03-10T07:49:53.153 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.151+0000 7fccad7fa700 1 -- 192.168.123.105:0/1336770476 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcca80057b0 con 0x7fccb0081a60 2026-03-10T07:49:53.153 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.151+0000 7fccb56b7700 1 -- 192.168.123.105:0/1336770476 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fccb007b730 con 0x7fccb0081a60 2026-03-10T07:49:53.154 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.152+0000 7fccad7fa700 1 -- 192.168.123.105:0/1336770476 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7fcca8005910 con 0x7fccb0081a60 2026-03-10T07:49:53.154 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.153+0000 7fccad7fa700 1 --2- 192.168.123.105:0/1336770476 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fcc9806c600 0x7fcc9806eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:53.154 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.153+0000 7fccaffff700 1 --2- 192.168.123.105:0/1336770476 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fcc9806c600 0x7fcc9806eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:53.154 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.153+0000 7fccad7fa700 1 -- 192.168.123.105:0/1336770476 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7fcca808d080 con 0x7fccb0081a60 2026-03-10T07:49:53.158 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.156+0000 7fccaffff700 1 --2- 192.168.123.105:0/1336770476 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fcc9806c600 0x7fcc9806eac0 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7fcca000ac50 tx=0x7fcca000a380 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:49:53.158 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.156+0000 7fccad7fa700 1 -- 192.168.123.105:0/1336770476 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7fcca8020080 con 0x7fccb0081a60 2026-03-10T07:49:53.342 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.340+0000 7fccb56b7700 1 -- 192.168.123.105:0/1336770476 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fccb00611d0 con 0x7fcc9806c600 2026-03-10T07:49:53.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.351+0000 7fccad7fa700 1 -- 192.168.123.105:0/1336770476 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3288 (secure 0 0 0) 0x7fccb00611d0 con 0x7fcc9806c600 2026-03-10T07:49:53.353 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T07:49:53.353 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (2m) 81s ago 3m 24.7M - 0.25.0 c8568f914cd2 f87529717116 2026-03-10T07:49:53.353 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (3m) 81s ago 3m 8120k - 18.2.1 5be31c24972a 26c4db858175 2026-03-10T07:49:53.353 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (3m) 82s ago 3m 8216k - 18.2.1 5be31c24972a 209e2398a09c 2026-03-10T07:49:53.353 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (3m) 81s ago 3m 7419k - 18.2.1 5be31c24972a d3d7b92c8ac3 2026-03-10T07:49:53.353 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (3m) 82s ago 3m 7415k - 18.2.1 5be31c24972a 96136e0195f7 2026-03-10T07:49:53.353 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (2m) 81s ago 3m 80.7M - 9.4.7 954c08fa6188 35089be30fc6 2026-03-10T07:49:53.353 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.omfhnh vm05 running (87s) 81s ago 87s 16.1M - 18.2.1 5be31c24972a e23de179e09c 2026-03-10T07:49:53.353 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.pavqil vm05 running (85s) 81s ago 85s 13.3M - 18.2.1 5be31c24972a 5b9e5afa214c 2026-03-10T07:49:53.353 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.dgsaon vm08 running (85s) 82s ago 84s 13.9M - 18.2.1 5be31c24972a 1696aee522b5 2026-03-10T07:49:53.353 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ybmbgd vm08 running (87s) 82s ago 87s 10.9M - 18.2.1 5be31c24972a 30b0e51cd2ed 2026-03-10T07:49:53.353 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.blexke vm05 *:9283,8765,8443 running (4m) 81s ago 4m 501M - 18.2.1 5be31c24972a 4af6d7f6e0f4 2026-03-10T07:49:53.353 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.orfpog vm08 *:8443,9283,8765 running (3m) 82s ago 3m 448M - 18.2.1 5be31c24972a 7b89b610a4ab 2026-03-10T07:49:53.353 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (4m) 81s ago 4m 50.4M 2048M 18.2.1 5be31c24972a 2a459bf05146 2026-03-10T07:49:53.353 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (2m) 82s ago 2m 45.2M 2048M 18.2.1 5be31c24972a e01dfb712474 2026-03-10T07:49:53.353 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (3m) 81s ago 3m 14.0M - 1.5.0 0da6a335fe13 cb6188e5fa06 2026-03-10T07:49:53.353 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (3m) 82s ago 3m 15.9M - 1.5.0 0da6a335fe13 f73da8e379d9 2026-03-10T07:49:53.353 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (2m) 81s ago 2m 46.6M 4096M 18.2.1 5be31c24972a 9b7c5ea48cea 2026-03-10T07:49:53.353 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (2m) 81s ago 2m 50.0M 4096M 18.2.1 5be31c24972a 88e0b65b2c93 2026-03-10T07:49:53.353 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (2m) 81s ago 2m 46.8M 4096M 18.2.1 5be31c24972a b8d3cbeb49f1 2026-03-10T07:49:53.353 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (2m) 82s ago 2m 46.8M 4096M 18.2.1 5be31c24972a 0a62c54a86c0 2026-03-10T07:49:53.353 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (2m) 82s ago 2m 43.3M 4096M 18.2.1 5be31c24972a bd748b691ccd 2026-03-10T07:49:53.353 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (113s) 82s ago 113s 43.0M 4096M 18.2.1 5be31c24972a 9f08820ae98b 2026-03-10T07:49:53.353 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (2m) 81s ago 3m 37.0M - 2.43.0 a07b618ecd1d bcb499ab4929 2026-03-10T07:49:53.356 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.354+0000 7fcc96ffd700 1 -- 192.168.123.105:0/1336770476 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fcc9806c600 msgr2=0x7fcc9806eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:53.356 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.354+0000 7fcc96ffd700 1 --2- 192.168.123.105:0/1336770476 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fcc9806c600 0x7fcc9806eac0 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7fcca000ac50 tx=0x7fcca000a380 comp rx=0 tx=0).stop 2026-03-10T07:49:53.356 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.354+0000 7fcc96ffd700 1 -- 192.168.123.105:0/1336770476 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fccb0081a60 msgr2=0x7fccb012e170 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:53.356 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.354+0000 7fcc96ffd700 1 --2- 192.168.123.105:0/1336770476 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fccb0081a60 0x7fccb012e170 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7fcca80076d0 tx=0x7fcca800ee70 comp rx=0 tx=0).stop 2026-03-10T07:49:53.356 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.355+0000 7fcc96ffd700 1 -- 192.168.123.105:0/1336770476 shutdown_connections 2026-03-10T07:49:53.356 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.355+0000 7fcc96ffd700 1 --2- 192.168.123.105:0/1336770476 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7fcc9806c600 0x7fcc9806eac0 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:53.357 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.355+0000 7fcc96ffd700 1 --2- 192.168.123.105:0/1336770476 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fccb0072b20 0x7fccb0081520 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:53.357 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.355+0000 7fcc96ffd700 1 --2- 192.168.123.105:0/1336770476 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fccb0081a60 0x7fccb012e170 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:53.357 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.355+0000 7fcc96ffd700 1 -- 192.168.123.105:0/1336770476 >> 192.168.123.105:0/1336770476 conn(0x7fccb006daa0 msgr2=0x7fccb006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:49:53.357 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.355+0000 7fcc96ffd700 1 -- 192.168.123.105:0/1336770476 shutdown_connections 2026-03-10T07:49:53.357 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.355+0000 7fcc96ffd700 1 -- 192.168.123.105:0/1336770476 wait complete. 2026-03-10T07:49:53.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:52 vm08.local ceph-mon[59917]: pgmap v121: 65 pgs: 65 active+clean; 33 MiB data, 254 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 2.7 MiB/s wr, 89 op/s 2026-03-10T07:49:53.481 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.480+0000 7f90a9119700 1 -- 192.168.123.105:0/3137794969 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f90a4107d90 msgr2=0x7f90a410a1c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:53.481 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.480+0000 7f90a9119700 1 --2- 192.168.123.105:0/3137794969 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f90a4107d90 0x7f90a410a1c0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f9094009b00 tx=0x7f9094009e10 comp rx=0 tx=0).stop 2026-03-10T07:49:53.482 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.480+0000 7f90a9119700 1 -- 192.168.123.105:0/3137794969 shutdown_connections 2026-03-10T07:49:53.482 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.480+0000 7f90a9119700 1 --2- 192.168.123.105:0/3137794969 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f90a410a700 0x7f90a410cb90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:53.482 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.480+0000 7f90a9119700 1 --2- 192.168.123.105:0/3137794969 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f90a4107d90 0x7f90a410a1c0 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:53.482 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.480+0000 7f90a9119700 1 -- 192.168.123.105:0/3137794969 >> 192.168.123.105:0/3137794969 conn(0x7f90a406dae0 msgr2=0x7f90a406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:49:53.482 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.481+0000 7f90a9119700 1 -- 192.168.123.105:0/3137794969 shutdown_connections 2026-03-10T07:49:53.483 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.481+0000 7f90a9119700 1 -- 192.168.123.105:0/3137794969 wait complete. 2026-03-10T07:49:53.483 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.482+0000 7f90a9119700 1 Processor -- start 2026-03-10T07:49:53.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.482+0000 7f90a9119700 1 -- start start 2026-03-10T07:49:53.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.482+0000 7f90a9119700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f90a4107d90 0x7f90a4116d10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:53.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.482+0000 7f90a9119700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f90a410a700 0x7f90a4117250 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:53.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.482+0000 7f90a9119700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f90a4117870 con 0x7f90a410a700 2026-03-10T07:49:53.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.482+0000 7f90a9119700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f90a41179b0 con 0x7f90a4107d90 2026-03-10T07:49:53.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.484+0000 7f90a2d9d700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f90a4107d90 0x7f90a4116d10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:53.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.484+0000 7f90a259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f90a410a700 0x7f90a4117250 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:53.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.484+0000 7f90a259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f90a410a700 0x7f90a4117250 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36458/0 (socket says 192.168.123.105:36458) 2026-03-10T07:49:53.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.484+0000 7f90a259c700 1 -- 192.168.123.105:0/523904509 learned_addr learned my addr 192.168.123.105:0/523904509 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:49:53.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.485+0000 7f90a2d9d700 1 -- 192.168.123.105:0/523904509 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f90a410a700 msgr2=0x7f90a4117250 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:53.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.485+0000 7f90a2d9d700 1 --2- 192.168.123.105:0/523904509 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f90a410a700 0x7f90a4117250 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:53.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.485+0000 7f90a2d9d700 1 -- 192.168.123.105:0/523904509 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f90940097e0 con 0x7f90a4107d90 2026-03-10T07:49:53.486 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.485+0000 7f90a2d9d700 1 --2- 192.168.123.105:0/523904509 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f90a4107d90 0x7f90a4116d10 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f90940052a0 tx=0x7f9094005340 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:49:53.487 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.485+0000 7f909bfff700 1 -- 192.168.123.105:0/523904509 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f909401d070 con 0x7f90a4107d90 2026-03-10T07:49:53.487 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.485+0000 7f909bfff700 1 -- 192.168.123.105:0/523904509 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f9094003ba0 con 0x7f90a4107d90 2026-03-10T07:49:53.487 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.485+0000 7f909bfff700 1 -- 192.168.123.105:0/523904509 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f90940176a0 con 0x7f90a4107d90 2026-03-10T07:49:53.487 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.485+0000 7f90a9119700 1 -- 192.168.123.105:0/523904509 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f90a41b3520 con 0x7f90a4107d90 2026-03-10T07:49:53.487 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.485+0000 7f90a9119700 1 -- 192.168.123.105:0/523904509 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f90a41b3a10 con 0x7f90a4107d90 2026-03-10T07:49:53.491 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.489+0000 7f909bfff700 1 -- 192.168.123.105:0/523904509 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f9094017800 con 0x7f90a4107d90 2026-03-10T07:49:53.491 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.489+0000 7f909bfff700 1 --2- 192.168.123.105:0/523904509 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f909006c530 0x7f909006e9f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:53.491 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.489+0000 7f909bfff700 1 -- 192.168.123.105:0/523904509 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f909408db10 con 0x7f90a4107d90 2026-03-10T07:49:53.492 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.490+0000 7f90a259c700 1 --2- 192.168.123.105:0/523904509 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f909006c530 0x7f909006e9f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:53.492 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.490+0000 7f90a9119700 1 -- 192.168.123.105:0/523904509 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9084005320 con 0x7f90a4107d90 2026-03-10T07:49:53.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.491+0000 7f90a259c700 1 --2- 192.168.123.105:0/523904509 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f909006c530 0x7f909006e9f0 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7f909c009e00 tx=0x7f909c00b040 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:49:53.495 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.493+0000 7f909bfff700 1 -- 192.168.123.105:0/523904509 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f909405bdc0 con 0x7f90a4107d90 2026-03-10T07:49:53.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.740+0000 7f90a9119700 1 -- 192.168.123.105:0/523904509 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f9084006200 con 0x7f90a4107d90 2026-03-10T07:49:53.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.741+0000 7f909bfff700 1 -- 192.168.123.105:0/523904509 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+558 (secure 0 0 0) 0x7f909405b950 con 0x7f90a4107d90 2026-03-10T07:49:53.742 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:49:53.742 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T07:49:53.742 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-10T07:49:53.742 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:49:53.742 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T07:49:53.742 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-10T07:49:53.742 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:49:53.742 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T07:49:53.742 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 6 2026-03-10T07:49:53.743 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:49:53.743 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T07:49:53.743 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T07:49:53.743 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:49:53.743 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T07:49:53.743 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 14 2026-03-10T07:49:53.743 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T07:49:53.743 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:49:53.745 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.744+0000 7f9099ffb700 1 -- 192.168.123.105:0/523904509 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f909006c530 msgr2=0x7f909006e9f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:53.745 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.744+0000 7f9099ffb700 1 --2- 192.168.123.105:0/523904509 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f909006c530 0x7f909006e9f0 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7f909c009e00 tx=0x7f909c00b040 comp rx=0 tx=0).stop 2026-03-10T07:49:53.745 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.744+0000 7f9099ffb700 1 -- 192.168.123.105:0/523904509 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f90a4107d90 msgr2=0x7f90a4116d10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:53.745 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.744+0000 7f9099ffb700 1 --2- 192.168.123.105:0/523904509 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f90a4107d90 0x7f90a4116d10 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f90940052a0 tx=0x7f9094005340 comp rx=0 tx=0).stop 2026-03-10T07:49:53.745 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.744+0000 7f9099ffb700 1 -- 192.168.123.105:0/523904509 shutdown_connections 2026-03-10T07:49:53.745 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.744+0000 7f9099ffb700 1 --2- 192.168.123.105:0/523904509 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f909006c530 0x7f909006e9f0 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:53.745 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.744+0000 7f9099ffb700 1 --2- 192.168.123.105:0/523904509 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f90a4107d90 0x7f90a4116d10 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:53.745 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.744+0000 7f9099ffb700 1 --2- 192.168.123.105:0/523904509 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f90a410a700 0x7f90a4117250 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:53.745 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.744+0000 7f9099ffb700 1 -- 192.168.123.105:0/523904509 >> 192.168.123.105:0/523904509 conn(0x7f90a406dae0 msgr2=0x7f90a406ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:49:53.746 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.744+0000 7f9099ffb700 1 -- 192.168.123.105:0/523904509 shutdown_connections 2026-03-10T07:49:53.746 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.744+0000 7f9099ffb700 1 -- 192.168.123.105:0/523904509 wait complete. 2026-03-10T07:49:53.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.831+0000 7f18a4d50700 1 -- 192.168.123.105:0/2715961903 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f18a0075a40 msgr2=0x7f18a0077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:53.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.831+0000 7f18a4d50700 1 --2- 192.168.123.105:0/2715961903 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f18a0075a40 0x7f18a0077ed0 secure :-1 s=READY pgs=295 cs=0 l=1 rev1=1 crypto rx=0x7f189800b780 tx=0x7f189800ba90 comp rx=0 tx=0).stop 2026-03-10T07:49:53.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.831+0000 7f18a4d50700 1 -- 192.168.123.105:0/2715961903 shutdown_connections 2026-03-10T07:49:53.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.831+0000 7f18a4d50700 1 --2- 192.168.123.105:0/2715961903 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f18a0075a40 0x7f18a0077ed0 unknown :-1 s=CLOSED pgs=295 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:53.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.831+0000 7f18a4d50700 1 --2- 192.168.123.105:0/2715961903 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f18a0072b50 0x7f18a0072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:53.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.831+0000 7f18a4d50700 1 -- 192.168.123.105:0/2715961903 >> 192.168.123.105:0/2715961903 conn(0x7f18a006dae0 msgr2=0x7f18a006ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:49:53.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.831+0000 7f18a4d50700 1 -- 192.168.123.105:0/2715961903 shutdown_connections 2026-03-10T07:49:53.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.831+0000 7f18a4d50700 1 -- 192.168.123.105:0/2715961903 wait complete. 2026-03-10T07:49:53.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.831+0000 7f18a4d50700 1 Processor -- start 2026-03-10T07:49:53.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.832+0000 7f18a4d50700 1 -- start start 2026-03-10T07:49:53.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.832+0000 7f18a4d50700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f18a0072b50 0x7f18a0083100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:53.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.832+0000 7f18a4d50700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f18a0083640 0x7f18a012e400 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:53.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.832+0000 7f18a4d50700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f18a0083b80 con 0x7f18a0083640 2026-03-10T07:49:53.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.832+0000 7f18a4d50700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f18a0083cf0 con 0x7f18a0072b50 2026-03-10T07:49:53.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.832+0000 7f189dd9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f18a0083640 0x7f18a012e400 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:53.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.832+0000 7f189dd9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f18a0083640 0x7f18a012e400 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36480/0 (socket says 192.168.123.105:36480) 2026-03-10T07:49:53.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.832+0000 7f189dd9b700 1 -- 192.168.123.105:0/3999279459 learned_addr learned my addr 192.168.123.105:0/3999279459 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:49:53.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.832+0000 7f189e59c700 1 --2- 192.168.123.105:0/3999279459 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f18a0072b50 0x7f18a0083100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:53.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.833+0000 7f189dd9b700 1 -- 192.168.123.105:0/3999279459 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f18a0072b50 msgr2=0x7f18a0083100 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:53.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.833+0000 7f189dd9b700 1 --2- 192.168.123.105:0/3999279459 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f18a0072b50 0x7f18a0083100 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:53.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.833+0000 7f189dd9b700 1 -- 192.168.123.105:0/3999279459 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f189800b050 con 0x7f18a0083640 2026-03-10T07:49:53.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.833+0000 7f189dd9b700 1 --2- 192.168.123.105:0/3999279459 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f18a0083640 0x7f18a012e400 secure :-1 s=READY pgs=296 cs=0 l=1 rev1=1 crypto rx=0x7f189800b750 tx=0x7f1898009300 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:49:53.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.833+0000 7f188f7fe700 1 -- 192.168.123.105:0/3999279459 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1898003bb0 con 0x7f18a0083640 2026-03-10T07:49:53.835 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.833+0000 7f18a4d50700 1 -- 192.168.123.105:0/3999279459 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f18a012ea00 con 0x7f18a0083640 2026-03-10T07:49:53.835 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.833+0000 7f18a4d50700 1 -- 192.168.123.105:0/3999279459 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f18a012ef00 con 0x7f18a0083640 2026-03-10T07:49:53.835 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.834+0000 7f188f7fe700 1 -- 192.168.123.105:0/3999279459 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f1898009d70 con 0x7f18a0083640 2026-03-10T07:49:53.835 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.834+0000 7f18a4d50700 1 -- 192.168.123.105:0/3999279459 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f18a004ea90 con 0x7f18a0083640 2026-03-10T07:49:53.835 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.834+0000 7f188f7fe700 1 -- 192.168.123.105:0/3999279459 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f18980046b0 con 0x7f18a0083640 2026-03-10T07:49:53.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.839+0000 7f188f7fe700 1 -- 192.168.123.105:0/3999279459 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 19) v1 ==== 90308+0+0 (secure 0 0 0) 0x7f189802b030 con 0x7f18a0083640 2026-03-10T07:49:53.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.840+0000 7f188f7fe700 1 --2- 192.168.123.105:0/3999279459 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f188806c600 0x7f188806eac0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:49:53.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.840+0000 7f188f7fe700 1 -- 192.168.123.105:0/3999279459 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(39..39 src has 1..39) v4 ==== 5362+0+0 (secure 0 0 0) 0x7f189808d1d0 con 0x7f18a0083640 2026-03-10T07:49:53.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.840+0000 7f188f7fe700 1 -- 192.168.123.105:0/3999279459 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+180422 (secure 0 0 0) 0x7f189808d650 con 0x7f18a0083640 2026-03-10T07:49:53.845 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.842+0000 7f189e59c700 1 --2- 192.168.123.105:0/3999279459 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f188806c600 0x7f188806eac0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:49:53.846 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.845+0000 7f189e59c700 1 --2- 192.168.123.105:0/3999279459 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f188806c600 0x7f188806eac0 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f1890009c80 tx=0x7f1890009400 comp rx=0 tx=0).ready entity=mgr.14223 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:49:53.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.972+0000 7f18a4d50700 1 -- 192.168.123.105:0/3999279459 --> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f18a0077700 con 0x7f188806c600 2026-03-10T07:49:53.975 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.974+0000 7f188f7fe700 1 -- 192.168.123.105:0/3999279459 <== mgr.14223 v2:192.168.123.105:6800/2 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+434 (secure 0 0 0) 0x7f18a0077700 con 0x7f188806c600 2026-03-10T07:49:53.975 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:49:53.975 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T07:49:53.975 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T07:49:53.975 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading daemons of type(s) mgr", 2026-03-10T07:49:53.975 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [], 2026-03-10T07:49:53.975 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "0/2 daemons upgraded", 2026-03-10T07:49:53.975 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Pulling quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc image on host vm08", 2026-03-10T07:49:53.975 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T07:49:53.975 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:49:53.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.976+0000 7f188d7fa700 1 -- 192.168.123.105:0/3999279459 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f188806c600 msgr2=0x7f188806eac0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:53.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.976+0000 7f188d7fa700 1 --2- 192.168.123.105:0/3999279459 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f188806c600 0x7f188806eac0 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f1890009c80 tx=0x7f1890009400 comp rx=0 tx=0).stop 2026-03-10T07:49:53.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.976+0000 7f188d7fa700 1 -- 192.168.123.105:0/3999279459 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f18a0083640 msgr2=0x7f18a012e400 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:49:53.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.976+0000 7f188d7fa700 1 --2- 192.168.123.105:0/3999279459 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f18a0083640 0x7f18a012e400 secure :-1 s=READY pgs=296 cs=0 l=1 rev1=1 crypto rx=0x7f189800b750 tx=0x7f1898009300 comp rx=0 tx=0).stop 2026-03-10T07:49:53.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.976+0000 7f188d7fa700 1 -- 192.168.123.105:0/3999279459 shutdown_connections 2026-03-10T07:49:53.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.976+0000 7f188d7fa700 1 --2- 192.168.123.105:0/3999279459 >> [v2:192.168.123.105:6800/2,v1:192.168.123.105:6801/2] conn(0x7f188806c600 0x7f188806eac0 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:53.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.976+0000 7f188d7fa700 1 --2- 192.168.123.105:0/3999279459 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f18a0072b50 0x7f18a0083100 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:53.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.976+0000 7f188d7fa700 1 --2- 192.168.123.105:0/3999279459 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f18a0083640 0x7f18a012e400 unknown :-1 s=CLOSED pgs=296 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:49:53.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.977+0000 7f188d7fa700 1 -- 192.168.123.105:0/3999279459 >> 192.168.123.105:0/3999279459 conn(0x7f18a006dae0 msgr2=0x7f18a006ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:49:53.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.978+0000 7f188d7fa700 1 -- 192.168.123.105:0/3999279459 shutdown_connections 2026-03-10T07:49:53.979 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:49:53.978+0000 7f188d7fa700 1 -- 192.168.123.105:0/3999279459 wait complete. 2026-03-10T07:49:54.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:54 vm08.local ceph-mon[59917]: from='client.24371 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:49:54.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:54 vm08.local ceph-mon[59917]: from='client.24375 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:49:54.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:54 vm08.local ceph-mon[59917]: pgmap v122: 65 pgs: 65 active+clean; 41 MiB data, 295 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 3.4 MiB/s wr, 141 op/s 2026-03-10T07:49:54.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:54 vm08.local ceph-mon[59917]: from='client.24377 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:49:54.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:54 vm08.local ceph-mon[59917]: from='client.? 192.168.123.105:0/523904509' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:49:54.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:54 vm05.local ceph-mon[50387]: from='client.24371 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:49:54.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:54 vm05.local ceph-mon[50387]: from='client.24375 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:49:54.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:54 vm05.local ceph-mon[50387]: pgmap v122: 65 pgs: 65 active+clean; 41 MiB data, 295 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 3.4 MiB/s wr, 141 op/s 2026-03-10T07:49:54.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:54 vm05.local ceph-mon[50387]: from='client.24377 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:49:54.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:54 vm05.local ceph-mon[50387]: from='client.? 192.168.123.105:0/523904509' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:49:56.017 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:55 vm05.local ceph-mon[50387]: from='client.14600 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:49:56.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:55 vm08.local ceph-mon[59917]: from='client.14600 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:49:57.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:56 vm05.local ceph-mon[50387]: pgmap v123: 65 pgs: 65 active+clean; 50 MiB data, 338 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 4.2 MiB/s wr, 169 op/s 2026-03-10T07:49:57.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:56 vm08.local ceph-mon[59917]: pgmap v123: 65 pgs: 65 active+clean; 50 MiB data, 338 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 4.2 MiB/s wr, 169 op/s 2026-03-10T07:49:58.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:57 vm05.local ceph-mon[50387]: pgmap v124: 65 pgs: 65 active+clean; 68 MiB data, 384 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 5.4 MiB/s wr, 210 op/s 2026-03-10T07:49:58.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:57 vm08.local ceph-mon[59917]: pgmap v124: 65 pgs: 65 active+clean; 68 MiB data, 384 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 5.4 MiB/s wr, 210 op/s 2026-03-10T07:49:59.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:49:59 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:49:59.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:49:59 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:50:00.809 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:00 vm08.local ceph-mon[59917]: pgmap v125: 65 pgs: 65 active+clean; 76 MiB data, 409 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 4.7 MiB/s wr, 209 op/s 2026-03-10T07:50:00.809 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:00 vm08.local ceph-mon[59917]: Health detail: HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T07:50:00.809 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:00 vm08.local ceph-mon[59917]: [WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T07:50:00.809 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:00 vm08.local ceph-mon[59917]: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T07:50:00.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:00 vm05.local ceph-mon[50387]: pgmap v125: 65 pgs: 65 active+clean; 76 MiB data, 409 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 4.7 MiB/s wr, 209 op/s 2026-03-10T07:50:00.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:00 vm05.local ceph-mon[50387]: Health detail: HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T07:50:00.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:00 vm05.local ceph-mon[50387]: [WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T07:50:00.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:00 vm05.local ceph-mon[50387]: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T07:50:01.571 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:01 vm08.local ceph-mon[59917]: Upgrade: Updating mgr.vm08.orfpog 2026-03-10T07:50:01.571 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:01 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:50:01.571 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:01 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.orfpog", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T07:50:01.571 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:01 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T07:50:01.571 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:01 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:50:01.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:01 vm05.local ceph-mon[50387]: Upgrade: Updating mgr.vm08.orfpog 2026-03-10T07:50:01.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:01 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:50:01.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:01 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.orfpog", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T07:50:01.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:01 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T07:50:01.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:01 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:50:02.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:02 vm08.local ceph-mon[59917]: Deploying daemon mgr.vm08.orfpog on vm08 2026-03-10T07:50:02.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:02 vm08.local ceph-mon[59917]: pgmap v126: 65 pgs: 65 active+clean; 84 MiB data, 445 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 4.6 MiB/s wr, 212 op/s 2026-03-10T07:50:03.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:02 vm05.local ceph-mon[50387]: Deploying daemon mgr.vm08.orfpog on vm08 2026-03-10T07:50:03.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:02 vm05.local ceph-mon[50387]: pgmap v126: 65 pgs: 65 active+clean; 84 MiB data, 445 MiB used, 120 GiB / 120 GiB avail; 0 B/s rd, 4.6 MiB/s wr, 212 op/s 2026-03-10T07:50:03.998 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:03 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:50:03.999 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:03 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:50:03.999 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:03 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:50:04.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:03 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:50:04.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:03 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:50:04.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:03 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:50:05.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:05 vm05.local ceph-mon[50387]: pgmap v127: 65 pgs: 65 active+clean; 97 MiB data, 511 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 5.4 MiB/s wr, 234 op/s 2026-03-10T07:50:05.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:05 vm08.local ceph-mon[59917]: pgmap v127: 65 pgs: 65 active+clean; 97 MiB data, 511 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 5.4 MiB/s wr, 234 op/s 2026-03-10T07:50:06.452 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:06 vm08.local ceph-mon[59917]: pgmap v128: 65 pgs: 65 active+clean; 104 MiB data, 610 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 5.3 MiB/s wr, 210 op/s 2026-03-10T07:50:06.452 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:06 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:50:06.452 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:06 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:50:06.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:06 vm05.local ceph-mon[50387]: pgmap v128: 65 pgs: 65 active+clean; 104 MiB data, 610 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 5.3 MiB/s wr, 210 op/s 2026-03-10T07:50:06.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:06 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:50:06.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:06 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:50:07.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:07 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:50:07.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:07 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:50:07.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:07 vm05.local ceph-mon[50387]: Standby manager daemon vm08.orfpog restarted 2026-03-10T07:50:07.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:07 vm05.local ceph-mon[50387]: Standby manager daemon vm08.orfpog started 2026-03-10T07:50:07.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:07 vm05.local ceph-mon[50387]: from='mgr.? 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.orfpog/crt"}]: dispatch 2026-03-10T07:50:07.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:07 vm05.local ceph-mon[50387]: from='mgr.? 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T07:50:07.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:07 vm05.local ceph-mon[50387]: from='mgr.? 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.orfpog/key"}]: dispatch 2026-03-10T07:50:07.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:07 vm05.local ceph-mon[50387]: from='mgr.? 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T07:50:07.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:07 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:50:07.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:07 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:50:07.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:07 vm08.local ceph-mon[59917]: Standby manager daemon vm08.orfpog restarted 2026-03-10T07:50:07.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:07 vm08.local ceph-mon[59917]: Standby manager daemon vm08.orfpog started 2026-03-10T07:50:07.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:07 vm08.local ceph-mon[59917]: from='mgr.? 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.orfpog/crt"}]: dispatch 2026-03-10T07:50:07.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:07 vm08.local ceph-mon[59917]: from='mgr.? 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T07:50:07.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:07 vm08.local ceph-mon[59917]: from='mgr.? 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.orfpog/key"}]: dispatch 2026-03-10T07:50:07.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:07 vm08.local ceph-mon[59917]: from='mgr.? 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: pgmap v129: 65 pgs: 65 active+clean; 117 MiB data, 633 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 5.7 MiB/s wr, 220 op/s 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: mgrmap e20: vm05.blexke(active, since 3m), standbys: vm08.orfpog 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr fail", "who": "vm05.blexke"}]: dispatch 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: osdmap e40: 6 total, 6 up, 6 in 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "mgr fail", "who": "vm05.blexke"}]': finished 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: mgrmap e21: vm08.orfpog(active, starting, since 0.0163721s) 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.pavqil"}]: dispatch 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.ybmbgd"}]: dispatch 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.omfhnh"}]: dispatch 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.dgsaon"}]: dispatch 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mgr metadata", "who": "vm08.orfpog", "id": "vm08.orfpog"}]: dispatch 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T07:50:08.791 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:08 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: pgmap v129: 65 pgs: 65 active+clean; 117 MiB data, 633 MiB used, 119 GiB / 120 GiB avail; 0 B/s rd, 5.7 MiB/s wr, 220 op/s 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: mgrmap e20: vm05.blexke(active, since 3m), standbys: vm08.orfpog 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr fail", "who": "vm05.blexke"}]: dispatch 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: osdmap e40: 6 total, 6 up, 6 in 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: from='mgr.14223 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd='[{"prefix": "mgr fail", "who": "vm05.blexke"}]': finished 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: mgrmap e21: vm08.orfpog(active, starting, since 0.0163721s) 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.pavqil"}]: dispatch 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.ybmbgd"}]: dispatch 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.omfhnh"}]: dispatch 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.dgsaon"}]: dispatch 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mgr metadata", "who": "vm08.orfpog", "id": "vm08.orfpog"}]: dispatch 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T07:50:08.795 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T07:50:08.796 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T07:50:08.796 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T07:50:08.796 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:08 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T07:50:09.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:09 vm05.local ceph-mon[50387]: Manager daemon vm08.orfpog is now available 2026-03-10T07:50:09.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:09 vm05.local ceph-mon[50387]: Migrating agent root cert to cert store 2026-03-10T07:50:09.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:09 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:09.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:09 vm05.local ceph-mon[50387]: Migrating agent root key to cert store 2026-03-10T07:50:09.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:09 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:09.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:09 vm05.local ceph-mon[50387]: Checking for cert/key for grafana.vm05 2026-03-10T07:50:09.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:09 vm05.local ceph-mon[50387]: Migrating grafana.vm05 cert to cert store 2026-03-10T07:50:09.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:09 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:09.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:09 vm05.local ceph-mon[50387]: Migrating grafana.vm05 key to cert store 2026-03-10T07:50:09.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:09 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:09.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:09 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:09.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:09 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:50:09.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:09 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:50:09.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:09 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm08.orfpog/mirror_snapshot_schedule"}]: dispatch 2026-03-10T07:50:09.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:09 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm08.orfpog/mirror_snapshot_schedule"}]: dispatch 2026-03-10T07:50:09.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:09 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm08.orfpog/trash_purge_schedule"}]: dispatch 2026-03-10T07:50:09.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:09 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm08.orfpog/trash_purge_schedule"}]: dispatch 2026-03-10T07:50:09.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:09 vm08.local ceph-mon[59917]: Manager daemon vm08.orfpog is now available 2026-03-10T07:50:09.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:09 vm08.local ceph-mon[59917]: Migrating agent root cert to cert store 2026-03-10T07:50:09.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:09 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:09.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:09 vm08.local ceph-mon[59917]: Migrating agent root key to cert store 2026-03-10T07:50:09.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:09 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:09.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:09 vm08.local ceph-mon[59917]: Checking for cert/key for grafana.vm05 2026-03-10T07:50:09.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:09 vm08.local ceph-mon[59917]: Migrating grafana.vm05 cert to cert store 2026-03-10T07:50:09.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:09 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:09.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:09 vm08.local ceph-mon[59917]: Migrating grafana.vm05 key to cert store 2026-03-10T07:50:09.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:09 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:09.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:09 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:09.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:09 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:50:09.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:09 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:50:09.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:09 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm08.orfpog/mirror_snapshot_schedule"}]: dispatch 2026-03-10T07:50:09.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:09 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm08.orfpog/mirror_snapshot_schedule"}]: dispatch 2026-03-10T07:50:09.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:09 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm08.orfpog/trash_purge_schedule"}]: dispatch 2026-03-10T07:50:09.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:09 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm08.orfpog/trash_purge_schedule"}]: dispatch 2026-03-10T07:50:11.126 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:10 vm05.local ceph-mon[50387]: Deploying cephadm binary to vm05 2026-03-10T07:50:11.126 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:10 vm05.local ceph-mon[50387]: Deploying cephadm binary to vm08 2026-03-10T07:50:11.126 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:10 vm05.local ceph-mon[50387]: mgrmap e22: vm08.orfpog(active, since 1.22202s) 2026-03-10T07:50:11.126 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:10 vm05.local ceph-mon[50387]: pgmap v3: 65 pgs: 65 active+clean; 132 MiB data, 790 MiB used, 119 GiB / 120 GiB avail 2026-03-10T07:50:11.126 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:10 vm05.local ceph-mon[50387]: pgmap v4: 65 pgs: 65 active+clean; 132 MiB data, 790 MiB used, 119 GiB / 120 GiB avail 2026-03-10T07:50:11.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:10 vm08.local ceph-mon[59917]: Deploying cephadm binary to vm05 2026-03-10T07:50:11.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:10 vm08.local ceph-mon[59917]: Deploying cephadm binary to vm08 2026-03-10T07:50:11.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:10 vm08.local ceph-mon[59917]: mgrmap e22: vm08.orfpog(active, since 1.22202s) 2026-03-10T07:50:11.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:10 vm08.local ceph-mon[59917]: pgmap v3: 65 pgs: 65 active+clean; 132 MiB data, 790 MiB used, 119 GiB / 120 GiB avail 2026-03-10T07:50:11.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:10 vm08.local ceph-mon[59917]: pgmap v4: 65 pgs: 65 active+clean; 132 MiB data, 790 MiB used, 119 GiB / 120 GiB avail 2026-03-10T07:50:12.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:11 vm05.local ceph-mon[50387]: [10/Mar/2026:07:50:10] ENGINE Bus STARTING 2026-03-10T07:50:12.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:11 vm05.local ceph-mon[50387]: [10/Mar/2026:07:50:10] ENGINE Serving on http://192.168.123.108:8765 2026-03-10T07:50:12.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:11 vm05.local ceph-mon[50387]: [10/Mar/2026:07:50:10] ENGINE Serving on https://192.168.123.108:7150 2026-03-10T07:50:12.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:11 vm05.local ceph-mon[50387]: [10/Mar/2026:07:50:10] ENGINE Bus STARTED 2026-03-10T07:50:12.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:11 vm05.local ceph-mon[50387]: [10/Mar/2026:07:50:10] ENGINE Client ('192.168.123.108', 44938) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T07:50:12.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:11 vm08.local ceph-mon[59917]: [10/Mar/2026:07:50:10] ENGINE Bus STARTING 2026-03-10T07:50:12.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:11 vm08.local ceph-mon[59917]: [10/Mar/2026:07:50:10] ENGINE Serving on http://192.168.123.108:8765 2026-03-10T07:50:12.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:11 vm08.local ceph-mon[59917]: [10/Mar/2026:07:50:10] ENGINE Serving on https://192.168.123.108:7150 2026-03-10T07:50:12.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:11 vm08.local ceph-mon[59917]: [10/Mar/2026:07:50:10] ENGINE Bus STARTED 2026-03-10T07:50:12.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:11 vm08.local ceph-mon[59917]: [10/Mar/2026:07:50:10] ENGINE Client ('192.168.123.108', 44938) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T07:50:12.989 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:12 vm08.local ceph-mon[59917]: mgrmap e23: vm08.orfpog(active, since 3s) 2026-03-10T07:50:12.990 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:12 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:12.990 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:12 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:12.990 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:12 vm08.local ceph-mon[59917]: pgmap v5: 65 pgs: 65 active+clean; 132 MiB data, 790 MiB used, 119 GiB / 120 GiB avail 2026-03-10T07:50:13.133 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:12 vm05.local ceph-mon[50387]: mgrmap e23: vm08.orfpog(active, since 3s) 2026-03-10T07:50:13.133 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:12 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:13.133 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:12 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:13.133 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:12 vm05.local ceph-mon[50387]: pgmap v5: 65 pgs: 65 active+clean; 132 MiB data, 790 MiB used, 119 GiB / 120 GiB avail 2026-03-10T07:50:13.133 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:12 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:13.133 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:12 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:13.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:12 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:13.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:12 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:14.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:14 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:14.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:14 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:14.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:14 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:14.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:14 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:14.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:14 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:50:14.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:14 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:50:14.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:14 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:14.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:14 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:14.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:14 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:50:14.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:14 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:50:14.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:14 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:50:14.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:14 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:50:14.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:14 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:14.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:14 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:14.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:14 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:14.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:14 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:14.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:14 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:50:14.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:14 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:50:14.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:14 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:14.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:14 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:14.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:14 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:50:14.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:14 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:50:14.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:14 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:50:14.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:14 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:50:15.441 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:15 vm05.local ceph-mon[50387]: Updating vm05:/etc/ceph/ceph.conf 2026-03-10T07:50:15.441 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:15 vm05.local ceph-mon[50387]: Updating vm08:/etc/ceph/ceph.conf 2026-03-10T07:50:15.441 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:15 vm05.local ceph-mon[50387]: Updating vm08:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.conf 2026-03-10T07:50:15.441 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:15 vm05.local ceph-mon[50387]: Updating vm05:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.conf 2026-03-10T07:50:15.441 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:15 vm05.local ceph-mon[50387]: Updating vm08:/etc/ceph/ceph.client.admin.keyring 2026-03-10T07:50:15.441 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:15 vm05.local ceph-mon[50387]: Updating vm05:/etc/ceph/ceph.client.admin.keyring 2026-03-10T07:50:15.441 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:15 vm05.local ceph-mon[50387]: Updating vm08:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.client.admin.keyring 2026-03-10T07:50:15.441 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:15 vm05.local ceph-mon[50387]: Updating vm05:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.client.admin.keyring 2026-03-10T07:50:15.441 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:15 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:15.441 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:15 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:15.441 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:15 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:15.441 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:15 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:15.441 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:15 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:15.441 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:15 vm05.local ceph-mon[50387]: Reconfiguring prometheus.vm05 (dependencies changed)... 2026-03-10T07:50:15.441 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:15 vm05.local ceph-mon[50387]: pgmap v6: 65 pgs: 65 active+clean; 132 MiB data, 790 MiB used, 119 GiB / 120 GiB avail 2026-03-10T07:50:15.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:15 vm08.local ceph-mon[59917]: Updating vm05:/etc/ceph/ceph.conf 2026-03-10T07:50:15.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:15 vm08.local ceph-mon[59917]: Updating vm08:/etc/ceph/ceph.conf 2026-03-10T07:50:15.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:15 vm08.local ceph-mon[59917]: Updating vm08:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.conf 2026-03-10T07:50:15.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:15 vm08.local ceph-mon[59917]: Updating vm05:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.conf 2026-03-10T07:50:15.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:15 vm08.local ceph-mon[59917]: Updating vm08:/etc/ceph/ceph.client.admin.keyring 2026-03-10T07:50:15.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:15 vm08.local ceph-mon[59917]: Updating vm05:/etc/ceph/ceph.client.admin.keyring 2026-03-10T07:50:15.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:15 vm08.local ceph-mon[59917]: Updating vm08:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.client.admin.keyring 2026-03-10T07:50:15.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:15 vm08.local ceph-mon[59917]: Updating vm05:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.client.admin.keyring 2026-03-10T07:50:15.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:15 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:15.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:15 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:15.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:15 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:15.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:15 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:15.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:15 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:15.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:15 vm08.local ceph-mon[59917]: Reconfiguring prometheus.vm05 (dependencies changed)... 2026-03-10T07:50:15.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:15 vm08.local ceph-mon[59917]: pgmap v6: 65 pgs: 65 active+clean; 132 MiB data, 790 MiB used, 119 GiB / 120 GiB avail 2026-03-10T07:50:16.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:16 vm05.local ceph-mon[50387]: Reconfiguring daemon prometheus.vm05 on vm05 2026-03-10T07:50:16.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:16 vm08.local ceph-mon[59917]: Reconfiguring daemon prometheus.vm05 on vm05 2026-03-10T07:50:17.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:17 vm05.local ceph-mon[50387]: pgmap v7: 65 pgs: 65 active+clean; 142 MiB data, 839 MiB used, 119 GiB / 120 GiB avail; 26 KiB/s rd, 1.6 MiB/s wr, 103 op/s 2026-03-10T07:50:17.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:17 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:17.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:17 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:17.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:17 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T07:50:17.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:17 vm05.local ceph-mon[50387]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T07:50:17.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:17 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:50:17.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:17 vm05.local ceph-mon[50387]: Upgrade: Need to upgrade myself (mgr.vm08.orfpog) 2026-03-10T07:50:17.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:17 vm05.local ceph-mon[50387]: Upgrade: Need to upgrade myself (mgr.vm08.orfpog) 2026-03-10T07:50:17.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:17 vm05.local ceph-mon[50387]: Upgrade: Updating mgr.vm05.blexke 2026-03-10T07:50:17.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:17 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:17.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:17 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.blexke", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T07:50:17.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:17 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.blexke", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T07:50:17.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:17 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T07:50:17.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:17 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:50:17.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:17 vm05.local ceph-mon[50387]: Deploying daemon mgr.vm05.blexke on vm05 2026-03-10T07:50:17.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:17 vm05.local ceph-mon[50387]: Standby manager daemon vm05.blexke started 2026-03-10T07:50:17.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:17 vm05.local ceph-mon[50387]: from='mgr.? 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.blexke/crt"}]: dispatch 2026-03-10T07:50:17.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:17 vm05.local ceph-mon[50387]: from='mgr.? 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T07:50:17.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:17 vm05.local ceph-mon[50387]: from='mgr.? 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.blexke/key"}]: dispatch 2026-03-10T07:50:17.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:17 vm05.local ceph-mon[50387]: from='mgr.? 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T07:50:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:17 vm08.local ceph-mon[59917]: pgmap v7: 65 pgs: 65 active+clean; 142 MiB data, 839 MiB used, 119 GiB / 120 GiB avail; 26 KiB/s rd, 1.6 MiB/s wr, 103 op/s 2026-03-10T07:50:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:17 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:17 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:17 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T07:50:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:17 vm08.local ceph-mon[59917]: from='mon.? -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T07:50:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:17 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:50:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:17 vm08.local ceph-mon[59917]: Upgrade: Need to upgrade myself (mgr.vm08.orfpog) 2026-03-10T07:50:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:17 vm08.local ceph-mon[59917]: Upgrade: Need to upgrade myself (mgr.vm08.orfpog) 2026-03-10T07:50:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:17 vm08.local ceph-mon[59917]: Upgrade: Updating mgr.vm05.blexke 2026-03-10T07:50:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:17 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:17 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.blexke", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T07:50:17.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:17 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.blexke", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T07:50:17.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:17 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T07:50:17.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:17 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:50:17.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:17 vm08.local ceph-mon[59917]: Deploying daemon mgr.vm05.blexke on vm05 2026-03-10T07:50:17.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:17 vm08.local ceph-mon[59917]: Standby manager daemon vm05.blexke started 2026-03-10T07:50:17.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:17 vm08.local ceph-mon[59917]: from='mgr.? 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.blexke/crt"}]: dispatch 2026-03-10T07:50:17.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:17 vm08.local ceph-mon[59917]: from='mgr.? 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T07:50:17.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:17 vm08.local ceph-mon[59917]: from='mgr.? 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.blexke/key"}]: dispatch 2026-03-10T07:50:17.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:17 vm08.local ceph-mon[59917]: from='mgr.? 192.168.123.105:0/2' entity='mgr.vm05.blexke' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T07:50:19.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:19 vm05.local ceph-mon[50387]: mgrmap e24: vm08.orfpog(active, since 9s), standbys: vm05.blexke 2026-03-10T07:50:19.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:19 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mgr metadata", "who": "vm05.blexke", "id": "vm05.blexke"}]: dispatch 2026-03-10T07:50:19.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:19 vm05.local ceph-mon[50387]: pgmap v8: 65 pgs: 65 active+clean; 142 MiB data, 839 MiB used, 119 GiB / 120 GiB avail; 20 KiB/s rd, 1.2 MiB/s wr, 80 op/s 2026-03-10T07:50:19.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:19 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:19.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:19 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:19.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:19 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:50:19.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:19 vm08.local ceph-mon[59917]: mgrmap e24: vm08.orfpog(active, since 9s), standbys: vm05.blexke 2026-03-10T07:50:19.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:19 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mgr metadata", "who": "vm05.blexke", "id": "vm05.blexke"}]: dispatch 2026-03-10T07:50:19.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:19 vm08.local ceph-mon[59917]: pgmap v8: 65 pgs: 65 active+clean; 142 MiB data, 839 MiB used, 119 GiB / 120 GiB avail; 20 KiB/s rd, 1.2 MiB/s wr, 80 op/s 2026-03-10T07:50:19.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:19 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:19.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:19 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:19.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:19 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:50:21.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:21 vm05.local ceph-mon[50387]: pgmap v9: 65 pgs: 65 active+clean; 152 MiB data, 905 MiB used, 119 GiB / 120 GiB avail; 16 KiB/s rd, 2.0 MiB/s wr, 168 op/s 2026-03-10T07:50:21.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:21 vm08.local ceph-mon[59917]: pgmap v9: 65 pgs: 65 active+clean; 152 MiB data, 905 MiB used, 119 GiB / 120 GiB avail; 16 KiB/s rd, 2.0 MiB/s wr, 168 op/s 2026-03-10T07:50:23.226 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:22 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:23.226 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:22 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:23.226 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:22 vm05.local ceph-mon[50387]: pgmap v10: 65 pgs: 65 active+clean; 152 MiB data, 905 MiB used, 119 GiB / 120 GiB avail; 15 KiB/s rd, 1.8 MiB/s wr, 151 op/s 2026-03-10T07:50:23.226 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:22 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:23.226 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:22 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:23.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:22 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:23.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:22 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:23.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:22 vm08.local ceph-mon[59917]: pgmap v10: 65 pgs: 65 active+clean; 152 MiB data, 905 MiB used, 119 GiB / 120 GiB avail; 15 KiB/s rd, 1.8 MiB/s wr, 151 op/s 2026-03-10T07:50:23.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:22 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:23.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:22 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:24.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.177+0000 7f3e5d946700 1 -- 192.168.123.105:0/3534366287 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3e58104340 msgr2=0x7f3e581047a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:24.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.177+0000 7f3e5d946700 1 --2- 192.168.123.105:0/3534366287 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3e58104340 0x7f3e581047a0 secure :-1 s=READY pgs=301 cs=0 l=1 rev1=1 crypto rx=0x7f3e48009b00 tx=0x7f3e48009e10 comp rx=0 tx=0).stop 2026-03-10T07:50:24.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.178+0000 7f3e5d946700 1 -- 192.168.123.105:0/3534366287 shutdown_connections 2026-03-10T07:50:24.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.178+0000 7f3e5d946700 1 --2- 192.168.123.105:0/3534366287 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3e58104340 0x7f3e581047a0 unknown :-1 s=CLOSED pgs=301 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:24.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.178+0000 7f3e5d946700 1 --2- 192.168.123.105:0/3534366287 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3e58103140 0x7f3e58103560 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:24.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.178+0000 7f3e5d946700 1 -- 192.168.123.105:0/3534366287 >> 192.168.123.105:0/3534366287 conn(0x7f3e580fe6c0 msgr2=0x7f3e58100b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:50:24.180 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.178+0000 7f3e5d946700 1 -- 192.168.123.105:0/3534366287 shutdown_connections 2026-03-10T07:50:24.180 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.178+0000 7f3e5d946700 1 -- 192.168.123.105:0/3534366287 wait complete. 2026-03-10T07:50:24.180 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.179+0000 7f3e5d946700 1 Processor -- start 2026-03-10T07:50:24.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.179+0000 7f3e5d946700 1 -- start start 2026-03-10T07:50:24.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.180+0000 7f3e5d946700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3e58103140 0x7f3e581989c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:24.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.180+0000 7f3e5d946700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3e58104340 0x7f3e58198f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:24.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.180+0000 7f3e5d946700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3e58199490 con 0x7f3e58103140 2026-03-10T07:50:24.187 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.180+0000 7f3e5d946700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3e581995d0 con 0x7f3e58104340 2026-03-10T07:50:24.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.181+0000 7f3e56ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3e58103140 0x7f3e581989c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:50:24.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.181+0000 7f3e56ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3e58103140 0x7f3e581989c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:51216/0 (socket says 192.168.123.105:51216) 2026-03-10T07:50:24.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.181+0000 7f3e56ffd700 1 -- 192.168.123.105:0/1572472857 learned_addr learned my addr 192.168.123.105:0/1572472857 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:50:24.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.182+0000 7f3e56ffd700 1 -- 192.168.123.105:0/1572472857 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3e58104340 msgr2=0x7f3e58198f00 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:50:24.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.182+0000 7f3e56ffd700 1 --2- 192.168.123.105:0/1572472857 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3e58104340 0x7f3e58198f00 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:24.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.182+0000 7f3e56ffd700 1 -- 192.168.123.105:0/1572472857 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3e480097e0 con 0x7f3e58103140 2026-03-10T07:50:24.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.182+0000 7f3e56ffd700 1 --2- 192.168.123.105:0/1572472857 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3e58103140 0x7f3e581989c0 secure :-1 s=READY pgs=302 cs=0 l=1 rev1=1 crypto rx=0x7f3e4000c910 tx=0x7f3e4000ccd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:50:24.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.183+0000 7f3e5c944700 1 -- 192.168.123.105:0/1572472857 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3e400041d0 con 0x7f3e58103140 2026-03-10T07:50:24.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.184+0000 7f3e5d946700 1 -- 192.168.123.105:0/1572472857 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3e5806a8d0 con 0x7f3e58103140 2026-03-10T07:50:24.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.184+0000 7f3e5d946700 1 -- 192.168.123.105:0/1572472857 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3e5806adf0 con 0x7f3e58103140 2026-03-10T07:50:24.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.184+0000 7f3e5c944700 1 -- 192.168.123.105:0/1572472857 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3e40007de0 con 0x7f3e58103140 2026-03-10T07:50:24.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.184+0000 7f3e5c944700 1 -- 192.168.123.105:0/1572472857 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3e40003e80 con 0x7f3e58103140 2026-03-10T07:50:24.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.186+0000 7f3e5d946700 1 -- 192.168.123.105:0/1572472857 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3e58066e80 con 0x7f3e58103140 2026-03-10T07:50:24.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.190+0000 7f3e5c944700 1 -- 192.168.123.105:0/1572472857 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 24) v1 ==== 95238+0+0 (secure 0 0 0) 0x7f3e4000f4b0 con 0x7f3e58103140 2026-03-10T07:50:24.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.190+0000 7f3e5c944700 1 --2- 192.168.123.105:0/1572472857 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7f3e4407a710 0x7f3e4407cbd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:24.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.190+0000 7f3e5c944700 1 -- 192.168.123.105:0/1572472857 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(40..40 src has 1..40) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f3e40093950 con 0x7f3e58103140 2026-03-10T07:50:24.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.193+0000 7f3e567fc700 1 --2- 192.168.123.105:0/1572472857 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7f3e4407a710 0x7f3e4407cbd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:50:24.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.194+0000 7f3e567fc700 1 --2- 192.168.123.105:0/1572472857 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7f3e4407a710 0x7f3e4407cbd0 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f3e48009fd0 tx=0x7f3e48005fb0 comp rx=0 tx=0).ready entity=mgr.24387 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:50:24.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.201+0000 7f3e5c944700 1 -- 192.168.123.105:0/1572472857 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f3e4005bef0 con 0x7f3e58103140 2026-03-10T07:50:24.421 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:24 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:50:24.491 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.488+0000 7f3e5d946700 1 -- 192.168.123.105:0/1572472857 --> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f3e58108c90 con 0x7f3e4407a710 2026-03-10T07:50:24.493 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.491+0000 7f3e5c944700 1 -- 192.168.123.105:0/1572472857 <== mgr.24387 v2:192.168.123.108:6828/2231834414 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+360 (secure 0 0 0) 0x7f3e58108c90 con 0x7f3e4407a710 2026-03-10T07:50:24.503 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.501+0000 7f3e5d946700 1 -- 192.168.123.105:0/1572472857 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7f3e4407a710 msgr2=0x7f3e4407cbd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:24.503 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.501+0000 7f3e5d946700 1 --2- 192.168.123.105:0/1572472857 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7f3e4407a710 0x7f3e4407cbd0 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7f3e48009fd0 tx=0x7f3e48005fb0 comp rx=0 tx=0).stop 2026-03-10T07:50:24.503 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.501+0000 7f3e5d946700 1 -- 192.168.123.105:0/1572472857 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3e58103140 msgr2=0x7f3e581989c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:24.503 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.501+0000 7f3e5d946700 1 --2- 192.168.123.105:0/1572472857 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3e58103140 0x7f3e581989c0 secure :-1 s=READY pgs=302 cs=0 l=1 rev1=1 crypto rx=0x7f3e4000c910 tx=0x7f3e4000ccd0 comp rx=0 tx=0).stop 2026-03-10T07:50:24.504 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.503+0000 7f3e5d946700 1 -- 192.168.123.105:0/1572472857 shutdown_connections 2026-03-10T07:50:24.504 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.503+0000 7f3e5d946700 1 --2- 192.168.123.105:0/1572472857 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3e58103140 0x7f3e581989c0 unknown :-1 s=CLOSED pgs=302 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:24.504 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.503+0000 7f3e5d946700 1 --2- 192.168.123.105:0/1572472857 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7f3e4407a710 0x7f3e4407cbd0 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:24.504 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.503+0000 7f3e5d946700 1 --2- 192.168.123.105:0/1572472857 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3e58104340 0x7f3e58198f00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:24.504 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.503+0000 7f3e5d946700 1 -- 192.168.123.105:0/1572472857 >> 192.168.123.105:0/1572472857 conn(0x7f3e580fe6c0 msgr2=0x7f3e58107570 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:50:24.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.503+0000 7f3e5d946700 1 -- 192.168.123.105:0/1572472857 shutdown_connections 2026-03-10T07:50:24.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.503+0000 7f3e5d946700 1 -- 192.168.123.105:0/1572472857 wait complete. 2026-03-10T07:50:24.529 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T07:50:24.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:24 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:50:24.691 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.688+0000 7f090e5e6700 1 -- 192.168.123.105:0/490829203 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0908075a40 msgr2=0x7f0908077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:24.691 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.688+0000 7f090e5e6700 1 --2- 192.168.123.105:0/490829203 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0908075a40 0x7f0908077ed0 secure :-1 s=READY pgs=303 cs=0 l=1 rev1=1 crypto rx=0x7f090000cd40 tx=0x7f090000a320 comp rx=0 tx=0).stop 2026-03-10T07:50:24.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.688+0000 7f090e5e6700 1 -- 192.168.123.105:0/490829203 shutdown_connections 2026-03-10T07:50:24.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.688+0000 7f090e5e6700 1 --2- 192.168.123.105:0/490829203 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0908075a40 0x7f0908077ed0 unknown :-1 s=CLOSED pgs=303 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:24.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.688+0000 7f090e5e6700 1 --2- 192.168.123.105:0/490829203 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0908072b50 0x7f0908072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:24.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.688+0000 7f090e5e6700 1 -- 192.168.123.105:0/490829203 >> 192.168.123.105:0/490829203 conn(0x7f090806dae0 msgr2=0x7f090806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:50:24.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.688+0000 7f090e5e6700 1 -- 192.168.123.105:0/490829203 shutdown_connections 2026-03-10T07:50:24.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.688+0000 7f090e5e6700 1 -- 192.168.123.105:0/490829203 wait complete. 2026-03-10T07:50:24.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.689+0000 7f090e5e6700 1 Processor -- start 2026-03-10T07:50:24.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.689+0000 7f090e5e6700 1 -- start start 2026-03-10T07:50:24.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.689+0000 7f090e5e6700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0908072b50 0x7f0908083090 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:24.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.689+0000 7f090e5e6700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f09080835d0 0x7f09081b3120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:24.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.689+0000 7f090e5e6700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0908083ae0 con 0x7f0908072b50 2026-03-10T07:50:24.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.689+0000 7f090e5e6700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0908083c50 con 0x7f09080835d0 2026-03-10T07:50:24.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.691+0000 7f0907fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0908072b50 0x7f0908083090 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:50:24.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.691+0000 7f0907fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0908072b50 0x7f0908083090 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:51234/0 (socket says 192.168.123.105:51234) 2026-03-10T07:50:24.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.691+0000 7f0907fff700 1 -- 192.168.123.105:0/2690103773 learned_addr learned my addr 192.168.123.105:0/2690103773 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:50:24.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.691+0000 7f09077fe700 1 --2- 192.168.123.105:0/2690103773 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f09080835d0 0x7f09081b3120 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:50:24.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.692+0000 7f09077fe700 1 -- 192.168.123.105:0/2690103773 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0908072b50 msgr2=0x7f0908083090 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:24.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.692+0000 7f09077fe700 1 --2- 192.168.123.105:0/2690103773 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0908072b50 0x7f0908083090 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:24.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.692+0000 7f09077fe700 1 -- 192.168.123.105:0/2690103773 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f090000c9f0 con 0x7f09080835d0 2026-03-10T07:50:24.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.692+0000 7f09077fe700 1 --2- 192.168.123.105:0/2690103773 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f09080835d0 0x7f09081b3120 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f090000cd40 tx=0x7f090000bb10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:50:24.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.692+0000 7f09057fa700 1 -- 192.168.123.105:0/2690103773 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0900004420 con 0x7f09080835d0 2026-03-10T07:50:24.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.692+0000 7f090e5e6700 1 -- 192.168.123.105:0/2690103773 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f09081b3660 con 0x7f09080835d0 2026-03-10T07:50:24.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.692+0000 7f090e5e6700 1 -- 192.168.123.105:0/2690103773 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f09081b3bb0 con 0x7f09080835d0 2026-03-10T07:50:24.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.693+0000 7f09057fa700 1 -- 192.168.123.105:0/2690103773 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f0900004030 con 0x7f09080835d0 2026-03-10T07:50:24.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.693+0000 7f09057fa700 1 -- 192.168.123.105:0/2690103773 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f090000c040 con 0x7f09080835d0 2026-03-10T07:50:24.696 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.695+0000 7f09057fa700 1 -- 192.168.123.105:0/2690103773 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 24) v1 ==== 95238+0+0 (secure 0 0 0) 0x7f090000c1a0 con 0x7f09080835d0 2026-03-10T07:50:24.696 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.695+0000 7f09057fa700 1 --2- 192.168.123.105:0/2690103773 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7f08f0071e80 0x7f08f0074340 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:24.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.695+0000 7f09057fa700 1 -- 192.168.123.105:0/2690103773 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(40..40 src has 1..40) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f0900022030 con 0x7f09080835d0 2026-03-10T07:50:24.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.696+0000 7f0907fff700 1 --2- 192.168.123.105:0/2690103773 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7f08f0071e80 0x7f08f0074340 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:50:24.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.696+0000 7f090e5e6700 1 -- 192.168.123.105:0/2690103773 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f08f4005320 con 0x7f09080835d0 2026-03-10T07:50:24.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.697+0000 7f0907fff700 1 --2- 192.168.123.105:0/2690103773 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7f08f0071e80 0x7f08f0074340 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f08f8005950 tx=0x7f08f800a400 comp rx=0 tx=0).ready entity=mgr.24387 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:50:24.703 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.701+0000 7f09057fa700 1 -- 192.168.123.105:0/2690103773 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f0900061730 con 0x7f09080835d0 2026-03-10T07:50:24.918 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.916+0000 7f090e5e6700 1 -- 192.168.123.105:0/2690103773 --> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f08f4000bf0 con 0x7f08f0071e80 2026-03-10T07:50:24.919 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.917+0000 7f09057fa700 1 -- 192.168.123.105:0/2690103773 <== mgr.24387 v2:192.168.123.108:6828/2231834414 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+360 (secure 0 0 0) 0x7f08f4000bf0 con 0x7f08f0071e80 2026-03-10T07:50:24.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.920+0000 7f08eeffd700 1 -- 192.168.123.105:0/2690103773 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7f08f0071e80 msgr2=0x7f08f0074340 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:24.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.920+0000 7f08eeffd700 1 --2- 192.168.123.105:0/2690103773 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7f08f0071e80 0x7f08f0074340 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f08f8005950 tx=0x7f08f800a400 comp rx=0 tx=0).stop 2026-03-10T07:50:24.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.920+0000 7f08eeffd700 1 -- 192.168.123.105:0/2690103773 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f09080835d0 msgr2=0x7f09081b3120 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:24.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.920+0000 7f08eeffd700 1 --2- 192.168.123.105:0/2690103773 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f09080835d0 0x7f09081b3120 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f090000cd40 tx=0x7f090000bb10 comp rx=0 tx=0).stop 2026-03-10T07:50:24.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.920+0000 7f08eeffd700 1 -- 192.168.123.105:0/2690103773 shutdown_connections 2026-03-10T07:50:24.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.920+0000 7f08eeffd700 1 --2- 192.168.123.105:0/2690103773 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0908072b50 0x7f0908083090 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:24.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.920+0000 7f08eeffd700 1 --2- 192.168.123.105:0/2690103773 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7f08f0071e80 0x7f08f0074340 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:24.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.920+0000 7f08eeffd700 1 --2- 192.168.123.105:0/2690103773 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f09080835d0 0x7f09081b3120 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:24.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.920+0000 7f08eeffd700 1 -- 192.168.123.105:0/2690103773 >> 192.168.123.105:0/2690103773 conn(0x7f090806dae0 msgr2=0x7f090806ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:50:24.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.920+0000 7f08eeffd700 1 -- 192.168.123.105:0/2690103773 shutdown_connections 2026-03-10T07:50:24.922 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:24.920+0000 7f08eeffd700 1 -- 192.168.123.105:0/2690103773 wait complete. 2026-03-10T07:50:25.025 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.023+0000 7f6b91960700 1 -- 192.168.123.105:0/3267344363 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b8c075a40 msgr2=0x7f6b8c077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:25.025 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.023+0000 7f6b91960700 1 --2- 192.168.123.105:0/3267344363 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b8c075a40 0x7f6b8c077ed0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f6b84009230 tx=0x7f6b84009260 comp rx=0 tx=0).stop 2026-03-10T07:50:25.025 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.023+0000 7f6b91960700 1 -- 192.168.123.105:0/3267344363 shutdown_connections 2026-03-10T07:50:25.025 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.023+0000 7f6b91960700 1 --2- 192.168.123.105:0/3267344363 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b8c075a40 0x7f6b8c077ed0 secure :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f6b84009230 tx=0x7f6b84009260 comp rx=0 tx=0).stop 2026-03-10T07:50:25.025 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.023+0000 7f6b91960700 1 --2- 192.168.123.105:0/3267344363 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b8c072b50 0x7f6b8c072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:25.025 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.023+0000 7f6b91960700 1 -- 192.168.123.105:0/3267344363 >> 192.168.123.105:0/3267344363 conn(0x7f6b8c06dae0 msgr2=0x7f6b8c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:50:25.025 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.024+0000 7f6b91960700 1 -- 192.168.123.105:0/3267344363 shutdown_connections 2026-03-10T07:50:25.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.024+0000 7f6b91960700 1 -- 192.168.123.105:0/3267344363 wait complete. 2026-03-10T07:50:25.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.024+0000 7f6b91960700 1 Processor -- start 2026-03-10T07:50:25.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.024+0000 7f6b91960700 1 -- start start 2026-03-10T07:50:25.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.025+0000 7f6b91960700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b8c072b50 0x7f6b8c12bdb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:25.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.025+0000 7f6b91960700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b8c12c2f0 0x7f6b8c12e780 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:25.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.025+0000 7f6b91960700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6b8c083d80 con 0x7f6b8c072b50 2026-03-10T07:50:25.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.025+0000 7f6b91960700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6b8c12ecc0 con 0x7f6b8c12c2f0 2026-03-10T07:50:25.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.025+0000 7f6b8a7fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b8c12c2f0 0x7f6b8c12e780 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:50:25.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.025+0000 7f6b8a7fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b8c12c2f0 0x7f6b8c12e780 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:60150/0 (socket says 192.168.123.105:60150) 2026-03-10T07:50:25.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.025+0000 7f6b8a7fc700 1 -- 192.168.123.105:0/1798524746 learned_addr learned my addr 192.168.123.105:0/1798524746 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:50:25.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.025+0000 7f6b8a7fc700 1 -- 192.168.123.105:0/1798524746 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b8c072b50 msgr2=0x7f6b8c12bdb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:25.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.025+0000 7f6b8a7fc700 1 --2- 192.168.123.105:0/1798524746 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b8c072b50 0x7f6b8c12bdb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:25.028 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.025+0000 7f6b8a7fc700 1 -- 192.168.123.105:0/1798524746 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6b84008ee0 con 0x7f6b8c12c2f0 2026-03-10T07:50:25.028 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.025+0000 7f6b8a7fc700 1 --2- 192.168.123.105:0/1798524746 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b8c12c2f0 0x7f6b8c12e780 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f6b84000f80 tx=0x7f6b840085f0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:50:25.028 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.025+0000 7f6b9095e700 1 -- 192.168.123.105:0/1798524746 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6b84007cb0 con 0x7f6b8c12c2f0 2026-03-10T07:50:25.028 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.026+0000 7f6b91960700 1 -- 192.168.123.105:0/1798524746 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6b8c12ef40 con 0x7f6b8c12c2f0 2026-03-10T07:50:25.028 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.026+0000 7f6b91960700 1 -- 192.168.123.105:0/1798524746 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6b8c12f490 con 0x7f6b8c12c2f0 2026-03-10T07:50:25.028 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.026+0000 7f6b9095e700 1 -- 192.168.123.105:0/1798524746 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f6b84007e10 con 0x7f6b8c12c2f0 2026-03-10T07:50:25.028 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.026+0000 7f6b9095e700 1 -- 192.168.123.105:0/1798524746 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6b8400eba0 con 0x7f6b8c12c2f0 2026-03-10T07:50:25.028 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.026+0000 7f6b727fc700 1 -- 192.168.123.105:0/1798524746 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6b8c04ea90 con 0x7f6b8c12c2f0 2026-03-10T07:50:25.030 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.027+0000 7f6b9095e700 1 -- 192.168.123.105:0/1798524746 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 24) v1 ==== 95238+0+0 (secure 0 0 0) 0x7f6b8400ed00 con 0x7f6b8c12c2f0 2026-03-10T07:50:25.030 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.027+0000 7f6b9095e700 1 --2- 192.168.123.105:0/1798524746 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7f6b74071ea0 0x7f6b74074360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:25.030 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.027+0000 7f6b9095e700 1 -- 192.168.123.105:0/1798524746 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(40..40 src has 1..40) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f6b84012070 con 0x7f6b8c12c2f0 2026-03-10T07:50:25.032 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.031+0000 7f6b8affd700 1 --2- 192.168.123.105:0/1798524746 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7f6b74071ea0 0x7f6b74074360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:50:25.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.031+0000 7f6b9095e700 1 -- 192.168.123.105:0/1798524746 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f6b8405d3d0 con 0x7f6b8c12c2f0 2026-03-10T07:50:25.035 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.031+0000 7f6b8affd700 1 --2- 192.168.123.105:0/1798524746 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7f6b74071ea0 0x7f6b74074360 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f6b7c00b3c0 tx=0x7f6b7c00d040 comp rx=0 tx=0).ready entity=mgr.24387 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:50:25.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.276+0000 7f6b727fc700 1 -- 192.168.123.105:0/1798524746 --> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f6b8c0611d0 con 0x7f6b74071ea0 2026-03-10T07:50:25.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:25 vm05.local ceph-mon[50387]: pgmap v11: 65 pgs: 65 active+clean; 152 MiB data, 905 MiB used, 119 GiB / 120 GiB avail; 15 KiB/s rd, 1.8 MiB/s wr, 151 op/s 2026-03-10T07:50:25.282 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:25 vm05.local ceph-mon[50387]: from='client.14632 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (3m) 3s ago 4m 25.0M - 0.25.0 c8568f914cd2 f87529717116 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (4m) 3s ago 4m 8414k - 18.2.1 5be31c24972a 26c4db858175 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (3m) 13s ago 3m 8350k - 18.2.1 5be31c24972a 209e2398a09c 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (4m) 3s ago 4m 7419k - 18.2.1 5be31c24972a d3d7b92c8ac3 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (3m) 13s ago 3m 7415k - 18.2.1 5be31c24972a 96136e0195f7 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (3m) 3s ago 3m 88.0M - 9.4.7 954c08fa6188 35089be30fc6 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.omfhnh vm05 running (119s) 3s ago 119s 177M - 18.2.1 5be31c24972a e23de179e09c 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.pavqil vm05 running (117s) 3s ago 117s 14.9M - 18.2.1 5be31c24972a 5b9e5afa214c 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.dgsaon vm08 running (116s) 13s ago 116s 15.7M - 18.2.1 5be31c24972a 1696aee522b5 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ybmbgd vm08 running (118s) 13s ago 118s 14.2M - 18.2.1 5be31c24972a 30b0e51cd2ed 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.blexke vm05 *:8443,9283,8765 running (6s) 3s ago 4m 62.6M - 19.2.3-678-ge911bdeb 654f31e6858e 5467b6a3bd38 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.orfpog vm08 *:8443,9283,8765 running (22s) 13s ago 3m 536M - 19.2.3-678-ge911bdeb 654f31e6858e 459b5407f826 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (4m) 3s ago 4m 59.0M 2048M 18.2.1 5be31c24972a 2a459bf05146 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (3m) 13s ago 3m 50.2M 2048M 18.2.1 5be31c24972a e01dfb712474 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (4m) 3s ago 4m 14.9M - 1.5.0 0da6a335fe13 cb6188e5fa06 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (3m) 13s ago 3m 16.1M - 1.5.0 0da6a335fe13 f73da8e379d9 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (3m) 3s ago 3m 94.9M 4096M 18.2.1 5be31c24972a 9b7c5ea48cea 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (3m) 3s ago 3m 105M 4096M 18.2.1 5be31c24972a 88e0b65b2c93 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (2m) 3s ago 2m 86.7M 4096M 18.2.1 5be31c24972a b8d3cbeb49f1 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (2m) 13s ago 2m 107M 4096M 18.2.1 5be31c24972a 0a62c54a86c0 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (2m) 13s ago 2m 96.1M 4096M 18.2.1 5be31c24972a bd748b691ccd 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (2m) 13s ago 2m 94.5M 4096M 18.2.1 5be31c24972a 9f08820ae98b 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (9s) 3s ago 3m 37.4M - 2.43.0 a07b618ecd1d 830d841bd5e4 2026-03-10T07:50:25.287 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.283+0000 7f6b9095e700 1 -- 192.168.123.105:0/1798524746 <== mgr.24387 v2:192.168.123.108:6828/2231834414 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3600 (secure 0 0 0) 0x7f6b8c0611d0 con 0x7f6b74071ea0 2026-03-10T07:50:25.288 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.287+0000 7f6b727fc700 1 -- 192.168.123.105:0/1798524746 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7f6b74071ea0 msgr2=0x7f6b74074360 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:25.288 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.287+0000 7f6b727fc700 1 --2- 192.168.123.105:0/1798524746 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7f6b74071ea0 0x7f6b74074360 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f6b7c00b3c0 tx=0x7f6b7c00d040 comp rx=0 tx=0).stop 2026-03-10T07:50:25.288 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.287+0000 7f6b727fc700 1 -- 192.168.123.105:0/1798524746 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b8c12c2f0 msgr2=0x7f6b8c12e780 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:25.288 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.287+0000 7f6b727fc700 1 --2- 192.168.123.105:0/1798524746 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b8c12c2f0 0x7f6b8c12e780 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f6b84000f80 tx=0x7f6b840085f0 comp rx=0 tx=0).stop 2026-03-10T07:50:25.294 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.289+0000 7f6b727fc700 1 -- 192.168.123.105:0/1798524746 shutdown_connections 2026-03-10T07:50:25.294 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.289+0000 7f6b727fc700 1 --2- 192.168.123.105:0/1798524746 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6b8c072b50 0x7f6b8c12bdb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:25.294 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.289+0000 7f6b727fc700 1 --2- 192.168.123.105:0/1798524746 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7f6b74071ea0 0x7f6b74074360 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:25.294 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.289+0000 7f6b727fc700 1 --2- 192.168.123.105:0/1798524746 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6b8c12c2f0 0x7f6b8c12e780 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:25.294 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.289+0000 7f6b727fc700 1 -- 192.168.123.105:0/1798524746 >> 192.168.123.105:0/1798524746 conn(0x7f6b8c06dae0 msgr2=0x7f6b8c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:50:25.294 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.293+0000 7f6b727fc700 1 -- 192.168.123.105:0/1798524746 shutdown_connections 2026-03-10T07:50:25.294 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.293+0000 7f6b727fc700 1 -- 192.168.123.105:0/1798524746 wait complete. 2026-03-10T07:50:25.418 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.414+0000 7fdec8f48700 1 -- 192.168.123.105:0/3358457373 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdec4072b20 msgr2=0x7fdec4072f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:25.418 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.414+0000 7fdec8f48700 1 --2- 192.168.123.105:0/3358457373 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdec4072b20 0x7fdec4072f40 secure :-1 s=READY pgs=304 cs=0 l=1 rev1=1 crypto rx=0x7fdeb4009b00 tx=0x7fdeb4009e10 comp rx=0 tx=0).stop 2026-03-10T07:50:25.419 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.414+0000 7fdec8f48700 1 -- 192.168.123.105:0/3358457373 shutdown_connections 2026-03-10T07:50:25.419 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.414+0000 7fdec8f48700 1 --2- 192.168.123.105:0/3358457373 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdec4075a10 0x7fdec4077ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:25.419 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.414+0000 7fdec8f48700 1 --2- 192.168.123.105:0/3358457373 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdec4072b20 0x7fdec4072f40 unknown :-1 s=CLOSED pgs=304 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:25.419 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.414+0000 7fdec8f48700 1 -- 192.168.123.105:0/3358457373 >> 192.168.123.105:0/3358457373 conn(0x7fdec406daa0 msgr2=0x7fdec406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:50:25.419 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.414+0000 7fdec8f48700 1 -- 192.168.123.105:0/3358457373 shutdown_connections 2026-03-10T07:50:25.419 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.414+0000 7fdec8f48700 1 -- 192.168.123.105:0/3358457373 wait complete. 2026-03-10T07:50:25.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.414+0000 7fdec8f48700 1 Processor -- start 2026-03-10T07:50:25.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.414+0000 7fdec8f48700 1 -- start start 2026-03-10T07:50:25.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.414+0000 7fdec8f48700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdec4075a10 0x7fdec4080070 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:25.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.414+0000 7fdec8f48700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdec40805b0 0x7fdec412e470 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:25.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.414+0000 7fdec8f48700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdec4080ac0 con 0x7fdec4075a10 2026-03-10T07:50:25.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.414+0000 7fdec8f48700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdec4080c30 con 0x7fdec40805b0 2026-03-10T07:50:25.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.415+0000 7fdec2ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdec40805b0 0x7fdec412e470 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:50:25.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.415+0000 7fdec2ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdec40805b0 0x7fdec412e470 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:60164/0 (socket says 192.168.123.105:60164) 2026-03-10T07:50:25.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.415+0000 7fdec2ffd700 1 -- 192.168.123.105:0/150044798 learned_addr learned my addr 192.168.123.105:0/150044798 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:50:25.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.415+0000 7fdec37fe700 1 --2- 192.168.123.105:0/150044798 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdec4075a10 0x7fdec4080070 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:50:25.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.416+0000 7fdec2ffd700 1 -- 192.168.123.105:0/150044798 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdec4075a10 msgr2=0x7fdec4080070 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:25.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.416+0000 7fdec2ffd700 1 --2- 192.168.123.105:0/150044798 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdec4075a10 0x7fdec4080070 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:25.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.416+0000 7fdec2ffd700 1 -- 192.168.123.105:0/150044798 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdeb40097e0 con 0x7fdec40805b0 2026-03-10T07:50:25.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.416+0000 7fdec2ffd700 1 --2- 192.168.123.105:0/150044798 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdec40805b0 0x7fdec412e470 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fdebc009e00 tx=0x7fdebc00c620 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:50:25.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.416+0000 7fdec0ff9700 1 -- 192.168.123.105:0/150044798 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdebc011070 con 0x7fdec40805b0 2026-03-10T07:50:25.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.416+0000 7fdec8f48700 1 -- 192.168.123.105:0/150044798 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdec412ea10 con 0x7fdec40805b0 2026-03-10T07:50:25.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.416+0000 7fdec8f48700 1 -- 192.168.123.105:0/150044798 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdec412ef60 con 0x7fdec40805b0 2026-03-10T07:50:25.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.416+0000 7fdec0ff9700 1 -- 192.168.123.105:0/150044798 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fdebc004bb0 con 0x7fdec40805b0 2026-03-10T07:50:25.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.416+0000 7fdec0ff9700 1 -- 192.168.123.105:0/150044798 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdebc005230 con 0x7fdec40805b0 2026-03-10T07:50:25.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.417+0000 7fdec8f48700 1 -- 192.168.123.105:0/150044798 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdec404ea90 con 0x7fdec40805b0 2026-03-10T07:50:25.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.419+0000 7fdec0ff9700 1 -- 192.168.123.105:0/150044798 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 24) v1 ==== 95238+0+0 (secure 0 0 0) 0x7fdebc0054a0 con 0x7fdec40805b0 2026-03-10T07:50:25.421 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.419+0000 7fdec0ff9700 1 --2- 192.168.123.105:0/150044798 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7fdeac071ea0 0x7fdeac074360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:25.421 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.419+0000 7fdec0ff9700 1 -- 192.168.123.105:0/150044798 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(40..40 src has 1..40) v4 ==== 5534+0+0 (secure 0 0 0) 0x7fdebc094270 con 0x7fdec40805b0 2026-03-10T07:50:25.421 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.419+0000 7fdec37fe700 1 --2- 192.168.123.105:0/150044798 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7fdeac071ea0 0x7fdeac074360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:50:25.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.420+0000 7fdec37fe700 1 --2- 192.168.123.105:0/150044798 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7fdeac071ea0 0x7fdeac074360 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fdeb400b5c0 tx=0x7fdeb4005e20 comp rx=0 tx=0).ready entity=mgr.24387 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:50:25.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.420+0000 7fdec0ff9700 1 -- 192.168.123.105:0/150044798 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7fdebc05cee0 con 0x7fdec40805b0 2026-03-10T07:50:25.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:25 vm08.local ceph-mon[59917]: pgmap v11: 65 pgs: 65 active+clean; 152 MiB data, 905 MiB used, 119 GiB / 120 GiB avail; 15 KiB/s rd, 1.8 MiB/s wr, 151 op/s 2026-03-10T07:50:25.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:25 vm08.local ceph-mon[59917]: from='client.14632 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:50:25.689 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.687+0000 7fdec8f48700 1 -- 192.168.123.105:0/150044798 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fdec412f240 con 0x7fdec40805b0 2026-03-10T07:50:25.689 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.688+0000 7fdec0ff9700 1 -- 192.168.123.105:0/150044798 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+770 (secure 0 0 0) 0x7fdebc05c630 con 0x7fdec40805b0 2026-03-10T07:50:25.689 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:50:25.690 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T07:50:25.690 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-10T07:50:25.690 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:50:25.690 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T07:50:25.690 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 1, 2026-03-10T07:50:25.690 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T07:50:25.690 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:50:25.690 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T07:50:25.690 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 6 2026-03-10T07:50:25.690 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:50:25.690 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T07:50:25.690 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T07:50:25.690 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:50:25.690 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T07:50:25.690 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 13, 2026-03-10T07:50:25.690 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T07:50:25.690 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T07:50:25.690 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:50:25.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.692+0000 7fdec8f48700 1 -- 192.168.123.105:0/150044798 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7fdeac071ea0 msgr2=0x7fdeac074360 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:25.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.692+0000 7fdec8f48700 1 --2- 192.168.123.105:0/150044798 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7fdeac071ea0 0x7fdeac074360 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7fdeb400b5c0 tx=0x7fdeb4005e20 comp rx=0 tx=0).stop 2026-03-10T07:50:25.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.692+0000 7fdec8f48700 1 -- 192.168.123.105:0/150044798 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdec40805b0 msgr2=0x7fdec412e470 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:25.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.692+0000 7fdec8f48700 1 --2- 192.168.123.105:0/150044798 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdec40805b0 0x7fdec412e470 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7fdebc009e00 tx=0x7fdebc00c620 comp rx=0 tx=0).stop 2026-03-10T07:50:25.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.692+0000 7fdec8f48700 1 -- 192.168.123.105:0/150044798 shutdown_connections 2026-03-10T07:50:25.693 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.692+0000 7fdec8f48700 1 --2- 192.168.123.105:0/150044798 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdec4075a10 0x7fdec4080070 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:25.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.692+0000 7fdec8f48700 1 --2- 192.168.123.105:0/150044798 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7fdeac071ea0 0x7fdeac074360 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:25.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.692+0000 7fdec8f48700 1 --2- 192.168.123.105:0/150044798 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdec40805b0 0x7fdec412e470 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:25.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.692+0000 7fdec8f48700 1 -- 192.168.123.105:0/150044798 >> 192.168.123.105:0/150044798 conn(0x7fdec406daa0 msgr2=0x7fdec406ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:50:25.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.692+0000 7fdec8f48700 1 -- 192.168.123.105:0/150044798 shutdown_connections 2026-03-10T07:50:25.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.692+0000 7fdec8f48700 1 -- 192.168.123.105:0/150044798 wait complete. 2026-03-10T07:50:25.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.862+0000 7f835b59e700 1 -- 192.168.123.105:0/1608839968 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f835c072b20 msgr2=0x7f835c072f40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:25.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.862+0000 7f835b59e700 1 --2- 192.168.123.105:0/1608839968 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f835c072b20 0x7f835c072f40 secure :-1 s=READY pgs=305 cs=0 l=1 rev1=1 crypto rx=0x7f834c009a60 tx=0x7f834c009d70 comp rx=0 tx=0).stop 2026-03-10T07:50:25.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.862+0000 7f835b59e700 1 -- 192.168.123.105:0/1608839968 shutdown_connections 2026-03-10T07:50:25.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.862+0000 7f835b59e700 1 --2- 192.168.123.105:0/1608839968 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f835c075a10 0x7f835c077ea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:25.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.862+0000 7f835b59e700 1 --2- 192.168.123.105:0/1608839968 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f835c072b20 0x7f835c072f40 unknown :-1 s=CLOSED pgs=305 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:25.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.862+0000 7f835b59e700 1 -- 192.168.123.105:0/1608839968 >> 192.168.123.105:0/1608839968 conn(0x7f835c06daa0 msgr2=0x7f835c06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:50:25.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.862+0000 7f835b59e700 1 -- 192.168.123.105:0/1608839968 shutdown_connections 2026-03-10T07:50:25.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.862+0000 7f835b59e700 1 -- 192.168.123.105:0/1608839968 wait complete. 2026-03-10T07:50:25.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.863+0000 7f835b59e700 1 Processor -- start 2026-03-10T07:50:25.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.863+0000 7f835b59e700 1 -- start start 2026-03-10T07:50:25.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.863+0000 7f835b59e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f835c075a10 0x7f835c0830a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:25.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.863+0000 7f835b59e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f835c0835e0 0x7f835c1b3100 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:25.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.863+0000 7f835b59e700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f835c083af0 con 0x7f835c075a10 2026-03-10T07:50:25.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.863+0000 7f835b59e700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f835c083c60 con 0x7f835c0835e0 2026-03-10T07:50:25.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.866+0000 7f8359d9b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f835c0835e0 0x7f835c1b3100 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:50:25.868 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.866+0000 7f8359d9b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f835c0835e0 0x7f835c1b3100 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:60184/0 (socket says 192.168.123.105:60184) 2026-03-10T07:50:25.868 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.866+0000 7f8359d9b700 1 -- 192.168.123.105:0/1208643413 learned_addr learned my addr 192.168.123.105:0/1208643413 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:50:25.868 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.866+0000 7f835a59c700 1 --2- 192.168.123.105:0/1208643413 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f835c075a10 0x7f835c0830a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:50:25.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.868+0000 7f8359d9b700 1 -- 192.168.123.105:0/1208643413 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f835c075a10 msgr2=0x7f835c0830a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:25.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.868+0000 7f8359d9b700 1 --2- 192.168.123.105:0/1208643413 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f835c075a10 0x7f835c0830a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:25.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.868+0000 7f8359d9b700 1 -- 192.168.123.105:0/1208643413 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f834c009710 con 0x7f835c0835e0 2026-03-10T07:50:25.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.869+0000 7f8359d9b700 1 --2- 192.168.123.105:0/1208643413 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f835c0835e0 0x7f835c1b3100 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f835400c370 tx=0x7f835400c730 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:50:25.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.869+0000 7f834b7fe700 1 -- 192.168.123.105:0/1208643413 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8354011070 con 0x7f835c0835e0 2026-03-10T07:50:25.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.869+0000 7f835b59e700 1 -- 192.168.123.105:0/1208643413 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f835c1b36a0 con 0x7f835c0835e0 2026-03-10T07:50:25.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.869+0000 7f835b59e700 1 -- 192.168.123.105:0/1208643413 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f835c1b3bf0 con 0x7f835c0835e0 2026-03-10T07:50:25.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.870+0000 7f835b59e700 1 -- 192.168.123.105:0/1208643413 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f835c04ea90 con 0x7f835c0835e0 2026-03-10T07:50:25.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.870+0000 7f834b7fe700 1 -- 192.168.123.105:0/1208643413 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f83540158f0 con 0x7f835c0835e0 2026-03-10T07:50:25.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.872+0000 7f834b7fe700 1 -- 192.168.123.105:0/1208643413 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8354013a10 con 0x7f835c0835e0 2026-03-10T07:50:25.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.872+0000 7f834b7fe700 1 -- 192.168.123.105:0/1208643413 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 24) v1 ==== 95238+0+0 (secure 0 0 0) 0x7f8354013c30 con 0x7f835c0835e0 2026-03-10T07:50:25.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.872+0000 7f834b7fe700 1 --2- 192.168.123.105:0/1208643413 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7f8344071ea0 0x7f8344074360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:25.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.873+0000 7f834b7fe700 1 -- 192.168.123.105:0/1208643413 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(40..40 src has 1..40) v4 ==== 5534+0+0 (secure 0 0 0) 0x7f8354093310 con 0x7f835c0835e0 2026-03-10T07:50:25.876 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.875+0000 7f835a59c700 1 --2- 192.168.123.105:0/1208643413 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7f8344071ea0 0x7f8344074360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:50:25.877 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.875+0000 7f834b7fe700 1 -- 192.168.123.105:0/1208643413 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f835405c030 con 0x7f835c0835e0 2026-03-10T07:50:25.892 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:25.889+0000 7f835a59c700 1 --2- 192.168.123.105:0/1208643413 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7f8344071ea0 0x7f8344074360 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f834c00d7d0 tx=0x7f834c003680 comp rx=0 tx=0).ready entity=mgr.24387 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:50:26.012 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:26.010+0000 7f834b7fe700 1 -- 192.168.123.105:0/1208643413 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mgrmap(e 25) v1 ==== 45072+0+0 (secure 0 0 0) 0x7f83540582a0 con 0x7f835c0835e0 2026-03-10T07:50:26.012 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:26.010+0000 7f834b7fe700 1 -- 192.168.123.105:0/1208643413 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7f8344071ea0 msgr2=0x7f8344074360 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:26.012 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:26.010+0000 7f834b7fe700 1 --2- 192.168.123.105:0/1208643413 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7f8344071ea0 0x7f8344074360 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f834c00d7d0 tx=0x7f834c003680 comp rx=0 tx=0).stop 2026-03-10T07:50:26.281 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:26 vm08.local ceph-mon[59917]: from='client.24417 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:50:26.281 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:26 vm08.local ceph-mon[59917]: from='client.24421 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:50:26.281 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:26 vm08.local ceph-mon[59917]: from='client.? 192.168.123.105:0/150044798' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:50:26.281 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:26 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:26.281 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:26 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:26.281 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:26 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:50:26.281 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:26 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:50:26.281 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:26 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:26.281 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:26 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:50:26.281 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:26 vm08.local ceph-mon[59917]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mgr fail", "who": "vm08.orfpog"}]: dispatch 2026-03-10T07:50:26.281 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:26 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mgr fail", "who": "vm08.orfpog"}]: dispatch 2026-03-10T07:50:26.281 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:26 vm08.local ceph-mon[59917]: osdmap e41: 6 total, 6 up, 6 in 2026-03-10T07:50:26.281 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:26 vm08.local ceph-mon[59917]: from='mgr.24387 ' entity='mgr.vm08.orfpog' cmd='[{"prefix": "mgr fail", "who": "vm08.orfpog"}]': finished 2026-03-10T07:50:26.281 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:26 vm08.local ceph-mon[59917]: mgrmap e25: vm05.blexke(active, starting, since 0.0317599s) 2026-03-10T07:50:26.334 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:26 vm05.local ceph-mon[50387]: from='client.24417 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:50:26.335 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:26 vm05.local ceph-mon[50387]: from='client.24421 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:50:26.335 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:26 vm05.local ceph-mon[50387]: from='client.? 192.168.123.105:0/150044798' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:50:26.335 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:26 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:26.335 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:26 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:26.335 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:26 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:50:26.335 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:26 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:50:26.335 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:26 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' 2026-03-10T07:50:26.335 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:26 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:50:26.335 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:26 vm05.local ceph-mon[50387]: from='mgr.24387 192.168.123.108:0/503323081' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mgr fail", "who": "vm08.orfpog"}]: dispatch 2026-03-10T07:50:26.335 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:26 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' cmd=[{"prefix": "mgr fail", "who": "vm08.orfpog"}]: dispatch 2026-03-10T07:50:26.335 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:26 vm05.local ceph-mon[50387]: osdmap e41: 6 total, 6 up, 6 in 2026-03-10T07:50:26.335 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:26 vm05.local ceph-mon[50387]: from='mgr.24387 ' entity='mgr.vm08.orfpog' cmd='[{"prefix": "mgr fail", "who": "vm08.orfpog"}]': finished 2026-03-10T07:50:26.335 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:26 vm05.local ceph-mon[50387]: mgrmap e25: vm05.blexke(active, starting, since 0.0317599s) 2026-03-10T07:50:26.971 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:26.969+0000 7f834b7fe700 1 -- 192.168.123.105:0/1208643413 <== mon.1 v2:192.168.123.108:3300/0 8 ==== mgrmap(e 26) v1 ==== 49900+0+0 (secure 0 0 0) 0x7f8354079770 con 0x7f835c0835e0 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: Active manager daemon vm05.blexke restarted 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: Activating manager daemon vm05.blexke 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: from='mgr.? 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.blexke/crt"}]: dispatch 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: from='mgr.? 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: osdmap e42: 6 total, 6 up, 6 in 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: mgrmap e26: vm05.blexke(active, starting, since 0.0218393s) 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: from='mgr.? 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.blexke/key"}]: dispatch 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.blexke/key"}]: dispatch 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.pavqil"}]: dispatch 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.ybmbgd"}]: dispatch 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.omfhnh"}]: dispatch 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.dgsaon"}]: dispatch 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr metadata", "who": "vm05.blexke", "id": "vm05.blexke"}]: dispatch 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T07:50:27.486 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:27 vm08.local ceph-mon[59917]: Manager daemon vm05.blexke is now available 2026-03-10T07:50:27.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: Active manager daemon vm05.blexke restarted 2026-03-10T07:50:27.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: Activating manager daemon vm05.blexke 2026-03-10T07:50:27.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: from='mgr.? 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.blexke/crt"}]: dispatch 2026-03-10T07:50:27.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: from='mgr.? 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T07:50:27.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: osdmap e42: 6 total, 6 up, 6 in 2026-03-10T07:50:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: mgrmap e26: vm05.blexke(active, starting, since 0.0218393s) 2026-03-10T07:50:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: from='mgr.? 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.blexke/key"}]: dispatch 2026-03-10T07:50:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm05.blexke/key"}]: dispatch 2026-03-10T07:50:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T07:50:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:50:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.pavqil"}]: dispatch 2026-03-10T07:50:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.ybmbgd"}]: dispatch 2026-03-10T07:50:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.omfhnh"}]: dispatch 2026-03-10T07:50:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.dgsaon"}]: dispatch 2026-03-10T07:50:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr metadata", "who": "vm05.blexke", "id": "vm05.blexke"}]: dispatch 2026-03-10T07:50:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T07:50:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T07:50:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T07:50:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:50:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T07:50:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T07:50:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T07:50:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T07:50:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T07:50:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T07:50:27.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:27 vm05.local ceph-mon[50387]: Manager daemon vm05.blexke is now available 2026-03-10T07:50:28.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:28.067+0000 7f834b7fe700 1 -- 192.168.123.105:0/1208643413 <== mon.1 v2:192.168.123.108:3300/0 9 ==== mgrmap(e 27) v1 ==== 50027+0+0 (secure 0 0 0) 0x7f8354076400 con 0x7f835c0835e0 2026-03-10T07:50:28.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:28.067+0000 7f834b7fe700 1 --2- 192.168.123.105:0/1208643413 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f8344002a20 0x7f8344035750 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:28.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:28.068+0000 7f834b7fe700 1 -- 192.168.123.105:0/1208643413 --> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f83440360c0 con 0x7f8344002a20 2026-03-10T07:50:28.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:28.074+0000 7f835a59c700 1 --2- 192.168.123.105:0/1208643413 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f8344002a20 0x7f8344035750 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:50:28.083 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:28.082+0000 7f835a59c700 1 --2- 192.168.123.105:0/1208643413 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f8344002a20 0x7f8344035750 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f834c003730 tx=0x7f834c01e000 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:50:28.118 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:50:28.118 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T07:50:28.118 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T07:50:28.118 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading daemons of type(s) mgr", 2026-03-10T07:50:28.118 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T07:50:28.118 INFO:teuthology.orchestra.run.vm05.stdout: "mgr" 2026-03-10T07:50:28.118 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T07:50:28.118 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "2/2 daemons upgraded", 2026-03-10T07:50:28.118 INFO:teuthology.orchestra.run.vm05.stdout: "message": "", 2026-03-10T07:50:28.118 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T07:50:28.118 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:50:28.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:28.114+0000 7f834b7fe700 1 -- 192.168.123.105:0/1208643413 <== mgr.14628 v2:192.168.123.105:6800/413688438 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+329 (secure 0 0 0) 0x7f83440360c0 con 0x7f8344002a20 2026-03-10T07:50:28.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:28.116+0000 7f83497fa700 1 -- 192.168.123.105:0/1208643413 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f8344002a20 msgr2=0x7f8344035750 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:28.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:28.117+0000 7f83497fa700 1 --2- 192.168.123.105:0/1208643413 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f8344002a20 0x7f8344035750 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f834c003730 tx=0x7f834c01e000 comp rx=0 tx=0).stop 2026-03-10T07:50:28.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:28.117+0000 7f83497fa700 1 -- 192.168.123.105:0/1208643413 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f835c0835e0 msgr2=0x7f835c1b3100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:28.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:28.117+0000 7f83497fa700 1 --2- 192.168.123.105:0/1208643413 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f835c0835e0 0x7f835c1b3100 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f835400c370 tx=0x7f835400c730 comp rx=0 tx=0).stop 2026-03-10T07:50:28.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:28.117+0000 7f83497fa700 1 -- 192.168.123.105:0/1208643413 shutdown_connections 2026-03-10T07:50:28.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:28.117+0000 7f83497fa700 1 --2- 192.168.123.105:0/1208643413 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f8344002a20 0x7f8344035750 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:28.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:28.117+0000 7f83497fa700 1 --2- 192.168.123.105:0/1208643413 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f835c075a10 0x7f835c0830a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:28.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:28.117+0000 7f83497fa700 1 --2- 192.168.123.105:0/1208643413 >> [v2:192.168.123.108:6828/2231834414,v1:192.168.123.108:6829/2231834414] conn(0x7f8344071ea0 0x7f8344074360 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:28.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:28.117+0000 7f83497fa700 1 --2- 192.168.123.105:0/1208643413 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f835c0835e0 0x7f835c1b3100 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:28.119 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:28.117+0000 7f83497fa700 1 -- 192.168.123.105:0/1208643413 >> 192.168.123.105:0/1208643413 conn(0x7f835c06daa0 msgr2=0x7f835c06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:50:28.119 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:28.117+0000 7f83497fa700 1 -- 192.168.123.105:0/1208643413 shutdown_connections 2026-03-10T07:50:28.119 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:28.117+0000 7f83497fa700 1 -- 192.168.123.105:0/1208643413 wait complete. 2026-03-10T07:50:28.231 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:28 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:50:28.231 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:28 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.blexke/mirror_snapshot_schedule"}]: dispatch 2026-03-10T07:50:28.231 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:28 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:50:28.231 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:28 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.blexke/trash_purge_schedule"}]: dispatch 2026-03-10T07:50:28.231 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:28 vm05.local ceph-mon[50387]: mgrmap e27: vm05.blexke(active, since 1.11896s) 2026-03-10T07:50:28.231 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:28 vm05.local ceph-mon[50387]: from='client.24429 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:50:28.231 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:28 vm05.local ceph-mon[50387]: pgmap v3: 65 pgs: 65 active+clean; 173 MiB data, 998 MiB used, 119 GiB / 120 GiB avail 2026-03-10T07:50:28.491 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:28 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:50:28.491 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:28 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.blexke/mirror_snapshot_schedule"}]: dispatch 2026-03-10T07:50:28.491 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:28 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:50:28.491 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:28 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.blexke/trash_purge_schedule"}]: dispatch 2026-03-10T07:50:28.491 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:28 vm08.local ceph-mon[59917]: mgrmap e27: vm05.blexke(active, since 1.11896s) 2026-03-10T07:50:28.491 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:28 vm08.local ceph-mon[59917]: from='client.24429 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:50:28.491 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:28 vm08.local ceph-mon[59917]: pgmap v3: 65 pgs: 65 active+clean; 173 MiB data, 998 MiB used, 119 GiB / 120 GiB avail 2026-03-10T07:50:30.196 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:30 vm08.local ceph-mon[59917]: pgmap v4: 65 pgs: 65 active+clean; 173 MiB data, 998 MiB used, 119 GiB / 120 GiB avail 2026-03-10T07:50:30.196 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:30 vm08.local ceph-mon[59917]: mgrmap e28: vm05.blexke(active, since 2s) 2026-03-10T07:50:30.196 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:30 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:30.196 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:30 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:30.196 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:30 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:30.196 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:30 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:30.196 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:30 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:50:30.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:30 vm05.local ceph-mon[50387]: pgmap v4: 65 pgs: 65 active+clean; 173 MiB data, 998 MiB used, 119 GiB / 120 GiB avail 2026-03-10T07:50:30.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:30 vm05.local ceph-mon[50387]: mgrmap e28: vm05.blexke(active, since 2s) 2026-03-10T07:50:30.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:30 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:30.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:30 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:30.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:30 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:30.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:30 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:30.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:30 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:50:31.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:31 vm05.local ceph-mon[50387]: [10/Mar/2026:07:50:29] ENGINE Bus STARTING 2026-03-10T07:50:31.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:31 vm05.local ceph-mon[50387]: [10/Mar/2026:07:50:29] ENGINE Serving on https://192.168.123.105:7150 2026-03-10T07:50:31.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:31 vm05.local ceph-mon[50387]: [10/Mar/2026:07:50:29] ENGINE Client ('192.168.123.105', 35106) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T07:50:31.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:31 vm05.local ceph-mon[50387]: [10/Mar/2026:07:50:29] ENGINE Serving on http://192.168.123.105:8765 2026-03-10T07:50:31.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:31 vm05.local ceph-mon[50387]: [10/Mar/2026:07:50:29] ENGINE Bus STARTED 2026-03-10T07:50:31.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:31 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:31.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:31 vm05.local ceph-mon[50387]: Standby manager daemon vm08.orfpog started 2026-03-10T07:50:31.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:31 vm05.local ceph-mon[50387]: from='mgr.? 192.168.123.108:0/1946652531' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.orfpog/crt"}]: dispatch 2026-03-10T07:50:31.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:31 vm05.local ceph-mon[50387]: from='mgr.? 192.168.123.108:0/1946652531' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T07:50:31.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:31 vm05.local ceph-mon[50387]: from='mgr.? 192.168.123.108:0/1946652531' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.orfpog/key"}]: dispatch 2026-03-10T07:50:31.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:31 vm05.local ceph-mon[50387]: from='mgr.? 192.168.123.108:0/1946652531' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T07:50:31.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:31 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:31.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:31 vm08.local ceph-mon[59917]: [10/Mar/2026:07:50:29] ENGINE Bus STARTING 2026-03-10T07:50:31.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:31 vm08.local ceph-mon[59917]: [10/Mar/2026:07:50:29] ENGINE Serving on https://192.168.123.105:7150 2026-03-10T07:50:31.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:31 vm08.local ceph-mon[59917]: [10/Mar/2026:07:50:29] ENGINE Client ('192.168.123.105', 35106) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T07:50:31.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:31 vm08.local ceph-mon[59917]: [10/Mar/2026:07:50:29] ENGINE Serving on http://192.168.123.105:8765 2026-03-10T07:50:31.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:31 vm08.local ceph-mon[59917]: [10/Mar/2026:07:50:29] ENGINE Bus STARTED 2026-03-10T07:50:31.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:31 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:31.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:31 vm08.local ceph-mon[59917]: Standby manager daemon vm08.orfpog started 2026-03-10T07:50:31.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:31 vm08.local ceph-mon[59917]: from='mgr.? 192.168.123.108:0/1946652531' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.orfpog/crt"}]: dispatch 2026-03-10T07:50:31.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:31 vm08.local ceph-mon[59917]: from='mgr.? 192.168.123.108:0/1946652531' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T07:50:31.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:31 vm08.local ceph-mon[59917]: from='mgr.? 192.168.123.108:0/1946652531' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.orfpog/key"}]: dispatch 2026-03-10T07:50:31.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:31 vm08.local ceph-mon[59917]: from='mgr.? 192.168.123.108:0/1946652531' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T07:50:31.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:31 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:32.137 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:32 vm05.local ceph-mon[50387]: pgmap v5: 65 pgs: 65 active+clean; 173 MiB data, 998 MiB used, 119 GiB / 120 GiB avail 2026-03-10T07:50:32.137 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:32 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:32.137 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:32 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:32.137 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:32 vm05.local ceph-mon[50387]: mgrmap e29: vm05.blexke(active, since 4s), standbys: vm08.orfpog 2026-03-10T07:50:32.137 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:32 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr metadata", "who": "vm08.orfpog", "id": "vm08.orfpog"}]: dispatch 2026-03-10T07:50:32.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:32 vm08.local ceph-mon[59917]: pgmap v5: 65 pgs: 65 active+clean; 173 MiB data, 998 MiB used, 119 GiB / 120 GiB avail 2026-03-10T07:50:32.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:32 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:32.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:32 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:32.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:32 vm08.local ceph-mon[59917]: mgrmap e29: vm05.blexke(active, since 4s), standbys: vm08.orfpog 2026-03-10T07:50:32.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:32 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr metadata", "who": "vm08.orfpog", "id": "vm08.orfpog"}]: dispatch 2026-03-10T07:50:34.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:33 vm05.local ceph-mon[50387]: pgmap v6: 65 pgs: 65 active+clean; 173 MiB data, 998 MiB used, 119 GiB / 120 GiB avail 2026-03-10T07:50:34.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:33 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:34.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:33 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:34.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:33 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:50:34.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:33 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:50:34.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:33 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:50:34.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:33 vm08.local ceph-mon[59917]: pgmap v6: 65 pgs: 65 active+clean; 173 MiB data, 998 MiB used, 119 GiB / 120 GiB avail 2026-03-10T07:50:34.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:33 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:34.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:33 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:34.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:33 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:50:34.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:33 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:50:34.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:33 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:50:35.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:35 vm05.local ceph-mon[50387]: Updating vm05:/etc/ceph/ceph.conf 2026-03-10T07:50:35.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:35 vm05.local ceph-mon[50387]: Updating vm08:/etc/ceph/ceph.conf 2026-03-10T07:50:35.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:35 vm05.local ceph-mon[50387]: Updating vm08:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.conf 2026-03-10T07:50:35.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:35 vm05.local ceph-mon[50387]: Updating vm05:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.conf 2026-03-10T07:50:35.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:35 vm05.local ceph-mon[50387]: Updating vm08:/etc/ceph/ceph.client.admin.keyring 2026-03-10T07:50:35.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:35 vm05.local ceph-mon[50387]: Updating vm05:/etc/ceph/ceph.client.admin.keyring 2026-03-10T07:50:35.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:35 vm05.local ceph-mon[50387]: Updating vm08:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.client.admin.keyring 2026-03-10T07:50:35.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:35 vm05.local ceph-mon[50387]: Updating vm05:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.client.admin.keyring 2026-03-10T07:50:35.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:35 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:35.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:35 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:35.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:35 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:35.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:35 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:35.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:35 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:35.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:35 vm05.local ceph-mon[50387]: Reconfiguring prometheus.vm05 (dependencies changed)... 2026-03-10T07:50:35.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:35 vm08.local ceph-mon[59917]: Updating vm05:/etc/ceph/ceph.conf 2026-03-10T07:50:35.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:35 vm08.local ceph-mon[59917]: Updating vm08:/etc/ceph/ceph.conf 2026-03-10T07:50:35.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:35 vm08.local ceph-mon[59917]: Updating vm08:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.conf 2026-03-10T07:50:35.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:35 vm08.local ceph-mon[59917]: Updating vm05:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.conf 2026-03-10T07:50:35.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:35 vm08.local ceph-mon[59917]: Updating vm08:/etc/ceph/ceph.client.admin.keyring 2026-03-10T07:50:35.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:35 vm08.local ceph-mon[59917]: Updating vm05:/etc/ceph/ceph.client.admin.keyring 2026-03-10T07:50:35.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:35 vm08.local ceph-mon[59917]: Updating vm08:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.client.admin.keyring 2026-03-10T07:50:35.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:35 vm08.local ceph-mon[59917]: Updating vm05:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.client.admin.keyring 2026-03-10T07:50:35.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:35 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:35.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:35 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:35.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:35 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:35.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:35 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:35.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:35 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:35.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:35 vm08.local ceph-mon[59917]: Reconfiguring prometheus.vm05 (dependencies changed)... 2026-03-10T07:50:35.833 INFO:tasks.workunit.client.0.vm05.stderr:+ pushd ltp-full-20091231/testcases/kernel/fs/fsstress 2026-03-10T07:50:35.836 INFO:tasks.workunit.client.0.vm05.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress ~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-10T07:50:35.836 INFO:tasks.workunit.client.0.vm05.stderr:+ make 2026-03-10T07:50:36.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:36 vm08.local ceph-mon[59917]: Reconfiguring daemon prometheus.vm05 on vm05 2026-03-10T07:50:36.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:36 vm08.local ceph-mon[59917]: pgmap v7: 65 pgs: 65 active+clean; 189 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 26 KiB/s rd, 2.4 MiB/s wr, 202 op/s 2026-03-10T07:50:36.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:36 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:36.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:36 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:36.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:36 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T07:50:36.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:36 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:50:36.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:36 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.orfpog", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T07:50:36.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:36 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T07:50:36.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:36 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:50:36.219 INFO:tasks.workunit.client.0.vm05.stdout:cc -DNO_XFS -I/home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress -D_LARGEFILE64_SOURCE -D_GNU_SOURCE -I../../../../include -I../../../../include -L../../../../lib fsstress.c -o fsstress 2026-03-10T07:50:36.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:36 vm05.local ceph-mon[50387]: Reconfiguring daemon prometheus.vm05 on vm05 2026-03-10T07:50:36.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:36 vm05.local ceph-mon[50387]: pgmap v7: 65 pgs: 65 active+clean; 189 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 26 KiB/s rd, 2.4 MiB/s wr, 202 op/s 2026-03-10T07:50:36.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:36 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:36.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:36 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:36.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:36 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T07:50:36.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:36 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:50:36.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:36 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.orfpog", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T07:50:36.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:36 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T07:50:36.360 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:36 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:50:36.475 INFO:tasks.workunit.client.0.vm05.stderr:++ readlink -f fsstress 2026-03-10T07:50:36.477 INFO:tasks.workunit.client.0.vm05.stderr:+ BIN=/home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress 2026-03-10T07:50:36.477 INFO:tasks.workunit.client.0.vm05.stderr:+ popd 2026-03-10T07:50:36.479 INFO:tasks.workunit.client.0.vm05.stdout:~/cephtest/mnt.0/client.0/tmp/fsstress ~/cephtest/mnt.0/client.0/tmp 2026-03-10T07:50:36.479 INFO:tasks.workunit.client.0.vm05.stderr:+ popd 2026-03-10T07:50:36.479 INFO:tasks.workunit.client.0.vm05.stdout:~/cephtest/mnt.0/client.0/tmp 2026-03-10T07:50:36.480 INFO:tasks.workunit.client.0.vm05.stderr:++ mktemp -d -p . 2026-03-10T07:50:36.482 INFO:tasks.workunit.client.0.vm05.stderr:+ T=./tmp.9XAR9Oh0ly 2026-03-10T07:50:36.482 INFO:tasks.workunit.client.0.vm05.stderr:+ /home/ubuntu/cephtest/mnt.0/client.0/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress -d ./tmp.9XAR9Oh0ly -l 1 -n 1000 -p 10 -v 2026-03-10T07:50:36.485 INFO:tasks.workunit.client.0.vm05.stdout:seed = 1772660583 2026-03-10T07:50:36.491 INFO:tasks.workunit.client.0.vm05.stdout:2/0: chown . 109 1 2026-03-10T07:50:36.491 INFO:tasks.workunit.client.0.vm05.stdout:2/1: write - no filename 2026-03-10T07:50:36.491 INFO:tasks.workunit.client.0.vm05.stdout:2/2: dwrite - no filename 2026-03-10T07:50:36.491 INFO:tasks.workunit.client.0.vm05.stdout:2/3: fdatasync - no filename 2026-03-10T07:50:36.491 INFO:tasks.workunit.client.0.vm05.stdout:2/4: rename - no filename 2026-03-10T07:50:36.491 INFO:tasks.workunit.client.0.vm05.stdout:2/5: dread - no filename 2026-03-10T07:50:36.492 INFO:tasks.workunit.client.0.vm05.stdout:2/6: write - no filename 2026-03-10T07:50:36.492 INFO:tasks.workunit.client.0.vm05.stdout:2/7: chown . 49716 1 2026-03-10T07:50:36.493 INFO:tasks.workunit.client.0.vm05.stdout:3/0: fdatasync - no filename 2026-03-10T07:50:36.494 INFO:tasks.workunit.client.0.vm05.stdout:0/0: mknod c0 0 2026-03-10T07:50:36.508 INFO:tasks.workunit.client.0.vm05.stdout:2/8: mkdir d0 0 2026-03-10T07:50:36.508 INFO:tasks.workunit.client.0.vm05.stdout:2/9: unlink - no file 2026-03-10T07:50:36.508 INFO:tasks.workunit.client.0.vm05.stdout:2/10: truncate - no filename 2026-03-10T07:50:36.509 INFO:tasks.workunit.client.0.vm05.stdout:4/0: dwrite - no filename 2026-03-10T07:50:36.509 INFO:tasks.workunit.client.0.vm05.stdout:4/1: rmdir - no directory 2026-03-10T07:50:36.509 INFO:tasks.workunit.client.0.vm05.stdout:4/2: rename - no filename 2026-03-10T07:50:36.509 INFO:tasks.workunit.client.0.vm05.stdout:4/3: read - no filename 2026-03-10T07:50:36.510 INFO:tasks.workunit.client.0.vm05.stdout:3/1: getdents . 0 2026-03-10T07:50:36.510 INFO:tasks.workunit.client.0.vm05.stdout:3/2: chown . 167755926 1 2026-03-10T07:50:36.514 INFO:tasks.workunit.client.0.vm05.stdout:5/0: dread - no filename 2026-03-10T07:50:36.514 INFO:tasks.workunit.client.0.vm05.stdout:5/1: truncate - no filename 2026-03-10T07:50:36.516 INFO:tasks.workunit.client.0.vm05.stdout:1/0: creat f0 x:0 0 0 2026-03-10T07:50:36.518 INFO:tasks.workunit.client.0.vm05.stdout:2/11: creat d0/f1 x:0 0 0 2026-03-10T07:50:36.518 INFO:tasks.workunit.client.0.vm05.stdout:2/12: read - d0/f1 zero size 2026-03-10T07:50:36.519 INFO:tasks.workunit.client.0.vm05.stdout:2/13: dread - d0/f1 zero size 2026-03-10T07:50:36.520 INFO:tasks.workunit.client.0.vm05.stdout:6/0: link - no file 2026-03-10T07:50:36.520 INFO:tasks.workunit.client.0.vm05.stdout:6/1: chown . 1170864354 1 2026-03-10T07:50:36.520 INFO:tasks.workunit.client.0.vm05.stdout:6/2: truncate - no filename 2026-03-10T07:50:36.520 INFO:tasks.workunit.client.0.vm05.stdout:6/3: write - no filename 2026-03-10T07:50:36.520 INFO:tasks.workunit.client.0.vm05.stdout:6/4: chown . 449205 1 2026-03-10T07:50:36.521 INFO:tasks.workunit.client.0.vm05.stdout:6/5: chown . 4456 1 2026-03-10T07:50:36.522 INFO:tasks.workunit.client.0.vm05.stdout:4/4: mkdir d0 0 2026-03-10T07:50:36.524 INFO:tasks.workunit.client.0.vm05.stdout:1/1: unlink f0 0 2026-03-10T07:50:36.524 INFO:tasks.workunit.client.0.vm05.stdout:1/2: write - no filename 2026-03-10T07:50:36.526 INFO:tasks.workunit.client.0.vm05.stdout:3/3: creat f0 x:0 0 0 2026-03-10T07:50:36.527 INFO:tasks.workunit.client.0.vm05.stdout:3/4: write f0 [897596,103613] 0 2026-03-10T07:50:36.528 INFO:tasks.workunit.client.0.vm05.stdout:5/2: symlink l0 0 2026-03-10T07:50:36.528 INFO:tasks.workunit.client.0.vm05.stdout:5/3: truncate - no filename 2026-03-10T07:50:36.542 INFO:tasks.workunit.client.0.vm05.stdout:1/3: mknod c1 0 2026-03-10T07:50:36.542 INFO:tasks.workunit.client.0.vm05.stdout:1/4: dread - no filename 2026-03-10T07:50:36.542 INFO:tasks.workunit.client.0.vm05.stdout:1/5: write - no filename 2026-03-10T07:50:36.542 INFO:tasks.workunit.client.0.vm05.stdout:1/6: dread - no filename 2026-03-10T07:50:36.542 INFO:tasks.workunit.client.0.vm05.stdout:3/5: read f0 [601910,46606] 0 2026-03-10T07:50:36.543 INFO:tasks.workunit.client.0.vm05.stdout:5/4: rename l0 to l1 0 2026-03-10T07:50:36.543 INFO:tasks.workunit.client.0.vm05.stdout:5/5: truncate - no filename 2026-03-10T07:50:36.543 INFO:tasks.workunit.client.0.vm05.stdout:6/6: mkdir d0 0 2026-03-10T07:50:36.546 INFO:tasks.workunit.client.0.vm05.stdout:4/5: creat d0/f1 x:0 0 0 2026-03-10T07:50:36.546 INFO:tasks.workunit.client.0.vm05.stdout:4/6: stat d0 0 2026-03-10T07:50:36.552 INFO:tasks.workunit.client.0.vm05.stdout:7/0: creat f0 x:0 0 0 2026-03-10T07:50:36.553 INFO:tasks.workunit.client.0.vm05.stdout:5/6: mkdir d2 0 2026-03-10T07:50:36.554 INFO:tasks.workunit.client.0.vm05.stdout:5/7: stat d2 0 2026-03-10T07:50:36.555 INFO:tasks.workunit.client.0.vm05.stdout:3/6: write f0 [1387081,59903] 0 2026-03-10T07:50:36.570 INFO:tasks.workunit.client.0.vm05.stdout:4/7: rename d0/f1 to d0/f2 0 2026-03-10T07:50:36.570 INFO:tasks.workunit.client.0.vm05.stdout:7/1: mkdir d1 0 2026-03-10T07:50:36.571 INFO:tasks.workunit.client.0.vm05.stdout:6/7: mknod d0/c1 0 2026-03-10T07:50:36.571 INFO:tasks.workunit.client.0.vm05.stdout:6/8: write - no filename 2026-03-10T07:50:36.576 INFO:tasks.workunit.client.0.vm05.stdout:3/7: dwrite f0 [0,4194304] 0 2026-03-10T07:50:36.576 INFO:tasks.workunit.client.0.vm05.stdout:7/2: write f0 [436454,81302] 0 2026-03-10T07:50:36.596 INFO:tasks.workunit.client.0.vm05.stdout:1/7: link c1 c2 0 2026-03-10T07:50:36.596 INFO:tasks.workunit.client.0.vm05.stdout:1/8: write - no filename 2026-03-10T07:50:36.596 INFO:tasks.workunit.client.0.vm05.stdout:1/9: write - no filename 2026-03-10T07:50:36.596 INFO:tasks.workunit.client.0.vm05.stdout:1/10: write - no filename 2026-03-10T07:50:36.598 INFO:tasks.workunit.client.0.vm05.stdout:5/8: creat d2/f3 x:0 0 0 2026-03-10T07:50:36.598 INFO:tasks.workunit.client.0.vm05.stdout:7/3: dread f0 [0,4194304] 0 2026-03-10T07:50:36.603 INFO:tasks.workunit.client.0.vm05.stdout:4/8: symlink d0/l3 0 2026-03-10T07:50:36.611 INFO:tasks.workunit.client.0.vm05.stdout:5/9: write d2/f3 [788307,86947] 0 2026-03-10T07:50:36.611 INFO:tasks.workunit.client.0.vm05.stdout:5/10: chown d2 149034 1 2026-03-10T07:50:36.611 INFO:tasks.workunit.client.0.vm05.stdout:8/0: creat f0 x:0 0 0 2026-03-10T07:50:36.611 INFO:tasks.workunit.client.0.vm05.stdout:8/1: read - f0 zero size 2026-03-10T07:50:36.611 INFO:tasks.workunit.client.0.vm05.stdout:4/9: dread - d0/f2 zero size 2026-03-10T07:50:36.611 INFO:tasks.workunit.client.0.vm05.stdout:5/11: chown d2/f3 96 1 2026-03-10T07:50:36.611 INFO:tasks.workunit.client.0.vm05.stdout:2/14: fsync d0/f1 0 2026-03-10T07:50:36.611 INFO:tasks.workunit.client.0.vm05.stdout:4/10: dread - d0/f2 zero size 2026-03-10T07:50:36.611 INFO:tasks.workunit.client.0.vm05.stdout:3/8: symlink l1 0 2026-03-10T07:50:36.611 INFO:tasks.workunit.client.0.vm05.stdout:3/9: write f0 [3230972,40355] 0 2026-03-10T07:50:36.614 INFO:tasks.workunit.client.0.vm05.stdout:1/11: creat f3 x:0 0 0 2026-03-10T07:50:36.614 INFO:tasks.workunit.client.0.vm05.stdout:1/12: read - f3 zero size 2026-03-10T07:50:36.618 INFO:tasks.workunit.client.0.vm05.stdout:8/2: mkdir d1 0 2026-03-10T07:50:36.619 INFO:tasks.workunit.client.0.vm05.stdout:2/15: creat d0/f2 x:0 0 0 2026-03-10T07:50:36.619 INFO:tasks.workunit.client.0.vm05.stdout:2/16: dread - d0/f2 zero size 2026-03-10T07:50:36.621 INFO:tasks.workunit.client.0.vm05.stdout:4/11: symlink d0/l4 0 2026-03-10T07:50:36.623 INFO:tasks.workunit.client.0.vm05.stdout:4/12: truncate d0/f2 607852 0 2026-03-10T07:50:36.631 INFO:tasks.workunit.client.0.vm05.stdout:3/10: mknod c2 0 2026-03-10T07:50:36.631 INFO:tasks.workunit.client.0.vm05.stdout:3/11: write f0 [1815674,106523] 0 2026-03-10T07:50:36.631 INFO:tasks.workunit.client.0.vm05.stdout:3/12: chown c2 29905066 1 2026-03-10T07:50:36.631 INFO:tasks.workunit.client.0.vm05.stdout:7/4: rename f0 to d1/f2 0 2026-03-10T07:50:36.631 INFO:tasks.workunit.client.0.vm05.stdout:9/0: creat f0 x:0 0 0 2026-03-10T07:50:36.631 INFO:tasks.workunit.client.0.vm05.stdout:7/5: dread d1/f2 [0,4194304] 0 2026-03-10T07:50:36.631 INFO:tasks.workunit.client.0.vm05.stdout:9/1: dread - f0 zero size 2026-03-10T07:50:36.633 INFO:tasks.workunit.client.0.vm05.stdout:2/17: mkdir d0/d3 0 2026-03-10T07:50:36.633 INFO:tasks.workunit.client.0.vm05.stdout:4/13: mknod d0/c5 0 2026-03-10T07:50:36.634 INFO:tasks.workunit.client.0.vm05.stdout:4/14: dread d0/f2 [0,4194304] 0 2026-03-10T07:50:36.635 INFO:tasks.workunit.client.0.vm05.stdout:6/9: link d0/c1 d0/c2 0 2026-03-10T07:50:36.636 INFO:tasks.workunit.client.0.vm05.stdout:6/10: readlink - no filename 2026-03-10T07:50:36.636 INFO:tasks.workunit.client.0.vm05.stdout:3/13: creat f3 x:0 0 0 2026-03-10T07:50:36.637 INFO:tasks.workunit.client.0.vm05.stdout:1/13: chown c1 15019 1 2026-03-10T07:50:36.637 INFO:tasks.workunit.client.0.vm05.stdout:3/14: chown l1 29 1 2026-03-10T07:50:36.638 INFO:tasks.workunit.client.0.vm05.stdout:1/14: dread - f3 zero size 2026-03-10T07:50:36.638 INFO:tasks.workunit.client.0.vm05.stdout:7/6: mkdir d1/d3 0 2026-03-10T07:50:36.639 INFO:tasks.workunit.client.0.vm05.stdout:1/15: write f3 [148479,18857] 0 2026-03-10T07:50:36.641 INFO:tasks.workunit.client.0.vm05.stdout:9/2: creat f1 x:0 0 0 2026-03-10T07:50:36.642 INFO:tasks.workunit.client.0.vm05.stdout:9/3: write f0 [661397,76900] 0 2026-03-10T07:50:36.644 INFO:tasks.workunit.client.0.vm05.stdout:9/4: write f0 [214693,55989] 0 2026-03-10T07:50:36.645 INFO:tasks.workunit.client.0.vm05.stdout:9/5: chown f0 11373 1 2026-03-10T07:50:36.645 INFO:tasks.workunit.client.0.vm05.stdout:9/6: readlink - no filename 2026-03-10T07:50:36.647 INFO:tasks.workunit.client.0.vm05.stdout:1/16: dwrite f3 [0,4194304] 0 2026-03-10T07:50:36.649 INFO:tasks.workunit.client.0.vm05.stdout:8/3: symlink d1/l2 0 2026-03-10T07:50:36.649 INFO:tasks.workunit.client.0.vm05.stdout:4/15: mkdir d0/d6 0 2026-03-10T07:50:36.649 INFO:tasks.workunit.client.0.vm05.stdout:2/18: creat d0/f4 x:0 0 0 2026-03-10T07:50:36.650 INFO:tasks.workunit.client.0.vm05.stdout:2/19: stat d0/f1 0 2026-03-10T07:50:36.652 INFO:tasks.workunit.client.0.vm05.stdout:6/11: creat d0/f3 x:0 0 0 2026-03-10T07:50:36.656 INFO:tasks.workunit.client.0.vm05.stdout:8/4: dwrite f0 [0,4194304] 0 2026-03-10T07:50:36.660 INFO:tasks.workunit.client.0.vm05.stdout:7/7: write d1/f2 [454661,81987] 0 2026-03-10T07:50:36.669 INFO:tasks.workunit.client.0.vm05.stdout:3/15: creat f4 x:0 0 0 2026-03-10T07:50:36.669 INFO:tasks.workunit.client.0.vm05.stdout:7/8: read d1/f2 [396410,94840] 0 2026-03-10T07:50:36.669 INFO:tasks.workunit.client.0.vm05.stdout:9/7: symlink l2 0 2026-03-10T07:50:36.673 INFO:tasks.workunit.client.0.vm05.stdout:1/17: creat f4 x:0 0 0 2026-03-10T07:50:36.685 INFO:tasks.workunit.client.0.vm05.stdout:1/18: read f3 [918142,60014] 0 2026-03-10T07:50:36.685 INFO:tasks.workunit.client.0.vm05.stdout:4/16: creat d0/f7 x:0 0 0 2026-03-10T07:50:36.685 INFO:tasks.workunit.client.0.vm05.stdout:1/19: dwrite f4 [0,4194304] 0 2026-03-10T07:50:36.685 INFO:tasks.workunit.client.0.vm05.stdout:1/20: chown f4 7 1 2026-03-10T07:50:36.688 INFO:tasks.workunit.client.0.vm05.stdout:3/16: symlink l5 0 2026-03-10T07:50:36.692 INFO:tasks.workunit.client.0.vm05.stdout:3/17: dwrite f0 [0,4194304] 0 2026-03-10T07:50:36.694 INFO:tasks.workunit.client.0.vm05.stdout:3/18: dread - f4 zero size 2026-03-10T07:50:36.700 INFO:tasks.workunit.client.0.vm05.stdout:9/8: creat f3 x:0 0 0 2026-03-10T07:50:36.705 INFO:tasks.workunit.client.0.vm05.stdout:3/19: creat f6 x:0 0 0 2026-03-10T07:50:36.708 INFO:tasks.workunit.client.0.vm05.stdout:7/9: rmdir d1/d3 0 2026-03-10T07:50:36.711 INFO:tasks.workunit.client.0.vm05.stdout:9/9: link f1 f4 0 2026-03-10T07:50:36.713 INFO:tasks.workunit.client.0.vm05.stdout:7/10: symlink d1/l4 0 2026-03-10T07:50:36.714 INFO:tasks.workunit.client.0.vm05.stdout:7/11: dread d1/f2 [0,4194304] 0 2026-03-10T07:50:36.714 INFO:tasks.workunit.client.0.vm05.stdout:9/10: mknod c5 0 2026-03-10T07:50:36.714 INFO:tasks.workunit.client.0.vm05.stdout:9/11: readlink l2 0 2026-03-10T07:50:36.718 INFO:tasks.workunit.client.0.vm05.stdout:7/12: link d1/l4 d1/l5 0 2026-03-10T07:50:36.719 INFO:tasks.workunit.client.0.vm05.stdout:9/12: dwrite f3 [0,4194304] 0 2026-03-10T07:50:36.720 INFO:tasks.workunit.client.0.vm05.stdout:7/13: write d1/f2 [801483,49948] 0 2026-03-10T07:50:36.722 INFO:tasks.workunit.client.0.vm05.stdout:9/13: rename f0 to f6 0 2026-03-10T07:50:36.722 INFO:tasks.workunit.client.0.vm05.stdout:9/14: readlink l2 0 2026-03-10T07:50:36.722 INFO:tasks.workunit.client.0.vm05.stdout:9/15: dread f3 [0,4194304] 0 2026-03-10T07:50:36.723 INFO:tasks.workunit.client.0.vm05.stdout:7/14: mkdir d1/d6 0 2026-03-10T07:50:36.732 INFO:tasks.workunit.client.0.vm05.stdout:7/15: getdents d1/d6 0 2026-03-10T07:50:36.733 INFO:tasks.workunit.client.0.vm05.stdout:7/16: unlink d1/l5 0 2026-03-10T07:50:36.734 INFO:tasks.workunit.client.0.vm05.stdout:7/17: rename d1 to d1/d6/d7 22 2026-03-10T07:50:36.741 INFO:tasks.workunit.client.0.vm05.stdout:7/18: creat d1/d6/f8 x:0 0 0 2026-03-10T07:50:36.742 INFO:tasks.workunit.client.0.vm05.stdout:7/19: creat d1/d6/f9 x:0 0 0 2026-03-10T07:50:36.743 INFO:tasks.workunit.client.0.vm05.stdout:7/20: creat d1/fa x:0 0 0 2026-03-10T07:50:36.745 INFO:tasks.workunit.client.0.vm05.stdout:7/21: rename d1/d6/f8 to d1/d6/fb 0 2026-03-10T07:50:36.747 INFO:tasks.workunit.client.0.vm05.stdout:7/22: unlink d1/l4 0 2026-03-10T07:50:36.955 INFO:tasks.workunit.client.0.vm05.stdout:1/21: fdatasync f3 0 2026-03-10T07:50:36.966 INFO:tasks.workunit.client.0.vm05.stdout:2/20: getdents d0 0 2026-03-10T07:50:36.975 INFO:tasks.workunit.client.0.vm05.stdout:2/21: creat d0/f5 x:0 0 0 2026-03-10T07:50:36.975 INFO:tasks.workunit.client.0.vm05.stdout:2/22: dread - d0/f1 zero size 2026-03-10T07:50:36.976 INFO:tasks.workunit.client.0.vm05.stdout:6/12: getdents d0 0 2026-03-10T07:50:36.976 INFO:tasks.workunit.client.0.vm05.stdout:2/23: write d0/f2 [791832,127144] 0 2026-03-10T07:50:36.976 INFO:tasks.workunit.client.0.vm05.stdout:2/24: readlink - no filename 2026-03-10T07:50:36.976 INFO:tasks.workunit.client.0.vm05.stdout:6/13: dread - d0/f3 zero size 2026-03-10T07:50:36.980 INFO:tasks.workunit.client.0.vm05.stdout:4/17: getdents d0 0 2026-03-10T07:50:36.980 INFO:tasks.workunit.client.0.vm05.stdout:4/18: chown d0/l3 825581 1 2026-03-10T07:50:36.980 INFO:tasks.workunit.client.0.vm05.stdout:4/19: read d0/f2 [597159,62837] 0 2026-03-10T07:50:36.981 INFO:tasks.workunit.client.0.vm05.stdout:4/20: dread d0/f2 [0,4194304] 0 2026-03-10T07:50:36.984 INFO:tasks.workunit.client.0.vm05.stdout:8/5: dwrite f0 [4194304,4194304] 0 2026-03-10T07:50:36.987 INFO:tasks.workunit.client.0.vm05.stdout:8/6: dread f0 [0,4194304] 0 2026-03-10T07:50:36.987 INFO:tasks.workunit.client.0.vm05.stdout:8/7: readlink d1/l2 0 2026-03-10T07:50:37.036 INFO:tasks.workunit.client.0.vm05.stdout:2/25: rename d0/f2 to d0/f6 0 2026-03-10T07:50:37.036 INFO:tasks.workunit.client.0.vm05.stdout:6/14: symlink d0/l4 0 2026-03-10T07:50:37.036 INFO:tasks.workunit.client.0.vm05.stdout:6/15: readlink d0/l4 0 2026-03-10T07:50:37.036 INFO:tasks.workunit.client.0.vm05.stdout:6/16: stat d0 0 2026-03-10T07:50:37.037 INFO:tasks.workunit.client.0.vm05.stdout:6/17: truncate d0/f3 772514 0 2026-03-10T07:50:37.044 INFO:tasks.workunit.client.0.vm05.stdout:6/18: dwrite d0/f3 [0,4194304] 0 2026-03-10T07:50:37.054 INFO:tasks.workunit.client.0.vm05.stdout:8/8: mknod d1/c3 0 2026-03-10T07:50:37.054 INFO:tasks.workunit.client.0.vm05.stdout:6/19: chown d0/f3 59 1 2026-03-10T07:50:37.054 INFO:tasks.workunit.client.0.vm05.stdout:8/9: stat f0 0 2026-03-10T07:50:37.054 INFO:tasks.workunit.client.0.vm05.stdout:2/26: creat d0/f7 x:0 0 0 2026-03-10T07:50:37.054 INFO:tasks.workunit.client.0.vm05.stdout:8/10: rename d1/c3 to d1/c4 0 2026-03-10T07:50:37.054 INFO:tasks.workunit.client.0.vm05.stdout:2/27: mkdir d0/d8 0 2026-03-10T07:50:37.054 INFO:tasks.workunit.client.0.vm05.stdout:8/11: write f0 [2723961,19460] 0 2026-03-10T07:50:37.061 INFO:tasks.workunit.client.0.vm05.stdout:6/20: link d0/l4 d0/l5 0 2026-03-10T07:50:37.062 INFO:tasks.workunit.client.0.vm05.stdout:2/28: mknod d0/d8/c9 0 2026-03-10T07:50:37.064 INFO:tasks.workunit.client.0.vm05.stdout:8/12: dwrite f0 [8388608,4194304] 0 2026-03-10T07:50:37.067 INFO:tasks.workunit.client.0.vm05.stdout:8/13: symlink d1/l5 0 2026-03-10T07:50:37.067 INFO:tasks.workunit.client.0.vm05.stdout:8/14: rename d1 to d1/d6 22 2026-03-10T07:50:37.069 INFO:tasks.workunit.client.0.vm05.stdout:2/29: dwrite d0/f7 [0,4194304] 0 2026-03-10T07:50:37.074 INFO:tasks.workunit.client.0.vm05.stdout:2/30: mknod d0/d3/ca 0 2026-03-10T07:50:37.075 INFO:tasks.workunit.client.0.vm05.stdout:8/15: dwrite f0 [4194304,4194304] 0 2026-03-10T07:50:37.088 INFO:tasks.workunit.client.0.vm05.stdout:2/31: dwrite d0/f1 [0,4194304] 0 2026-03-10T07:50:37.089 INFO:tasks.workunit.client.0.vm05.stdout:2/32: chown d0 4 1 2026-03-10T07:50:37.094 INFO:tasks.workunit.client.0.vm05.stdout:2/33: dwrite d0/f1 [0,4194304] 0 2026-03-10T07:50:37.097 INFO:tasks.workunit.client.0.vm05.stdout:2/34: rmdir d0/d8 39 2026-03-10T07:50:37.098 INFO:tasks.workunit.client.0.vm05.stdout:2/35: chown d0/f5 1329645 1 2026-03-10T07:50:37.099 INFO:tasks.workunit.client.0.vm05.stdout:2/36: creat d0/fb x:0 0 0 2026-03-10T07:50:37.099 INFO:tasks.workunit.client.0.vm05.stdout:2/37: write d0/f7 [4117075,16674] 0 2026-03-10T07:50:37.100 INFO:tasks.workunit.client.0.vm05.stdout:2/38: dread - d0/fb zero size 2026-03-10T07:50:37.163 INFO:tasks.workunit.client.0.vm05.stdout:3/20: fsync f0 0 2026-03-10T07:50:37.165 INFO:tasks.workunit.client.0.vm05.stdout:3/21: dread - f3 zero size 2026-03-10T07:50:37.170 INFO:tasks.workunit.client.0.vm05.stdout:3/22: dwrite f6 [0,4194304] 0 2026-03-10T07:50:37.177 INFO:tasks.workunit.client.0.vm05.stdout:3/23: dwrite f4 [0,4194304] 0 2026-03-10T07:50:37.219 INFO:tasks.workunit.client.0.vm05.stdout:3/24: getdents . 0 2026-03-10T07:50:37.219 INFO:tasks.workunit.client.0.vm05.stdout:3/25: readlink l1 0 2026-03-10T07:50:37.222 INFO:tasks.workunit.client.0.vm05.stdout:3/26: dread f6 [0,4194304] 0 2026-03-10T07:50:37.229 INFO:tasks.workunit.client.0.vm05.stdout:3/27: dwrite f6 [0,4194304] 0 2026-03-10T07:50:37.238 INFO:tasks.workunit.client.0.vm05.stdout:3/28: creat f7 x:0 0 0 2026-03-10T07:50:37.238 INFO:tasks.workunit.client.0.vm05.stdout:7/23: fsync d1/d6/fb 0 2026-03-10T07:50:37.238 INFO:tasks.workunit.client.0.vm05.stdout:7/24: truncate d1/fa 413733 0 2026-03-10T07:50:37.240 INFO:tasks.workunit.client.0.vm05.stdout:1/22: truncate f4 2395759 0 2026-03-10T07:50:37.240 INFO:tasks.workunit.client.0.vm05.stdout:1/23: chown f3 53870 1 2026-03-10T07:50:37.240 INFO:tasks.workunit.client.0.vm05.stdout:1/24: read f3 [898037,55412] 0 2026-03-10T07:50:37.359 INFO:tasks.workunit.client.0.vm05.stdout:2/39: getdents d0/d8 0 2026-03-10T07:50:37.359 INFO:tasks.workunit.client.0.vm05.stdout:2/40: stat d0/d3/ca 0 2026-03-10T07:50:37.361 INFO:tasks.workunit.client.0.vm05.stdout:2/41: creat d0/d8/fc x:0 0 0 2026-03-10T07:50:37.363 INFO:tasks.workunit.client.0.vm05.stdout:8/16: truncate f0 12538947 0 2026-03-10T07:50:37.365 INFO:tasks.workunit.client.0.vm05.stdout:8/17: mknod d1/c7 0 2026-03-10T07:50:37.523 INFO:tasks.workunit.client.0.vm05.stdout:3/29: getdents . 0 2026-03-10T07:50:37.524 INFO:tasks.workunit.client.0.vm05.stdout:3/30: mkdir d8 0 2026-03-10T07:50:37.524 INFO:tasks.workunit.client.0.vm05.stdout:3/31: readlink l1 0 2026-03-10T07:50:37.525 INFO:tasks.workunit.client.0.vm05.stdout:3/32: fsync f7 0 2026-03-10T07:50:37.531 INFO:tasks.workunit.client.0.vm05.stdout:1/25: dread f4 [0,4194304] 0 2026-03-10T07:50:37.533 INFO:tasks.workunit.client.0.vm05.stdout:1/26: dread f3 [0,4194304] 0 2026-03-10T07:50:37.533 INFO:tasks.workunit.client.0.vm05.stdout:1/27: fdatasync f3 0 2026-03-10T07:50:37.538 INFO:tasks.workunit.client.0.vm05.stdout:3/33: dwrite f6 [0,4194304] 0 2026-03-10T07:50:37.539 INFO:tasks.workunit.client.0.vm05.stdout:1/28: link c1 c5 0 2026-03-10T07:50:37.547 INFO:tasks.workunit.client.0.vm05.stdout:1/29: dwrite f3 [0,4194304] 0 2026-03-10T07:50:37.548 INFO:tasks.workunit.client.0.vm05.stdout:1/30: read f3 [1551690,125847] 0 2026-03-10T07:50:37.549 INFO:tasks.workunit.client.0.vm05.stdout:3/34: symlink d8/l9 0 2026-03-10T07:50:37.555 INFO:tasks.workunit.client.0.vm05.stdout:3/35: mkdir d8/da 0 2026-03-10T07:50:37.559 INFO:tasks.workunit.client.0.vm05.stdout:3/36: dwrite f6 [0,4194304] 0 2026-03-10T07:50:37.563 INFO:tasks.workunit.client.0.vm05.stdout:3/37: dread f0 [0,4194304] 0 2026-03-10T07:50:37.649 INFO:tasks.workunit.client.0.vm05.stdout:7/25: truncate d1/fa 153613 0 2026-03-10T07:50:37.650 INFO:tasks.workunit.client.0.vm05.stdout:7/26: rename d1 to d1/dc 22 2026-03-10T07:50:37.674 INFO:tasks.workunit.client.0.vm05.stdout:2/42: truncate d0/f1 1890421 0 2026-03-10T07:50:37.675 INFO:tasks.workunit.client.0.vm05.stdout:2/43: symlink d0/d8/ld 0 2026-03-10T07:50:37.676 INFO:tasks.workunit.client.0.vm05.stdout:2/44: creat d0/d8/fe x:0 0 0 2026-03-10T07:50:37.679 INFO:tasks.workunit.client.0.vm05.stdout:2/45: dwrite d0/d8/fc [0,4194304] 0 2026-03-10T07:50:37.683 INFO:tasks.workunit.client.0.vm05.stdout:2/46: dread d0/f6 [0,4194304] 0 2026-03-10T07:50:37.683 INFO:tasks.workunit.client.0.vm05.stdout:2/47: stat d0/f5 0 2026-03-10T07:50:37.684 INFO:tasks.workunit.client.0.vm05.stdout:2/48: chown d0/f1 33651039 1 2026-03-10T07:50:37.684 INFO:tasks.workunit.client.0.vm05.stdout:2/49: stat d0/d3 0 2026-03-10T07:50:37.684 INFO:tasks.workunit.client.0.vm05.stdout:2/50: read - d0/f5 zero size 2026-03-10T07:50:37.686 INFO:tasks.workunit.client.0.vm05.stdout:2/51: rmdir d0/d3 39 2026-03-10T07:50:37.687 INFO:tasks.workunit.client.0.vm05.stdout:2/52: readlink d0/d8/ld 0 2026-03-10T07:50:37.687 INFO:tasks.workunit.client.0.vm05.stdout:2/53: write d0/d8/fe [264286,3990] 0 2026-03-10T07:50:37.689 INFO:tasks.workunit.client.0.vm05.stdout:2/54: mkdir d0/d3/df 0 2026-03-10T07:50:37.741 INFO:tasks.workunit.client.0.vm05.stdout:4/21: sync 2026-03-10T07:50:37.742 INFO:tasks.workunit.client.0.vm05.stdout:9/16: sync 2026-03-10T07:50:37.742 INFO:tasks.workunit.client.0.vm05.stdout:5/12: sync 2026-03-10T07:50:37.742 INFO:tasks.workunit.client.0.vm05.stdout:6/21: sync 2026-03-10T07:50:37.742 INFO:tasks.workunit.client.0.vm05.stdout:0/1: sync 2026-03-10T07:50:37.742 INFO:tasks.workunit.client.0.vm05.stdout:0/2: chown c0 5120 1 2026-03-10T07:50:37.742 INFO:tasks.workunit.client.0.vm05.stdout:8/18: dread f0 [8388608,4194304] 0 2026-03-10T07:50:37.743 INFO:tasks.workunit.client.0.vm05.stdout:8/19: readlink d1/l5 0 2026-03-10T07:50:37.744 INFO:tasks.workunit.client.0.vm05.stdout:4/22: write d0/f2 [583567,51289] 0 2026-03-10T07:50:37.746 INFO:tasks.workunit.client.0.vm05.stdout:8/20: write f0 [1081291,60103] 0 2026-03-10T07:50:37.747 INFO:tasks.workunit.client.0.vm05.stdout:5/13: rename d2/f3 to d2/f4 0 2026-03-10T07:50:37.748 INFO:tasks.workunit.client.0.vm05.stdout:9/17: link f6 f7 0 2026-03-10T07:50:37.751 INFO:tasks.workunit.client.0.vm05.stdout:8/21: mknod d1/c8 0 2026-03-10T07:50:37.751 INFO:tasks.workunit.client.0.vm05.stdout:8/22: chown d1/l5 1043 1 2026-03-10T07:50:37.755 INFO:tasks.workunit.client.0.vm05.stdout:5/14: mkdir d2/d5 0 2026-03-10T07:50:37.757 INFO:tasks.workunit.client.0.vm05.stdout:0/3: link c0 c1 0 2026-03-10T07:50:37.758 INFO:tasks.workunit.client.0.vm05.stdout:4/23: symlink d0/d6/l8 0 2026-03-10T07:50:37.767 INFO:tasks.workunit.client.0.vm05.stdout:8/23: mkdir d1/d9 0 2026-03-10T07:50:37.768 INFO:tasks.workunit.client.0.vm05.stdout:1/31: truncate f4 1026771 0 2026-03-10T07:50:37.771 INFO:tasks.workunit.client.0.vm05.stdout:8/24: dwrite f0 [0,4194304] 0 2026-03-10T07:50:37.774 INFO:tasks.workunit.client.0.vm05.stdout:1/32: dread f3 [0,4194304] 0 2026-03-10T07:50:37.778 INFO:tasks.workunit.client.0.vm05.stdout:8/25: dwrite f0 [8388608,4194304] 0 2026-03-10T07:50:37.781 INFO:tasks.workunit.client.0.vm05.stdout:8/26: dread f0 [8388608,4194304] 0 2026-03-10T07:50:37.783 INFO:tasks.workunit.client.0.vm05.stdout:8/27: write f0 [10549659,74180] 0 2026-03-10T07:50:37.784 INFO:tasks.workunit.client.0.vm05.stdout:0/4: creat f2 x:0 0 0 2026-03-10T07:50:37.786 INFO:tasks.workunit.client.0.vm05.stdout:3/38: truncate f6 334573 0 2026-03-10T07:50:37.788 INFO:tasks.workunit.client.0.vm05.stdout:4/24: mkdir d0/d6/d9 0 2026-03-10T07:50:37.789 INFO:tasks.workunit.client.0.vm05.stdout:4/25: write d0/f7 [479987,32153] 0 2026-03-10T07:50:37.792 INFO:tasks.workunit.client.0.vm05.stdout:0/5: dwrite f2 [0,4194304] 0 2026-03-10T07:50:37.799 INFO:tasks.workunit.client.0.vm05.stdout:2/55: getdents d0/d8 0 2026-03-10T07:50:37.799 INFO:tasks.workunit.client.0.vm05.stdout:1/33: dwrite f3 [0,4194304] 0 2026-03-10T07:50:37.806 INFO:tasks.workunit.client.0.vm05.stdout:8/28: creat d1/fa x:0 0 0 2026-03-10T07:50:37.807 INFO:tasks.workunit.client.0.vm05.stdout:5/15: fsync d2/f4 0 2026-03-10T07:50:37.811 INFO:tasks.workunit.client.0.vm05.stdout:5/16: dwrite d2/f4 [0,4194304] 0 2026-03-10T07:50:37.817 INFO:tasks.workunit.client.0.vm05.stdout:5/17: dread d2/f4 [0,4194304] 0 2026-03-10T07:50:37.821 INFO:tasks.workunit.client.0.vm05.stdout:0/6: mknod c3 0 2026-03-10T07:50:37.827 INFO:tasks.workunit.client.0.vm05.stdout:6/22: truncate d0/f3 3685192 0 2026-03-10T07:50:37.835 INFO:tasks.workunit.client.0.vm05.stdout:8/29: mknod d1/cb 0 2026-03-10T07:50:37.855 INFO:tasks.workunit.client.0.vm05.stdout:2/56: mknod d0/d3/df/c10 0 2026-03-10T07:50:37.855 INFO:tasks.workunit.client.0.vm05.stdout:9/18: truncate f3 2413858 0 2026-03-10T07:50:37.858 INFO:tasks.workunit.client.0.vm05.stdout:7/27: truncate d1/fa 850385 0 2026-03-10T07:50:37.858 INFO:tasks.workunit.client.0.vm05.stdout:8/30: write f0 [579753,112308] 0 2026-03-10T07:50:37.864 INFO:tasks.workunit.client.0.vm05.stdout:8/31: dread f0 [4194304,4194304] 0 2026-03-10T07:50:37.864 INFO:tasks.workunit.client.0.vm05.stdout:8/32: truncate f0 13034218 0 2026-03-10T07:50:37.877 INFO:tasks.workunit.client.0.vm05.stdout:3/39: rmdir d8/da 0 2026-03-10T07:50:37.877 INFO:tasks.workunit.client.0.vm05.stdout:3/40: readlink l1 0 2026-03-10T07:50:37.885 INFO:tasks.workunit.client.0.vm05.stdout:5/18: creat d2/d5/f6 x:0 0 0 2026-03-10T07:50:37.889 INFO:tasks.workunit.client.0.vm05.stdout:4/26: fdatasync d0/f7 0 2026-03-10T07:50:37.890 INFO:tasks.workunit.client.0.vm05.stdout:4/27: write d0/f7 [748653,81776] 0 2026-03-10T07:50:37.892 INFO:tasks.workunit.client.0.vm05.stdout:9/19: mkdir d8 0 2026-03-10T07:50:37.893 INFO:tasks.workunit.client.0.vm05.stdout:4/28: write d0/f2 [954172,38391] 0 2026-03-10T07:50:37.893 INFO:tasks.workunit.client.0.vm05.stdout:4/29: chown d0/l4 33717 1 2026-03-10T07:50:37.894 INFO:tasks.workunit.client.0.vm05.stdout:2/57: mknod d0/d3/df/c11 0 2026-03-10T07:50:37.898 INFO:tasks.workunit.client.0.vm05.stdout:7/28: mkdir d1/d6/dd 0 2026-03-10T07:50:37.898 INFO:tasks.workunit.client.0.vm05.stdout:7/29: rename d1/d6 to d1/d6/dd/de 22 2026-03-10T07:50:37.900 INFO:tasks.workunit.client.0.vm05.stdout:5/19: mknod d2/d5/c7 0 2026-03-10T07:50:37.900 INFO:tasks.workunit.client.0.vm05.stdout:5/20: dread - d2/d5/f6 zero size 2026-03-10T07:50:37.901 INFO:tasks.workunit.client.0.vm05.stdout:5/21: chown d2 2041282058 1 2026-03-10T07:50:37.913 INFO:tasks.workunit.client.0.vm05.stdout:1/34: dread f4 [0,4194304] 0 2026-03-10T07:50:37.924 INFO:tasks.workunit.client.0.vm05.stdout:4/30: rmdir d0/d6 39 2026-03-10T07:50:37.928 INFO:tasks.workunit.client.0.vm05.stdout:4/31: dwrite d0/f7 [0,4194304] 0 2026-03-10T07:50:37.936 INFO:tasks.workunit.client.0.vm05.stdout:8/33: mknod d1/d9/cc 0 2026-03-10T07:50:37.954 INFO:tasks.workunit.client.0.vm05.stdout:3/41: truncate f0 1561804 0 2026-03-10T07:50:37.959 INFO:tasks.workunit.client.0.vm05.stdout:5/22: write d2/f4 [2285226,101128] 0 2026-03-10T07:50:37.986 INFO:tasks.workunit.client.0.vm05.stdout:9/20: creat d8/f9 x:0 0 0 2026-03-10T07:50:37.987 INFO:tasks.workunit.client.0.vm05.stdout:6/23: dread d0/f3 [0,4194304] 0 2026-03-10T07:50:37.996 INFO:tasks.workunit.client.0.vm05.stdout:0/7: truncate f2 568289 0 2026-03-10T07:50:37.998 INFO:tasks.workunit.client.0.vm05.stdout:2/58: mknod d0/c12 0 2026-03-10T07:50:37.999 INFO:tasks.workunit.client.0.vm05.stdout:2/59: fsync d0/d8/fc 0 2026-03-10T07:50:38.005 INFO:tasks.workunit.client.0.vm05.stdout:8/34: write f0 [10319752,74924] 0 2026-03-10T07:50:38.024 INFO:tasks.workunit.client.0.vm05.stdout:3/42: creat d8/fb x:0 0 0 2026-03-10T07:50:38.024 INFO:tasks.workunit.client.0.vm05.stdout:5/23: rmdir d2 39 2026-03-10T07:50:38.024 INFO:tasks.workunit.client.0.vm05.stdout:1/35: rename c2 to c6 0 2026-03-10T07:50:38.034 INFO:tasks.workunit.client.0.vm05.stdout:6/24: sync 2026-03-10T07:50:38.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:37 vm05.local ceph-mon[50387]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T07:50:38.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:37 vm05.local ceph-mon[50387]: Upgrade: Updating mgr.vm08.orfpog 2026-03-10T07:50:38.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:37 vm05.local ceph-mon[50387]: Deploying daemon mgr.vm08.orfpog on vm08 2026-03-10T07:50:38.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:37 vm08.local ceph-mon[59917]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T07:50:38.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:37 vm08.local ceph-mon[59917]: Upgrade: Updating mgr.vm08.orfpog 2026-03-10T07:50:38.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:37 vm08.local ceph-mon[59917]: Deploying daemon mgr.vm08.orfpog on vm08 2026-03-10T07:50:38.199 INFO:tasks.workunit.client.0.vm05.stdout:2/60: mknod d0/d3/df/c13 0 2026-03-10T07:50:38.203 INFO:tasks.workunit.client.0.vm05.stdout:8/35: rename d1/d9 to d1/dd 0 2026-03-10T07:50:38.203 INFO:tasks.workunit.client.0.vm05.stdout:3/43: rename f7 to d8/fc 0 2026-03-10T07:50:38.203 INFO:tasks.workunit.client.0.vm05.stdout:2/61: write d0/f4 [654016,99894] 0 2026-03-10T07:50:38.204 INFO:tasks.workunit.client.0.vm05.stdout:5/24: fsync d2/f4 0 2026-03-10T07:50:38.204 INFO:tasks.workunit.client.0.vm05.stdout:2/62: read d0/f6 [423261,30877] 0 2026-03-10T07:50:38.207 INFO:tasks.workunit.client.0.vm05.stdout:2/63: stat d0/d3/df 0 2026-03-10T07:50:38.211 INFO:tasks.workunit.client.0.vm05.stdout:6/25: mkdir d0/d6 0 2026-03-10T07:50:38.213 INFO:tasks.workunit.client.0.vm05.stdout:8/36: dwrite d1/fa [0,4194304] 0 2026-03-10T07:50:38.213 INFO:tasks.workunit.client.0.vm05.stdout:5/25: dread d2/f4 [0,4194304] 0 2026-03-10T07:50:38.220 INFO:tasks.workunit.client.0.vm05.stdout:8/37: dwrite d1/fa [0,4194304] 0 2026-03-10T07:50:38.221 INFO:tasks.workunit.client.0.vm05.stdout:9/21: fdatasync f3 0 2026-03-10T07:50:38.222 INFO:tasks.workunit.client.0.vm05.stdout:8/38: read f0 [1359372,85043] 0 2026-03-10T07:50:38.224 INFO:tasks.workunit.client.0.vm05.stdout:7/30: truncate d1/fa 1715165 0 2026-03-10T07:50:38.225 INFO:tasks.workunit.client.0.vm05.stdout:7/31: write d1/d6/fb [846321,39789] 0 2026-03-10T07:50:38.236 INFO:tasks.workunit.client.0.vm05.stdout:3/44: creat d8/fd x:0 0 0 2026-03-10T07:50:38.237 INFO:tasks.workunit.client.0.vm05.stdout:7/32: dwrite d1/f2 [0,4194304] 0 2026-03-10T07:50:38.244 INFO:tasks.workunit.client.0.vm05.stdout:2/64: creat d0/d3/df/f14 x:0 0 0 2026-03-10T07:50:38.254 INFO:tasks.workunit.client.0.vm05.stdout:6/26: fdatasync d0/f3 0 2026-03-10T07:50:38.255 INFO:tasks.workunit.client.0.vm05.stdout:7/33: dwrite d1/f2 [0,4194304] 0 2026-03-10T07:50:38.257 INFO:tasks.workunit.client.0.vm05.stdout:5/26: creat d2/f8 x:0 0 0 2026-03-10T07:50:38.303 INFO:tasks.workunit.client.0.vm05.stdout:5/27: rename d2/f4 to d2/f9 0 2026-03-10T07:50:38.303 INFO:tasks.workunit.client.0.vm05.stdout:0/8: getdents . 0 2026-03-10T07:50:38.306 INFO:tasks.workunit.client.0.vm05.stdout:3/45: rename f0 to d8/fe 0 2026-03-10T07:50:38.310 INFO:tasks.workunit.client.0.vm05.stdout:1/36: getdents . 0 2026-03-10T07:50:38.315 INFO:tasks.workunit.client.0.vm05.stdout:0/9: creat f4 x:0 0 0 2026-03-10T07:50:38.320 INFO:tasks.workunit.client.0.vm05.stdout:5/28: creat d2/d5/fa x:0 0 0 2026-03-10T07:50:38.321 INFO:tasks.workunit.client.0.vm05.stdout:5/29: fdatasync d2/f8 0 2026-03-10T07:50:38.322 INFO:tasks.workunit.client.0.vm05.stdout:3/46: creat d8/ff x:0 0 0 2026-03-10T07:50:38.327 INFO:tasks.workunit.client.0.vm05.stdout:3/47: dwrite d8/fd [0,4194304] 0 2026-03-10T07:50:38.330 INFO:tasks.workunit.client.0.vm05.stdout:5/30: dwrite d2/d5/f6 [0,4194304] 0 2026-03-10T07:50:38.338 INFO:tasks.workunit.client.0.vm05.stdout:0/10: creat f5 x:0 0 0 2026-03-10T07:50:38.342 INFO:tasks.workunit.client.0.vm05.stdout:1/37: rename c6 to c7 0 2026-03-10T07:50:38.345 INFO:tasks.workunit.client.0.vm05.stdout:5/31: creat d2/d5/fb x:0 0 0 2026-03-10T07:50:38.348 INFO:tasks.workunit.client.0.vm05.stdout:2/65: rename d0/d8/ld to d0/l15 0 2026-03-10T07:50:38.348 INFO:tasks.workunit.client.0.vm05.stdout:0/11: dread f2 [0,4194304] 0 2026-03-10T07:50:38.353 INFO:tasks.workunit.client.0.vm05.stdout:2/66: dread d0/d8/fe [0,4194304] 0 2026-03-10T07:50:38.355 INFO:tasks.workunit.client.0.vm05.stdout:5/32: symlink d2/d5/lc 0 2026-03-10T07:50:38.359 INFO:tasks.workunit.client.0.vm05.stdout:0/12: creat f6 x:0 0 0 2026-03-10T07:50:38.364 INFO:tasks.workunit.client.0.vm05.stdout:2/67: dwrite d0/f5 [0,4194304] 0 2026-03-10T07:50:38.366 INFO:tasks.workunit.client.0.vm05.stdout:5/33: mknod d2/d5/cd 0 2026-03-10T07:50:38.366 INFO:tasks.workunit.client.0.vm05.stdout:5/34: read - d2/f8 zero size 2026-03-10T07:50:38.367 INFO:tasks.workunit.client.0.vm05.stdout:5/35: readlink d2/d5/lc 0 2026-03-10T07:50:38.373 INFO:tasks.workunit.client.0.vm05.stdout:0/13: chown c0 5 1 2026-03-10T07:50:38.374 INFO:tasks.workunit.client.0.vm05.stdout:0/14: truncate f4 1044820 0 2026-03-10T07:50:38.377 INFO:tasks.workunit.client.0.vm05.stdout:5/36: dwrite d2/d5/fb [0,4194304] 0 2026-03-10T07:50:38.382 INFO:tasks.workunit.client.0.vm05.stdout:5/37: readlink d2/d5/lc 0 2026-03-10T07:50:38.387 INFO:tasks.workunit.client.0.vm05.stdout:0/15: unlink f5 0 2026-03-10T07:50:38.393 INFO:tasks.workunit.client.0.vm05.stdout:0/16: symlink l7 0 2026-03-10T07:50:38.401 INFO:tasks.workunit.client.0.vm05.stdout:0/17: mkdir d8 0 2026-03-10T07:50:38.401 INFO:tasks.workunit.client.0.vm05.stdout:0/18: fdatasync f2 0 2026-03-10T07:50:38.401 INFO:tasks.workunit.client.0.vm05.stdout:0/19: fsync f2 0 2026-03-10T07:50:38.406 INFO:tasks.workunit.client.0.vm05.stdout:0/20: link f2 d8/f9 0 2026-03-10T07:50:38.410 INFO:tasks.workunit.client.0.vm05.stdout:0/21: dwrite f4 [0,4194304] 0 2026-03-10T07:50:38.430 INFO:tasks.workunit.client.0.vm05.stdout:0/22: dread f4 [0,4194304] 0 2026-03-10T07:50:38.434 INFO:tasks.workunit.client.0.vm05.stdout:0/23: write f6 [105314,97072] 0 2026-03-10T07:50:38.438 INFO:tasks.workunit.client.0.vm05.stdout:0/24: chown f6 13302871 1 2026-03-10T07:50:38.444 INFO:tasks.workunit.client.0.vm05.stdout:4/32: truncate d0/f7 692808 0 2026-03-10T07:50:38.445 INFO:tasks.workunit.client.0.vm05.stdout:0/25: creat d8/fa x:0 0 0 2026-03-10T07:50:38.456 INFO:tasks.workunit.client.0.vm05.stdout:0/26: rename f2 to d8/fb 0 2026-03-10T07:50:38.460 INFO:tasks.workunit.client.0.vm05.stdout:7/34: dread d1/fa [0,4194304] 0 2026-03-10T07:50:38.469 INFO:tasks.workunit.client.0.vm05.stdout:7/35: symlink d1/lf 0 2026-03-10T07:50:38.473 INFO:tasks.workunit.client.0.vm05.stdout:9/22: dwrite f4 [0,4194304] 0 2026-03-10T07:50:38.475 INFO:tasks.workunit.client.0.vm05.stdout:4/33: rename d0/d6/l8 to d0/la 0 2026-03-10T07:50:38.476 INFO:tasks.workunit.client.0.vm05.stdout:6/27: truncate d0/f3 3200082 0 2026-03-10T07:50:38.479 INFO:tasks.workunit.client.0.vm05.stdout:7/36: symlink d1/d6/l10 0 2026-03-10T07:50:38.483 INFO:tasks.workunit.client.0.vm05.stdout:8/39: truncate f0 5407038 0 2026-03-10T07:50:38.483 INFO:tasks.workunit.client.0.vm05.stdout:4/34: dwrite d0/f2 [0,4194304] 0 2026-03-10T07:50:38.488 INFO:tasks.workunit.client.0.vm05.stdout:9/23: dread f7 [0,4194304] 0 2026-03-10T07:50:38.497 INFO:tasks.workunit.client.0.vm05.stdout:9/24: dwrite d8/f9 [0,4194304] 0 2026-03-10T07:50:38.502 INFO:tasks.workunit.client.0.vm05.stdout:9/25: read f7 [633535,36610] 0 2026-03-10T07:50:38.518 INFO:tasks.workunit.client.0.vm05.stdout:6/28: creat d0/f7 x:0 0 0 2026-03-10T07:50:38.518 INFO:tasks.workunit.client.0.vm05.stdout:6/29: fdatasync d0/f7 0 2026-03-10T07:50:38.518 INFO:tasks.workunit.client.0.vm05.stdout:3/48: getdents d8 0 2026-03-10T07:50:38.527 INFO:tasks.workunit.client.0.vm05.stdout:7/37: getdents d1/d6/dd 0 2026-03-10T07:50:38.530 INFO:tasks.workunit.client.0.vm05.stdout:2/68: readlink d0/l15 0 2026-03-10T07:50:38.535 INFO:tasks.workunit.client.0.vm05.stdout:1/38: write f4 [724945,43112] 0 2026-03-10T07:50:38.535 INFO:tasks.workunit.client.0.vm05.stdout:2/69: stat d0/f6 0 2026-03-10T07:50:38.535 INFO:tasks.workunit.client.0.vm05.stdout:3/49: dread - f3 zero size 2026-03-10T07:50:38.535 INFO:tasks.workunit.client.0.vm05.stdout:1/39: dread f4 [0,4194304] 0 2026-03-10T07:50:38.535 INFO:tasks.workunit.client.0.vm05.stdout:1/40: write f4 [1188409,39935] 0 2026-03-10T07:50:38.535 INFO:tasks.workunit.client.0.vm05.stdout:2/70: dwrite d0/f7 [0,4194304] 0 2026-03-10T07:50:38.542 INFO:tasks.workunit.client.0.vm05.stdout:5/38: getdents d2/d5 0 2026-03-10T07:50:38.546 INFO:tasks.workunit.client.0.vm05.stdout:3/50: write f4 [999622,113398] 0 2026-03-10T07:50:38.554 INFO:tasks.workunit.client.0.vm05.stdout:1/41: mknod c8 0 2026-03-10T07:50:38.554 INFO:tasks.workunit.client.0.vm05.stdout:1/42: chown f3 63684 1 2026-03-10T07:50:38.554 INFO:tasks.workunit.client.0.vm05.stdout:1/43: stat f4 0 2026-03-10T07:50:38.558 INFO:tasks.workunit.client.0.vm05.stdout:2/71: symlink d0/d3/df/l16 0 2026-03-10T07:50:38.560 INFO:tasks.workunit.client.0.vm05.stdout:5/39: mknod d2/ce 0 2026-03-10T07:50:38.564 INFO:tasks.workunit.client.0.vm05.stdout:5/40: dwrite d2/d5/f6 [4194304,4194304] 0 2026-03-10T07:50:38.577 INFO:tasks.workunit.client.0.vm05.stdout:5/41: dwrite d2/d5/f6 [4194304,4194304] 0 2026-03-10T07:50:38.591 INFO:tasks.workunit.client.0.vm05.stdout:3/51: mknod d8/c10 0 2026-03-10T07:50:38.594 INFO:tasks.workunit.client.0.vm05.stdout:3/52: read - d8/fb zero size 2026-03-10T07:50:38.594 INFO:tasks.workunit.client.0.vm05.stdout:1/44: write f4 [1695231,8742] 0 2026-03-10T07:50:38.594 INFO:tasks.workunit.client.0.vm05.stdout:8/40: sync 2026-03-10T07:50:38.594 INFO:tasks.workunit.client.0.vm05.stdout:0/27: sync 2026-03-10T07:50:38.595 INFO:tasks.workunit.client.0.vm05.stdout:1/45: write f4 [2277451,57198] 0 2026-03-10T07:50:38.596 INFO:tasks.workunit.client.0.vm05.stdout:3/53: dwrite d8/ff [0,4194304] 0 2026-03-10T07:50:38.601 INFO:tasks.workunit.client.0.vm05.stdout:2/72: symlink d0/d3/df/l17 0 2026-03-10T07:50:38.601 INFO:tasks.workunit.client.0.vm05.stdout:2/73: chown d0/f7 128246 1 2026-03-10T07:50:38.603 INFO:tasks.workunit.client.0.vm05.stdout:7/38: rmdir d1/d6/dd 0 2026-03-10T07:50:38.604 INFO:tasks.workunit.client.0.vm05.stdout:7/39: chown d1/fa 1112531 1 2026-03-10T07:50:38.605 INFO:tasks.workunit.client.0.vm05.stdout:5/42: creat d2/ff x:0 0 0 2026-03-10T07:50:38.605 INFO:tasks.workunit.client.0.vm05.stdout:5/43: chown d2/ff 577543548 1 2026-03-10T07:50:38.606 INFO:tasks.workunit.client.0.vm05.stdout:5/44: truncate d2/f9 4698145 0 2026-03-10T07:50:38.606 INFO:tasks.workunit.client.0.vm05.stdout:5/45: write d2/d5/fa [1027956,110450] 0 2026-03-10T07:50:38.624 INFO:tasks.workunit.client.0.vm05.stdout:4/35: dread d0/f7 [0,4194304] 0 2026-03-10T07:50:38.632 INFO:tasks.workunit.client.0.vm05.stdout:2/74: creat d0/d3/df/f18 x:0 0 0 2026-03-10T07:50:38.632 INFO:tasks.workunit.client.0.vm05.stdout:0/28: creat d8/fc x:0 0 0 2026-03-10T07:50:38.632 INFO:tasks.workunit.client.0.vm05.stdout:2/75: rename d0/d3 to d0/d3/d19 22 2026-03-10T07:50:38.640 INFO:tasks.workunit.client.0.vm05.stdout:7/40: creat d1/f11 x:0 0 0 2026-03-10T07:50:38.640 INFO:tasks.workunit.client.0.vm05.stdout:5/46: creat d2/d5/f10 x:0 0 0 2026-03-10T07:50:38.644 INFO:tasks.workunit.client.0.vm05.stdout:4/36: unlink d0/f7 0 2026-03-10T07:50:38.652 INFO:tasks.workunit.client.0.vm05.stdout:0/29: mkdir d8/dd 0 2026-03-10T07:50:38.652 INFO:tasks.workunit.client.0.vm05.stdout:2/76: rename d0/c12 to d0/d8/c1a 0 2026-03-10T07:50:38.652 INFO:tasks.workunit.client.0.vm05.stdout:4/37: rename d0/d6 to d0/d6/db 22 2026-03-10T07:50:38.652 INFO:tasks.workunit.client.0.vm05.stdout:2/77: dread - d0/fb zero size 2026-03-10T07:50:38.652 INFO:tasks.workunit.client.0.vm05.stdout:0/30: write d8/fa [232614,10756] 0 2026-03-10T07:50:38.654 INFO:tasks.workunit.client.0.vm05.stdout:7/41: creat d1/d6/f12 x:0 0 0 2026-03-10T07:50:38.655 INFO:tasks.workunit.client.0.vm05.stdout:7/42: read d1/f2 [3260444,122517] 0 2026-03-10T07:50:38.656 INFO:tasks.workunit.client.0.vm05.stdout:5/47: dwrite d2/d5/fb [0,4194304] 0 2026-03-10T07:50:38.658 INFO:tasks.workunit.client.0.vm05.stdout:5/48: readlink d2/d5/lc 0 2026-03-10T07:50:38.658 INFO:tasks.workunit.client.0.vm05.stdout:7/43: chown d1/f2 241 1 2026-03-10T07:50:38.659 INFO:tasks.workunit.client.0.vm05.stdout:7/44: chown d1/d6/fb 267 1 2026-03-10T07:50:38.659 INFO:tasks.workunit.client.0.vm05.stdout:0/31: dwrite d8/fa [0,4194304] 0 2026-03-10T07:50:38.661 INFO:tasks.workunit.client.0.vm05.stdout:5/49: unlink d2/d5/lc 0 2026-03-10T07:50:38.661 INFO:tasks.workunit.client.0.vm05.stdout:7/45: readlink d1/d6/l10 0 2026-03-10T07:50:38.667 INFO:tasks.workunit.client.0.vm05.stdout:4/38: sync 2026-03-10T07:50:38.669 INFO:tasks.workunit.client.0.vm05.stdout:2/78: mknod d0/c1b 0 2026-03-10T07:50:38.674 INFO:tasks.workunit.client.0.vm05.stdout:5/50: rename d2/d5/cd to d2/c11 0 2026-03-10T07:50:38.684 INFO:tasks.workunit.client.0.vm05.stdout:2/79: creat d0/d8/f1c x:0 0 0 2026-03-10T07:50:38.684 INFO:tasks.workunit.client.0.vm05.stdout:1/46: dread f4 [0,4194304] 0 2026-03-10T07:50:38.691 INFO:tasks.workunit.client.0.vm05.stdout:4/39: dwrite d0/f2 [0,4194304] 0 2026-03-10T07:50:38.704 INFO:tasks.workunit.client.0.vm05.stdout:5/51: mkdir d2/d12 0 2026-03-10T07:50:38.704 INFO:tasks.workunit.client.0.vm05.stdout:1/47: mknod c9 0 2026-03-10T07:50:38.704 INFO:tasks.workunit.client.0.vm05.stdout:2/80: creat d0/d3/f1d x:0 0 0 2026-03-10T07:50:38.705 INFO:tasks.workunit.client.0.vm05.stdout:2/81: chown d0/d3/f1d 115 1 2026-03-10T07:50:38.706 INFO:tasks.workunit.client.0.vm05.stdout:2/82: write d0/d8/fc [2248669,25287] 0 2026-03-10T07:50:38.710 INFO:tasks.workunit.client.0.vm05.stdout:5/52: dread d2/f9 [0,4194304] 0 2026-03-10T07:50:38.716 INFO:tasks.workunit.client.0.vm05.stdout:2/83: unlink d0/d3/df/f14 0 2026-03-10T07:50:38.726 INFO:tasks.workunit.client.0.vm05.stdout:5/53: mknod d2/c13 0 2026-03-10T07:50:38.727 INFO:tasks.workunit.client.0.vm05.stdout:2/84: stat d0/d3 0 2026-03-10T07:50:38.727 INFO:tasks.workunit.client.0.vm05.stdout:5/54: fsync d2/f8 0 2026-03-10T07:50:38.727 INFO:tasks.workunit.client.0.vm05.stdout:1/48: dwrite f4 [0,4194304] 0 2026-03-10T07:50:38.727 INFO:tasks.workunit.client.0.vm05.stdout:1/49: chown c8 798 1 2026-03-10T07:50:38.732 INFO:tasks.workunit.client.0.vm05.stdout:1/50: dread f3 [0,4194304] 0 2026-03-10T07:50:38.732 INFO:tasks.workunit.client.0.vm05.stdout:1/51: stat c8 0 2026-03-10T07:50:38.735 INFO:tasks.workunit.client.0.vm05.stdout:4/40: link d0/la d0/d6/lc 0 2026-03-10T07:50:38.743 INFO:tasks.workunit.client.0.vm05.stdout:5/55: creat d2/d12/f14 x:0 0 0 2026-03-10T07:50:38.746 INFO:tasks.workunit.client.0.vm05.stdout:4/41: symlink d0/d6/ld 0 2026-03-10T07:50:38.749 INFO:tasks.workunit.client.0.vm05.stdout:1/52: unlink f3 0 2026-03-10T07:50:38.751 INFO:tasks.workunit.client.0.vm05.stdout:2/85: creat d0/f1e x:0 0 0 2026-03-10T07:50:38.758 INFO:tasks.workunit.client.0.vm05.stdout:2/86: fsync d0/f4 0 2026-03-10T07:50:38.758 INFO:tasks.workunit.client.0.vm05.stdout:9/26: dwrite f7 [0,4194304] 0 2026-03-10T07:50:38.764 INFO:tasks.workunit.client.0.vm05.stdout:2/87: creat d0/d3/f1f x:0 0 0 2026-03-10T07:50:38.769 INFO:tasks.workunit.client.0.vm05.stdout:2/88: creat d0/d3/df/f20 x:0 0 0 2026-03-10T07:50:38.773 INFO:tasks.workunit.client.0.vm05.stdout:2/89: rename d0/fb to d0/d3/df/f21 0 2026-03-10T07:50:38.783 INFO:tasks.workunit.client.0.vm05.stdout:2/90: creat d0/f22 x:0 0 0 2026-03-10T07:50:38.788 INFO:tasks.workunit.client.0.vm05.stdout:4/42: fdatasync d0/f2 0 2026-03-10T07:50:38.799 INFO:tasks.workunit.client.0.vm05.stdout:2/91: creat d0/d8/f23 x:0 0 0 2026-03-10T07:50:38.800 INFO:tasks.workunit.client.0.vm05.stdout:2/92: chown d0/d3/df/f18 3879 1 2026-03-10T07:50:38.801 INFO:tasks.workunit.client.0.vm05.stdout:2/93: write d0/d8/f1c [688673,119241] 0 2026-03-10T07:50:38.803 INFO:tasks.workunit.client.0.vm05.stdout:2/94: read - d0/d8/f23 zero size 2026-03-10T07:50:38.807 INFO:tasks.workunit.client.0.vm05.stdout:4/43: symlink d0/le 0 2026-03-10T07:50:38.813 INFO:tasks.workunit.client.0.vm05.stdout:2/95: dwrite d0/d3/f1f [0,4194304] 0 2026-03-10T07:50:38.823 INFO:tasks.workunit.client.0.vm05.stdout:3/54: dwrite d8/fe [0,4194304] 0 2026-03-10T07:50:38.831 INFO:tasks.workunit.client.0.vm05.stdout:3/55: truncate d8/fc 174142 0 2026-03-10T07:50:38.835 INFO:tasks.workunit.client.0.vm05.stdout:3/56: write d8/ff [1146990,94640] 0 2026-03-10T07:50:38.843 INFO:tasks.workunit.client.0.vm05.stdout:2/96: symlink d0/d8/l24 0 2026-03-10T07:50:38.844 INFO:tasks.workunit.client.0.vm05.stdout:2/97: truncate d0/d3/df/f20 563891 0 2026-03-10T07:50:38.846 INFO:tasks.workunit.client.0.vm05.stdout:2/98: chown d0/f22 509441 1 2026-03-10T07:50:38.853 INFO:tasks.workunit.client.0.vm05.stdout:4/44: mkdir d0/df 0 2026-03-10T07:50:38.856 INFO:tasks.workunit.client.0.vm05.stdout:6/30: write d0/f3 [2825998,43796] 0 2026-03-10T07:50:38.857 INFO:tasks.workunit.client.0.vm05.stdout:3/57: write f6 [494101,60886] 0 2026-03-10T07:50:38.862 INFO:tasks.workunit.client.0.vm05.stdout:3/58: mknod d8/c11 0 2026-03-10T07:50:38.867 INFO:tasks.workunit.client.0.vm05.stdout:6/31: stat d0/l5 0 2026-03-10T07:50:38.867 INFO:tasks.workunit.client.0.vm05.stdout:0/32: write d8/fb [54868,43012] 0 2026-03-10T07:50:38.867 INFO:tasks.workunit.client.0.vm05.stdout:3/59: write f4 [3528151,128711] 0 2026-03-10T07:50:38.867 INFO:tasks.workunit.client.0.vm05.stdout:4/45: chown d0/la 2 1 2026-03-10T07:50:38.868 INFO:tasks.workunit.client.0.vm05.stdout:3/60: readlink l1 0 2026-03-10T07:50:38.868 INFO:tasks.workunit.client.0.vm05.stdout:0/33: chown f4 275642 1 2026-03-10T07:50:38.869 INFO:tasks.workunit.client.0.vm05.stdout:2/99: sync 2026-03-10T07:50:38.869 INFO:tasks.workunit.client.0.vm05.stdout:4/46: read d0/f2 [1592277,121601] 0 2026-03-10T07:50:38.872 INFO:tasks.workunit.client.0.vm05.stdout:7/46: dwrite d1/fa [0,4194304] 0 2026-03-10T07:50:38.875 INFO:tasks.workunit.client.0.vm05.stdout:6/32: rename d0/f3 to d0/f8 0 2026-03-10T07:50:38.881 INFO:tasks.workunit.client.0.vm05.stdout:3/61: creat d8/f12 x:0 0 0 2026-03-10T07:50:38.882 INFO:tasks.workunit.client.0.vm05.stdout:5/56: rmdir d2/d12 39 2026-03-10T07:50:38.882 INFO:tasks.workunit.client.0.vm05.stdout:3/62: dread - d8/f12 zero size 2026-03-10T07:50:38.886 INFO:tasks.workunit.client.0.vm05.stdout:0/34: dwrite d8/fc [0,4194304] 0 2026-03-10T07:50:38.893 INFO:tasks.workunit.client.0.vm05.stdout:4/47: rename d0/df to d0/d6/d10 0 2026-03-10T07:50:38.893 INFO:tasks.workunit.client.0.vm05.stdout:8/41: dwrite f0 [0,4194304] 0 2026-03-10T07:50:38.897 INFO:tasks.workunit.client.0.vm05.stdout:1/53: truncate f4 3602890 0 2026-03-10T07:50:38.901 INFO:tasks.workunit.client.0.vm05.stdout:9/27: truncate f1 3828667 0 2026-03-10T07:50:38.903 INFO:tasks.workunit.client.0.vm05.stdout:2/100: mkdir d0/d3/df/d25 0 2026-03-10T07:50:38.904 INFO:tasks.workunit.client.0.vm05.stdout:6/33: mknod d0/c9 0 2026-03-10T07:50:38.912 INFO:tasks.workunit.client.0.vm05.stdout:1/54: sync 2026-03-10T07:50:38.912 INFO:tasks.workunit.client.0.vm05.stdout:4/48: dwrite d0/f2 [0,4194304] 0 2026-03-10T07:50:38.914 INFO:tasks.workunit.client.0.vm05.stdout:7/47: truncate d1/f2 2648449 0 2026-03-10T07:50:38.914 INFO:tasks.workunit.client.0.vm05.stdout:6/34: truncate d0/f7 197377 0 2026-03-10T07:50:38.917 INFO:tasks.workunit.client.0.vm05.stdout:3/63: creat d8/f13 x:0 0 0 2026-03-10T07:50:38.918 INFO:tasks.workunit.client.0.vm05.stdout:3/64: dread - d8/fb zero size 2026-03-10T07:50:38.918 INFO:tasks.workunit.client.0.vm05.stdout:5/57: write d2/f9 [1632280,25009] 0 2026-03-10T07:50:38.918 INFO:tasks.workunit.client.0.vm05.stdout:8/42: creat d1/fe x:0 0 0 2026-03-10T07:50:38.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:38 vm08.local ceph-mon[59917]: pgmap v8: 65 pgs: 65 active+clean; 189 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 20 KiB/s rd, 1.9 MiB/s wr, 157 op/s 2026-03-10T07:50:38.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:38 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:38.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:38 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:38.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:38 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:50:38.923 INFO:tasks.workunit.client.0.vm05.stdout:3/65: write d8/f12 [968445,106974] 0 2026-03-10T07:50:38.923 INFO:tasks.workunit.client.0.vm05.stdout:5/58: truncate d2/d5/f10 530068 0 2026-03-10T07:50:38.924 INFO:tasks.workunit.client.0.vm05.stdout:5/59: write d2/d5/fa [2006673,68951] 0 2026-03-10T07:50:38.925 INFO:tasks.workunit.client.0.vm05.stdout:3/66: write d8/ff [1954830,81793] 0 2026-03-10T07:50:38.929 INFO:tasks.workunit.client.0.vm05.stdout:2/101: unlink d0/d8/f23 0 2026-03-10T07:50:38.932 INFO:tasks.workunit.client.0.vm05.stdout:3/67: chown d8/c11 6402 1 2026-03-10T07:50:38.933 INFO:tasks.workunit.client.0.vm05.stdout:3/68: fsync d8/fd 0 2026-03-10T07:50:38.936 INFO:tasks.workunit.client.0.vm05.stdout:0/35: mkdir d8/dd/de 0 2026-03-10T07:50:38.937 INFO:tasks.workunit.client.0.vm05.stdout:7/48: dread d1/d6/fb [0,4194304] 0 2026-03-10T07:50:38.942 INFO:tasks.workunit.client.0.vm05.stdout:8/43: symlink d1/lf 0 2026-03-10T07:50:38.956 INFO:tasks.workunit.client.0.vm05.stdout:6/35: rename d0/f7 to d0/fa 0 2026-03-10T07:50:38.956 INFO:tasks.workunit.client.0.vm05.stdout:4/49: mknod d0/c11 0 2026-03-10T07:50:38.957 INFO:tasks.workunit.client.0.vm05.stdout:6/36: truncate d0/f8 3919133 0 2026-03-10T07:50:38.957 INFO:tasks.workunit.client.0.vm05.stdout:4/50: read d0/f2 [235253,6375] 0 2026-03-10T07:50:38.957 INFO:tasks.workunit.client.0.vm05.stdout:4/51: chown d0/c5 126873 1 2026-03-10T07:50:38.958 INFO:tasks.workunit.client.0.vm05.stdout:4/52: write d0/f2 [4304752,45674] 0 2026-03-10T07:50:38.962 INFO:tasks.workunit.client.0.vm05.stdout:0/36: creat d8/ff x:0 0 0 2026-03-10T07:50:38.966 INFO:tasks.workunit.client.0.vm05.stdout:7/49: symlink d1/l13 0 2026-03-10T07:50:38.971 INFO:tasks.workunit.client.0.vm05.stdout:7/50: truncate d1/f11 84355 0 2026-03-10T07:50:38.971 INFO:tasks.workunit.client.0.vm05.stdout:0/37: dread d8/fa [0,4194304] 0 2026-03-10T07:50:38.971 INFO:tasks.workunit.client.0.vm05.stdout:8/44: creat d1/dd/f10 x:0 0 0 2026-03-10T07:50:38.971 INFO:tasks.workunit.client.0.vm05.stdout:0/38: write f6 [218295,23963] 0 2026-03-10T07:50:38.971 INFO:tasks.workunit.client.0.vm05.stdout:8/45: readlink d1/l2 0 2026-03-10T07:50:38.972 INFO:tasks.workunit.client.0.vm05.stdout:7/51: dread d1/fa [0,4194304] 0 2026-03-10T07:50:38.972 INFO:tasks.workunit.client.0.vm05.stdout:0/39: stat d8/fb 0 2026-03-10T07:50:38.973 INFO:tasks.workunit.client.0.vm05.stdout:7/52: write d1/fa [1999913,104079] 0 2026-03-10T07:50:38.979 INFO:tasks.workunit.client.0.vm05.stdout:6/37: write d0/fa [1056375,86057] 0 2026-03-10T07:50:38.991 INFO:tasks.workunit.client.0.vm05.stdout:8/46: unlink d1/dd/cc 0 2026-03-10T07:50:38.991 INFO:tasks.workunit.client.0.vm05.stdout:8/47: write d1/dd/f10 [287561,36124] 0 2026-03-10T07:50:38.992 INFO:tasks.workunit.client.0.vm05.stdout:5/60: creat d2/f15 x:0 0 0 2026-03-10T07:50:38.994 INFO:tasks.workunit.client.0.vm05.stdout:0/40: mkdir d8/dd/d10 0 2026-03-10T07:50:39.000 INFO:tasks.workunit.client.0.vm05.stdout:9/28: rename f1 to d8/fa 0 2026-03-10T07:50:39.003 INFO:tasks.workunit.client.0.vm05.stdout:1/55: dwrite f4 [0,4194304] 0 2026-03-10T07:50:39.005 INFO:tasks.workunit.client.0.vm05.stdout:1/56: write f4 [4226413,97149] 0 2026-03-10T07:50:39.014 INFO:tasks.workunit.client.0.vm05.stdout:4/53: mkdir d0/d6/d9/d12 0 2026-03-10T07:50:39.019 INFO:tasks.workunit.client.0.vm05.stdout:7/53: rmdir d1 39 2026-03-10T07:50:39.021 INFO:tasks.workunit.client.0.vm05.stdout:3/69: truncate d8/f12 478995 0 2026-03-10T07:50:39.022 INFO:tasks.workunit.client.0.vm05.stdout:3/70: truncate d8/fc 219007 0 2026-03-10T07:50:39.022 INFO:tasks.workunit.client.0.vm05.stdout:6/38: dwrite d0/f8 [0,4194304] 0 2026-03-10T07:50:39.028 INFO:tasks.workunit.client.0.vm05.stdout:3/71: dwrite d8/ff [0,4194304] 0 2026-03-10T07:50:39.031 INFO:tasks.workunit.client.0.vm05.stdout:3/72: write d8/fd [1192808,5474] 0 2026-03-10T07:50:39.031 INFO:tasks.workunit.client.0.vm05.stdout:3/73: chown d8/c10 30655 1 2026-03-10T07:50:39.035 INFO:tasks.workunit.client.0.vm05.stdout:1/57: sync 2026-03-10T07:50:39.036 INFO:tasks.workunit.client.0.vm05.stdout:1/58: write f4 [841326,23938] 0 2026-03-10T07:50:39.037 INFO:tasks.workunit.client.0.vm05.stdout:1/59: write f4 [970216,7352] 0 2026-03-10T07:50:39.038 INFO:tasks.workunit.client.0.vm05.stdout:3/74: dwrite d8/ff [0,4194304] 0 2026-03-10T07:50:39.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:38 vm05.local ceph-mon[50387]: pgmap v8: 65 pgs: 65 active+clean; 189 MiB data, 1.1 GiB used, 119 GiB / 120 GiB avail; 20 KiB/s rd, 1.9 MiB/s wr, 157 op/s 2026-03-10T07:50:39.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:38 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:39.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:38 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:39.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:38 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:50:39.176 INFO:tasks.workunit.client.0.vm05.stdout:8/48: creat d1/dd/f11 x:0 0 0 2026-03-10T07:50:39.176 INFO:tasks.workunit.client.0.vm05.stdout:5/61: chown l1 3397934 1 2026-03-10T07:50:39.176 INFO:tasks.workunit.client.0.vm05.stdout:8/49: fsync d1/fe 0 2026-03-10T07:50:39.176 INFO:tasks.workunit.client.0.vm05.stdout:5/62: chown d2 69313193 1 2026-03-10T07:50:39.176 INFO:tasks.workunit.client.0.vm05.stdout:2/102: getdents d0/d8 0 2026-03-10T07:50:39.178 INFO:tasks.workunit.client.0.vm05.stdout:8/50: write d1/dd/f10 [1338373,125606] 0 2026-03-10T07:50:39.179 INFO:tasks.workunit.client.0.vm05.stdout:8/51: truncate d1/dd/f10 1538720 0 2026-03-10T07:50:39.182 INFO:tasks.workunit.client.0.vm05.stdout:8/52: write d1/fa [1557811,73012] 0 2026-03-10T07:50:39.183 INFO:tasks.workunit.client.0.vm05.stdout:6/39: rmdir d0 39 2026-03-10T07:50:39.183 INFO:tasks.workunit.client.0.vm05.stdout:8/53: stat d1/dd 0 2026-03-10T07:50:39.184 INFO:tasks.workunit.client.0.vm05.stdout:2/103: dwrite d0/d8/f1c [0,4194304] 0 2026-03-10T07:50:39.189 INFO:tasks.workunit.client.0.vm05.stdout:1/60: mkdir da 0 2026-03-10T07:50:39.195 INFO:tasks.workunit.client.0.vm05.stdout:1/61: dread f4 [0,4194304] 0 2026-03-10T07:50:39.195 INFO:tasks.workunit.client.0.vm05.stdout:1/62: fsync f4 0 2026-03-10T07:50:39.200 INFO:tasks.workunit.client.0.vm05.stdout:4/54: mknod d0/d6/d9/d12/c13 0 2026-03-10T07:50:39.200 INFO:tasks.workunit.client.0.vm05.stdout:7/54: symlink d1/l14 0 2026-03-10T07:50:39.200 INFO:tasks.workunit.client.0.vm05.stdout:7/55: read d1/f11 [41531,6724] 0 2026-03-10T07:50:39.201 INFO:tasks.workunit.client.0.vm05.stdout:7/56: fdatasync d1/d6/f9 0 2026-03-10T07:50:39.201 INFO:tasks.workunit.client.0.vm05.stdout:7/57: dread - d1/d6/f9 zero size 2026-03-10T07:50:39.208 INFO:tasks.workunit.client.0.vm05.stdout:5/63: creat d2/d12/f16 x:0 0 0 2026-03-10T07:50:39.213 INFO:tasks.workunit.client.0.vm05.stdout:7/58: rename d1/fa to d1/d6/f15 0 2026-03-10T07:50:39.213 INFO:tasks.workunit.client.0.vm05.stdout:5/64: mknod d2/d5/c17 0 2026-03-10T07:50:39.214 INFO:tasks.workunit.client.0.vm05.stdout:1/63: getdents da 0 2026-03-10T07:50:39.219 INFO:tasks.workunit.client.0.vm05.stdout:2/104: dwrite d0/f6 [0,4194304] 0 2026-03-10T07:50:39.219 INFO:tasks.workunit.client.0.vm05.stdout:5/65: write d2/d12/f16 [460877,2378] 0 2026-03-10T07:50:39.224 INFO:tasks.workunit.client.0.vm05.stdout:2/105: chown d0/d3/df/c13 3170 1 2026-03-10T07:50:39.224 INFO:tasks.workunit.client.0.vm05.stdout:2/106: write d0/f4 [1628473,112603] 0 2026-03-10T07:50:39.226 INFO:tasks.workunit.client.0.vm05.stdout:2/107: chown d0/d8/l24 1146331 1 2026-03-10T07:50:39.227 INFO:tasks.workunit.client.0.vm05.stdout:2/108: stat d0/d3/ca 0 2026-03-10T07:50:39.227 INFO:tasks.workunit.client.0.vm05.stdout:2/109: write d0/f7 [2344900,72953] 0 2026-03-10T07:50:39.228 INFO:tasks.workunit.client.0.vm05.stdout:7/59: dread d1/d6/f15 [0,4194304] 0 2026-03-10T07:50:39.231 INFO:tasks.workunit.client.0.vm05.stdout:1/64: creat da/fb x:0 0 0 2026-03-10T07:50:39.232 INFO:tasks.workunit.client.0.vm05.stdout:1/65: dread - da/fb zero size 2026-03-10T07:50:39.237 INFO:tasks.workunit.client.0.vm05.stdout:5/66: dread d2/d12/f16 [0,4194304] 0 2026-03-10T07:50:39.238 INFO:tasks.workunit.client.0.vm05.stdout:7/60: dwrite d1/d6/f9 [0,4194304] 0 2026-03-10T07:50:39.243 INFO:tasks.workunit.client.0.vm05.stdout:5/67: fdatasync d2/f8 0 2026-03-10T07:50:39.243 INFO:tasks.workunit.client.0.vm05.stdout:7/61: stat d1/lf 0 2026-03-10T07:50:39.243 INFO:tasks.workunit.client.0.vm05.stdout:7/62: read d1/d6/f9 [904838,47207] 0 2026-03-10T07:50:39.258 INFO:tasks.workunit.client.0.vm05.stdout:1/66: dread f4 [4194304,4194304] 0 2026-03-10T07:50:39.268 INFO:tasks.workunit.client.0.vm05.stdout:2/110: dwrite d0/f1 [0,4194304] 0 2026-03-10T07:50:39.279 INFO:tasks.workunit.client.0.vm05.stdout:2/111: write d0/d3/f1d [932023,5123] 0 2026-03-10T07:50:39.279 INFO:tasks.workunit.client.0.vm05.stdout:1/67: dwrite f4 [4194304,4194304] 0 2026-03-10T07:50:39.301 INFO:tasks.workunit.client.0.vm05.stdout:1/68: creat da/fc x:0 0 0 2026-03-10T07:50:39.308 INFO:tasks.workunit.client.0.vm05.stdout:1/69: mkdir da/dd 0 2026-03-10T07:50:39.312 INFO:tasks.workunit.client.0.vm05.stdout:2/112: getdents d0/d3/df 0 2026-03-10T07:50:39.313 INFO:tasks.workunit.client.0.vm05.stdout:3/75: dread d8/f12 [0,4194304] 0 2026-03-10T07:50:39.315 INFO:tasks.workunit.client.0.vm05.stdout:2/113: unlink d0/d8/c9 0 2026-03-10T07:50:39.324 INFO:tasks.workunit.client.0.vm05.stdout:1/70: mknod da/dd/ce 0 2026-03-10T07:50:39.324 INFO:tasks.workunit.client.0.vm05.stdout:1/71: dread - da/fb zero size 2026-03-10T07:50:39.326 INFO:tasks.workunit.client.0.vm05.stdout:3/76: symlink d8/l14 0 2026-03-10T07:50:39.326 INFO:tasks.workunit.client.0.vm05.stdout:3/77: readlink l1 0 2026-03-10T07:50:39.329 INFO:tasks.workunit.client.0.vm05.stdout:7/63: fsync d1/d6/f9 0 2026-03-10T07:50:39.333 INFO:tasks.workunit.client.0.vm05.stdout:3/78: dread f4 [0,4194304] 0 2026-03-10T07:50:39.333 INFO:tasks.workunit.client.0.vm05.stdout:1/72: dwrite da/fb [0,4194304] 0 2026-03-10T07:50:39.336 INFO:tasks.workunit.client.0.vm05.stdout:1/73: fdatasync f4 0 2026-03-10T07:50:39.342 INFO:tasks.workunit.client.0.vm05.stdout:2/114: symlink d0/d3/df/d25/l26 0 2026-03-10T07:50:39.343 INFO:tasks.workunit.client.0.vm05.stdout:1/74: write da/fc [450275,46964] 0 2026-03-10T07:50:39.343 INFO:tasks.workunit.client.0.vm05.stdout:2/115: stat d0/d8/c1a 0 2026-03-10T07:50:39.344 INFO:tasks.workunit.client.0.vm05.stdout:2/116: truncate d0/d3/df/f21 848258 0 2026-03-10T07:50:39.347 INFO:tasks.workunit.client.0.vm05.stdout:3/79: dread f4 [0,4194304] 0 2026-03-10T07:50:39.352 INFO:tasks.workunit.client.0.vm05.stdout:8/54: rmdir d1/dd 39 2026-03-10T07:50:39.356 INFO:tasks.workunit.client.0.vm05.stdout:2/117: unlink d0/d8/fc 0 2026-03-10T07:50:39.356 INFO:tasks.workunit.client.0.vm05.stdout:2/118: dread - d0/d3/df/f18 zero size 2026-03-10T07:50:39.357 INFO:tasks.workunit.client.0.vm05.stdout:3/80: mknod d8/c15 0 2026-03-10T07:50:39.357 INFO:tasks.workunit.client.0.vm05.stdout:2/119: dread - d0/f22 zero size 2026-03-10T07:50:39.358 INFO:tasks.workunit.client.0.vm05.stdout:3/81: truncate d8/f13 856894 0 2026-03-10T07:50:39.359 INFO:tasks.workunit.client.0.vm05.stdout:1/75: fdatasync da/fb 0 2026-03-10T07:50:39.362 INFO:tasks.workunit.client.0.vm05.stdout:1/76: write da/fc [310663,118727] 0 2026-03-10T07:50:39.365 INFO:tasks.workunit.client.0.vm05.stdout:0/41: truncate f4 3458243 0 2026-03-10T07:50:39.365 INFO:tasks.workunit.client.0.vm05.stdout:0/42: fsync d8/fc 0 2026-03-10T07:50:39.366 INFO:tasks.workunit.client.0.vm05.stdout:9/29: truncate f6 652005 0 2026-03-10T07:50:39.368 INFO:tasks.workunit.client.0.vm05.stdout:1/77: write da/fb [1948774,28955] 0 2026-03-10T07:50:39.368 INFO:tasks.workunit.client.0.vm05.stdout:4/55: rmdir d0 39 2026-03-10T07:50:39.373 INFO:tasks.workunit.client.0.vm05.stdout:2/120: dread d0/d8/f1c [0,4194304] 0 2026-03-10T07:50:39.374 INFO:tasks.workunit.client.0.vm05.stdout:8/55: symlink d1/dd/l12 0 2026-03-10T07:50:39.374 INFO:tasks.workunit.client.0.vm05.stdout:5/68: getdents d2/d12 0 2026-03-10T07:50:39.375 INFO:tasks.workunit.client.0.vm05.stdout:7/64: truncate d1/f2 721160 0 2026-03-10T07:50:39.375 INFO:tasks.workunit.client.0.vm05.stdout:5/69: write d2/d5/f6 [7501415,63375] 0 2026-03-10T07:50:39.377 INFO:tasks.workunit.client.0.vm05.stdout:5/70: readlink l1 0 2026-03-10T07:50:39.382 INFO:tasks.workunit.client.0.vm05.stdout:1/78: dread da/fc [0,4194304] 0 2026-03-10T07:50:39.382 INFO:tasks.workunit.client.0.vm05.stdout:1/79: readlink - no filename 2026-03-10T07:50:39.384 INFO:tasks.workunit.client.0.vm05.stdout:5/71: dwrite d2/ff [0,4194304] 0 2026-03-10T07:50:39.387 INFO:tasks.workunit.client.0.vm05.stdout:6/40: truncate d0/f8 2551294 0 2026-03-10T07:50:39.391 INFO:tasks.workunit.client.0.vm05.stdout:4/56: read d0/f2 [1545266,104050] 0 2026-03-10T07:50:39.400 INFO:tasks.workunit.client.0.vm05.stdout:2/121: mknod d0/d3/c27 0 2026-03-10T07:50:39.412 INFO:tasks.workunit.client.0.vm05.stdout:2/122: fsync d0/f6 0 2026-03-10T07:50:39.414 INFO:tasks.workunit.client.0.vm05.stdout:7/65: dread d1/f11 [0,4194304] 0 2026-03-10T07:50:39.416 INFO:tasks.workunit.client.0.vm05.stdout:7/66: readlink d1/l13 0 2026-03-10T07:50:39.423 INFO:tasks.workunit.client.0.vm05.stdout:5/72: creat d2/d5/f18 x:0 0 0 2026-03-10T07:50:39.432 INFO:tasks.workunit.client.0.vm05.stdout:4/57: dwrite d0/f2 [0,4194304] 0 2026-03-10T07:50:39.434 INFO:tasks.workunit.client.0.vm05.stdout:4/58: write d0/f2 [4246508,22231] 0 2026-03-10T07:50:39.436 INFO:tasks.workunit.client.0.vm05.stdout:5/73: symlink d2/d12/l19 0 2026-03-10T07:50:39.438 INFO:tasks.workunit.client.0.vm05.stdout:4/59: write d0/f2 [3773209,103869] 0 2026-03-10T07:50:39.443 INFO:tasks.workunit.client.0.vm05.stdout:1/80: rmdir da 39 2026-03-10T07:50:39.443 INFO:tasks.workunit.client.0.vm05.stdout:2/123: symlink d0/l28 0 2026-03-10T07:50:39.444 INFO:tasks.workunit.client.0.vm05.stdout:2/124: dread - d0/f22 zero size 2026-03-10T07:50:39.444 INFO:tasks.workunit.client.0.vm05.stdout:2/125: fdatasync d0/f1 0 2026-03-10T07:50:39.446 INFO:tasks.workunit.client.0.vm05.stdout:2/126: chown d0/d3/df/d25/l26 10299844 1 2026-03-10T07:50:39.455 INFO:tasks.workunit.client.0.vm05.stdout:6/41: link d0/c9 d0/cb 0 2026-03-10T07:50:39.460 INFO:tasks.workunit.client.0.vm05.stdout:8/56: link d1/c8 d1/c13 0 2026-03-10T07:50:39.468 INFO:tasks.workunit.client.0.vm05.stdout:2/127: creat d0/d3/df/d25/f29 x:0 0 0 2026-03-10T07:50:39.468 INFO:tasks.workunit.client.0.vm05.stdout:9/30: dread f7 [0,4194304] 0 2026-03-10T07:50:39.470 INFO:tasks.workunit.client.0.vm05.stdout:9/31: write f3 [934348,23390] 0 2026-03-10T07:50:39.472 INFO:tasks.workunit.client.0.vm05.stdout:9/32: chown c5 24660 1 2026-03-10T07:50:39.477 INFO:tasks.workunit.client.0.vm05.stdout:4/60: getdents d0/d6/d10 0 2026-03-10T07:50:39.478 INFO:tasks.workunit.client.0.vm05.stdout:2/128: dread d0/f5 [0,4194304] 0 2026-03-10T07:50:39.479 INFO:tasks.workunit.client.0.vm05.stdout:2/129: write d0/d3/df/d25/f29 [614195,35143] 0 2026-03-10T07:50:39.486 INFO:tasks.workunit.client.0.vm05.stdout:7/67: fsync d1/f2 0 2026-03-10T07:50:39.495 INFO:tasks.workunit.client.0.vm05.stdout:9/33: symlink d8/lb 0 2026-03-10T07:50:39.496 INFO:tasks.workunit.client.0.vm05.stdout:4/61: rmdir d0/d6/d9/d12 39 2026-03-10T07:50:39.496 INFO:tasks.workunit.client.0.vm05.stdout:9/34: chown d8 20946398 1 2026-03-10T07:50:39.501 INFO:tasks.workunit.client.0.vm05.stdout:3/82: truncate d8/fd 1161847 0 2026-03-10T07:50:39.508 INFO:tasks.workunit.client.0.vm05.stdout:7/68: rename d1/d6/f12 to d1/f16 0 2026-03-10T07:50:39.513 INFO:tasks.workunit.client.0.vm05.stdout:2/130: mkdir d0/d2a 0 2026-03-10T07:50:39.515 INFO:tasks.workunit.client.0.vm05.stdout:5/74: rmdir d2/d5 39 2026-03-10T07:50:39.516 INFO:tasks.workunit.client.0.vm05.stdout:1/81: truncate da/fc 653768 0 2026-03-10T07:50:39.517 INFO:tasks.workunit.client.0.vm05.stdout:5/75: chown d2/c13 69279 1 2026-03-10T07:50:39.517 INFO:tasks.workunit.client.0.vm05.stdout:1/82: truncate f4 8917724 0 2026-03-10T07:50:39.519 INFO:tasks.workunit.client.0.vm05.stdout:9/35: rename f3 to d8/fc 0 2026-03-10T07:50:39.520 INFO:tasks.workunit.client.0.vm05.stdout:9/36: chown l2 32891707 1 2026-03-10T07:50:39.521 INFO:tasks.workunit.client.0.vm05.stdout:3/83: dwrite d8/ff [0,4194304] 0 2026-03-10T07:50:39.523 INFO:tasks.workunit.client.0.vm05.stdout:3/84: write f6 [1373670,104103] 0 2026-03-10T07:50:39.531 INFO:tasks.workunit.client.0.vm05.stdout:6/42: link d0/c1 d0/d6/cc 0 2026-03-10T07:50:39.531 INFO:tasks.workunit.client.0.vm05.stdout:2/131: creat d0/d3/df/d25/f2b x:0 0 0 2026-03-10T07:50:39.531 INFO:tasks.workunit.client.0.vm05.stdout:1/83: dwrite da/fb [0,4194304] 0 2026-03-10T07:50:39.542 INFO:tasks.workunit.client.0.vm05.stdout:2/132: mknod d0/d8/c2c 0 2026-03-10T07:50:39.549 INFO:tasks.workunit.client.0.vm05.stdout:3/85: write d8/fc [333831,109316] 0 2026-03-10T07:50:39.549 INFO:tasks.workunit.client.0.vm05.stdout:3/86: stat f4 0 2026-03-10T07:50:39.550 INFO:tasks.workunit.client.0.vm05.stdout:2/133: creat d0/d8/f2d x:0 0 0 2026-03-10T07:50:39.550 INFO:tasks.workunit.client.0.vm05.stdout:9/37: dread f4 [0,4194304] 0 2026-03-10T07:50:39.550 INFO:tasks.workunit.client.0.vm05.stdout:6/43: dwrite d0/fa [0,4194304] 0 2026-03-10T07:50:39.552 INFO:tasks.workunit.client.0.vm05.stdout:3/87: truncate d8/f12 772602 0 2026-03-10T07:50:39.552 INFO:tasks.workunit.client.0.vm05.stdout:2/134: creat d0/d2a/f2e x:0 0 0 2026-03-10T07:50:39.552 INFO:tasks.workunit.client.0.vm05.stdout:1/84: dwrite f4 [0,4194304] 0 2026-03-10T07:50:39.552 INFO:tasks.workunit.client.0.vm05.stdout:3/88: chown d8/fe 2966192 1 2026-03-10T07:50:39.553 INFO:tasks.workunit.client.0.vm05.stdout:2/135: stat d0/d3/df/f21 0 2026-03-10T07:50:39.564 INFO:tasks.workunit.client.0.vm05.stdout:2/136: mkdir d0/d2a/d2f 0 2026-03-10T07:50:39.565 INFO:tasks.workunit.client.0.vm05.stdout:2/137: chown d0/f7 310869 1 2026-03-10T07:50:39.574 INFO:tasks.workunit.client.0.vm05.stdout:2/138: creat d0/d3/f30 x:0 0 0 2026-03-10T07:50:39.578 INFO:tasks.workunit.client.0.vm05.stdout:2/139: link d0/f4 d0/d8/f31 0 2026-03-10T07:50:39.578 INFO:tasks.workunit.client.0.vm05.stdout:2/140: write d0/d3/df/f18 [976178,58960] 0 2026-03-10T07:50:39.579 INFO:tasks.workunit.client.0.vm05.stdout:4/62: sync 2026-03-10T07:50:39.579 INFO:tasks.workunit.client.0.vm05.stdout:9/38: sync 2026-03-10T07:50:39.582 INFO:tasks.workunit.client.0.vm05.stdout:4/63: symlink d0/d6/d9/d12/l14 0 2026-03-10T07:50:39.586 INFO:tasks.workunit.client.0.vm05.stdout:9/39: creat d8/fd x:0 0 0 2026-03-10T07:50:39.586 INFO:tasks.workunit.client.0.vm05.stdout:4/64: creat d0/d6/f15 x:0 0 0 2026-03-10T07:50:39.586 INFO:tasks.workunit.client.0.vm05.stdout:9/40: dread - d8/fd zero size 2026-03-10T07:50:39.590 INFO:tasks.workunit.client.0.vm05.stdout:4/65: link d0/d6/d9/d12/l14 d0/d6/l16 0 2026-03-10T07:50:39.590 INFO:tasks.workunit.client.0.vm05.stdout:4/66: write d0/d6/f15 [242211,45676] 0 2026-03-10T07:50:39.591 INFO:tasks.workunit.client.0.vm05.stdout:4/67: write d0/f2 [3337898,103081] 0 2026-03-10T07:50:39.595 INFO:tasks.workunit.client.0.vm05.stdout:4/68: mkdir d0/d17 0 2026-03-10T07:50:39.608 INFO:tasks.workunit.client.0.vm05.stdout:4/69: sync 2026-03-10T07:50:39.613 INFO:tasks.workunit.client.0.vm05.stdout:4/70: sync 2026-03-10T07:50:39.618 INFO:tasks.workunit.client.0.vm05.stdout:4/71: mknod d0/d6/c18 0 2026-03-10T07:50:39.645 INFO:tasks.workunit.client.0.vm05.stdout:8/57: truncate f0 325085 0 2026-03-10T07:50:39.645 INFO:tasks.workunit.client.0.vm05.stdout:3/89: dread d8/fd [0,4194304] 0 2026-03-10T07:50:39.658 INFO:tasks.workunit.client.0.vm05.stdout:4/72: fdatasync d0/f2 0 2026-03-10T07:50:39.658 INFO:tasks.workunit.client.0.vm05.stdout:4/73: chown d0/d6/c18 424482 1 2026-03-10T07:50:39.659 INFO:tasks.workunit.client.0.vm05.stdout:4/74: write d0/f2 [5312402,88959] 0 2026-03-10T07:50:39.660 INFO:tasks.workunit.client.0.vm05.stdout:4/75: read d0/d6/f15 [29044,120559] 0 2026-03-10T07:50:39.670 INFO:tasks.workunit.client.0.vm05.stdout:4/76: rmdir d0/d6/d9 39 2026-03-10T07:50:39.670 INFO:tasks.workunit.client.0.vm05.stdout:1/85: fsync da/fb 0 2026-03-10T07:50:39.675 INFO:tasks.workunit.client.0.vm05.stdout:0/43: truncate f4 1491502 0 2026-03-10T07:50:39.679 INFO:tasks.workunit.client.0.vm05.stdout:2/141: getdents d0 0 2026-03-10T07:50:39.679 INFO:tasks.workunit.client.0.vm05.stdout:0/44: dread d8/fc [0,4194304] 0 2026-03-10T07:50:39.688 INFO:tasks.workunit.client.0.vm05.stdout:4/77: dread d0/d6/f15 [0,4194304] 0 2026-03-10T07:50:39.688 INFO:tasks.workunit.client.0.vm05.stdout:2/142: mknod d0/d3/df/d25/c32 0 2026-03-10T07:50:39.688 INFO:tasks.workunit.client.0.vm05.stdout:1/86: stat c1 0 2026-03-10T07:50:39.690 INFO:tasks.workunit.client.0.vm05.stdout:2/143: chown d0/d3/df/c13 0 1 2026-03-10T07:50:39.693 INFO:tasks.workunit.client.0.vm05.stdout:5/76: truncate d2/f9 3630085 0 2026-03-10T07:50:39.695 INFO:tasks.workunit.client.0.vm05.stdout:3/90: getdents d8 0 2026-03-10T07:50:39.695 INFO:tasks.workunit.client.0.vm05.stdout:4/78: readlink d0/d6/l16 0 2026-03-10T07:50:39.699 INFO:tasks.workunit.client.0.vm05.stdout:7/69: dwrite d1/f11 [0,4194304] 0 2026-03-10T07:50:39.701 INFO:tasks.workunit.client.0.vm05.stdout:2/144: symlink d0/d3/df/d25/l33 0 2026-03-10T07:50:39.701 INFO:tasks.workunit.client.0.vm05.stdout:2/145: readlink d0/d3/df/l16 0 2026-03-10T07:50:39.702 INFO:tasks.workunit.client.0.vm05.stdout:2/146: dread - d0/f1e zero size 2026-03-10T07:50:39.703 INFO:tasks.workunit.client.0.vm05.stdout:2/147: dread d0/d3/df/f20 [0,4194304] 0 2026-03-10T07:50:39.703 INFO:tasks.workunit.client.0.vm05.stdout:2/148: write d0/d3/df/d25/f29 [1475456,59952] 0 2026-03-10T07:50:39.704 INFO:tasks.workunit.client.0.vm05.stdout:2/149: chown d0/f22 67158024 1 2026-03-10T07:50:39.704 INFO:tasks.workunit.client.0.vm05.stdout:1/87: creat da/ff x:0 0 0 2026-03-10T07:50:39.704 INFO:tasks.workunit.client.0.vm05.stdout:2/150: stat d0/d3/df/c11 0 2026-03-10T07:50:39.707 INFO:tasks.workunit.client.0.vm05.stdout:3/91: mkdir d8/d16 0 2026-03-10T07:50:39.711 INFO:tasks.workunit.client.0.vm05.stdout:3/92: write d8/fe [2257409,64585] 0 2026-03-10T07:50:39.721 INFO:tasks.workunit.client.0.vm05.stdout:7/70: creat d1/f17 x:0 0 0 2026-03-10T07:50:39.729 INFO:tasks.workunit.client.0.vm05.stdout:4/79: fsync d0/d6/f15 0 2026-03-10T07:50:39.732 INFO:tasks.workunit.client.0.vm05.stdout:9/41: write d8/fa [2465467,39795] 0 2026-03-10T07:50:39.733 INFO:tasks.workunit.client.0.vm05.stdout:4/80: chown d0/d17 239849 1 2026-03-10T07:50:39.739 INFO:tasks.workunit.client.0.vm05.stdout:4/81: dwrite d0/f2 [4194304,4194304] 0 2026-03-10T07:50:39.741 INFO:tasks.workunit.client.0.vm05.stdout:1/88: mknod da/dd/c10 0 2026-03-10T07:50:39.741 INFO:tasks.workunit.client.0.vm05.stdout:4/82: read d0/f2 [2254932,36453] 0 2026-03-10T07:50:39.742 INFO:tasks.workunit.client.0.vm05.stdout:4/83: truncate d0/d6/f15 882964 0 2026-03-10T07:50:39.743 INFO:tasks.workunit.client.0.vm05.stdout:4/84: write d0/d6/f15 [326513,98408] 0 2026-03-10T07:50:39.750 INFO:tasks.workunit.client.0.vm05.stdout:1/89: dread f4 [0,4194304] 0 2026-03-10T07:50:39.759 INFO:tasks.workunit.client.0.vm05.stdout:3/93: symlink d8/l17 0 2026-03-10T07:50:39.759 INFO:tasks.workunit.client.0.vm05.stdout:2/151: rename d0/d3 to d0/d34 0 2026-03-10T07:50:39.759 INFO:tasks.workunit.client.0.vm05.stdout:7/71: symlink d1/l18 0 2026-03-10T07:50:39.759 INFO:tasks.workunit.client.0.vm05.stdout:5/77: creat d2/f1a x:0 0 0 2026-03-10T07:50:39.759 INFO:tasks.workunit.client.0.vm05.stdout:9/42: creat d8/fe x:0 0 0 2026-03-10T07:50:39.760 INFO:tasks.workunit.client.0.vm05.stdout:7/72: chown d1/f11 95696309 1 2026-03-10T07:50:39.763 INFO:tasks.workunit.client.0.vm05.stdout:6/44: truncate d0/f8 3386355 0 2026-03-10T07:50:39.765 INFO:tasks.workunit.client.0.vm05.stdout:7/73: dread d1/d6/fb [0,4194304] 0 2026-03-10T07:50:39.770 INFO:tasks.workunit.client.0.vm05.stdout:7/74: dread d1/f2 [0,4194304] 0 2026-03-10T07:50:39.772 INFO:tasks.workunit.client.0.vm05.stdout:3/94: dwrite d8/fc [0,4194304] 0 2026-03-10T07:50:39.775 INFO:tasks.workunit.client.0.vm05.stdout:3/95: stat f4 0 2026-03-10T07:50:39.780 INFO:tasks.workunit.client.0.vm05.stdout:1/90: creat da/dd/f11 x:0 0 0 2026-03-10T07:50:39.780 INFO:tasks.workunit.client.0.vm05.stdout:4/85: dread d0/d6/f15 [0,4194304] 0 2026-03-10T07:50:39.780 INFO:tasks.workunit.client.0.vm05.stdout:9/43: dwrite d8/fe [0,4194304] 0 2026-03-10T07:50:39.781 INFO:tasks.workunit.client.0.vm05.stdout:3/96: write d8/fb [866833,57567] 0 2026-03-10T07:50:39.788 INFO:tasks.workunit.client.0.vm05.stdout:4/86: dread d0/d6/f15 [0,4194304] 0 2026-03-10T07:50:39.790 INFO:tasks.workunit.client.0.vm05.stdout:7/75: mknod d1/d6/c19 0 2026-03-10T07:50:39.791 INFO:tasks.workunit.client.0.vm05.stdout:7/76: readlink d1/lf 0 2026-03-10T07:50:39.791 INFO:tasks.workunit.client.0.vm05.stdout:7/77: chown d1/l14 15068 1 2026-03-10T07:50:39.792 INFO:tasks.workunit.client.0.vm05.stdout:7/78: write d1/f11 [3515333,125530] 0 2026-03-10T07:50:39.793 INFO:tasks.workunit.client.0.vm05.stdout:7/79: dread d1/d6/fb [0,4194304] 0 2026-03-10T07:50:39.793 INFO:tasks.workunit.client.0.vm05.stdout:7/80: read d1/d6/f9 [299054,33279] 0 2026-03-10T07:50:39.799 INFO:tasks.workunit.client.0.vm05.stdout:9/44: mkdir d8/df 0 2026-03-10T07:50:39.799 INFO:tasks.workunit.client.0.vm05.stdout:1/91: stat da/fc 0 2026-03-10T07:50:39.800 INFO:tasks.workunit.client.0.vm05.stdout:9/45: truncate d8/fd 875042 0 2026-03-10T07:50:39.803 INFO:tasks.workunit.client.0.vm05.stdout:6/45: unlink d0/c1 0 2026-03-10T07:50:39.804 INFO:tasks.workunit.client.0.vm05.stdout:3/97: creat d8/f18 x:0 0 0 2026-03-10T07:50:39.804 INFO:tasks.workunit.client.0.vm05.stdout:3/98: chown d8/l9 45 1 2026-03-10T07:50:39.811 INFO:tasks.workunit.client.0.vm05.stdout:7/81: fsync d1/d6/f15 0 2026-03-10T07:50:39.820 INFO:tasks.workunit.client.0.vm05.stdout:1/92: mkdir da/dd/d12 0 2026-03-10T07:50:39.823 INFO:tasks.workunit.client.0.vm05.stdout:4/87: creat d0/d17/f19 x:0 0 0 2026-03-10T07:50:39.828 INFO:tasks.workunit.client.0.vm05.stdout:7/82: fsync d1/f16 0 2026-03-10T07:50:39.829 INFO:tasks.workunit.client.0.vm05.stdout:1/93: creat da/dd/f13 x:0 0 0 2026-03-10T07:50:39.832 INFO:tasks.workunit.client.0.vm05.stdout:7/83: chown d1/d6/l10 3055 1 2026-03-10T07:50:39.834 INFO:tasks.workunit.client.0.vm05.stdout:8/58: write f0 [782337,27480] 0 2026-03-10T07:50:39.835 INFO:tasks.workunit.client.0.vm05.stdout:8/59: truncate d1/dd/f11 252987 0 2026-03-10T07:50:39.838 INFO:tasks.workunit.client.0.vm05.stdout:3/99: mkdir d8/d16/d19 0 2026-03-10T07:50:39.842 INFO:tasks.workunit.client.0.vm05.stdout:0/45: truncate f4 434908 0 2026-03-10T07:50:39.844 INFO:tasks.workunit.client.0.vm05.stdout:9/46: rmdir d8/df 0 2026-03-10T07:50:39.844 INFO:tasks.workunit.client.0.vm05.stdout:7/84: write d1/d6/fb [1505395,82529] 0 2026-03-10T07:50:39.846 INFO:tasks.workunit.client.0.vm05.stdout:8/60: mknod d1/dd/c14 0 2026-03-10T07:50:39.848 INFO:tasks.workunit.client.0.vm05.stdout:4/88: sync 2026-03-10T07:50:39.848 INFO:tasks.workunit.client.0.vm05.stdout:7/85: sync 2026-03-10T07:50:39.862 INFO:tasks.workunit.client.0.vm05.stdout:3/100: creat d8/d16/f1a x:0 0 0 2026-03-10T07:50:39.867 INFO:tasks.workunit.client.0.vm05.stdout:3/101: dread d8/fc [0,4194304] 0 2026-03-10T07:50:39.873 INFO:tasks.workunit.client.0.vm05.stdout:3/102: dwrite f3 [0,4194304] 0 2026-03-10T07:50:39.880 INFO:tasks.workunit.client.0.vm05.stdout:7/86: fdatasync d1/f2 0 2026-03-10T07:50:39.888 INFO:tasks.workunit.client.0.vm05.stdout:6/46: dread d0/f8 [0,4194304] 0 2026-03-10T07:50:39.889 INFO:tasks.workunit.client.0.vm05.stdout:6/47: stat d0/l4 0 2026-03-10T07:50:39.890 INFO:tasks.workunit.client.0.vm05.stdout:7/87: dwrite d1/f17 [0,4194304] 0 2026-03-10T07:50:39.891 INFO:tasks.workunit.client.0.vm05.stdout:6/48: write d0/fa [3303554,129778] 0 2026-03-10T07:50:39.902 INFO:tasks.workunit.client.0.vm05.stdout:7/88: write d1/d6/f9 [1246084,46707] 0 2026-03-10T07:50:39.902 INFO:tasks.workunit.client.0.vm05.stdout:2/152: chown d0/d34/df/f18 5 1 2026-03-10T07:50:39.902 INFO:tasks.workunit.client.0.vm05.stdout:1/94: truncate da/fb 3753175 0 2026-03-10T07:50:39.905 INFO:tasks.workunit.client.0.vm05.stdout:0/46: getdents d8/dd/d10 0 2026-03-10T07:50:39.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:39 vm05.local ceph-mon[50387]: pgmap v9: 65 pgs: 65 active+clean; 253 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 131 KiB/s rd, 12 MiB/s wr, 219 op/s 2026-03-10T07:50:39.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:39 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:39.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:39 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:39.912 INFO:tasks.workunit.client.0.vm05.stdout:3/103: symlink d8/l1b 0 2026-03-10T07:50:39.915 INFO:tasks.workunit.client.0.vm05.stdout:5/78: truncate d2/d5/fa 1439397 0 2026-03-10T07:50:39.916 INFO:tasks.workunit.client.0.vm05.stdout:7/89: symlink d1/l1a 0 2026-03-10T07:50:39.916 INFO:tasks.workunit.client.0.vm05.stdout:2/153: mknod d0/d34/c35 0 2026-03-10T07:50:39.917 INFO:tasks.workunit.client.0.vm05.stdout:2/154: fsync d0/d34/df/d25/f29 0 2026-03-10T07:50:39.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:39 vm08.local ceph-mon[59917]: pgmap v9: 65 pgs: 65 active+clean; 253 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 131 KiB/s rd, 12 MiB/s wr, 219 op/s 2026-03-10T07:50:39.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:39 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:39.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:39 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:39.918 INFO:tasks.workunit.client.0.vm05.stdout:2/155: write d0/d34/df/d25/f29 [1852056,71658] 0 2026-03-10T07:50:39.921 INFO:tasks.workunit.client.0.vm05.stdout:3/104: unlink d8/f13 0 2026-03-10T07:50:39.921 INFO:tasks.workunit.client.0.vm05.stdout:5/79: stat d2/c11 0 2026-03-10T07:50:39.922 INFO:tasks.workunit.client.0.vm05.stdout:8/61: creat d1/f15 x:0 0 0 2026-03-10T07:50:39.928 INFO:tasks.workunit.client.0.vm05.stdout:7/90: rename d1/d6/f9 to d1/d6/f1b 0 2026-03-10T07:50:39.933 INFO:tasks.workunit.client.0.vm05.stdout:2/156: write d0/f1e [976933,5098] 0 2026-03-10T07:50:39.937 INFO:tasks.workunit.client.0.vm05.stdout:1/95: creat da/dd/d12/f14 x:0 0 0 2026-03-10T07:50:39.939 INFO:tasks.workunit.client.0.vm05.stdout:0/47: getdents d8/dd/d10 0 2026-03-10T07:50:39.940 INFO:tasks.workunit.client.0.vm05.stdout:3/105: mkdir d8/d1c 0 2026-03-10T07:50:39.940 INFO:tasks.workunit.client.0.vm05.stdout:5/80: symlink d2/d12/l1b 0 2026-03-10T07:50:39.940 INFO:tasks.workunit.client.0.vm05.stdout:8/62: dwrite d1/f15 [0,4194304] 0 2026-03-10T07:50:39.940 INFO:tasks.workunit.client.0.vm05.stdout:9/47: rmdir d8 39 2026-03-10T07:50:39.942 INFO:tasks.workunit.client.0.vm05.stdout:7/91: sync 2026-03-10T07:50:39.943 INFO:tasks.workunit.client.0.vm05.stdout:8/63: stat d1/dd/l12 0 2026-03-10T07:50:39.945 INFO:tasks.workunit.client.0.vm05.stdout:1/96: dread da/fc [0,4194304] 0 2026-03-10T07:50:39.947 INFO:tasks.workunit.client.0.vm05.stdout:0/48: dread d8/fa [0,4194304] 0 2026-03-10T07:50:39.967 INFO:tasks.workunit.client.0.vm05.stdout:5/81: symlink d2/d12/l1c 0 2026-03-10T07:50:39.967 INFO:tasks.workunit.client.0.vm05.stdout:5/82: readlink l1 0 2026-03-10T07:50:39.968 INFO:tasks.workunit.client.0.vm05.stdout:7/92: rename d1/f2 to d1/d6/f1c 0 2026-03-10T07:50:39.968 INFO:tasks.workunit.client.0.vm05.stdout:7/93: read d1/d6/f1c [276281,57302] 0 2026-03-10T07:50:39.968 INFO:tasks.workunit.client.0.vm05.stdout:7/94: fsync d1/f11 0 2026-03-10T07:50:39.969 INFO:tasks.workunit.client.0.vm05.stdout:4/89: truncate d0/f2 207084 0 2026-03-10T07:50:39.971 INFO:tasks.workunit.client.0.vm05.stdout:6/49: write d0/f8 [3618596,105081] 0 2026-03-10T07:50:39.972 INFO:tasks.workunit.client.0.vm05.stdout:3/106: dread d8/fc [0,4194304] 0 2026-03-10T07:50:39.974 INFO:tasks.workunit.client.0.vm05.stdout:2/157: rmdir d0/d8 39 2026-03-10T07:50:39.988 INFO:tasks.workunit.client.0.vm05.stdout:9/48: rmdir d8 39 2026-03-10T07:50:39.988 INFO:tasks.workunit.client.0.vm05.stdout:5/83: symlink d2/d5/l1d 0 2026-03-10T07:50:39.991 INFO:tasks.workunit.client.0.vm05.stdout:5/84: write d2/d12/f14 [556637,15307] 0 2026-03-10T07:50:39.992 INFO:tasks.workunit.client.0.vm05.stdout:5/85: write d2/d5/fb [3051773,67222] 0 2026-03-10T07:50:40.009 INFO:tasks.workunit.client.0.vm05.stdout:5/86: fsync d2/d12/f16 0 2026-03-10T07:50:40.018 INFO:tasks.workunit.client.0.vm05.stdout:5/87: dwrite d2/d5/f10 [0,4194304] 0 2026-03-10T07:50:40.040 INFO:tasks.workunit.client.0.vm05.stdout:1/97: dread da/fb [0,4194304] 0 2026-03-10T07:50:40.041 INFO:tasks.workunit.client.0.vm05.stdout:5/88: dwrite d2/d5/f6 [0,4194304] 0 2026-03-10T07:50:40.050 INFO:tasks.workunit.client.0.vm05.stdout:7/95: rename d1/f17 to d1/d6/f1d 0 2026-03-10T07:50:40.154 INFO:tasks.workunit.client.0.vm05.stdout:6/50: write d0/fa [1442735,63217] 0 2026-03-10T07:50:40.161 INFO:tasks.workunit.client.0.vm05.stdout:4/90: symlink d0/d6/d9/l1a 0 2026-03-10T07:50:40.166 INFO:tasks.workunit.client.0.vm05.stdout:6/51: dwrite d0/fa [4194304,4194304] 0 2026-03-10T07:50:40.174 INFO:tasks.workunit.client.0.vm05.stdout:1/98: symlink da/l15 0 2026-03-10T07:50:40.177 INFO:tasks.workunit.client.0.vm05.stdout:5/89: rename d2/d12/f16 to d2/d5/f1e 0 2026-03-10T07:50:40.178 INFO:tasks.workunit.client.0.vm05.stdout:5/90: dread - d2/d5/f18 zero size 2026-03-10T07:50:40.179 INFO:tasks.workunit.client.0.vm05.stdout:5/91: write d2/d12/f14 [951044,3331] 0 2026-03-10T07:50:40.180 INFO:tasks.workunit.client.0.vm05.stdout:7/96: write d1/d6/f1b [2222429,12773] 0 2026-03-10T07:50:40.181 INFO:tasks.workunit.client.0.vm05.stdout:7/97: truncate d1/d6/f1d 4698421 0 2026-03-10T07:50:40.185 INFO:tasks.workunit.client.0.vm05.stdout:7/98: dwrite d1/d6/f15 [0,4194304] 0 2026-03-10T07:50:40.188 INFO:tasks.workunit.client.0.vm05.stdout:7/99: chown d1/lf 139 1 2026-03-10T07:50:40.188 INFO:tasks.workunit.client.0.vm05.stdout:7/100: chown d1/l18 794885484 1 2026-03-10T07:50:40.194 INFO:tasks.workunit.client.0.vm05.stdout:8/64: link d1/l2 d1/l16 0 2026-03-10T07:50:40.202 INFO:tasks.workunit.client.0.vm05.stdout:1/99: rename da/ff to da/dd/d12/f16 0 2026-03-10T07:50:40.203 INFO:tasks.workunit.client.0.vm05.stdout:1/100: dread - da/dd/f13 zero size 2026-03-10T07:50:40.206 INFO:tasks.workunit.client.0.vm05.stdout:7/101: write d1/d6/f1d [4315886,99412] 0 2026-03-10T07:50:40.207 INFO:tasks.workunit.client.0.vm05.stdout:7/102: fdatasync d1/f11 0 2026-03-10T07:50:40.208 INFO:tasks.workunit.client.0.vm05.stdout:7/103: truncate d1/d6/f1d 5714336 0 2026-03-10T07:50:40.210 INFO:tasks.workunit.client.0.vm05.stdout:2/158: creat d0/d2a/d2f/f36 x:0 0 0 2026-03-10T07:50:40.214 INFO:tasks.workunit.client.0.vm05.stdout:9/49: link c5 d8/c10 0 2026-03-10T07:50:40.228 INFO:tasks.workunit.client.0.vm05.stdout:4/91: unlink d0/d6/l16 0 2026-03-10T07:50:40.231 INFO:tasks.workunit.client.0.vm05.stdout:5/92: rename d2/d12/l1c to d2/d5/l1f 0 2026-03-10T07:50:40.240 INFO:tasks.workunit.client.0.vm05.stdout:1/101: dwrite da/dd/d12/f16 [0,4194304] 0 2026-03-10T07:50:40.245 INFO:tasks.workunit.client.0.vm05.stdout:7/104: dwrite d1/d6/f1c [0,4194304] 0 2026-03-10T07:50:40.258 INFO:tasks.workunit.client.0.vm05.stdout:9/50: mknod d8/c11 0 2026-03-10T07:50:40.259 INFO:tasks.workunit.client.0.vm05.stdout:9/51: dread f6 [0,4194304] 0 2026-03-10T07:50:40.262 INFO:tasks.workunit.client.0.vm05.stdout:6/52: mknod d0/d6/cd 0 2026-03-10T07:50:40.272 INFO:tasks.workunit.client.0.vm05.stdout:7/105: creat d1/f1e x:0 0 0 2026-03-10T07:50:40.277 INFO:tasks.workunit.client.0.vm05.stdout:7/106: write d1/d6/f15 [1876606,94834] 0 2026-03-10T07:50:40.277 INFO:tasks.workunit.client.0.vm05.stdout:2/159: mkdir d0/d34/df/d25/d37 0 2026-03-10T07:50:40.285 INFO:tasks.workunit.client.0.vm05.stdout:6/53: creat d0/d6/fe x:0 0 0 2026-03-10T07:50:40.287 INFO:tasks.workunit.client.0.vm05.stdout:5/93: mkdir d2/d20 0 2026-03-10T07:50:40.287 INFO:tasks.workunit.client.0.vm05.stdout:5/94: truncate d2/d5/f1e 1149403 0 2026-03-10T07:50:40.291 INFO:tasks.workunit.client.0.vm05.stdout:7/107: unlink d1/f1e 0 2026-03-10T07:50:40.294 INFO:tasks.workunit.client.0.vm05.stdout:2/160: mkdir d0/d34/d38 0 2026-03-10T07:50:40.321 INFO:tasks.workunit.client.0.vm05.stdout:6/54: symlink d0/lf 0 2026-03-10T07:50:40.324 INFO:tasks.workunit.client.0.vm05.stdout:5/95: creat d2/d20/f21 x:0 0 0 2026-03-10T07:50:40.324 INFO:tasks.workunit.client.0.vm05.stdout:5/96: dread - d2/f8 zero size 2026-03-10T07:50:40.327 INFO:tasks.workunit.client.0.vm05.stdout:2/161: rename d0/c1b to d0/d2a/d2f/c39 0 2026-03-10T07:50:40.334 INFO:tasks.workunit.client.0.vm05.stdout:6/55: write d0/f8 [1245506,78805] 0 2026-03-10T07:50:40.341 INFO:tasks.workunit.client.0.vm05.stdout:3/107: truncate d8/f12 763938 0 2026-03-10T07:50:40.344 INFO:tasks.workunit.client.0.vm05.stdout:8/65: getdents d1 0 2026-03-10T07:50:40.350 INFO:tasks.workunit.client.0.vm05.stdout:6/56: creat d0/d6/f10 x:0 0 0 2026-03-10T07:50:40.353 INFO:tasks.workunit.client.0.vm05.stdout:0/49: dwrite f4 [0,4194304] 0 2026-03-10T07:50:40.355 INFO:tasks.workunit.client.0.vm05.stdout:3/108: fsync d8/fc 0 2026-03-10T07:50:40.355 INFO:tasks.workunit.client.0.vm05.stdout:3/109: write f6 [714705,115928] 0 2026-03-10T07:50:40.364 INFO:tasks.workunit.client.0.vm05.stdout:8/66: creat d1/dd/f17 x:0 0 0 2026-03-10T07:50:40.368 INFO:tasks.workunit.client.0.vm05.stdout:5/97: symlink d2/l22 0 2026-03-10T07:50:40.370 INFO:tasks.workunit.client.0.vm05.stdout:6/57: chown d0/l4 1243696 1 2026-03-10T07:50:40.370 INFO:tasks.workunit.client.0.vm05.stdout:6/58: readlink d0/l4 0 2026-03-10T07:50:40.374 INFO:tasks.workunit.client.0.vm05.stdout:7/108: write d1/d6/f1d [2088697,26680] 0 2026-03-10T07:50:40.376 INFO:tasks.workunit.client.0.vm05.stdout:4/92: getdents d0/d6 0 2026-03-10T07:50:40.377 INFO:tasks.workunit.client.0.vm05.stdout:4/93: write d0/d17/f19 [461121,121666] 0 2026-03-10T07:50:40.380 INFO:tasks.workunit.client.0.vm05.stdout:3/110: rename d8/fc to d8/d16/d19/f1d 0 2026-03-10T07:50:40.381 INFO:tasks.workunit.client.0.vm05.stdout:3/111: chown d8/d16/d19/f1d 13699029 1 2026-03-10T07:50:40.383 INFO:tasks.workunit.client.0.vm05.stdout:8/67: mkdir d1/dd/d18 0 2026-03-10T07:50:40.383 INFO:tasks.workunit.client.0.vm05.stdout:8/68: chown d1/dd/l12 0 1 2026-03-10T07:50:40.384 INFO:tasks.workunit.client.0.vm05.stdout:9/52: getdents d8 0 2026-03-10T07:50:40.388 INFO:tasks.workunit.client.0.vm05.stdout:9/53: dwrite d8/fd [0,4194304] 0 2026-03-10T07:50:40.388 INFO:tasks.workunit.client.0.vm05.stdout:9/54: write d8/fa [343618,10044] 0 2026-03-10T07:50:40.389 INFO:tasks.workunit.client.0.vm05.stdout:9/55: read d8/fa [47891,55990] 0 2026-03-10T07:50:40.389 INFO:tasks.workunit.client.0.vm05.stdout:9/56: write d8/f9 [2502399,51910] 0 2026-03-10T07:50:40.390 INFO:tasks.workunit.client.0.vm05.stdout:9/57: write d8/fd [3861890,43614] 0 2026-03-10T07:50:40.392 INFO:tasks.workunit.client.0.vm05.stdout:2/162: sync 2026-03-10T07:50:40.396 INFO:tasks.workunit.client.0.vm05.stdout:5/98: creat d2/d5/f23 x:0 0 0 2026-03-10T07:50:40.406 INFO:tasks.workunit.client.0.vm05.stdout:1/102: dwrite da/fb [0,4194304] 0 2026-03-10T07:50:40.411 INFO:tasks.workunit.client.0.vm05.stdout:7/109: mknod d1/d6/c1f 0 2026-03-10T07:50:40.419 INFO:tasks.workunit.client.0.vm05.stdout:5/99: sync 2026-03-10T07:50:40.419 INFO:tasks.workunit.client.0.vm05.stdout:5/100: dread - d2/d5/f18 zero size 2026-03-10T07:50:40.426 INFO:tasks.workunit.client.0.vm05.stdout:0/50: creat d8/dd/d10/f11 x:0 0 0 2026-03-10T07:50:40.432 INFO:tasks.workunit.client.0.vm05.stdout:3/112: creat d8/d16/f1e x:0 0 0 2026-03-10T07:50:40.436 INFO:tasks.workunit.client.0.vm05.stdout:3/113: dwrite d8/fe [0,4194304] 0 2026-03-10T07:50:40.440 INFO:tasks.workunit.client.0.vm05.stdout:8/69: creat d1/dd/f19 x:0 0 0 2026-03-10T07:50:40.444 INFO:tasks.workunit.client.0.vm05.stdout:8/70: write d1/dd/f10 [558037,121583] 0 2026-03-10T07:50:40.445 INFO:tasks.workunit.client.0.vm05.stdout:8/71: chown d1/c4 167817 1 2026-03-10T07:50:40.445 INFO:tasks.workunit.client.0.vm05.stdout:8/72: read - d1/fe zero size 2026-03-10T07:50:40.446 INFO:tasks.workunit.client.0.vm05.stdout:8/73: truncate d1/dd/f11 1116806 0 2026-03-10T07:50:40.449 INFO:tasks.workunit.client.0.vm05.stdout:3/114: dwrite d8/fb [0,4194304] 0 2026-03-10T07:50:40.462 INFO:tasks.workunit.client.0.vm05.stdout:8/74: dwrite d1/dd/f19 [0,4194304] 0 2026-03-10T07:50:40.486 INFO:tasks.workunit.client.0.vm05.stdout:9/58: fsync d8/fd 0 2026-03-10T07:50:40.528 INFO:tasks.workunit.client.0.vm05.stdout:1/103: creat da/f17 x:0 0 0 2026-03-10T07:50:40.529 INFO:tasks.workunit.client.0.vm05.stdout:7/110: creat d1/d6/f20 x:0 0 0 2026-03-10T07:50:40.529 INFO:tasks.workunit.client.0.vm05.stdout:1/104: chown c7 4 1 2026-03-10T07:50:40.529 INFO:tasks.workunit.client.0.vm05.stdout:5/101: creat d2/d12/f24 x:0 0 0 2026-03-10T07:50:40.531 INFO:tasks.workunit.client.0.vm05.stdout:4/94: mknod d0/c1b 0 2026-03-10T07:50:40.535 INFO:tasks.workunit.client.0.vm05.stdout:0/51: creat d8/f12 x:0 0 0 2026-03-10T07:50:40.540 INFO:tasks.workunit.client.0.vm05.stdout:7/111: dwrite d1/d6/f15 [0,4194304] 0 2026-03-10T07:50:40.543 INFO:tasks.workunit.client.0.vm05.stdout:3/115: mkdir d8/d1f 0 2026-03-10T07:50:40.546 INFO:tasks.workunit.client.0.vm05.stdout:7/112: read d1/d6/f1c [3592783,65435] 0 2026-03-10T07:50:40.550 INFO:tasks.workunit.client.0.vm05.stdout:3/116: dread d8/ff [0,4194304] 0 2026-03-10T07:50:40.565 INFO:tasks.workunit.client.0.vm05.stdout:9/59: rename d8/fe to d8/f12 0 2026-03-10T07:50:40.588 INFO:tasks.workunit.client.0.vm05.stdout:1/105: creat da/dd/d12/f18 x:0 0 0 2026-03-10T07:50:40.592 INFO:tasks.workunit.client.0.vm05.stdout:4/95: symlink d0/d6/d9/d12/l1c 0 2026-03-10T07:50:40.597 INFO:tasks.workunit.client.0.vm05.stdout:5/102: rename d2/d5/f6 to d2/d5/f25 0 2026-03-10T07:50:40.599 INFO:tasks.workunit.client.0.vm05.stdout:0/52: creat d8/f13 x:0 0 0 2026-03-10T07:50:40.599 INFO:tasks.workunit.client.0.vm05.stdout:0/53: fdatasync f4 0 2026-03-10T07:50:40.603 INFO:tasks.workunit.client.0.vm05.stdout:5/103: dwrite d2/f8 [0,4194304] 0 2026-03-10T07:50:40.609 INFO:tasks.workunit.client.0.vm05.stdout:0/54: dread d8/fc [0,4194304] 0 2026-03-10T07:50:40.609 INFO:tasks.workunit.client.0.vm05.stdout:1/106: sync 2026-03-10T07:50:40.613 INFO:tasks.workunit.client.0.vm05.stdout:0/55: write d8/fb [1607993,88070] 0 2026-03-10T07:50:40.619 INFO:tasks.workunit.client.0.vm05.stdout:1/107: dwrite da/dd/d12/f16 [0,4194304] 0 2026-03-10T07:50:40.622 INFO:tasks.workunit.client.0.vm05.stdout:1/108: chown c8 682700 1 2026-03-10T07:50:40.626 INFO:tasks.workunit.client.0.vm05.stdout:0/56: dwrite f4 [0,4194304] 0 2026-03-10T07:50:40.645 INFO:tasks.workunit.client.0.vm05.stdout:7/113: creat d1/f21 x:0 0 0 2026-03-10T07:50:40.648 INFO:tasks.workunit.client.0.vm05.stdout:7/114: truncate d1/f16 659477 0 2026-03-10T07:50:40.649 INFO:tasks.workunit.client.0.vm05.stdout:3/117: stat c2 0 2026-03-10T07:50:40.661 INFO:tasks.workunit.client.0.vm05.stdout:6/59: getdents d0/d6 0 2026-03-10T07:50:40.673 INFO:tasks.workunit.client.0.vm05.stdout:2/163: dwrite d0/d34/df/f20 [0,4194304] 0 2026-03-10T07:50:40.684 INFO:tasks.workunit.client.0.vm05.stdout:2/164: sync 2026-03-10T07:50:40.688 INFO:tasks.workunit.client.0.vm05.stdout:5/104: symlink d2/d5/l26 0 2026-03-10T07:50:40.689 INFO:tasks.workunit.client.0.vm05.stdout:1/109: fdatasync da/dd/d12/f16 0 2026-03-10T07:50:40.690 INFO:tasks.workunit.client.0.vm05.stdout:1/110: chown da/fb 10919 1 2026-03-10T07:50:40.695 INFO:tasks.workunit.client.0.vm05.stdout:1/111: write da/dd/d12/f14 [743671,116566] 0 2026-03-10T07:50:40.725 INFO:tasks.workunit.client.0.vm05.stdout:4/96: truncate d0/f2 177949 0 2026-03-10T07:50:40.731 INFO:tasks.workunit.client.0.vm05.stdout:4/97: dread d0/d17/f19 [0,4194304] 0 2026-03-10T07:50:40.734 INFO:tasks.workunit.client.0.vm05.stdout:2/165: creat d0/d34/df/f3a x:0 0 0 2026-03-10T07:50:40.734 INFO:tasks.workunit.client.0.vm05.stdout:2/166: readlink d0/d34/df/l16 0 2026-03-10T07:50:40.736 INFO:tasks.workunit.client.0.vm05.stdout:4/98: dwrite d0/d17/f19 [0,4194304] 0 2026-03-10T07:50:40.737 INFO:tasks.workunit.client.0.vm05.stdout:5/105: mknod d2/d5/c27 0 2026-03-10T07:50:40.738 INFO:tasks.workunit.client.0.vm05.stdout:4/99: write d0/d17/f19 [1815683,72273] 0 2026-03-10T07:50:40.747 INFO:tasks.workunit.client.0.vm05.stdout:4/100: dwrite d0/d17/f19 [0,4194304] 0 2026-03-10T07:50:40.749 INFO:tasks.workunit.client.0.vm05.stdout:0/57: mknod d8/dd/de/c14 0 2026-03-10T07:50:40.759 INFO:tasks.workunit.client.0.vm05.stdout:0/58: dwrite d8/fb [0,4194304] 0 2026-03-10T07:50:40.764 INFO:tasks.workunit.client.0.vm05.stdout:7/115: rename d1/d6/f20 to d1/d6/f22 0 2026-03-10T07:50:40.764 INFO:tasks.workunit.client.0.vm05.stdout:0/59: write d8/fb [2825685,15155] 0 2026-03-10T07:50:40.771 INFO:tasks.workunit.client.0.vm05.stdout:0/60: dwrite d8/fc [4194304,4194304] 0 2026-03-10T07:50:40.778 INFO:tasks.workunit.client.0.vm05.stdout:3/118: mknod d8/d1c/c20 0 2026-03-10T07:50:40.780 INFO:tasks.workunit.client.0.vm05.stdout:8/75: link d1/c8 d1/dd/d18/c1a 0 2026-03-10T07:50:40.782 INFO:tasks.workunit.client.0.vm05.stdout:6/60: mkdir d0/d11 0 2026-03-10T07:50:40.782 INFO:tasks.workunit.client.0.vm05.stdout:0/61: dwrite d8/f13 [0,4194304] 0 2026-03-10T07:50:40.783 INFO:tasks.workunit.client.0.vm05.stdout:8/76: stat d1/f15 0 2026-03-10T07:50:40.784 INFO:tasks.workunit.client.0.vm05.stdout:8/77: write d1/dd/f11 [1818116,4751] 0 2026-03-10T07:50:40.785 INFO:tasks.workunit.client.0.vm05.stdout:8/78: readlink d1/lf 0 2026-03-10T07:50:40.786 INFO:tasks.workunit.client.0.vm05.stdout:8/79: dread - d1/fe zero size 2026-03-10T07:50:40.788 INFO:tasks.workunit.client.0.vm05.stdout:2/167: truncate d0/d8/f2d 223201 0 2026-03-10T07:50:40.800 INFO:tasks.workunit.client.0.vm05.stdout:2/168: dwrite d0/d34/df/f18 [0,4194304] 0 2026-03-10T07:50:40.817 INFO:tasks.workunit.client.0.vm05.stdout:7/116: rename d1/d6/f1c to d1/f23 0 2026-03-10T07:50:40.824 INFO:tasks.workunit.client.0.vm05.stdout:0/62: mknod d8/c15 0 2026-03-10T07:50:40.824 INFO:tasks.workunit.client.0.vm05.stdout:0/63: truncate d8/dd/d10/f11 895807 0 2026-03-10T07:50:40.826 INFO:tasks.workunit.client.0.vm05.stdout:9/60: write f7 [107093,16235] 0 2026-03-10T07:50:40.832 INFO:tasks.workunit.client.0.vm05.stdout:5/106: symlink d2/l28 0 2026-03-10T07:50:40.837 INFO:tasks.workunit.client.0.vm05.stdout:4/101: mknod d0/c1d 0 2026-03-10T07:50:40.842 INFO:tasks.workunit.client.0.vm05.stdout:2/169: creat d0/d8/f3b x:0 0 0 2026-03-10T07:50:40.842 INFO:tasks.workunit.client.0.vm05.stdout:7/117: rename d1/d6/l10 to d1/l24 0 2026-03-10T07:50:40.842 INFO:tasks.workunit.client.0.vm05.stdout:3/119: unlink l5 0 2026-03-10T07:50:40.842 INFO:tasks.workunit.client.0.vm05.stdout:3/120: stat d8/d1c/c20 0 2026-03-10T07:50:40.842 INFO:tasks.workunit.client.0.vm05.stdout:3/121: stat d8/l9 0 2026-03-10T07:50:40.842 INFO:tasks.workunit.client.0.vm05.stdout:3/122: dread d8/fd [0,4194304] 0 2026-03-10T07:50:40.845 INFO:tasks.workunit.client.0.vm05.stdout:6/61: mknod d0/d11/c12 0 2026-03-10T07:50:40.846 INFO:tasks.workunit.client.0.vm05.stdout:0/64: rmdir d8/dd/de 39 2026-03-10T07:50:40.847 INFO:tasks.workunit.client.0.vm05.stdout:0/65: readlink l7 0 2026-03-10T07:50:40.847 INFO:tasks.workunit.client.0.vm05.stdout:7/118: dread d1/d6/f1d [4194304,4194304] 0 2026-03-10T07:50:40.848 INFO:tasks.workunit.client.0.vm05.stdout:7/119: write d1/d6/fb [1506909,33174] 0 2026-03-10T07:50:40.850 INFO:tasks.workunit.client.0.vm05.stdout:9/61: mknod d8/c13 0 2026-03-10T07:50:40.860 INFO:tasks.workunit.client.0.vm05.stdout:3/123: creat d8/d16/d19/f21 x:0 0 0 2026-03-10T07:50:40.860 INFO:tasks.workunit.client.0.vm05.stdout:3/124: write d8/d16/f1e [805634,129366] 0 2026-03-10T07:50:40.864 INFO:tasks.workunit.client.0.vm05.stdout:6/62: creat d0/d11/f13 x:0 0 0 2026-03-10T07:50:40.866 INFO:tasks.workunit.client.0.vm05.stdout:0/66: symlink d8/dd/d10/l16 0 2026-03-10T07:50:40.874 INFO:tasks.workunit.client.0.vm05.stdout:7/120: fdatasync d1/d6/f22 0 2026-03-10T07:50:40.877 INFO:tasks.workunit.client.0.vm05.stdout:1/112: dwrite f4 [8388608,4194304] 0 2026-03-10T07:50:40.878 INFO:tasks.workunit.client.0.vm05.stdout:1/113: chown da/dd/ce 0 1 2026-03-10T07:50:40.880 INFO:tasks.workunit.client.0.vm05.stdout:7/121: dread d1/d6/f1b [0,4194304] 0 2026-03-10T07:50:40.885 INFO:tasks.workunit.client.0.vm05.stdout:9/62: rename d8/fc to d8/f14 0 2026-03-10T07:50:40.885 INFO:tasks.workunit.client.0.vm05.stdout:5/107: mknod d2/c29 0 2026-03-10T07:50:40.886 INFO:tasks.workunit.client.0.vm05.stdout:2/170: mkdir d0/d3c 0 2026-03-10T07:50:40.887 INFO:tasks.workunit.client.0.vm05.stdout:2/171: chown d0/d34/df/d25/f2b 396 1 2026-03-10T07:50:40.890 INFO:tasks.workunit.client.0.vm05.stdout:5/108: dwrite d2/d5/f18 [0,4194304] 0 2026-03-10T07:50:40.903 INFO:tasks.workunit.client.0.vm05.stdout:3/125: mkdir d8/d22 0 2026-03-10T07:50:40.904 INFO:tasks.workunit.client.0.vm05.stdout:6/63: symlink d0/d11/l14 0 2026-03-10T07:50:40.907 INFO:tasks.workunit.client.0.vm05.stdout:6/64: chown d0/l4 7835376 1 2026-03-10T07:50:40.913 INFO:tasks.workunit.client.0.vm05.stdout:1/114: rmdir da/dd 39 2026-03-10T07:50:40.915 INFO:tasks.workunit.client.0.vm05.stdout:9/63: fsync d8/fa 0 2026-03-10T07:50:40.915 INFO:tasks.workunit.client.0.vm05.stdout:7/122: creat d1/f25 x:0 0 0 2026-03-10T07:50:40.916 INFO:tasks.workunit.client.0.vm05.stdout:2/172: mkdir d0/d8/d3d 0 2026-03-10T07:50:40.920 INFO:tasks.workunit.client.0.vm05.stdout:3/126: unlink f6 0 2026-03-10T07:50:40.920 INFO:tasks.workunit.client.0.vm05.stdout:6/65: dread d0/fa [4194304,4194304] 0 2026-03-10T07:50:40.930 INFO:tasks.workunit.client.0.vm05.stdout:5/109: link d2/d5/f25 d2/d20/f2a 0 2026-03-10T07:50:40.935 INFO:tasks.workunit.client.0.vm05.stdout:3/127: creat d8/d1c/f23 x:0 0 0 2026-03-10T07:50:40.935 INFO:tasks.workunit.client.0.vm05.stdout:2/173: dread d0/f1e [0,4194304] 0 2026-03-10T07:50:40.937 INFO:tasks.workunit.client.0.vm05.stdout:6/66: truncate d0/d11/f13 169716 0 2026-03-10T07:50:40.937 INFO:tasks.workunit.client.0.vm05.stdout:8/80: truncate d1/dd/f19 2420463 0 2026-03-10T07:50:40.942 INFO:tasks.workunit.client.0.vm05.stdout:9/64: sync 2026-03-10T07:50:40.947 INFO:tasks.workunit.client.0.vm05.stdout:0/67: rmdir d8 39 2026-03-10T07:50:40.949 INFO:tasks.workunit.client.0.vm05.stdout:1/115: dwrite f4 [8388608,4194304] 0 2026-03-10T07:50:40.949 INFO:tasks.workunit.client.0.vm05.stdout:9/65: creat d8/f15 x:0 0 0 2026-03-10T07:50:40.959 INFO:tasks.workunit.client.0.vm05.stdout:1/116: read - da/f17 zero size 2026-03-10T07:50:40.960 INFO:tasks.workunit.client.0.vm05.stdout:7/123: dwrite d1/d6/f15 [0,4194304] 0 2026-03-10T07:50:40.960 INFO:tasks.workunit.client.0.vm05.stdout:8/81: dwrite d1/dd/f17 [0,4194304] 0 2026-03-10T07:50:40.967 INFO:tasks.workunit.client.0.vm05.stdout:4/102: dwrite d0/d17/f19 [4194304,4194304] 0 2026-03-10T07:50:40.973 INFO:tasks.workunit.client.0.vm05.stdout:9/66: dwrite d8/f15 [0,4194304] 0 2026-03-10T07:50:40.981 INFO:tasks.workunit.client.0.vm05.stdout:9/67: dread d8/f14 [0,4194304] 0 2026-03-10T07:50:40.981 INFO:tasks.workunit.client.0.vm05.stdout:9/68: read d8/f14 [380722,14974] 0 2026-03-10T07:50:41.007 INFO:tasks.workunit.client.0.vm05.stdout:7/124: symlink d1/d6/l26 0 2026-03-10T07:50:41.017 INFO:tasks.workunit.client.0.vm05.stdout:9/69: mkdir d8/d16 0 2026-03-10T07:50:41.018 INFO:tasks.workunit.client.0.vm05.stdout:9/70: read f6 [610345,125288] 0 2026-03-10T07:50:41.019 INFO:tasks.workunit.client.0.vm05.stdout:6/67: creat d0/f15 x:0 0 0 2026-03-10T07:50:41.022 INFO:tasks.workunit.client.0.vm05.stdout:1/117: mkdir da/dd/d12/d19 0 2026-03-10T07:50:41.022 INFO:tasks.workunit.client.0.vm05.stdout:2/174: rmdir d0/d8 39 2026-03-10T07:50:41.035 INFO:tasks.workunit.client.0.vm05.stdout:5/110: rmdir d2/d5 39 2026-03-10T07:50:41.056 INFO:tasks.workunit.client.0.vm05.stdout:6/68: sync 2026-03-10T07:50:41.065 INFO:tasks.workunit.client.0.vm05.stdout:3/128: dwrite d8/f12 [0,4194304] 0 2026-03-10T07:50:41.179 INFO:tasks.workunit.client.0.vm05.stdout:4/103: mknod d0/d6/d9/c1e 0 2026-03-10T07:50:41.186 INFO:tasks.workunit.client.0.vm05.stdout:4/104: dwrite d0/d17/f19 [8388608,4194304] 0 2026-03-10T07:50:41.186 INFO:tasks.workunit.client.0.vm05.stdout:2/175: fsync d0/f6 0 2026-03-10T07:50:41.188 INFO:tasks.workunit.client.0.vm05.stdout:5/111: creat d2/d12/f2b x:0 0 0 2026-03-10T07:50:41.188 INFO:tasks.workunit.client.0.vm05.stdout:6/69: creat d0/d6/f16 x:0 0 0 2026-03-10T07:50:41.189 INFO:tasks.workunit.client.0.vm05.stdout:2/176: write d0/d34/f30 [129886,60823] 0 2026-03-10T07:50:41.190 INFO:tasks.workunit.client.0.vm05.stdout:6/70: truncate d0/d11/f13 406592 0 2026-03-10T07:50:41.200 INFO:tasks.workunit.client.0.vm05.stdout:6/71: dwrite d0/d6/f16 [0,4194304] 0 2026-03-10T07:50:41.202 INFO:tasks.workunit.client.0.vm05.stdout:6/72: stat d0/d6/f16 0 2026-03-10T07:50:41.211 INFO:tasks.workunit.client.0.vm05.stdout:8/82: dwrite d1/dd/f19 [0,4194304] 0 2026-03-10T07:50:41.217 INFO:tasks.workunit.client.0.vm05.stdout:9/71: symlink d8/d16/l17 0 2026-03-10T07:50:41.217 INFO:tasks.workunit.client.0.vm05.stdout:4/105: mknod d0/d6/c1f 0 2026-03-10T07:50:41.217 INFO:tasks.workunit.client.0.vm05.stdout:1/118: creat da/dd/d12/d19/f1a x:0 0 0 2026-03-10T07:50:41.218 INFO:tasks.workunit.client.0.vm05.stdout:0/68: link l7 d8/dd/de/l17 0 2026-03-10T07:50:41.218 INFO:tasks.workunit.client.0.vm05.stdout:0/69: readlink d8/dd/d10/l16 0 2026-03-10T07:50:41.227 INFO:tasks.workunit.client.0.vm05.stdout:0/70: creat d8/dd/de/f18 x:0 0 0 2026-03-10T07:50:41.228 INFO:tasks.workunit.client.0.vm05.stdout:2/177: creat d0/d8/f3e x:0 0 0 2026-03-10T07:50:41.228 INFO:tasks.workunit.client.0.vm05.stdout:4/106: mkdir d0/d20 0 2026-03-10T07:50:41.228 INFO:tasks.workunit.client.0.vm05.stdout:6/73: link d0/d11/l14 d0/d6/l17 0 2026-03-10T07:50:41.228 INFO:tasks.workunit.client.0.vm05.stdout:0/71: stat d8/dd/de/c14 0 2026-03-10T07:50:41.229 INFO:tasks.workunit.client.0.vm05.stdout:0/72: stat c0 0 2026-03-10T07:50:41.236 INFO:tasks.workunit.client.0.vm05.stdout:2/178: truncate d0/f22 549656 0 2026-03-10T07:50:41.237 INFO:tasks.workunit.client.0.vm05.stdout:4/107: rmdir d0 39 2026-03-10T07:50:41.237 INFO:tasks.workunit.client.0.vm05.stdout:2/179: write d0/d8/f3e [233585,1130] 0 2026-03-10T07:50:41.246 INFO:tasks.workunit.client.0.vm05.stdout:0/73: creat d8/dd/d10/f19 x:0 0 0 2026-03-10T07:50:41.246 INFO:tasks.workunit.client.0.vm05.stdout:2/180: dread d0/d34/f30 [0,4194304] 0 2026-03-10T07:50:41.246 INFO:tasks.workunit.client.0.vm05.stdout:0/74: stat d8/dd/de 0 2026-03-10T07:50:41.247 INFO:tasks.workunit.client.0.vm05.stdout:0/75: write d8/fc [777214,124845] 0 2026-03-10T07:50:41.248 INFO:tasks.workunit.client.0.vm05.stdout:8/83: dread d1/dd/f10 [0,4194304] 0 2026-03-10T07:50:41.252 INFO:tasks.workunit.client.0.vm05.stdout:0/76: dread f6 [0,4194304] 0 2026-03-10T07:50:41.252 INFO:tasks.workunit.client.0.vm05.stdout:0/77: dread - d8/ff zero size 2026-03-10T07:50:41.254 INFO:tasks.workunit.client.0.vm05.stdout:0/78: truncate d8/dd/d10/f11 1141177 0 2026-03-10T07:50:41.257 INFO:tasks.workunit.client.0.vm05.stdout:2/181: chown d0/l15 0 1 2026-03-10T07:50:41.258 INFO:tasks.workunit.client.0.vm05.stdout:8/84: symlink d1/dd/l1b 0 2026-03-10T07:50:41.261 INFO:tasks.workunit.client.0.vm05.stdout:4/108: mknod d0/c21 0 2026-03-10T07:50:41.262 INFO:tasks.workunit.client.0.vm05.stdout:8/85: fsync d1/dd/f19 0 2026-03-10T07:50:41.263 INFO:tasks.workunit.client.0.vm05.stdout:2/182: write d0/f1e [937520,130250] 0 2026-03-10T07:50:41.268 INFO:tasks.workunit.client.0.vm05.stdout:0/79: dwrite d8/dd/d10/f11 [0,4194304] 0 2026-03-10T07:50:41.270 INFO:tasks.workunit.client.0.vm05.stdout:8/86: unlink d1/dd/l1b 0 2026-03-10T07:50:41.273 INFO:tasks.workunit.client.0.vm05.stdout:6/74: link d0/c2 d0/c18 0 2026-03-10T07:50:41.274 INFO:tasks.workunit.client.0.vm05.stdout:8/87: write d1/dd/f17 [1215111,46306] 0 2026-03-10T07:50:41.276 INFO:tasks.workunit.client.0.vm05.stdout:8/88: fsync d1/dd/f19 0 2026-03-10T07:50:41.277 INFO:tasks.workunit.client.0.vm05.stdout:6/75: write d0/d6/fe [1019649,2301] 0 2026-03-10T07:50:41.288 INFO:tasks.workunit.client.0.vm05.stdout:8/89: rename d1/dd/f10 to d1/dd/d18/f1c 0 2026-03-10T07:50:41.293 INFO:tasks.workunit.client.0.vm05.stdout:8/90: dread d1/dd/f19 [0,4194304] 0 2026-03-10T07:50:41.293 INFO:tasks.workunit.client.0.vm05.stdout:0/80: link d8/dd/de/f18 d8/dd/d10/f1a 0 2026-03-10T07:50:41.294 INFO:tasks.workunit.client.0.vm05.stdout:6/76: symlink d0/l19 0 2026-03-10T07:50:41.296 INFO:tasks.workunit.client.0.vm05.stdout:2/183: getdents d0/d2a/d2f 0 2026-03-10T07:50:41.306 INFO:tasks.workunit.client.0.vm05.stdout:6/77: readlink d0/d6/l17 0 2026-03-10T07:50:41.308 INFO:tasks.workunit.client.0.vm05.stdout:0/81: write d8/fa [2115835,112782] 0 2026-03-10T07:50:41.310 INFO:tasks.workunit.client.0.vm05.stdout:2/184: rename d0/d34/df/d25/l26 to d0/d2a/d2f/l3f 0 2026-03-10T07:50:41.320 INFO:tasks.workunit.client.0.vm05.stdout:6/78: dwrite d0/f15 [0,4194304] 0 2026-03-10T07:50:41.324 INFO:tasks.workunit.client.0.vm05.stdout:2/185: creat d0/d8/d3d/f40 x:0 0 0 2026-03-10T07:50:41.324 INFO:tasks.workunit.client.0.vm05.stdout:6/79: fdatasync d0/d6/f16 0 2026-03-10T07:50:41.328 INFO:tasks.workunit.client.0.vm05.stdout:2/186: write d0/f6 [4809039,89017] 0 2026-03-10T07:50:41.328 INFO:tasks.workunit.client.0.vm05.stdout:6/80: read - d0/d6/f10 zero size 2026-03-10T07:50:41.334 INFO:tasks.workunit.client.0.vm05.stdout:6/81: creat d0/d6/f1a x:0 0 0 2026-03-10T07:50:41.337 INFO:tasks.workunit.client.0.vm05.stdout:6/82: readlink d0/l5 0 2026-03-10T07:50:41.339 INFO:tasks.workunit.client.0.vm05.stdout:2/187: creat d0/f41 x:0 0 0 2026-03-10T07:50:41.352 INFO:tasks.workunit.client.0.vm05.stdout:2/188: dread d0/d8/f3e [0,4194304] 0 2026-03-10T07:50:41.352 INFO:tasks.workunit.client.0.vm05.stdout:6/83: dread d0/d6/f16 [0,4194304] 0 2026-03-10T07:50:41.352 INFO:tasks.workunit.client.0.vm05.stdout:2/189: truncate d0/f22 1500787 0 2026-03-10T07:50:41.352 INFO:tasks.workunit.client.0.vm05.stdout:6/84: fdatasync d0/d6/f10 0 2026-03-10T07:50:41.356 INFO:tasks.workunit.client.0.vm05.stdout:6/85: getdents d0/d11 0 2026-03-10T07:50:41.361 INFO:tasks.workunit.client.0.vm05.stdout:6/86: dwrite d0/d6/fe [0,4194304] 0 2026-03-10T07:50:41.369 INFO:tasks.workunit.client.0.vm05.stdout:6/87: unlink d0/d6/fe 0 2026-03-10T07:50:41.370 INFO:tasks.workunit.client.0.vm05.stdout:6/88: dread d0/d11/f13 [0,4194304] 0 2026-03-10T07:50:41.376 INFO:tasks.workunit.client.0.vm05.stdout:2/190: sync 2026-03-10T07:50:41.376 INFO:tasks.workunit.client.0.vm05.stdout:0/82: sync 2026-03-10T07:50:41.376 INFO:tasks.workunit.client.0.vm05.stdout:6/89: symlink d0/d6/l1b 0 2026-03-10T07:50:41.376 INFO:tasks.workunit.client.0.vm05.stdout:7/125: write d1/f23 [3791957,21595] 0 2026-03-10T07:50:41.376 INFO:tasks.workunit.client.0.vm05.stdout:2/191: chown d0/d34/d38 164 1 2026-03-10T07:50:41.377 INFO:tasks.workunit.client.0.vm05.stdout:7/126: chown d1/d6 7717721 1 2026-03-10T07:50:41.384 INFO:tasks.workunit.client.0.vm05.stdout:5/112: write d2/d5/f25 [2938256,16428] 0 2026-03-10T07:50:41.385 INFO:tasks.workunit.client.0.vm05.stdout:3/129: write d8/fd [101573,11447] 0 2026-03-10T07:50:41.391 INFO:tasks.workunit.client.0.vm05.stdout:5/113: dwrite d2/d5/f10 [0,4194304] 0 2026-03-10T07:50:41.409 INFO:tasks.workunit.client.0.vm05.stdout:9/72: write f4 [4082937,26519] 0 2026-03-10T07:50:41.444 INFO:tasks.workunit.client.0.vm05.stdout:7/127: creat d1/f27 x:0 0 0 2026-03-10T07:50:41.450 INFO:tasks.workunit.client.0.vm05.stdout:1/119: truncate f4 11986490 0 2026-03-10T07:50:41.451 INFO:tasks.workunit.client.0.vm05.stdout:1/120: read da/dd/d12/f14 [250309,110441] 0 2026-03-10T07:50:41.454 INFO:tasks.workunit.client.0.vm05.stdout:5/114: chown d2/f9 1049 1 2026-03-10T07:50:41.454 INFO:tasks.workunit.client.0.vm05.stdout:9/73: symlink d8/d16/l18 0 2026-03-10T07:50:41.456 INFO:tasks.workunit.client.0.vm05.stdout:0/83: mknod d8/dd/c1b 0 2026-03-10T07:50:41.457 INFO:tasks.workunit.client.0.vm05.stdout:0/84: chown d8/dd/de/f18 220884941 1 2026-03-10T07:50:41.458 INFO:tasks.workunit.client.0.vm05.stdout:7/128: rename d1/l1a to d1/d6/l28 0 2026-03-10T07:50:41.459 INFO:tasks.workunit.client.0.vm05.stdout:7/129: chown d1/d6/f1b 10 1 2026-03-10T07:50:41.462 INFO:tasks.workunit.client.0.vm05.stdout:7/130: dwrite d1/f25 [0,4194304] 0 2026-03-10T07:50:41.472 INFO:tasks.workunit.client.0.vm05.stdout:1/121: symlink da/dd/d12/l1b 0 2026-03-10T07:50:41.474 INFO:tasks.workunit.client.0.vm05.stdout:5/115: rmdir d2/d5 39 2026-03-10T07:50:41.480 INFO:tasks.workunit.client.0.vm05.stdout:5/116: dread - d2/f15 zero size 2026-03-10T07:50:41.480 INFO:tasks.workunit.client.0.vm05.stdout:4/109: dwrite d0/f2 [0,4194304] 0 2026-03-10T07:50:41.482 INFO:tasks.workunit.client.0.vm05.stdout:8/91: getdents d1/dd 0 2026-03-10T07:50:41.483 INFO:tasks.workunit.client.0.vm05.stdout:4/110: dread d0/f2 [0,4194304] 0 2026-03-10T07:50:41.486 INFO:tasks.workunit.client.0.vm05.stdout:8/92: dwrite f0 [0,4194304] 0 2026-03-10T07:50:41.494 INFO:tasks.workunit.client.0.vm05.stdout:0/85: unlink d8/dd/d10/l16 0 2026-03-10T07:50:41.504 INFO:tasks.workunit.client.0.vm05.stdout:6/90: write d0/fa [8978667,77935] 0 2026-03-10T07:50:41.504 INFO:tasks.workunit.client.0.vm05.stdout:7/131: mknod d1/d6/c29 0 2026-03-10T07:50:41.504 INFO:tasks.workunit.client.0.vm05.stdout:7/132: read d1/f16 [611707,130300] 0 2026-03-10T07:50:41.504 INFO:tasks.workunit.client.0.vm05.stdout:5/117: creat d2/d20/f2c x:0 0 0 2026-03-10T07:50:41.504 INFO:tasks.workunit.client.0.vm05.stdout:4/111: rename d0/d6/c1f to d0/d20/c22 0 2026-03-10T07:50:41.509 INFO:tasks.workunit.client.0.vm05.stdout:7/133: creat d1/d6/f2a x:0 0 0 2026-03-10T07:50:41.510 INFO:tasks.workunit.client.0.vm05.stdout:3/130: getdents d8/d16/d19 0 2026-03-10T07:50:41.513 INFO:tasks.workunit.client.0.vm05.stdout:5/118: mkdir d2/d12/d2d 0 2026-03-10T07:50:41.520 INFO:tasks.workunit.client.0.vm05.stdout:6/91: sync 2026-03-10T07:50:41.520 INFO:tasks.workunit.client.0.vm05.stdout:6/92: readlink d0/lf 0 2026-03-10T07:50:41.523 INFO:tasks.workunit.client.0.vm05.stdout:3/131: mkdir d8/d1f/d24 0 2026-03-10T07:50:41.527 INFO:tasks.workunit.client.0.vm05.stdout:5/119: dwrite d2/d5/f1e [0,4194304] 0 2026-03-10T07:50:41.528 INFO:tasks.workunit.client.0.vm05.stdout:4/112: creat d0/f23 x:0 0 0 2026-03-10T07:50:41.528 INFO:tasks.workunit.client.0.vm05.stdout:3/132: creat d8/f25 x:0 0 0 2026-03-10T07:50:41.534 INFO:tasks.workunit.client.0.vm05.stdout:3/133: dread d8/fb [0,4194304] 0 2026-03-10T07:50:41.534 INFO:tasks.workunit.client.0.vm05.stdout:5/120: creat d2/d5/f2e x:0 0 0 2026-03-10T07:50:41.536 INFO:tasks.workunit.client.0.vm05.stdout:4/113: stat d0/d6/d10 0 2026-03-10T07:50:41.538 INFO:tasks.workunit.client.0.vm05.stdout:3/134: sync 2026-03-10T07:50:41.550 INFO:tasks.workunit.client.0.vm05.stdout:4/114: creat d0/f24 x:0 0 0 2026-03-10T07:50:41.551 INFO:tasks.workunit.client.0.vm05.stdout:3/135: link d8/l9 d8/d16/l26 0 2026-03-10T07:50:41.553 INFO:tasks.workunit.client.0.vm05.stdout:3/136: fdatasync f4 0 2026-03-10T07:50:41.554 INFO:tasks.workunit.client.0.vm05.stdout:4/115: rmdir d0/d6/d9/d12 39 2026-03-10T07:50:41.554 INFO:tasks.workunit.client.0.vm05.stdout:4/116: chown d0 5500 1 2026-03-10T07:50:41.556 INFO:tasks.workunit.client.0.vm05.stdout:4/117: chown d0/d6/f15 422967 1 2026-03-10T07:50:41.557 INFO:tasks.workunit.client.0.vm05.stdout:4/118: truncate d0/f23 1043123 0 2026-03-10T07:50:41.559 INFO:tasks.workunit.client.0.vm05.stdout:3/137: link f4 d8/d16/d19/f27 0 2026-03-10T07:50:41.560 INFO:tasks.workunit.client.0.vm05.stdout:4/119: creat d0/d6/d10/f25 x:0 0 0 2026-03-10T07:50:41.562 INFO:tasks.workunit.client.0.vm05.stdout:3/138: mknod d8/d16/c28 0 2026-03-10T07:50:41.563 INFO:tasks.workunit.client.0.vm05.stdout:3/139: write d8/d16/f1e [444452,90838] 0 2026-03-10T07:50:41.563 INFO:tasks.workunit.client.0.vm05.stdout:4/120: dwrite d0/d17/f19 [0,4194304] 0 2026-03-10T07:50:41.567 INFO:tasks.workunit.client.0.vm05.stdout:4/121: mkdir d0/d20/d26 0 2026-03-10T07:50:41.569 INFO:tasks.workunit.client.0.vm05.stdout:3/140: creat d8/d22/f29 x:0 0 0 2026-03-10T07:50:41.571 INFO:tasks.workunit.client.0.vm05.stdout:6/93: fsync d0/fa 0 2026-03-10T07:50:41.572 INFO:tasks.workunit.client.0.vm05.stdout:2/192: write d0/f5 [4134871,8333] 0 2026-03-10T07:50:41.575 INFO:tasks.workunit.client.0.vm05.stdout:6/94: creat d0/d11/f1c x:0 0 0 2026-03-10T07:50:41.580 INFO:tasks.workunit.client.0.vm05.stdout:2/193: link d0/d2a/d2f/f36 d0/d8/f42 0 2026-03-10T07:50:41.580 INFO:tasks.workunit.client.0.vm05.stdout:2/194: chown d0/d34/df/d25/f2b 558939362 1 2026-03-10T07:50:41.582 INFO:tasks.workunit.client.0.vm05.stdout:6/95: creat d0/d6/f1d x:0 0 0 2026-03-10T07:50:41.592 INFO:tasks.workunit.client.0.vm05.stdout:9/74: dwrite f6 [0,4194304] 0 2026-03-10T07:50:41.593 INFO:tasks.workunit.client.0.vm05.stdout:9/75: write d8/f9 [2314452,30235] 0 2026-03-10T07:50:41.595 INFO:tasks.workunit.client.0.vm05.stdout:0/86: rmdir d8/dd/d10 39 2026-03-10T07:50:41.598 INFO:tasks.workunit.client.0.vm05.stdout:2/195: rmdir d0/d3c 0 2026-03-10T07:50:41.600 INFO:tasks.workunit.client.0.vm05.stdout:6/96: sync 2026-03-10T07:50:41.604 INFO:tasks.workunit.client.0.vm05.stdout:0/87: creat d8/f1c x:0 0 0 2026-03-10T07:50:41.604 INFO:tasks.workunit.client.0.vm05.stdout:0/88: dread - d8/dd/de/f18 zero size 2026-03-10T07:50:41.605 INFO:tasks.workunit.client.0.vm05.stdout:0/89: chown d8/dd/de/f18 2042065 1 2026-03-10T07:50:41.605 INFO:tasks.workunit.client.0.vm05.stdout:0/90: stat d8/fb 0 2026-03-10T07:50:41.608 INFO:tasks.workunit.client.0.vm05.stdout:2/196: dwrite d0/d8/f1c [0,4194304] 0 2026-03-10T07:50:41.609 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:41 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:41.609 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:41 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:41.610 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:41 vm08.local ceph-mon[59917]: pgmap v10: 65 pgs: 65 active+clean; 253 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 119 KiB/s rd, 11 MiB/s wr, 199 op/s 2026-03-10T07:50:41.617 INFO:tasks.workunit.client.0.vm05.stdout:2/197: fsync d0/f1 0 2026-03-10T07:50:41.618 INFO:tasks.workunit.client.0.vm05.stdout:2/198: dread - d0/d34/df/d25/f2b zero size 2026-03-10T07:50:41.627 INFO:tasks.workunit.client.0.vm05.stdout:0/91: mknod d8/dd/d10/c1d 0 2026-03-10T07:50:41.634 INFO:tasks.workunit.client.0.vm05.stdout:2/199: rmdir d0/d34/df/d25/d37 0 2026-03-10T07:50:41.640 INFO:tasks.workunit.client.0.vm05.stdout:2/200: getdents d0/d34/d38 0 2026-03-10T07:50:41.647 INFO:tasks.workunit.client.0.vm05.stdout:1/122: truncate da/dd/d12/f14 1897424 0 2026-03-10T07:50:41.647 INFO:tasks.workunit.client.0.vm05.stdout:1/123: fdatasync da/dd/f11 0 2026-03-10T07:50:41.648 INFO:tasks.workunit.client.0.vm05.stdout:1/124: dread - da/dd/d12/d19/f1a zero size 2026-03-10T07:50:41.648 INFO:tasks.workunit.client.0.vm05.stdout:5/121: getdents d2/d20 0 2026-03-10T07:50:41.649 INFO:tasks.workunit.client.0.vm05.stdout:5/122: chown d2/d5/c27 154 1 2026-03-10T07:50:41.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:41 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:41.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:41 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:41.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:41 vm05.local ceph-mon[50387]: pgmap v10: 65 pgs: 65 active+clean; 253 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 119 KiB/s rd, 11 MiB/s wr, 199 op/s 2026-03-10T07:50:41.658 INFO:tasks.workunit.client.0.vm05.stdout:5/123: sync 2026-03-10T07:50:41.659 INFO:tasks.workunit.client.0.vm05.stdout:5/124: readlink d2/d5/l1d 0 2026-03-10T07:50:41.663 INFO:tasks.workunit.client.0.vm05.stdout:5/125: dwrite d2/d20/f2c [0,4194304] 0 2026-03-10T07:50:41.665 INFO:tasks.workunit.client.0.vm05.stdout:8/93: write f0 [4774053,128460] 0 2026-03-10T07:50:41.667 INFO:tasks.workunit.client.0.vm05.stdout:5/126: creat d2/d12/f2f x:0 0 0 2026-03-10T07:50:41.667 INFO:tasks.workunit.client.0.vm05.stdout:5/127: chown d2 8335 1 2026-03-10T07:50:41.668 INFO:tasks.workunit.client.0.vm05.stdout:8/94: chown d1/dd/d18/c1a 475368 1 2026-03-10T07:50:41.676 INFO:tasks.workunit.client.0.vm05.stdout:5/128: mknod d2/d12/c30 0 2026-03-10T07:50:41.677 INFO:tasks.workunit.client.0.vm05.stdout:5/129: fsync d2/d12/f2b 0 2026-03-10T07:50:41.678 INFO:tasks.workunit.client.0.vm05.stdout:8/95: mknod d1/c1d 0 2026-03-10T07:50:41.693 INFO:tasks.workunit.client.0.vm05.stdout:7/134: getdents d1/d6 0 2026-03-10T07:50:41.693 INFO:tasks.workunit.client.0.vm05.stdout:5/130: creat d2/d20/f31 x:0 0 0 2026-03-10T07:50:41.693 INFO:tasks.workunit.client.0.vm05.stdout:5/131: readlink d2/l22 0 2026-03-10T07:50:41.693 INFO:tasks.workunit.client.0.vm05.stdout:8/96: write d1/fe [394926,47586] 0 2026-03-10T07:50:41.693 INFO:tasks.workunit.client.0.vm05.stdout:7/135: dread d1/f16 [0,4194304] 0 2026-03-10T07:50:41.693 INFO:tasks.workunit.client.0.vm05.stdout:7/136: chown d1/d6/l28 390896 1 2026-03-10T07:50:41.693 INFO:tasks.workunit.client.0.vm05.stdout:3/141: getdents d8 0 2026-03-10T07:50:41.693 INFO:tasks.workunit.client.0.vm05.stdout:3/142: chown d8/c10 7994002 1 2026-03-10T07:50:41.693 INFO:tasks.workunit.client.0.vm05.stdout:7/137: truncate d1/d6/f2a 887374 0 2026-03-10T07:50:41.693 INFO:tasks.workunit.client.0.vm05.stdout:3/143: chown l1 130605771 1 2026-03-10T07:50:41.693 INFO:tasks.workunit.client.0.vm05.stdout:7/138: truncate d1/f21 676842 0 2026-03-10T07:50:41.693 INFO:tasks.workunit.client.0.vm05.stdout:3/144: dwrite d8/f12 [0,4194304] 0 2026-03-10T07:50:41.693 INFO:tasks.workunit.client.0.vm05.stdout:3/145: chown f3 0 1 2026-03-10T07:50:41.693 INFO:tasks.workunit.client.0.vm05.stdout:3/146: chown d8/d1f/d24 11706 1 2026-03-10T07:50:41.694 INFO:tasks.workunit.client.0.vm05.stdout:3/147: chown d8/d16/f1e 4146 1 2026-03-10T07:50:41.694 INFO:tasks.workunit.client.0.vm05.stdout:3/148: stat d8/d16/c28 0 2026-03-10T07:50:41.703 INFO:tasks.workunit.client.0.vm05.stdout:7/139: unlink d1/d6/f2a 0 2026-03-10T07:50:41.704 INFO:tasks.workunit.client.0.vm05.stdout:3/149: mkdir d8/d1f/d2a 0 2026-03-10T07:50:41.705 INFO:tasks.workunit.client.0.vm05.stdout:3/150: write d8/f25 [937621,125104] 0 2026-03-10T07:50:41.706 INFO:tasks.workunit.client.0.vm05.stdout:3/151: write d8/d16/d19/f21 [415211,117777] 0 2026-03-10T07:50:41.710 INFO:tasks.workunit.client.0.vm05.stdout:3/152: dread d8/fb [0,4194304] 0 2026-03-10T07:50:41.716 INFO:tasks.workunit.client.0.vm05.stdout:3/153: fsync d8/d22/f29 0 2026-03-10T07:50:41.716 INFO:tasks.workunit.client.0.vm05.stdout:5/132: dread d2/d5/fb [0,4194304] 0 2026-03-10T07:50:41.716 INFO:tasks.workunit.client.0.vm05.stdout:3/154: creat d8/d1c/f2b x:0 0 0 2026-03-10T07:50:41.716 INFO:tasks.workunit.client.0.vm05.stdout:5/133: unlink d2/d12/l19 0 2026-03-10T07:50:41.717 INFO:tasks.workunit.client.0.vm05.stdout:3/155: mknod d8/d1f/c2c 0 2026-03-10T07:50:41.718 INFO:tasks.workunit.client.0.vm05.stdout:5/134: creat d2/d20/f32 x:0 0 0 2026-03-10T07:50:41.719 INFO:tasks.workunit.client.0.vm05.stdout:5/135: dread - d2/d5/f23 zero size 2026-03-10T07:50:41.719 INFO:tasks.workunit.client.0.vm05.stdout:5/136: fdatasync d2/f15 0 2026-03-10T07:50:41.720 INFO:tasks.workunit.client.0.vm05.stdout:7/140: link d1/lf d1/l2b 0 2026-03-10T07:50:41.721 INFO:tasks.workunit.client.0.vm05.stdout:7/141: chown d1/d6/f22 669625 1 2026-03-10T07:50:41.731 INFO:tasks.workunit.client.0.vm05.stdout:7/142: sync 2026-03-10T07:50:41.731 INFO:tasks.workunit.client.0.vm05.stdout:8/97: dread d1/fe [0,4194304] 0 2026-03-10T07:50:41.732 INFO:tasks.workunit.client.0.vm05.stdout:4/122: dwrite d0/d6/f15 [0,4194304] 0 2026-03-10T07:50:41.740 INFO:tasks.workunit.client.0.vm05.stdout:6/97: rmdir d0/d11 39 2026-03-10T07:50:41.740 INFO:tasks.workunit.client.0.vm05.stdout:6/98: readlink d0/l5 0 2026-03-10T07:50:41.745 INFO:tasks.workunit.client.0.vm05.stdout:8/98: dwrite d1/dd/f17 [0,4194304] 0 2026-03-10T07:50:41.746 INFO:tasks.workunit.client.0.vm05.stdout:8/99: write d1/dd/f11 [562313,74242] 0 2026-03-10T07:50:41.753 INFO:tasks.workunit.client.0.vm05.stdout:7/143: creat d1/f2c x:0 0 0 2026-03-10T07:50:41.758 INFO:tasks.workunit.client.0.vm05.stdout:4/123: stat d0/d20/c22 0 2026-03-10T07:50:41.758 INFO:tasks.workunit.client.0.vm05.stdout:9/76: truncate f7 765370 0 2026-03-10T07:50:41.758 INFO:tasks.workunit.client.0.vm05.stdout:7/144: mknod d1/c2d 0 2026-03-10T07:50:41.759 INFO:tasks.workunit.client.0.vm05.stdout:4/124: readlink d0/la 0 2026-03-10T07:50:41.762 INFO:tasks.workunit.client.0.vm05.stdout:4/125: dwrite d0/d6/f15 [0,4194304] 0 2026-03-10T07:50:41.771 INFO:tasks.workunit.client.0.vm05.stdout:0/92: truncate d8/f9 3409690 0 2026-03-10T07:50:41.772 INFO:tasks.workunit.client.0.vm05.stdout:0/93: read - d8/dd/d10/f19 zero size 2026-03-10T07:50:41.774 INFO:tasks.workunit.client.0.vm05.stdout:6/99: mkdir d0/d1e 0 2026-03-10T07:50:41.775 INFO:tasks.workunit.client.0.vm05.stdout:2/201: rename d0/d34 to d0/d8/d43 0 2026-03-10T07:50:41.775 INFO:tasks.workunit.client.0.vm05.stdout:2/202: chown d0/d8/d43/df/f18 12192 1 2026-03-10T07:50:41.776 INFO:tasks.workunit.client.0.vm05.stdout:9/77: symlink d8/l19 0 2026-03-10T07:50:41.778 INFO:tasks.workunit.client.0.vm05.stdout:7/145: creat d1/d6/f2e x:0 0 0 2026-03-10T07:50:41.781 INFO:tasks.workunit.client.0.vm05.stdout:1/125: rmdir da 39 2026-03-10T07:50:41.781 INFO:tasks.workunit.client.0.vm05.stdout:9/78: mknod d8/d16/c1a 0 2026-03-10T07:50:41.783 INFO:tasks.workunit.client.0.vm05.stdout:7/146: truncate d1/d6/f1b 910577 0 2026-03-10T07:50:41.784 INFO:tasks.workunit.client.0.vm05.stdout:4/126: symlink d0/d6/d9/l27 0 2026-03-10T07:50:41.785 INFO:tasks.workunit.client.0.vm05.stdout:4/127: dread d0/f23 [0,4194304] 0 2026-03-10T07:50:41.786 INFO:tasks.workunit.client.0.vm05.stdout:6/100: mknod d0/d1e/c1f 0 2026-03-10T07:50:41.790 INFO:tasks.workunit.client.0.vm05.stdout:8/100: link d1/c7 d1/dd/d18/c1e 0 2026-03-10T07:50:41.790 INFO:tasks.workunit.client.0.vm05.stdout:8/101: write d1/dd/f17 [2913048,35479] 0 2026-03-10T07:50:41.792 INFO:tasks.workunit.client.0.vm05.stdout:7/147: symlink d1/l2f 0 2026-03-10T07:50:41.793 INFO:tasks.workunit.client.0.vm05.stdout:4/128: stat d0/d6/c18 0 2026-03-10T07:50:41.794 INFO:tasks.workunit.client.0.vm05.stdout:0/94: link d8/c15 d8/c1e 0 2026-03-10T07:50:41.794 INFO:tasks.workunit.client.0.vm05.stdout:0/95: chown d8/dd 0 1 2026-03-10T07:50:41.794 INFO:tasks.workunit.client.0.vm05.stdout:2/203: sync 2026-03-10T07:50:41.795 INFO:tasks.workunit.client.0.vm05.stdout:2/204: write d0/f1e [516279,41192] 0 2026-03-10T07:50:41.797 INFO:tasks.workunit.client.0.vm05.stdout:0/96: dread d8/fa [0,4194304] 0 2026-03-10T07:50:41.800 INFO:tasks.workunit.client.0.vm05.stdout:3/156: rmdir d8/d1c 39 2026-03-10T07:50:41.800 INFO:tasks.workunit.client.0.vm05.stdout:3/157: fsync d8/f25 0 2026-03-10T07:50:41.802 INFO:tasks.workunit.client.0.vm05.stdout:8/102: creat d1/dd/d18/f1f x:0 0 0 2026-03-10T07:50:41.803 INFO:tasks.workunit.client.0.vm05.stdout:5/137: truncate d2/d5/f25 8362363 0 2026-03-10T07:50:41.804 INFO:tasks.workunit.client.0.vm05.stdout:5/138: dread - d2/d12/f2f zero size 2026-03-10T07:50:41.809 INFO:tasks.workunit.client.0.vm05.stdout:8/103: dwrite d1/dd/f17 [0,4194304] 0 2026-03-10T07:50:41.811 INFO:tasks.workunit.client.0.vm05.stdout:5/139: dwrite d2/d5/f18 [0,4194304] 0 2026-03-10T07:50:41.813 INFO:tasks.workunit.client.0.vm05.stdout:5/140: readlink d2/d12/l1b 0 2026-03-10T07:50:41.816 INFO:tasks.workunit.client.0.vm05.stdout:5/141: chown d2/d5/f1e 653977 1 2026-03-10T07:50:41.816 INFO:tasks.workunit.client.0.vm05.stdout:9/79: dwrite f7 [0,4194304] 0 2026-03-10T07:50:41.817 INFO:tasks.workunit.client.0.vm05.stdout:5/142: write d2/d20/f21 [1014612,74333] 0 2026-03-10T07:50:41.836 INFO:tasks.workunit.client.0.vm05.stdout:2/205: mknod d0/d8/d43/df/d25/c44 0 2026-03-10T07:50:41.836 INFO:tasks.workunit.client.0.vm05.stdout:2/206: dread - d0/d8/d43/df/d25/f2b zero size 2026-03-10T07:50:41.836 INFO:tasks.workunit.client.0.vm05.stdout:2/207: dread - d0/f41 zero size 2026-03-10T07:50:41.837 INFO:tasks.workunit.client.0.vm05.stdout:2/208: fdatasync d0/d8/d43/f1d 0 2026-03-10T07:50:41.841 INFO:tasks.workunit.client.0.vm05.stdout:1/126: symlink da/l1c 0 2026-03-10T07:50:41.842 INFO:tasks.workunit.client.0.vm05.stdout:3/158: write d8/d1c/f2b [523605,997] 0 2026-03-10T07:50:41.846 INFO:tasks.workunit.client.0.vm05.stdout:8/104: mkdir d1/dd/d18/d20 0 2026-03-10T07:50:41.847 INFO:tasks.workunit.client.0.vm05.stdout:9/80: creat d8/f1b x:0 0 0 2026-03-10T07:50:41.849 INFO:tasks.workunit.client.0.vm05.stdout:9/81: dread f7 [0,4194304] 0 2026-03-10T07:50:41.850 INFO:tasks.workunit.client.0.vm05.stdout:5/143: unlink d2/d5/f2e 0 2026-03-10T07:50:41.851 INFO:tasks.workunit.client.0.vm05.stdout:3/159: dread d8/d16/f1e [0,4194304] 0 2026-03-10T07:50:41.853 INFO:tasks.workunit.client.0.vm05.stdout:7/148: rename d1/lf to d1/d6/l30 0 2026-03-10T07:50:41.854 INFO:tasks.workunit.client.0.vm05.stdout:7/149: dread - d1/d6/f22 zero size 2026-03-10T07:50:41.854 INFO:tasks.workunit.client.0.vm05.stdout:7/150: chown d1/d6/f15 14074 1 2026-03-10T07:50:41.858 INFO:tasks.workunit.client.0.vm05.stdout:4/129: mkdir d0/d28 0 2026-03-10T07:50:41.858 INFO:tasks.workunit.client.0.vm05.stdout:4/130: fdatasync d0/d17/f19 0 2026-03-10T07:50:41.858 INFO:tasks.workunit.client.0.vm05.stdout:4/131: write d0/f24 [658886,87228] 0 2026-03-10T07:50:41.858 INFO:tasks.workunit.client.0.vm05.stdout:2/209: creat d0/d2a/f45 x:0 0 0 2026-03-10T07:50:41.858 INFO:tasks.workunit.client.0.vm05.stdout:2/210: chown d0/d8/l24 368744876 1 2026-03-10T07:50:41.861 INFO:tasks.workunit.client.0.vm05.stdout:8/105: creat d1/dd/d18/f21 x:0 0 0 2026-03-10T07:50:41.861 INFO:tasks.workunit.client.0.vm05.stdout:9/82: mkdir d8/d16/d1c 0 2026-03-10T07:50:41.863 INFO:tasks.workunit.client.0.vm05.stdout:5/144: mkdir d2/d20/d33 0 2026-03-10T07:50:41.866 INFO:tasks.workunit.client.0.vm05.stdout:7/151: creat d1/d6/f31 x:0 0 0 2026-03-10T07:50:41.867 INFO:tasks.workunit.client.0.vm05.stdout:6/101: rename d0/c2 to d0/c20 0 2026-03-10T07:50:41.867 INFO:tasks.workunit.client.0.vm05.stdout:6/102: fsync d0/d6/f16 0 2026-03-10T07:50:41.868 INFO:tasks.workunit.client.0.vm05.stdout:2/211: rename d0/d8/f3e to d0/d8/d43/df/d25/f46 0 2026-03-10T07:50:41.868 INFO:tasks.workunit.client.0.vm05.stdout:2/212: chown d0/d8/d43/df/d25/c44 348830 1 2026-03-10T07:50:41.873 INFO:tasks.workunit.client.0.vm05.stdout:2/213: dwrite d0/d8/f1c [0,4194304] 0 2026-03-10T07:50:41.881 INFO:tasks.workunit.client.0.vm05.stdout:8/106: creat d1/dd/d18/f22 x:0 0 0 2026-03-10T07:50:41.884 INFO:tasks.workunit.client.0.vm05.stdout:5/145: mknod d2/d12/c34 0 2026-03-10T07:50:41.886 INFO:tasks.workunit.client.0.vm05.stdout:4/132: creat d0/d20/d26/f29 x:0 0 0 2026-03-10T07:50:41.891 INFO:tasks.workunit.client.0.vm05.stdout:0/97: getdents d8 0 2026-03-10T07:50:41.891 INFO:tasks.workunit.client.0.vm05.stdout:6/103: read - d0/d11/f1c zero size 2026-03-10T07:50:41.892 INFO:tasks.workunit.client.0.vm05.stdout:6/104: dwrite d0/d6/f1a [0,4194304] 0 2026-03-10T07:50:41.895 INFO:tasks.workunit.client.0.vm05.stdout:3/160: truncate d8/f25 134508 0 2026-03-10T07:50:41.900 INFO:tasks.workunit.client.0.vm05.stdout:4/133: creat d0/d20/f2a x:0 0 0 2026-03-10T07:50:41.901 INFO:tasks.workunit.client.0.vm05.stdout:4/134: fsync d0/d17/f19 0 2026-03-10T07:50:41.901 INFO:tasks.workunit.client.0.vm05.stdout:8/107: dwrite d1/f15 [0,4194304] 0 2026-03-10T07:50:41.907 INFO:tasks.workunit.client.0.vm05.stdout:8/108: dread d1/dd/f19 [0,4194304] 0 2026-03-10T07:50:41.908 INFO:tasks.workunit.client.0.vm05.stdout:8/109: write d1/dd/f19 [3856057,59809] 0 2026-03-10T07:50:41.910 INFO:tasks.workunit.client.0.vm05.stdout:2/214: mkdir d0/d47 0 2026-03-10T07:50:41.911 INFO:tasks.workunit.client.0.vm05.stdout:2/215: write d0/f41 [309289,71291] 0 2026-03-10T07:50:41.911 INFO:tasks.workunit.client.0.vm05.stdout:2/216: chown d0/f41 7111 1 2026-03-10T07:50:41.912 INFO:tasks.workunit.client.0.vm05.stdout:3/161: creat d8/d16/f2d x:0 0 0 2026-03-10T07:50:41.916 INFO:tasks.workunit.client.0.vm05.stdout:3/162: dwrite d8/d1c/f23 [0,4194304] 0 2026-03-10T07:50:41.918 INFO:tasks.workunit.client.0.vm05.stdout:4/135: chown d0/d6/ld 147923 1 2026-03-10T07:50:41.925 INFO:tasks.workunit.client.0.vm05.stdout:8/110: dwrite d1/fa [0,4194304] 0 2026-03-10T07:50:41.926 INFO:tasks.workunit.client.0.vm05.stdout:8/111: dread d1/fe [0,4194304] 0 2026-03-10T07:50:41.930 INFO:tasks.workunit.client.0.vm05.stdout:5/146: link d2/d12/l1b d2/l35 0 2026-03-10T07:50:41.930 INFO:tasks.workunit.client.0.vm05.stdout:4/136: rename d0/d6/d9/d12/l1c to d0/d6/d9/d12/l2b 0 2026-03-10T07:50:41.930 INFO:tasks.workunit.client.0.vm05.stdout:2/217: creat d0/d47/f48 x:0 0 0 2026-03-10T07:50:41.939 INFO:tasks.workunit.client.0.vm05.stdout:3/163: rename d8/d16/d19/f1d to d8/d16/d19/f2e 0 2026-03-10T07:50:41.939 INFO:tasks.workunit.client.0.vm05.stdout:4/137: dread - d0/d20/d26/f29 zero size 2026-03-10T07:50:41.939 INFO:tasks.workunit.client.0.vm05.stdout:2/218: mkdir d0/d47/d49 0 2026-03-10T07:50:41.939 INFO:tasks.workunit.client.0.vm05.stdout:4/138: creat d0/d17/f2c x:0 0 0 2026-03-10T07:50:41.939 INFO:tasks.workunit.client.0.vm05.stdout:2/219: write d0/f7 [2908652,106051] 0 2026-03-10T07:50:41.939 INFO:tasks.workunit.client.0.vm05.stdout:8/112: dwrite d1/fa [0,4194304] 0 2026-03-10T07:50:41.939 INFO:tasks.workunit.client.0.vm05.stdout:8/113: chown d1/dd/d18 5142328 1 2026-03-10T07:50:41.939 INFO:tasks.workunit.client.0.vm05.stdout:8/114: unlink d1/dd/d18/f1c 0 2026-03-10T07:50:41.940 INFO:tasks.workunit.client.0.vm05.stdout:2/220: symlink d0/l4a 0 2026-03-10T07:50:41.943 INFO:tasks.workunit.client.0.vm05.stdout:8/115: dread d1/fa [0,4194304] 0 2026-03-10T07:50:41.944 INFO:tasks.workunit.client.0.vm05.stdout:8/116: write d1/dd/f19 [3244461,58056] 0 2026-03-10T07:50:41.945 INFO:tasks.workunit.client.0.vm05.stdout:8/117: stat d1/cb 0 2026-03-10T07:50:41.946 INFO:tasks.workunit.client.0.vm05.stdout:8/118: unlink d1/dd/f19 0 2026-03-10T07:50:41.946 INFO:tasks.workunit.client.0.vm05.stdout:8/119: stat d1/cb 0 2026-03-10T07:50:41.947 INFO:tasks.workunit.client.0.vm05.stdout:8/120: mkdir d1/d23 0 2026-03-10T07:50:41.948 INFO:tasks.workunit.client.0.vm05.stdout:8/121: mknod d1/d23/c24 0 2026-03-10T07:50:41.948 INFO:tasks.workunit.client.0.vm05.stdout:8/122: write d1/dd/d18/f22 [70914,115866] 0 2026-03-10T07:50:41.949 INFO:tasks.workunit.client.0.vm05.stdout:8/123: chown d1/dd/l12 408529 1 2026-03-10T07:50:41.953 INFO:tasks.workunit.client.0.vm05.stdout:8/124: unlink d1/c13 0 2026-03-10T07:50:41.964 INFO:tasks.workunit.client.0.vm05.stdout:8/125: dwrite d1/dd/d18/f1f [0,4194304] 0 2026-03-10T07:50:41.967 INFO:tasks.workunit.client.0.vm05.stdout:8/126: truncate d1/dd/f17 4665466 0 2026-03-10T07:50:41.968 INFO:tasks.workunit.client.0.vm05.stdout:8/127: write d1/dd/d18/f22 [282112,73037] 0 2026-03-10T07:50:41.970 INFO:tasks.workunit.client.0.vm05.stdout:8/128: fdatasync d1/dd/d18/f21 0 2026-03-10T07:50:41.978 INFO:tasks.workunit.client.0.vm05.stdout:8/129: creat d1/dd/f25 x:0 0 0 2026-03-10T07:50:42.037 INFO:tasks.workunit.client.0.vm05.stdout:8/130: sync 2026-03-10T07:50:42.038 INFO:tasks.workunit.client.0.vm05.stdout:8/131: unlink f0 0 2026-03-10T07:50:42.039 INFO:tasks.workunit.client.0.vm05.stdout:8/132: symlink d1/d23/l26 0 2026-03-10T07:50:42.142 INFO:tasks.workunit.client.0.vm05.stdout:0/98: fsync d8/f9 0 2026-03-10T07:50:42.143 INFO:tasks.workunit.client.0.vm05.stdout:0/99: stat d8/fb 0 2026-03-10T07:50:42.147 INFO:tasks.workunit.client.0.vm05.stdout:1/127: dwrite da/dd/d12/f14 [0,4194304] 0 2026-03-10T07:50:42.148 INFO:tasks.workunit.client.0.vm05.stdout:0/100: fsync d8/f12 0 2026-03-10T07:50:42.151 INFO:tasks.workunit.client.0.vm05.stdout:9/83: fsync f6 0 2026-03-10T07:50:42.152 INFO:tasks.workunit.client.0.vm05.stdout:6/105: dwrite d0/d11/f13 [0,4194304] 0 2026-03-10T07:50:42.157 INFO:tasks.workunit.client.0.vm05.stdout:6/106: dread d0/f15 [0,4194304] 0 2026-03-10T07:50:42.158 INFO:tasks.workunit.client.0.vm05.stdout:6/107: readlink d0/l19 0 2026-03-10T07:50:42.165 INFO:tasks.workunit.client.0.vm05.stdout:7/152: chown d1/d6/f1b 3 1 2026-03-10T07:50:42.171 INFO:tasks.workunit.client.0.vm05.stdout:4/139: write d0/f23 [936681,37456] 0 2026-03-10T07:50:42.174 INFO:tasks.workunit.client.0.vm05.stdout:1/128: symlink da/dd/d12/d19/l1d 0 2026-03-10T07:50:42.183 INFO:tasks.workunit.client.0.vm05.stdout:6/108: creat d0/d11/f21 x:0 0 0 2026-03-10T07:50:42.186 INFO:tasks.workunit.client.0.vm05.stdout:9/84: dwrite d8/f1b [0,4194304] 0 2026-03-10T07:50:42.186 INFO:tasks.workunit.client.0.vm05.stdout:3/164: dwrite f4 [0,4194304] 0 2026-03-10T07:50:42.187 INFO:tasks.workunit.client.0.vm05.stdout:4/140: write d0/f23 [409282,19412] 0 2026-03-10T07:50:42.189 INFO:tasks.workunit.client.0.vm05.stdout:7/153: dwrite d1/d6/f2e [0,4194304] 0 2026-03-10T07:50:42.189 INFO:tasks.workunit.client.0.vm05.stdout:3/165: dread - d8/d16/f2d zero size 2026-03-10T07:50:42.189 INFO:tasks.workunit.client.0.vm05.stdout:3/166: chown d8/d16 7934329 1 2026-03-10T07:50:42.201 INFO:tasks.workunit.client.0.vm05.stdout:9/85: chown d8/l19 2632291 1 2026-03-10T07:50:42.204 INFO:tasks.workunit.client.0.vm05.stdout:2/221: dwrite d0/d8/f42 [0,4194304] 0 2026-03-10T07:50:42.207 INFO:tasks.workunit.client.0.vm05.stdout:2/222: write d0/d8/d43/df/d25/f2b [838940,100459] 0 2026-03-10T07:50:42.210 INFO:tasks.workunit.client.0.vm05.stdout:7/154: creat d1/d6/f32 x:0 0 0 2026-03-10T07:50:42.230 INFO:tasks.workunit.client.0.vm05.stdout:7/155: dread - d1/d6/f22 zero size 2026-03-10T07:50:42.238 INFO:tasks.workunit.client.0.vm05.stdout:1/129: mknod da/dd/c1e 0 2026-03-10T07:50:42.238 INFO:tasks.workunit.client.0.vm05.stdout:6/109: dwrite d0/d6/f1a [0,4194304] 0 2026-03-10T07:50:42.238 INFO:tasks.workunit.client.0.vm05.stdout:1/130: write da/dd/d12/f18 [450471,96489] 0 2026-03-10T07:50:42.238 INFO:tasks.workunit.client.0.vm05.stdout:6/110: write d0/fa [2308913,40774] 0 2026-03-10T07:50:42.238 INFO:tasks.workunit.client.0.vm05.stdout:0/101: link d8/dd/de/l17 d8/l1f 0 2026-03-10T07:50:42.238 INFO:tasks.workunit.client.0.vm05.stdout:9/86: fdatasync d8/f12 0 2026-03-10T07:50:42.238 INFO:tasks.workunit.client.0.vm05.stdout:3/167: creat d8/d1f/f2f x:0 0 0 2026-03-10T07:50:42.238 INFO:tasks.workunit.client.0.vm05.stdout:3/168: dwrite f3 [0,4194304] 0 2026-03-10T07:50:42.238 INFO:tasks.workunit.client.0.vm05.stdout:7/156: creat d1/f33 x:0 0 0 2026-03-10T07:50:42.238 INFO:tasks.workunit.client.0.vm05.stdout:0/102: chown d8/c15 7508 1 2026-03-10T07:50:42.238 INFO:tasks.workunit.client.0.vm05.stdout:6/111: readlink d0/d11/l14 0 2026-03-10T07:50:42.238 INFO:tasks.workunit.client.0.vm05.stdout:6/112: dread - d0/d6/f10 zero size 2026-03-10T07:50:42.244 INFO:tasks.workunit.client.0.vm05.stdout:9/87: creat d8/d16/f1d x:0 0 0 2026-03-10T07:50:42.244 INFO:tasks.workunit.client.0.vm05.stdout:4/141: link d0/c5 d0/d20/c2d 0 2026-03-10T07:50:42.244 INFO:tasks.workunit.client.0.vm05.stdout:1/131: fdatasync f4 0 2026-03-10T07:50:42.245 INFO:tasks.workunit.client.0.vm05.stdout:3/169: mknod d8/d1f/c30 0 2026-03-10T07:50:42.245 INFO:tasks.workunit.client.0.vm05.stdout:3/170: write f3 [2300660,109690] 0 2026-03-10T07:50:42.247 INFO:tasks.workunit.client.0.vm05.stdout:1/132: write da/dd/f11 [416483,92970] 0 2026-03-10T07:50:42.258 INFO:tasks.workunit.client.0.vm05.stdout:7/157: mkdir d1/d34 0 2026-03-10T07:50:42.258 INFO:tasks.workunit.client.0.vm05.stdout:7/158: chown d1/f21 79 1 2026-03-10T07:50:42.258 INFO:tasks.workunit.client.0.vm05.stdout:7/159: readlink d1/l14 0 2026-03-10T07:50:42.258 INFO:tasks.workunit.client.0.vm05.stdout:9/88: dread d8/f15 [0,4194304] 0 2026-03-10T07:50:42.258 INFO:tasks.workunit.client.0.vm05.stdout:0/103: readlink l7 0 2026-03-10T07:50:42.258 INFO:tasks.workunit.client.0.vm05.stdout:6/113: mkdir d0/d11/d22 0 2026-03-10T07:50:42.258 INFO:tasks.workunit.client.0.vm05.stdout:1/133: mknod da/dd/d12/d19/c1f 0 2026-03-10T07:50:42.258 INFO:tasks.workunit.client.0.vm05.stdout:6/114: write d0/f8 [2153639,59299] 0 2026-03-10T07:50:42.260 INFO:tasks.workunit.client.0.vm05.stdout:1/134: dwrite da/dd/d12/f16 [0,4194304] 0 2026-03-10T07:50:42.265 INFO:tasks.workunit.client.0.vm05.stdout:6/115: dwrite d0/d11/f21 [0,4194304] 0 2026-03-10T07:50:42.265 INFO:tasks.workunit.client.0.vm05.stdout:2/223: sync 2026-03-10T07:50:42.269 INFO:tasks.workunit.client.0.vm05.stdout:6/116: dread - d0/d6/f10 zero size 2026-03-10T07:50:42.273 INFO:tasks.workunit.client.0.vm05.stdout:0/104: creat d8/f20 x:0 0 0 2026-03-10T07:50:42.283 INFO:tasks.workunit.client.0.vm05.stdout:1/135: mkdir da/dd/d12/d19/d20 0 2026-03-10T07:50:42.283 INFO:tasks.workunit.client.0.vm05.stdout:1/136: readlink da/dd/d12/l1b 0 2026-03-10T07:50:42.283 INFO:tasks.workunit.client.0.vm05.stdout:2/224: dwrite d0/f1e [0,4194304] 0 2026-03-10T07:50:42.283 INFO:tasks.workunit.client.0.vm05.stdout:0/105: dread d8/dd/d10/f11 [0,4194304] 0 2026-03-10T07:50:42.286 INFO:tasks.workunit.client.0.vm05.stdout:2/225: dread d0/d8/d43/df/f20 [0,4194304] 0 2026-03-10T07:50:42.295 INFO:tasks.workunit.client.0.vm05.stdout:4/142: link d0/d20/c22 d0/d6/d9/d12/c2e 0 2026-03-10T07:50:42.301 INFO:tasks.workunit.client.0.vm05.stdout:4/143: dwrite d0/d6/f15 [0,4194304] 0 2026-03-10T07:50:42.309 INFO:tasks.workunit.client.0.vm05.stdout:3/171: creat d8/f31 x:0 0 0 2026-03-10T07:50:42.310 INFO:tasks.workunit.client.0.vm05.stdout:4/144: dwrite d0/d17/f2c [0,4194304] 0 2026-03-10T07:50:42.326 INFO:tasks.workunit.client.0.vm05.stdout:0/106: mknod d8/c21 0 2026-03-10T07:50:42.338 INFO:tasks.workunit.client.0.vm05.stdout:1/137: mknod da/dd/d12/d19/c21 0 2026-03-10T07:50:42.339 INFO:tasks.workunit.client.0.vm05.stdout:1/138: write da/fb [3215923,5884] 0 2026-03-10T07:50:42.348 INFO:tasks.workunit.client.0.vm05.stdout:2/226: mknod d0/d8/d43/c4b 0 2026-03-10T07:50:42.348 INFO:tasks.workunit.client.0.vm05.stdout:2/227: stat d0/l4a 0 2026-03-10T07:50:42.348 INFO:tasks.workunit.client.0.vm05.stdout:2/228: chown d0/d2a/d2f 37437 1 2026-03-10T07:50:42.361 INFO:tasks.workunit.client.0.vm05.stdout:8/133: truncate d1/dd/f11 1296250 0 2026-03-10T07:50:42.368 INFO:tasks.workunit.client.0.vm05.stdout:4/145: creat d0/d6/d10/f2f x:0 0 0 2026-03-10T07:50:42.368 INFO:tasks.workunit.client.0.vm05.stdout:4/146: write d0/d6/f15 [4325124,78100] 0 2026-03-10T07:50:42.371 INFO:tasks.workunit.client.0.vm05.stdout:0/107: fdatasync d8/fb 0 2026-03-10T07:50:42.373 INFO:tasks.workunit.client.0.vm05.stdout:9/89: getdents d8 0 2026-03-10T07:50:42.373 INFO:tasks.workunit.client.0.vm05.stdout:9/90: chown d8/d16/c1a 348691793 1 2026-03-10T07:50:42.379 INFO:tasks.workunit.client.0.vm05.stdout:1/139: unlink da/l1c 0 2026-03-10T07:50:42.384 INFO:tasks.workunit.client.0.vm05.stdout:2/229: creat d0/d47/f4c x:0 0 0 2026-03-10T07:50:42.388 INFO:tasks.workunit.client.0.vm05.stdout:6/117: creat d0/f23 x:0 0 0 2026-03-10T07:50:42.389 INFO:tasks.workunit.client.0.vm05.stdout:4/147: write d0/f2 [2653900,24528] 0 2026-03-10T07:50:42.390 INFO:tasks.workunit.client.0.vm05.stdout:4/148: read - d0/d20/f2a zero size 2026-03-10T07:50:42.392 INFO:tasks.workunit.client.0.vm05.stdout:1/140: creat da/dd/d12/f22 x:0 0 0 2026-03-10T07:50:42.396 INFO:tasks.workunit.client.0.vm05.stdout:2/230: dwrite d0/d8/d43/df/f20 [4194304,4194304] 0 2026-03-10T07:50:42.399 INFO:tasks.workunit.client.0.vm05.stdout:6/118: creat d0/d6/f24 x:0 0 0 2026-03-10T07:50:42.399 INFO:tasks.workunit.client.0.vm05.stdout:4/149: symlink d0/d6/d10/l30 0 2026-03-10T07:50:42.404 INFO:tasks.workunit.client.0.vm05.stdout:9/91: creat d8/d16/d1c/f1e x:0 0 0 2026-03-10T07:50:42.406 INFO:tasks.workunit.client.0.vm05.stdout:4/150: mknod d0/d17/c31 0 2026-03-10T07:50:42.413 INFO:tasks.workunit.client.0.vm05.stdout:6/119: symlink d0/d11/l25 0 2026-03-10T07:50:42.413 INFO:tasks.workunit.client.0.vm05.stdout:2/231: mkdir d0/d8/d43/df/d4d 0 2026-03-10T07:50:42.413 INFO:tasks.workunit.client.0.vm05.stdout:2/232: write d0/d8/d43/df/f18 [480988,96294] 0 2026-03-10T07:50:42.413 INFO:tasks.workunit.client.0.vm05.stdout:0/108: creat d8/dd/f22 x:0 0 0 2026-03-10T07:50:42.413 INFO:tasks.workunit.client.0.vm05.stdout:0/109: write d8/ff [166339,70242] 0 2026-03-10T07:50:42.414 INFO:tasks.workunit.client.0.vm05.stdout:4/151: dwrite d0/d6/f15 [0,4194304] 0 2026-03-10T07:50:42.416 INFO:tasks.workunit.client.0.vm05.stdout:4/152: write d0/d6/f15 [1087541,7190] 0 2026-03-10T07:50:42.417 INFO:tasks.workunit.client.0.vm05.stdout:2/233: dread d0/d8/f2d [0,4194304] 0 2026-03-10T07:50:42.421 INFO:tasks.workunit.client.0.vm05.stdout:1/141: symlink da/l23 0 2026-03-10T07:50:42.421 INFO:tasks.workunit.client.0.vm05.stdout:1/142: stat da/l15 0 2026-03-10T07:50:42.425 INFO:tasks.workunit.client.0.vm05.stdout:2/234: dwrite d0/d8/d3d/f40 [0,4194304] 0 2026-03-10T07:50:42.448 INFO:tasks.workunit.client.0.vm05.stdout:0/110: symlink d8/l23 0 2026-03-10T07:50:42.459 INFO:tasks.workunit.client.0.vm05.stdout:1/143: unlink c1 0 2026-03-10T07:50:42.468 INFO:tasks.workunit.client.0.vm05.stdout:2/235: mkdir d0/d8/d43/df/d4e 0 2026-03-10T07:50:42.469 INFO:tasks.workunit.client.0.vm05.stdout:1/144: dread da/fc [0,4194304] 0 2026-03-10T07:50:42.472 INFO:tasks.workunit.client.0.vm05.stdout:6/120: creat d0/f26 x:0 0 0 2026-03-10T07:50:42.475 INFO:tasks.workunit.client.0.vm05.stdout:6/121: dwrite d0/d6/f1d [0,4194304] 0 2026-03-10T07:50:42.477 INFO:tasks.workunit.client.0.vm05.stdout:6/122: dread - d0/d11/f1c zero size 2026-03-10T07:50:42.479 INFO:tasks.workunit.client.0.vm05.stdout:6/123: read - d0/d6/f10 zero size 2026-03-10T07:50:42.480 INFO:tasks.workunit.client.0.vm05.stdout:4/153: rename d0/d20/d26/f29 to d0/d6/f32 0 2026-03-10T07:50:42.482 INFO:tasks.workunit.client.0.vm05.stdout:4/154: read - d0/d6/d10/f25 zero size 2026-03-10T07:50:42.486 INFO:tasks.workunit.client.0.vm05.stdout:6/124: mkdir d0/d6/d27 0 2026-03-10T07:50:42.487 INFO:tasks.workunit.client.0.vm05.stdout:4/155: dread d0/f23 [0,4194304] 0 2026-03-10T07:50:42.491 INFO:tasks.workunit.client.0.vm05.stdout:4/156: dread d0/d6/f15 [0,4194304] 0 2026-03-10T07:50:42.491 INFO:tasks.workunit.client.0.vm05.stdout:4/157: chown d0/c1b 908116898 1 2026-03-10T07:50:42.491 INFO:tasks.workunit.client.0.vm05.stdout:4/158: write d0/d17/f2c [511281,98067] 0 2026-03-10T07:50:42.492 INFO:tasks.workunit.client.0.vm05.stdout:4/159: chown d0/d17/f19 508730078 1 2026-03-10T07:50:42.503 INFO:tasks.workunit.client.0.vm05.stdout:0/111: link d8/c15 d8/dd/c24 0 2026-03-10T07:50:42.510 INFO:tasks.workunit.client.0.vm05.stdout:4/160: creat d0/d28/f33 x:0 0 0 2026-03-10T07:50:42.511 INFO:tasks.workunit.client.0.vm05.stdout:0/112: symlink d8/dd/d10/l25 0 2026-03-10T07:50:42.511 INFO:tasks.workunit.client.0.vm05.stdout:0/113: readlink d8/l23 0 2026-03-10T07:50:42.512 INFO:tasks.workunit.client.0.vm05.stdout:0/114: chown d8/dd/d10/f19 6465 1 2026-03-10T07:50:42.516 INFO:tasks.workunit.client.0.vm05.stdout:6/125: rmdir d0/d6/d27 0 2026-03-10T07:50:42.522 INFO:tasks.workunit.client.0.vm05.stdout:0/115: mkdir d8/dd/d10/d26 0 2026-03-10T07:50:42.522 INFO:tasks.workunit.client.0.vm05.stdout:0/116: dwrite d8/fc [0,4194304] 0 2026-03-10T07:50:42.522 INFO:tasks.workunit.client.0.vm05.stdout:0/117: fdatasync d8/f13 0 2026-03-10T07:50:42.531 INFO:tasks.workunit.client.0.vm05.stdout:6/126: creat d0/f28 x:0 0 0 2026-03-10T07:50:42.535 INFO:tasks.workunit.client.0.vm05.stdout:6/127: dwrite d0/d6/f1d [0,4194304] 0 2026-03-10T07:50:42.542 INFO:tasks.workunit.client.0.vm05.stdout:6/128: dwrite d0/d6/f10 [0,4194304] 0 2026-03-10T07:50:42.545 INFO:tasks.workunit.client.0.vm05.stdout:6/129: chown d0/d6/f1a 5261 1 2026-03-10T07:50:42.546 INFO:tasks.workunit.client.0.vm05.stdout:0/118: rename c0 to d8/dd/c27 0 2026-03-10T07:50:42.546 INFO:tasks.workunit.client.0.vm05.stdout:6/130: chown d0/d11 0 1 2026-03-10T07:50:42.546 INFO:tasks.workunit.client.0.vm05.stdout:6/131: readlink d0/d6/l17 0 2026-03-10T07:50:42.546 INFO:tasks.workunit.client.0.vm05.stdout:0/119: write d8/dd/d10/f19 [299568,108417] 0 2026-03-10T07:50:42.548 INFO:tasks.workunit.client.0.vm05.stdout:0/120: truncate f6 645737 0 2026-03-10T07:50:42.559 INFO:tasks.workunit.client.0.vm05.stdout:0/121: rename c1 to d8/dd/de/c28 0 2026-03-10T07:50:42.564 INFO:tasks.workunit.client.0.vm05.stdout:0/122: dwrite d8/dd/d10/f19 [0,4194304] 0 2026-03-10T07:50:42.648 INFO:tasks.workunit.client.0.vm05.stdout:9/92: fsync d8/d16/f1d 0 2026-03-10T07:50:42.650 INFO:tasks.workunit.client.0.vm05.stdout:1/145: write f4 [6569054,119093] 0 2026-03-10T07:50:42.651 INFO:tasks.workunit.client.0.vm05.stdout:1/146: chown da/dd/ce 41443601 1 2026-03-10T07:50:42.660 INFO:tasks.workunit.client.0.vm05.stdout:7/160: dwrite d1/d6/f1b [0,4194304] 0 2026-03-10T07:50:42.664 INFO:tasks.workunit.client.0.vm05.stdout:9/93: rename f4 to d8/d16/f1f 0 2026-03-10T07:50:42.664 INFO:tasks.workunit.client.0.vm05.stdout:9/94: write d8/f12 [4277248,95692] 0 2026-03-10T07:50:42.675 INFO:tasks.workunit.client.0.vm05.stdout:9/95: mkdir d8/d16/d1c/d20 0 2026-03-10T07:50:42.676 INFO:tasks.workunit.client.0.vm05.stdout:1/147: mknod da/dd/d12/d19/d20/c24 0 2026-03-10T07:50:42.680 INFO:tasks.workunit.client.0.vm05.stdout:7/161: getdents d1 0 2026-03-10T07:50:42.686 INFO:tasks.workunit.client.0.vm05.stdout:1/148: fsync f4 0 2026-03-10T07:50:42.686 INFO:tasks.workunit.client.0.vm05.stdout:1/149: dread - da/dd/f13 zero size 2026-03-10T07:50:42.687 INFO:tasks.workunit.client.0.vm05.stdout:7/162: truncate d1/f11 2376237 0 2026-03-10T07:50:42.690 INFO:tasks.workunit.client.0.vm05.stdout:9/96: dread d8/fa [0,4194304] 0 2026-03-10T07:50:42.691 INFO:tasks.workunit.client.0.vm05.stdout:9/97: write d8/f12 [3710784,30581] 0 2026-03-10T07:50:42.703 INFO:tasks.workunit.client.0.vm05.stdout:7/163: unlink d1/d6/f15 0 2026-03-10T07:50:42.703 INFO:tasks.workunit.client.0.vm05.stdout:7/164: dread - d1/d6/f22 zero size 2026-03-10T07:50:42.706 INFO:tasks.workunit.client.0.vm05.stdout:7/165: dwrite d1/d6/f1b [0,4194304] 0 2026-03-10T07:50:42.708 INFO:tasks.workunit.client.0.vm05.stdout:7/166: chown d1/d6/c19 370358 1 2026-03-10T07:50:42.708 INFO:tasks.workunit.client.0.vm05.stdout:7/167: stat d1/f27 0 2026-03-10T07:50:42.716 INFO:tasks.workunit.client.0.vm05.stdout:9/98: creat d8/d16/f21 x:0 0 0 2026-03-10T07:50:42.724 INFO:tasks.workunit.client.0.vm05.stdout:9/99: mkdir d8/d16/d22 0 2026-03-10T07:50:42.729 INFO:tasks.workunit.client.0.vm05.stdout:8/134: rmdir d1/dd 39 2026-03-10T07:50:42.733 INFO:tasks.workunit.client.0.vm05.stdout:3/172: dwrite d8/ff [0,4194304] 0 2026-03-10T07:50:42.733 INFO:tasks.workunit.client.0.vm05.stdout:7/168: symlink d1/d34/l35 0 2026-03-10T07:50:42.738 INFO:tasks.workunit.client.0.vm05.stdout:9/100: creat d8/d16/f23 x:0 0 0 2026-03-10T07:50:42.739 INFO:tasks.workunit.client.0.vm05.stdout:5/147: dwrite d2/d20/f2a [0,4194304] 0 2026-03-10T07:50:42.764 INFO:tasks.workunit.client.0.vm05.stdout:7/169: creat d1/d34/f36 x:0 0 0 2026-03-10T07:50:42.764 INFO:tasks.workunit.client.0.vm05.stdout:8/135: dwrite d1/dd/f25 [0,4194304] 0 2026-03-10T07:50:42.784 INFO:tasks.workunit.client.0.vm05.stdout:3/173: creat d8/d1f/d2a/f32 x:0 0 0 2026-03-10T07:50:42.784 INFO:tasks.workunit.client.0.vm05.stdout:7/170: unlink d1/d34/f36 0 2026-03-10T07:50:42.784 INFO:tasks.workunit.client.0.vm05.stdout:9/101: rename c5 to d8/c24 0 2026-03-10T07:50:42.785 INFO:tasks.workunit.client.0.vm05.stdout:3/174: read - d8/d16/f1a zero size 2026-03-10T07:50:42.785 INFO:tasks.workunit.client.0.vm05.stdout:7/171: readlink d1/d6/l28 0 2026-03-10T07:50:42.785 INFO:tasks.workunit.client.0.vm05.stdout:7/172: dread - d1/d6/f32 zero size 2026-03-10T07:50:42.786 INFO:tasks.workunit.client.0.vm05.stdout:5/148: creat d2/d12/d2d/f36 x:0 0 0 2026-03-10T07:50:42.791 INFO:tasks.workunit.client.0.vm05.stdout:7/173: dwrite d1/f21 [0,4194304] 0 2026-03-10T07:50:42.796 INFO:tasks.workunit.client.0.vm05.stdout:8/136: symlink d1/l27 0 2026-03-10T07:50:42.798 INFO:tasks.workunit.client.0.vm05.stdout:9/102: creat d8/d16/f25 x:0 0 0 2026-03-10T07:50:42.800 INFO:tasks.workunit.client.0.vm05.stdout:3/175: mkdir d8/d1f/d2a/d33 0 2026-03-10T07:50:42.802 INFO:tasks.workunit.client.0.vm05.stdout:5/149: mknod d2/d20/c37 0 2026-03-10T07:50:42.810 INFO:tasks.workunit.client.0.vm05.stdout:7/174: creat d1/f37 x:0 0 0 2026-03-10T07:50:42.817 INFO:tasks.workunit.client.0.vm05.stdout:5/150: dwrite d2/d5/f1e [0,4194304] 0 2026-03-10T07:50:42.818 INFO:tasks.workunit.client.0.vm05.stdout:9/103: mkdir d8/d16/d1c/d26 0 2026-03-10T07:50:42.818 INFO:tasks.workunit.client.0.vm05.stdout:3/176: mkdir d8/d1f/d2a/d34 0 2026-03-10T07:50:42.818 INFO:tasks.workunit.client.0.vm05.stdout:9/104: chown d8/d16/f1f 88 1 2026-03-10T07:50:42.819 INFO:tasks.workunit.client.0.vm05.stdout:7/175: dread d1/d6/f1b [0,4194304] 0 2026-03-10T07:50:42.829 INFO:tasks.workunit.client.0.vm05.stdout:9/105: dwrite d8/d16/f23 [0,4194304] 0 2026-03-10T07:50:42.830 INFO:tasks.workunit.client.0.vm05.stdout:9/106: fsync f6 0 2026-03-10T07:50:42.839 INFO:tasks.workunit.client.0.vm05.stdout:9/107: dwrite f6 [0,4194304] 0 2026-03-10T07:50:42.848 INFO:tasks.workunit.client.0.vm05.stdout:8/137: unlink d1/dd/d18/c1e 0 2026-03-10T07:50:42.850 INFO:tasks.workunit.client.0.vm05.stdout:9/108: dwrite d8/f1b [0,4194304] 0 2026-03-10T07:50:42.855 INFO:tasks.workunit.client.0.vm05.stdout:5/151: unlink d2/d5/fb 0 2026-03-10T07:50:42.874 INFO:tasks.workunit.client.0.vm05.stdout:2/236: dwrite d0/f4 [0,4194304] 0 2026-03-10T07:50:42.875 INFO:tasks.workunit.client.0.vm05.stdout:4/161: truncate d0/d6/f15 1823056 0 2026-03-10T07:50:42.875 INFO:tasks.workunit.client.0.vm05.stdout:5/152: mkdir d2/d20/d38 0 2026-03-10T07:50:42.876 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:42 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:42.876 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:42 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:42.876 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:42 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:42.877 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:42 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:42.877 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:42 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:50:42.877 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:42 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:50:42.877 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:42 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:42.877 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:42 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:50:42.877 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:42 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:50:42.877 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:42 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:50:42.877 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:42 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:42.877 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:42 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm05.blexke"}]: dispatch 2026-03-10T07:50:42.877 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:42 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm05.blexke"}]': finished 2026-03-10T07:50:42.877 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:42 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm08.orfpog"}]: dispatch 2026-03-10T07:50:42.877 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:42 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm08.orfpog"}]': finished 2026-03-10T07:50:42.877 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:42 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:50:42.885 INFO:tasks.workunit.client.0.vm05.stdout:6/132: truncate d0/fa 1096366 0 2026-03-10T07:50:42.885 INFO:tasks.workunit.client.0.vm05.stdout:0/123: getdents d8/dd/de 0 2026-03-10T07:50:42.888 INFO:tasks.workunit.client.0.vm05.stdout:2/237: symlink d0/d8/d43/df/d25/l4f 0 2026-03-10T07:50:42.888 INFO:tasks.workunit.client.0.vm05.stdout:2/238: chown d0/d8/d43/df/d25/c44 14292601 1 2026-03-10T07:50:42.891 INFO:tasks.workunit.client.0.vm05.stdout:2/239: dread d0/d8/d43/df/d25/f2b [0,4194304] 0 2026-03-10T07:50:42.891 INFO:tasks.workunit.client.0.vm05.stdout:0/124: dwrite d8/dd/f22 [0,4194304] 0 2026-03-10T07:50:42.892 INFO:tasks.workunit.client.0.vm05.stdout:3/177: link d8/d22/f29 d8/d1c/f35 0 2026-03-10T07:50:42.897 INFO:tasks.workunit.client.0.vm05.stdout:4/162: creat d0/d20/f34 x:0 0 0 2026-03-10T07:50:42.899 INFO:tasks.workunit.client.0.vm05.stdout:1/150: write da/fc [1483090,69688] 0 2026-03-10T07:50:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:42 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:42 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:42 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:42 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:42 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:50:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:42 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:50:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:42 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:42 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:50:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:42 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:50:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:42 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:50:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:42 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:42 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm05.blexke"}]: dispatch 2026-03-10T07:50:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:42 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm05.blexke"}]': finished 2026-03-10T07:50:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:42 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm08.orfpog"}]: dispatch 2026-03-10T07:50:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:42 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr.vm08.orfpog"}]': finished 2026-03-10T07:50:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:42 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:50:42.914 INFO:tasks.workunit.client.0.vm05.stdout:2/240: mknod d0/d2a/d2f/c50 0 2026-03-10T07:50:42.916 INFO:tasks.workunit.client.0.vm05.stdout:4/163: creat d0/d6/d9/d12/f35 x:0 0 0 2026-03-10T07:50:42.924 INFO:tasks.workunit.client.0.vm05.stdout:8/138: link d1/c1d d1/dd/d18/d20/c28 0 2026-03-10T07:50:42.933 INFO:tasks.workunit.client.0.vm05.stdout:1/151: write da/dd/d12/f18 [1303270,89973] 0 2026-03-10T07:50:42.935 INFO:tasks.workunit.client.0.vm05.stdout:1/152: write da/dd/d12/d19/f1a [223302,56412] 0 2026-03-10T07:50:42.941 INFO:tasks.workunit.client.0.vm05.stdout:4/164: creat d0/d6/d9/d12/f36 x:0 0 0 2026-03-10T07:50:42.943 INFO:tasks.workunit.client.0.vm05.stdout:9/109: getdents d8 0 2026-03-10T07:50:42.955 INFO:tasks.workunit.client.0.vm05.stdout:1/153: fdatasync f4 0 2026-03-10T07:50:42.955 INFO:tasks.workunit.client.0.vm05.stdout:7/176: write d1/f16 [435064,42572] 0 2026-03-10T07:50:42.958 INFO:tasks.workunit.client.0.vm05.stdout:0/125: creat d8/dd/f29 x:0 0 0 2026-03-10T07:50:42.959 INFO:tasks.workunit.client.0.vm05.stdout:2/241: symlink d0/d8/d43/d38/l51 0 2026-03-10T07:50:42.959 INFO:tasks.workunit.client.0.vm05.stdout:2/242: fsync d0/f5 0 2026-03-10T07:50:42.960 INFO:tasks.workunit.client.0.vm05.stdout:2/243: chown d0/d2a/d2f/c39 15552 1 2026-03-10T07:50:42.961 INFO:tasks.workunit.client.0.vm05.stdout:9/110: mknod d8/d16/c27 0 2026-03-10T07:50:42.961 INFO:tasks.workunit.client.0.vm05.stdout:0/126: dread d8/ff [0,4194304] 0 2026-03-10T07:50:42.963 INFO:tasks.workunit.client.0.vm05.stdout:0/127: chown d8/dd/d10/f1a 62 1 2026-03-10T07:50:42.967 INFO:tasks.workunit.client.0.vm05.stdout:0/128: dread - d8/f20 zero size 2026-03-10T07:50:42.970 INFO:tasks.workunit.client.0.vm05.stdout:5/153: getdents d2/d5 0 2026-03-10T07:50:42.972 INFO:tasks.workunit.client.0.vm05.stdout:0/129: dwrite d8/f1c [0,4194304] 0 2026-03-10T07:50:42.979 INFO:tasks.workunit.client.0.vm05.stdout:9/111: dread d8/f14 [0,4194304] 0 2026-03-10T07:50:42.983 INFO:tasks.workunit.client.0.vm05.stdout:1/154: chown c8 295 1 2026-03-10T07:50:42.983 INFO:tasks.workunit.client.0.vm05.stdout:7/177: mknod d1/d6/c38 0 2026-03-10T07:50:42.986 INFO:tasks.workunit.client.0.vm05.stdout:4/165: mkdir d0/d6/d37 0 2026-03-10T07:50:42.990 INFO:tasks.workunit.client.0.vm05.stdout:5/154: creat d2/d12/d2d/f39 x:0 0 0 2026-03-10T07:50:42.992 INFO:tasks.workunit.client.0.vm05.stdout:5/155: dread - d2/f15 zero size 2026-03-10T07:50:42.992 INFO:tasks.workunit.client.0.vm05.stdout:7/178: unlink d1/l13 0 2026-03-10T07:50:42.993 INFO:tasks.workunit.client.0.vm05.stdout:7/179: dread - d1/f27 zero size 2026-03-10T07:50:42.995 INFO:tasks.workunit.client.0.vm05.stdout:6/133: write d0/f15 [4502501,103303] 0 2026-03-10T07:50:42.998 INFO:tasks.workunit.client.0.vm05.stdout:5/156: fsync d2/d12/d2d/f36 0 2026-03-10T07:50:42.999 INFO:tasks.workunit.client.0.vm05.stdout:2/244: mkdir d0/d52 0 2026-03-10T07:50:43.002 INFO:tasks.workunit.client.0.vm05.stdout:8/139: sync 2026-03-10T07:50:43.006 INFO:tasks.workunit.client.0.vm05.stdout:5/157: write d2/d5/f1e [428254,95948] 0 2026-03-10T07:50:43.011 INFO:tasks.workunit.client.0.vm05.stdout:4/166: symlink d0/d17/l38 0 2026-03-10T07:50:43.017 INFO:tasks.workunit.client.0.vm05.stdout:9/112: mkdir d8/d16/d1c/d26/d28 0 2026-03-10T07:50:43.017 INFO:tasks.workunit.client.0.vm05.stdout:7/180: sync 2026-03-10T07:50:43.023 INFO:tasks.workunit.client.0.vm05.stdout:3/178: truncate f4 3056684 0 2026-03-10T07:50:43.023 INFO:tasks.workunit.client.0.vm05.stdout:0/130: dread d8/f9 [0,4194304] 0 2026-03-10T07:50:43.024 INFO:tasks.workunit.client.0.vm05.stdout:0/131: write f4 [1124290,124998] 0 2026-03-10T07:50:43.026 INFO:tasks.workunit.client.0.vm05.stdout:2/245: dwrite d0/d47/f48 [0,4194304] 0 2026-03-10T07:50:43.026 INFO:tasks.workunit.client.0.vm05.stdout:7/181: dwrite d1/f27 [0,4194304] 0 2026-03-10T07:50:43.027 INFO:tasks.workunit.client.0.vm05.stdout:7/182: dread - d1/d6/f32 zero size 2026-03-10T07:50:43.047 INFO:tasks.workunit.client.0.vm05.stdout:1/155: dread f4 [8388608,4194304] 0 2026-03-10T07:50:43.047 INFO:tasks.workunit.client.0.vm05.stdout:4/167: chown d0/d6/d9/d12/c2e 234856946 1 2026-03-10T07:50:43.047 INFO:tasks.workunit.client.0.vm05.stdout:5/158: creat d2/d12/f3a x:0 0 0 2026-03-10T07:50:43.053 INFO:tasks.workunit.client.0.vm05.stdout:3/179: rename d8/d1f/c2c to d8/d1c/c36 0 2026-03-10T07:50:43.058 INFO:tasks.workunit.client.0.vm05.stdout:2/246: mkdir d0/d8/d43/df/d53 0 2026-03-10T07:50:43.058 INFO:tasks.workunit.client.0.vm05.stdout:7/183: symlink d1/d34/l39 0 2026-03-10T07:50:43.059 INFO:tasks.workunit.client.0.vm05.stdout:7/184: chown d1/d6/c19 498 1 2026-03-10T07:50:43.066 INFO:tasks.workunit.client.0.vm05.stdout:1/156: unlink da/dd/d12/f14 0 2026-03-10T07:50:43.068 INFO:tasks.workunit.client.0.vm05.stdout:6/134: rename d0/f8 to d0/f29 0 2026-03-10T07:50:43.069 INFO:tasks.workunit.client.0.vm05.stdout:1/157: chown da/l15 4900 1 2026-03-10T07:50:43.069 INFO:tasks.workunit.client.0.vm05.stdout:6/135: fdatasync d0/d6/f24 0 2026-03-10T07:50:43.083 INFO:tasks.workunit.client.0.vm05.stdout:7/185: creat d1/f3a x:0 0 0 2026-03-10T07:50:43.105 INFO:tasks.workunit.client.0.vm05.stdout:9/113: truncate d8/f9 2956648 0 2026-03-10T07:50:43.106 INFO:tasks.workunit.client.0.vm05.stdout:3/180: write d8/fb [1897556,119568] 0 2026-03-10T07:50:43.107 INFO:tasks.workunit.client.0.vm05.stdout:5/159: mknod d2/d20/d33/c3b 0 2026-03-10T07:50:43.108 INFO:tasks.workunit.client.0.vm05.stdout:6/136: creat d0/d11/f2a x:0 0 0 2026-03-10T07:50:43.108 INFO:tasks.workunit.client.0.vm05.stdout:9/114: write f7 [1528620,110403] 0 2026-03-10T07:50:43.108 INFO:tasks.workunit.client.0.vm05.stdout:6/137: chown d0/l4 7463 1 2026-03-10T07:50:43.109 INFO:tasks.workunit.client.0.vm05.stdout:5/160: write d2/d20/f2a [4809779,37288] 0 2026-03-10T07:50:43.110 INFO:tasks.workunit.client.0.vm05.stdout:3/181: truncate d8/d1f/d2a/f32 854814 0 2026-03-10T07:50:43.110 INFO:tasks.workunit.client.0.vm05.stdout:9/115: truncate d8/f14 3092573 0 2026-03-10T07:50:43.117 INFO:tasks.workunit.client.0.vm05.stdout:5/161: truncate d2/d5/f10 4763915 0 2026-03-10T07:50:43.120 INFO:tasks.workunit.client.0.vm05.stdout:6/138: dwrite d0/f15 [0,4194304] 0 2026-03-10T07:50:43.127 INFO:tasks.workunit.client.0.vm05.stdout:2/247: creat d0/d47/d49/f54 x:0 0 0 2026-03-10T07:50:43.140 INFO:tasks.workunit.client.0.vm05.stdout:8/140: getdents d1/dd/d18 0 2026-03-10T07:50:43.141 INFO:tasks.workunit.client.0.vm05.stdout:7/186: mkdir d1/d6/d3b 0 2026-03-10T07:50:43.142 INFO:tasks.workunit.client.0.vm05.stdout:4/168: creat d0/d6/f39 x:0 0 0 2026-03-10T07:50:43.149 INFO:tasks.workunit.client.0.vm05.stdout:3/182: mkdir d8/d16/d19/d37 0 2026-03-10T07:50:43.150 INFO:tasks.workunit.client.0.vm05.stdout:0/132: getdents d8/dd 0 2026-03-10T07:50:43.150 INFO:tasks.workunit.client.0.vm05.stdout:0/133: stat d8/fb 0 2026-03-10T07:50:43.151 INFO:tasks.workunit.client.0.vm05.stdout:6/139: dread d0/f29 [0,4194304] 0 2026-03-10T07:50:43.157 INFO:tasks.workunit.client.0.vm05.stdout:6/140: read d0/d11/f13 [1610291,110626] 0 2026-03-10T07:50:43.158 INFO:tasks.workunit.client.0.vm05.stdout:5/162: creat d2/d5/f3c x:0 0 0 2026-03-10T07:50:43.159 INFO:tasks.workunit.client.0.vm05.stdout:6/141: fdatasync d0/d11/f1c 0 2026-03-10T07:50:43.160 INFO:tasks.workunit.client.0.vm05.stdout:2/248: mknod d0/d8/d43/df/c55 0 2026-03-10T07:50:43.163 INFO:tasks.workunit.client.0.vm05.stdout:2/249: chown d0/d8/d43/d38/l51 147385 1 2026-03-10T07:50:43.174 INFO:tasks.workunit.client.0.vm05.stdout:2/250: chown d0/d8/d43/df/c13 133 1 2026-03-10T07:50:43.175 INFO:tasks.workunit.client.0.vm05.stdout:8/141: write d1/fa [818136,38892] 0 2026-03-10T07:50:43.175 INFO:tasks.workunit.client.0.vm05.stdout:6/142: dwrite d0/f15 [0,4194304] 0 2026-03-10T07:50:43.175 INFO:tasks.workunit.client.0.vm05.stdout:2/251: dwrite d0/d8/f1c [4194304,4194304] 0 2026-03-10T07:50:43.175 INFO:tasks.workunit.client.0.vm05.stdout:1/158: rename da/dd/d12/d19/c1f to da/c25 0 2026-03-10T07:50:43.175 INFO:tasks.workunit.client.0.vm05.stdout:8/142: truncate d1/fa 4732369 0 2026-03-10T07:50:43.175 INFO:tasks.workunit.client.0.vm05.stdout:6/143: fdatasync d0/f26 0 2026-03-10T07:50:43.180 INFO:tasks.workunit.client.0.vm05.stdout:1/159: dread da/dd/d12/d19/f1a [0,4194304] 0 2026-03-10T07:50:43.183 INFO:tasks.workunit.client.0.vm05.stdout:7/187: mkdir d1/d3c 0 2026-03-10T07:50:43.184 INFO:tasks.workunit.client.0.vm05.stdout:7/188: readlink d1/d34/l35 0 2026-03-10T07:50:43.190 INFO:tasks.workunit.client.0.vm05.stdout:4/169: creat d0/d20/d26/f3a x:0 0 0 2026-03-10T07:50:43.190 INFO:tasks.workunit.client.0.vm05.stdout:4/170: dread - d0/d6/f39 zero size 2026-03-10T07:50:43.194 INFO:tasks.workunit.client.0.vm05.stdout:4/171: dwrite d0/d20/f2a [0,4194304] 0 2026-03-10T07:50:43.199 INFO:tasks.workunit.client.0.vm05.stdout:2/252: rmdir d0/d8/d43/df/d25 39 2026-03-10T07:50:43.199 INFO:tasks.workunit.client.0.vm05.stdout:9/116: creat d8/d16/d1c/d26/d28/f29 x:0 0 0 2026-03-10T07:50:43.202 INFO:tasks.workunit.client.0.vm05.stdout:9/117: stat d8/d16/f21 0 2026-03-10T07:50:43.203 INFO:tasks.workunit.client.0.vm05.stdout:8/143: creat d1/dd/d18/f29 x:0 0 0 2026-03-10T07:50:43.205 INFO:tasks.workunit.client.0.vm05.stdout:9/118: stat d8/d16/f1d 0 2026-03-10T07:50:43.208 INFO:tasks.workunit.client.0.vm05.stdout:7/189: rmdir d1/d34 39 2026-03-10T07:50:43.209 INFO:tasks.workunit.client.0.vm05.stdout:3/183: mknod d8/c38 0 2026-03-10T07:50:43.210 INFO:tasks.workunit.client.0.vm05.stdout:3/184: stat c2 0 2026-03-10T07:50:43.211 INFO:tasks.workunit.client.0.vm05.stdout:0/134: mkdir d8/dd/d10/d26/d2a 0 2026-03-10T07:50:43.213 INFO:tasks.workunit.client.0.vm05.stdout:5/163: link d2/d5/f10 d2/d5/f3d 0 2026-03-10T07:50:43.213 INFO:tasks.workunit.client.0.vm05.stdout:5/164: write d2/d5/f1e [1204404,33808] 0 2026-03-10T07:50:43.217 INFO:tasks.workunit.client.0.vm05.stdout:8/144: readlink d1/l2 0 2026-03-10T07:50:43.231 INFO:tasks.workunit.client.0.vm05.stdout:2/253: dread d0/d8/d43/df/d25/f29 [0,4194304] 0 2026-03-10T07:50:43.231 INFO:tasks.workunit.client.0.vm05.stdout:2/254: truncate d0/d8/d43/f1f 4598418 0 2026-03-10T07:50:43.232 INFO:tasks.workunit.client.0.vm05.stdout:2/255: dwrite d0/d8/d43/f1d [0,4194304] 0 2026-03-10T07:50:43.232 INFO:tasks.workunit.client.0.vm05.stdout:6/144: link d0/d11/f21 d0/d11/d22/f2b 0 2026-03-10T07:50:43.232 INFO:tasks.workunit.client.0.vm05.stdout:1/160: mkdir da/d26 0 2026-03-10T07:50:43.232 INFO:tasks.workunit.client.0.vm05.stdout:1/161: chown da/l23 13 1 2026-03-10T07:50:43.232 INFO:tasks.workunit.client.0.vm05.stdout:4/172: mkdir d0/d3b 0 2026-03-10T07:50:43.232 INFO:tasks.workunit.client.0.vm05.stdout:3/185: fdatasync d8/d16/f1e 0 2026-03-10T07:50:43.233 INFO:tasks.workunit.client.0.vm05.stdout:1/162: dwrite da/fb [0,4194304] 0 2026-03-10T07:50:43.243 INFO:tasks.workunit.client.0.vm05.stdout:0/135: link d8/fc d8/dd/d10/d26/d2a/f2b 0 2026-03-10T07:50:43.247 INFO:tasks.workunit.client.0.vm05.stdout:8/145: mkdir d1/dd/d18/d20/d2a 0 2026-03-10T07:50:43.253 INFO:tasks.workunit.client.0.vm05.stdout:0/136: dwrite d8/f1c [0,4194304] 0 2026-03-10T07:50:43.258 INFO:tasks.workunit.client.0.vm05.stdout:1/163: mkdir da/dd/d27 0 2026-03-10T07:50:43.259 INFO:tasks.workunit.client.0.vm05.stdout:1/164: truncate da/dd/f13 1039924 0 2026-03-10T07:50:43.259 INFO:tasks.workunit.client.0.vm05.stdout:1/165: write da/fc [948199,4007] 0 2026-03-10T07:50:43.264 INFO:tasks.workunit.client.0.vm05.stdout:6/145: link d0/f29 d0/d6/f2c 0 2026-03-10T07:50:43.264 INFO:tasks.workunit.client.0.vm05.stdout:6/146: write d0/f15 [1169507,123704] 0 2026-03-10T07:50:43.268 INFO:tasks.workunit.client.0.vm05.stdout:6/147: dwrite d0/f28 [0,4194304] 0 2026-03-10T07:50:43.278 INFO:tasks.workunit.client.0.vm05.stdout:8/146: creat d1/dd/d18/f2b x:0 0 0 2026-03-10T07:50:43.279 INFO:tasks.workunit.client.0.vm05.stdout:1/166: symlink da/dd/d12/l28 0 2026-03-10T07:50:43.279 INFO:tasks.workunit.client.0.vm05.stdout:1/167: read da/fc [264859,124189] 0 2026-03-10T07:50:43.280 INFO:tasks.workunit.client.0.vm05.stdout:1/168: write da/dd/f11 [611059,101666] 0 2026-03-10T07:50:43.280 INFO:tasks.workunit.client.0.vm05.stdout:1/169: write da/dd/f13 [1986023,52886] 0 2026-03-10T07:50:43.282 INFO:tasks.workunit.client.0.vm05.stdout:0/137: read f6 [201944,110886] 0 2026-03-10T07:50:43.287 INFO:tasks.workunit.client.0.vm05.stdout:0/138: dwrite d8/fb [0,4194304] 0 2026-03-10T07:50:43.293 INFO:tasks.workunit.client.0.vm05.stdout:0/139: write f4 [632566,67736] 0 2026-03-10T07:50:43.311 INFO:tasks.workunit.client.0.vm05.stdout:8/147: dwrite d1/fe [0,4194304] 0 2026-03-10T07:50:43.322 INFO:tasks.workunit.client.0.vm05.stdout:1/170: symlink da/dd/d12/d19/l29 0 2026-03-10T07:50:43.324 INFO:tasks.workunit.client.0.vm05.stdout:0/140: symlink d8/dd/d10/l2c 0 2026-03-10T07:50:43.326 INFO:tasks.workunit.client.0.vm05.stdout:4/173: getdents d0/d17 0 2026-03-10T07:50:43.330 INFO:tasks.workunit.client.0.vm05.stdout:6/148: symlink d0/l2d 0 2026-03-10T07:50:43.335 INFO:tasks.workunit.client.0.vm05.stdout:0/141: unlink d8/ff 0 2026-03-10T07:50:43.336 INFO:tasks.workunit.client.0.vm05.stdout:0/142: write d8/f13 [1112578,54382] 0 2026-03-10T07:50:43.339 INFO:tasks.workunit.client.0.vm05.stdout:4/174: rename d0/d6/d10 to d0/d6/d37/d3c 0 2026-03-10T07:50:43.342 INFO:tasks.workunit.client.0.vm05.stdout:6/149: mkdir d0/d11/d2e 0 2026-03-10T07:50:43.343 INFO:tasks.workunit.client.0.vm05.stdout:0/143: rename d8/dd/d10/f1a to d8/f2d 0 2026-03-10T07:50:43.344 INFO:tasks.workunit.client.0.vm05.stdout:0/144: write d8/dd/d10/f19 [1271945,89190] 0 2026-03-10T07:50:43.349 INFO:tasks.workunit.client.0.vm05.stdout:4/175: link d0/d6/f15 d0/d6/d37/f3d 0 2026-03-10T07:50:43.356 INFO:tasks.workunit.client.0.vm05.stdout:4/176: chown d0/d6 908431958 1 2026-03-10T07:50:43.356 INFO:tasks.workunit.client.0.vm05.stdout:4/177: fdatasync d0/d6/d37/d3c/f2f 0 2026-03-10T07:50:43.356 INFO:tasks.workunit.client.0.vm05.stdout:0/145: creat d8/dd/d10/d26/d2a/f2e x:0 0 0 2026-03-10T07:50:43.356 INFO:tasks.workunit.client.0.vm05.stdout:0/146: dread d8/f9 [0,4194304] 0 2026-03-10T07:50:43.356 INFO:tasks.workunit.client.0.vm05.stdout:4/178: mknod d0/d3b/c3e 0 2026-03-10T07:50:43.356 INFO:tasks.workunit.client.0.vm05.stdout:4/179: fdatasync d0/d20/f34 0 2026-03-10T07:50:43.358 INFO:tasks.workunit.client.0.vm05.stdout:9/119: sync 2026-03-10T07:50:43.368 INFO:tasks.workunit.client.0.vm05.stdout:9/120: dread d8/f14 [0,4194304] 0 2026-03-10T07:50:43.373 INFO:tasks.workunit.client.0.vm05.stdout:9/121: mknod d8/d16/d1c/d20/c2a 0 2026-03-10T07:50:43.373 INFO:tasks.workunit.client.0.vm05.stdout:9/122: truncate d8/fd 4302550 0 2026-03-10T07:50:43.373 INFO:tasks.workunit.client.0.vm05.stdout:9/123: fdatasync d8/f12 0 2026-03-10T07:50:43.374 INFO:tasks.workunit.client.0.vm05.stdout:9/124: readlink d8/d16/l18 0 2026-03-10T07:50:43.381 INFO:tasks.workunit.client.0.vm05.stdout:9/125: creat d8/d16/d22/f2b x:0 0 0 2026-03-10T07:50:43.384 INFO:tasks.workunit.client.0.vm05.stdout:5/165: sync 2026-03-10T07:50:43.384 INFO:tasks.workunit.client.0.vm05.stdout:8/148: sync 2026-03-10T07:50:43.385 INFO:tasks.workunit.client.0.vm05.stdout:8/149: read - d1/dd/d18/f21 zero size 2026-03-10T07:50:43.387 INFO:tasks.workunit.client.0.vm05.stdout:8/150: read - d1/dd/d18/f21 zero size 2026-03-10T07:50:43.390 INFO:tasks.workunit.client.0.vm05.stdout:6/150: sync 2026-03-10T07:50:43.394 INFO:tasks.workunit.client.0.vm05.stdout:8/151: dwrite d1/dd/d18/f21 [0,4194304] 0 2026-03-10T07:50:43.403 INFO:tasks.workunit.client.0.vm05.stdout:1/171: fdatasync da/fc 0 2026-03-10T07:50:43.414 INFO:tasks.workunit.client.0.vm05.stdout:2/256: write d0/d8/f2d [1183862,53154] 0 2026-03-10T07:50:43.418 INFO:tasks.workunit.client.0.vm05.stdout:7/190: dwrite d1/f11 [0,4194304] 0 2026-03-10T07:50:43.419 INFO:tasks.workunit.client.0.vm05.stdout:1/172: sync 2026-03-10T07:50:43.421 INFO:tasks.workunit.client.0.vm05.stdout:9/126: getdents d8/d16/d1c/d26 0 2026-03-10T07:50:43.435 INFO:tasks.workunit.client.0.vm05.stdout:2/257: dwrite d0/d8/d43/df/f21 [0,4194304] 0 2026-03-10T07:50:43.438 INFO:tasks.workunit.client.0.vm05.stdout:2/258: chown d0/f22 1 1 2026-03-10T07:50:43.451 INFO:tasks.workunit.client.0.vm05.stdout:1/173: mkdir da/dd/d2a 0 2026-03-10T07:50:43.451 INFO:tasks.workunit.client.0.vm05.stdout:8/152: creat d1/f2c x:0 0 0 2026-03-10T07:50:43.452 INFO:tasks.workunit.client.0.vm05.stdout:7/191: fdatasync d1/d6/f1b 0 2026-03-10T07:50:43.453 INFO:tasks.workunit.client.0.vm05.stdout:3/186: dwrite d8/d1c/f35 [0,4194304] 0 2026-03-10T07:50:43.455 INFO:tasks.workunit.client.0.vm05.stdout:9/127: fdatasync d8/fa 0 2026-03-10T07:50:43.459 INFO:tasks.workunit.client.0.vm05.stdout:7/192: chown d1/l24 46 1 2026-03-10T07:50:43.461 INFO:tasks.workunit.client.0.vm05.stdout:8/153: rename d1/dd/d18/d20 to d1/dd/d18/d20/d2a/d2d 22 2026-03-10T07:50:43.461 INFO:tasks.workunit.client.0.vm05.stdout:7/193: read - d1/d6/f22 zero size 2026-03-10T07:50:43.477 INFO:tasks.workunit.client.0.vm05.stdout:0/147: dread f4 [0,4194304] 0 2026-03-10T07:50:43.478 INFO:tasks.workunit.client.0.vm05.stdout:2/259: read d0/d8/d43/df/d25/f2b [75267,27078] 0 2026-03-10T07:50:43.484 INFO:tasks.workunit.client.0.vm05.stdout:1/174: unlink da/f17 0 2026-03-10T07:50:43.487 INFO:tasks.workunit.client.0.vm05.stdout:9/128: mkdir d8/d16/d1c/d2c 0 2026-03-10T07:50:43.487 INFO:tasks.workunit.client.0.vm05.stdout:9/129: write d8/d16/d22/f2b [529956,58839] 0 2026-03-10T07:50:43.490 INFO:tasks.workunit.client.0.vm05.stdout:8/154: chown d1/c7 119809 1 2026-03-10T07:50:43.497 INFO:tasks.workunit.client.0.vm05.stdout:7/194: unlink d1/f33 0 2026-03-10T07:50:43.499 INFO:tasks.workunit.client.0.vm05.stdout:6/151: write d0/f28 [4723514,79301] 0 2026-03-10T07:50:43.499 INFO:tasks.workunit.client.0.vm05.stdout:6/152: dread - d0/d11/f2a zero size 2026-03-10T07:50:43.500 INFO:tasks.workunit.client.0.vm05.stdout:6/153: chown d0/d1e/c1f 598635 1 2026-03-10T07:50:43.508 INFO:tasks.workunit.client.0.vm05.stdout:2/260: creat d0/d8/d43/d38/f56 x:0 0 0 2026-03-10T07:50:43.509 INFO:tasks.workunit.client.0.vm05.stdout:2/261: write d0/f22 [948840,118394] 0 2026-03-10T07:50:43.509 INFO:tasks.workunit.client.0.vm05.stdout:2/262: read - d0/d8/d43/d38/f56 zero size 2026-03-10T07:50:43.516 INFO:tasks.workunit.client.0.vm05.stdout:2/263: dwrite d0/d8/d43/df/f3a [0,4194304] 0 2026-03-10T07:50:43.536 INFO:tasks.workunit.client.0.vm05.stdout:0/148: unlink l7 0 2026-03-10T07:50:43.544 INFO:tasks.workunit.client.0.vm05.stdout:6/154: truncate d0/fa 744421 0 2026-03-10T07:50:43.545 INFO:tasks.workunit.client.0.vm05.stdout:0/149: creat d8/dd/d10/f2f x:0 0 0 2026-03-10T07:50:43.551 INFO:tasks.workunit.client.0.vm05.stdout:6/155: dread d0/d6/f1d [0,4194304] 0 2026-03-10T07:50:43.555 INFO:tasks.workunit.client.0.vm05.stdout:2/264: creat d0/d8/d43/df/d4d/f57 x:0 0 0 2026-03-10T07:50:43.557 INFO:tasks.workunit.client.0.vm05.stdout:4/180: write d0/f23 [10824,84979] 0 2026-03-10T07:50:43.561 INFO:tasks.workunit.client.0.vm05.stdout:4/181: read - d0/d6/f32 zero size 2026-03-10T07:50:43.575 INFO:tasks.workunit.client.0.vm05.stdout:4/182: rmdir d0/d20 39 2026-03-10T07:50:43.575 INFO:tasks.workunit.client.0.vm05.stdout:4/183: chown d0/f23 0 1 2026-03-10T07:50:43.577 INFO:tasks.workunit.client.0.vm05.stdout:5/166: truncate d2/d5/f18 1643436 0 2026-03-10T07:50:43.577 INFO:tasks.workunit.client.0.vm05.stdout:5/167: chown d2 4721275 1 2026-03-10T07:50:43.577 INFO:tasks.workunit.client.0.vm05.stdout:5/168: stat d2/d12/f14 0 2026-03-10T07:50:43.578 INFO:tasks.workunit.client.0.vm05.stdout:5/169: fsync d2/f8 0 2026-03-10T07:50:43.578 INFO:tasks.workunit.client.0.vm05.stdout:5/170: truncate d2/d5/f3c 157146 0 2026-03-10T07:50:43.583 INFO:tasks.workunit.client.0.vm05.stdout:6/156: mknod d0/c2f 0 2026-03-10T07:50:43.587 INFO:tasks.workunit.client.0.vm05.stdout:2/265: truncate d0/f5 1582193 0 2026-03-10T07:50:43.589 INFO:tasks.workunit.client.0.vm05.stdout:1/175: write da/dd/d12/d19/f1a [967537,73212] 0 2026-03-10T07:50:43.589 INFO:tasks.workunit.client.0.vm05.stdout:1/176: write da/dd/d12/f18 [281060,97597] 0 2026-03-10T07:50:43.597 INFO:tasks.workunit.client.0.vm05.stdout:3/187: truncate d8/d1c/f23 1635320 0 2026-03-10T07:50:43.600 INFO:tasks.workunit.client.0.vm05.stdout:4/184: symlink d0/d28/l3f 0 2026-03-10T07:50:43.603 INFO:tasks.workunit.client.0.vm05.stdout:4/185: dwrite d0/f23 [0,4194304] 0 2026-03-10T07:50:43.623 INFO:tasks.workunit.client.0.vm05.stdout:9/130: write d8/f15 [3311864,74676] 0 2026-03-10T07:50:43.625 INFO:tasks.workunit.client.0.vm05.stdout:7/195: write d1/d6/f1d [2730713,8745] 0 2026-03-10T07:50:43.637 INFO:tasks.workunit.client.0.vm05.stdout:8/155: write d1/dd/f11 [1394726,79431] 0 2026-03-10T07:50:43.651 INFO:tasks.workunit.client.0.vm05.stdout:0/150: truncate d8/dd/f22 1432008 0 2026-03-10T07:50:43.678 INFO:tasks.workunit.client.0.vm05.stdout:8/156: unlink d1/d23/l26 0 2026-03-10T07:50:43.679 INFO:tasks.workunit.client.0.vm05.stdout:6/157: creat d0/d11/d2e/f30 x:0 0 0 2026-03-10T07:50:43.680 INFO:tasks.workunit.client.0.vm05.stdout:1/177: mkdir da/d26/d2b 0 2026-03-10T07:50:43.685 INFO:tasks.workunit.client.0.vm05.stdout:8/157: dwrite d1/fa [0,4194304] 0 2026-03-10T07:50:43.686 INFO:tasks.workunit.client.0.vm05.stdout:9/131: dwrite d8/f14 [0,4194304] 0 2026-03-10T07:50:43.695 INFO:tasks.workunit.client.0.vm05.stdout:9/132: read d8/fd [2637418,25122] 0 2026-03-10T07:50:43.698 INFO:tasks.workunit.client.0.vm05.stdout:8/158: dwrite d1/f2c [0,4194304] 0 2026-03-10T07:50:43.703 INFO:tasks.workunit.client.0.vm05.stdout:0/151: mkdir d8/dd/d10/d26/d2a/d30 0 2026-03-10T07:50:43.714 INFO:tasks.workunit.client.0.vm05.stdout:4/186: creat d0/d20/d26/f40 x:0 0 0 2026-03-10T07:50:43.714 INFO:tasks.workunit.client.0.vm05.stdout:4/187: chown d0/d28/f33 97 1 2026-03-10T07:50:43.715 INFO:tasks.workunit.client.0.vm05.stdout:4/188: write d0/d20/f34 [79987,25430] 0 2026-03-10T07:50:43.719 INFO:tasks.workunit.client.0.vm05.stdout:4/189: dread d0/f23 [0,4194304] 0 2026-03-10T07:50:43.729 INFO:tasks.workunit.client.0.vm05.stdout:5/171: rmdir d2/d20/d38 0 2026-03-10T07:50:43.731 INFO:tasks.workunit.client.0.vm05.stdout:2/266: creat d0/d8/d43/df/f58 x:0 0 0 2026-03-10T07:50:43.735 INFO:tasks.workunit.client.0.vm05.stdout:6/158: rename d0/d1e to d0/d11/d31 0 2026-03-10T07:50:43.744 INFO:tasks.workunit.client.0.vm05.stdout:9/133: unlink d8/d16/c1a 0 2026-03-10T07:50:43.747 INFO:tasks.workunit.client.0.vm05.stdout:8/159: rmdir d1/dd/d18 39 2026-03-10T07:50:43.754 INFO:tasks.workunit.client.0.vm05.stdout:3/188: dwrite d8/d1c/f23 [0,4194304] 0 2026-03-10T07:50:43.760 INFO:tasks.workunit.client.0.vm05.stdout:0/152: fsync d8/fb 0 2026-03-10T07:50:43.764 INFO:tasks.workunit.client.0.vm05.stdout:7/196: creat d1/f3d x:0 0 0 2026-03-10T07:50:43.765 INFO:tasks.workunit.client.0.vm05.stdout:7/197: write d1/f21 [3153730,15496] 0 2026-03-10T07:50:43.769 INFO:tasks.workunit.client.0.vm05.stdout:7/198: dwrite d1/f25 [0,4194304] 0 2026-03-10T07:50:43.774 INFO:tasks.workunit.client.0.vm05.stdout:7/199: truncate d1/d6/f22 856566 0 2026-03-10T07:50:43.774 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:43 vm08.local ceph-mon[59917]: Upgrade: Setting container_image for all mgr 2026-03-10T07:50:43.774 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:43 vm08.local ceph-mon[59917]: Upgrade: Setting container_image for all rgw 2026-03-10T07:50:43.774 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:43 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:43.774 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:43 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:50:43.774 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:43 vm08.local ceph-mon[59917]: Upgrade: Setting container_image for all rbd-mirror 2026-03-10T07:50:43.774 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:43 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:43.774 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:43 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:50:43.774 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:43 vm08.local ceph-mon[59917]: Upgrade: Setting container_image for all cephfs-mirror 2026-03-10T07:50:43.774 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:43 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:43.774 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:43 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:50:43.774 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:43 vm08.local ceph-mon[59917]: Upgrade: Setting container_image for all iscsi 2026-03-10T07:50:43.774 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:43 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:43.774 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:43 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:50:43.774 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:43 vm08.local ceph-mon[59917]: Upgrade: Setting container_image for all nfs 2026-03-10T07:50:43.774 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:43 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:43.774 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:43 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:50:43.774 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:43 vm08.local ceph-mon[59917]: Upgrade: Setting container_image for all nvmeof 2026-03-10T07:50:43.774 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:43 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:43.774 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:43 vm08.local ceph-mon[59917]: pgmap v11: 65 pgs: 65 active+clean; 253 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 119 KiB/s rd, 11 MiB/s wr, 199 op/s 2026-03-10T07:50:43.775 INFO:tasks.workunit.client.0.vm05.stdout:7/200: truncate d1/f21 4861346 0 2026-03-10T07:50:43.783 INFO:tasks.workunit.client.0.vm05.stdout:4/190: unlink d0/l4 0 2026-03-10T07:50:43.783 INFO:tasks.workunit.client.0.vm05.stdout:5/172: mknod d2/d12/c3e 0 2026-03-10T07:50:43.803 INFO:tasks.workunit.client.0.vm05.stdout:9/134: mknod d8/d16/d1c/d20/c2d 0 2026-03-10T07:50:43.803 INFO:tasks.workunit.client.0.vm05.stdout:8/160: fsync d1/dd/d18/f1f 0 2026-03-10T07:50:43.804 INFO:tasks.workunit.client.0.vm05.stdout:8/161: fsync d1/fa 0 2026-03-10T07:50:43.805 INFO:tasks.workunit.client.0.vm05.stdout:9/135: write d8/f15 [2601809,88456] 0 2026-03-10T07:50:43.807 INFO:tasks.workunit.client.0.vm05.stdout:9/136: stat d8/d16 0 2026-03-10T07:50:43.811 INFO:tasks.workunit.client.0.vm05.stdout:8/162: dwrite d1/f15 [4194304,4194304] 0 2026-03-10T07:50:43.819 INFO:tasks.workunit.client.0.vm05.stdout:9/137: dwrite d8/d16/f21 [0,4194304] 0 2026-03-10T07:50:43.820 INFO:tasks.workunit.client.0.vm05.stdout:9/138: fdatasync d8/d16/d1c/f1e 0 2026-03-10T07:50:43.830 INFO:tasks.workunit.client.0.vm05.stdout:3/189: unlink d8/d16/f1e 0 2026-03-10T07:50:43.839 INFO:tasks.workunit.client.0.vm05.stdout:0/153: creat d8/dd/d10/d26/f31 x:0 0 0 2026-03-10T07:50:43.855 INFO:tasks.workunit.client.0.vm05.stdout:6/159: truncate d0/d6/f16 3437749 0 2026-03-10T07:50:43.865 INFO:tasks.workunit.client.0.vm05.stdout:0/154: truncate d8/dd/d10/f11 4586197 0 2026-03-10T07:50:43.869 INFO:tasks.workunit.client.0.vm05.stdout:2/267: rename d0/d8/d43/c35 to d0/c59 0 2026-03-10T07:50:43.869 INFO:tasks.workunit.client.0.vm05.stdout:0/155: rename d8/dd to d8/dd/d10/d26/d2a/d32 22 2026-03-10T07:50:43.872 INFO:tasks.workunit.client.0.vm05.stdout:8/163: sync 2026-03-10T07:50:43.874 INFO:tasks.workunit.client.0.vm05.stdout:8/164: chown d1/dd/d18/f29 1057066213 1 2026-03-10T07:50:43.875 INFO:tasks.workunit.client.0.vm05.stdout:6/160: rmdir d0/d11 39 2026-03-10T07:50:43.878 INFO:tasks.workunit.client.0.vm05.stdout:1/178: getdents da/dd/d12/d19/d20 0 2026-03-10T07:50:43.885 INFO:tasks.workunit.client.0.vm05.stdout:8/165: sync 2026-03-10T07:50:43.890 INFO:tasks.workunit.client.0.vm05.stdout:3/190: creat d8/d1f/d2a/d34/f39 x:0 0 0 2026-03-10T07:50:43.891 INFO:tasks.workunit.client.0.vm05.stdout:3/191: sync 2026-03-10T07:50:43.895 INFO:tasks.workunit.client.0.vm05.stdout:7/201: link d1/f23 d1/d34/f3e 0 2026-03-10T07:50:43.899 INFO:tasks.workunit.client.0.vm05.stdout:8/166: dwrite d1/dd/d18/f21 [0,4194304] 0 2026-03-10T07:50:43.903 INFO:tasks.workunit.client.0.vm05.stdout:5/173: creat d2/f3f x:0 0 0 2026-03-10T07:50:43.903 INFO:tasks.workunit.client.0.vm05.stdout:4/191: creat d0/f41 x:0 0 0 2026-03-10T07:50:43.906 INFO:tasks.workunit.client.0.vm05.stdout:8/167: write d1/dd/f25 [1661791,77763] 0 2026-03-10T07:50:43.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:43 vm05.local ceph-mon[50387]: Upgrade: Setting container_image for all mgr 2026-03-10T07:50:43.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:43 vm05.local ceph-mon[50387]: Upgrade: Setting container_image for all rgw 2026-03-10T07:50:43.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:43 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:43.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:43 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:50:43.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:43 vm05.local ceph-mon[50387]: Upgrade: Setting container_image for all rbd-mirror 2026-03-10T07:50:43.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:43 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:43.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:43 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:50:43.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:43 vm05.local ceph-mon[50387]: Upgrade: Setting container_image for all cephfs-mirror 2026-03-10T07:50:43.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:43 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:43.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:43 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:50:43.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:43 vm05.local ceph-mon[50387]: Upgrade: Setting container_image for all iscsi 2026-03-10T07:50:43.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:43 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:43.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:43 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:50:43.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:43 vm05.local ceph-mon[50387]: Upgrade: Setting container_image for all nfs 2026-03-10T07:50:43.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:43 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:43.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:43 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:50:43.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:43 vm05.local ceph-mon[50387]: Upgrade: Setting container_image for all nvmeof 2026-03-10T07:50:43.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:43 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:43.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:43 vm05.local ceph-mon[50387]: pgmap v11: 65 pgs: 65 active+clean; 253 MiB data, 1.4 GiB used, 119 GiB / 120 GiB avail; 119 KiB/s rd, 11 MiB/s wr, 199 op/s 2026-03-10T07:50:43.908 INFO:tasks.workunit.client.0.vm05.stdout:8/168: fsync d1/fe 0 2026-03-10T07:50:43.910 INFO:tasks.workunit.client.0.vm05.stdout:5/174: dread d2/d5/f25 [0,4194304] 0 2026-03-10T07:50:43.923 INFO:tasks.workunit.client.0.vm05.stdout:4/192: dread d0/d20/f34 [0,4194304] 0 2026-03-10T07:50:43.938 INFO:tasks.workunit.client.0.vm05.stdout:1/179: mknod da/dd/d12/d19/d20/c2c 0 2026-03-10T07:50:43.939 INFO:tasks.workunit.client.0.vm05.stdout:9/139: getdents d8/d16/d1c/d26 0 2026-03-10T07:50:43.946 INFO:tasks.workunit.client.0.vm05.stdout:9/140: chown f7 203679429 1 2026-03-10T07:50:43.951 INFO:tasks.workunit.client.0.vm05.stdout:0/156: rename d8/dd/d10/f11 to d8/f33 0 2026-03-10T07:50:43.951 INFO:tasks.workunit.client.0.vm05.stdout:4/193: dwrite d0/d17/f19 [0,4194304] 0 2026-03-10T07:50:43.953 INFO:tasks.workunit.client.0.vm05.stdout:9/141: chown d8/d16/d1c/f1e 4302567 1 2026-03-10T07:50:43.960 INFO:tasks.workunit.client.0.vm05.stdout:6/161: truncate d0/fa 1597988 0 2026-03-10T07:50:43.960 INFO:tasks.workunit.client.0.vm05.stdout:4/194: write d0/d20/f2a [4129631,26817] 0 2026-03-10T07:50:43.960 INFO:tasks.workunit.client.0.vm05.stdout:7/202: write d1/d6/f1b [4520089,106869] 0 2026-03-10T07:50:43.961 INFO:tasks.workunit.client.0.vm05.stdout:8/169: mknod d1/dd/c2e 0 2026-03-10T07:50:43.962 INFO:tasks.workunit.client.0.vm05.stdout:8/170: stat d1/dd/c2e 0 2026-03-10T07:50:43.974 INFO:tasks.workunit.client.0.vm05.stdout:2/268: dwrite d0/f5 [0,4194304] 0 2026-03-10T07:50:43.974 INFO:tasks.workunit.client.0.vm05.stdout:1/180: dwrite da/dd/d12/f16 [0,4194304] 0 2026-03-10T07:50:43.974 INFO:tasks.workunit.client.0.vm05.stdout:3/192: rename d8/d16/d19/f2e to d8/d22/f3a 0 2026-03-10T07:50:43.986 INFO:tasks.workunit.client.0.vm05.stdout:2/269: chown d0/f1e 5 1 2026-03-10T07:50:43.986 INFO:tasks.workunit.client.0.vm05.stdout:1/181: write da/dd/d12/f18 [462876,66227] 0 2026-03-10T07:50:43.993 INFO:tasks.workunit.client.0.vm05.stdout:8/171: dread d1/dd/d18/f21 [0,4194304] 0 2026-03-10T07:50:43.993 INFO:tasks.workunit.client.0.vm05.stdout:1/182: chown da/dd/d12/d19/l29 16110789 1 2026-03-10T07:50:43.993 INFO:tasks.workunit.client.0.vm05.stdout:8/172: readlink d1/l2 0 2026-03-10T07:50:43.995 INFO:tasks.workunit.client.0.vm05.stdout:8/173: write d1/dd/f25 [2416922,64435] 0 2026-03-10T07:50:44.001 INFO:tasks.workunit.client.0.vm05.stdout:7/203: symlink d1/d34/l3f 0 2026-03-10T07:50:44.012 INFO:tasks.workunit.client.0.vm05.stdout:4/195: symlink d0/d6/d9/d12/l42 0 2026-03-10T07:50:44.012 INFO:tasks.workunit.client.0.vm05.stdout:4/196: fdatasync d0/d17/f2c 0 2026-03-10T07:50:44.037 INFO:tasks.workunit.client.0.vm05.stdout:9/142: creat d8/d16/d1c/d2c/f2e x:0 0 0 2026-03-10T07:50:44.038 INFO:tasks.workunit.client.0.vm05.stdout:9/143: write d8/d16/f23 [478385,13014] 0 2026-03-10T07:50:44.045 INFO:tasks.workunit.client.0.vm05.stdout:3/193: unlink d8/f31 0 2026-03-10T07:50:44.051 INFO:tasks.workunit.client.0.vm05.stdout:6/162: mknod d0/d11/c32 0 2026-03-10T07:50:44.051 INFO:tasks.workunit.client.0.vm05.stdout:1/183: dread da/dd/d12/f18 [0,4194304] 0 2026-03-10T07:50:44.056 INFO:tasks.workunit.client.0.vm05.stdout:1/184: dwrite f4 [8388608,4194304] 0 2026-03-10T07:50:44.059 INFO:tasks.workunit.client.0.vm05.stdout:1/185: write da/dd/d12/f16 [1581489,37745] 0 2026-03-10T07:50:44.076 INFO:tasks.workunit.client.0.vm05.stdout:2/270: mknod d0/d8/d43/df/d4d/c5a 0 2026-03-10T07:50:44.083 INFO:tasks.workunit.client.0.vm05.stdout:4/197: rename d0/d6/d9/d12/l42 to d0/d17/l43 0 2026-03-10T07:50:44.084 INFO:tasks.workunit.client.0.vm05.stdout:5/175: getdents d2/d20/d33 0 2026-03-10T07:50:44.085 INFO:tasks.workunit.client.0.vm05.stdout:5/176: write d2/d12/d2d/f36 [420381,59143] 0 2026-03-10T07:50:44.086 INFO:tasks.workunit.client.0.vm05.stdout:5/177: write d2/d12/f2b [124896,86960] 0 2026-03-10T07:50:44.093 INFO:tasks.workunit.client.0.vm05.stdout:0/157: rmdir d8/dd/d10/d26/d2a/d30 0 2026-03-10T07:50:44.096 INFO:tasks.workunit.client.0.vm05.stdout:0/158: dread d8/f1c [0,4194304] 0 2026-03-10T07:50:44.101 INFO:tasks.workunit.client.0.vm05.stdout:5/178: dread d2/ff [0,4194304] 0 2026-03-10T07:50:44.104 INFO:tasks.workunit.client.0.vm05.stdout:6/163: creat d0/d11/d31/f33 x:0 0 0 2026-03-10T07:50:44.108 INFO:tasks.workunit.client.0.vm05.stdout:8/174: symlink d1/l2f 0 2026-03-10T07:50:44.109 INFO:tasks.workunit.client.0.vm05.stdout:7/204: mknod d1/d3c/c40 0 2026-03-10T07:50:44.112 INFO:tasks.workunit.client.0.vm05.stdout:9/144: rename d8/c11 to d8/d16/d1c/d26/c2f 0 2026-03-10T07:50:44.118 INFO:tasks.workunit.client.0.vm05.stdout:0/159: mkdir d8/dd/d34 0 2026-03-10T07:50:44.125 INFO:tasks.workunit.client.0.vm05.stdout:3/194: fsync d8/d16/d19/f27 0 2026-03-10T07:50:44.125 INFO:tasks.workunit.client.0.vm05.stdout:1/186: link da/dd/d12/f16 da/d26/f2d 0 2026-03-10T07:50:44.128 INFO:tasks.workunit.client.0.vm05.stdout:5/179: fdatasync d2/d12/f2b 0 2026-03-10T07:50:44.129 INFO:tasks.workunit.client.0.vm05.stdout:5/180: chown d2/d5/f3c 12464246 1 2026-03-10T07:50:44.129 INFO:tasks.workunit.client.0.vm05.stdout:1/187: dwrite da/dd/d12/f22 [0,4194304] 0 2026-03-10T07:50:44.133 INFO:tasks.workunit.client.0.vm05.stdout:8/175: creat d1/dd/d18/d20/f30 x:0 0 0 2026-03-10T07:50:44.134 INFO:tasks.workunit.client.0.vm05.stdout:8/176: chown d1/d23/c24 5 1 2026-03-10T07:50:44.143 INFO:tasks.workunit.client.0.vm05.stdout:2/271: rename d0/l15 to d0/d8/d43/l5b 0 2026-03-10T07:50:44.143 INFO:tasks.workunit.client.0.vm05.stdout:2/272: fsync d0/d2a/f45 0 2026-03-10T07:50:44.146 INFO:tasks.workunit.client.0.vm05.stdout:0/160: unlink d8/f12 0 2026-03-10T07:50:44.151 INFO:tasks.workunit.client.0.vm05.stdout:6/164: symlink d0/l34 0 2026-03-10T07:50:44.156 INFO:tasks.workunit.client.0.vm05.stdout:1/188: dread da/dd/d12/f18 [0,4194304] 0 2026-03-10T07:50:44.161 INFO:tasks.workunit.client.0.vm05.stdout:8/177: creat d1/d23/f31 x:0 0 0 2026-03-10T07:50:44.161 INFO:tasks.workunit.client.0.vm05.stdout:8/178: truncate d1/fe 5021306 0 2026-03-10T07:50:44.162 INFO:tasks.workunit.client.0.vm05.stdout:8/179: write d1/dd/d18/f22 [1377284,122244] 0 2026-03-10T07:50:44.166 INFO:tasks.workunit.client.0.vm05.stdout:7/205: link d1/d6/f22 d1/d6/f41 0 2026-03-10T07:50:44.169 INFO:tasks.workunit.client.0.vm05.stdout:9/145: rename d8/c13 to d8/d16/d22/c30 0 2026-03-10T07:50:44.184 INFO:tasks.workunit.client.0.vm05.stdout:8/180: truncate d1/dd/d18/f21 3654676 0 2026-03-10T07:50:44.185 INFO:tasks.workunit.client.0.vm05.stdout:8/181: dread - d1/dd/d18/d20/f30 zero size 2026-03-10T07:50:44.185 INFO:tasks.workunit.client.0.vm05.stdout:8/182: truncate d1/d23/f31 685920 0 2026-03-10T07:50:44.188 INFO:tasks.workunit.client.0.vm05.stdout:7/206: unlink d1/f23 0 2026-03-10T07:50:44.188 INFO:tasks.workunit.client.0.vm05.stdout:7/207: chown d1/d6/f31 278779 1 2026-03-10T07:50:44.193 INFO:tasks.workunit.client.0.vm05.stdout:6/165: dwrite d0/d11/d22/f2b [0,4194304] 0 2026-03-10T07:50:44.202 INFO:tasks.workunit.client.0.vm05.stdout:6/166: dread d0/d6/f1a [0,4194304] 0 2026-03-10T07:50:44.204 INFO:tasks.workunit.client.0.vm05.stdout:5/181: rename d2/d20/f31 to d2/d12/f40 0 2026-03-10T07:50:44.205 INFO:tasks.workunit.client.0.vm05.stdout:5/182: truncate d2/d12/f2b 1167899 0 2026-03-10T07:50:44.212 INFO:tasks.workunit.client.0.vm05.stdout:2/273: symlink d0/d8/d43/df/d53/l5c 0 2026-03-10T07:50:44.214 INFO:tasks.workunit.client.0.vm05.stdout:4/198: getdents d0/d6/d37/d3c 0 2026-03-10T07:50:44.214 INFO:tasks.workunit.client.0.vm05.stdout:4/199: write d0/f2 [3661289,44736] 0 2026-03-10T07:50:44.216 INFO:tasks.workunit.client.0.vm05.stdout:4/200: dread d0/d20/f2a [0,4194304] 0 2026-03-10T07:50:44.217 INFO:tasks.workunit.client.0.vm05.stdout:0/161: stat d8/dd/c1b 0 2026-03-10T07:50:44.227 INFO:tasks.workunit.client.0.vm05.stdout:8/183: readlink d1/lf 0 2026-03-10T07:50:44.246 INFO:tasks.workunit.client.0.vm05.stdout:6/167: sync 2026-03-10T07:50:44.246 INFO:tasks.workunit.client.0.vm05.stdout:4/201: sync 2026-03-10T07:50:44.249 INFO:tasks.workunit.client.0.vm05.stdout:6/168: dwrite d0/d11/d22/f2b [0,4194304] 0 2026-03-10T07:50:44.260 INFO:tasks.workunit.client.0.vm05.stdout:7/208: rename d1/f2c to d1/d6/d3b/f42 0 2026-03-10T07:50:44.267 INFO:tasks.workunit.client.0.vm05.stdout:1/189: truncate da/fb 1636581 0 2026-03-10T07:50:44.270 INFO:tasks.workunit.client.0.vm05.stdout:1/190: dwrite da/dd/f11 [0,4194304] 0 2026-03-10T07:50:44.274 INFO:tasks.workunit.client.0.vm05.stdout:1/191: chown da/d26/d2b 27460920 1 2026-03-10T07:50:44.274 INFO:tasks.workunit.client.0.vm05.stdout:5/183: dread d2/d20/f2a [0,4194304] 0 2026-03-10T07:50:44.280 INFO:tasks.workunit.client.0.vm05.stdout:5/184: dread d2/d5/f1e [0,4194304] 0 2026-03-10T07:50:44.290 INFO:tasks.workunit.client.0.vm05.stdout:3/195: link d8/d16/d19/f27 d8/f3b 0 2026-03-10T07:50:44.293 INFO:tasks.workunit.client.0.vm05.stdout:3/196: dwrite d8/d1c/f2b [0,4194304] 0 2026-03-10T07:50:44.327 INFO:tasks.workunit.client.0.vm05.stdout:4/202: rmdir d0/d6 39 2026-03-10T07:50:44.351 INFO:tasks.workunit.client.0.vm05.stdout:7/209: rmdir d1 39 2026-03-10T07:50:44.352 INFO:tasks.workunit.client.0.vm05.stdout:5/185: fsync d2/d20/f2a 0 2026-03-10T07:50:44.360 INFO:tasks.workunit.client.0.vm05.stdout:5/186: dread d2/f8 [0,4194304] 0 2026-03-10T07:50:44.363 INFO:tasks.workunit.client.0.vm05.stdout:5/187: dread d2/d20/f21 [0,4194304] 0 2026-03-10T07:50:44.363 INFO:tasks.workunit.client.0.vm05.stdout:5/188: chown d2/l22 1 1 2026-03-10T07:50:44.365 INFO:tasks.workunit.client.0.vm05.stdout:0/162: mknod d8/dd/d34/c35 0 2026-03-10T07:50:44.371 INFO:tasks.workunit.client.0.vm05.stdout:9/146: link d8/d16/d1c/d26/c2f d8/d16/c31 0 2026-03-10T07:50:44.372 INFO:tasks.workunit.client.0.vm05.stdout:8/184: mknod d1/dd/d18/d20/d2a/c32 0 2026-03-10T07:50:44.373 INFO:tasks.workunit.client.0.vm05.stdout:8/185: dread - d1/dd/d18/d20/f30 zero size 2026-03-10T07:50:44.373 INFO:tasks.workunit.client.0.vm05.stdout:8/186: dread - d1/dd/d18/f2b zero size 2026-03-10T07:50:44.374 INFO:tasks.workunit.client.0.vm05.stdout:8/187: chown d1/dd/l12 68965 1 2026-03-10T07:50:44.377 INFO:tasks.workunit.client.0.vm05.stdout:4/203: mkdir d0/d17/d44 0 2026-03-10T07:50:44.380 INFO:tasks.workunit.client.0.vm05.stdout:6/169: mkdir d0/d35 0 2026-03-10T07:50:44.381 INFO:tasks.workunit.client.0.vm05.stdout:6/170: chown d0/d6/l17 803 1 2026-03-10T07:50:44.381 INFO:tasks.workunit.client.0.vm05.stdout:6/171: truncate d0/d11/f2a 221447 0 2026-03-10T07:50:44.383 INFO:tasks.workunit.client.0.vm05.stdout:1/192: getdents da/dd/d2a 0 2026-03-10T07:50:44.407 INFO:tasks.workunit.client.0.vm05.stdout:0/163: write d8/dd/f29 [212210,89913] 0 2026-03-10T07:50:44.407 INFO:tasks.workunit.client.0.vm05.stdout:0/164: chown d8/fa 43689548 1 2026-03-10T07:50:44.414 INFO:tasks.workunit.client.0.vm05.stdout:5/189: dwrite d2/d5/f3d [0,4194304] 0 2026-03-10T07:50:44.415 INFO:tasks.workunit.client.0.vm05.stdout:9/147: unlink d8/d16/l17 0 2026-03-10T07:50:44.418 INFO:tasks.workunit.client.0.vm05.stdout:9/148: dread d8/d16/f23 [0,4194304] 0 2026-03-10T07:50:44.431 INFO:tasks.workunit.client.0.vm05.stdout:2/274: getdents d0/d8/d43/df/d53 0 2026-03-10T07:50:44.432 INFO:tasks.workunit.client.0.vm05.stdout:2/275: write d0/d8/f1c [640800,25924] 0 2026-03-10T07:50:44.440 INFO:tasks.workunit.client.0.vm05.stdout:3/197: rename f3 to d8/f3c 0 2026-03-10T07:50:44.446 INFO:tasks.workunit.client.0.vm05.stdout:4/204: mkdir d0/d6/d9/d12/d45 0 2026-03-10T07:50:44.448 INFO:tasks.workunit.client.0.vm05.stdout:6/172: mkdir d0/d35/d36 0 2026-03-10T07:50:44.455 INFO:tasks.workunit.client.0.vm05.stdout:6/173: dwrite d0/f23 [0,4194304] 0 2026-03-10T07:50:44.458 INFO:tasks.workunit.client.0.vm05.stdout:9/149: creat d8/d16/d1c/d20/f32 x:0 0 0 2026-03-10T07:50:44.459 INFO:tasks.workunit.client.0.vm05.stdout:9/150: readlink d8/l19 0 2026-03-10T07:50:44.467 INFO:tasks.workunit.client.0.vm05.stdout:0/165: symlink d8/dd/l36 0 2026-03-10T07:50:44.468 INFO:tasks.workunit.client.0.vm05.stdout:3/198: unlink d8/fd 0 2026-03-10T07:50:44.470 INFO:tasks.workunit.client.0.vm05.stdout:3/199: chown d8/d1f/c30 23 1 2026-03-10T07:50:44.475 INFO:tasks.workunit.client.0.vm05.stdout:7/210: dread d1/d34/f3e [0,4194304] 0 2026-03-10T07:50:44.482 INFO:tasks.workunit.client.0.vm05.stdout:6/174: dread d0/d6/f16 [0,4194304] 0 2026-03-10T07:50:44.485 INFO:tasks.workunit.client.0.vm05.stdout:2/276: mknod d0/c5d 0 2026-03-10T07:50:44.486 INFO:tasks.workunit.client.0.vm05.stdout:2/277: write d0/d8/d43/df/f20 [5564834,13364] 0 2026-03-10T07:50:44.487 INFO:tasks.workunit.client.0.vm05.stdout:6/175: dwrite d0/d11/d31/f33 [0,4194304] 0 2026-03-10T07:50:44.498 INFO:tasks.workunit.client.0.vm05.stdout:8/188: rename d1/c8 to d1/dd/d18/d20/d2a/c33 0 2026-03-10T07:50:44.500 INFO:tasks.workunit.client.0.vm05.stdout:3/200: sync 2026-03-10T07:50:44.503 INFO:tasks.workunit.client.0.vm05.stdout:3/201: dwrite d8/fb [4194304,4194304] 0 2026-03-10T07:50:44.510 INFO:tasks.workunit.client.0.vm05.stdout:0/166: write d8/dd/d10/f2f [485449,21194] 0 2026-03-10T07:50:44.514 INFO:tasks.workunit.client.0.vm05.stdout:5/190: link d2/d12/c30 d2/d20/d33/c41 0 2026-03-10T07:50:44.514 INFO:tasks.workunit.client.0.vm05.stdout:5/191: fdatasync d2/d5/f10 0 2026-03-10T07:50:44.515 INFO:tasks.workunit.client.0.vm05.stdout:4/205: getdents d0/d6/d37/d3c 0 2026-03-10T07:50:44.520 INFO:tasks.workunit.client.0.vm05.stdout:9/151: truncate d8/f12 654113 0 2026-03-10T07:50:44.522 INFO:tasks.workunit.client.0.vm05.stdout:9/152: write f6 [627015,33107] 0 2026-03-10T07:50:44.524 INFO:tasks.workunit.client.0.vm05.stdout:2/278: creat d0/d8/d43/f5e x:0 0 0 2026-03-10T07:50:44.524 INFO:tasks.workunit.client.0.vm05.stdout:2/279: fdatasync d0/d2a/f45 0 2026-03-10T07:50:44.526 INFO:tasks.workunit.client.0.vm05.stdout:5/192: sync 2026-03-10T07:50:44.533 INFO:tasks.workunit.client.0.vm05.stdout:1/193: dwrite da/fb [0,4194304] 0 2026-03-10T07:50:44.538 INFO:tasks.workunit.client.0.vm05.stdout:8/189: mkdir d1/dd/d18/d20/d2a/d34 0 2026-03-10T07:50:44.539 INFO:tasks.workunit.client.0.vm05.stdout:2/280: dread d0/f6 [0,4194304] 0 2026-03-10T07:50:44.546 INFO:tasks.workunit.client.0.vm05.stdout:2/281: dwrite d0/d8/d43/df/f21 [4194304,4194304] 0 2026-03-10T07:50:44.549 INFO:tasks.workunit.client.0.vm05.stdout:4/206: dread d0/f2 [0,4194304] 0 2026-03-10T07:50:44.549 INFO:tasks.workunit.client.0.vm05.stdout:2/282: stat d0/d8/d43/df/c11 0 2026-03-10T07:50:44.554 INFO:tasks.workunit.client.0.vm05.stdout:2/283: dwrite d0/f22 [0,4194304] 0 2026-03-10T07:50:44.554 INFO:tasks.workunit.client.0.vm05.stdout:2/284: stat d0/f41 0 2026-03-10T07:50:44.556 INFO:tasks.workunit.client.0.vm05.stdout:0/167: unlink d8/dd/d10/f2f 0 2026-03-10T07:50:44.564 INFO:tasks.workunit.client.0.vm05.stdout:9/153: mkdir d8/d16/d22/d33 0 2026-03-10T07:50:44.592 INFO:tasks.workunit.client.0.vm05.stdout:1/194: mknod da/d26/c2e 0 2026-03-10T07:50:44.592 INFO:tasks.workunit.client.0.vm05.stdout:6/176: mknod d0/d35/d36/c37 0 2026-03-10T07:50:44.593 INFO:tasks.workunit.client.0.vm05.stdout:8/190: mknod d1/dd/d18/d20/c35 0 2026-03-10T07:50:44.593 INFO:tasks.workunit.client.0.vm05.stdout:4/207: creat d0/d6/d37/f46 x:0 0 0 2026-03-10T07:50:44.597 INFO:tasks.workunit.client.0.vm05.stdout:2/285: mknod d0/d47/d49/c5f 0 2026-03-10T07:50:44.597 INFO:tasks.workunit.client.0.vm05.stdout:2/286: dread - d0/d2a/f2e zero size 2026-03-10T07:50:44.597 INFO:tasks.workunit.client.0.vm05.stdout:2/287: fsync d0/f1 0 2026-03-10T07:50:44.600 INFO:tasks.workunit.client.0.vm05.stdout:7/211: dwrite d1/d6/f22 [0,4194304] 0 2026-03-10T07:50:44.605 INFO:tasks.workunit.client.0.vm05.stdout:7/212: write d1/f21 [2485947,42391] 0 2026-03-10T07:50:44.605 INFO:tasks.workunit.client.0.vm05.stdout:7/213: chown d1/f3a 51058203 1 2026-03-10T07:50:44.611 INFO:tasks.workunit.client.0.vm05.stdout:7/214: sync 2026-03-10T07:50:44.627 INFO:tasks.workunit.client.0.vm05.stdout:6/177: mknod d0/c38 0 2026-03-10T07:50:44.628 INFO:tasks.workunit.client.0.vm05.stdout:1/195: creat da/dd/d2a/f2f x:0 0 0 2026-03-10T07:50:44.628 INFO:tasks.workunit.client.0.vm05.stdout:6/178: chown d0/f26 12 1 2026-03-10T07:50:44.638 INFO:tasks.workunit.client.0.vm05.stdout:4/208: symlink d0/d17/d44/l47 0 2026-03-10T07:50:44.638 INFO:tasks.workunit.client.0.vm05.stdout:8/191: symlink d1/dd/d18/d20/d2a/d34/l36 0 2026-03-10T07:50:44.639 INFO:tasks.workunit.client.0.vm05.stdout:2/288: mknod d0/c60 0 2026-03-10T07:50:44.640 INFO:tasks.workunit.client.0.vm05.stdout:8/192: fsync d1/f2c 0 2026-03-10T07:50:44.646 INFO:tasks.workunit.client.0.vm05.stdout:6/179: mknod d0/d35/c39 0 2026-03-10T07:50:44.647 INFO:tasks.workunit.client.0.vm05.stdout:5/193: getdents d2/d12 0 2026-03-10T07:50:44.647 INFO:tasks.workunit.client.0.vm05.stdout:4/209: chown d0/d6/d37/f46 2 1 2026-03-10T07:50:44.647 INFO:tasks.workunit.client.0.vm05.stdout:4/210: dread - d0/d20/d26/f40 zero size 2026-03-10T07:50:44.671 INFO:tasks.workunit.client.0.vm05.stdout:0/168: getdents d8/dd/d10 0 2026-03-10T07:50:44.690 INFO:tasks.workunit.client.0.vm05.stdout:3/202: truncate d8/f12 4020102 0 2026-03-10T07:50:44.693 INFO:tasks.workunit.client.0.vm05.stdout:3/203: dwrite d8/fe [0,4194304] 0 2026-03-10T07:50:44.707 INFO:tasks.workunit.client.0.vm05.stdout:9/154: dwrite d8/f9 [0,4194304] 0 2026-03-10T07:50:44.723 INFO:tasks.workunit.client.0.vm05.stdout:1/196: truncate da/dd/d12/d19/f1a 27125 0 2026-03-10T07:50:44.732 INFO:tasks.workunit.client.0.vm05.stdout:4/211: unlink d0/d6/d37/d3c/f25 0 2026-03-10T07:50:44.732 INFO:tasks.workunit.client.0.vm05.stdout:4/212: dread - d0/d20/d26/f40 zero size 2026-03-10T07:50:44.732 INFO:tasks.workunit.client.0.vm05.stdout:7/215: link d1/l18 d1/l43 0 2026-03-10T07:50:44.732 INFO:tasks.workunit.client.0.vm05.stdout:7/216: dread - d1/f3d zero size 2026-03-10T07:50:44.732 INFO:tasks.workunit.client.0.vm05.stdout:3/204: creat d8/d1c/f3d x:0 0 0 2026-03-10T07:50:44.733 INFO:tasks.workunit.client.0.vm05.stdout:3/205: readlink d8/l1b 0 2026-03-10T07:50:44.735 INFO:tasks.workunit.client.0.vm05.stdout:9/155: dread d8/d16/f23 [0,4194304] 0 2026-03-10T07:50:44.739 INFO:tasks.workunit.client.0.vm05.stdout:2/289: creat d0/d8/d43/df/d4e/f61 x:0 0 0 2026-03-10T07:50:44.739 INFO:tasks.workunit.client.0.vm05.stdout:9/156: chown d8/f1b 0 1 2026-03-10T07:50:44.739 INFO:tasks.workunit.client.0.vm05.stdout:9/157: readlink d8/lb 0 2026-03-10T07:50:44.739 INFO:tasks.workunit.client.0.vm05.stdout:9/158: truncate d8/f1b 4602719 0 2026-03-10T07:50:44.739 INFO:tasks.workunit.client.0.vm05.stdout:9/159: chown d8/f1b 100 1 2026-03-10T07:50:44.742 INFO:tasks.workunit.client.0.vm05.stdout:9/160: dwrite d8/d16/f1d [0,4194304] 0 2026-03-10T07:50:44.755 INFO:tasks.workunit.client.0.vm05.stdout:7/217: rename d1/l24 to d1/d3c/l44 0 2026-03-10T07:50:44.755 INFO:tasks.workunit.client.0.vm05.stdout:7/218: chown d1/d3c 361 1 2026-03-10T07:50:44.759 INFO:tasks.workunit.client.0.vm05.stdout:6/180: write d0/d6/f2c [389801,33883] 0 2026-03-10T07:50:44.759 INFO:tasks.workunit.client.0.vm05.stdout:6/181: fdatasync d0/f15 0 2026-03-10T07:50:44.759 INFO:tasks.workunit.client.0.vm05.stdout:6/182: stat d0/d6/f10 0 2026-03-10T07:50:44.766 INFO:tasks.workunit.client.0.vm05.stdout:8/193: write d1/dd/d18/f21 [252253,64977] 0 2026-03-10T07:50:44.769 INFO:tasks.workunit.client.0.vm05.stdout:5/194: creat d2/f42 x:0 0 0 2026-03-10T07:50:44.772 INFO:tasks.workunit.client.0.vm05.stdout:0/169: mkdir d8/dd/d37 0 2026-03-10T07:50:44.773 INFO:tasks.workunit.client.0.vm05.stdout:0/170: chown d8/dd/d10/l25 6857 1 2026-03-10T07:50:44.775 INFO:tasks.workunit.client.0.vm05.stdout:1/197: symlink da/l30 0 2026-03-10T07:50:44.776 INFO:tasks.workunit.client.0.vm05.stdout:1/198: read da/dd/d12/f22 [2514715,27381] 0 2026-03-10T07:50:44.778 INFO:tasks.workunit.client.0.vm05.stdout:6/183: dread d0/f29 [0,4194304] 0 2026-03-10T07:50:44.792 INFO:tasks.workunit.client.0.vm05.stdout:4/213: symlink d0/l48 0 2026-03-10T07:50:44.797 INFO:tasks.workunit.client.0.vm05.stdout:9/161: symlink d8/d16/d1c/l34 0 2026-03-10T07:50:44.798 INFO:tasks.workunit.client.0.vm05.stdout:9/162: write d8/f15 [4078239,15658] 0 2026-03-10T07:50:44.804 INFO:tasks.workunit.client.0.vm05.stdout:8/194: rename d1/l27 to d1/dd/d18/d20/d2a/l37 0 2026-03-10T07:50:44.807 INFO:tasks.workunit.client.0.vm05.stdout:7/219: write d1/d34/f3e [4101265,58136] 0 2026-03-10T07:50:44.807 INFO:tasks.workunit.client.0.vm05.stdout:5/195: symlink d2/d20/l43 0 2026-03-10T07:50:44.810 INFO:tasks.workunit.client.0.vm05.stdout:7/220: dwrite d1/d6/f2e [0,4194304] 0 2026-03-10T07:50:44.823 INFO:tasks.workunit.client.0.vm05.stdout:3/206: creat d8/d1f/d24/f3e x:0 0 0 2026-03-10T07:50:44.830 INFO:tasks.workunit.client.0.vm05.stdout:1/199: creat da/dd/d12/f31 x:0 0 0 2026-03-10T07:50:44.836 INFO:tasks.workunit.client.0.vm05.stdout:9/163: rmdir d8/d16/d1c/d2c 39 2026-03-10T07:50:44.836 INFO:tasks.workunit.client.0.vm05.stdout:9/164: dread d8/f14 [0,4194304] 0 2026-03-10T07:50:44.836 INFO:tasks.workunit.client.0.vm05.stdout:8/195: unlink d1/dd/d18/d20/d2a/c32 0 2026-03-10T07:50:44.837 INFO:tasks.workunit.client.0.vm05.stdout:5/196: fdatasync d2/d5/f1e 0 2026-03-10T07:50:44.839 INFO:tasks.workunit.client.0.vm05.stdout:0/171: truncate d8/dd/f22 1435735 0 2026-03-10T07:50:44.848 INFO:tasks.workunit.client.0.vm05.stdout:0/172: dread - d8/dd/d10/d26/f31 zero size 2026-03-10T07:50:44.849 INFO:tasks.workunit.client.0.vm05.stdout:2/290: link d0/d47/d49/c5f d0/d52/c62 0 2026-03-10T07:50:44.849 INFO:tasks.workunit.client.0.vm05.stdout:2/291: write d0/d47/f48 [803769,69762] 0 2026-03-10T07:50:44.849 INFO:tasks.workunit.client.0.vm05.stdout:1/200: stat c9 0 2026-03-10T07:50:44.849 INFO:tasks.workunit.client.0.vm05.stdout:1/201: chown da/dd/d27 25008161 1 2026-03-10T07:50:44.849 INFO:tasks.workunit.client.0.vm05.stdout:9/165: rename d8/d16 to d8/d35 0 2026-03-10T07:50:44.849 INFO:tasks.workunit.client.0.vm05.stdout:9/166: dread - d8/d35/d1c/d20/f32 zero size 2026-03-10T07:50:44.849 INFO:tasks.workunit.client.0.vm05.stdout:8/196: creat d1/dd/d18/f38 x:0 0 0 2026-03-10T07:50:44.851 INFO:tasks.workunit.client.0.vm05.stdout:7/221: rename d1/d6/l30 to d1/d6/d3b/l45 0 2026-03-10T07:50:44.853 INFO:tasks.workunit.client.0.vm05.stdout:3/207: sync 2026-03-10T07:50:44.854 INFO:tasks.workunit.client.0.vm05.stdout:8/197: sync 2026-03-10T07:50:44.854 INFO:tasks.workunit.client.0.vm05.stdout:8/198: chown d1/f15 2 1 2026-03-10T07:50:44.854 INFO:tasks.workunit.client.0.vm05.stdout:3/208: chown d8/d1f/c30 417 1 2026-03-10T07:50:44.866 INFO:tasks.workunit.client.0.vm05.stdout:8/199: dread d1/dd/d18/f1f [0,4194304] 0 2026-03-10T07:50:44.869 INFO:tasks.workunit.client.0.vm05.stdout:2/292: unlink d0/d8/d43/df/d4e/f61 0 2026-03-10T07:50:44.872 INFO:tasks.workunit.client.0.vm05.stdout:2/293: write d0/f5 [1984887,40193] 0 2026-03-10T07:50:44.878 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:44 vm05.local ceph-mon[50387]: Upgrade: Updating node-exporter.vm05 (1/2) 2026-03-10T07:50:44.878 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:44 vm05.local ceph-mon[50387]: Deploying daemon node-exporter.vm05 on vm05 2026-03-10T07:50:44.878 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:44 vm05.local ceph-mon[50387]: Standby manager daemon vm08.orfpog restarted 2026-03-10T07:50:44.878 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:44 vm05.local ceph-mon[50387]: Standby manager daemon vm08.orfpog started 2026-03-10T07:50:44.878 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:44 vm05.local ceph-mon[50387]: from='mgr.? 192.168.123.108:0/2523620950' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.orfpog/crt"}]: dispatch 2026-03-10T07:50:44.878 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:44 vm05.local ceph-mon[50387]: from='mgr.? 192.168.123.108:0/2523620950' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T07:50:44.878 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:44 vm05.local ceph-mon[50387]: from='mgr.? 192.168.123.108:0/2523620950' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.orfpog/key"}]: dispatch 2026-03-10T07:50:44.878 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:44 vm05.local ceph-mon[50387]: from='mgr.? 192.168.123.108:0/2523620950' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T07:50:44.882 INFO:tasks.workunit.client.0.vm05.stdout:2/294: dwrite d0/f7 [0,4194304] 0 2026-03-10T07:50:44.889 INFO:tasks.workunit.client.0.vm05.stdout:1/202: rmdir da/d26 39 2026-03-10T07:50:44.890 INFO:tasks.workunit.client.0.vm05.stdout:1/203: chown da/dd/c1e 14 1 2026-03-10T07:50:44.891 INFO:tasks.workunit.client.0.vm05.stdout:6/184: rename d0/f28 to d0/f3a 0 2026-03-10T07:50:44.892 INFO:tasks.workunit.client.0.vm05.stdout:7/222: rmdir d1/d34 39 2026-03-10T07:50:44.893 INFO:tasks.workunit.client.0.vm05.stdout:0/173: link d8/f9 d8/dd/d37/f38 0 2026-03-10T07:50:44.893 INFO:tasks.workunit.client.0.vm05.stdout:3/209: creat d8/d1f/d2a/d34/f3f x:0 0 0 2026-03-10T07:50:44.895 INFO:tasks.workunit.client.0.vm05.stdout:5/197: dwrite d2/d5/fa [0,4194304] 0 2026-03-10T07:50:44.897 INFO:tasks.workunit.client.0.vm05.stdout:3/210: read d8/d16/d19/f21 [395859,102685] 0 2026-03-10T07:50:44.897 INFO:tasks.workunit.client.0.vm05.stdout:8/200: creat d1/dd/d18/d20/d2a/d34/f39 x:0 0 0 2026-03-10T07:50:44.898 INFO:tasks.workunit.client.0.vm05.stdout:2/295: symlink d0/d8/d3d/l63 0 2026-03-10T07:50:44.898 INFO:tasks.workunit.client.0.vm05.stdout:2/296: chown d0/d8/d43/d38/f56 16510824 1 2026-03-10T07:50:44.904 INFO:tasks.workunit.client.0.vm05.stdout:2/297: write d0/d47/f4c [519887,110805] 0 2026-03-10T07:50:44.908 INFO:tasks.workunit.client.0.vm05.stdout:4/214: getdents d0/d6/d37 0 2026-03-10T07:50:44.918 INFO:tasks.workunit.client.0.vm05.stdout:4/215: chown d0/d3b 19554 1 2026-03-10T07:50:44.918 INFO:tasks.workunit.client.0.vm05.stdout:4/216: dwrite d0/d20/d26/f40 [0,4194304] 0 2026-03-10T07:50:44.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:44 vm08.local ceph-mon[59917]: Upgrade: Updating node-exporter.vm05 (1/2) 2026-03-10T07:50:44.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:44 vm08.local ceph-mon[59917]: Deploying daemon node-exporter.vm05 on vm05 2026-03-10T07:50:44.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:44 vm08.local ceph-mon[59917]: Standby manager daemon vm08.orfpog restarted 2026-03-10T07:50:44.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:44 vm08.local ceph-mon[59917]: Standby manager daemon vm08.orfpog started 2026-03-10T07:50:44.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:44 vm08.local ceph-mon[59917]: from='mgr.? 192.168.123.108:0/2523620950' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.orfpog/crt"}]: dispatch 2026-03-10T07:50:44.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:44 vm08.local ceph-mon[59917]: from='mgr.? 192.168.123.108:0/2523620950' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T07:50:44.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:44 vm08.local ceph-mon[59917]: from='mgr.? 192.168.123.108:0/2523620950' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.orfpog/key"}]: dispatch 2026-03-10T07:50:44.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:44 vm08.local ceph-mon[59917]: from='mgr.? 192.168.123.108:0/2523620950' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T07:50:44.922 INFO:tasks.workunit.client.0.vm05.stdout:9/167: mkdir d8/d35/d1c/d36 0 2026-03-10T07:50:44.925 INFO:tasks.workunit.client.0.vm05.stdout:3/211: sync 2026-03-10T07:50:44.930 INFO:tasks.workunit.client.0.vm05.stdout:9/168: dread f7 [0,4194304] 0 2026-03-10T07:50:44.939 INFO:tasks.workunit.client.0.vm05.stdout:5/198: unlink d2/d12/f14 0 2026-03-10T07:50:44.944 INFO:tasks.workunit.client.0.vm05.stdout:1/204: mknod da/dd/d27/c32 0 2026-03-10T07:50:44.951 INFO:tasks.workunit.client.0.vm05.stdout:9/169: symlink d8/d35/d1c/d20/l37 0 2026-03-10T07:50:44.954 INFO:tasks.workunit.client.0.vm05.stdout:0/174: mkdir d8/dd/d37/d39 0 2026-03-10T07:50:44.958 INFO:tasks.workunit.client.0.vm05.stdout:8/201: link d1/dd/f25 d1/dd/d18/d20/d2a/f3a 0 2026-03-10T07:50:44.964 INFO:tasks.workunit.client.0.vm05.stdout:6/185: rmdir d0 39 2026-03-10T07:50:44.972 INFO:tasks.workunit.client.0.vm05.stdout:5/199: fdatasync d2/d5/f1e 0 2026-03-10T07:50:44.974 INFO:tasks.workunit.client.0.vm05.stdout:4/217: dwrite d0/f2 [4194304,4194304] 0 2026-03-10T07:50:44.978 INFO:tasks.workunit.client.0.vm05.stdout:7/223: creat d1/f46 x:0 0 0 2026-03-10T07:50:44.978 INFO:tasks.workunit.client.0.vm05.stdout:3/212: mknod d8/d16/d19/d37/c40 0 2026-03-10T07:50:44.979 INFO:tasks.workunit.client.0.vm05.stdout:9/170: mkdir d8/d35/d38 0 2026-03-10T07:50:44.979 INFO:tasks.workunit.client.0.vm05.stdout:9/171: readlink d8/d35/l18 0 2026-03-10T07:50:44.980 INFO:tasks.workunit.client.0.vm05.stdout:9/172: fdatasync d8/d35/d22/f2b 0 2026-03-10T07:50:44.981 INFO:tasks.workunit.client.0.vm05.stdout:9/173: truncate d8/f9 4756195 0 2026-03-10T07:50:44.982 INFO:tasks.workunit.client.0.vm05.stdout:7/224: dwrite d1/f27 [4194304,4194304] 0 2026-03-10T07:50:44.982 INFO:tasks.workunit.client.0.vm05.stdout:7/225: truncate d1/f37 173727 0 2026-03-10T07:50:44.988 INFO:tasks.workunit.client.0.vm05.stdout:0/175: mkdir d8/dd/d10/d26/d3a 0 2026-03-10T07:50:44.989 INFO:tasks.workunit.client.0.vm05.stdout:2/298: link d0/d8/d43/c27 d0/d8/c64 0 2026-03-10T07:50:45.008 INFO:tasks.workunit.client.0.vm05.stdout:5/200: mknod d2/d20/d33/c44 0 2026-03-10T07:50:45.014 INFO:tasks.workunit.client.0.vm05.stdout:3/213: mkdir d8/d1f/d24/d41 0 2026-03-10T07:50:45.014 INFO:tasks.workunit.client.0.vm05.stdout:3/214: chown d8/d22/f3a 47147162 1 2026-03-10T07:50:45.017 INFO:tasks.workunit.client.0.vm05.stdout:3/215: dwrite d8/d16/f1a [0,4194304] 0 2026-03-10T07:50:45.018 INFO:tasks.workunit.client.0.vm05.stdout:7/226: mkdir d1/d6/d47 0 2026-03-10T07:50:45.020 INFO:tasks.workunit.client.0.vm05.stdout:2/299: creat d0/d8/f65 x:0 0 0 2026-03-10T07:50:45.021 INFO:tasks.workunit.client.0.vm05.stdout:6/186: mkdir d0/d6/d3b 0 2026-03-10T07:50:45.021 INFO:tasks.workunit.client.0.vm05.stdout:6/187: read d0/d6/f16 [1386919,122500] 0 2026-03-10T07:50:45.042 INFO:tasks.workunit.client.0.vm05.stdout:1/205: dwrite da/d26/f2d [0,4194304] 0 2026-03-10T07:50:45.046 INFO:tasks.workunit.client.0.vm05.stdout:1/206: dread da/dd/d12/f16 [0,4194304] 0 2026-03-10T07:50:45.049 INFO:tasks.workunit.client.0.vm05.stdout:1/207: stat da/fb 0 2026-03-10T07:50:45.050 INFO:tasks.workunit.client.0.vm05.stdout:9/174: creat d8/d35/d1c/d36/f39 x:0 0 0 2026-03-10T07:50:45.051 INFO:tasks.workunit.client.0.vm05.stdout:9/175: write d8/d35/f1f [4663440,13716] 0 2026-03-10T07:50:45.053 INFO:tasks.workunit.client.0.vm05.stdout:5/201: dwrite d2/d20/f2a [0,4194304] 0 2026-03-10T07:50:45.063 INFO:tasks.workunit.client.0.vm05.stdout:3/216: creat d8/d1f/d2a/f42 x:0 0 0 2026-03-10T07:50:45.071 INFO:tasks.workunit.client.0.vm05.stdout:6/188: mknod d0/d11/d31/c3c 0 2026-03-10T07:50:45.075 INFO:tasks.workunit.client.0.vm05.stdout:4/218: creat d0/d6/d9/f49 x:0 0 0 2026-03-10T07:50:45.085 INFO:tasks.workunit.client.0.vm05.stdout:8/202: getdents d1/dd/d18/d20 0 2026-03-10T07:50:45.088 INFO:tasks.workunit.client.0.vm05.stdout:5/202: dread - d2/f1a zero size 2026-03-10T07:50:45.088 INFO:tasks.workunit.client.0.vm05.stdout:7/227: readlink d1/d34/l3f 0 2026-03-10T07:50:45.090 INFO:tasks.workunit.client.0.vm05.stdout:3/217: truncate d8/f3b 1943230 0 2026-03-10T07:50:45.091 INFO:tasks.workunit.client.0.vm05.stdout:6/189: truncate d0/d6/f2c 4517731 0 2026-03-10T07:50:45.095 INFO:tasks.workunit.client.0.vm05.stdout:4/219: creat d0/d3b/f4a x:0 0 0 2026-03-10T07:50:45.096 INFO:tasks.workunit.client.0.vm05.stdout:4/220: fdatasync d0/d6/d9/d12/f35 0 2026-03-10T07:50:45.099 INFO:tasks.workunit.client.0.vm05.stdout:4/221: dwrite d0/d6/f39 [0,4194304] 0 2026-03-10T07:50:45.106 INFO:tasks.workunit.client.0.vm05.stdout:1/208: dwrite da/dd/d12/f22 [0,4194304] 0 2026-03-10T07:50:45.106 INFO:tasks.workunit.client.0.vm05.stdout:1/209: fsync f4 0 2026-03-10T07:50:45.107 INFO:tasks.workunit.client.0.vm05.stdout:1/210: chown da/l23 21017 1 2026-03-10T07:50:45.112 INFO:tasks.workunit.client.0.vm05.stdout:5/203: dwrite d2/ff [0,4194304] 0 2026-03-10T07:50:45.114 INFO:tasks.workunit.client.0.vm05.stdout:5/204: fsync d2/d20/f32 0 2026-03-10T07:50:45.115 INFO:tasks.workunit.client.0.vm05.stdout:7/228: unlink d1/f25 0 2026-03-10T07:50:45.116 INFO:tasks.workunit.client.0.vm05.stdout:7/229: fsync d1/f3a 0 2026-03-10T07:50:45.119 INFO:tasks.workunit.client.0.vm05.stdout:0/176: creat d8/dd/de/f3b x:0 0 0 2026-03-10T07:50:45.138 INFO:tasks.workunit.client.0.vm05.stdout:6/190: rename d0/d6/l1b to d0/d11/d22/l3d 0 2026-03-10T07:50:45.138 INFO:tasks.workunit.client.0.vm05.stdout:7/230: rename d1/d6/d3b to d1/d6/d3b/d48 22 2026-03-10T07:50:45.140 INFO:tasks.workunit.client.0.vm05.stdout:8/203: symlink d1/dd/l3b 0 2026-03-10T07:50:45.150 INFO:tasks.workunit.client.0.vm05.stdout:4/222: mknod d0/d6/d37/d3c/c4b 0 2026-03-10T07:50:45.150 INFO:tasks.workunit.client.0.vm05.stdout:9/176: link d8/d35/d1c/l34 d8/d35/d22/l3a 0 2026-03-10T07:50:45.154 INFO:tasks.workunit.client.0.vm05.stdout:1/211: creat da/d26/f33 x:0 0 0 2026-03-10T07:50:45.154 INFO:tasks.workunit.client.0.vm05.stdout:1/212: fsync da/fb 0 2026-03-10T07:50:45.155 INFO:tasks.workunit.client.0.vm05.stdout:1/213: write da/d26/f33 [213623,62739] 0 2026-03-10T07:50:45.158 INFO:tasks.workunit.client.0.vm05.stdout:1/214: dwrite da/d26/f33 [0,4194304] 0 2026-03-10T07:50:45.162 INFO:tasks.workunit.client.0.vm05.stdout:5/205: creat d2/d20/d33/f45 x:0 0 0 2026-03-10T07:50:45.162 INFO:tasks.workunit.client.0.vm05.stdout:5/206: read - d2/d12/f40 zero size 2026-03-10T07:50:45.165 INFO:tasks.workunit.client.0.vm05.stdout:5/207: dwrite d2/d5/f25 [0,4194304] 0 2026-03-10T07:50:45.169 INFO:tasks.workunit.client.0.vm05.stdout:2/300: getdents d0/d8/d3d 0 2026-03-10T07:50:45.169 INFO:tasks.workunit.client.0.vm05.stdout:2/301: dread - d0/d8/f3b zero size 2026-03-10T07:50:45.171 INFO:tasks.workunit.client.0.vm05.stdout:8/204: rmdir d1/dd 39 2026-03-10T07:50:45.172 INFO:tasks.workunit.client.0.vm05.stdout:8/205: truncate d1/d23/f31 1677019 0 2026-03-10T07:50:45.174 INFO:tasks.workunit.client.0.vm05.stdout:4/223: creat d0/d3b/f4c x:0 0 0 2026-03-10T07:50:45.178 INFO:tasks.workunit.client.0.vm05.stdout:1/215: rmdir da/dd/d12 39 2026-03-10T07:50:45.189 INFO:tasks.workunit.client.0.vm05.stdout:2/302: unlink d0/c5d 0 2026-03-10T07:50:45.189 INFO:tasks.workunit.client.0.vm05.stdout:6/191: symlink d0/d6/d3b/l3e 0 2026-03-10T07:50:45.189 INFO:tasks.workunit.client.0.vm05.stdout:6/192: dread - d0/f26 zero size 2026-03-10T07:50:45.189 INFO:tasks.workunit.client.0.vm05.stdout:7/231: creat d1/f49 x:0 0 0 2026-03-10T07:50:45.189 INFO:tasks.workunit.client.0.vm05.stdout:7/232: chown d1/f3a 1652 1 2026-03-10T07:50:45.192 INFO:tasks.workunit.client.0.vm05.stdout:2/303: mkdir d0/d8/d66 0 2026-03-10T07:50:45.197 INFO:tasks.workunit.client.0.vm05.stdout:0/177: rename d8/fa to d8/dd/f3c 0 2026-03-10T07:50:45.208 INFO:tasks.workunit.client.0.vm05.stdout:1/216: mkdir da/dd/d12/d34 0 2026-03-10T07:50:45.210 INFO:tasks.workunit.client.0.vm05.stdout:2/304: unlink d0/d8/d43/df/l16 0 2026-03-10T07:50:45.215 INFO:tasks.workunit.client.0.vm05.stdout:0/178: dread d8/dd/d10/f19 [0,4194304] 0 2026-03-10T07:50:45.220 INFO:tasks.workunit.client.0.vm05.stdout:3/218: dwrite d8/f12 [0,4194304] 0 2026-03-10T07:50:45.221 INFO:tasks.workunit.client.0.vm05.stdout:6/193: mknod d0/c3f 0 2026-03-10T07:50:45.221 INFO:tasks.workunit.client.0.vm05.stdout:4/224: creat d0/d6/d9/f4d x:0 0 0 2026-03-10T07:50:45.222 INFO:tasks.workunit.client.0.vm05.stdout:9/177: getdents d8/d35/d1c/d36 0 2026-03-10T07:50:45.223 INFO:tasks.workunit.client.0.vm05.stdout:7/233: symlink d1/d6/d47/l4a 0 2026-03-10T07:50:45.224 INFO:tasks.workunit.client.0.vm05.stdout:7/234: write d1/d6/fb [198494,18785] 0 2026-03-10T07:50:45.225 INFO:tasks.workunit.client.0.vm05.stdout:5/208: fdatasync d2/d20/f2a 0 2026-03-10T07:50:45.228 INFO:tasks.workunit.client.0.vm05.stdout:1/217: mknod da/dd/d12/c35 0 2026-03-10T07:50:45.235 INFO:tasks.workunit.client.0.vm05.stdout:8/206: rename d1/dd/d18/d20/d2a/l37 to d1/dd/d18/d20/l3c 0 2026-03-10T07:50:45.242 INFO:tasks.workunit.client.0.vm05.stdout:0/179: creat d8/dd/d34/f3d x:0 0 0 2026-03-10T07:50:45.242 INFO:tasks.workunit.client.0.vm05.stdout:4/225: unlink d0/la 0 2026-03-10T07:50:45.243 INFO:tasks.workunit.client.0.vm05.stdout:6/194: symlink d0/d35/l40 0 2026-03-10T07:50:45.249 INFO:tasks.workunit.client.0.vm05.stdout:1/218: creat da/dd/d27/f36 x:0 0 0 2026-03-10T07:50:45.256 INFO:tasks.workunit.client.0.vm05.stdout:2/305: dwrite d0/d8/d43/df/d25/f29 [0,4194304] 0 2026-03-10T07:50:45.259 INFO:tasks.workunit.client.0.vm05.stdout:2/306: dwrite d0/d8/f42 [0,4194304] 0 2026-03-10T07:50:45.259 INFO:tasks.workunit.client.0.vm05.stdout:3/219: rename d8/d1c/f3d to d8/d16/d19/d37/f43 0 2026-03-10T07:50:45.260 INFO:tasks.workunit.client.0.vm05.stdout:2/307: chown d0/d8/f1c 1816797289 1 2026-03-10T07:50:45.260 INFO:tasks.workunit.client.0.vm05.stdout:2/308: fsync d0/d8/f3b 0 2026-03-10T07:50:45.267 INFO:tasks.workunit.client.0.vm05.stdout:0/180: creat d8/dd/d34/f3e x:0 0 0 2026-03-10T07:50:45.269 INFO:tasks.workunit.client.0.vm05.stdout:4/226: mkdir d0/d17/d4e 0 2026-03-10T07:50:45.271 INFO:tasks.workunit.client.0.vm05.stdout:6/195: fsync d0/f3a 0 2026-03-10T07:50:45.277 INFO:tasks.workunit.client.0.vm05.stdout:8/207: mknod d1/c3d 0 2026-03-10T07:50:45.298 INFO:tasks.workunit.client.0.vm05.stdout:5/209: rename d2/d20/f21 to d2/d5/f46 0 2026-03-10T07:50:45.305 INFO:tasks.workunit.client.0.vm05.stdout:5/210: dwrite d2/d5/f3d [0,4194304] 0 2026-03-10T07:50:45.306 INFO:tasks.workunit.client.0.vm05.stdout:2/309: rmdir d0/d52 39 2026-03-10T07:50:45.306 INFO:tasks.workunit.client.0.vm05.stdout:5/211: write d2/d5/f10 [4432430,42114] 0 2026-03-10T07:50:45.306 INFO:tasks.workunit.client.0.vm05.stdout:9/178: creat d8/d35/d1c/f3b x:0 0 0 2026-03-10T07:50:45.306 INFO:tasks.workunit.client.0.vm05.stdout:9/179: chown d8 11 1 2026-03-10T07:50:45.306 INFO:tasks.workunit.client.0.vm05.stdout:1/219: rename da/l15 to da/dd/d12/d34/l37 0 2026-03-10T07:50:45.310 INFO:tasks.workunit.client.0.vm05.stdout:0/181: fsync d8/f2d 0 2026-03-10T07:50:45.311 INFO:tasks.workunit.client.0.vm05.stdout:9/180: write d8/d35/d1c/d2c/f2e [1034087,14733] 0 2026-03-10T07:50:45.312 INFO:tasks.workunit.client.0.vm05.stdout:9/181: write d8/d35/d1c/d26/d28/f29 [332081,4927] 0 2026-03-10T07:50:45.313 INFO:tasks.workunit.client.0.vm05.stdout:9/182: chown d8/d35/d1c/d26/c2f 1 1 2026-03-10T07:50:45.314 INFO:tasks.workunit.client.0.vm05.stdout:1/220: creat da/dd/d12/d34/f38 x:0 0 0 2026-03-10T07:50:45.320 INFO:tasks.workunit.client.0.vm05.stdout:9/183: mkdir d8/d35/d3c 0 2026-03-10T07:50:45.329 INFO:tasks.workunit.client.0.vm05.stdout:9/184: write d8/d35/d22/f2b [434400,78067] 0 2026-03-10T07:50:45.329 INFO:tasks.workunit.client.0.vm05.stdout:9/185: write d8/f15 [3610036,32910] 0 2026-03-10T07:50:45.329 INFO:tasks.workunit.client.0.vm05.stdout:9/186: write d8/fa [527659,105355] 0 2026-03-10T07:50:45.329 INFO:tasks.workunit.client.0.vm05.stdout:9/187: dread - d8/d35/f25 zero size 2026-03-10T07:50:45.329 INFO:tasks.workunit.client.0.vm05.stdout:1/221: rename da/dd/d12/d19/l1d to da/dd/d12/l39 0 2026-03-10T07:50:45.329 INFO:tasks.workunit.client.0.vm05.stdout:2/310: link d0/d8/f3b d0/d52/f67 0 2026-03-10T07:50:45.329 INFO:tasks.workunit.client.0.vm05.stdout:2/311: write d0/d8/d43/d38/f56 [122442,129873] 0 2026-03-10T07:50:45.331 INFO:tasks.workunit.client.0.vm05.stdout:9/188: creat d8/d35/d1c/d26/f3d x:0 0 0 2026-03-10T07:50:45.336 INFO:tasks.workunit.client.0.vm05.stdout:9/189: stat d8/d35/d38 0 2026-03-10T07:50:45.337 INFO:tasks.workunit.client.0.vm05.stdout:2/312: dwrite d0/d52/f67 [0,4194304] 0 2026-03-10T07:50:45.339 INFO:tasks.workunit.client.0.vm05.stdout:9/190: mknod d8/d35/d1c/d20/c3e 0 2026-03-10T07:50:45.341 INFO:tasks.workunit.client.0.vm05.stdout:1/222: creat da/f3a x:0 0 0 2026-03-10T07:50:45.343 INFO:tasks.workunit.client.0.vm05.stdout:2/313: readlink d0/d2a/d2f/l3f 0 2026-03-10T07:50:45.343 INFO:tasks.workunit.client.0.vm05.stdout:9/191: creat d8/d35/d22/f3f x:0 0 0 2026-03-10T07:50:45.344 INFO:tasks.workunit.client.0.vm05.stdout:1/223: creat da/dd/d12/d19/f3b x:0 0 0 2026-03-10T07:50:45.344 INFO:tasks.workunit.client.0.vm05.stdout:9/192: chown d8/f1b 281 1 2026-03-10T07:50:45.344 INFO:tasks.workunit.client.0.vm05.stdout:1/224: chown da/dd/d12/f16 6126751 1 2026-03-10T07:50:45.346 INFO:tasks.workunit.client.0.vm05.stdout:9/193: write d8/d35/d1c/d26/d28/f29 [83916,69422] 0 2026-03-10T07:50:45.348 INFO:tasks.workunit.client.0.vm05.stdout:1/225: dwrite da/dd/d12/d19/f3b [0,4194304] 0 2026-03-10T07:50:45.366 INFO:tasks.workunit.client.0.vm05.stdout:2/314: rmdir d0/d47/d49 39 2026-03-10T07:50:45.367 INFO:tasks.workunit.client.0.vm05.stdout:8/208: sync 2026-03-10T07:50:45.367 INFO:tasks.workunit.client.0.vm05.stdout:9/194: creat d8/d35/d1c/f40 x:0 0 0 2026-03-10T07:50:45.372 INFO:tasks.workunit.client.0.vm05.stdout:8/209: dwrite d1/dd/d18/f29 [0,4194304] 0 2026-03-10T07:50:45.373 INFO:tasks.workunit.client.0.vm05.stdout:8/210: write d1/fa [3308354,117440] 0 2026-03-10T07:50:45.374 INFO:tasks.workunit.client.0.vm05.stdout:2/315: dwrite d0/d8/d43/df/f20 [4194304,4194304] 0 2026-03-10T07:50:45.390 INFO:tasks.workunit.client.0.vm05.stdout:0/182: sync 2026-03-10T07:50:45.390 INFO:tasks.workunit.client.0.vm05.stdout:1/226: sync 2026-03-10T07:50:45.390 INFO:tasks.workunit.client.0.vm05.stdout:0/183: chown f4 526096 1 2026-03-10T07:50:45.394 INFO:tasks.workunit.client.0.vm05.stdout:9/195: rename d8/f1b to d8/d35/d22/d33/f41 0 2026-03-10T07:50:45.394 INFO:tasks.workunit.client.0.vm05.stdout:9/196: write d8/fa [3754162,65071] 0 2026-03-10T07:50:45.395 INFO:tasks.workunit.client.0.vm05.stdout:9/197: chown d8/d35/d1c/d36 379 1 2026-03-10T07:50:45.397 INFO:tasks.workunit.client.0.vm05.stdout:8/211: symlink d1/dd/d18/d20/l3e 0 2026-03-10T07:50:45.398 INFO:tasks.workunit.client.0.vm05.stdout:8/212: write d1/dd/d18/f22 [833760,4482] 0 2026-03-10T07:50:45.404 INFO:tasks.workunit.client.0.vm05.stdout:0/184: dread d8/dd/f3c [0,4194304] 0 2026-03-10T07:50:45.404 INFO:tasks.workunit.client.0.vm05.stdout:0/185: rename d8/dd/d10 to d8/dd/d10/d26/d3f 22 2026-03-10T07:50:45.411 INFO:tasks.workunit.client.0.vm05.stdout:7/235: truncate d1/f27 1239880 0 2026-03-10T07:50:45.413 INFO:tasks.workunit.client.0.vm05.stdout:8/213: dread d1/dd/d18/d20/d2a/f3a [0,4194304] 0 2026-03-10T07:50:45.416 INFO:tasks.workunit.client.0.vm05.stdout:8/214: dwrite d1/dd/d18/f1f [4194304,4194304] 0 2026-03-10T07:50:45.424 INFO:tasks.workunit.client.0.vm05.stdout:3/220: dwrite d8/f18 [0,4194304] 0 2026-03-10T07:50:45.429 INFO:tasks.workunit.client.0.vm05.stdout:1/227: rename da/dd/d12/d19/c21 to da/dd/d12/d19/c3c 0 2026-03-10T07:50:45.431 INFO:tasks.workunit.client.0.vm05.stdout:4/227: dwrite d0/d6/d37/f3d [0,4194304] 0 2026-03-10T07:50:45.442 INFO:tasks.workunit.client.0.vm05.stdout:2/316: creat d0/d8/d66/f68 x:0 0 0 2026-03-10T07:50:45.449 INFO:tasks.workunit.client.0.vm05.stdout:0/186: fdatasync f4 0 2026-03-10T07:50:45.451 INFO:tasks.workunit.client.0.vm05.stdout:0/187: dread d8/dd/f3c [0,4194304] 0 2026-03-10T07:50:45.452 INFO:tasks.workunit.client.0.vm05.stdout:0/188: read - d8/dd/d10/d26/f31 zero size 2026-03-10T07:50:45.453 INFO:tasks.workunit.client.0.vm05.stdout:6/196: write d0/f29 [3220728,95837] 0 2026-03-10T07:50:45.454 INFO:tasks.workunit.client.0.vm05.stdout:6/197: fsync d0/f15 0 2026-03-10T07:50:45.454 INFO:tasks.workunit.client.0.vm05.stdout:6/198: chown d0/d6/l17 873082 1 2026-03-10T07:50:45.455 INFO:tasks.workunit.client.0.vm05.stdout:8/215: creat d1/dd/d18/f3f x:0 0 0 2026-03-10T07:50:45.462 INFO:tasks.workunit.client.0.vm05.stdout:2/317: rename d0/d52/f67 to d0/d8/d43/df/d53/f69 0 2026-03-10T07:50:45.466 INFO:tasks.workunit.client.0.vm05.stdout:8/216: mknod d1/dd/d18/d20/d2a/c40 0 2026-03-10T07:50:45.472 INFO:tasks.workunit.client.0.vm05.stdout:8/217: dread d1/dd/f11 [0,4194304] 0 2026-03-10T07:50:45.472 INFO:tasks.workunit.client.0.vm05.stdout:8/218: chown d1/f15 43 1 2026-03-10T07:50:45.473 INFO:tasks.workunit.client.0.vm05.stdout:3/221: creat d8/d1f/d2a/d33/f44 x:0 0 0 2026-03-10T07:50:45.473 INFO:tasks.workunit.client.0.vm05.stdout:1/228: fsync da/dd/d12/d19/f1a 0 2026-03-10T07:50:45.474 INFO:tasks.workunit.client.0.vm05.stdout:3/222: write d8/ff [1913298,105071] 0 2026-03-10T07:50:45.477 INFO:tasks.workunit.client.0.vm05.stdout:2/318: symlink d0/d8/d43/d38/l6a 0 2026-03-10T07:50:45.479 INFO:tasks.workunit.client.0.vm05.stdout:3/223: dwrite d8/d16/f1a [0,4194304] 0 2026-03-10T07:50:45.494 INFO:tasks.workunit.client.0.vm05.stdout:2/319: fsync d0/d8/d43/df/f20 0 2026-03-10T07:50:45.501 INFO:tasks.workunit.client.0.vm05.stdout:9/198: getdents d8/d35 0 2026-03-10T07:50:45.504 INFO:tasks.workunit.client.0.vm05.stdout:1/229: creat da/dd/d12/f3d x:0 0 0 2026-03-10T07:50:45.514 INFO:tasks.workunit.client.0.vm05.stdout:5/212: dwrite d2/d5/f46 [0,4194304] 0 2026-03-10T07:50:45.515 INFO:tasks.workunit.client.0.vm05.stdout:5/213: fdatasync d2/d20/d33/f45 0 2026-03-10T07:50:45.525 INFO:tasks.workunit.client.0.vm05.stdout:0/189: creat d8/dd/f40 x:0 0 0 2026-03-10T07:50:45.528 INFO:tasks.workunit.client.0.vm05.stdout:7/236: truncate d1/d6/f1d 2238645 0 2026-03-10T07:50:45.533 INFO:tasks.workunit.client.0.vm05.stdout:7/237: read d1/d6/f41 [3450320,124052] 0 2026-03-10T07:50:45.533 INFO:tasks.workunit.client.0.vm05.stdout:7/238: dwrite d1/d6/f32 [0,4194304] 0 2026-03-10T07:50:45.533 INFO:tasks.workunit.client.0.vm05.stdout:7/239: chown d1/l18 1 1 2026-03-10T07:50:45.537 INFO:tasks.workunit.client.0.vm05.stdout:3/224: mkdir d8/d1f/d24/d45 0 2026-03-10T07:50:45.543 INFO:tasks.workunit.client.0.vm05.stdout:2/320: stat d0/c59 0 2026-03-10T07:50:45.543 INFO:tasks.workunit.client.0.vm05.stdout:2/321: read - d0/d8/d43/f5e zero size 2026-03-10T07:50:45.547 INFO:tasks.workunit.client.0.vm05.stdout:2/322: dwrite d0/d2a/f45 [0,4194304] 0 2026-03-10T07:50:45.551 INFO:tasks.workunit.client.0.vm05.stdout:2/323: read d0/f41 [277030,11359] 0 2026-03-10T07:50:45.553 INFO:tasks.workunit.client.0.vm05.stdout:2/324: dread d0/d8/f3b [0,4194304] 0 2026-03-10T07:50:45.553 INFO:tasks.workunit.client.0.vm05.stdout:2/325: chown d0/d8/d43/d38/f56 216 1 2026-03-10T07:50:45.556 INFO:tasks.workunit.client.0.vm05.stdout:9/199: symlink d8/d35/l42 0 2026-03-10T07:50:45.559 INFO:tasks.workunit.client.0.vm05.stdout:1/230: mknod da/dd/d12/d34/c3e 0 2026-03-10T07:50:45.559 INFO:tasks.workunit.client.0.vm05.stdout:1/231: chown da/dd/d2a 32 1 2026-03-10T07:50:45.559 INFO:tasks.workunit.client.0.vm05.stdout:1/232: readlink da/dd/d12/d19/l29 0 2026-03-10T07:50:45.561 INFO:tasks.workunit.client.0.vm05.stdout:6/199: write d0/fa [721122,59234] 0 2026-03-10T07:50:45.565 INFO:tasks.workunit.client.0.vm05.stdout:2/326: dread d0/d47/f48 [0,4194304] 0 2026-03-10T07:50:45.581 INFO:tasks.workunit.client.0.vm05.stdout:4/228: chown d0/d6/d37/d3c 1909 1 2026-03-10T07:50:45.583 INFO:tasks.workunit.client.0.vm05.stdout:4/229: dwrite d0/d17/f2c [0,4194304] 0 2026-03-10T07:50:45.586 INFO:tasks.workunit.client.0.vm05.stdout:7/240: mkdir d1/d3c/d4b 0 2026-03-10T07:50:45.590 INFO:tasks.workunit.client.0.vm05.stdout:9/200: rmdir d8/d35 39 2026-03-10T07:50:45.591 INFO:tasks.workunit.client.0.vm05.stdout:6/200: creat d0/d35/f41 x:0 0 0 2026-03-10T07:50:45.594 INFO:tasks.workunit.client.0.vm05.stdout:6/201: dwrite d0/f23 [4194304,4194304] 0 2026-03-10T07:50:45.605 INFO:tasks.workunit.client.0.vm05.stdout:6/202: mknod d0/d11/d2e/c42 0 2026-03-10T07:50:45.614 INFO:tasks.workunit.client.0.vm05.stdout:9/201: link f7 d8/d35/d1c/d26/d28/f43 0 2026-03-10T07:50:45.621 INFO:tasks.workunit.client.0.vm05.stdout:9/202: dread d8/d35/f23 [0,4194304] 0 2026-03-10T07:50:45.621 INFO:tasks.workunit.client.0.vm05.stdout:9/203: dread - d8/d35/d1c/d36/f39 zero size 2026-03-10T07:50:45.621 INFO:tasks.workunit.client.0.vm05.stdout:9/204: dwrite d8/d35/d1c/d36/f39 [0,4194304] 0 2026-03-10T07:50:45.626 INFO:tasks.workunit.client.0.vm05.stdout:9/205: stat d8/f12 0 2026-03-10T07:50:45.630 INFO:tasks.workunit.client.0.vm05.stdout:9/206: creat d8/d35/d3c/f44 x:0 0 0 2026-03-10T07:50:45.647 INFO:tasks.workunit.client.0.vm05.stdout:9/207: dread d8/d35/d1c/d2c/f2e [0,4194304] 0 2026-03-10T07:50:45.664 INFO:tasks.workunit.client.0.vm05.stdout:7/241: write d1/d6/f32 [4298421,84942] 0 2026-03-10T07:50:45.665 INFO:tasks.workunit.client.0.vm05.stdout:5/214: rmdir d2 39 2026-03-10T07:50:45.667 INFO:tasks.workunit.client.0.vm05.stdout:1/233: mknod da/dd/c3f 0 2026-03-10T07:50:45.671 INFO:tasks.workunit.client.0.vm05.stdout:5/215: unlink d2/d20/f2c 0 2026-03-10T07:50:45.673 INFO:tasks.workunit.client.0.vm05.stdout:1/234: unlink da/dd/f13 0 2026-03-10T07:50:45.675 INFO:tasks.workunit.client.0.vm05.stdout:5/216: dwrite d2/d12/d2d/f39 [0,4194304] 0 2026-03-10T07:50:45.675 INFO:tasks.workunit.client.0.vm05.stdout:5/217: read - d2/f3f zero size 2026-03-10T07:50:45.676 INFO:tasks.workunit.client.0.vm05.stdout:5/218: chown d2/d5 160 1 2026-03-10T07:50:45.677 INFO:tasks.workunit.client.0.vm05.stdout:0/190: rmdir d8 39 2026-03-10T07:50:45.677 INFO:tasks.workunit.client.0.vm05.stdout:6/203: fsync d0/d35/f41 0 2026-03-10T07:50:45.682 INFO:tasks.workunit.client.0.vm05.stdout:7/242: getdents d1/d3c 0 2026-03-10T07:50:45.685 INFO:tasks.workunit.client.0.vm05.stdout:2/327: dwrite d0/f6 [4194304,4194304] 0 2026-03-10T07:50:45.686 INFO:tasks.workunit.client.0.vm05.stdout:2/328: truncate d0/d8/d3d/f40 4227334 0 2026-03-10T07:50:45.687 INFO:tasks.workunit.client.0.vm05.stdout:2/329: dread - d0/d8/f65 zero size 2026-03-10T07:50:45.687 INFO:tasks.workunit.client.0.vm05.stdout:7/243: dwrite d1/f16 [0,4194304] 0 2026-03-10T07:50:45.687 INFO:tasks.workunit.client.0.vm05.stdout:2/330: dread - d0/d2a/f2e zero size 2026-03-10T07:50:45.696 INFO:tasks.workunit.client.0.vm05.stdout:6/204: mkdir d0/d35/d36/d43 0 2026-03-10T07:50:45.696 INFO:tasks.workunit.client.0.vm05.stdout:5/219: creat d2/d20/d33/f47 x:0 0 0 2026-03-10T07:50:45.697 INFO:tasks.workunit.client.0.vm05.stdout:2/331: dread d0/d47/f4c [0,4194304] 0 2026-03-10T07:50:45.698 INFO:tasks.workunit.client.0.vm05.stdout:2/332: chown d0/f7 12904 1 2026-03-10T07:50:45.700 INFO:tasks.workunit.client.0.vm05.stdout:0/191: dread d8/dd/f29 [0,4194304] 0 2026-03-10T07:50:45.704 INFO:tasks.workunit.client.0.vm05.stdout:2/333: dwrite d0/d8/d3d/f40 [4194304,4194304] 0 2026-03-10T07:50:45.711 INFO:tasks.workunit.client.0.vm05.stdout:3/225: rmdir d8/d1f 39 2026-03-10T07:50:45.716 INFO:tasks.workunit.client.0.vm05.stdout:4/230: dread - d0/d6/d9/d12/f36 zero size 2026-03-10T07:50:45.717 INFO:tasks.workunit.client.0.vm05.stdout:5/220: symlink d2/d12/d2d/l48 0 2026-03-10T07:50:45.718 INFO:tasks.workunit.client.0.vm05.stdout:9/208: write f6 [1390846,121036] 0 2026-03-10T07:50:45.719 INFO:tasks.workunit.client.0.vm05.stdout:9/209: chown d8/d35/d1c/d26/c2f 1 1 2026-03-10T07:50:45.736 INFO:tasks.workunit.client.0.vm05.stdout:6/205: creat d0/d6/f44 x:0 0 0 2026-03-10T07:50:45.737 INFO:tasks.workunit.client.0.vm05.stdout:0/192: mknod d8/dd/d10/d26/d2a/c41 0 2026-03-10T07:50:45.737 INFO:tasks.workunit.client.0.vm05.stdout:3/226: fsync d8/d1f/d2a/d33/f44 0 2026-03-10T07:50:45.737 INFO:tasks.workunit.client.0.vm05.stdout:7/244: symlink d1/d34/l4c 0 2026-03-10T07:50:45.737 INFO:tasks.workunit.client.0.vm05.stdout:4/231: mkdir d0/d6/d9/d12/d4f 0 2026-03-10T07:50:45.737 INFO:tasks.workunit.client.0.vm05.stdout:5/221: symlink d2/d5/l49 0 2026-03-10T07:50:45.737 INFO:tasks.workunit.client.0.vm05.stdout:9/210: mknod d8/d35/d1c/d36/c45 0 2026-03-10T07:50:45.737 INFO:tasks.workunit.client.0.vm05.stdout:0/193: mknod d8/dd/d10/d26/c42 0 2026-03-10T07:50:45.737 INFO:tasks.workunit.client.0.vm05.stdout:3/227: creat d8/d1f/d2a/d34/f46 x:0 0 0 2026-03-10T07:50:45.737 INFO:tasks.workunit.client.0.vm05.stdout:5/222: stat d2/d12/f3a 0 2026-03-10T07:50:45.737 INFO:tasks.workunit.client.0.vm05.stdout:4/232: unlink d0/d6/d9/c1e 0 2026-03-10T07:50:45.737 INFO:tasks.workunit.client.0.vm05.stdout:9/211: creat d8/d35/d1c/d26/d28/f46 x:0 0 0 2026-03-10T07:50:45.739 INFO:tasks.workunit.client.0.vm05.stdout:3/228: dwrite d8/d22/f3a [0,4194304] 0 2026-03-10T07:50:45.747 INFO:tasks.workunit.client.0.vm05.stdout:5/223: mkdir d2/d12/d2d/d4a 0 2026-03-10T07:50:45.757 INFO:tasks.workunit.client.0.vm05.stdout:9/212: mkdir d8/d35/d22/d33/d47 0 2026-03-10T07:50:45.757 INFO:tasks.workunit.client.0.vm05.stdout:9/213: dread - d8/d35/d1c/d26/f3d zero size 2026-03-10T07:50:45.759 INFO:tasks.workunit.client.0.vm05.stdout:4/233: link d0/d6/d9/l1a d0/d17/d44/l50 0 2026-03-10T07:50:45.764 INFO:tasks.workunit.client.0.vm05.stdout:5/224: getdents d2/d12 0 2026-03-10T07:50:45.766 INFO:tasks.workunit.client.0.vm05.stdout:6/206: dread d0/d11/d22/f2b [0,4194304] 0 2026-03-10T07:50:45.779 INFO:tasks.workunit.client.0.vm05.stdout:6/207: dread - d0/f26 zero size 2026-03-10T07:50:45.779 INFO:tasks.workunit.client.0.vm05.stdout:5/225: mkdir d2/d4b 0 2026-03-10T07:50:45.779 INFO:tasks.workunit.client.0.vm05.stdout:9/214: dwrite d8/f14 [0,4194304] 0 2026-03-10T07:50:45.779 INFO:tasks.workunit.client.0.vm05.stdout:4/234: dwrite d0/d6/d37/d3c/f2f [0,4194304] 0 2026-03-10T07:50:45.779 INFO:tasks.workunit.client.0.vm05.stdout:4/235: truncate d0/d17/f2c 4539286 0 2026-03-10T07:50:45.779 INFO:tasks.workunit.client.0.vm05.stdout:6/208: link d0/d6/f44 d0/d6/f45 0 2026-03-10T07:50:45.779 INFO:tasks.workunit.client.0.vm05.stdout:4/236: mknod d0/d17/d44/c51 0 2026-03-10T07:50:45.779 INFO:tasks.workunit.client.0.vm05.stdout:9/215: creat d8/d35/f48 x:0 0 0 2026-03-10T07:50:45.779 INFO:tasks.workunit.client.0.vm05.stdout:9/216: unlink d8/f12 0 2026-03-10T07:50:45.779 INFO:tasks.workunit.client.0.vm05.stdout:6/209: symlink d0/d11/d2e/l46 0 2026-03-10T07:50:45.782 INFO:tasks.workunit.client.0.vm05.stdout:9/217: creat d8/d35/d1c/f49 x:0 0 0 2026-03-10T07:50:45.782 INFO:tasks.workunit.client.0.vm05.stdout:4/237: link d0/d17/d44/l50 d0/d17/d4e/l52 0 2026-03-10T07:50:45.796 INFO:tasks.workunit.client.0.vm05.stdout:9/218: chown d8/d35/d1c 2790 1 2026-03-10T07:50:45.796 INFO:tasks.workunit.client.0.vm05.stdout:9/219: chown d8/fa 209390 1 2026-03-10T07:50:45.796 INFO:tasks.workunit.client.0.vm05.stdout:4/238: fsync d0/f23 0 2026-03-10T07:50:45.796 INFO:tasks.workunit.client.0.vm05.stdout:4/239: creat d0/d3b/f53 x:0 0 0 2026-03-10T07:50:45.796 INFO:tasks.workunit.client.0.vm05.stdout:4/240: dread - d0/d6/d9/f49 zero size 2026-03-10T07:50:45.796 INFO:tasks.workunit.client.0.vm05.stdout:4/241: fdatasync d0/f24 0 2026-03-10T07:50:45.796 INFO:tasks.workunit.client.0.vm05.stdout:4/242: unlink d0/d17/d4e/l52 0 2026-03-10T07:50:45.796 INFO:tasks.workunit.client.0.vm05.stdout:4/243: rename d0/d20/f34 to d0/d6/d9/f54 0 2026-03-10T07:50:45.796 INFO:tasks.workunit.client.0.vm05.stdout:4/244: chown d0/c5 1 1 2026-03-10T07:50:45.796 INFO:tasks.workunit.client.0.vm05.stdout:4/245: stat d0/d28/f33 0 2026-03-10T07:50:45.796 INFO:tasks.workunit.client.0.vm05.stdout:4/246: chown d0/f2 7240 1 2026-03-10T07:50:45.799 INFO:tasks.workunit.client.0.vm05.stdout:0/194: sync 2026-03-10T07:50:45.800 INFO:tasks.workunit.client.0.vm05.stdout:6/210: dread d0/fa [0,4194304] 0 2026-03-10T07:50:45.808 INFO:tasks.workunit.client.0.vm05.stdout:0/195: dread d8/dd/f29 [0,4194304] 0 2026-03-10T07:50:45.808 INFO:tasks.workunit.client.0.vm05.stdout:6/211: write d0/d11/d2e/f30 [441464,74438] 0 2026-03-10T07:50:45.812 INFO:tasks.workunit.client.0.vm05.stdout:6/212: write d0/d6/f2c [2985411,12684] 0 2026-03-10T07:50:45.818 INFO:tasks.workunit.client.0.vm05.stdout:0/196: symlink d8/dd/d10/d26/d2a/l43 0 2026-03-10T07:50:45.819 INFO:tasks.workunit.client.0.vm05.stdout:9/220: dread d8/f15 [0,4194304] 0 2026-03-10T07:50:45.821 INFO:tasks.workunit.client.0.vm05.stdout:0/197: mknod d8/dd/d10/c44 0 2026-03-10T07:50:45.822 INFO:tasks.workunit.client.0.vm05.stdout:6/213: creat d0/d35/d36/d43/f47 x:0 0 0 2026-03-10T07:50:45.828 INFO:tasks.workunit.client.0.vm05.stdout:0/198: symlink d8/dd/d10/d26/l45 0 2026-03-10T07:50:45.828 INFO:tasks.workunit.client.0.vm05.stdout:0/199: symlink d8/dd/d10/d26/l46 0 2026-03-10T07:50:45.828 INFO:tasks.workunit.client.0.vm05.stdout:0/200: unlink d8/f13 0 2026-03-10T07:50:45.828 INFO:tasks.workunit.client.0.vm05.stdout:0/201: readlink d8/l23 0 2026-03-10T07:50:45.828 INFO:tasks.workunit.client.0.vm05.stdout:6/214: dwrite d0/d11/d22/f2b [0,4194304] 0 2026-03-10T07:50:45.828 INFO:tasks.workunit.client.0.vm05.stdout:0/202: readlink d8/dd/d10/l25 0 2026-03-10T07:50:45.829 INFO:tasks.workunit.client.0.vm05.stdout:0/203: stat d8/dd/d10/d26/c42 0 2026-03-10T07:50:45.829 INFO:tasks.workunit.client.0.vm05.stdout:0/204: read - d8/f2d zero size 2026-03-10T07:50:45.833 INFO:tasks.workunit.client.0.vm05.stdout:6/215: dwrite d0/d11/d2e/f30 [0,4194304] 0 2026-03-10T07:50:45.845 INFO:tasks.workunit.client.0.vm05.stdout:8/219: symlink d1/l41 0 2026-03-10T07:50:45.850 INFO:tasks.workunit.client.0.vm05.stdout:1/235: dwrite da/d26/f33 [4194304,4194304] 0 2026-03-10T07:50:45.852 INFO:tasks.workunit.client.0.vm05.stdout:1/236: read da/dd/d12/f18 [168766,42004] 0 2026-03-10T07:50:45.853 INFO:tasks.workunit.client.0.vm05.stdout:1/237: write da/dd/d2a/f2f [400942,127406] 0 2026-03-10T07:50:45.858 INFO:tasks.workunit.client.0.vm05.stdout:1/238: dread da/d26/f33 [4194304,4194304] 0 2026-03-10T07:50:45.871 INFO:tasks.workunit.client.0.vm05.stdout:1/239: chown da/l23 158798 1 2026-03-10T07:50:45.871 INFO:tasks.workunit.client.0.vm05.stdout:1/240: write da/f3a [748164,126754] 0 2026-03-10T07:50:45.871 INFO:tasks.workunit.client.0.vm05.stdout:1/241: creat da/dd/d27/f40 x:0 0 0 2026-03-10T07:50:45.874 INFO:tasks.workunit.client.0.vm05.stdout:1/242: unlink da/dd/f11 0 2026-03-10T07:50:45.876 INFO:tasks.workunit.client.0.vm05.stdout:1/243: truncate da/dd/d12/d19/f1a 511825 0 2026-03-10T07:50:45.879 INFO:tasks.workunit.client.0.vm05.stdout:2/334: dwrite d0/f41 [0,4194304] 0 2026-03-10T07:50:45.884 INFO:tasks.workunit.client.0.vm05.stdout:2/335: unlink d0/d8/f31 0 2026-03-10T07:50:45.885 INFO:tasks.workunit.client.0.vm05.stdout:1/244: creat da/f41 x:0 0 0 2026-03-10T07:50:45.887 INFO:tasks.workunit.client.0.vm05.stdout:2/336: mknod d0/d8/d43/d38/c6b 0 2026-03-10T07:50:45.887 INFO:tasks.workunit.client.0.vm05.stdout:2/337: write d0/d8/f65 [776560,46578] 0 2026-03-10T07:50:45.893 INFO:tasks.workunit.client.0.vm05.stdout:1/245: mkdir da/dd/d42 0 2026-03-10T07:50:45.894 INFO:tasks.workunit.client.0.vm05.stdout:1/246: chown da/dd/d12/d34/l37 86 1 2026-03-10T07:50:45.896 INFO:tasks.workunit.client.0.vm05.stdout:1/247: creat da/f43 x:0 0 0 2026-03-10T07:50:45.897 INFO:tasks.workunit.client.0.vm05.stdout:1/248: mknod da/dd/d12/d19/d20/c44 0 2026-03-10T07:50:45.898 INFO:tasks.workunit.client.0.vm05.stdout:1/249: write da/dd/d12/d34/f38 [16335,95821] 0 2026-03-10T07:50:45.900 INFO:tasks.workunit.client.0.vm05.stdout:7/245: dwrite d1/f27 [0,4194304] 0 2026-03-10T07:50:45.907 INFO:tasks.workunit.client.0.vm05.stdout:3/229: truncate d8/fb 5280697 0 2026-03-10T07:50:45.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:45 vm05.local ceph-mon[50387]: mgrmap e30: vm05.blexke(active, since 17s), standbys: vm08.orfpog 2026-03-10T07:50:45.915 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:45 vm05.local ceph-mon[50387]: pgmap v12: 65 pgs: 65 active+clean; 579 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 2.4 MiB/s rd, 61 MiB/s wr, 290 op/s 2026-03-10T07:50:45.916 INFO:tasks.workunit.client.0.vm05.stdout:3/230: dwrite d8/f12 [4194304,4194304] 0 2026-03-10T07:50:45.916 INFO:tasks.workunit.client.0.vm05.stdout:5/226: truncate d2/ff 2819505 0 2026-03-10T07:50:45.916 INFO:tasks.workunit.client.0.vm05.stdout:7/246: creat d1/d34/f4d x:0 0 0 2026-03-10T07:50:45.916 INFO:tasks.workunit.client.0.vm05.stdout:3/231: rmdir d8 39 2026-03-10T07:50:45.916 INFO:tasks.workunit.client.0.vm05.stdout:5/227: mkdir d2/d20/d4c 0 2026-03-10T07:50:45.916 INFO:tasks.workunit.client.0.vm05.stdout:7/247: write d1/d6/f31 [322410,29211] 0 2026-03-10T07:50:45.917 INFO:tasks.workunit.client.0.vm05.stdout:7/248: write d1/f46 [939282,112245] 0 2026-03-10T07:50:45.917 INFO:tasks.workunit.client.0.vm05.stdout:7/249: chown d1/d6 273239337 1 2026-03-10T07:50:45.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:45 vm08.local ceph-mon[59917]: mgrmap e30: vm05.blexke(active, since 17s), standbys: vm08.orfpog 2026-03-10T07:50:45.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:45 vm08.local ceph-mon[59917]: pgmap v12: 65 pgs: 65 active+clean; 579 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 2.4 MiB/s rd, 61 MiB/s wr, 290 op/s 2026-03-10T07:50:45.921 INFO:tasks.workunit.client.0.vm05.stdout:5/228: mkdir d2/d12/d4d 0 2026-03-10T07:50:45.921 INFO:tasks.workunit.client.0.vm05.stdout:5/229: dread - d2/d12/f24 zero size 2026-03-10T07:50:45.922 INFO:tasks.workunit.client.0.vm05.stdout:5/230: fsync d2/d20/f32 0 2026-03-10T07:50:45.923 INFO:tasks.workunit.client.0.vm05.stdout:5/231: mknod d2/d5/c4e 0 2026-03-10T07:50:45.925 INFO:tasks.workunit.client.0.vm05.stdout:5/232: chown d2/d20/f2a 38357540 1 2026-03-10T07:50:45.925 INFO:tasks.workunit.client.0.vm05.stdout:5/233: chown d2/d20/f32 191118564 1 2026-03-10T07:50:45.925 INFO:tasks.workunit.client.0.vm05.stdout:5/234: creat d2/d20/f4f x:0 0 0 2026-03-10T07:50:45.926 INFO:tasks.workunit.client.0.vm05.stdout:3/232: read d8/ff [3384608,47732] 0 2026-03-10T07:50:45.926 INFO:tasks.workunit.client.0.vm05.stdout:7/250: getdents d1 0 2026-03-10T07:50:45.932 INFO:tasks.workunit.client.0.vm05.stdout:7/251: dread d1/d34/f3e [0,4194304] 0 2026-03-10T07:50:45.938 INFO:tasks.workunit.client.0.vm05.stdout:7/252: dwrite d1/f27 [0,4194304] 0 2026-03-10T07:50:45.943 INFO:tasks.workunit.client.0.vm05.stdout:7/253: creat d1/d6/f4e x:0 0 0 2026-03-10T07:50:45.953 INFO:tasks.workunit.client.0.vm05.stdout:7/254: dread - d1/f49 zero size 2026-03-10T07:50:45.953 INFO:tasks.workunit.client.0.vm05.stdout:5/235: symlink d2/d12/d2d/d4a/l50 0 2026-03-10T07:50:45.953 INFO:tasks.workunit.client.0.vm05.stdout:3/233: mkdir d8/d47 0 2026-03-10T07:50:45.953 INFO:tasks.workunit.client.0.vm05.stdout:7/255: dwrite d1/d6/fb [0,4194304] 0 2026-03-10T07:50:45.953 INFO:tasks.workunit.client.0.vm05.stdout:7/256: truncate d1/f3d 225531 0 2026-03-10T07:50:45.953 INFO:tasks.workunit.client.0.vm05.stdout:7/257: chown d1/d6/fb 12 1 2026-03-10T07:50:45.957 INFO:tasks.workunit.client.0.vm05.stdout:3/234: rmdir d8/d1f/d24/d41 0 2026-03-10T07:50:45.958 INFO:tasks.workunit.client.0.vm05.stdout:5/236: dread d2/d12/f2b [0,4194304] 0 2026-03-10T07:50:45.962 INFO:tasks.workunit.client.0.vm05.stdout:5/237: unlink d2/c11 0 2026-03-10T07:50:45.962 INFO:tasks.workunit.client.0.vm05.stdout:7/258: getdents d1/d3c 0 2026-03-10T07:50:45.963 INFO:tasks.workunit.client.0.vm05.stdout:5/238: rmdir d2/d20/d33 39 2026-03-10T07:50:45.965 INFO:tasks.workunit.client.0.vm05.stdout:3/235: getdents d8/d22 0 2026-03-10T07:50:45.966 INFO:tasks.workunit.client.0.vm05.stdout:3/236: mkdir d8/d1c/d48 0 2026-03-10T07:50:45.970 INFO:tasks.workunit.client.0.vm05.stdout:7/259: getdents d1/d3c 0 2026-03-10T07:50:45.970 INFO:tasks.workunit.client.0.vm05.stdout:3/237: write d8/d1f/d2a/d34/f39 [25507,74728] 0 2026-03-10T07:50:45.970 INFO:tasks.workunit.client.0.vm05.stdout:5/239: creat d2/d20/f51 x:0 0 0 2026-03-10T07:50:45.970 INFO:tasks.workunit.client.0.vm05.stdout:5/240: readlink d2/d12/d2d/l48 0 2026-03-10T07:50:45.970 INFO:tasks.workunit.client.0.vm05.stdout:5/241: chown d2/d20 378194 1 2026-03-10T07:50:45.971 INFO:tasks.workunit.client.0.vm05.stdout:3/238: link d8/d16/d19/f21 d8/d1f/f49 0 2026-03-10T07:50:45.971 INFO:tasks.workunit.client.0.vm05.stdout:3/239: mkdir d8/d1f/d2a/d4a 0 2026-03-10T07:50:45.972 INFO:tasks.workunit.client.0.vm05.stdout:3/240: mkdir d8/d1f/d2a/d4b 0 2026-03-10T07:50:45.976 INFO:tasks.workunit.client.0.vm05.stdout:3/241: dwrite d8/d1f/d2a/d34/f3f [0,4194304] 0 2026-03-10T07:50:45.979 INFO:tasks.workunit.client.0.vm05.stdout:3/242: readlink d8/l9 0 2026-03-10T07:50:45.980 INFO:tasks.workunit.client.0.vm05.stdout:3/243: creat d8/d16/f4c x:0 0 0 2026-03-10T07:50:45.983 INFO:tasks.workunit.client.0.vm05.stdout:3/244: link d8/f12 d8/f4d 0 2026-03-10T07:50:45.995 INFO:tasks.workunit.client.0.vm05.stdout:4/247: rename d0/d17 to d0/d6/d9/d12/d45/d55 0 2026-03-10T07:50:45.996 INFO:tasks.workunit.client.0.vm05.stdout:4/248: write d0/d6/d37/d3c/f2f [4168340,29289] 0 2026-03-10T07:50:45.999 INFO:tasks.workunit.client.0.vm05.stdout:4/249: dwrite d0/f23 [0,4194304] 0 2026-03-10T07:50:45.999 INFO:tasks.workunit.client.0.vm05.stdout:4/250: truncate d0/d6/d9/f4d 816805 0 2026-03-10T07:50:46.003 INFO:tasks.workunit.client.0.vm05.stdout:3/245: dread d8/fe [0,4194304] 0 2026-03-10T07:50:46.003 INFO:tasks.workunit.client.0.vm05.stdout:3/246: chown d8/c15 3060551 1 2026-03-10T07:50:46.008 INFO:tasks.workunit.client.0.vm05.stdout:6/216: write d0/d6/f1d [4892538,39856] 0 2026-03-10T07:50:46.008 INFO:tasks.workunit.client.0.vm05.stdout:0/205: write d8/dd/f22 [1752759,55036] 0 2026-03-10T07:50:46.009 INFO:tasks.workunit.client.0.vm05.stdout:6/217: fdatasync d0/d11/f21 0 2026-03-10T07:50:46.016 INFO:tasks.workunit.client.0.vm05.stdout:9/221: rename f7 to d8/d35/d22/f4a 0 2026-03-10T07:50:46.019 INFO:tasks.workunit.client.0.vm05.stdout:4/251: creat d0/d6/d9/d12/d45/d55/f56 x:0 0 0 2026-03-10T07:50:46.020 INFO:tasks.workunit.client.0.vm05.stdout:0/206: mknod d8/dd/d37/c47 0 2026-03-10T07:50:46.021 INFO:tasks.workunit.client.0.vm05.stdout:9/222: unlink d8/fd 0 2026-03-10T07:50:46.022 INFO:tasks.workunit.client.0.vm05.stdout:9/223: chown d8/d35/l42 30 1 2026-03-10T07:50:46.023 INFO:tasks.workunit.client.0.vm05.stdout:8/220: rename d1/l2f to d1/d23/l42 0 2026-03-10T07:50:46.024 INFO:tasks.workunit.client.0.vm05.stdout:4/252: fdatasync d0/d6/f39 0 2026-03-10T07:50:46.025 INFO:tasks.workunit.client.0.vm05.stdout:0/207: mkdir d8/dd/d10/d26/d48 0 2026-03-10T07:50:46.025 INFO:tasks.workunit.client.0.vm05.stdout:9/224: unlink d8/d35/d1c/f1e 0 2026-03-10T07:50:46.027 INFO:tasks.workunit.client.0.vm05.stdout:5/242: rename d2/d5/c17 to d2/d20/d33/c52 0 2026-03-10T07:50:46.027 INFO:tasks.workunit.client.0.vm05.stdout:8/221: dwrite d1/dd/d18/f22 [0,4194304] 0 2026-03-10T07:50:46.029 INFO:tasks.workunit.client.0.vm05.stdout:4/253: rmdir d0/d6/d37 39 2026-03-10T07:50:46.030 INFO:tasks.workunit.client.0.vm05.stdout:9/225: symlink d8/d35/d22/d33/l4b 0 2026-03-10T07:50:46.031 INFO:tasks.workunit.client.0.vm05.stdout:8/222: creat d1/dd/d18/d20/f43 x:0 0 0 2026-03-10T07:50:46.038 INFO:tasks.workunit.client.0.vm05.stdout:9/226: dwrite d8/d35/f21 [0,4194304] 0 2026-03-10T07:50:46.038 INFO:tasks.workunit.client.0.vm05.stdout:4/254: dread - d0/d6/d37/f46 zero size 2026-03-10T07:50:46.042 INFO:tasks.workunit.client.0.vm05.stdout:4/255: write d0/d6/d9/d12/d45/d55/f56 [237479,70432] 0 2026-03-10T07:50:46.064 INFO:tasks.workunit.client.0.vm05.stdout:9/227: creat d8/d35/d1c/f4c x:0 0 0 2026-03-10T07:50:46.065 INFO:tasks.workunit.client.0.vm05.stdout:9/228: read - d8/d35/d1c/d26/f3d zero size 2026-03-10T07:50:46.065 INFO:tasks.workunit.client.0.vm05.stdout:4/256: link d0/l3 d0/d20/d26/l57 0 2026-03-10T07:50:46.065 INFO:tasks.workunit.client.0.vm05.stdout:4/257: dread - d0/d6/d9/d12/f36 zero size 2026-03-10T07:50:46.065 INFO:tasks.workunit.client.0.vm05.stdout:4/258: write d0/d6/d9/d12/f36 [328691,43097] 0 2026-03-10T07:50:46.065 INFO:tasks.workunit.client.0.vm05.stdout:9/229: dwrite d8/d35/f1d [0,4194304] 0 2026-03-10T07:50:46.065 INFO:tasks.workunit.client.0.vm05.stdout:4/259: unlink d0/d6/d9/d12/f35 0 2026-03-10T07:50:46.067 INFO:tasks.workunit.client.0.vm05.stdout:4/260: creat d0/d6/d37/d3c/f58 x:0 0 0 2026-03-10T07:50:46.068 INFO:tasks.workunit.client.0.vm05.stdout:9/230: creat d8/d35/d38/f4d x:0 0 0 2026-03-10T07:50:46.069 INFO:tasks.workunit.client.0.vm05.stdout:9/231: symlink d8/d35/d22/l4e 0 2026-03-10T07:50:46.069 INFO:tasks.workunit.client.0.vm05.stdout:9/232: truncate d8/d35/d1c/f4c 54134 0 2026-03-10T07:50:46.077 INFO:tasks.workunit.client.0.vm05.stdout:9/233: truncate d8/d35/f25 301482 0 2026-03-10T07:50:46.078 INFO:tasks.workunit.client.0.vm05.stdout:9/234: write d8/d35/f48 [854628,51694] 0 2026-03-10T07:50:46.078 INFO:tasks.workunit.client.0.vm05.stdout:2/338: dwrite d0/d8/d43/df/d25/f2b [0,4194304] 0 2026-03-10T07:50:46.081 INFO:tasks.workunit.client.0.vm05.stdout:2/339: chown d0/d8/f2d 9213767 1 2026-03-10T07:50:46.081 INFO:tasks.workunit.client.0.vm05.stdout:9/235: creat d8/d35/d22/f4f x:0 0 0 2026-03-10T07:50:46.087 INFO:tasks.workunit.client.0.vm05.stdout:2/340: creat d0/d8/d66/f6c x:0 0 0 2026-03-10T07:50:46.087 INFO:tasks.workunit.client.0.vm05.stdout:2/341: chown d0/d8/d43/df/d25/c32 2620777 1 2026-03-10T07:50:46.088 INFO:tasks.workunit.client.0.vm05.stdout:2/342: dread - d0/d8/d66/f6c zero size 2026-03-10T07:50:46.089 INFO:tasks.workunit.client.0.vm05.stdout:2/343: write d0/d8/f65 [744216,21993] 0 2026-03-10T07:50:46.090 INFO:tasks.workunit.client.0.vm05.stdout:2/344: rmdir d0/d52 39 2026-03-10T07:50:46.091 INFO:tasks.workunit.client.0.vm05.stdout:2/345: creat d0/d8/d43/df/d53/f6d x:0 0 0 2026-03-10T07:50:46.093 INFO:tasks.workunit.client.0.vm05.stdout:2/346: rename d0/l28 to d0/d8/d43/df/d4e/l6e 0 2026-03-10T07:50:46.094 INFO:tasks.workunit.client.0.vm05.stdout:2/347: mknod d0/d8/d43/c6f 0 2026-03-10T07:50:46.106 INFO:tasks.workunit.client.0.vm05.stdout:9/236: dread d8/d35/d22/f2b [0,4194304] 0 2026-03-10T07:50:46.107 INFO:tasks.workunit.client.0.vm05.stdout:9/237: creat d8/d35/d1c/d36/f50 x:0 0 0 2026-03-10T07:50:46.108 INFO:tasks.workunit.client.0.vm05.stdout:9/238: dread - d8/d35/d1c/d36/f50 zero size 2026-03-10T07:50:46.109 INFO:tasks.workunit.client.0.vm05.stdout:9/239: creat d8/d35/f51 x:0 0 0 2026-03-10T07:50:46.111 INFO:tasks.workunit.client.0.vm05.stdout:9/240: dread d8/f14 [0,4194304] 0 2026-03-10T07:50:46.111 INFO:tasks.workunit.client.0.vm05.stdout:9/241: dread - d8/d35/d1c/f40 zero size 2026-03-10T07:50:46.115 INFO:tasks.workunit.client.0.vm05.stdout:6/218: sync 2026-03-10T07:50:46.115 INFO:tasks.workunit.client.0.vm05.stdout:4/261: sync 2026-03-10T07:50:46.117 INFO:tasks.workunit.client.0.vm05.stdout:9/242: creat d8/d35/d1c/f52 x:0 0 0 2026-03-10T07:50:46.124 INFO:tasks.workunit.client.0.vm05.stdout:9/243: dread d8/fa [0,4194304] 0 2026-03-10T07:50:46.126 INFO:tasks.workunit.client.0.vm05.stdout:9/244: rmdir d8/d35/d1c/d26/d28 39 2026-03-10T07:50:46.128 INFO:tasks.workunit.client.0.vm05.stdout:9/245: rename d8/c24 to d8/d35/d22/d33/c53 0 2026-03-10T07:50:46.141 INFO:tasks.workunit.client.0.vm05.stdout:9/246: sync 2026-03-10T07:50:46.143 INFO:tasks.workunit.client.0.vm05.stdout:9/247: mkdir d8/d35/d1c/d20/d54 0 2026-03-10T07:50:46.188 INFO:tasks.workunit.client.0.vm05.stdout:3/247: dread d8/fb [4194304,4194304] 0 2026-03-10T07:50:46.188 INFO:tasks.workunit.client.0.vm05.stdout:3/248: write d8/d16/f2d [579826,78351] 0 2026-03-10T07:50:46.190 INFO:tasks.workunit.client.0.vm05.stdout:1/250: write da/dd/d12/d19/f1a [1558416,111571] 0 2026-03-10T07:50:46.197 INFO:tasks.workunit.client.0.vm05.stdout:3/249: rename d8/l1b to d8/d16/l4e 0 2026-03-10T07:50:46.197 INFO:tasks.workunit.client.0.vm05.stdout:3/250: write d8/d1f/f2f [705201,80224] 0 2026-03-10T07:50:46.197 INFO:tasks.workunit.client.0.vm05.stdout:1/251: rename da/fb to da/d26/d2b/f45 0 2026-03-10T07:50:46.197 INFO:tasks.workunit.client.0.vm05.stdout:0/208: rmdir d8/dd/d10/d26 39 2026-03-10T07:50:46.197 INFO:tasks.workunit.client.0.vm05.stdout:0/209: dread - d8/dd/d34/f3d zero size 2026-03-10T07:50:46.198 INFO:tasks.workunit.client.0.vm05.stdout:1/252: mknod da/d26/c46 0 2026-03-10T07:50:46.198 INFO:tasks.workunit.client.0.vm05.stdout:1/253: dread - da/dd/d12/f3d zero size 2026-03-10T07:50:46.200 INFO:tasks.workunit.client.0.vm05.stdout:3/251: rename d8/d22/f3a to d8/d1f/d2a/d4a/f4f 0 2026-03-10T07:50:46.202 INFO:tasks.workunit.client.0.vm05.stdout:1/254: mkdir da/dd/d2a/d47 0 2026-03-10T07:50:46.206 INFO:tasks.workunit.client.0.vm05.stdout:3/252: creat d8/d1f/d2a/d33/f50 x:0 0 0 2026-03-10T07:50:46.207 INFO:tasks.workunit.client.0.vm05.stdout:0/210: mknod d8/dd/d10/c49 0 2026-03-10T07:50:46.212 INFO:tasks.workunit.client.0.vm05.stdout:8/223: dwrite d1/dd/f11 [0,4194304] 0 2026-03-10T07:50:46.214 INFO:tasks.workunit.client.0.vm05.stdout:1/255: dread da/d26/d2b/f45 [0,4194304] 0 2026-03-10T07:50:46.214 INFO:tasks.workunit.client.0.vm05.stdout:1/256: chown c8 14714 1 2026-03-10T07:50:46.218 INFO:tasks.workunit.client.0.vm05.stdout:7/260: dread d1/f21 [0,4194304] 0 2026-03-10T07:50:46.223 INFO:tasks.workunit.client.0.vm05.stdout:3/253: mkdir d8/d1f/d2a/d51 0 2026-03-10T07:50:46.223 INFO:tasks.workunit.client.0.vm05.stdout:0/211: creat d8/f4a x:0 0 0 2026-03-10T07:50:46.227 INFO:tasks.workunit.client.0.vm05.stdout:8/224: dread d1/dd/d18/f21 [0,4194304] 0 2026-03-10T07:50:46.231 INFO:tasks.workunit.client.0.vm05.stdout:8/225: fdatasync d1/dd/d18/d20/f43 0 2026-03-10T07:50:46.231 INFO:tasks.workunit.client.0.vm05.stdout:8/226: write d1/dd/d18/f29 [5200871,90732] 0 2026-03-10T07:50:46.231 INFO:tasks.workunit.client.0.vm05.stdout:2/348: write d0/d47/f48 [349988,94416] 0 2026-03-10T07:50:46.234 INFO:tasks.workunit.client.0.vm05.stdout:8/227: dwrite d1/fa [4194304,4194304] 0 2026-03-10T07:50:46.240 INFO:tasks.workunit.client.0.vm05.stdout:1/257: mknod da/dd/d12/c48 0 2026-03-10T07:50:46.241 INFO:tasks.workunit.client.0.vm05.stdout:9/248: getdents d8/d35 0 2026-03-10T07:50:46.243 INFO:tasks.workunit.client.0.vm05.stdout:9/249: truncate d8/d35/d1c/d26/f3d 357195 0 2026-03-10T07:50:46.243 INFO:tasks.workunit.client.0.vm05.stdout:9/250: write d8/f9 [581086,106792] 0 2026-03-10T07:50:46.244 INFO:tasks.workunit.client.0.vm05.stdout:1/258: dwrite da/dd/d12/f3d [0,4194304] 0 2026-03-10T07:50:46.250 INFO:tasks.workunit.client.0.vm05.stdout:4/262: dwrite d0/d20/f2a [0,4194304] 0 2026-03-10T07:50:46.258 INFO:tasks.workunit.client.0.vm05.stdout:7/261: chown d1/d6/d3b/l45 0 1 2026-03-10T07:50:46.258 INFO:tasks.workunit.client.0.vm05.stdout:6/219: truncate d0/f23 7687054 0 2026-03-10T07:50:46.258 INFO:tasks.workunit.client.0.vm05.stdout:5/243: write d2/ff [2272771,66878] 0 2026-03-10T07:50:46.258 INFO:tasks.workunit.client.0.vm05.stdout:3/254: mkdir d8/d16/d52 0 2026-03-10T07:50:46.258 INFO:tasks.workunit.client.0.vm05.stdout:5/244: readlink d2/d12/d2d/d4a/l50 0 2026-03-10T07:50:46.258 INFO:tasks.workunit.client.0.vm05.stdout:5/245: fsync d2/d20/f32 0 2026-03-10T07:50:46.258 INFO:tasks.workunit.client.0.vm05.stdout:5/246: stat d2/d5/f23 0 2026-03-10T07:50:46.259 INFO:tasks.workunit.client.0.vm05.stdout:5/247: dwrite d2/d20/f2a [4194304,4194304] 0 2026-03-10T07:50:46.284 INFO:tasks.workunit.client.0.vm05.stdout:2/349: chown d0/d8/c64 264214 1 2026-03-10T07:50:46.285 INFO:tasks.workunit.client.0.vm05.stdout:2/350: write d0/d8/d43/df/d4d/f57 [325323,53040] 0 2026-03-10T07:50:46.292 INFO:tasks.workunit.client.0.vm05.stdout:9/251: unlink d8/d35/c27 0 2026-03-10T07:50:46.298 INFO:tasks.workunit.client.0.vm05.stdout:7/262: rmdir d1/d6 39 2026-03-10T07:50:46.302 INFO:tasks.workunit.client.0.vm05.stdout:4/263: dread d0/d6/f15 [0,4194304] 0 2026-03-10T07:50:46.303 INFO:tasks.workunit.client.0.vm05.stdout:3/255: mkdir d8/d16/d19/d53 0 2026-03-10T07:50:46.305 INFO:tasks.workunit.client.0.vm05.stdout:3/256: dread d8/f12 [4194304,4194304] 0 2026-03-10T07:50:46.306 INFO:tasks.workunit.client.0.vm05.stdout:3/257: write d8/d1f/d24/f3e [323059,72988] 0 2026-03-10T07:50:46.311 INFO:tasks.workunit.client.0.vm05.stdout:5/248: mkdir d2/d20/d33/d53 0 2026-03-10T07:50:46.315 INFO:tasks.workunit.client.0.vm05.stdout:5/249: stat d2/d12/d2d/d4a/l50 0 2026-03-10T07:50:46.316 INFO:tasks.workunit.client.0.vm05.stdout:3/258: dread d8/d1c/f35 [0,4194304] 0 2026-03-10T07:50:46.323 INFO:tasks.workunit.client.0.vm05.stdout:1/259: getdents da/dd/d42 0 2026-03-10T07:50:46.331 INFO:tasks.workunit.client.0.vm05.stdout:6/220: mknod d0/c48 0 2026-03-10T07:50:46.331 INFO:tasks.workunit.client.0.vm05.stdout:2/351: dread d0/d8/d43/f1f [0,4194304] 0 2026-03-10T07:50:46.339 INFO:tasks.workunit.client.0.vm05.stdout:0/212: dwrite d8/dd/f29 [0,4194304] 0 2026-03-10T07:50:46.340 INFO:tasks.workunit.client.0.vm05.stdout:7/263: sync 2026-03-10T07:50:46.345 INFO:tasks.workunit.client.0.vm05.stdout:0/213: dread d8/dd/f3c [0,4194304] 0 2026-03-10T07:50:46.352 INFO:tasks.workunit.client.0.vm05.stdout:5/250: truncate d2/d5/f1e 3048847 0 2026-03-10T07:50:46.352 INFO:tasks.workunit.client.0.vm05.stdout:5/251: stat d2/d12/d2d/l48 0 2026-03-10T07:50:46.355 INFO:tasks.workunit.client.0.vm05.stdout:3/259: dread d8/f25 [0,4194304] 0 2026-03-10T07:50:46.356 INFO:tasks.workunit.client.0.vm05.stdout:3/260: write d8/d1c/f23 [3546669,2713] 0 2026-03-10T07:50:46.359 INFO:tasks.workunit.client.0.vm05.stdout:9/252: symlink d8/d35/d1c/d36/l55 0 2026-03-10T07:50:46.364 INFO:tasks.workunit.client.0.vm05.stdout:1/260: truncate da/dd/d12/f18 606112 0 2026-03-10T07:50:46.364 INFO:tasks.workunit.client.0.vm05.stdout:1/261: write f4 [9337497,77071] 0 2026-03-10T07:50:46.364 INFO:tasks.workunit.client.0.vm05.stdout:1/262: fdatasync da/dd/d12/d19/f1a 0 2026-03-10T07:50:46.365 INFO:tasks.workunit.client.0.vm05.stdout:6/221: chown d0/c20 5734 1 2026-03-10T07:50:46.370 INFO:tasks.workunit.client.0.vm05.stdout:1/263: dread da/dd/d12/d19/f1a [0,4194304] 0 2026-03-10T07:50:46.371 INFO:tasks.workunit.client.0.vm05.stdout:1/264: chown da/dd/d12/d19/d20/c2c 50368 1 2026-03-10T07:50:46.371 INFO:tasks.workunit.client.0.vm05.stdout:1/265: stat da/dd/c3f 0 2026-03-10T07:50:46.373 INFO:tasks.workunit.client.0.vm05.stdout:4/264: unlink d0/d20/d26/l57 0 2026-03-10T07:50:46.378 INFO:tasks.workunit.client.0.vm05.stdout:6/222: dread d0/d11/d31/f33 [0,4194304] 0 2026-03-10T07:50:46.378 INFO:tasks.workunit.client.0.vm05.stdout:6/223: chown d0/c20 64 1 2026-03-10T07:50:46.381 INFO:tasks.workunit.client.0.vm05.stdout:0/214: chown d8/dd/d10/d26/d2a/c41 4844 1 2026-03-10T07:50:46.383 INFO:tasks.workunit.client.0.vm05.stdout:5/252: mkdir d2/d12/d2d/d54 0 2026-03-10T07:50:46.383 INFO:tasks.workunit.client.0.vm05.stdout:5/253: stat d2/d5/l26 0 2026-03-10T07:50:46.387 INFO:tasks.workunit.client.0.vm05.stdout:3/261: mkdir d8/d22/d54 0 2026-03-10T07:50:46.389 INFO:tasks.workunit.client.0.vm05.stdout:8/228: link d1/dd/l12 d1/l44 0 2026-03-10T07:50:46.393 INFO:tasks.workunit.client.0.vm05.stdout:2/352: symlink d0/d52/l70 0 2026-03-10T07:50:46.397 INFO:tasks.workunit.client.0.vm05.stdout:1/266: symlink da/dd/d2a/l49 0 2026-03-10T07:50:46.399 INFO:tasks.workunit.client.0.vm05.stdout:1/267: dread da/f3a [0,4194304] 0 2026-03-10T07:50:46.401 INFO:tasks.workunit.client.0.vm05.stdout:1/268: read da/dd/d12/d19/f3b [2962038,72816] 0 2026-03-10T07:50:46.402 INFO:tasks.workunit.client.0.vm05.stdout:1/269: truncate da/dd/d2a/f2f 1558287 0 2026-03-10T07:50:46.403 INFO:tasks.workunit.client.0.vm05.stdout:6/224: creat d0/d6/f49 x:0 0 0 2026-03-10T07:50:46.405 INFO:tasks.workunit.client.0.vm05.stdout:7/264: creat d1/d3c/d4b/f4f x:0 0 0 2026-03-10T07:50:46.406 INFO:tasks.workunit.client.0.vm05.stdout:1/270: dwrite da/dd/d27/f40 [0,4194304] 0 2026-03-10T07:50:46.408 INFO:tasks.workunit.client.0.vm05.stdout:1/271: dread da/dd/d12/d34/f38 [0,4194304] 0 2026-03-10T07:50:46.414 INFO:tasks.workunit.client.0.vm05.stdout:8/229: chown d1/d23/l42 109 1 2026-03-10T07:50:46.415 INFO:tasks.workunit.client.0.vm05.stdout:8/230: fdatasync d1/dd/d18/d20/d2a/d34/f39 0 2026-03-10T07:50:46.418 INFO:tasks.workunit.client.0.vm05.stdout:9/253: mknod d8/d35/d1c/d20/d54/c56 0 2026-03-10T07:50:46.421 INFO:tasks.workunit.client.0.vm05.stdout:8/231: dwrite d1/dd/d18/f2b [0,4194304] 0 2026-03-10T07:50:46.426 INFO:tasks.workunit.client.0.vm05.stdout:2/353: dread d0/f22 [0,4194304] 0 2026-03-10T07:50:46.440 INFO:tasks.workunit.client.0.vm05.stdout:3/262: dread f4 [0,4194304] 0 2026-03-10T07:50:46.443 INFO:tasks.workunit.client.0.vm05.stdout:0/215: mkdir d8/dd/d10/d4b 0 2026-03-10T07:50:46.449 INFO:tasks.workunit.client.0.vm05.stdout:5/254: mknod d2/c55 0 2026-03-10T07:50:46.449 INFO:tasks.workunit.client.0.vm05.stdout:5/255: read - d2/d20/f32 zero size 2026-03-10T07:50:46.449 INFO:tasks.workunit.client.0.vm05.stdout:5/256: readlink d2/d5/l49 0 2026-03-10T07:50:46.468 INFO:tasks.workunit.client.0.vm05.stdout:6/225: write d0/d6/f45 [25831,111032] 0 2026-03-10T07:50:46.468 INFO:tasks.workunit.client.0.vm05.stdout:2/354: write d0/f1e [4182975,52291] 0 2026-03-10T07:50:46.468 INFO:tasks.workunit.client.0.vm05.stdout:6/226: truncate d0/d11/f1c 318855 0 2026-03-10T07:50:46.472 INFO:tasks.workunit.client.0.vm05.stdout:6/227: dwrite d0/f26 [0,4194304] 0 2026-03-10T07:50:46.472 INFO:tasks.workunit.client.0.vm05.stdout:6/228: chown d0/d11/d2e/l46 6 1 2026-03-10T07:50:46.503 INFO:tasks.workunit.client.0.vm05.stdout:0/216: dwrite d8/dd/de/f3b [0,4194304] 0 2026-03-10T07:50:46.521 INFO:tasks.workunit.client.0.vm05.stdout:9/254: mkdir d8/d35/d3c/d57 0 2026-03-10T07:50:46.523 INFO:tasks.workunit.client.0.vm05.stdout:7/265: truncate d1/f16 1605403 0 2026-03-10T07:50:46.524 INFO:tasks.workunit.client.0.vm05.stdout:8/232: mkdir d1/d45 0 2026-03-10T07:50:46.524 INFO:tasks.workunit.client.0.vm05.stdout:2/355: mknod d0/d52/c71 0 2026-03-10T07:50:46.527 INFO:tasks.workunit.client.0.vm05.stdout:6/229: mknod d0/d11/d22/c4a 0 2026-03-10T07:50:46.532 INFO:tasks.workunit.client.0.vm05.stdout:6/230: dwrite d0/d6/f10 [4194304,4194304] 0 2026-03-10T07:50:46.533 INFO:tasks.workunit.client.0.vm05.stdout:6/231: chown d0/d6/f45 1346 1 2026-03-10T07:50:46.541 INFO:tasks.workunit.client.0.vm05.stdout:0/217: rename d8/dd/d10/d26/c42 to d8/dd/d10/d26/d48/c4c 0 2026-03-10T07:50:46.544 INFO:tasks.workunit.client.0.vm05.stdout:7/266: mknod d1/d3c/d4b/c50 0 2026-03-10T07:50:46.546 INFO:tasks.workunit.client.0.vm05.stdout:4/265: link d0/l3 d0/d6/d9/l59 0 2026-03-10T07:50:46.547 INFO:tasks.workunit.client.0.vm05.stdout:6/232: mknod d0/d35/d36/c4b 0 2026-03-10T07:50:46.548 INFO:tasks.workunit.client.0.vm05.stdout:6/233: read d0/d11/d2e/f30 [3812334,17081] 0 2026-03-10T07:50:46.549 INFO:tasks.workunit.client.0.vm05.stdout:6/234: fsync d0/d6/f45 0 2026-03-10T07:50:46.551 INFO:tasks.workunit.client.0.vm05.stdout:3/263: link d8/fb d8/d1f/d2a/d4b/f55 0 2026-03-10T07:50:46.551 INFO:tasks.workunit.client.0.vm05.stdout:1/272: getdents da/dd 0 2026-03-10T07:50:46.552 INFO:tasks.workunit.client.0.vm05.stdout:1/273: chown da/dd/d12/f3d 1 1 2026-03-10T07:50:46.553 INFO:tasks.workunit.client.0.vm05.stdout:5/257: link d2/d5/c27 d2/d12/d2d/d54/c56 0 2026-03-10T07:50:46.554 INFO:tasks.workunit.client.0.vm05.stdout:5/258: read d2/d5/f3d [1834412,118546] 0 2026-03-10T07:50:46.563 INFO:tasks.workunit.client.0.vm05.stdout:6/235: unlink d0/d6/f49 0 2026-03-10T07:50:46.563 INFO:tasks.workunit.client.0.vm05.stdout:6/236: chown d0/c38 2625426 1 2026-03-10T07:50:46.569 INFO:tasks.workunit.client.0.vm05.stdout:9/255: sync 2026-03-10T07:50:46.569 INFO:tasks.workunit.client.0.vm05.stdout:8/233: sync 2026-03-10T07:50:46.569 INFO:tasks.workunit.client.0.vm05.stdout:8/234: fdatasync d1/f15 0 2026-03-10T07:50:46.572 INFO:tasks.workunit.client.0.vm05.stdout:5/259: read d2/d5/f18 [97713,52975] 0 2026-03-10T07:50:46.574 INFO:tasks.workunit.client.0.vm05.stdout:7/267: symlink d1/l51 0 2026-03-10T07:50:46.586 INFO:tasks.workunit.client.0.vm05.stdout:0/218: rename d8/dd/d37/d39 to d8/dd/de/d4d 0 2026-03-10T07:50:46.596 INFO:tasks.workunit.client.0.vm05.stdout:8/235: creat d1/dd/d18/d20/f46 x:0 0 0 2026-03-10T07:50:46.600 INFO:tasks.workunit.client.0.vm05.stdout:3/264: dwrite d8/f3b [0,4194304] 0 2026-03-10T07:50:46.600 INFO:tasks.workunit.client.0.vm05.stdout:9/256: dwrite d8/d35/f25 [0,4194304] 0 2026-03-10T07:50:46.604 INFO:tasks.workunit.client.0.vm05.stdout:8/236: dwrite d1/dd/d18/d20/f46 [0,4194304] 0 2026-03-10T07:50:46.610 INFO:tasks.workunit.client.0.vm05.stdout:0/219: dread d8/fb [0,4194304] 0 2026-03-10T07:50:46.621 INFO:tasks.workunit.client.0.vm05.stdout:2/356: getdents d0/d8/d43/df 0 2026-03-10T07:50:46.622 INFO:tasks.workunit.client.0.vm05.stdout:7/268: write d1/d6/f2e [477870,62249] 0 2026-03-10T07:50:46.631 INFO:tasks.workunit.client.0.vm05.stdout:9/257: symlink d8/d35/d1c/d36/l58 0 2026-03-10T07:50:46.650 INFO:tasks.workunit.client.0.vm05.stdout:2/357: mknod d0/d2a/d2f/c72 0 2026-03-10T07:50:46.651 INFO:tasks.workunit.client.0.vm05.stdout:7/269: creat d1/d6/d47/f52 x:0 0 0 2026-03-10T07:50:46.652 INFO:tasks.workunit.client.0.vm05.stdout:7/270: dread d1/f3d [0,4194304] 0 2026-03-10T07:50:46.652 INFO:tasks.workunit.client.0.vm05.stdout:7/271: chown d1/d6/fb 21162539 1 2026-03-10T07:50:46.653 INFO:tasks.workunit.client.0.vm05.stdout:7/272: write d1/f46 [1678124,40897] 0 2026-03-10T07:50:46.653 INFO:tasks.workunit.client.0.vm05.stdout:7/273: chown d1/d3c 60892 1 2026-03-10T07:50:46.654 INFO:tasks.workunit.client.0.vm05.stdout:1/274: getdents da/d26/d2b 0 2026-03-10T07:50:46.660 INFO:tasks.workunit.client.0.vm05.stdout:5/260: creat d2/d20/f57 x:0 0 0 2026-03-10T07:50:46.660 INFO:tasks.workunit.client.0.vm05.stdout:5/261: write d2/d20/f57 [86879,6183] 0 2026-03-10T07:50:46.663 INFO:tasks.workunit.client.0.vm05.stdout:2/358: dread d0/d8/d43/df/d25/f29 [0,4194304] 0 2026-03-10T07:50:46.672 INFO:tasks.workunit.client.0.vm05.stdout:7/274: sync 2026-03-10T07:50:46.672 INFO:tasks.workunit.client.0.vm05.stdout:5/262: sync 2026-03-10T07:50:46.673 INFO:tasks.workunit.client.0.vm05.stdout:7/275: write d1/d34/f4d [476275,35523] 0 2026-03-10T07:50:46.673 INFO:tasks.workunit.client.0.vm05.stdout:5/263: chown d2/d20/d33/f45 1 1 2026-03-10T07:50:46.678 INFO:tasks.workunit.client.0.vm05.stdout:9/258: mkdir d8/d35/d1c/d20/d59 0 2026-03-10T07:50:46.686 INFO:tasks.workunit.client.0.vm05.stdout:6/237: truncate d0/d6/f1d 1210786 0 2026-03-10T07:50:46.693 INFO:tasks.workunit.client.0.vm05.stdout:2/359: dread d0/f1 [0,4194304] 0 2026-03-10T07:50:46.701 INFO:tasks.workunit.client.0.vm05.stdout:4/266: rename d0/d6/d37/d3c to d0/d6/d9/d5a 0 2026-03-10T07:50:46.701 INFO:tasks.workunit.client.0.vm05.stdout:4/267: read - d0/d3b/f53 zero size 2026-03-10T07:50:46.701 INFO:tasks.workunit.client.0.vm05.stdout:4/268: truncate d0/d3b/f4c 759334 0 2026-03-10T07:50:46.702 INFO:tasks.workunit.client.0.vm05.stdout:4/269: chown d0/d6/d9/d12/d45/d55 1598 1 2026-03-10T07:50:46.703 INFO:tasks.workunit.client.0.vm05.stdout:1/275: mknod da/dd/d27/c4a 0 2026-03-10T07:50:46.706 INFO:tasks.workunit.client.0.vm05.stdout:5/264: rename d2/d5/l1f to d2/d5/l58 0 2026-03-10T07:50:46.707 INFO:tasks.workunit.client.0.vm05.stdout:5/265: write d2/d20/d33/f47 [526771,53685] 0 2026-03-10T07:50:46.707 INFO:tasks.workunit.client.0.vm05.stdout:5/266: write d2/d5/f23 [514297,91589] 0 2026-03-10T07:50:46.708 INFO:tasks.workunit.client.0.vm05.stdout:1/276: sync 2026-03-10T07:50:46.711 INFO:tasks.workunit.client.0.vm05.stdout:8/237: write d1/dd/f25 [1560584,102112] 0 2026-03-10T07:50:46.722 INFO:tasks.workunit.client.0.vm05.stdout:9/259: write d8/d35/d22/f3f [245899,33666] 0 2026-03-10T07:50:46.725 INFO:tasks.workunit.client.0.vm05.stdout:0/220: write f6 [800603,41960] 0 2026-03-10T07:50:46.727 INFO:tasks.workunit.client.0.vm05.stdout:6/238: creat d0/d11/d22/f4c x:0 0 0 2026-03-10T07:50:46.727 INFO:tasks.workunit.client.0.vm05.stdout:6/239: dread - d0/d6/f24 zero size 2026-03-10T07:50:46.728 INFO:tasks.workunit.client.0.vm05.stdout:6/240: fdatasync d0/d11/d22/f2b 0 2026-03-10T07:50:46.737 INFO:tasks.workunit.client.0.vm05.stdout:7/276: symlink d1/l53 0 2026-03-10T07:50:46.741 INFO:tasks.workunit.client.0.vm05.stdout:5/267: unlink d2/f3f 0 2026-03-10T07:50:46.745 INFO:tasks.workunit.client.0.vm05.stdout:1/277: rename da/dd/d27/f40 to da/dd/d27/f4b 0 2026-03-10T07:50:46.747 INFO:tasks.workunit.client.0.vm05.stdout:8/238: unlink d1/dd/l3b 0 2026-03-10T07:50:46.748 INFO:tasks.workunit.client.0.vm05.stdout:0/221: truncate d8/f1c 4647089 0 2026-03-10T07:50:46.749 INFO:tasks.workunit.client.0.vm05.stdout:0/222: fdatasync d8/dd/d10/d26/f31 0 2026-03-10T07:50:46.749 INFO:tasks.workunit.client.0.vm05.stdout:0/223: write d8/dd/f29 [1839648,85137] 0 2026-03-10T07:50:46.750 INFO:tasks.workunit.client.0.vm05.stdout:0/224: chown d8/dd/d10/l2c 9732 1 2026-03-10T07:50:46.750 INFO:tasks.workunit.client.0.vm05.stdout:0/225: read - d8/dd/d10/d26/d2a/f2e zero size 2026-03-10T07:50:46.764 INFO:tasks.workunit.client.0.vm05.stdout:3/265: rename d8/fb to d8/d1c/f56 0 2026-03-10T07:50:46.765 INFO:tasks.workunit.client.0.vm05.stdout:4/270: link d0/d3b/f4c d0/d6/d9/d12/d4f/f5b 0 2026-03-10T07:50:46.767 INFO:tasks.workunit.client.0.vm05.stdout:7/277: chown d1/f16 80 1 2026-03-10T07:50:46.772 INFO:tasks.workunit.client.0.vm05.stdout:5/268: creat d2/d12/d2d/d4a/f59 x:0 0 0 2026-03-10T07:50:46.779 INFO:tasks.workunit.client.0.vm05.stdout:8/239: creat d1/dd/d18/f47 x:0 0 0 2026-03-10T07:50:46.779 INFO:tasks.workunit.client.0.vm05.stdout:8/240: fdatasync d1/fe 0 2026-03-10T07:50:46.779 INFO:tasks.workunit.client.0.vm05.stdout:9/260: creat d8/d35/d22/d33/d47/f5a x:0 0 0 2026-03-10T07:50:46.779 INFO:tasks.workunit.client.0.vm05.stdout:9/261: write d8/d35/d1c/f49 [155917,5446] 0 2026-03-10T07:50:46.779 INFO:tasks.workunit.client.0.vm05.stdout:9/262: read - d8/d35/d1c/d20/f32 zero size 2026-03-10T07:50:46.787 INFO:tasks.workunit.client.0.vm05.stdout:2/360: truncate d0/d8/d43/df/f3a 3257897 0 2026-03-10T07:50:46.789 INFO:tasks.workunit.client.0.vm05.stdout:6/241: dwrite d0/d11/f13 [0,4194304] 0 2026-03-10T07:50:46.795 INFO:tasks.workunit.client.0.vm05.stdout:6/242: dread d0/d11/f13 [0,4194304] 0 2026-03-10T07:50:46.801 INFO:tasks.workunit.client.0.vm05.stdout:2/361: sync 2026-03-10T07:50:46.805 INFO:tasks.workunit.client.0.vm05.stdout:3/266: write d8/fe [1556542,130361] 0 2026-03-10T07:50:46.806 INFO:tasks.workunit.client.0.vm05.stdout:3/267: stat d8/l14 0 2026-03-10T07:50:46.809 INFO:tasks.workunit.client.0.vm05.stdout:4/271: mkdir d0/d3b/d5c 0 2026-03-10T07:50:46.820 INFO:tasks.workunit.client.0.vm05.stdout:7/278: rmdir d1/d3c 39 2026-03-10T07:50:46.820 INFO:tasks.workunit.client.0.vm05.stdout:7/279: dwrite d1/f37 [0,4194304] 0 2026-03-10T07:50:46.820 INFO:tasks.workunit.client.0.vm05.stdout:5/269: unlink d2/ff 0 2026-03-10T07:50:46.820 INFO:tasks.workunit.client.0.vm05.stdout:1/278: mknod da/dd/d42/c4c 0 2026-03-10T07:50:46.820 INFO:tasks.workunit.client.0.vm05.stdout:8/241: mkdir d1/dd/d18/d20/d2a/d48 0 2026-03-10T07:50:46.821 INFO:tasks.workunit.client.0.vm05.stdout:9/263: write f6 [718228,91988] 0 2026-03-10T07:50:46.822 INFO:tasks.workunit.client.0.vm05.stdout:8/242: write d1/dd/d18/f29 [6191226,21589] 0 2026-03-10T07:50:46.823 INFO:tasks.workunit.client.0.vm05.stdout:8/243: truncate d1/dd/d18/f22 4201823 0 2026-03-10T07:50:46.828 INFO:tasks.workunit.client.0.vm05.stdout:8/244: dwrite d1/dd/d18/f2b [4194304,4194304] 0 2026-03-10T07:50:46.835 INFO:tasks.workunit.client.0.vm05.stdout:2/362: unlink d0/d47/f4c 0 2026-03-10T07:50:46.852 INFO:tasks.workunit.client.0.vm05.stdout:1/279: creat da/dd/d27/f4d x:0 0 0 2026-03-10T07:50:46.854 INFO:tasks.workunit.client.0.vm05.stdout:9/264: rename d8/d35/d1c/d26/f3d to d8/d35/d22/d33/f5b 0 2026-03-10T07:50:46.854 INFO:tasks.workunit.client.0.vm05.stdout:9/265: chown d8/d35/d22/f4a 53154707 1 2026-03-10T07:50:46.855 INFO:tasks.workunit.client.0.vm05.stdout:9/266: rename d8/d35 to d8/d35/d1c/d20/d59/d5c 22 2026-03-10T07:50:46.856 INFO:tasks.workunit.client.0.vm05.stdout:8/245: mkdir d1/dd/d18/d20/d2a/d34/d49 0 2026-03-10T07:50:46.857 INFO:tasks.workunit.client.0.vm05.stdout:3/268: link d8/d1f/d2a/d33/f44 d8/d1f/d2a/d4b/f57 0 2026-03-10T07:50:46.858 INFO:tasks.workunit.client.0.vm05.stdout:2/363: symlink d0/d8/d43/df/d25/l73 0 2026-03-10T07:50:46.859 INFO:tasks.workunit.client.0.vm05.stdout:4/272: mknod d0/d6/c5d 0 2026-03-10T07:50:46.861 INFO:tasks.workunit.client.0.vm05.stdout:7/280: mknod d1/d3c/c54 0 2026-03-10T07:50:46.862 INFO:tasks.workunit.client.0.vm05.stdout:0/226: getdents d8 0 2026-03-10T07:50:46.863 INFO:tasks.workunit.client.0.vm05.stdout:1/280: write da/d26/f33 [4971270,82157] 0 2026-03-10T07:50:46.869 INFO:tasks.workunit.client.0.vm05.stdout:5/270: link d2/d12/d2d/f39 d2/d12/f5a 0 2026-03-10T07:50:46.871 INFO:tasks.workunit.client.0.vm05.stdout:0/227: dread f4 [0,4194304] 0 2026-03-10T07:50:46.872 INFO:tasks.workunit.client.0.vm05.stdout:4/273: sync 2026-03-10T07:50:46.880 INFO:tasks.workunit.client.0.vm05.stdout:2/364: symlink d0/d8/d43/df/d4d/l74 0 2026-03-10T07:50:46.881 INFO:tasks.workunit.client.0.vm05.stdout:2/365: write d0/d8/d43/df/d25/f2b [1838590,1443] 0 2026-03-10T07:50:46.889 INFO:tasks.workunit.client.0.vm05.stdout:6/243: truncate d0/f26 747506 0 2026-03-10T07:50:46.889 INFO:tasks.workunit.client.0.vm05.stdout:6/244: fsync d0/d6/f45 0 2026-03-10T07:50:46.895 INFO:tasks.workunit.client.0.vm05.stdout:0/228: rmdir d8/dd/d10/d26/d2a 39 2026-03-10T07:50:46.902 INFO:tasks.workunit.client.0.vm05.stdout:8/246: creat d1/dd/d18/d20/d2a/d48/f4a x:0 0 0 2026-03-10T07:50:46.906 INFO:tasks.workunit.client.0.vm05.stdout:3/269: getdents d8/d47 0 2026-03-10T07:50:46.906 INFO:tasks.workunit.client.0.vm05.stdout:2/366: mknod d0/d8/d66/c75 0 2026-03-10T07:50:46.906 INFO:tasks.workunit.client.0.vm05.stdout:1/281: dwrite da/f3a [0,4194304] 0 2026-03-10T07:50:46.907 INFO:tasks.workunit.client.0.vm05.stdout:6/245: symlink d0/d35/l4d 0 2026-03-10T07:50:46.907 INFO:tasks.workunit.client.0.vm05.stdout:3/270: chown d8/c15 2154124 1 2026-03-10T07:50:46.919 INFO:tasks.workunit.client.0.vm05.stdout:7/281: fsync d1/d6/f1d 0 2026-03-10T07:50:46.920 INFO:tasks.workunit.client.0.vm05.stdout:5/271: mkdir d2/d20/d5b 0 2026-03-10T07:50:46.920 INFO:tasks.workunit.client.0.vm05.stdout:7/282: write d1/f27 [1144943,95792] 0 2026-03-10T07:50:46.920 INFO:tasks.workunit.client.0.vm05.stdout:1/282: sync 2026-03-10T07:50:46.921 INFO:tasks.workunit.client.0.vm05.stdout:0/229: creat d8/f4e x:0 0 0 2026-03-10T07:50:46.921 INFO:tasks.workunit.client.0.vm05.stdout:5/272: truncate d2/f42 934156 0 2026-03-10T07:50:46.921 INFO:tasks.workunit.client.0.vm05.stdout:0/230: write d8/f4e [468066,63809] 0 2026-03-10T07:50:46.922 INFO:tasks.workunit.client.0.vm05.stdout:7/283: chown d1/f3d 2518818 1 2026-03-10T07:50:46.922 INFO:tasks.workunit.client.0.vm05.stdout:9/267: creat d8/d35/f5d x:0 0 0 2026-03-10T07:50:46.923 INFO:tasks.workunit.client.0.vm05.stdout:1/283: chown da/dd/d42/c4c 762554504 1 2026-03-10T07:50:46.923 INFO:tasks.workunit.client.0.vm05.stdout:7/284: dread - d1/d3c/d4b/f4f zero size 2026-03-10T07:50:46.924 INFO:tasks.workunit.client.0.vm05.stdout:7/285: write d1/d3c/d4b/f4f [682160,124611] 0 2026-03-10T07:50:46.928 INFO:tasks.workunit.client.0.vm05.stdout:9/268: dwrite d8/d35/d1c/f49 [0,4194304] 0 2026-03-10T07:50:46.937 INFO:tasks.workunit.client.0.vm05.stdout:6/246: symlink d0/d6/d3b/l4e 0 2026-03-10T07:50:46.941 INFO:tasks.workunit.client.0.vm05.stdout:4/274: link d0/d20/c2d d0/d3b/d5c/c5e 0 2026-03-10T07:50:46.943 INFO:tasks.workunit.client.0.vm05.stdout:0/231: creat d8/dd/d37/f4f x:0 0 0 2026-03-10T07:50:46.945 INFO:tasks.workunit.client.0.vm05.stdout:1/284: creat da/dd/d12/d19/f4e x:0 0 0 2026-03-10T07:50:46.947 INFO:tasks.workunit.client.0.vm05.stdout:1/285: dwrite da/dd/d12/f31 [0,4194304] 0 2026-03-10T07:50:46.957 INFO:tasks.workunit.client.0.vm05.stdout:6/247: mkdir d0/d11/d4f 0 2026-03-10T07:50:46.959 INFO:tasks.workunit.client.0.vm05.stdout:5/273: mknod d2/c5c 0 2026-03-10T07:50:46.960 INFO:tasks.workunit.client.0.vm05.stdout:5/274: write d2/f8 [582967,53222] 0 2026-03-10T07:50:46.960 INFO:tasks.workunit.client.0.vm05.stdout:5/275: truncate d2/d12/f2f 66364 0 2026-03-10T07:50:46.970 INFO:tasks.workunit.client.0.vm05.stdout:0/232: dread d8/dd/d10/f19 [0,4194304] 0 2026-03-10T07:50:46.971 INFO:tasks.workunit.client.0.vm05.stdout:0/233: chown d8/dd/d10/d26/f31 2236 1 2026-03-10T07:50:46.971 INFO:tasks.workunit.client.0.vm05.stdout:0/234: readlink d8/dd/l36 0 2026-03-10T07:50:46.972 INFO:tasks.workunit.client.0.vm05.stdout:0/235: truncate d8/dd/f22 2664724 0 2026-03-10T07:50:46.977 INFO:tasks.workunit.client.0.vm05.stdout:8/247: link d1/dd/c14 d1/dd/d18/d20/d2a/d34/c4b 0 2026-03-10T07:50:46.978 INFO:tasks.workunit.client.0.vm05.stdout:8/248: write d1/fa [9320375,129923] 0 2026-03-10T07:50:46.978 INFO:tasks.workunit.client.0.vm05.stdout:8/249: chown d1/d45 4852 1 2026-03-10T07:50:46.983 INFO:tasks.workunit.client.0.vm05.stdout:4/275: dwrite d0/d6/f32 [0,4194304] 0 2026-03-10T07:50:46.995 INFO:tasks.workunit.client.0.vm05.stdout:0/236: mknod d8/dd/d37/c50 0 2026-03-10T07:50:46.996 INFO:tasks.workunit.client.0.vm05.stdout:3/271: write d8/d1c/f35 [3910383,106460] 0 2026-03-10T07:50:46.996 INFO:tasks.workunit.client.0.vm05.stdout:3/272: readlink d8/l17 0 2026-03-10T07:50:46.998 INFO:tasks.workunit.client.0.vm05.stdout:0/237: dread d8/dd/d10/f19 [0,4194304] 0 2026-03-10T07:50:47.002 INFO:tasks.workunit.client.0.vm05.stdout:7/286: dwrite d1/d6/f22 [0,4194304] 0 2026-03-10T07:50:47.003 INFO:tasks.workunit.client.0.vm05.stdout:7/287: chown d1/l53 0 1 2026-03-10T07:50:47.004 INFO:tasks.workunit.client.0.vm05.stdout:7/288: chown d1/d6/f1d 15006 1 2026-03-10T07:50:47.004 INFO:tasks.workunit.client.0.vm05.stdout:7/289: fdatasync d1/d6/d47/f52 0 2026-03-10T07:50:47.011 INFO:tasks.workunit.client.0.vm05.stdout:6/248: write d0/d6/f1a [1550460,50238] 0 2026-03-10T07:50:47.012 INFO:tasks.workunit.client.0.vm05.stdout:6/249: write d0/d6/f24 [339131,120349] 0 2026-03-10T07:50:47.016 INFO:tasks.workunit.client.0.vm05.stdout:5/276: creat d2/d12/d4d/f5d x:0 0 0 2026-03-10T07:50:47.020 INFO:tasks.workunit.client.0.vm05.stdout:4/276: creat d0/d6/d9/d12/d45/d55/f5f x:0 0 0 2026-03-10T07:50:47.028 INFO:tasks.workunit.client.0.vm05.stdout:4/277: write d0/d6/d9/d12/d45/d55/f19 [13134973,28178] 0 2026-03-10T07:50:47.029 INFO:tasks.workunit.client.0.vm05.stdout:9/269: rename d8/d35/d1c/d26/d28/f46 to d8/f5e 0 2026-03-10T07:50:47.029 INFO:tasks.workunit.client.0.vm05.stdout:6/250: symlink d0/d35/d36/d43/l50 0 2026-03-10T07:50:47.029 INFO:tasks.workunit.client.0.vm05.stdout:6/251: readlink d0/d6/d3b/l3e 0 2026-03-10T07:50:47.029 INFO:tasks.workunit.client.0.vm05.stdout:6/252: truncate d0/d35/f41 543572 0 2026-03-10T07:50:47.030 INFO:tasks.workunit.client.0.vm05.stdout:0/238: dwrite d8/dd/d10/d26/d2a/f2e [0,4194304] 0 2026-03-10T07:50:47.032 INFO:tasks.workunit.client.0.vm05.stdout:0/239: write d8/dd/d10/d26/d2a/f2e [2741331,92607] 0 2026-03-10T07:50:47.038 INFO:tasks.workunit.client.0.vm05.stdout:1/286: getdents da 0 2026-03-10T07:50:47.038 INFO:tasks.workunit.client.0.vm05.stdout:1/287: chown da/c25 6 1 2026-03-10T07:50:47.044 INFO:tasks.workunit.client.0.vm05.stdout:8/250: rename d1/c7 to d1/dd/d18/d20/d2a/d34/c4c 0 2026-03-10T07:50:47.045 INFO:tasks.workunit.client.0.vm05.stdout:8/251: chown d1/dd 23818 1 2026-03-10T07:50:47.047 INFO:tasks.workunit.client.0.vm05.stdout:9/270: creat d8/d35/d22/d33/d47/f5f x:0 0 0 2026-03-10T07:50:47.050 INFO:tasks.workunit.client.0.vm05.stdout:2/367: getdents d0/d8/d43/df 0 2026-03-10T07:50:47.053 INFO:tasks.workunit.client.0.vm05.stdout:4/278: dread d0/d6/d9/d5a/f2f [0,4194304] 0 2026-03-10T07:50:47.054 INFO:tasks.workunit.client.0.vm05.stdout:4/279: write d0/d6/d9/d12/d45/d55/f56 [613035,91044] 0 2026-03-10T07:50:47.057 INFO:tasks.workunit.client.0.vm05.stdout:4/280: dread d0/f24 [0,4194304] 0 2026-03-10T07:50:47.060 INFO:tasks.workunit.client.0.vm05.stdout:4/281: dwrite d0/d6/d9/d12/d45/d55/f56 [0,4194304] 0 2026-03-10T07:50:47.074 INFO:tasks.workunit.client.0.vm05.stdout:0/240: mknod d8/dd/de/d4d/c51 0 2026-03-10T07:50:47.074 INFO:tasks.workunit.client.0.vm05.stdout:0/241: write d8/dd/d10/d26/f31 [123648,46307] 0 2026-03-10T07:50:47.076 INFO:tasks.workunit.client.0.vm05.stdout:5/277: symlink d2/d4b/l5e 0 2026-03-10T07:50:47.077 INFO:tasks.workunit.client.0.vm05.stdout:5/278: readlink d2/d4b/l5e 0 2026-03-10T07:50:47.081 INFO:tasks.workunit.client.0.vm05.stdout:0/242: dread d8/f1c [0,4194304] 0 2026-03-10T07:50:47.082 INFO:tasks.workunit.client.0.vm05.stdout:0/243: write d8/dd/d10/d26/f31 [633908,127969] 0 2026-03-10T07:50:47.084 INFO:tasks.workunit.client.0.vm05.stdout:3/273: rename d8/d16/d19/d53 to d8/d1f/d2a/d33/d58 0 2026-03-10T07:50:47.102 INFO:tasks.workunit.client.0.vm05.stdout:4/282: write d0/d6/d9/d5a/f2f [2933681,105793] 0 2026-03-10T07:50:47.102 INFO:tasks.workunit.client.0.vm05.stdout:4/283: chown d0/d3b/c3e 14623 1 2026-03-10T07:50:47.105 INFO:tasks.workunit.client.0.vm05.stdout:4/284: dwrite d0/f24 [0,4194304] 0 2026-03-10T07:50:47.123 INFO:tasks.workunit.client.0.vm05.stdout:0/244: symlink d8/dd/d10/d26/d48/l52 0 2026-03-10T07:50:47.124 INFO:tasks.workunit.client.0.vm05.stdout:0/245: dread - d8/dd/d34/f3d zero size 2026-03-10T07:50:47.124 INFO:tasks.workunit.client.0.vm05.stdout:0/246: truncate f6 1278826 0 2026-03-10T07:50:47.129 INFO:tasks.workunit.client.0.vm05.stdout:7/290: rename d1/d6/l28 to d1/d3c/d4b/l55 0 2026-03-10T07:50:47.134 INFO:tasks.workunit.client.0.vm05.stdout:3/274: dread d8/d1f/f49 [0,4194304] 0 2026-03-10T07:50:47.134 INFO:tasks.workunit.client.0.vm05.stdout:3/275: readlink d8/d16/l26 0 2026-03-10T07:50:47.135 INFO:tasks.workunit.client.0.vm05.stdout:8/252: truncate d1/f15 667341 0 2026-03-10T07:50:47.141 INFO:tasks.workunit.client.0.vm05.stdout:9/271: symlink d8/l60 0 2026-03-10T07:50:47.151 INFO:tasks.workunit.client.0.vm05.stdout:2/368: truncate d0/d8/d43/df/f20 1778075 0 2026-03-10T07:50:47.152 INFO:tasks.workunit.client.0.vm05.stdout:2/369: write d0/d8/d43/df/d4d/f57 [372342,6050] 0 2026-03-10T07:50:47.153 INFO:tasks.workunit.client.0.vm05.stdout:2/370: read d0/d47/f48 [2081728,110437] 0 2026-03-10T07:50:47.155 INFO:tasks.workunit.client.0.vm05.stdout:6/253: link d0/c20 d0/d6/c51 0 2026-03-10T07:50:47.160 INFO:tasks.workunit.client.0.vm05.stdout:5/279: creat d2/d20/d5b/f5f x:0 0 0 2026-03-10T07:50:47.160 INFO:tasks.workunit.client.0.vm05.stdout:5/280: dread - d2/d20/f32 zero size 2026-03-10T07:50:47.170 INFO:tasks.workunit.client.0.vm05.stdout:1/288: write da/dd/d12/f18 [388645,78982] 0 2026-03-10T07:50:47.176 INFO:tasks.workunit.client.0.vm05.stdout:7/291: symlink d1/d3c/d4b/l56 0 2026-03-10T07:50:47.187 INFO:tasks.workunit.client.0.vm05.stdout:3/276: unlink d8/f3c 0 2026-03-10T07:50:47.190 INFO:tasks.workunit.client.0.vm05.stdout:3/277: dwrite d8/d1f/d2a/d33/f50 [0,4194304] 0 2026-03-10T07:50:47.193 INFO:tasks.workunit.client.0.vm05.stdout:3/278: write d8/d1c/f35 [42559,111878] 0 2026-03-10T07:50:47.193 INFO:tasks.workunit.client.0.vm05.stdout:3/279: write d8/d1f/d24/f3e [1203708,35751] 0 2026-03-10T07:50:47.193 INFO:tasks.workunit.client.0.vm05.stdout:3/280: write d8/d1f/d24/f3e [1111428,52835] 0 2026-03-10T07:50:47.194 INFO:tasks.workunit.client.0.vm05.stdout:3/281: readlink d8/l17 0 2026-03-10T07:50:47.195 INFO:tasks.workunit.client.0.vm05.stdout:3/282: truncate d8/d1f/d2a/f32 1478274 0 2026-03-10T07:50:47.199 INFO:tasks.workunit.client.0.vm05.stdout:5/281: sync 2026-03-10T07:50:47.213 INFO:tasks.workunit.client.0.vm05.stdout:2/371: fsync d0/d8/d43/df/f20 0 2026-03-10T07:50:47.215 INFO:tasks.workunit.client.0.vm05.stdout:4/285: mkdir d0/d6/d60 0 2026-03-10T07:50:47.216 INFO:tasks.workunit.client.0.vm05.stdout:4/286: chown d0/d3b/c3e 416 1 2026-03-10T07:50:47.217 INFO:tasks.workunit.client.0.vm05.stdout:1/289: creat da/dd/d12/d34/f4f x:0 0 0 2026-03-10T07:50:47.219 INFO:tasks.workunit.client.0.vm05.stdout:0/247: creat d8/dd/d10/d26/d3a/f53 x:0 0 0 2026-03-10T07:50:47.221 INFO:tasks.workunit.client.0.vm05.stdout:7/292: fdatasync d1/d6/d3b/f42 0 2026-03-10T07:50:47.224 INFO:tasks.workunit.client.0.vm05.stdout:7/293: dread d1/d3c/d4b/f4f [0,4194304] 0 2026-03-10T07:50:47.234 INFO:tasks.workunit.client.0.vm05.stdout:5/282: rename d2/d20/d33/f47 to d2/d12/d2d/f60 0 2026-03-10T07:50:47.239 INFO:tasks.workunit.client.0.vm05.stdout:8/253: truncate d1/dd/d18/d20/d2a/f3a 3853877 0 2026-03-10T07:50:47.239 INFO:tasks.workunit.client.0.vm05.stdout:8/254: stat d1/l41 0 2026-03-10T07:50:47.240 INFO:tasks.workunit.client.0.vm05.stdout:8/255: read d1/f2c [865269,63105] 0 2026-03-10T07:50:47.269 INFO:tasks.workunit.client.0.vm05.stdout:7/294: mknod d1/d6/d3b/c57 0 2026-03-10T07:50:47.275 INFO:tasks.workunit.client.0.vm05.stdout:0/248: rename d8/f9 to d8/dd/de/d4d/f54 0 2026-03-10T07:50:47.276 INFO:tasks.workunit.client.0.vm05.stdout:0/249: write d8/dd/f29 [948740,62474] 0 2026-03-10T07:50:47.285 INFO:tasks.workunit.client.0.vm05.stdout:3/283: dwrite d8/d1c/f56 [0,4194304] 0 2026-03-10T07:50:47.291 INFO:tasks.workunit.client.0.vm05.stdout:5/283: mkdir d2/d5/d61 0 2026-03-10T07:50:47.291 INFO:tasks.workunit.client.0.vm05.stdout:5/284: truncate d2/f1a 817985 0 2026-03-10T07:50:47.292 INFO:tasks.workunit.client.0.vm05.stdout:5/285: chown d2/c29 7992 1 2026-03-10T07:50:47.295 INFO:tasks.workunit.client.0.vm05.stdout:8/256: rmdir d1/dd/d18/d20/d2a/d48 39 2026-03-10T07:50:47.307 INFO:tasks.workunit.client.0.vm05.stdout:4/287: symlink d0/d3b/d5c/l61 0 2026-03-10T07:50:47.315 INFO:tasks.workunit.client.0.vm05.stdout:1/290: truncate da/d26/f2d 680303 0 2026-03-10T07:50:47.317 INFO:tasks.workunit.client.0.vm05.stdout:7/295: creat d1/d6/f58 x:0 0 0 2026-03-10T07:50:47.324 INFO:tasks.workunit.client.0.vm05.stdout:0/250: dread d8/dd/d37/f38 [0,4194304] 0 2026-03-10T07:50:47.327 INFO:tasks.workunit.client.0.vm05.stdout:3/284: symlink d8/d1f/l59 0 2026-03-10T07:50:47.327 INFO:tasks.workunit.client.0.vm05.stdout:3/285: dread - d8/d16/d19/d37/f43 zero size 2026-03-10T07:50:47.330 INFO:tasks.workunit.client.0.vm05.stdout:0/251: dread d8/f33 [0,4194304] 0 2026-03-10T07:50:47.335 INFO:tasks.workunit.client.0.vm05.stdout:5/286: mkdir d2/d12/d2d/d62 0 2026-03-10T07:50:47.335 INFO:tasks.workunit.client.0.vm05.stdout:5/287: chown d2/d4b 3789820 1 2026-03-10T07:50:47.338 INFO:tasks.workunit.client.0.vm05.stdout:8/257: write d1/dd/d18/d20/d2a/d48/f4a [920065,67228] 0 2026-03-10T07:50:47.339 INFO:tasks.workunit.client.0.vm05.stdout:8/258: chown d1/dd/d18/d20/d2a/d34 16 1 2026-03-10T07:50:47.342 INFO:tasks.workunit.client.0.vm05.stdout:9/272: getdents d8/d35/d22/d33/d47 0 2026-03-10T07:50:47.347 INFO:tasks.workunit.client.0.vm05.stdout:2/372: link d0/d8/d43/df/d53/f6d d0/d47/f76 0 2026-03-10T07:50:47.350 INFO:tasks.workunit.client.0.vm05.stdout:2/373: dread d0/d8/f2d [0,4194304] 0 2026-03-10T07:50:47.351 INFO:tasks.workunit.client.0.vm05.stdout:6/254: getdents d0/d11/d2e 0 2026-03-10T07:50:47.354 INFO:tasks.workunit.client.0.vm05.stdout:4/288: fdatasync d0/d6/d37/f3d 0 2026-03-10T07:50:47.359 INFO:tasks.workunit.client.0.vm05.stdout:1/291: rename da/dd/d27/f4b to da/d26/d2b/f50 0 2026-03-10T07:50:47.372 INFO:tasks.workunit.client.0.vm05.stdout:0/252: readlink d8/dd/de/l17 0 2026-03-10T07:50:47.372 INFO:tasks.workunit.client.0.vm05.stdout:0/253: chown d8/f20 2060 1 2026-03-10T07:50:47.372 INFO:tasks.workunit.client.0.vm05.stdout:0/254: fdatasync d8/dd/d10/d26/d2a/f2e 0 2026-03-10T07:50:47.374 INFO:tasks.workunit.client.0.vm05.stdout:8/259: fdatasync d1/dd/d18/f21 0 2026-03-10T07:50:47.378 INFO:tasks.workunit.client.0.vm05.stdout:8/260: dwrite d1/dd/d18/f22 [0,4194304] 0 2026-03-10T07:50:47.381 INFO:tasks.workunit.client.0.vm05.stdout:6/255: creat d0/d11/d22/f52 x:0 0 0 2026-03-10T07:50:47.381 INFO:tasks.workunit.client.0.vm05.stdout:6/256: chown d0/d11/f13 90743 1 2026-03-10T07:50:47.382 INFO:tasks.workunit.client.0.vm05.stdout:6/257: write d0/d35/f41 [1212459,64201] 0 2026-03-10T07:50:47.383 INFO:tasks.workunit.client.0.vm05.stdout:4/289: symlink d0/d6/d9/d12/d45/d55/l62 0 2026-03-10T07:50:47.385 INFO:tasks.workunit.client.0.vm05.stdout:3/286: rename d8/c10 to d8/d1c/c5a 0 2026-03-10T07:50:47.386 INFO:tasks.workunit.client.0.vm05.stdout:1/292: fsync da/dd/d12/d34/f38 0 2026-03-10T07:50:47.389 INFO:tasks.workunit.client.0.vm05.stdout:7/296: mkdir d1/d34/d59 0 2026-03-10T07:50:47.391 INFO:tasks.workunit.client.0.vm05.stdout:7/297: dread d1/f37 [0,4194304] 0 2026-03-10T07:50:47.393 INFO:tasks.workunit.client.0.vm05.stdout:0/255: fdatasync d8/dd/d10/f19 0 2026-03-10T07:50:47.394 INFO:tasks.workunit.client.0.vm05.stdout:5/288: link d2/d20/d5b/f5f d2/d5/f63 0 2026-03-10T07:50:47.398 INFO:tasks.workunit.client.0.vm05.stdout:4/290: creat d0/d20/f63 x:0 0 0 2026-03-10T07:50:47.403 INFO:tasks.workunit.client.0.vm05.stdout:1/293: rename da/dd/d12/l1b to da/l51 0 2026-03-10T07:50:47.403 INFO:tasks.workunit.client.0.vm05.stdout:7/298: mknod d1/d6/c5a 0 2026-03-10T07:50:47.407 INFO:tasks.workunit.client.0.vm05.stdout:8/261: mkdir d1/dd/d4d 0 2026-03-10T07:50:47.408 INFO:tasks.workunit.client.0.vm05.stdout:8/262: dread - d1/dd/d18/f3f zero size 2026-03-10T07:50:47.408 INFO:tasks.workunit.client.0.vm05.stdout:6/258: truncate d0/d11/d22/f2b 1282387 0 2026-03-10T07:50:47.410 INFO:tasks.workunit.client.0.vm05.stdout:0/256: rename d8/dd/de to d8/dd/de/d55 22 2026-03-10T07:50:47.411 INFO:tasks.workunit.client.0.vm05.stdout:0/257: chown d8/dd/d10/d26/d3a/f53 2067537 1 2026-03-10T07:50:47.412 INFO:tasks.workunit.client.0.vm05.stdout:5/289: mkdir d2/d20/d4c/d64 0 2026-03-10T07:50:47.413 INFO:tasks.workunit.client.0.vm05.stdout:4/291: dwrite d0/d6/f15 [0,4194304] 0 2026-03-10T07:50:47.415 INFO:tasks.workunit.client.0.vm05.stdout:9/273: link d8/d35/d22/d33/l4b d8/l61 0 2026-03-10T07:50:47.416 INFO:tasks.workunit.client.0.vm05.stdout:8/263: symlink d1/dd/d18/d20/l4e 0 2026-03-10T07:50:47.420 INFO:tasks.workunit.client.0.vm05.stdout:9/274: dread d8/d35/d1c/f4c [0,4194304] 0 2026-03-10T07:50:47.427 INFO:tasks.workunit.client.0.vm05.stdout:0/258: rename d8/dd/de to d8/dd/d37/d56 0 2026-03-10T07:50:47.429 INFO:tasks.workunit.client.0.vm05.stdout:8/264: dwrite d1/dd/d18/f47 [0,4194304] 0 2026-03-10T07:50:47.436 INFO:tasks.workunit.client.0.vm05.stdout:4/292: dwrite d0/d6/d9/d12/f36 [0,4194304] 0 2026-03-10T07:50:47.440 INFO:tasks.workunit.client.0.vm05.stdout:9/275: dread d8/d35/d22/d33/f41 [4194304,4194304] 0 2026-03-10T07:50:47.443 INFO:tasks.workunit.client.0.vm05.stdout:9/276: truncate d8/d35/d22/d33/f5b 1279753 0 2026-03-10T07:50:47.445 INFO:tasks.workunit.client.0.vm05.stdout:5/290: link d2/d12/d2d/f39 d2/d5/d61/f65 0 2026-03-10T07:50:47.445 INFO:tasks.workunit.client.0.vm05.stdout:5/291: stat d2/d5/f63 0 2026-03-10T07:50:47.455 INFO:tasks.workunit.client.0.vm05.stdout:4/293: symlink d0/d3b/d5c/l64 0 2026-03-10T07:50:47.460 INFO:tasks.workunit.client.0.vm05.stdout:8/265: getdents d1/d45 0 2026-03-10T07:50:47.463 INFO:tasks.workunit.client.0.vm05.stdout:4/294: mkdir d0/d6/d9/d12/d65 0 2026-03-10T07:50:47.465 INFO:tasks.workunit.client.0.vm05.stdout:9/277: unlink d8/d35/d38/f4d 0 2026-03-10T07:50:47.467 INFO:tasks.workunit.client.0.vm05.stdout:5/292: creat d2/d5/d61/f66 x:0 0 0 2026-03-10T07:50:47.467 INFO:tasks.workunit.client.0.vm05.stdout:5/293: fsync d2/d20/d33/f45 0 2026-03-10T07:50:47.469 INFO:tasks.workunit.client.0.vm05.stdout:4/295: creat d0/d6/d9/d12/d45/f66 x:0 0 0 2026-03-10T07:50:47.471 INFO:tasks.workunit.client.0.vm05.stdout:9/278: mkdir d8/d35/d22/d33/d62 0 2026-03-10T07:50:47.472 INFO:tasks.workunit.client.0.vm05.stdout:9/279: write d8/d35/d1c/d26/d28/f43 [3562862,100883] 0 2026-03-10T07:50:47.477 INFO:tasks.workunit.client.0.vm05.stdout:8/266: link d1/dd/d18/d20/f46 d1/dd/d18/d20/d2a/d34/f4f 0 2026-03-10T07:50:47.478 INFO:tasks.workunit.client.0.vm05.stdout:7/299: sync 2026-03-10T07:50:47.479 INFO:tasks.workunit.client.0.vm05.stdout:7/300: fsync d1/d6/f32 0 2026-03-10T07:50:47.479 INFO:tasks.workunit.client.0.vm05.stdout:7/301: stat d1/d6/f22 0 2026-03-10T07:50:47.485 INFO:tasks.workunit.client.0.vm05.stdout:4/296: stat d0/d6/ld 0 2026-03-10T07:50:47.491 INFO:tasks.workunit.client.0.vm05.stdout:4/297: write d0/d6/d9/d5a/f2f [467193,33138] 0 2026-03-10T07:50:47.491 INFO:tasks.workunit.client.0.vm05.stdout:4/298: dread d0/d6/d9/d5a/f2f [0,4194304] 0 2026-03-10T07:50:47.497 INFO:tasks.workunit.client.0.vm05.stdout:3/287: chown d8/d1c/c5a 218308194 1 2026-03-10T07:50:47.501 INFO:tasks.workunit.client.0.vm05.stdout:9/280: mkdir d8/d35/d1c/d2c/d63 0 2026-03-10T07:50:47.506 INFO:tasks.workunit.client.0.vm05.stdout:2/374: dwrite d0/f4 [0,4194304] 0 2026-03-10T07:50:47.520 INFO:tasks.workunit.client.0.vm05.stdout:8/267: creat d1/dd/d18/d20/d2a/d48/f50 x:0 0 0 2026-03-10T07:50:47.523 INFO:tasks.workunit.client.0.vm05.stdout:0/259: write d8/f2d [715783,55614] 0 2026-03-10T07:50:47.523 INFO:tasks.workunit.client.0.vm05.stdout:7/302: read d1/d6/f1d [1501228,98274] 0 2026-03-10T07:50:47.524 INFO:tasks.workunit.client.0.vm05.stdout:7/303: write d1/d6/f58 [346838,60945] 0 2026-03-10T07:50:47.527 INFO:tasks.workunit.client.0.vm05.stdout:3/288: write d8/d1f/d2a/d4a/f4f [1110588,120628] 0 2026-03-10T07:50:47.536 INFO:tasks.workunit.client.0.vm05.stdout:6/259: write d0/d11/d22/f2b [1174469,87723] 0 2026-03-10T07:50:47.537 INFO:tasks.workunit.client.0.vm05.stdout:6/260: write d0/d6/f2c [2178917,100242] 0 2026-03-10T07:50:47.537 INFO:tasks.workunit.client.0.vm05.stdout:6/261: fsync d0/f3a 0 2026-03-10T07:50:47.544 INFO:tasks.workunit.client.0.vm05.stdout:3/289: symlink d8/d16/d19/l5b 0 2026-03-10T07:50:47.544 INFO:tasks.workunit.client.0.vm05.stdout:9/281: creat d8/d35/d1c/d2c/d63/f64 x:0 0 0 2026-03-10T07:50:47.549 INFO:tasks.workunit.client.0.vm05.stdout:1/294: symlink da/d26/l52 0 2026-03-10T07:50:47.551 INFO:tasks.workunit.client.0.vm05.stdout:6/262: dread d0/f29 [0,4194304] 0 2026-03-10T07:50:47.555 INFO:tasks.workunit.client.0.vm05.stdout:8/268: symlink d1/dd/d4d/l51 0 2026-03-10T07:50:47.556 INFO:tasks.workunit.client.0.vm05.stdout:5/294: dwrite d2/d5/f1e [0,4194304] 0 2026-03-10T07:50:47.568 INFO:tasks.workunit.client.0.vm05.stdout:7/304: mkdir d1/d5b 0 2026-03-10T07:50:47.570 INFO:tasks.workunit.client.0.vm05.stdout:4/299: creat d0/d6/d9/f67 x:0 0 0 2026-03-10T07:50:47.572 INFO:tasks.workunit.client.0.vm05.stdout:2/375: write d0/d8/d43/f30 [1049468,30255] 0 2026-03-10T07:50:47.572 INFO:tasks.workunit.client.0.vm05.stdout:2/376: stat d0/d8/d43/d38/l6a 0 2026-03-10T07:50:47.574 INFO:tasks.workunit.client.0.vm05.stdout:9/282: write d8/d35/f1d [832875,92807] 0 2026-03-10T07:50:47.576 INFO:tasks.workunit.client.0.vm05.stdout:1/295: fdatasync da/dd/d12/f31 0 2026-03-10T07:50:47.577 INFO:tasks.workunit.client.0.vm05.stdout:7/305: readlink d1/d34/l35 0 2026-03-10T07:50:47.577 INFO:tasks.workunit.client.0.vm05.stdout:1/296: write da/dd/d27/f36 [28294,74148] 0 2026-03-10T07:50:47.578 INFO:tasks.workunit.client.0.vm05.stdout:4/300: creat d0/d28/f68 x:0 0 0 2026-03-10T07:50:47.579 INFO:tasks.workunit.client.0.vm05.stdout:4/301: truncate d0/d28/f33 1037811 0 2026-03-10T07:50:47.580 INFO:tasks.workunit.client.0.vm05.stdout:4/302: write d0/f23 [3400682,9655] 0 2026-03-10T07:50:47.580 INFO:tasks.workunit.client.0.vm05.stdout:2/377: symlink d0/d8/d3d/l77 0 2026-03-10T07:50:47.582 INFO:tasks.workunit.client.0.vm05.stdout:4/303: chown d0/d3b/f53 25492164 1 2026-03-10T07:50:47.586 INFO:tasks.workunit.client.0.vm05.stdout:1/297: dread da/dd/d12/f22 [0,4194304] 0 2026-03-10T07:50:47.599 INFO:tasks.workunit.client.0.vm05.stdout:6/263: creat d0/d11/d4f/f53 x:0 0 0 2026-03-10T07:50:47.600 INFO:tasks.workunit.client.0.vm05.stdout:7/306: unlink d1/f27 0 2026-03-10T07:50:47.600 INFO:tasks.workunit.client.0.vm05.stdout:4/304: mkdir d0/d6/d9/d12/d69 0 2026-03-10T07:50:47.600 INFO:tasks.workunit.client.0.vm05.stdout:2/378: dread d0/d8/d43/df/d25/f29 [0,4194304] 0 2026-03-10T07:50:47.600 INFO:tasks.workunit.client.0.vm05.stdout:3/290: creat d8/f5c x:0 0 0 2026-03-10T07:50:47.600 INFO:tasks.workunit.client.0.vm05.stdout:3/291: write d8/f3b [428549,75131] 0 2026-03-10T07:50:47.600 INFO:tasks.workunit.client.0.vm05.stdout:1/298: unlink da/dd/d12/f3d 0 2026-03-10T07:50:47.601 INFO:tasks.workunit.client.0.vm05.stdout:9/283: sync 2026-03-10T07:50:47.605 INFO:tasks.workunit.client.0.vm05.stdout:7/307: symlink d1/d6/l5c 0 2026-03-10T07:50:47.614 INFO:tasks.workunit.client.0.vm05.stdout:1/299: read da/d26/f2d [23162,91007] 0 2026-03-10T07:50:47.616 INFO:tasks.workunit.client.0.vm05.stdout:8/269: getdents d1/dd/d4d 0 2026-03-10T07:50:47.617 INFO:tasks.workunit.client.0.vm05.stdout:0/260: truncate d8/dd/f22 383401 0 2026-03-10T07:50:47.618 INFO:tasks.workunit.client.0.vm05.stdout:0/261: write f6 [1814057,10852] 0 2026-03-10T07:50:47.618 INFO:tasks.workunit.client.0.vm05.stdout:0/262: readlink d8/dd/d37/d56/l17 0 2026-03-10T07:50:47.624 INFO:tasks.workunit.client.0.vm05.stdout:9/284: fdatasync d8/f14 0 2026-03-10T07:50:47.629 INFO:tasks.workunit.client.0.vm05.stdout:5/295: write d2/d20/d5b/f5f [1003732,5154] 0 2026-03-10T07:50:47.629 INFO:tasks.workunit.client.0.vm05.stdout:5/296: chown d2/c13 0 1 2026-03-10T07:50:47.630 INFO:tasks.workunit.client.0.vm05.stdout:5/297: read d2/d20/f2a [3019259,101516] 0 2026-03-10T07:50:47.638 INFO:tasks.workunit.client.0.vm05.stdout:1/300: write da/d26/d2b/f45 [2365794,31645] 0 2026-03-10T07:50:47.643 INFO:tasks.workunit.client.0.vm05.stdout:6/264: link d0/d35/d36/c4b d0/d11/d31/c54 0 2026-03-10T07:50:47.657 INFO:tasks.workunit.client.0.vm05.stdout:7/308: symlink d1/d34/d59/l5d 0 2026-03-10T07:50:47.660 INFO:tasks.workunit.client.0.vm05.stdout:7/309: dwrite d1/d6/f1b [0,4194304] 0 2026-03-10T07:50:47.669 INFO:tasks.workunit.client.0.vm05.stdout:5/298: mknod d2/d20/d5b/c67 0 2026-03-10T07:50:47.670 INFO:tasks.workunit.client.0.vm05.stdout:1/301: readlink da/l51 0 2026-03-10T07:50:47.671 INFO:tasks.workunit.client.0.vm05.stdout:8/270: truncate d1/f15 243994 0 2026-03-10T07:50:47.675 INFO:tasks.workunit.client.0.vm05.stdout:8/271: dwrite d1/dd/d18/f3f [0,4194304] 0 2026-03-10T07:50:47.676 INFO:tasks.workunit.client.0.vm05.stdout:6/265: creat d0/d6/d3b/f55 x:0 0 0 2026-03-10T07:50:47.678 INFO:tasks.workunit.client.0.vm05.stdout:0/263: symlink d8/dd/d10/d4b/l57 0 2026-03-10T07:50:47.681 INFO:tasks.workunit.client.0.vm05.stdout:6/266: dread d0/d6/f44 [0,4194304] 0 2026-03-10T07:50:47.682 INFO:tasks.workunit.client.0.vm05.stdout:9/285: symlink d8/d35/d1c/l65 0 2026-03-10T07:50:47.688 INFO:tasks.workunit.client.0.vm05.stdout:7/310: mknod d1/d6/d47/c5e 0 2026-03-10T07:50:47.692 INFO:tasks.workunit.client.0.vm05.stdout:4/305: getdents d0/d28 0 2026-03-10T07:50:47.694 INFO:tasks.workunit.client.0.vm05.stdout:2/379: getdents d0/d8/d43/df/d4e 0 2026-03-10T07:50:47.694 INFO:tasks.workunit.client.0.vm05.stdout:2/380: write d0/d8/f65 [1327934,129798] 0 2026-03-10T07:50:47.695 INFO:tasks.workunit.client.0.vm05.stdout:2/381: write d0/d8/f1c [2332215,2718] 0 2026-03-10T07:50:47.700 INFO:tasks.workunit.client.0.vm05.stdout:3/292: rename d8/d16/d19/f27 to d8/f5d 0 2026-03-10T07:50:47.702 INFO:tasks.workunit.client.0.vm05.stdout:1/302: symlink da/dd/d2a/l53 0 2026-03-10T07:50:47.713 INFO:tasks.workunit.client.0.vm05.stdout:8/272: unlink d1/dd/l12 0 2026-03-10T07:50:47.720 INFO:tasks.workunit.client.0.vm05.stdout:0/264: unlink d8/dd/d10/d26/f31 0 2026-03-10T07:50:47.721 INFO:tasks.workunit.client.0.vm05.stdout:0/265: chown d8/dd/d34/f3d 7 1 2026-03-10T07:50:47.723 INFO:tasks.workunit.client.0.vm05.stdout:8/273: dread d1/fe [0,4194304] 0 2026-03-10T07:50:47.733 INFO:tasks.workunit.client.0.vm05.stdout:9/286: mknod d8/d35/d1c/d26/d28/c66 0 2026-03-10T07:50:47.734 INFO:tasks.workunit.client.0.vm05.stdout:9/287: readlink d8/d35/d1c/d36/l55 0 2026-03-10T07:50:47.734 INFO:tasks.workunit.client.0.vm05.stdout:9/288: write d8/d35/d1c/d20/f32 [724171,6220] 0 2026-03-10T07:50:47.738 INFO:tasks.workunit.client.0.vm05.stdout:7/311: mknod d1/d34/d59/c5f 0 2026-03-10T07:50:47.739 INFO:tasks.workunit.client.0.vm05.stdout:4/306: chown d0/d6/d9/d12/l14 1 1 2026-03-10T07:50:47.751 INFO:tasks.workunit.client.0.vm05.stdout:3/293: symlink d8/d1f/d2a/d33/l5e 0 2026-03-10T07:50:47.756 INFO:tasks.workunit.client.0.vm05.stdout:1/303: creat da/dd/d2a/f54 x:0 0 0 2026-03-10T07:50:47.760 INFO:tasks.workunit.client.0.vm05.stdout:5/299: link d2/d5/f23 d2/d20/d33/d53/f68 0 2026-03-10T07:50:47.775 INFO:tasks.workunit.client.0.vm05.stdout:9/289: creat d8/d35/d1c/d20/d54/f67 x:0 0 0 2026-03-10T07:50:47.777 INFO:tasks.workunit.client.0.vm05.stdout:4/307: sync 2026-03-10T07:50:47.784 INFO:tasks.workunit.client.0.vm05.stdout:7/312: fdatasync d1/f16 0 2026-03-10T07:50:47.788 INFO:tasks.workunit.client.0.vm05.stdout:3/294: mkdir d8/d1f/d2a/d34/d5f 0 2026-03-10T07:50:47.788 INFO:tasks.workunit.client.0.vm05.stdout:3/295: readlink d8/l17 0 2026-03-10T07:50:47.789 INFO:tasks.workunit.client.0.vm05.stdout:1/304: mkdir da/dd/d2a/d55 0 2026-03-10T07:50:47.791 INFO:tasks.workunit.client.0.vm05.stdout:5/300: symlink d2/d5/d61/l69 0 2026-03-10T07:50:47.794 INFO:tasks.workunit.client.0.vm05.stdout:2/382: dwrite d0/d8/d43/df/d25/f29 [0,4194304] 0 2026-03-10T07:50:47.796 INFO:tasks.workunit.client.0.vm05.stdout:5/301: read d2/f9 [2871542,41840] 0 2026-03-10T07:50:47.803 INFO:tasks.workunit.client.0.vm05.stdout:9/290: symlink d8/d35/d1c/d26/d28/l68 0 2026-03-10T07:50:47.806 INFO:tasks.workunit.client.0.vm05.stdout:6/267: write d0/d6/f1d [173337,45701] 0 2026-03-10T07:50:47.806 INFO:tasks.workunit.client.0.vm05.stdout:8/274: write d1/dd/f17 [2690864,15511] 0 2026-03-10T07:50:47.807 INFO:tasks.workunit.client.0.vm05.stdout:6/268: fsync d0/d11/d4f/f53 0 2026-03-10T07:50:47.811 INFO:tasks.workunit.client.0.vm05.stdout:3/296: rename d8/d1f/d2a/d33 to d8/d22/d60 0 2026-03-10T07:50:47.812 INFO:tasks.workunit.client.0.vm05.stdout:7/313: dread d1/d34/f3e [0,4194304] 0 2026-03-10T07:50:47.813 INFO:tasks.workunit.client.0.vm05.stdout:7/314: write d1/d6/f4e [364676,14008] 0 2026-03-10T07:50:47.814 INFO:tasks.workunit.client.0.vm05.stdout:1/305: mknod da/dd/c56 0 2026-03-10T07:50:47.815 INFO:tasks.workunit.client.0.vm05.stdout:1/306: stat da/dd/d12/d34/f4f 0 2026-03-10T07:50:47.815 INFO:tasks.workunit.client.0.vm05.stdout:0/266: link d8/dd/d10/c49 d8/dd/d37/d56/d4d/c58 0 2026-03-10T07:50:47.818 INFO:tasks.workunit.client.0.vm05.stdout:1/307: dwrite da/d26/f33 [4194304,4194304] 0 2026-03-10T07:50:47.819 INFO:tasks.workunit.client.0.vm05.stdout:1/308: fsync da/dd/d12/f18 0 2026-03-10T07:50:47.829 INFO:tasks.workunit.client.0.vm05.stdout:9/291: unlink d8/l60 0 2026-03-10T07:50:47.832 INFO:tasks.workunit.client.0.vm05.stdout:9/292: dwrite d8/d35/d1c/f49 [0,4194304] 0 2026-03-10T07:50:47.840 INFO:tasks.workunit.client.0.vm05.stdout:4/308: creat d0/d6/d9/d12/d69/f6a x:0 0 0 2026-03-10T07:50:47.847 INFO:tasks.workunit.client.0.vm05.stdout:6/269: mkdir d0/d11/d4f/d56 0 2026-03-10T07:50:47.849 INFO:tasks.workunit.client.0.vm05.stdout:3/297: rename d8/d1c/f35 to d8/d22/d60/f61 0 2026-03-10T07:50:47.852 INFO:tasks.workunit.client.0.vm05.stdout:3/298: dwrite d8/f5c [0,4194304] 0 2026-03-10T07:50:47.859 INFO:tasks.workunit.client.0.vm05.stdout:7/315: rmdir d1 39 2026-03-10T07:50:47.868 INFO:tasks.workunit.client.0.vm05.stdout:2/383: mknod d0/c78 0 2026-03-10T07:50:47.868 INFO:tasks.workunit.client.0.vm05.stdout:5/302: creat d2/d20/d33/d53/f6a x:0 0 0 2026-03-10T07:50:47.868 INFO:tasks.workunit.client.0.vm05.stdout:5/303: readlink d2/d4b/l5e 0 2026-03-10T07:50:47.869 INFO:tasks.workunit.client.0.vm05.stdout:9/293: unlink d8/d35/d1c/f40 0 2026-03-10T07:50:47.869 INFO:tasks.workunit.client.0.vm05.stdout:9/294: readlink d8/d35/d22/l4e 0 2026-03-10T07:50:47.870 INFO:tasks.workunit.client.0.vm05.stdout:2/384: read d0/f6 [2062433,68295] 0 2026-03-10T07:50:47.870 INFO:tasks.workunit.client.0.vm05.stdout:4/309: mknod d0/d6/d9/d12/c6b 0 2026-03-10T07:50:47.870 INFO:tasks.workunit.client.0.vm05.stdout:9/295: stat d8/d35/d1c/d20/l37 0 2026-03-10T07:50:47.871 INFO:tasks.workunit.client.0.vm05.stdout:9/296: chown d8/d35/d1c/d20/l37 21317887 1 2026-03-10T07:50:47.872 INFO:tasks.workunit.client.0.vm05.stdout:6/270: fdatasync d0/d11/d2e/f30 0 2026-03-10T07:50:47.872 INFO:tasks.workunit.client.0.vm05.stdout:4/310: read d0/f2 [7904143,92490] 0 2026-03-10T07:50:47.874 INFO:tasks.workunit.client.0.vm05.stdout:2/385: dwrite d0/d2a/f2e [0,4194304] 0 2026-03-10T07:50:47.881 INFO:tasks.workunit.client.0.vm05.stdout:2/386: dwrite d0/f7 [0,4194304] 0 2026-03-10T07:50:47.897 INFO:tasks.workunit.client.0.vm05.stdout:2/387: dread d0/f5 [0,4194304] 0 2026-03-10T07:50:47.907 INFO:tasks.workunit.client.0.vm05.stdout:1/309: mknod da/c57 0 2026-03-10T07:50:47.913 INFO:tasks.workunit.client.0.vm05.stdout:5/304: read d2/d12/f2b [1116196,75298] 0 2026-03-10T07:50:47.915 INFO:tasks.workunit.client.0.vm05.stdout:6/271: mkdir d0/d11/d57 0 2026-03-10T07:50:47.921 INFO:tasks.workunit.client.0.vm05.stdout:2/388: symlink d0/d8/d43/df/d4d/l79 0 2026-03-10T07:50:47.921 INFO:tasks.workunit.client.0.vm05.stdout:2/389: chown d0/d8/d66 142264 1 2026-03-10T07:50:47.921 INFO:tasks.workunit.client.0.vm05.stdout:2/390: stat d0/d2a/f45 0 2026-03-10T07:50:47.921 INFO:tasks.workunit.client.0.vm05.stdout:2/391: stat d0/d8/d43/d38/l6a 0 2026-03-10T07:50:47.924 INFO:tasks.workunit.client.0.vm05.stdout:7/316: mkdir d1/d34/d59/d60 0 2026-03-10T07:50:47.926 INFO:tasks.workunit.client.0.vm05.stdout:0/267: creat d8/dd/f59 x:0 0 0 2026-03-10T07:50:47.930 INFO:tasks.workunit.client.0.vm05.stdout:0/268: dwrite d8/f2d [0,4194304] 0 2026-03-10T07:50:47.934 INFO:tasks.workunit.client.0.vm05.stdout:0/269: dread - d8/dd/d37/f4f zero size 2026-03-10T07:50:47.939 INFO:tasks.workunit.client.0.vm05.stdout:9/297: symlink d8/d35/l69 0 2026-03-10T07:50:47.940 INFO:tasks.workunit.client.0.vm05.stdout:9/298: chown d8/d35/d1c/d2c 339442291 1 2026-03-10T07:50:47.943 INFO:tasks.workunit.client.0.vm05.stdout:6/272: creat d0/d11/f58 x:0 0 0 2026-03-10T07:50:47.944 INFO:tasks.workunit.client.0.vm05.stdout:8/275: mkdir d1/d52 0 2026-03-10T07:50:47.948 INFO:tasks.workunit.client.0.vm05.stdout:7/317: dread d1/f3d [0,4194304] 0 2026-03-10T07:50:47.955 INFO:tasks.workunit.client.0.vm05.stdout:1/310: mkdir da/dd/d12/d34/d58 0 2026-03-10T07:50:47.955 INFO:tasks.workunit.client.0.vm05.stdout:2/392: dread d0/d8/d43/df/f3a [0,4194304] 0 2026-03-10T07:50:47.958 INFO:tasks.workunit.client.0.vm05.stdout:3/299: getdents d8/d1f 0 2026-03-10T07:50:47.960 INFO:tasks.workunit.client.0.vm05.stdout:5/305: truncate d2/d12/d2d/f36 164346 0 2026-03-10T07:50:47.962 INFO:tasks.workunit.client.0.vm05.stdout:1/311: rmdir da/dd/d12/d34 39 2026-03-10T07:50:47.972 INFO:tasks.workunit.client.0.vm05.stdout:2/393: symlink d0/d2a/l7a 0 2026-03-10T07:50:47.980 INFO:tasks.workunit.client.0.vm05.stdout:7/318: dwrite d1/f3d [0,4194304] 0 2026-03-10T07:50:47.988 INFO:tasks.workunit.client.0.vm05.stdout:4/311: getdents d0/d3b 0 2026-03-10T07:50:47.988 INFO:tasks.workunit.client.0.vm05.stdout:4/312: stat d0/d6/d9/d12/d4f 0 2026-03-10T07:50:47.988 INFO:tasks.workunit.client.0.vm05.stdout:4/313: chown d0/d6/d9/l27 5 1 2026-03-10T07:50:47.988 INFO:tasks.workunit.client.0.vm05.stdout:3/300: symlink d8/d16/d19/l62 0 2026-03-10T07:50:47.989 INFO:tasks.workunit.client.0.vm05.stdout:3/301: fdatasync d8/d16/f4c 0 2026-03-10T07:50:47.993 INFO:tasks.workunit.client.0.vm05.stdout:5/306: creat d2/d12/f6b x:0 0 0 2026-03-10T07:50:47.997 INFO:tasks.workunit.client.0.vm05.stdout:0/270: rename d8/dd/d10/d26/d48/l52 to d8/dd/d37/d56/l5a 0 2026-03-10T07:50:47.997 INFO:tasks.workunit.client.0.vm05.stdout:0/271: dread - d8/dd/d10/d26/d3a/f53 zero size 2026-03-10T07:50:47.997 INFO:tasks.workunit.client.0.vm05.stdout:2/394: creat d0/d8/f7b x:0 0 0 2026-03-10T07:50:47.997 INFO:tasks.workunit.client.0.vm05.stdout:2/395: stat d0/d8/d43/d38/c6b 0 2026-03-10T07:50:48.000 INFO:tasks.workunit.client.0.vm05.stdout:5/307: fdatasync d2/d20/d33/d53/f68 0 2026-03-10T07:50:48.000 INFO:tasks.workunit.client.0.vm05.stdout:8/276: dread d1/dd/d18/f29 [4194304,4194304] 0 2026-03-10T07:50:48.000 INFO:tasks.workunit.client.0.vm05.stdout:5/308: write d2/f8 [3505425,96704] 0 2026-03-10T07:50:48.003 INFO:tasks.workunit.client.0.vm05.stdout:6/273: rename d0/d6/f2c to d0/d35/d36/f59 0 2026-03-10T07:50:48.004 INFO:tasks.workunit.client.0.vm05.stdout:1/312: mkdir da/d59 0 2026-03-10T07:50:48.005 INFO:tasks.workunit.client.0.vm05.stdout:1/313: dread - da/dd/d12/d19/f4e zero size 2026-03-10T07:50:48.007 INFO:tasks.workunit.client.0.vm05.stdout:7/319: mknod d1/c61 0 2026-03-10T07:50:48.011 INFO:tasks.workunit.client.0.vm05.stdout:3/302: rename d8/d22/d60/f44 to d8/d1c/f63 0 2026-03-10T07:50:48.020 INFO:tasks.workunit.client.0.vm05.stdout:3/303: dwrite d8/d1f/d2a/f32 [0,4194304] 0 2026-03-10T07:50:48.021 INFO:tasks.workunit.client.0.vm05.stdout:3/304: truncate d8/d1f/d2a/d34/f46 544444 0 2026-03-10T07:50:48.025 INFO:tasks.workunit.client.0.vm05.stdout:1/314: dread da/d26/f2d [0,4194304] 0 2026-03-10T07:50:48.038 INFO:tasks.workunit.client.0.vm05.stdout:7/320: write d1/f21 [3097493,23369] 0 2026-03-10T07:50:48.042 INFO:tasks.workunit.client.0.vm05.stdout:9/299: truncate d8/d35/d1c/d26/d28/f43 1139969 0 2026-03-10T07:50:48.056 INFO:tasks.workunit.client.0.vm05.stdout:6/274: getdents d0/d11/d4f/d56 0 2026-03-10T07:50:48.056 INFO:tasks.workunit.client.0.vm05.stdout:3/305: mkdir d8/d1c/d64 0 2026-03-10T07:50:48.056 INFO:tasks.workunit.client.0.vm05.stdout:1/315: write da/f41 [851225,114732] 0 2026-03-10T07:50:48.056 INFO:tasks.workunit.client.0.vm05.stdout:0/272: write d8/dd/d10/d26/d2a/f2b [8986646,31300] 0 2026-03-10T07:50:48.060 INFO:tasks.workunit.client.0.vm05.stdout:9/300: creat d8/d35/d22/f6a x:0 0 0 2026-03-10T07:50:48.065 INFO:tasks.workunit.client.0.vm05.stdout:4/314: rename d0/d6/d9/d12/d45/d55/d44/l50 to d0/d6/d9/d12/d65/l6c 0 2026-03-10T07:50:48.066 INFO:tasks.workunit.client.0.vm05.stdout:4/315: fdatasync d0/d6/d37/f46 0 2026-03-10T07:50:48.069 INFO:tasks.workunit.client.0.vm05.stdout:4/316: dwrite d0/d20/f2a [0,4194304] 0 2026-03-10T07:50:48.072 INFO:tasks.workunit.client.0.vm05.stdout:4/317: dread - d0/f41 zero size 2026-03-10T07:50:48.078 INFO:tasks.workunit.client.0.vm05.stdout:4/318: dread d0/d28/f33 [0,4194304] 0 2026-03-10T07:50:48.080 INFO:tasks.workunit.client.0.vm05.stdout:5/309: dwrite d2/d20/f2a [4194304,4194304] 0 2026-03-10T07:50:48.080 INFO:tasks.workunit.client.0.vm05.stdout:4/319: write d0/d20/f2a [607573,123128] 0 2026-03-10T07:50:48.080 INFO:tasks.workunit.client.0.vm05.stdout:4/320: write d0/d20/d26/f40 [3911890,30924] 0 2026-03-10T07:50:48.093 INFO:tasks.workunit.client.0.vm05.stdout:1/316: dwrite da/dd/d12/d34/f4f [0,4194304] 0 2026-03-10T07:50:48.105 INFO:tasks.workunit.client.0.vm05.stdout:4/321: dread d0/d6/d9/d12/d45/d55/f56 [0,4194304] 0 2026-03-10T07:50:48.108 INFO:tasks.workunit.client.0.vm05.stdout:2/396: rename d0/d47/f76 to d0/d8/d43/df/d53/f7c 0 2026-03-10T07:50:48.112 INFO:tasks.workunit.client.0.vm05.stdout:4/322: chown d0/d6/f39 11 1 2026-03-10T07:50:48.115 INFO:tasks.workunit.client.0.vm05.stdout:3/306: sync 2026-03-10T07:50:48.127 INFO:tasks.workunit.client.0.vm05.stdout:9/301: sync 2026-03-10T07:50:48.128 INFO:tasks.workunit.client.0.vm05.stdout:3/307: readlink d8/d16/d19/l62 0 2026-03-10T07:50:48.128 INFO:tasks.workunit.client.0.vm05.stdout:8/277: rename d1/dd/d18/f1f to d1/d45/f53 0 2026-03-10T07:50:48.128 INFO:tasks.workunit.client.0.vm05.stdout:2/397: readlink d0/d8/d43/l5b 0 2026-03-10T07:50:48.128 INFO:tasks.workunit.client.0.vm05.stdout:5/310: mknod d2/d20/d4c/d64/c6c 0 2026-03-10T07:50:48.128 INFO:tasks.workunit.client.0.vm05.stdout:4/323: mknod d0/d6/d9/d12/d45/d55/d4e/c6d 0 2026-03-10T07:50:48.128 INFO:tasks.workunit.client.0.vm05.stdout:3/308: mknod d8/d22/d60/d58/c65 0 2026-03-10T07:50:48.128 INFO:tasks.workunit.client.0.vm05.stdout:4/324: chown d0/d6/d9/d12/d45/d55/l62 5788 1 2026-03-10T07:50:48.128 INFO:tasks.workunit.client.0.vm05.stdout:4/325: dread - d0/d20/f63 zero size 2026-03-10T07:50:48.128 INFO:tasks.workunit.client.0.vm05.stdout:8/278: dread d1/d23/f31 [0,4194304] 0 2026-03-10T07:50:48.128 INFO:tasks.workunit.client.0.vm05.stdout:8/279: write d1/dd/d18/f47 [3753779,101640] 0 2026-03-10T07:50:48.128 INFO:tasks.workunit.client.0.vm05.stdout:9/302: dwrite d8/d35/f51 [0,4194304] 0 2026-03-10T07:50:48.130 INFO:tasks.workunit.client.0.vm05.stdout:4/326: dread d0/d6/d9/d12/f36 [0,4194304] 0 2026-03-10T07:50:48.133 INFO:tasks.workunit.client.0.vm05.stdout:1/317: rename da/f41 to da/f5a 0 2026-03-10T07:50:48.137 INFO:tasks.workunit.client.0.vm05.stdout:7/321: rename d1/d3c/d4b/l55 to d1/d3c/l62 0 2026-03-10T07:50:48.138 INFO:tasks.workunit.client.0.vm05.stdout:4/327: mkdir d0/d6/d9/d5a/d6e 0 2026-03-10T07:50:48.139 INFO:tasks.workunit.client.0.vm05.stdout:1/318: fdatasync da/dd/d12/d19/f3b 0 2026-03-10T07:50:48.140 INFO:tasks.workunit.client.0.vm05.stdout:1/319: dread - da/dd/d27/f4d zero size 2026-03-10T07:50:48.142 INFO:tasks.workunit.client.0.vm05.stdout:3/309: link d8/ff d8/d1f/d2a/f66 0 2026-03-10T07:50:48.146 INFO:tasks.workunit.client.0.vm05.stdout:0/273: rename f6 to d8/dd/d34/f5b 0 2026-03-10T07:50:48.147 INFO:tasks.workunit.client.0.vm05.stdout:9/303: mkdir d8/d35/d6b 0 2026-03-10T07:50:48.151 INFO:tasks.workunit.client.0.vm05.stdout:8/280: dread d1/dd/d18/d20/d2a/f3a [0,4194304] 0 2026-03-10T07:50:48.153 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:48 vm05.local ceph-mon[50387]: pgmap v13: 65 pgs: 65 active+clean; 579 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 2.4 MiB/s rd, 60 MiB/s wr, 173 op/s 2026-03-10T07:50:48.156 INFO:tasks.workunit.client.0.vm05.stdout:3/310: creat d8/d16/f67 x:0 0 0 2026-03-10T07:50:48.158 INFO:tasks.workunit.client.0.vm05.stdout:3/311: dread d8/d1c/f56 [0,4194304] 0 2026-03-10T07:50:48.161 INFO:tasks.workunit.client.0.vm05.stdout:6/275: dwrite d0/d11/d2e/f30 [0,4194304] 0 2026-03-10T07:50:48.163 INFO:tasks.workunit.client.0.vm05.stdout:4/328: rmdir d0/d6/d9/d12/d45/d55 39 2026-03-10T07:50:48.163 INFO:tasks.workunit.client.0.vm05.stdout:2/398: write d0/d8/d43/f1f [1807814,29224] 0 2026-03-10T07:50:48.169 INFO:tasks.workunit.client.0.vm05.stdout:1/320: getdents da/d59 0 2026-03-10T07:50:48.171 INFO:tasks.workunit.client.0.vm05.stdout:5/311: dwrite d2/d5/f18 [0,4194304] 0 2026-03-10T07:50:48.172 INFO:tasks.workunit.client.0.vm05.stdout:6/276: dwrite d0/d6/f45 [0,4194304] 0 2026-03-10T07:50:48.176 INFO:tasks.workunit.client.0.vm05.stdout:6/277: readlink d0/d35/l4d 0 2026-03-10T07:50:48.176 INFO:tasks.workunit.client.0.vm05.stdout:0/274: mknod d8/dd/d37/d56/c5c 0 2026-03-10T07:50:48.182 INFO:tasks.workunit.client.0.vm05.stdout:7/322: dread d1/d6/f32 [0,4194304] 0 2026-03-10T07:50:48.201 INFO:tasks.workunit.client.0.vm05.stdout:7/323: dwrite d1/f49 [0,4194304] 0 2026-03-10T07:50:48.201 INFO:tasks.workunit.client.0.vm05.stdout:2/399: rename d0/d2a/d2f to d0/d8/d3d/d7d 0 2026-03-10T07:50:48.201 INFO:tasks.workunit.client.0.vm05.stdout:9/304: creat d8/d35/d22/d33/d62/f6c x:0 0 0 2026-03-10T07:50:48.201 INFO:tasks.workunit.client.0.vm05.stdout:4/329: mkdir d0/d6/d6f 0 2026-03-10T07:50:48.201 INFO:tasks.workunit.client.0.vm05.stdout:6/278: symlink d0/d11/d22/l5a 0 2026-03-10T07:50:48.201 INFO:tasks.workunit.client.0.vm05.stdout:5/312: read d2/d12/d2d/f39 [1827595,100484] 0 2026-03-10T07:50:48.201 INFO:tasks.workunit.client.0.vm05.stdout:5/313: stat d2/f15 0 2026-03-10T07:50:48.201 INFO:tasks.workunit.client.0.vm05.stdout:1/321: mknod da/c5b 0 2026-03-10T07:50:48.201 INFO:tasks.workunit.client.0.vm05.stdout:8/281: creat d1/dd/d18/d20/d2a/f54 x:0 0 0 2026-03-10T07:50:48.201 INFO:tasks.workunit.client.0.vm05.stdout:1/322: dwrite da/dd/d12/d19/f4e [0,4194304] 0 2026-03-10T07:50:48.208 INFO:tasks.workunit.client.0.vm05.stdout:7/324: dread d1/f16 [0,4194304] 0 2026-03-10T07:50:48.214 INFO:tasks.workunit.client.0.vm05.stdout:9/305: mkdir d8/d35/d22/d33/d62/d6d 0 2026-03-10T07:50:48.215 INFO:tasks.workunit.client.0.vm05.stdout:6/279: rename d0/d11/f2a to d0/d35/d36/f5b 0 2026-03-10T07:50:48.218 INFO:tasks.workunit.client.0.vm05.stdout:0/275: truncate d8/dd/f22 917783 0 2026-03-10T07:50:48.220 INFO:tasks.workunit.client.0.vm05.stdout:3/312: getdents d8/d1f 0 2026-03-10T07:50:48.224 INFO:tasks.workunit.client.0.vm05.stdout:1/323: dwrite da/f43 [0,4194304] 0 2026-03-10T07:50:48.226 INFO:tasks.workunit.client.0.vm05.stdout:1/324: truncate da/d26/d2b/f50 4270982 0 2026-03-10T07:50:48.232 INFO:tasks.workunit.client.0.vm05.stdout:9/306: mknod d8/d35/d22/c6e 0 2026-03-10T07:50:48.233 INFO:tasks.workunit.client.0.vm05.stdout:9/307: chown d8/d35/d1c/d26/c2f 290 1 2026-03-10T07:50:48.234 INFO:tasks.workunit.client.0.vm05.stdout:1/325: dwrite da/dd/d27/f36 [0,4194304] 0 2026-03-10T07:50:48.245 INFO:tasks.workunit.client.0.vm05.stdout:4/330: mknod d0/d6/d9/d5a/d6e/c70 0 2026-03-10T07:50:48.252 INFO:tasks.workunit.client.0.vm05.stdout:2/400: dread d0/d8/f42 [0,4194304] 0 2026-03-10T07:50:48.255 INFO:tasks.workunit.client.0.vm05.stdout:7/325: rename d1/f3d to d1/d3c/f63 0 2026-03-10T07:50:48.259 INFO:tasks.workunit.client.0.vm05.stdout:7/326: truncate d1/d6/d47/f52 994786 0 2026-03-10T07:50:48.259 INFO:tasks.workunit.client.0.vm05.stdout:7/327: write d1/f11 [4497293,114804] 0 2026-03-10T07:50:48.259 INFO:tasks.workunit.client.0.vm05.stdout:0/276: truncate d8/fb 4022990 0 2026-03-10T07:50:48.259 INFO:tasks.workunit.client.0.vm05.stdout:0/277: chown d8/f4e 922 1 2026-03-10T07:50:48.270 INFO:tasks.workunit.client.0.vm05.stdout:7/328: dread d1/f21 [0,4194304] 0 2026-03-10T07:50:48.272 INFO:tasks.workunit.client.0.vm05.stdout:7/329: dread d1/d6/f4e [0,4194304] 0 2026-03-10T07:50:48.273 INFO:tasks.workunit.client.0.vm05.stdout:7/330: read d1/d6/f41 [3203645,2576] 0 2026-03-10T07:50:48.274 INFO:tasks.workunit.client.0.vm05.stdout:7/331: dread d1/f16 [0,4194304] 0 2026-03-10T07:50:48.276 INFO:tasks.workunit.client.0.vm05.stdout:7/332: dread d1/d3c/f63 [0,4194304] 0 2026-03-10T07:50:48.277 INFO:tasks.workunit.client.0.vm05.stdout:8/282: link d1/dd/d18/f47 d1/d45/f55 0 2026-03-10T07:50:48.280 INFO:tasks.workunit.client.0.vm05.stdout:6/280: creat d0/d11/d57/f5c x:0 0 0 2026-03-10T07:50:48.287 INFO:tasks.workunit.client.0.vm05.stdout:0/278: mknod d8/dd/d10/d26/d48/c5d 0 2026-03-10T07:50:48.296 INFO:tasks.workunit.client.0.vm05.stdout:4/331: symlink d0/d6/d9/d12/d4f/l71 0 2026-03-10T07:50:48.297 INFO:tasks.workunit.client.0.vm05.stdout:8/283: dread d1/d23/f31 [0,4194304] 0 2026-03-10T07:50:48.299 INFO:tasks.workunit.client.0.vm05.stdout:6/281: dwrite d0/d11/f13 [0,4194304] 0 2026-03-10T07:50:48.300 INFO:tasks.workunit.client.0.vm05.stdout:6/282: readlink d0/d11/d2e/l46 0 2026-03-10T07:50:48.302 INFO:tasks.workunit.client.0.vm05.stdout:2/401: mkdir d0/d7e 0 2026-03-10T07:50:48.302 INFO:tasks.workunit.client.0.vm05.stdout:5/314: getdents d2/d12 0 2026-03-10T07:50:48.315 INFO:tasks.workunit.client.0.vm05.stdout:9/308: dread d8/d35/f1d [0,4194304] 0 2026-03-10T07:50:48.318 INFO:tasks.workunit.client.0.vm05.stdout:1/326: creat da/f5c x:0 0 0 2026-03-10T07:50:48.318 INFO:tasks.workunit.client.0.vm05.stdout:1/327: readlink da/dd/d12/d19/l29 0 2026-03-10T07:50:48.318 INFO:tasks.workunit.client.0.vm05.stdout:1/328: chown da/d26/d2b/f50 49962 1 2026-03-10T07:50:48.320 INFO:tasks.workunit.client.0.vm05.stdout:4/332: write d0/d6/d9/d12/d45/d55/f19 [14056215,50899] 0 2026-03-10T07:50:48.326 INFO:tasks.workunit.client.0.vm05.stdout:8/284: dread d1/fe [0,4194304] 0 2026-03-10T07:50:48.327 INFO:tasks.workunit.client.0.vm05.stdout:6/283: rename d0/d11/l14 to d0/d6/l5d 0 2026-03-10T07:50:48.328 INFO:tasks.workunit.client.0.vm05.stdout:6/284: write d0/d6/d3b/f55 [702680,81273] 0 2026-03-10T07:50:48.331 INFO:tasks.workunit.client.0.vm05.stdout:5/315: mknod d2/d20/d33/c6d 0 2026-03-10T07:50:48.334 INFO:tasks.workunit.client.0.vm05.stdout:9/309: fsync d8/d35/d1c/d2c/f2e 0 2026-03-10T07:50:48.334 INFO:tasks.workunit.client.0.vm05.stdout:9/310: truncate d8/d35/d1c/d36/f50 16857 0 2026-03-10T07:50:48.340 INFO:tasks.workunit.client.0.vm05.stdout:1/329: dread da/fc [0,4194304] 0 2026-03-10T07:50:48.343 INFO:tasks.workunit.client.0.vm05.stdout:1/330: dwrite da/d26/d2b/f45 [4194304,4194304] 0 2026-03-10T07:50:48.348 INFO:tasks.workunit.client.0.vm05.stdout:3/313: dwrite d8/d1c/f63 [0,4194304] 0 2026-03-10T07:50:48.353 INFO:tasks.workunit.client.0.vm05.stdout:7/333: dwrite d1/d34/f3e [0,4194304] 0 2026-03-10T07:50:48.363 INFO:tasks.workunit.client.0.vm05.stdout:6/285: stat d0/d11/d22/l3d 0 2026-03-10T07:50:48.363 INFO:tasks.workunit.client.0.vm05.stdout:6/286: chown d0/d11/d2e 37 1 2026-03-10T07:50:48.367 INFO:tasks.workunit.client.0.vm05.stdout:0/279: write d8/f33 [1798702,129238] 0 2026-03-10T07:50:48.371 INFO:tasks.workunit.client.0.vm05.stdout:5/316: creat d2/d20/d5b/f6e x:0 0 0 2026-03-10T07:50:48.371 INFO:tasks.workunit.client.0.vm05.stdout:8/285: creat d1/dd/d18/d20/d2a/d34/d49/f56 x:0 0 0 2026-03-10T07:50:48.374 INFO:tasks.workunit.client.0.vm05.stdout:7/334: unlink d1/d3c/l44 0 2026-03-10T07:50:48.376 INFO:tasks.workunit.client.0.vm05.stdout:7/335: dread d1/d34/f3e [0,4194304] 0 2026-03-10T07:50:48.379 INFO:tasks.workunit.client.0.vm05.stdout:9/311: mknod d8/d35/d3c/d57/c6f 0 2026-03-10T07:50:48.381 INFO:tasks.workunit.client.0.vm05.stdout:9/312: fsync d8/d35/d1c/d20/f32 0 2026-03-10T07:50:48.381 INFO:tasks.workunit.client.0.vm05.stdout:2/402: rename d0/d8/c2c to d0/c7f 0 2026-03-10T07:50:48.381 INFO:tasks.workunit.client.0.vm05.stdout:7/336: creat d1/d34/d59/f64 x:0 0 0 2026-03-10T07:50:48.383 INFO:tasks.workunit.client.0.vm05.stdout:9/313: write d8/f15 [2493441,59802] 0 2026-03-10T07:50:48.385 INFO:tasks.workunit.client.0.vm05.stdout:8/286: unlink d1/dd/d18/c1a 0 2026-03-10T07:50:48.387 INFO:tasks.workunit.client.0.vm05.stdout:7/337: creat d1/d6/d47/f65 x:0 0 0 2026-03-10T07:50:48.389 INFO:tasks.workunit.client.0.vm05.stdout:8/287: dwrite d1/dd/d18/d20/f43 [0,4194304] 0 2026-03-10T07:50:48.394 INFO:tasks.workunit.client.0.vm05.stdout:8/288: write d1/dd/d18/d20/d2a/d34/f39 [564072,75875] 0 2026-03-10T07:50:48.402 INFO:tasks.workunit.client.0.vm05.stdout:0/280: dread d8/fc [4194304,4194304] 0 2026-03-10T07:50:48.404 INFO:tasks.workunit.client.0.vm05.stdout:1/331: rename da/dd/d12/d34/f4f to da/f5d 0 2026-03-10T07:50:48.405 INFO:tasks.workunit.client.0.vm05.stdout:3/314: rename d8 to d8/d16/d19/d68 22 2026-03-10T07:50:48.405 INFO:tasks.workunit.client.0.vm05.stdout:3/315: chown d8/d1f/d2a/d4b/f57 662 1 2026-03-10T07:50:48.406 INFO:tasks.workunit.client.0.vm05.stdout:9/314: getdents d8/d35/d22/d33/d47 0 2026-03-10T07:50:48.416 INFO:tasks.workunit.client.0.vm05.stdout:8/289: write d1/fa [10087808,85116] 0 2026-03-10T07:50:48.416 INFO:tasks.workunit.client.0.vm05.stdout:2/403: link d0/d8/d43/d38/l6a d0/d8/l80 0 2026-03-10T07:50:48.416 INFO:tasks.workunit.client.0.vm05.stdout:2/404: truncate d0/d8/d43/df/f21 8998218 0 2026-03-10T07:50:48.417 INFO:tasks.workunit.client.0.vm05.stdout:6/287: link d0/cb d0/c5e 0 2026-03-10T07:50:48.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:48 vm08.local ceph-mon[59917]: pgmap v13: 65 pgs: 65 active+clean; 579 MiB data, 2.3 GiB used, 118 GiB / 120 GiB avail; 2.4 MiB/s rd, 60 MiB/s wr, 173 op/s 2026-03-10T07:50:48.422 INFO:tasks.workunit.client.0.vm05.stdout:3/316: mkdir d8/d1c/d48/d69 0 2026-03-10T07:50:48.426 INFO:tasks.workunit.client.0.vm05.stdout:9/315: dread d8/d35/f25 [0,4194304] 0 2026-03-10T07:50:48.426 INFO:tasks.workunit.client.0.vm05.stdout:4/333: dwrite d0/d6/d9/d12/d45/d55/f56 [4194304,4194304] 0 2026-03-10T07:50:48.428 INFO:tasks.workunit.client.0.vm05.stdout:4/334: chown d0/d6/d37/f3d 100640705 1 2026-03-10T07:50:48.428 INFO:tasks.workunit.client.0.vm05.stdout:4/335: dread - d0/d6/d9/d5a/f58 zero size 2026-03-10T07:50:48.430 INFO:tasks.workunit.client.0.vm05.stdout:2/405: unlink d0/d47/d49/f54 0 2026-03-10T07:50:48.431 INFO:tasks.workunit.client.0.vm05.stdout:2/406: write d0/d8/d43/f1f [2599113,78400] 0 2026-03-10T07:50:48.438 INFO:tasks.workunit.client.0.vm05.stdout:6/288: dread d0/f23 [4194304,4194304] 0 2026-03-10T07:50:48.444 INFO:tasks.workunit.client.0.vm05.stdout:7/338: rename d1/d6/c19 to d1/d34/c66 0 2026-03-10T07:50:48.449 INFO:tasks.workunit.client.0.vm05.stdout:9/316: mkdir d8/d35/d22/d33/d70 0 2026-03-10T07:50:48.454 INFO:tasks.workunit.client.0.vm05.stdout:1/332: getdents da/dd 0 2026-03-10T07:50:48.454 INFO:tasks.workunit.client.0.vm05.stdout:2/407: mkdir d0/d47/d49/d81 0 2026-03-10T07:50:48.456 INFO:tasks.workunit.client.0.vm05.stdout:6/289: dread d0/d11/d22/f2b [0,4194304] 0 2026-03-10T07:50:48.457 INFO:tasks.workunit.client.0.vm05.stdout:6/290: stat d0/fa 0 2026-03-10T07:50:48.460 INFO:tasks.workunit.client.0.vm05.stdout:9/317: mkdir d8/d35/d38/d71 0 2026-03-10T07:50:48.461 INFO:tasks.workunit.client.0.vm05.stdout:9/318: truncate d8/d35/d1c/d20/d54/f67 532055 0 2026-03-10T07:50:48.461 INFO:tasks.workunit.client.0.vm05.stdout:9/319: dread - d8/d35/d1c/f3b zero size 2026-03-10T07:50:48.462 INFO:tasks.workunit.client.0.vm05.stdout:9/320: truncate d8/d35/d1c/d36/f50 461659 0 2026-03-10T07:50:48.464 INFO:tasks.workunit.client.0.vm05.stdout:4/336: creat d0/d6/d60/f72 x:0 0 0 2026-03-10T07:50:48.466 INFO:tasks.workunit.client.0.vm05.stdout:1/333: unlink da/dd/d12/d19/f1a 0 2026-03-10T07:50:48.467 INFO:tasks.workunit.client.0.vm05.stdout:2/408: fsync d0/d8/f42 0 2026-03-10T07:50:48.467 INFO:tasks.workunit.client.0.vm05.stdout:1/334: chown da/dd/d12/d19 0 1 2026-03-10T07:50:48.468 INFO:tasks.workunit.client.0.vm05.stdout:1/335: write da/d26/d2b/f45 [4834269,45213] 0 2026-03-10T07:50:48.473 INFO:tasks.workunit.client.0.vm05.stdout:6/291: creat d0/d11/d57/f5f x:0 0 0 2026-03-10T07:50:48.474 INFO:tasks.workunit.client.0.vm05.stdout:6/292: truncate d0/d11/f1c 587073 0 2026-03-10T07:50:48.475 INFO:tasks.workunit.client.0.vm05.stdout:4/337: creat d0/d20/d26/f73 x:0 0 0 2026-03-10T07:50:48.476 INFO:tasks.workunit.client.0.vm05.stdout:2/409: readlink d0/d8/d43/d38/l6a 0 2026-03-10T07:50:48.476 INFO:tasks.workunit.client.0.vm05.stdout:2/410: dread - d0/d8/d43/f5e zero size 2026-03-10T07:50:48.480 INFO:tasks.workunit.client.0.vm05.stdout:1/336: chown da/l51 116136 1 2026-03-10T07:50:48.484 INFO:tasks.workunit.client.0.vm05.stdout:6/293: mkdir d0/d11/d57/d60 0 2026-03-10T07:50:48.485 INFO:tasks.workunit.client.0.vm05.stdout:3/317: getdents d8 0 2026-03-10T07:50:48.487 INFO:tasks.workunit.client.0.vm05.stdout:3/318: dread d8/d1c/f63 [0,4194304] 0 2026-03-10T07:50:48.489 INFO:tasks.workunit.client.0.vm05.stdout:3/319: dread d8/d1c/f63 [0,4194304] 0 2026-03-10T07:50:48.490 INFO:tasks.workunit.client.0.vm05.stdout:5/317: write d2/f9 [3748304,78723] 0 2026-03-10T07:50:48.491 INFO:tasks.workunit.client.0.vm05.stdout:5/318: stat d2/d5/f23 0 2026-03-10T07:50:48.491 INFO:tasks.workunit.client.0.vm05.stdout:5/319: stat d2/d12/d2d/l48 0 2026-03-10T07:50:48.492 INFO:tasks.workunit.client.0.vm05.stdout:5/320: chown d2/d20/d33/c44 80043876 1 2026-03-10T07:50:48.495 INFO:tasks.workunit.client.0.vm05.stdout:4/338: creat d0/d6/d9/d12/d45/d55/d44/f74 x:0 0 0 2026-03-10T07:50:48.499 INFO:tasks.workunit.client.0.vm05.stdout:1/337: mknod da/d26/d2b/c5e 0 2026-03-10T07:50:48.500 INFO:tasks.workunit.client.0.vm05.stdout:6/294: mknod d0/d35/d36/c61 0 2026-03-10T07:50:48.503 INFO:tasks.workunit.client.0.vm05.stdout:3/320: mknod d8/d22/c6a 0 2026-03-10T07:50:48.505 INFO:tasks.workunit.client.0.vm05.stdout:5/321: chown d2/d12/d2d/d54/c56 1370153611 1 2026-03-10T07:50:48.506 INFO:tasks.workunit.client.0.vm05.stdout:5/322: write d2/f42 [257600,57539] 0 2026-03-10T07:50:48.513 INFO:tasks.workunit.client.0.vm05.stdout:1/338: dwrite da/f5a [0,4194304] 0 2026-03-10T07:50:48.515 INFO:tasks.workunit.client.0.vm05.stdout:0/281: write d8/dd/f3c [2805016,100219] 0 2026-03-10T07:50:48.523 INFO:tasks.workunit.client.0.vm05.stdout:6/295: rmdir d0/d11/d4f 39 2026-03-10T07:50:48.523 INFO:tasks.workunit.client.0.vm05.stdout:6/296: write d0/f3a [3410131,80596] 0 2026-03-10T07:50:48.524 INFO:tasks.workunit.client.0.vm05.stdout:6/297: write d0/d6/f1a [4437638,86567] 0 2026-03-10T07:50:48.527 INFO:tasks.workunit.client.0.vm05.stdout:8/290: write d1/d45/f55 [3254342,75346] 0 2026-03-10T07:50:48.530 INFO:tasks.workunit.client.0.vm05.stdout:3/321: mkdir d8/d16/d19/d6b 0 2026-03-10T07:50:48.534 INFO:tasks.workunit.client.0.vm05.stdout:5/323: symlink d2/d12/l6f 0 2026-03-10T07:50:48.543 INFO:tasks.workunit.client.0.vm05.stdout:0/282: dwrite d8/fc [0,4194304] 0 2026-03-10T07:50:48.545 INFO:tasks.workunit.client.0.vm05.stdout:6/298: mknod d0/d35/c62 0 2026-03-10T07:50:48.546 INFO:tasks.workunit.client.0.vm05.stdout:5/324: dread d2/d12/d2d/f60 [0,4194304] 0 2026-03-10T07:50:48.547 INFO:tasks.workunit.client.0.vm05.stdout:6/299: dread d0/d11/d22/f2b [0,4194304] 0 2026-03-10T07:50:48.560 INFO:tasks.workunit.client.0.vm05.stdout:8/291: creat d1/dd/d18/d20/d2a/d48/f57 x:0 0 0 2026-03-10T07:50:48.563 INFO:tasks.workunit.client.0.vm05.stdout:9/321: link d8/d35/l42 d8/d35/d1c/d26/l72 0 2026-03-10T07:50:48.575 INFO:tasks.workunit.client.0.vm05.stdout:7/339: write d1/d6/f22 [724450,94073] 0 2026-03-10T07:50:48.575 INFO:tasks.workunit.client.0.vm05.stdout:0/283: mkdir d8/dd/d10/d26/d3a/d5e 0 2026-03-10T07:50:48.575 INFO:tasks.workunit.client.0.vm05.stdout:5/325: unlink d2/d12/d2d/f39 0 2026-03-10T07:50:48.575 INFO:tasks.workunit.client.0.vm05.stdout:5/326: stat d2/d20/d4c/d64 0 2026-03-10T07:50:48.575 INFO:tasks.workunit.client.0.vm05.stdout:4/339: getdents d0/d20/d26 0 2026-03-10T07:50:48.575 INFO:tasks.workunit.client.0.vm05.stdout:1/339: rmdir da/d59 0 2026-03-10T07:50:48.575 INFO:tasks.workunit.client.0.vm05.stdout:0/284: unlink d8/f33 0 2026-03-10T07:50:48.577 INFO:tasks.workunit.client.0.vm05.stdout:5/327: readlink d2/d12/l1b 0 2026-03-10T07:50:48.581 INFO:tasks.workunit.client.0.vm05.stdout:2/411: dwrite d0/d8/fe [0,4194304] 0 2026-03-10T07:50:48.583 INFO:tasks.workunit.client.0.vm05.stdout:9/322: unlink d8/c10 0 2026-03-10T07:50:48.587 INFO:tasks.workunit.client.0.vm05.stdout:7/340: mknod d1/c67 0 2026-03-10T07:50:48.587 INFO:tasks.workunit.client.0.vm05.stdout:3/322: dread d8/f4d [0,4194304] 0 2026-03-10T07:50:48.588 INFO:tasks.workunit.client.0.vm05.stdout:7/341: stat d1/d34/d59 0 2026-03-10T07:50:48.588 INFO:tasks.workunit.client.0.vm05.stdout:3/323: readlink d8/d22/d60/l5e 0 2026-03-10T07:50:48.592 INFO:tasks.workunit.client.0.vm05.stdout:4/340: rename d0/f23 to d0/d6/d37/f75 0 2026-03-10T07:50:48.593 INFO:tasks.workunit.client.0.vm05.stdout:1/340: creat da/dd/d12/d34/f5f x:0 0 0 2026-03-10T07:50:48.600 INFO:tasks.workunit.client.0.vm05.stdout:7/342: dread d1/d6/f4e [0,4194304] 0 2026-03-10T07:50:48.617 INFO:tasks.workunit.client.0.vm05.stdout:9/323: dwrite d8/fa [4194304,4194304] 0 2026-03-10T07:50:48.617 INFO:tasks.workunit.client.0.vm05.stdout:5/328: rename d2/d12/d2d/l48 to d2/d5/l70 0 2026-03-10T07:50:48.617 INFO:tasks.workunit.client.0.vm05.stdout:0/285: mknod d8/dd/d37/d56/c5f 0 2026-03-10T07:50:48.617 INFO:tasks.workunit.client.0.vm05.stdout:5/329: dread d2/d20/d5b/f5f [0,4194304] 0 2026-03-10T07:50:48.617 INFO:tasks.workunit.client.0.vm05.stdout:3/324: link d8/f5c d8/d1f/f6c 0 2026-03-10T07:50:48.617 INFO:tasks.workunit.client.0.vm05.stdout:4/341: mknod d0/d6/d9/c76 0 2026-03-10T07:50:48.618 INFO:tasks.workunit.client.0.vm05.stdout:9/324: creat d8/d35/d22/d33/f73 x:0 0 0 2026-03-10T07:50:48.618 INFO:tasks.workunit.client.0.vm05.stdout:5/330: rmdir d2/d20/d33/d53 39 2026-03-10T07:50:48.627 INFO:tasks.workunit.client.0.vm05.stdout:3/325: symlink d8/d1f/l6d 0 2026-03-10T07:50:48.627 INFO:tasks.workunit.client.0.vm05.stdout:3/326: dread d8/d1c/f56 [4194304,4194304] 0 2026-03-10T07:50:48.627 INFO:tasks.workunit.client.0.vm05.stdout:9/325: stat d8/d35/d22/c30 0 2026-03-10T07:50:48.627 INFO:tasks.workunit.client.0.vm05.stdout:9/326: write d8/f14 [1301371,98168] 0 2026-03-10T07:50:48.627 INFO:tasks.workunit.client.0.vm05.stdout:9/327: fsync d8/d35/d1c/d36/f50 0 2026-03-10T07:50:48.627 INFO:tasks.workunit.client.0.vm05.stdout:9/328: write d8/f15 [1562705,87429] 0 2026-03-10T07:50:48.627 INFO:tasks.workunit.client.0.vm05.stdout:1/341: rmdir da/dd/d2a/d47 0 2026-03-10T07:50:48.628 INFO:tasks.workunit.client.0.vm05.stdout:6/300: sync 2026-03-10T07:50:48.628 INFO:tasks.workunit.client.0.vm05.stdout:6/301: chown d0/d11/d22/f4c 60694123 1 2026-03-10T07:50:48.631 INFO:tasks.workunit.client.0.vm05.stdout:4/342: mknod d0/d6/d6f/c77 0 2026-03-10T07:50:48.631 INFO:tasks.workunit.client.0.vm05.stdout:4/343: dread - d0/d6/d9/f67 zero size 2026-03-10T07:50:48.635 INFO:tasks.workunit.client.0.vm05.stdout:1/342: dread da/dd/d2a/f2f [0,4194304] 0 2026-03-10T07:50:48.650 INFO:tasks.workunit.client.0.vm05.stdout:1/343: fdatasync da/dd/d12/d19/f4e 0 2026-03-10T07:50:48.650 INFO:tasks.workunit.client.0.vm05.stdout:3/327: rename d8/d1f/d2a/d51 to d8/d22/d60/d6e 0 2026-03-10T07:50:48.650 INFO:tasks.workunit.client.0.vm05.stdout:1/344: dwrite da/dd/d12/f31 [0,4194304] 0 2026-03-10T07:50:48.650 INFO:tasks.workunit.client.0.vm05.stdout:6/302: creat d0/d11/d31/f63 x:0 0 0 2026-03-10T07:50:48.650 INFO:tasks.workunit.client.0.vm05.stdout:4/344: rmdir d0/d28 39 2026-03-10T07:50:48.650 INFO:tasks.workunit.client.0.vm05.stdout:1/345: dread da/dd/d12/d19/f4e [0,4194304] 0 2026-03-10T07:50:48.650 INFO:tasks.workunit.client.0.vm05.stdout:6/303: mknod d0/d11/d22/c64 0 2026-03-10T07:50:48.650 INFO:tasks.workunit.client.0.vm05.stdout:5/331: link d2/d5/f23 d2/d5/f71 0 2026-03-10T07:50:48.650 INFO:tasks.workunit.client.0.vm05.stdout:1/346: dread da/f5a [0,4194304] 0 2026-03-10T07:50:48.653 INFO:tasks.workunit.client.0.vm05.stdout:9/329: dread d8/f9 [0,4194304] 0 2026-03-10T07:50:48.653 INFO:tasks.workunit.client.0.vm05.stdout:9/330: chown d8/d35/d1c/d2c 1312 1 2026-03-10T07:50:48.656 INFO:tasks.workunit.client.0.vm05.stdout:4/345: truncate d0/d6/d9/f54 1131969 0 2026-03-10T07:50:48.656 INFO:tasks.workunit.client.0.vm05.stdout:1/347: dwrite da/dd/d12/f18 [0,4194304] 0 2026-03-10T07:50:48.669 INFO:tasks.workunit.client.0.vm05.stdout:1/348: mknod da/dd/d12/d34/c60 0 2026-03-10T07:50:48.669 INFO:tasks.workunit.client.0.vm05.stdout:5/332: rename d2/d12/d2d/d54 to d2/d20/d33/d72 0 2026-03-10T07:50:48.673 INFO:tasks.workunit.client.0.vm05.stdout:1/349: dwrite da/f5a [0,4194304] 0 2026-03-10T07:50:48.680 INFO:tasks.workunit.client.0.vm05.stdout:9/331: rename d8/d35/d22/l3a to d8/d35/d3c/d57/l74 0 2026-03-10T07:50:48.683 INFO:tasks.workunit.client.0.vm05.stdout:9/332: dwrite d8/f15 [0,4194304] 0 2026-03-10T07:50:48.689 INFO:tasks.workunit.client.0.vm05.stdout:9/333: fsync d8/d35/d1c/d2c/d63/f64 0 2026-03-10T07:50:48.690 INFO:tasks.workunit.client.0.vm05.stdout:4/346: rename d0/d28/f68 to d0/d6/f78 0 2026-03-10T07:50:48.691 INFO:tasks.workunit.client.0.vm05.stdout:8/292: rmdir d1/dd/d18 39 2026-03-10T07:50:48.692 INFO:tasks.workunit.client.0.vm05.stdout:5/333: rename d2/d5/f46 to d2/d4b/f73 0 2026-03-10T07:50:48.692 INFO:tasks.workunit.client.0.vm05.stdout:4/347: read d0/d20/d26/f40 [664027,69695] 0 2026-03-10T07:50:48.694 INFO:tasks.workunit.client.0.vm05.stdout:9/334: mkdir d8/d35/d1c/d75 0 2026-03-10T07:50:48.695 INFO:tasks.workunit.client.0.vm05.stdout:8/293: stat d1/c4 0 2026-03-10T07:50:48.699 INFO:tasks.workunit.client.0.vm05.stdout:1/350: rename c9 to da/dd/c61 0 2026-03-10T07:50:48.700 INFO:tasks.workunit.client.0.vm05.stdout:4/348: mknod d0/d6/d9/d5a/c79 0 2026-03-10T07:50:48.704 INFO:tasks.workunit.client.0.vm05.stdout:5/334: mknod d2/d20/d33/d53/c74 0 2026-03-10T07:50:48.705 INFO:tasks.workunit.client.0.vm05.stdout:1/351: creat da/dd/d12/d19/d20/f62 x:0 0 0 2026-03-10T07:50:48.707 INFO:tasks.workunit.client.0.vm05.stdout:9/335: unlink d8/d35/f23 0 2026-03-10T07:50:48.708 INFO:tasks.workunit.client.0.vm05.stdout:5/335: creat d2/d5/f75 x:0 0 0 2026-03-10T07:50:48.710 INFO:tasks.workunit.client.0.vm05.stdout:9/336: dwrite d8/d35/d22/d33/d47/f5a [0,4194304] 0 2026-03-10T07:50:48.710 INFO:tasks.workunit.client.0.vm05.stdout:1/352: creat da/dd/d2a/f63 x:0 0 0 2026-03-10T07:50:48.712 INFO:tasks.workunit.client.0.vm05.stdout:1/353: dread - da/dd/d12/d19/d20/f62 zero size 2026-03-10T07:50:48.714 INFO:tasks.workunit.client.0.vm05.stdout:5/336: rename d2/d5/c27 to d2/d20/d33/d72/c76 0 2026-03-10T07:50:48.719 INFO:tasks.workunit.client.0.vm05.stdout:1/354: getdents da/dd/d12 0 2026-03-10T07:50:48.730 INFO:tasks.workunit.client.0.vm05.stdout:1/355: getdents da/dd/d2a/d55 0 2026-03-10T07:50:48.730 INFO:tasks.workunit.client.0.vm05.stdout:1/356: mkdir da/dd/d2a/d55/d64 0 2026-03-10T07:50:48.730 INFO:tasks.workunit.client.0.vm05.stdout:1/357: creat da/d26/d2b/f65 x:0 0 0 2026-03-10T07:50:48.730 INFO:tasks.workunit.client.0.vm05.stdout:1/358: write da/f43 [4655941,128005] 0 2026-03-10T07:50:48.730 INFO:tasks.workunit.client.0.vm05.stdout:1/359: truncate da/dd/d12/d34/f38 47813 0 2026-03-10T07:50:48.730 INFO:tasks.workunit.client.0.vm05.stdout:1/360: chown da/dd/d12/d34/f38 28 1 2026-03-10T07:50:48.730 INFO:tasks.workunit.client.0.vm05.stdout:1/361: readlink da/l23 0 2026-03-10T07:50:48.731 INFO:tasks.workunit.client.0.vm05.stdout:1/362: stat da/dd/d42 0 2026-03-10T07:50:48.744 INFO:tasks.workunit.client.0.vm05.stdout:4/349: sync 2026-03-10T07:50:48.747 INFO:tasks.workunit.client.0.vm05.stdout:4/350: symlink d0/l7a 0 2026-03-10T07:50:48.783 INFO:tasks.workunit.client.0.vm05.stdout:2/412: write d0/d8/d43/df/d53/f6d [398532,50158] 0 2026-03-10T07:50:48.789 INFO:tasks.workunit.client.0.vm05.stdout:2/413: link d0/d8/d66/f68 d0/d8/d43/df/d53/f82 0 2026-03-10T07:50:48.791 INFO:tasks.workunit.client.0.vm05.stdout:2/414: rmdir d0/d8/d43/df 39 2026-03-10T07:50:48.794 INFO:tasks.workunit.client.0.vm05.stdout:7/343: dwrite d1/f16 [0,4194304] 0 2026-03-10T07:50:48.800 INFO:tasks.workunit.client.0.vm05.stdout:2/415: dwrite d0/d8/d43/df/f58 [0,4194304] 0 2026-03-10T07:50:48.817 INFO:tasks.workunit.client.0.vm05.stdout:2/416: symlink d0/d8/d3d/d7d/l83 0 2026-03-10T07:50:48.818 INFO:tasks.workunit.client.0.vm05.stdout:2/417: write d0/d8/d43/f30 [1312822,73652] 0 2026-03-10T07:50:48.821 INFO:tasks.workunit.client.0.vm05.stdout:2/418: dwrite d0/d8/d43/f5e [0,4194304] 0 2026-03-10T07:50:48.823 INFO:tasks.workunit.client.0.vm05.stdout:2/419: truncate d0/d8/d43/df/d53/f69 4602628 0 2026-03-10T07:50:48.829 INFO:tasks.workunit.client.0.vm05.stdout:0/286: dwrite d8/fb [0,4194304] 0 2026-03-10T07:50:48.835 INFO:tasks.workunit.client.0.vm05.stdout:3/328: write f4 [3826352,109087] 0 2026-03-10T07:50:48.835 INFO:tasks.workunit.client.0.vm05.stdout:3/329: read d8/d1f/d2a/f32 [1016764,113940] 0 2026-03-10T07:50:48.835 INFO:tasks.workunit.client.0.vm05.stdout:6/304: truncate d0/d11/f13 1798693 0 2026-03-10T07:50:48.838 INFO:tasks.workunit.client.0.vm05.stdout:6/305: dwrite d0/d11/d57/f5f [0,4194304] 0 2026-03-10T07:50:48.854 INFO:tasks.workunit.client.0.vm05.stdout:7/344: link d1/l43 d1/d34/d59/l68 0 2026-03-10T07:50:48.857 INFO:tasks.workunit.client.0.vm05.stdout:0/287: chown d8/dd/d10/d26/d48/c4c 79 1 2026-03-10T07:50:48.860 INFO:tasks.workunit.client.0.vm05.stdout:3/330: write d8/d1c/f56 [2252116,57996] 0 2026-03-10T07:50:48.862 INFO:tasks.workunit.client.0.vm05.stdout:2/420: dread d0/d8/d43/f1d [0,4194304] 0 2026-03-10T07:50:48.866 INFO:tasks.workunit.client.0.vm05.stdout:0/288: mknod d8/c60 0 2026-03-10T07:50:48.869 INFO:tasks.workunit.client.0.vm05.stdout:8/294: truncate d1/fa 10011788 0 2026-03-10T07:50:48.873 INFO:tasks.workunit.client.0.vm05.stdout:2/421: write d0/d8/f3b [1880731,68432] 0 2026-03-10T07:50:48.881 INFO:tasks.workunit.client.0.vm05.stdout:2/422: chown d0/d8/d43/f1f 2982335 1 2026-03-10T07:50:48.881 INFO:tasks.workunit.client.0.vm05.stdout:0/289: mknod d8/dd/d10/d4b/c61 0 2026-03-10T07:50:48.881 INFO:tasks.workunit.client.0.vm05.stdout:9/337: truncate d8/d35/f51 3854611 0 2026-03-10T07:50:48.881 INFO:tasks.workunit.client.0.vm05.stdout:5/337: write d2/d12/d4d/f5d [394780,128219] 0 2026-03-10T07:50:48.881 INFO:tasks.workunit.client.0.vm05.stdout:2/423: creat d0/d8/d43/df/d4d/f84 x:0 0 0 2026-03-10T07:50:48.881 INFO:tasks.workunit.client.0.vm05.stdout:5/338: dread d2/f1a [0,4194304] 0 2026-03-10T07:50:48.882 INFO:tasks.workunit.client.0.vm05.stdout:2/424: write d0/d8/f42 [4711550,35257] 0 2026-03-10T07:50:48.898 INFO:tasks.workunit.client.0.vm05.stdout:1/363: truncate da/f5d 3754996 0 2026-03-10T07:50:48.898 INFO:tasks.workunit.client.0.vm05.stdout:1/364: chown da/d26/f33 24459705 1 2026-03-10T07:50:48.900 INFO:tasks.workunit.client.0.vm05.stdout:8/295: creat d1/dd/d18/f58 x:0 0 0 2026-03-10T07:50:48.903 INFO:tasks.workunit.client.0.vm05.stdout:4/351: write d0/d6/d9/d5a/f2f [982972,10148] 0 2026-03-10T07:50:48.907 INFO:tasks.workunit.client.0.vm05.stdout:2/425: creat d0/d8/d43/d38/f85 x:0 0 0 2026-03-10T07:50:48.913 INFO:tasks.workunit.client.0.vm05.stdout:7/345: rename d1/d34/c66 to d1/c69 0 2026-03-10T07:50:48.913 INFO:tasks.workunit.client.0.vm05.stdout:7/346: fsync d1/d34/d59/f64 0 2026-03-10T07:50:48.920 INFO:tasks.workunit.client.0.vm05.stdout:6/306: getdents d0/d11/d4f 0 2026-03-10T07:50:48.932 INFO:tasks.workunit.client.0.vm05.stdout:6/307: write d0/d11/f58 [158140,65535] 0 2026-03-10T07:50:48.932 INFO:tasks.workunit.client.0.vm05.stdout:6/308: fdatasync d0/d6/f24 0 2026-03-10T07:50:48.932 INFO:tasks.workunit.client.0.vm05.stdout:9/338: rename d8/d35/d1c/d20/d54/f67 to d8/d35/d22/d33/d47/f76 0 2026-03-10T07:50:48.933 INFO:tasks.workunit.client.0.vm05.stdout:2/426: sync 2026-03-10T07:50:48.936 INFO:tasks.workunit.client.0.vm05.stdout:2/427: chown d0/d8/d43/df/c11 1245 1 2026-03-10T07:50:48.942 INFO:tasks.workunit.client.0.vm05.stdout:0/290: creat d8/dd/d37/d56/f62 x:0 0 0 2026-03-10T07:50:48.947 INFO:tasks.workunit.client.0.vm05.stdout:8/296: rename d1/dd/d18/f2b to d1/dd/d18/d20/d2a/d48/f59 0 2026-03-10T07:50:48.949 INFO:tasks.workunit.client.0.vm05.stdout:9/339: creat d8/d35/d1c/d2c/d63/f77 x:0 0 0 2026-03-10T07:50:48.949 INFO:tasks.workunit.client.0.vm05.stdout:9/340: chown d8/d35/d22/d33/d62 206 1 2026-03-10T07:50:48.957 INFO:tasks.workunit.client.0.vm05.stdout:3/331: dwrite d8/d1c/f63 [0,4194304] 0 2026-03-10T07:50:48.968 INFO:tasks.workunit.client.0.vm05.stdout:8/297: truncate d1/dd/f25 639734 0 2026-03-10T07:50:48.968 INFO:tasks.workunit.client.0.vm05.stdout:8/298: write d1/dd/d18/f38 [845303,126306] 0 2026-03-10T07:50:48.969 INFO:tasks.workunit.client.0.vm05.stdout:8/299: readlink d1/dd/d4d/l51 0 2026-03-10T07:50:48.969 INFO:tasks.workunit.client.0.vm05.stdout:8/300: dread - d1/dd/d18/f58 zero size 2026-03-10T07:50:48.976 INFO:tasks.workunit.client.0.vm05.stdout:1/365: fsync da/f5d 0 2026-03-10T07:50:48.979 INFO:tasks.workunit.client.0.vm05.stdout:1/366: dwrite da/dd/d2a/f54 [0,4194304] 0 2026-03-10T07:50:48.983 INFO:tasks.workunit.client.0.vm05.stdout:0/291: mkdir d8/dd/d10/d26/d3a/d5e/d63 0 2026-03-10T07:50:48.983 INFO:tasks.workunit.client.0.vm05.stdout:0/292: chown d8/dd/d10/d26/d2a/c41 88 1 2026-03-10T07:50:48.996 INFO:tasks.workunit.client.0.vm05.stdout:4/352: dwrite d0/d28/f33 [0,4194304] 0 2026-03-10T07:50:48.997 INFO:tasks.workunit.client.0.vm05.stdout:5/339: truncate d2/f8 1970953 0 2026-03-10T07:50:48.997 INFO:tasks.workunit.client.0.vm05.stdout:4/353: dread - d0/d6/d9/d5a/f58 zero size 2026-03-10T07:50:49.001 INFO:tasks.workunit.client.0.vm05.stdout:4/354: dwrite d0/d6/d9/d5a/f58 [0,4194304] 0 2026-03-10T07:50:49.011 INFO:tasks.workunit.client.0.vm05.stdout:4/355: dwrite d0/f24 [4194304,4194304] 0 2026-03-10T07:50:49.019 INFO:tasks.workunit.client.0.vm05.stdout:7/347: dwrite d1/d6/f1d [0,4194304] 0 2026-03-10T07:50:49.030 INFO:tasks.workunit.client.0.vm05.stdout:8/301: mkdir d1/dd/d18/d20/d2a/d48/d5a 0 2026-03-10T07:50:49.041 INFO:tasks.workunit.client.0.vm05.stdout:1/367: symlink da/dd/d2a/d55/l66 0 2026-03-10T07:50:49.041 INFO:tasks.workunit.client.0.vm05.stdout:1/368: readlink da/l51 0 2026-03-10T07:50:49.041 INFO:tasks.workunit.client.0.vm05.stdout:6/309: truncate d0/d11/d2e/f30 1901921 0 2026-03-10T07:50:49.041 INFO:tasks.workunit.client.0.vm05.stdout:6/310: truncate d0/d11/d57/f5f 4673989 0 2026-03-10T07:50:49.041 INFO:tasks.workunit.client.0.vm05.stdout:2/428: truncate d0/f6 6072999 0 2026-03-10T07:50:49.044 INFO:tasks.workunit.client.0.vm05.stdout:3/332: dwrite d8/d1f/f49 [0,4194304] 0 2026-03-10T07:50:49.046 INFO:tasks.workunit.client.0.vm05.stdout:3/333: write d8/f18 [2208740,100559] 0 2026-03-10T07:50:49.047 INFO:tasks.workunit.client.0.vm05.stdout:4/356: symlink d0/d6/d9/d5a/d6e/l7b 0 2026-03-10T07:50:49.056 INFO:tasks.workunit.client.0.vm05.stdout:1/369: unlink da/d26/d2b/f50 0 2026-03-10T07:50:49.058 INFO:tasks.workunit.client.0.vm05.stdout:6/311: symlink d0/d6/d3b/l65 0 2026-03-10T07:50:49.063 INFO:tasks.workunit.client.0.vm05.stdout:2/429: dwrite d0/d8/d43/f1d [0,4194304] 0 2026-03-10T07:50:49.064 INFO:tasks.workunit.client.0.vm05.stdout:2/430: chown d0/l4a 4108 1 2026-03-10T07:50:49.065 INFO:tasks.workunit.client.0.vm05.stdout:5/340: mkdir d2/d20/d77 0 2026-03-10T07:50:49.065 INFO:tasks.workunit.client.0.vm05.stdout:5/341: chown d2/d20/d33/c3b 1 1 2026-03-10T07:50:49.066 INFO:tasks.workunit.client.0.vm05.stdout:2/431: write d0/d8/d43/df/f21 [2108184,77315] 0 2026-03-10T07:50:49.068 INFO:tasks.workunit.client.0.vm05.stdout:7/348: symlink d1/d34/l6a 0 2026-03-10T07:50:49.068 INFO:tasks.workunit.client.0.vm05.stdout:4/357: creat d0/d28/f7c x:0 0 0 2026-03-10T07:50:49.069 INFO:tasks.workunit.client.0.vm05.stdout:7/349: chown d1/d3c/c54 10226945 1 2026-03-10T07:50:49.070 INFO:tasks.workunit.client.0.vm05.stdout:8/302: link d1/dd/d18/f29 d1/dd/d18/d20/f5b 0 2026-03-10T07:50:49.072 INFO:tasks.workunit.client.0.vm05.stdout:7/350: dwrite d1/f49 [4194304,4194304] 0 2026-03-10T07:50:49.073 INFO:tasks.workunit.client.0.vm05.stdout:7/351: fsync d1/d34/f4d 0 2026-03-10T07:50:49.076 INFO:tasks.workunit.client.0.vm05.stdout:6/312: dwrite d0/d11/f21 [0,4194304] 0 2026-03-10T07:50:49.081 INFO:tasks.workunit.client.0.vm05.stdout:6/313: write d0/d6/d3b/f55 [1535264,51451] 0 2026-03-10T07:50:49.093 INFO:tasks.workunit.client.0.vm05.stdout:0/293: rename d8/dd/d10/d26/d48/c4c to d8/dd/d10/d26/c64 0 2026-03-10T07:50:49.095 INFO:tasks.workunit.client.0.vm05.stdout:8/303: unlink d1/d23/c24 0 2026-03-10T07:50:49.098 INFO:tasks.workunit.client.0.vm05.stdout:0/294: creat d8/f65 x:0 0 0 2026-03-10T07:50:49.101 INFO:tasks.workunit.client.0.vm05.stdout:3/334: rename d8/l14 to d8/d1f/d2a/d34/d5f/l6f 0 2026-03-10T07:50:49.118 INFO:tasks.workunit.client.0.vm05.stdout:0/295: mknod d8/dd/d10/d26/d3a/c66 0 2026-03-10T07:50:49.118 INFO:tasks.workunit.client.0.vm05.stdout:0/296: dwrite d8/fc [0,4194304] 0 2026-03-10T07:50:49.118 INFO:tasks.workunit.client.0.vm05.stdout:8/304: link d1/dd/d18/d20/d2a/f54 d1/dd/d18/f5c 0 2026-03-10T07:50:49.118 INFO:tasks.workunit.client.0.vm05.stdout:2/432: getdents d0/d8/d43/d38 0 2026-03-10T07:50:49.118 INFO:tasks.workunit.client.0.vm05.stdout:0/297: rename d8/dd/d10/d4b to d8/dd/d37/d67 0 2026-03-10T07:50:49.118 INFO:tasks.workunit.client.0.vm05.stdout:7/352: link d1/l51 d1/l6b 0 2026-03-10T07:50:49.118 INFO:tasks.workunit.client.0.vm05.stdout:8/305: rmdir d1/dd/d18/d20/d2a/d34 39 2026-03-10T07:50:49.118 INFO:tasks.workunit.client.0.vm05.stdout:2/433: rmdir d0/d8/d3d 39 2026-03-10T07:50:49.119 INFO:tasks.workunit.client.0.vm05.stdout:8/306: mkdir d1/dd/d18/d20/d2a/d34/d49/d5d 0 2026-03-10T07:50:49.120 INFO:tasks.workunit.client.0.vm05.stdout:0/298: getdents d8/dd/d37/d56 0 2026-03-10T07:50:49.121 INFO:tasks.workunit.client.0.vm05.stdout:0/299: dread - d8/dd/d34/f3d zero size 2026-03-10T07:50:49.121 INFO:tasks.workunit.client.0.vm05.stdout:0/300: fdatasync d8/dd/d37/f38 0 2026-03-10T07:50:49.122 INFO:tasks.workunit.client.0.vm05.stdout:8/307: mkdir d1/dd/d5e 0 2026-03-10T07:50:49.123 INFO:tasks.workunit.client.0.vm05.stdout:8/308: write d1/dd/d18/d20/f43 [4133085,113424] 0 2026-03-10T07:50:49.125 INFO:tasks.workunit.client.0.vm05.stdout:0/301: link d8/dd/d37/d56/c14 d8/dd/d37/d56/d4d/c68 0 2026-03-10T07:50:49.126 INFO:tasks.workunit.client.0.vm05.stdout:0/302: fsync d8/f1c 0 2026-03-10T07:50:49.128 INFO:tasks.workunit.client.0.vm05.stdout:0/303: creat d8/dd/d37/d56/d4d/f69 x:0 0 0 2026-03-10T07:50:49.130 INFO:tasks.workunit.client.0.vm05.stdout:0/304: rename d8/dd/d37/d56/d4d/c68 to d8/dd/d10/d26/d3a/c6a 0 2026-03-10T07:50:49.135 INFO:tasks.workunit.client.0.vm05.stdout:0/305: dwrite d8/dd/f40 [0,4194304] 0 2026-03-10T07:50:49.137 INFO:tasks.workunit.client.0.vm05.stdout:0/306: stat d8/dd/d10/l25 0 2026-03-10T07:50:49.141 INFO:tasks.workunit.client.0.vm05.stdout:0/307: symlink d8/dd/d10/d26/d3a/d5e/l6b 0 2026-03-10T07:50:49.146 INFO:tasks.workunit.client.0.vm05.stdout:0/308: creat d8/dd/d10/f6c x:0 0 0 2026-03-10T07:50:49.146 INFO:tasks.workunit.client.0.vm05.stdout:0/309: mkdir d8/dd/d37/d56/d6d 0 2026-03-10T07:50:49.152 INFO:tasks.workunit.client.0.vm05.stdout:6/314: sync 2026-03-10T07:50:49.152 INFO:tasks.workunit.client.0.vm05.stdout:7/353: sync 2026-03-10T07:50:49.157 INFO:tasks.workunit.client.0.vm05.stdout:7/354: creat d1/d6/f6c x:0 0 0 2026-03-10T07:50:49.163 INFO:tasks.workunit.client.0.vm05.stdout:6/315: mkdir d0/d11/d57/d66 0 2026-03-10T07:50:49.163 INFO:tasks.workunit.client.0.vm05.stdout:7/355: mknod d1/d5b/c6d 0 2026-03-10T07:50:49.168 INFO:tasks.workunit.client.0.vm05.stdout:9/341: write d8/d35/d22/f4a [368565,47114] 0 2026-03-10T07:50:49.169 INFO:tasks.workunit.client.0.vm05.stdout:9/342: write d8/d35/d1c/d26/d28/f43 [1461952,44196] 0 2026-03-10T07:50:49.170 INFO:tasks.workunit.client.0.vm05.stdout:9/343: truncate d8/d35/d22/d33/d62/f6c 659150 0 2026-03-10T07:50:49.183 INFO:tasks.workunit.client.0.vm05.stdout:9/344: sync 2026-03-10T07:50:49.187 INFO:tasks.workunit.client.0.vm05.stdout:9/345: write d8/d35/f21 [447799,57591] 0 2026-03-10T07:50:49.188 INFO:tasks.workunit.client.0.vm05.stdout:9/346: dread d8/d35/d22/d33/d62/f6c [0,4194304] 0 2026-03-10T07:50:49.277 INFO:tasks.workunit.client.0.vm05.stdout:4/358: dread d0/d6/d37/f3d [0,4194304] 0 2026-03-10T07:50:49.279 INFO:tasks.workunit.client.0.vm05.stdout:4/359: truncate d0/f2 6592382 0 2026-03-10T07:50:49.279 INFO:tasks.workunit.client.0.vm05.stdout:4/360: write d0/d20/f2a [3475625,27070] 0 2026-03-10T07:50:49.283 INFO:tasks.workunit.client.0.vm05.stdout:4/361: creat d0/d6/d9/d12/d45/d55/f7d x:0 0 0 2026-03-10T07:50:49.292 INFO:tasks.workunit.client.0.vm05.stdout:4/362: dread d0/d6/d9/f54 [0,4194304] 0 2026-03-10T07:50:49.293 INFO:tasks.workunit.client.0.vm05.stdout:4/363: fsync d0/d6/d37/f3d 0 2026-03-10T07:50:49.293 INFO:tasks.workunit.client.0.vm05.stdout:4/364: chown d0/d3b/f53 3 1 2026-03-10T07:50:49.294 INFO:tasks.workunit.client.0.vm05.stdout:4/365: creat d0/d6/d9/d12/d45/d55/d44/f7e x:0 0 0 2026-03-10T07:50:49.296 INFO:tasks.workunit.client.0.vm05.stdout:4/366: rename d0/d6/c5d to d0/d6/d9/d12/d45/d55/d44/c7f 0 2026-03-10T07:50:49.300 INFO:tasks.workunit.client.0.vm05.stdout:4/367: dwrite d0/d6/d9/f4d [0,4194304] 0 2026-03-10T07:50:49.306 INFO:tasks.workunit.client.0.vm05.stdout:4/368: dwrite d0/d6/d9/d5a/f58 [0,4194304] 0 2026-03-10T07:50:49.310 INFO:tasks.workunit.client.0.vm05.stdout:6/316: rmdir d0/d6/d3b 39 2026-03-10T07:50:49.311 INFO:tasks.workunit.client.0.vm05.stdout:1/370: getdents da/d26/d2b 0 2026-03-10T07:50:49.312 INFO:tasks.workunit.client.0.vm05.stdout:1/371: fdatasync da/dd/d12/d34/f5f 0 2026-03-10T07:50:49.314 INFO:tasks.workunit.client.0.vm05.stdout:1/372: read da/d26/f2d [501001,51605] 0 2026-03-10T07:50:49.315 INFO:tasks.workunit.client.0.vm05.stdout:1/373: chown da 454454753 1 2026-03-10T07:50:49.316 INFO:tasks.workunit.client.0.vm05.stdout:1/374: write da/dd/d12/f31 [4603058,68112] 0 2026-03-10T07:50:49.317 INFO:tasks.workunit.client.0.vm05.stdout:1/375: write f4 [10910774,61592] 0 2026-03-10T07:50:49.317 INFO:tasks.workunit.client.0.vm05.stdout:6/317: dwrite d0/d11/f58 [0,4194304] 0 2026-03-10T07:50:49.331 INFO:tasks.workunit.client.0.vm05.stdout:6/318: getdents d0/d11/d57/d60 0 2026-03-10T07:50:49.334 INFO:tasks.workunit.client.0.vm05.stdout:6/319: rename d0/l4 to d0/d35/d36/d43/l67 0 2026-03-10T07:50:49.343 INFO:tasks.workunit.client.0.vm05.stdout:6/320: dread d0/d35/f41 [0,4194304] 0 2026-03-10T07:50:49.346 INFO:tasks.workunit.client.0.vm05.stdout:6/321: dwrite d0/d35/f41 [0,4194304] 0 2026-03-10T07:50:49.350 INFO:tasks.workunit.client.0.vm05.stdout:6/322: getdents d0/d6 0 2026-03-10T07:50:49.352 INFO:tasks.workunit.client.0.vm05.stdout:6/323: fsync d0/d35/d36/f59 0 2026-03-10T07:50:49.352 INFO:tasks.workunit.client.0.vm05.stdout:6/324: write d0/d11/d22/f52 [722225,25546] 0 2026-03-10T07:50:49.398 INFO:tasks.workunit.client.0.vm05.stdout:6/325: sync 2026-03-10T07:50:49.399 INFO:tasks.workunit.client.0.vm05.stdout:8/309: rmdir d1/d23 39 2026-03-10T07:50:49.400 INFO:tasks.workunit.client.0.vm05.stdout:5/342: write d2/d5/d61/f65 [4893568,36482] 0 2026-03-10T07:50:49.407 INFO:tasks.workunit.client.0.vm05.stdout:3/335: getdents d8/d1f/d2a/d4b 0 2026-03-10T07:50:49.409 INFO:tasks.workunit.client.0.vm05.stdout:3/336: creat d8/d1c/d48/f70 x:0 0 0 2026-03-10T07:50:49.410 INFO:tasks.workunit.client.0.vm05.stdout:3/337: symlink d8/d1f/d24/d45/l71 0 2026-03-10T07:50:49.412 INFO:tasks.workunit.client.0.vm05.stdout:3/338: stat d8/f5d 0 2026-03-10T07:50:49.415 INFO:tasks.workunit.client.0.vm05.stdout:2/434: write d0/d8/f2d [53438,59226] 0 2026-03-10T07:50:49.416 INFO:tasks.workunit.client.0.vm05.stdout:3/339: sync 2026-03-10T07:50:49.420 INFO:tasks.workunit.client.0.vm05.stdout:3/340: dwrite d8/d1f/f2f [0,4194304] 0 2026-03-10T07:50:49.422 INFO:tasks.workunit.client.0.vm05.stdout:3/341: dread - d8/d1f/d2a/f42 zero size 2026-03-10T07:50:49.429 INFO:tasks.workunit.client.0.vm05.stdout:7/356: truncate d1/d34/f3e 2743251 0 2026-03-10T07:50:49.430 INFO:tasks.workunit.client.0.vm05.stdout:7/357: stat d1/d3c 0 2026-03-10T07:50:49.431 INFO:tasks.workunit.client.0.vm05.stdout:9/347: truncate d8/d35/d1c/f49 234294 0 2026-03-10T07:50:49.433 INFO:tasks.workunit.client.0.vm05.stdout:2/435: mknod d0/d8/d43/df/c86 0 2026-03-10T07:50:49.434 INFO:tasks.workunit.client.0.vm05.stdout:2/436: dread - d0/d8/d43/df/d4d/f84 zero size 2026-03-10T07:50:49.435 INFO:tasks.workunit.client.0.vm05.stdout:8/310: dread d1/dd/d18/d20/d2a/f3a [0,4194304] 0 2026-03-10T07:50:49.436 INFO:tasks.workunit.client.0.vm05.stdout:3/342: truncate d8/d16/f2d 1515726 0 2026-03-10T07:50:49.437 INFO:tasks.workunit.client.0.vm05.stdout:2/437: dread d0/d8/d43/d38/f56 [0,4194304] 0 2026-03-10T07:50:49.439 INFO:tasks.workunit.client.0.vm05.stdout:7/358: fdatasync d1/f21 0 2026-03-10T07:50:49.444 INFO:tasks.workunit.client.0.vm05.stdout:0/310: dwrite d8/dd/d37/d56/f18 [0,4194304] 0 2026-03-10T07:50:49.446 INFO:tasks.workunit.client.0.vm05.stdout:7/359: dwrite d1/f11 [0,4194304] 0 2026-03-10T07:50:49.450 INFO:tasks.workunit.client.0.vm05.stdout:2/438: sync 2026-03-10T07:50:49.453 INFO:tasks.workunit.client.0.vm05.stdout:2/439: write d0/d8/fe [3466628,46524] 0 2026-03-10T07:50:49.462 INFO:tasks.workunit.client.0.vm05.stdout:9/348: rmdir d8/d35/d1c/d2c/d63 39 2026-03-10T07:50:49.463 INFO:tasks.workunit.client.0.vm05.stdout:9/349: truncate d8/d35/d22/f6a 532222 0 2026-03-10T07:50:49.465 INFO:tasks.workunit.client.0.vm05.stdout:8/311: mknod d1/dd/d18/d20/d2a/d48/c5f 0 2026-03-10T07:50:49.467 INFO:tasks.workunit.client.0.vm05.stdout:9/350: dwrite d8/d35/d22/f4f [0,4194304] 0 2026-03-10T07:50:49.471 INFO:tasks.workunit.client.0.vm05.stdout:0/311: creat d8/dd/d10/d26/d2a/f6e x:0 0 0 2026-03-10T07:50:49.482 INFO:tasks.workunit.client.0.vm05.stdout:0/312: readlink d8/dd/l36 0 2026-03-10T07:50:49.482 INFO:tasks.workunit.client.0.vm05.stdout:0/313: stat d8/dd/d34/c35 0 2026-03-10T07:50:49.482 INFO:tasks.workunit.client.0.vm05.stdout:2/440: rename d0/d8/d43/d38/l51 to d0/d47/d49/d81/l87 0 2026-03-10T07:50:49.482 INFO:tasks.workunit.client.0.vm05.stdout:8/312: fdatasync d1/d45/f55 0 2026-03-10T07:50:49.482 INFO:tasks.workunit.client.0.vm05.stdout:3/343: creat d8/d1c/d64/f72 x:0 0 0 2026-03-10T07:50:49.482 INFO:tasks.workunit.client.0.vm05.stdout:3/344: readlink d8/d1f/l59 0 2026-03-10T07:50:49.482 INFO:tasks.workunit.client.0.vm05.stdout:4/369: write d0/d3b/f4c [1235763,17721] 0 2026-03-10T07:50:49.486 INFO:tasks.workunit.client.0.vm05.stdout:6/326: chown d0/d35/d36/d43/l67 72 1 2026-03-10T07:50:49.489 INFO:tasks.workunit.client.0.vm05.stdout:1/376: dwrite da/dd/d12/d34/f38 [0,4194304] 0 2026-03-10T07:50:49.491 INFO:tasks.workunit.client.0.vm05.stdout:9/351: symlink d8/d35/d1c/d36/l78 0 2026-03-10T07:50:49.497 INFO:tasks.workunit.client.0.vm05.stdout:7/360: mkdir d1/d34/d59/d60/d6e 0 2026-03-10T07:50:49.498 INFO:tasks.workunit.client.0.vm05.stdout:4/370: sync 2026-03-10T07:50:49.499 INFO:tasks.workunit.client.0.vm05.stdout:4/371: write d0/d3b/f4c [1779556,57983] 0 2026-03-10T07:50:49.500 INFO:tasks.workunit.client.0.vm05.stdout:6/327: dread d0/f29 [0,4194304] 0 2026-03-10T07:50:49.506 INFO:tasks.workunit.client.0.vm05.stdout:8/313: creat d1/dd/d4d/f60 x:0 0 0 2026-03-10T07:50:49.508 INFO:tasks.workunit.client.0.vm05.stdout:3/345: unlink d8/d1f/d2a/d34/f46 0 2026-03-10T07:50:49.511 INFO:tasks.workunit.client.0.vm05.stdout:1/377: rename da/dd/c3f to da/dd/d27/c67 0 2026-03-10T07:50:49.516 INFO:tasks.workunit.client.0.vm05.stdout:5/343: chown d2/f8 123771571 1 2026-03-10T07:50:49.517 INFO:tasks.workunit.client.0.vm05.stdout:0/314: truncate d8/dd/d10/d26/d2a/f2e 1233343 0 2026-03-10T07:50:49.519 INFO:tasks.workunit.client.0.vm05.stdout:4/372: symlink d0/d6/d37/l80 0 2026-03-10T07:50:49.524 INFO:tasks.workunit.client.0.vm05.stdout:6/328: mknod d0/d35/d36/d43/c68 0 2026-03-10T07:50:49.524 INFO:tasks.workunit.client.0.vm05.stdout:8/314: creat d1/dd/d4d/f61 x:0 0 0 2026-03-10T07:50:49.524 INFO:tasks.workunit.client.0.vm05.stdout:4/373: dwrite d0/d6/d9/d12/d45/d55/f56 [0,4194304] 0 2026-03-10T07:50:49.532 INFO:tasks.workunit.client.0.vm05.stdout:1/378: sync 2026-03-10T07:50:49.532 INFO:tasks.workunit.client.0.vm05.stdout:9/352: rename d8/d35/d3c to d8/d35/d1c/d26/d28/d79 0 2026-03-10T07:50:49.536 INFO:tasks.workunit.client.0.vm05.stdout:9/353: dwrite d8/d35/d22/d33/d47/f5f [0,4194304] 0 2026-03-10T07:50:49.540 INFO:tasks.workunit.client.0.vm05.stdout:1/379: sync 2026-03-10T07:50:49.540 INFO:tasks.workunit.client.0.vm05.stdout:5/344: mknod d2/d20/d4c/c78 0 2026-03-10T07:50:49.546 INFO:tasks.workunit.client.0.vm05.stdout:0/315: readlink d8/dd/d37/d56/l17 0 2026-03-10T07:50:49.550 INFO:tasks.workunit.client.0.vm05.stdout:6/329: write d0/d35/d36/f5b [454714,107766] 0 2026-03-10T07:50:49.552 INFO:tasks.workunit.client.0.vm05.stdout:0/316: dread f4 [0,4194304] 0 2026-03-10T07:50:49.559 INFO:tasks.workunit.client.0.vm05.stdout:3/346: rename d8/d16/c28 to d8/d1c/d48/c73 0 2026-03-10T07:50:49.565 INFO:tasks.workunit.client.0.vm05.stdout:5/345: dwrite d2/d12/d4d/f5d [0,4194304] 0 2026-03-10T07:50:49.567 INFO:tasks.workunit.client.0.vm05.stdout:6/330: mkdir d0/d11/d22/d69 0 2026-03-10T07:50:49.571 INFO:tasks.workunit.client.0.vm05.stdout:0/317: mkdir d8/dd/d10/d26/d2a/d6f 0 2026-03-10T07:50:49.574 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:49.574 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:49.574 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:49 vm08.local ceph-mon[59917]: Upgrade: Updating node-exporter.vm08 (2/2) 2026-03-10T07:50:49.574 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:49 vm08.local ceph-mon[59917]: Deploying daemon node-exporter.vm08 on vm08 2026-03-10T07:50:49.574 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:49 vm08.local ceph-mon[59917]: pgmap v14: 65 pgs: 65 active+clean; 840 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 18 MiB/s rd, 108 MiB/s wr, 295 op/s 2026-03-10T07:50:49.574 INFO:tasks.workunit.client.0.vm05.stdout:0/318: write d8/f4e [734058,40525] 0 2026-03-10T07:50:49.576 INFO:tasks.workunit.client.0.vm05.stdout:7/361: getdents d1/d6 0 2026-03-10T07:50:49.576 INFO:tasks.workunit.client.0.vm05.stdout:7/362: fdatasync d1/f3a 0 2026-03-10T07:50:49.576 INFO:tasks.workunit.client.0.vm05.stdout:7/363: chown d1/d6/f31 1329 1 2026-03-10T07:50:49.584 INFO:tasks.workunit.client.0.vm05.stdout:7/364: dread d1/d6/f1b [0,4194304] 0 2026-03-10T07:50:49.587 INFO:tasks.workunit.client.0.vm05.stdout:7/365: chown d1/d6/d3b/f42 658 1 2026-03-10T07:50:49.588 INFO:tasks.workunit.client.0.vm05.stdout:7/366: dwrite d1/f3a [0,4194304] 0 2026-03-10T07:50:49.591 INFO:tasks.workunit.client.0.vm05.stdout:4/374: link d0/d6/d37/l80 d0/d3b/d5c/l81 0 2026-03-10T07:50:49.597 INFO:tasks.workunit.client.0.vm05.stdout:5/346: mknod d2/c79 0 2026-03-10T07:50:49.598 INFO:tasks.workunit.client.0.vm05.stdout:8/315: getdents d1/d45 0 2026-03-10T07:50:49.598 INFO:tasks.workunit.client.0.vm05.stdout:5/347: dread - d2/d20/f51 zero size 2026-03-10T07:50:49.599 INFO:tasks.workunit.client.0.vm05.stdout:8/316: chown d1/dd/d18/d20/d2a/d34/d49/d5d 106326 1 2026-03-10T07:50:49.602 INFO:tasks.workunit.client.0.vm05.stdout:7/367: creat d1/d34/d59/f6f x:0 0 0 2026-03-10T07:50:49.605 INFO:tasks.workunit.client.0.vm05.stdout:7/368: dwrite d1/d6/d47/f52 [0,4194304] 0 2026-03-10T07:50:49.607 INFO:tasks.workunit.client.0.vm05.stdout:4/375: creat d0/d6/d9/d5a/f82 x:0 0 0 2026-03-10T07:50:49.613 INFO:tasks.workunit.client.0.vm05.stdout:5/348: fsync d2/d12/f24 0 2026-03-10T07:50:49.614 INFO:tasks.workunit.client.0.vm05.stdout:8/317: write d1/f2c [2267191,35453] 0 2026-03-10T07:50:49.614 INFO:tasks.workunit.client.0.vm05.stdout:8/318: stat d1/dd/d4d/f61 0 2026-03-10T07:50:49.619 INFO:tasks.workunit.client.0.vm05.stdout:7/369: dwrite d1/d3c/f63 [0,4194304] 0 2026-03-10T07:50:49.620 INFO:tasks.workunit.client.0.vm05.stdout:5/349: truncate d2/f15 428527 0 2026-03-10T07:50:49.621 INFO:tasks.workunit.client.0.vm05.stdout:5/350: write d2/d12/d4d/f5d [223196,48710] 0 2026-03-10T07:50:49.624 INFO:tasks.workunit.client.0.vm05.stdout:8/319: stat d1/lf 0 2026-03-10T07:50:49.638 INFO:tasks.workunit.client.0.vm05.stdout:0/319: getdents d8/dd/d10/d26/d3a/d5e 0 2026-03-10T07:50:49.638 INFO:tasks.workunit.client.0.vm05.stdout:4/376: creat d0/d6/d9/f83 x:0 0 0 2026-03-10T07:50:49.638 INFO:tasks.workunit.client.0.vm05.stdout:4/377: fsync d0/d6/d9/d12/d45/f66 0 2026-03-10T07:50:49.638 INFO:tasks.workunit.client.0.vm05.stdout:4/378: truncate d0/d3b/f4a 816537 0 2026-03-10T07:50:49.638 INFO:tasks.workunit.client.0.vm05.stdout:5/351: fsync d2/d5/f23 0 2026-03-10T07:50:49.638 INFO:tasks.workunit.client.0.vm05.stdout:5/352: stat d2/d5/d61/f65 0 2026-03-10T07:50:49.638 INFO:tasks.workunit.client.0.vm05.stdout:0/320: mkdir d8/dd/d37/d56/d6d/d70 0 2026-03-10T07:50:49.638 INFO:tasks.workunit.client.0.vm05.stdout:0/321: dread d8/dd/d34/f5b [0,4194304] 0 2026-03-10T07:50:49.638 INFO:tasks.workunit.client.0.vm05.stdout:5/353: link d2/d12/d4d/f5d d2/d12/d2d/f7a 0 2026-03-10T07:50:49.638 INFO:tasks.workunit.client.0.vm05.stdout:0/322: write d8/fb [1840281,73630] 0 2026-03-10T07:50:49.640 INFO:tasks.workunit.client.0.vm05.stdout:5/354: truncate d2/d12/f40 1036900 0 2026-03-10T07:50:49.642 INFO:tasks.workunit.client.0.vm05.stdout:0/323: creat d8/dd/d10/d26/d3a/d5e/f71 x:0 0 0 2026-03-10T07:50:49.645 INFO:tasks.workunit.client.0.vm05.stdout:0/324: dread f4 [0,4194304] 0 2026-03-10T07:50:49.647 INFO:tasks.workunit.client.0.vm05.stdout:0/325: symlink d8/dd/l72 0 2026-03-10T07:50:49.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:49.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:49.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:49 vm05.local ceph-mon[50387]: Upgrade: Updating node-exporter.vm08 (2/2) 2026-03-10T07:50:49.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:49 vm05.local ceph-mon[50387]: Deploying daemon node-exporter.vm08 on vm08 2026-03-10T07:50:49.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:49 vm05.local ceph-mon[50387]: pgmap v14: 65 pgs: 65 active+clean; 840 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 18 MiB/s rd, 108 MiB/s wr, 295 op/s 2026-03-10T07:50:49.659 INFO:tasks.workunit.client.0.vm05.stdout:0/326: dread d8/dd/d37/d56/f3b [0,4194304] 0 2026-03-10T07:50:49.695 INFO:tasks.workunit.client.0.vm05.stdout:2/441: unlink d0/d47/d49/d81/l87 0 2026-03-10T07:50:49.696 INFO:tasks.workunit.client.0.vm05.stdout:2/442: creat d0/d52/f88 x:0 0 0 2026-03-10T07:50:49.715 INFO:tasks.workunit.client.0.vm05.stdout:7/370: dread d1/d6/f2e [0,4194304] 0 2026-03-10T07:50:49.716 INFO:tasks.workunit.client.0.vm05.stdout:7/371: mknod d1/d5b/c70 0 2026-03-10T07:50:49.717 INFO:tasks.workunit.client.0.vm05.stdout:7/372: mkdir d1/d3c/d71 0 2026-03-10T07:50:49.719 INFO:tasks.workunit.client.0.vm05.stdout:7/373: unlink d1/d6/f6c 0 2026-03-10T07:50:49.721 INFO:tasks.workunit.client.0.vm05.stdout:1/380: rename da/dd/d27 to da/dd/d2a/d55/d68 0 2026-03-10T07:50:49.722 INFO:tasks.workunit.client.0.vm05.stdout:1/381: stat da/dd/d42 0 2026-03-10T07:50:49.727 INFO:tasks.workunit.client.0.vm05.stdout:7/374: creat d1/d34/d59/f72 x:0 0 0 2026-03-10T07:50:49.730 INFO:tasks.workunit.client.0.vm05.stdout:7/375: creat d1/d5b/f73 x:0 0 0 2026-03-10T07:50:49.731 INFO:tasks.workunit.client.0.vm05.stdout:7/376: mknod d1/d3c/d71/c74 0 2026-03-10T07:50:49.732 INFO:tasks.workunit.client.0.vm05.stdout:7/377: write d1/f49 [3122440,61083] 0 2026-03-10T07:50:49.732 INFO:tasks.workunit.client.0.vm05.stdout:7/378: read d1/f16 [3376504,91145] 0 2026-03-10T07:50:49.733 INFO:tasks.workunit.client.0.vm05.stdout:7/379: readlink d1/d6/l26 0 2026-03-10T07:50:49.733 INFO:tasks.workunit.client.0.vm05.stdout:7/380: readlink d1/d3c/d4b/l56 0 2026-03-10T07:50:49.737 INFO:tasks.workunit.client.0.vm05.stdout:6/331: rmdir d0/d11 39 2026-03-10T07:50:49.738 INFO:tasks.workunit.client.0.vm05.stdout:3/347: truncate d8/d1f/d2a/d4b/f55 2413181 0 2026-03-10T07:50:49.745 INFO:tasks.workunit.client.0.vm05.stdout:6/332: mknod d0/d11/d22/c6a 0 2026-03-10T07:50:49.749 INFO:tasks.workunit.client.0.vm05.stdout:3/348: mknod d8/d16/d19/d6b/c74 0 2026-03-10T07:50:49.750 INFO:tasks.workunit.client.0.vm05.stdout:8/320: dwrite d1/f15 [0,4194304] 0 2026-03-10T07:50:49.758 INFO:tasks.workunit.client.0.vm05.stdout:4/379: truncate d0/d6/d9/d12/d45/d55/f19 33700 0 2026-03-10T07:50:49.762 INFO:tasks.workunit.client.0.vm05.stdout:5/355: truncate d2/d12/d2d/f60 516924 0 2026-03-10T07:50:49.774 INFO:tasks.workunit.client.0.vm05.stdout:3/349: getdents d8 0 2026-03-10T07:50:49.774 INFO:tasks.workunit.client.0.vm05.stdout:4/380: dread d0/d6/d9/d12/d45/d55/f2c [0,4194304] 0 2026-03-10T07:50:49.774 INFO:tasks.workunit.client.0.vm05.stdout:0/327: dwrite d8/f20 [0,4194304] 0 2026-03-10T07:50:49.777 INFO:tasks.workunit.client.0.vm05.stdout:6/333: sync 2026-03-10T07:50:49.783 INFO:tasks.workunit.client.0.vm05.stdout:4/381: symlink d0/d20/l84 0 2026-03-10T07:50:49.784 INFO:tasks.workunit.client.0.vm05.stdout:0/328: mknod d8/c73 0 2026-03-10T07:50:49.787 INFO:tasks.workunit.client.0.vm05.stdout:3/350: creat d8/d22/d54/f75 x:0 0 0 2026-03-10T07:50:49.788 INFO:tasks.workunit.client.0.vm05.stdout:3/351: dread - d8/d16/f67 zero size 2026-03-10T07:50:49.790 INFO:tasks.workunit.client.0.vm05.stdout:3/352: dread d8/d1f/f2f [0,4194304] 0 2026-03-10T07:50:49.792 INFO:tasks.workunit.client.0.vm05.stdout:2/443: truncate d0/f22 1581314 0 2026-03-10T07:50:49.793 INFO:tasks.workunit.client.0.vm05.stdout:2/444: fdatasync d0/d8/d43/df/d53/f69 0 2026-03-10T07:50:49.796 INFO:tasks.workunit.client.0.vm05.stdout:6/334: dread d0/f26 [0,4194304] 0 2026-03-10T07:50:49.799 INFO:tasks.workunit.client.0.vm05.stdout:0/329: dwrite d8/dd/d34/f5b [0,4194304] 0 2026-03-10T07:50:49.803 INFO:tasks.workunit.client.0.vm05.stdout:4/382: sync 2026-03-10T07:50:49.810 INFO:tasks.workunit.client.0.vm05.stdout:9/354: chown d8/d35/d1c/f49 4553510 1 2026-03-10T07:50:49.813 INFO:tasks.workunit.client.0.vm05.stdout:2/445: dread d0/f5 [0,4194304] 0 2026-03-10T07:50:49.814 INFO:tasks.workunit.client.0.vm05.stdout:2/446: fsync d0/d8/fe 0 2026-03-10T07:50:49.815 INFO:tasks.workunit.client.0.vm05.stdout:6/335: chown d0/d6/cc 739841 1 2026-03-10T07:50:49.817 INFO:tasks.workunit.client.0.vm05.stdout:1/382: dwrite da/d26/f2d [0,4194304] 0 2026-03-10T07:50:49.818 INFO:tasks.workunit.client.0.vm05.stdout:7/381: truncate d1/d6/f2e 2902159 0 2026-03-10T07:50:49.823 INFO:tasks.workunit.client.0.vm05.stdout:1/383: dwrite da/dd/d12/d19/f3b [0,4194304] 0 2026-03-10T07:50:49.832 INFO:tasks.workunit.client.0.vm05.stdout:6/336: dread d0/f23 [4194304,4194304] 0 2026-03-10T07:50:49.840 INFO:tasks.workunit.client.0.vm05.stdout:9/355: write d8/d35/f25 [871468,112838] 0 2026-03-10T07:50:49.848 INFO:tasks.workunit.client.0.vm05.stdout:9/356: stat d8/d35/d22/c30 0 2026-03-10T07:50:49.848 INFO:tasks.workunit.client.0.vm05.stdout:7/382: truncate d1/d3c/d4b/f4f 1669927 0 2026-03-10T07:50:49.848 INFO:tasks.workunit.client.0.vm05.stdout:0/330: rename d8/dd/d37/d56/d4d/f54 to d8/dd/d10/d26/d2a/f74 0 2026-03-10T07:50:49.848 INFO:tasks.workunit.client.0.vm05.stdout:1/384: write da/f3a [3665972,64777] 0 2026-03-10T07:50:49.848 INFO:tasks.workunit.client.0.vm05.stdout:1/385: dread - da/dd/d2a/f63 zero size 2026-03-10T07:50:49.848 INFO:tasks.workunit.client.0.vm05.stdout:6/337: unlink d0/d11/d4f/f53 0 2026-03-10T07:50:49.848 INFO:tasks.workunit.client.0.vm05.stdout:7/383: dwrite d1/d6/d47/f65 [0,4194304] 0 2026-03-10T07:50:49.854 INFO:tasks.workunit.client.0.vm05.stdout:6/338: dwrite d0/d35/d36/d43/f47 [0,4194304] 0 2026-03-10T07:50:49.862 INFO:tasks.workunit.client.0.vm05.stdout:0/331: rename f4 to d8/f75 0 2026-03-10T07:50:49.871 INFO:tasks.workunit.client.0.vm05.stdout:8/321: write d1/dd/d18/d20/d2a/f3a [1497612,113306] 0 2026-03-10T07:50:49.871 INFO:tasks.workunit.client.0.vm05.stdout:2/447: creat d0/f89 x:0 0 0 2026-03-10T07:50:49.871 INFO:tasks.workunit.client.0.vm05.stdout:5/356: write d2/d20/d5b/f5f [1034401,4917] 0 2026-03-10T07:50:49.873 INFO:tasks.workunit.client.0.vm05.stdout:0/332: read d8/dd/f3c [911940,114868] 0 2026-03-10T07:50:49.873 INFO:tasks.workunit.client.0.vm05.stdout:0/333: truncate d8/f65 946170 0 2026-03-10T07:50:49.874 INFO:tasks.workunit.client.0.vm05.stdout:0/334: chown d8/dd/d37/d56/d6d/d70 750873 1 2026-03-10T07:50:49.880 INFO:tasks.workunit.client.0.vm05.stdout:7/384: rename d1/c61 to d1/d6/d47/c75 0 2026-03-10T07:50:49.881 INFO:tasks.workunit.client.0.vm05.stdout:3/353: write d8/d1f/d2a/f32 [4694369,93776] 0 2026-03-10T07:50:49.886 INFO:tasks.workunit.client.0.vm05.stdout:2/448: symlink d0/d8/d43/df/d4d/l8a 0 2026-03-10T07:50:49.886 INFO:tasks.workunit.client.0.vm05.stdout:5/357: stat d2/d12/c3e 0 2026-03-10T07:50:49.887 INFO:tasks.workunit.client.0.vm05.stdout:9/357: link d8/d35/d1c/d20/c3e d8/d35/d22/d33/d62/d6d/c7a 0 2026-03-10T07:50:49.887 INFO:tasks.workunit.client.0.vm05.stdout:6/339: creat d0/d11/d4f/d56/f6b x:0 0 0 2026-03-10T07:50:49.890 INFO:tasks.workunit.client.0.vm05.stdout:3/354: mkdir d8/d1f/d24/d76 0 2026-03-10T07:50:49.891 INFO:tasks.workunit.client.0.vm05.stdout:2/449: mkdir d0/d8/d43/df/d8b 0 2026-03-10T07:50:49.892 INFO:tasks.workunit.client.0.vm05.stdout:9/358: creat d8/d35/d1c/d20/d54/f7b x:0 0 0 2026-03-10T07:50:49.892 INFO:tasks.workunit.client.0.vm05.stdout:6/340: rmdir d0/d11/d31 39 2026-03-10T07:50:49.894 INFO:tasks.workunit.client.0.vm05.stdout:9/359: dread d8/d35/d1c/f49 [0,4194304] 0 2026-03-10T07:50:49.894 INFO:tasks.workunit.client.0.vm05.stdout:9/360: stat d8/d35/d22/d33/d62 0 2026-03-10T07:50:49.900 INFO:tasks.workunit.client.0.vm05.stdout:9/361: write d8/d35/f5d [173031,128453] 0 2026-03-10T07:50:49.900 INFO:tasks.workunit.client.0.vm05.stdout:8/322: rename d1/l16 to d1/dd/d18/d20/d2a/d34/l62 0 2026-03-10T07:50:49.900 INFO:tasks.workunit.client.0.vm05.stdout:8/323: write d1/dd/d4d/f61 [682693,97150] 0 2026-03-10T07:50:49.900 INFO:tasks.workunit.client.0.vm05.stdout:7/385: link d1/d6/f31 d1/d5b/f76 0 2026-03-10T07:50:49.900 INFO:tasks.workunit.client.0.vm05.stdout:1/386: link da/l30 da/dd/d12/l69 0 2026-03-10T07:50:49.901 INFO:tasks.workunit.client.0.vm05.stdout:2/450: mkdir d0/d2a/d8c 0 2026-03-10T07:50:49.901 INFO:tasks.workunit.client.0.vm05.stdout:5/358: mkdir d2/d20/d7b 0 2026-03-10T07:50:49.902 INFO:tasks.workunit.client.0.vm05.stdout:6/341: mkdir d0/d11/d22/d6c 0 2026-03-10T07:50:49.902 INFO:tasks.workunit.client.0.vm05.stdout:9/362: creat d8/d35/d1c/d26/d28/f7c x:0 0 0 2026-03-10T07:50:49.903 INFO:tasks.workunit.client.0.vm05.stdout:9/363: chown d8/d35/d1c/d20/l37 133 1 2026-03-10T07:50:49.905 INFO:tasks.workunit.client.0.vm05.stdout:1/387: creat da/dd/d2a/f6a x:0 0 0 2026-03-10T07:50:49.906 INFO:tasks.workunit.client.0.vm05.stdout:8/324: dwrite d1/d45/f53 [4194304,4194304] 0 2026-03-10T07:50:49.907 INFO:tasks.workunit.client.0.vm05.stdout:8/325: stat d1/f15 0 2026-03-10T07:50:49.908 INFO:tasks.workunit.client.0.vm05.stdout:8/326: chown d1/dd/d18/f5c 490 1 2026-03-10T07:50:49.909 INFO:tasks.workunit.client.0.vm05.stdout:8/327: write d1/dd/d18/f38 [1560289,101829] 0 2026-03-10T07:50:49.909 INFO:tasks.workunit.client.0.vm05.stdout:2/451: chown d0/d8/d3d/d7d/l3f 7774247 1 2026-03-10T07:50:49.915 INFO:tasks.workunit.client.0.vm05.stdout:9/364: unlink d8/d35/d1c/d26/d28/c66 0 2026-03-10T07:50:49.915 INFO:tasks.workunit.client.0.vm05.stdout:1/388: rename da/d26/l52 to da/dd/d2a/d55/d68/l6b 0 2026-03-10T07:50:49.925 INFO:tasks.workunit.client.0.vm05.stdout:6/342: truncate d0/d11/d2e/f30 2579277 0 2026-03-10T07:50:49.927 INFO:tasks.workunit.client.0.vm05.stdout:3/355: sync 2026-03-10T07:50:49.927 INFO:tasks.workunit.client.0.vm05.stdout:3/356: read - d8/d1c/d48/f70 zero size 2026-03-10T07:50:49.928 INFO:tasks.workunit.client.0.vm05.stdout:8/328: creat d1/dd/d4d/f63 x:0 0 0 2026-03-10T07:50:49.933 INFO:tasks.workunit.client.0.vm05.stdout:6/343: symlink d0/d35/d36/d43/l6d 0 2026-03-10T07:50:49.933 INFO:tasks.workunit.client.0.vm05.stdout:5/359: sync 2026-03-10T07:50:49.934 INFO:tasks.workunit.client.0.vm05.stdout:5/360: write d2/d5/f1e [2462465,116594] 0 2026-03-10T07:50:49.954 INFO:tasks.workunit.client.0.vm05.stdout:5/361: dread - d2/d20/f51 zero size 2026-03-10T07:50:49.956 INFO:tasks.workunit.client.0.vm05.stdout:5/362: write d2/d5/f18 [2184858,22003] 0 2026-03-10T07:50:49.956 INFO:tasks.workunit.client.0.vm05.stdout:3/357: write d8/f12 [5327095,462] 0 2026-03-10T07:50:49.956 INFO:tasks.workunit.client.0.vm05.stdout:8/329: mkdir d1/dd/d4d/d64 0 2026-03-10T07:50:49.956 INFO:tasks.workunit.client.0.vm05.stdout:2/452: rename d0/d8/d43/df/d25/f46 to d0/d8/d3d/f8d 0 2026-03-10T07:50:49.956 INFO:tasks.workunit.client.0.vm05.stdout:6/344: mkdir d0/d11/d4f/d6e 0 2026-03-10T07:50:49.956 INFO:tasks.workunit.client.0.vm05.stdout:5/363: fdatasync d2/d5/f10 0 2026-03-10T07:50:49.956 INFO:tasks.workunit.client.0.vm05.stdout:3/358: creat d8/d16/d19/f77 x:0 0 0 2026-03-10T07:50:49.956 INFO:tasks.workunit.client.0.vm05.stdout:8/330: dread d1/dd/d18/f22 [0,4194304] 0 2026-03-10T07:50:49.956 INFO:tasks.workunit.client.0.vm05.stdout:6/345: creat d0/d11/d4f/d56/f6f x:0 0 0 2026-03-10T07:50:49.956 INFO:tasks.workunit.client.0.vm05.stdout:6/346: chown d0/d11/d4f/d56/f6b 80 1 2026-03-10T07:50:49.956 INFO:tasks.workunit.client.0.vm05.stdout:8/331: dread d1/f2c [0,4194304] 0 2026-03-10T07:50:49.956 INFO:tasks.workunit.client.0.vm05.stdout:3/359: mknod d8/d1f/d2a/d34/c78 0 2026-03-10T07:50:49.957 INFO:tasks.workunit.client.0.vm05.stdout:3/360: dread - d8/d1f/d2a/f42 zero size 2026-03-10T07:50:49.957 INFO:tasks.workunit.client.0.vm05.stdout:3/361: fdatasync d8/fe 0 2026-03-10T07:50:49.957 INFO:tasks.workunit.client.0.vm05.stdout:5/364: mknod d2/d20/d7b/c7c 0 2026-03-10T07:50:49.957 INFO:tasks.workunit.client.0.vm05.stdout:8/332: truncate d1/dd/d18/f21 4694812 0 2026-03-10T07:50:49.959 INFO:tasks.workunit.client.0.vm05.stdout:9/365: getdents d8/d35/d1c/d2c/d63 0 2026-03-10T07:50:49.960 INFO:tasks.workunit.client.0.vm05.stdout:5/365: mkdir d2/d20/d33/d53/d7d 0 2026-03-10T07:50:49.962 INFO:tasks.workunit.client.0.vm05.stdout:9/366: write d8/d35/f1f [1393753,19798] 0 2026-03-10T07:50:49.962 INFO:tasks.workunit.client.0.vm05.stdout:9/367: dread - d8/f5e zero size 2026-03-10T07:50:49.964 INFO:tasks.workunit.client.0.vm05.stdout:3/362: link d8/d1f/f6c d8/d1f/f79 0 2026-03-10T07:50:49.964 INFO:tasks.workunit.client.0.vm05.stdout:5/366: write d2/d12/d2d/f7a [2634742,95193] 0 2026-03-10T07:50:49.967 INFO:tasks.workunit.client.0.vm05.stdout:4/383: truncate d0/d6/f32 1040011 0 2026-03-10T07:50:49.971 INFO:tasks.workunit.client.0.vm05.stdout:4/384: write d0/d6/f15 [4208713,95422] 0 2026-03-10T07:50:49.973 INFO:tasks.workunit.client.0.vm05.stdout:3/363: write d8/d22/f29 [1867282,44030] 0 2026-03-10T07:50:49.976 INFO:tasks.workunit.client.0.vm05.stdout:4/385: mkdir d0/d6/d9/d12/d45/d55/d44/d85 0 2026-03-10T07:50:49.978 INFO:tasks.workunit.client.0.vm05.stdout:8/333: getdents d1/dd/d18/d20 0 2026-03-10T07:50:49.981 INFO:tasks.workunit.client.0.vm05.stdout:3/364: symlink d8/d1f/l7a 0 2026-03-10T07:50:49.983 INFO:tasks.workunit.client.0.vm05.stdout:2/453: sync 2026-03-10T07:50:49.983 INFO:tasks.workunit.client.0.vm05.stdout:6/347: sync 2026-03-10T07:50:49.984 INFO:tasks.workunit.client.0.vm05.stdout:2/454: chown d0/d2a/f2e 25680 1 2026-03-10T07:50:49.985 INFO:tasks.workunit.client.0.vm05.stdout:6/348: dread d0/f26 [0,4194304] 0 2026-03-10T07:50:49.991 INFO:tasks.workunit.client.0.vm05.stdout:6/349: dwrite d0/d11/d4f/d56/f6b [0,4194304] 0 2026-03-10T07:50:49.993 INFO:tasks.workunit.client.0.vm05.stdout:6/350: stat d0/d6/f1d 0 2026-03-10T07:50:50.001 INFO:tasks.workunit.client.0.vm05.stdout:8/334: creat d1/dd/d18/d20/d2a/d48/f65 x:0 0 0 2026-03-10T07:50:50.001 INFO:tasks.workunit.client.0.vm05.stdout:4/386: mknod d0/d6/d9/c86 0 2026-03-10T07:50:50.008 INFO:tasks.workunit.client.0.vm05.stdout:0/335: dwrite d8/dd/d10/d26/d2a/f2e [0,4194304] 0 2026-03-10T07:50:50.015 INFO:tasks.workunit.client.0.vm05.stdout:8/335: dread d1/dd/d18/f38 [0,4194304] 0 2026-03-10T07:50:50.027 INFO:tasks.workunit.client.0.vm05.stdout:9/368: rmdir d8 39 2026-03-10T07:50:50.027 INFO:tasks.workunit.client.0.vm05.stdout:7/386: truncate d1/d3c/f63 361410 0 2026-03-10T07:50:50.027 INFO:tasks.workunit.client.0.vm05.stdout:1/389: write da/dd/d2a/f2f [933726,95383] 0 2026-03-10T07:50:50.032 INFO:tasks.workunit.client.0.vm05.stdout:5/367: write d2/d5/f3d [4428549,40908] 0 2026-03-10T07:50:50.035 INFO:tasks.workunit.client.0.vm05.stdout:3/365: dread d8/d16/f1a [0,4194304] 0 2026-03-10T07:50:50.035 INFO:tasks.workunit.client.0.vm05.stdout:3/366: write d8/d22/d60/f61 [429626,29961] 0 2026-03-10T07:50:50.041 INFO:tasks.workunit.client.0.vm05.stdout:1/390: dread da/d26/d2b/f45 [0,4194304] 0 2026-03-10T07:50:50.042 INFO:tasks.workunit.client.0.vm05.stdout:1/391: chown da/d26/d2b/c5e 91335868 1 2026-03-10T07:50:50.052 INFO:tasks.workunit.client.0.vm05.stdout:8/336: creat d1/dd/d18/d20/d2a/d34/d49/f66 x:0 0 0 2026-03-10T07:50:50.053 INFO:tasks.workunit.client.0.vm05.stdout:0/336: sync 2026-03-10T07:50:50.060 INFO:tasks.workunit.client.0.vm05.stdout:9/369: sync 2026-03-10T07:50:50.067 INFO:tasks.workunit.client.0.vm05.stdout:6/351: dread d0/d11/d31/f33 [0,4194304] 0 2026-03-10T07:50:50.070 INFO:tasks.workunit.client.0.vm05.stdout:6/352: dread d0/d11/d22/f52 [0,4194304] 0 2026-03-10T07:50:50.071 INFO:tasks.workunit.client.0.vm05.stdout:2/455: symlink d0/d8/d43/df/d8b/l8e 0 2026-03-10T07:50:50.072 INFO:tasks.workunit.client.0.vm05.stdout:2/456: truncate d0/d52/f88 248224 0 2026-03-10T07:50:50.081 INFO:tasks.workunit.client.0.vm05.stdout:7/387: creat d1/d6/f77 x:0 0 0 2026-03-10T07:50:50.082 INFO:tasks.workunit.client.0.vm05.stdout:5/368: fsync d2/d5/fa 0 2026-03-10T07:50:50.099 INFO:tasks.workunit.client.0.vm05.stdout:6/353: truncate d0/f23 7691633 0 2026-03-10T07:50:50.101 INFO:tasks.workunit.client.0.vm05.stdout:2/457: creat d0/d8/d43/df/d8b/f8f x:0 0 0 2026-03-10T07:50:50.102 INFO:tasks.workunit.client.0.vm05.stdout:3/367: write d8/ff [2084719,45599] 0 2026-03-10T07:50:50.104 INFO:tasks.workunit.client.0.vm05.stdout:0/337: dread d8/dd/f3c [0,4194304] 0 2026-03-10T07:50:50.107 INFO:tasks.workunit.client.0.vm05.stdout:7/388: creat d1/d34/d59/f78 x:0 0 0 2026-03-10T07:50:50.108 INFO:tasks.workunit.client.0.vm05.stdout:7/389: stat d1/d6/d47/l4a 0 2026-03-10T07:50:50.108 INFO:tasks.workunit.client.0.vm05.stdout:8/337: dwrite d1/dd/d18/f29 [4194304,4194304] 0 2026-03-10T07:50:50.109 INFO:tasks.workunit.client.0.vm05.stdout:6/354: sync 2026-03-10T07:50:50.109 INFO:tasks.workunit.client.0.vm05.stdout:7/390: chown d1/d5b/f73 1016 1 2026-03-10T07:50:50.112 INFO:tasks.workunit.client.0.vm05.stdout:5/369: unlink d2/c55 0 2026-03-10T07:50:50.115 INFO:tasks.workunit.client.0.vm05.stdout:5/370: sync 2026-03-10T07:50:50.115 INFO:tasks.workunit.client.0.vm05.stdout:4/387: link d0/d6/d9/d12/d45/d55/c31 d0/d3b/d5c/c87 0 2026-03-10T07:50:50.115 INFO:tasks.workunit.client.0.vm05.stdout:5/371: truncate d2/f42 1319790 0 2026-03-10T07:50:50.116 INFO:tasks.workunit.client.0.vm05.stdout:4/388: chown d0/d20/f63 780155651 1 2026-03-10T07:50:50.117 INFO:tasks.workunit.client.0.vm05.stdout:4/389: dread - d0/d20/d26/f73 zero size 2026-03-10T07:50:50.117 INFO:tasks.workunit.client.0.vm05.stdout:4/390: fsync d0/f24 0 2026-03-10T07:50:50.122 INFO:tasks.workunit.client.0.vm05.stdout:4/391: dwrite d0/d28/f7c [0,4194304] 0 2026-03-10T07:50:50.124 INFO:tasks.workunit.client.0.vm05.stdout:9/370: creat d8/d35/d22/d33/d62/f7d x:0 0 0 2026-03-10T07:50:50.129 INFO:tasks.workunit.client.0.vm05.stdout:4/392: write d0/d6/d9/d12/d45/d55/f5f [903416,27804] 0 2026-03-10T07:50:50.135 INFO:tasks.workunit.client.0.vm05.stdout:0/338: rmdir d8/dd/d10/d26 39 2026-03-10T07:50:50.139 INFO:tasks.workunit.client.0.vm05.stdout:7/391: mkdir d1/d3c/d71/d79 0 2026-03-10T07:50:50.139 INFO:tasks.workunit.client.0.vm05.stdout:7/392: chown d1/d6/c1f 7 1 2026-03-10T07:50:50.141 INFO:tasks.workunit.client.0.vm05.stdout:8/338: dwrite d1/dd/d18/d20/d2a/d48/f59 [0,4194304] 0 2026-03-10T07:50:50.145 INFO:tasks.workunit.client.0.vm05.stdout:1/392: creat da/dd/d12/d19/d20/f6c x:0 0 0 2026-03-10T07:50:50.146 INFO:tasks.workunit.client.0.vm05.stdout:5/372: rename d2/d20/f32 to d2/d20/d33/d53/d7d/f7e 0 2026-03-10T07:50:50.146 INFO:tasks.workunit.client.0.vm05.stdout:0/339: rename d8/dd/d37/d56/d6d to d8/dd/d37/d56/d6d/d70/d76 22 2026-03-10T07:50:50.149 INFO:tasks.workunit.client.0.vm05.stdout:4/393: readlink d0/d6/d9/d12/d45/d55/l43 0 2026-03-10T07:50:50.149 INFO:tasks.workunit.client.0.vm05.stdout:4/394: dread - d0/d6/d9/d12/d45/d55/d44/f74 zero size 2026-03-10T07:50:50.150 INFO:tasks.workunit.client.0.vm05.stdout:3/368: mkdir d8/d16/d52/d7b 0 2026-03-10T07:50:50.151 INFO:tasks.workunit.client.0.vm05.stdout:3/369: fsync d8/f5d 0 2026-03-10T07:50:50.152 INFO:tasks.workunit.client.0.vm05.stdout:6/355: symlink d0/l70 0 2026-03-10T07:50:50.153 INFO:tasks.workunit.client.0.vm05.stdout:6/356: stat d0/d11/d57/f5c 0 2026-03-10T07:50:50.157 INFO:tasks.workunit.client.0.vm05.stdout:8/339: rename d1/dd/d18/d20/f46 to d1/dd/d4d/d64/f67 0 2026-03-10T07:50:50.159 INFO:tasks.workunit.client.0.vm05.stdout:0/340: truncate d8/dd/d10/d26/d3a/d5e/f71 238387 0 2026-03-10T07:50:50.159 INFO:tasks.workunit.client.0.vm05.stdout:0/341: fsync d8/dd/d34/f3d 0 2026-03-10T07:50:50.160 INFO:tasks.workunit.client.0.vm05.stdout:0/342: stat d8/dd/d10/d26/d2a/f74 0 2026-03-10T07:50:50.164 INFO:tasks.workunit.client.0.vm05.stdout:4/395: creat d0/d6/d9/d5a/f88 x:0 0 0 2026-03-10T07:50:50.166 INFO:tasks.workunit.client.0.vm05.stdout:3/370: creat d8/d16/d19/d37/f7c x:0 0 0 2026-03-10T07:50:50.173 INFO:tasks.workunit.client.0.vm05.stdout:5/373: creat d2/d20/d77/f7f x:0 0 0 2026-03-10T07:50:50.175 INFO:tasks.workunit.client.0.vm05.stdout:0/343: chown d8/dd/d37/d56/f62 525 1 2026-03-10T07:50:50.176 INFO:tasks.workunit.client.0.vm05.stdout:0/344: fdatasync d8/dd/d34/f5b 0 2026-03-10T07:50:50.177 INFO:tasks.workunit.client.0.vm05.stdout:0/345: read d8/f1c [2611757,105183] 0 2026-03-10T07:50:50.177 INFO:tasks.workunit.client.0.vm05.stdout:5/374: dwrite d2/d20/f2a [0,4194304] 0 2026-03-10T07:50:50.184 INFO:tasks.workunit.client.0.vm05.stdout:7/393: creat d1/d34/f7a x:0 0 0 2026-03-10T07:50:50.184 INFO:tasks.workunit.client.0.vm05.stdout:7/394: write d1/d6/f1d [1461890,50410] 0 2026-03-10T07:50:50.188 INFO:tasks.workunit.client.0.vm05.stdout:6/357: link d0/d11/d22/f4c d0/d35/d36/f71 0 2026-03-10T07:50:50.192 INFO:tasks.workunit.client.0.vm05.stdout:8/340: creat d1/dd/d18/d20/d2a/d34/d49/d5d/f68 x:0 0 0 2026-03-10T07:50:50.204 INFO:tasks.workunit.client.0.vm05.stdout:5/375: symlink d2/d20/d33/d72/l80 0 2026-03-10T07:50:50.204 INFO:tasks.workunit.client.0.vm05.stdout:5/376: write d2/d12/d4d/f5d [2499719,120399] 0 2026-03-10T07:50:50.204 INFO:tasks.workunit.client.0.vm05.stdout:4/396: mkdir d0/d6/d9/d12/d45/d55/d44/d85/d89 0 2026-03-10T07:50:50.204 INFO:tasks.workunit.client.0.vm05.stdout:4/397: write d0/d6/d9/f67 [606926,98921] 0 2026-03-10T07:50:50.204 INFO:tasks.workunit.client.0.vm05.stdout:4/398: write d0/d6/d9/d12/d45/f66 [342956,83388] 0 2026-03-10T07:50:50.204 INFO:tasks.workunit.client.0.vm05.stdout:4/399: write d0/d6/d9/d12/d45/d55/f7d [896716,41671] 0 2026-03-10T07:50:50.205 INFO:tasks.workunit.client.0.vm05.stdout:6/358: sync 2026-03-10T07:50:50.213 INFO:tasks.workunit.client.0.vm05.stdout:1/393: link da/dd/d12/l39 da/dd/d2a/d55/d64/l6d 0 2026-03-10T07:50:50.223 INFO:tasks.workunit.client.0.vm05.stdout:2/458: write d0/f22 [2274150,6289] 0 2026-03-10T07:50:50.225 INFO:tasks.workunit.client.0.vm05.stdout:5/377: symlink d2/d20/d77/l81 0 2026-03-10T07:50:50.230 INFO:tasks.workunit.client.0.vm05.stdout:8/341: dread d1/fa [4194304,4194304] 0 2026-03-10T07:50:50.233 INFO:tasks.workunit.client.0.vm05.stdout:9/371: truncate d8/fa 1212155 0 2026-03-10T07:50:50.236 INFO:tasks.workunit.client.0.vm05.stdout:6/359: read d0/d11/d2e/f30 [337339,83604] 0 2026-03-10T07:50:50.238 INFO:tasks.workunit.client.0.vm05.stdout:7/395: link d1/d34/d59/f72 d1/d6/d47/f7b 0 2026-03-10T07:50:50.240 INFO:tasks.workunit.client.0.vm05.stdout:1/394: chown da/dd/d12/d34/l37 3 1 2026-03-10T07:50:50.244 INFO:tasks.workunit.client.0.vm05.stdout:5/378: creat d2/d20/d33/d53/d7d/f82 x:0 0 0 2026-03-10T07:50:50.244 INFO:tasks.workunit.client.0.vm05.stdout:5/379: dread - d2/d12/f24 zero size 2026-03-10T07:50:50.246 INFO:tasks.workunit.client.0.vm05.stdout:4/400: link d0/d6/d9/f67 d0/d6/d9/d12/d65/f8a 0 2026-03-10T07:50:50.247 INFO:tasks.workunit.client.0.vm05.stdout:8/342: mknod d1/dd/d18/c69 0 2026-03-10T07:50:50.248 INFO:tasks.workunit.client.0.vm05.stdout:9/372: mknod d8/d35/d1c/d36/c7e 0 2026-03-10T07:50:50.249 INFO:tasks.workunit.client.0.vm05.stdout:3/371: getdents d8/d22/d60 0 2026-03-10T07:50:50.251 INFO:tasks.workunit.client.0.vm05.stdout:1/395: chown da/dd/d12/c48 3392210 1 2026-03-10T07:50:50.253 INFO:tasks.workunit.client.0.vm05.stdout:0/346: rename d8/dd/d37/d56/l5a to d8/dd/d37/d56/l77 0 2026-03-10T07:50:50.264 INFO:tasks.workunit.client.0.vm05.stdout:4/401: fdatasync d0/d6/d9/f67 0 2026-03-10T07:50:50.267 INFO:tasks.workunit.client.0.vm05.stdout:9/373: symlink d8/d35/d22/d33/l7f 0 2026-03-10T07:50:50.267 INFO:tasks.workunit.client.0.vm05.stdout:9/374: chown d8/d35/d1c/l65 933 1 2026-03-10T07:50:50.269 INFO:tasks.workunit.client.0.vm05.stdout:3/372: mkdir d8/d1f/d2a/d4a/d7d 0 2026-03-10T07:50:50.270 INFO:tasks.workunit.client.0.vm05.stdout:3/373: write d8/d16/f4c [1009692,99407] 0 2026-03-10T07:50:50.275 INFO:tasks.workunit.client.0.vm05.stdout:1/396: rmdir da/d26 39 2026-03-10T07:50:50.278 INFO:tasks.workunit.client.0.vm05.stdout:2/459: creat d0/d8/d43/f90 x:0 0 0 2026-03-10T07:50:50.278 INFO:tasks.workunit.client.0.vm05.stdout:2/460: truncate d0/d8/d43/f90 249101 0 2026-03-10T07:50:50.280 INFO:tasks.workunit.client.0.vm05.stdout:2/461: dread d0/d8/d43/df/f21 [8388608,4194304] 0 2026-03-10T07:50:50.281 INFO:tasks.workunit.client.0.vm05.stdout:6/360: write d0/d35/d36/f59 [5291875,99258] 0 2026-03-10T07:50:50.281 INFO:tasks.workunit.client.0.vm05.stdout:2/462: read d0/d8/d3d/f40 [3626318,71342] 0 2026-03-10T07:50:50.285 INFO:tasks.workunit.client.0.vm05.stdout:6/361: dwrite d0/d11/d4f/d56/f6f [0,4194304] 0 2026-03-10T07:50:50.287 INFO:tasks.workunit.client.0.vm05.stdout:6/362: chown d0/d6/f16 310398 1 2026-03-10T07:50:50.296 INFO:tasks.workunit.client.0.vm05.stdout:9/375: symlink d8/d35/d1c/d26/d28/l80 0 2026-03-10T07:50:50.296 INFO:tasks.workunit.client.0.vm05.stdout:9/376: dread - d8/d35/d1c/f52 zero size 2026-03-10T07:50:50.297 INFO:tasks.workunit.client.0.vm05.stdout:9/377: truncate d8/d35/d1c/d20/d54/f7b 339175 0 2026-03-10T07:50:50.299 INFO:tasks.workunit.client.0.vm05.stdout:9/378: truncate d8/d35/d1c/d26/d28/d79/f44 35753 0 2026-03-10T07:50:50.300 INFO:tasks.workunit.client.0.vm05.stdout:7/396: rename d1/f21 to d1/d34/f7c 0 2026-03-10T07:50:50.301 INFO:tasks.workunit.client.0.vm05.stdout:7/397: truncate d1/d6/d3b/f42 137119 0 2026-03-10T07:50:50.302 INFO:tasks.workunit.client.0.vm05.stdout:1/397: creat da/dd/d12/d19/d20/f6e x:0 0 0 2026-03-10T07:50:50.303 INFO:tasks.workunit.client.0.vm05.stdout:0/347: symlink d8/dd/d10/d26/d3a/d5e/d63/l78 0 2026-03-10T07:50:50.304 INFO:tasks.workunit.client.0.vm05.stdout:0/348: read d8/dd/d10/f19 [2742569,65866] 0 2026-03-10T07:50:50.307 INFO:tasks.workunit.client.0.vm05.stdout:6/363: creat d0/d11/d31/f72 x:0 0 0 2026-03-10T07:50:50.309 INFO:tasks.workunit.client.0.vm05.stdout:6/364: fdatasync d0/d11/f58 0 2026-03-10T07:50:50.309 INFO:tasks.workunit.client.0.vm05.stdout:6/365: chown d0/d35/l4d 28785 1 2026-03-10T07:50:50.309 INFO:tasks.workunit.client.0.vm05.stdout:3/374: creat d8/d1f/d2a/d4a/d7d/f7e x:0 0 0 2026-03-10T07:50:50.313 INFO:tasks.workunit.client.0.vm05.stdout:1/398: creat da/dd/d2a/f6f x:0 0 0 2026-03-10T07:50:50.313 INFO:tasks.workunit.client.0.vm05.stdout:1/399: chown da/dd/d42 461 1 2026-03-10T07:50:50.317 INFO:tasks.workunit.client.0.vm05.stdout:5/380: dwrite d2/f15 [0,4194304] 0 2026-03-10T07:50:50.323 INFO:tasks.workunit.client.0.vm05.stdout:4/402: write d0/d6/d9/d12/f36 [2413476,129397] 0 2026-03-10T07:50:50.323 INFO:tasks.workunit.client.0.vm05.stdout:4/403: fsync d0/d6/d9/f67 0 2026-03-10T07:50:50.332 INFO:tasks.workunit.client.0.vm05.stdout:8/343: getdents d1/dd/d18 0 2026-03-10T07:50:50.334 INFO:tasks.workunit.client.0.vm05.stdout:8/344: stat d1/d45 0 2026-03-10T07:50:50.335 INFO:tasks.workunit.client.0.vm05.stdout:8/345: dwrite d1/dd/f17 [0,4194304] 0 2026-03-10T07:50:50.336 INFO:tasks.workunit.client.0.vm05.stdout:8/346: dread - d1/dd/d18/d20/d2a/d48/f65 zero size 2026-03-10T07:50:50.343 INFO:tasks.workunit.client.0.vm05.stdout:9/379: mkdir d8/d35/d38/d71/d81 0 2026-03-10T07:50:50.343 INFO:tasks.workunit.client.0.vm05.stdout:1/400: rmdir da/dd/d12 39 2026-03-10T07:50:50.345 INFO:tasks.workunit.client.0.vm05.stdout:5/381: creat d2/d20/d7b/f83 x:0 0 0 2026-03-10T07:50:50.347 INFO:tasks.workunit.client.0.vm05.stdout:4/404: chown d0/d6/c18 14246 1 2026-03-10T07:50:50.352 INFO:tasks.workunit.client.0.vm05.stdout:0/349: mknod d8/dd/d37/d56/d6d/c79 0 2026-03-10T07:50:50.352 INFO:tasks.workunit.client.0.vm05.stdout:8/347: mkdir d1/dd/d4d/d64/d6a 0 2026-03-10T07:50:50.352 INFO:tasks.workunit.client.0.vm05.stdout:7/398: rename d1/d5b/f76 to d1/d34/f7d 0 2026-03-10T07:50:50.354 INFO:tasks.workunit.client.0.vm05.stdout:7/399: dwrite d1/d6/f32 [4194304,4194304] 0 2026-03-10T07:50:50.356 INFO:tasks.workunit.client.0.vm05.stdout:1/401: mkdir da/dd/d2a/d70 0 2026-03-10T07:50:50.359 INFO:tasks.workunit.client.0.vm05.stdout:0/350: mkdir d8/dd/d7a 0 2026-03-10T07:50:50.362 INFO:tasks.workunit.client.0.vm05.stdout:2/463: truncate d0/d8/d43/f1d 1547351 0 2026-03-10T07:50:50.368 INFO:tasks.workunit.client.0.vm05.stdout:9/380: symlink d8/d35/d1c/l82 0 2026-03-10T07:50:50.370 INFO:tasks.workunit.client.0.vm05.stdout:7/400: symlink d1/d6/d47/l7e 0 2026-03-10T07:50:50.370 INFO:tasks.workunit.client.0.vm05.stdout:9/381: read d8/d35/d22/f3f [214267,72383] 0 2026-03-10T07:50:50.371 INFO:tasks.workunit.client.0.vm05.stdout:8/348: truncate d1/d45/f55 3199225 0 2026-03-10T07:50:50.371 INFO:tasks.workunit.client.0.vm05.stdout:9/382: chown d8/d35/d1c/d26/d28/d79/d57/c6f 0 1 2026-03-10T07:50:50.372 INFO:tasks.workunit.client.0.vm05.stdout:8/349: chown d1/dd/d18/d20/d2a/d34/d49 7 1 2026-03-10T07:50:50.373 INFO:tasks.workunit.client.0.vm05.stdout:3/375: getdents d8/d1c 0 2026-03-10T07:50:50.373 INFO:tasks.workunit.client.0.vm05.stdout:0/351: rmdir d8/dd/d10/d26/d3a 39 2026-03-10T07:50:50.373 INFO:tasks.workunit.client.0.vm05.stdout:2/464: creat d0/d52/f91 x:0 0 0 2026-03-10T07:50:50.374 INFO:tasks.workunit.client.0.vm05.stdout:3/376: fsync d8/d16/d19/f21 0 2026-03-10T07:50:50.375 INFO:tasks.workunit.client.0.vm05.stdout:4/405: rmdir d0/d6/d9/d12/d45/d55/d44/d85/d89 0 2026-03-10T07:50:50.377 INFO:tasks.workunit.client.0.vm05.stdout:7/401: readlink d1/l51 0 2026-03-10T07:50:50.377 INFO:tasks.workunit.client.0.vm05.stdout:0/352: dwrite d8/dd/d10/d26/d2a/f2e [0,4194304] 0 2026-03-10T07:50:50.388 INFO:tasks.workunit.client.0.vm05.stdout:9/383: rename d8/d35/f21 to d8/d35/d38/d71/d81/f83 0 2026-03-10T07:50:50.398 INFO:tasks.workunit.client.0.vm05.stdout:3/377: creat d8/d22/d60/d58/f7f x:0 0 0 2026-03-10T07:50:50.403 INFO:tasks.workunit.client.0.vm05.stdout:7/402: mkdir d1/d6/d3b/d7f 0 2026-03-10T07:50:50.407 INFO:tasks.workunit.client.0.vm05.stdout:7/403: dwrite d1/d34/d59/f64 [0,4194304] 0 2026-03-10T07:50:50.408 INFO:tasks.workunit.client.0.vm05.stdout:7/404: dread d1/d6/d3b/f42 [0,4194304] 0 2026-03-10T07:50:50.411 INFO:tasks.workunit.client.0.vm05.stdout:6/366: truncate d0/d35/d36/d43/f47 4066142 0 2026-03-10T07:50:50.414 INFO:tasks.workunit.client.0.vm05.stdout:8/350: link d1/dd/d18/d20/f5b d1/dd/d5e/f6b 0 2026-03-10T07:50:50.417 INFO:tasks.workunit.client.0.vm05.stdout:6/367: dwrite d0/d6/f45 [0,4194304] 0 2026-03-10T07:50:50.421 INFO:tasks.workunit.client.0.vm05.stdout:2/465: rename d0/c59 to d0/d8/d43/d38/c92 0 2026-03-10T07:50:50.426 INFO:tasks.workunit.client.0.vm05.stdout:2/466: stat d0/d8/d43/df/f18 0 2026-03-10T07:50:50.426 INFO:tasks.workunit.client.0.vm05.stdout:2/467: dwrite d0/d8/d43/df/d4d/f57 [0,4194304] 0 2026-03-10T07:50:50.436 INFO:tasks.workunit.client.0.vm05.stdout:9/384: creat d8/d35/d1c/d26/d28/f84 x:0 0 0 2026-03-10T07:50:50.436 INFO:tasks.workunit.client.0.vm05.stdout:5/382: write d2/d5/f23 [930578,52065] 0 2026-03-10T07:50:50.440 INFO:tasks.workunit.client.0.vm05.stdout:5/383: dwrite d2/d20/d77/f7f [0,4194304] 0 2026-03-10T07:50:50.456 INFO:tasks.workunit.client.0.vm05.stdout:8/351: rmdir d1/dd/d18/d20/d2a/d34/d49/d5d 39 2026-03-10T07:50:50.456 INFO:tasks.workunit.client.0.vm05.stdout:6/368: creat d0/d11/d4f/d56/f73 x:0 0 0 2026-03-10T07:50:50.457 INFO:tasks.workunit.client.0.vm05.stdout:6/369: dread - d0/d11/d4f/d56/f73 zero size 2026-03-10T07:50:50.458 INFO:tasks.workunit.client.0.vm05.stdout:2/468: creat d0/d8/d43/df/d4d/f93 x:0 0 0 2026-03-10T07:50:50.459 INFO:tasks.workunit.client.0.vm05.stdout:9/385: mkdir d8/d35/d22/d33/d85 0 2026-03-10T07:50:50.460 INFO:tasks.workunit.client.0.vm05.stdout:6/370: unlink d0/c48 0 2026-03-10T07:50:50.461 INFO:tasks.workunit.client.0.vm05.stdout:6/371: chown d0/d11/d22/f4c 17 1 2026-03-10T07:50:50.468 INFO:tasks.workunit.client.0.vm05.stdout:5/384: rmdir d2/d12/d2d/d62 0 2026-03-10T07:50:50.469 INFO:tasks.workunit.client.0.vm05.stdout:5/385: write d2/d5/f3d [260757,83735] 0 2026-03-10T07:50:50.471 INFO:tasks.workunit.client.0.vm05.stdout:9/386: dread d8/d35/f51 [0,4194304] 0 2026-03-10T07:50:50.472 INFO:tasks.workunit.client.0.vm05.stdout:7/405: getdents d1/d34/d59 0 2026-03-10T07:50:50.475 INFO:tasks.workunit.client.0.vm05.stdout:3/378: sync 2026-03-10T07:50:50.475 INFO:tasks.workunit.client.0.vm05.stdout:7/406: dwrite d1/f3a [0,4194304] 0 2026-03-10T07:50:50.482 INFO:tasks.workunit.client.0.vm05.stdout:8/352: sync 2026-03-10T07:50:50.483 INFO:tasks.workunit.client.0.vm05.stdout:8/353: fdatasync d1/dd/d18/d20/d2a/d34/d49/f56 0 2026-03-10T07:50:50.492 INFO:tasks.workunit.client.0.vm05.stdout:6/372: creat d0/d11/d57/d60/f74 x:0 0 0 2026-03-10T07:50:50.496 INFO:tasks.workunit.client.0.vm05.stdout:5/386: creat d2/d12/d4d/f84 x:0 0 0 2026-03-10T07:50:50.500 INFO:tasks.workunit.client.0.vm05.stdout:9/387: dread d8/d35/f48 [0,4194304] 0 2026-03-10T07:50:50.507 INFO:tasks.workunit.client.0.vm05.stdout:1/402: truncate da/dd/d12/d34/f38 2910287 0 2026-03-10T07:50:50.512 INFO:tasks.workunit.client.0.vm05.stdout:3/379: mknod d8/d22/d60/d58/c80 0 2026-03-10T07:50:50.514 INFO:tasks.workunit.client.0.vm05.stdout:6/373: unlink d0/d11/d31/f72 0 2026-03-10T07:50:50.514 INFO:tasks.workunit.client.0.vm05.stdout:6/374: chown d0/d6/f24 2702 1 2026-03-10T07:50:50.514 INFO:tasks.workunit.client.0.vm05.stdout:1/403: dwrite da/f43 [0,4194304] 0 2026-03-10T07:50:50.518 INFO:tasks.workunit.client.0.vm05.stdout:3/380: sync 2026-03-10T07:50:50.524 INFO:tasks.workunit.client.0.vm05.stdout:5/387: dread d2/f9 [0,4194304] 0 2026-03-10T07:50:50.527 INFO:tasks.workunit.client.0.vm05.stdout:5/388: dwrite d2/d5/f10 [4194304,4194304] 0 2026-03-10T07:50:50.533 INFO:tasks.workunit.client.0.vm05.stdout:8/354: mknod d1/dd/d18/d20/d2a/d34/d49/d5d/c6c 0 2026-03-10T07:50:50.537 INFO:tasks.workunit.client.0.vm05.stdout:8/355: dwrite d1/dd/d4d/f63 [0,4194304] 0 2026-03-10T07:50:50.541 INFO:tasks.workunit.client.0.vm05.stdout:8/356: dread d1/dd/d18/f29 [4194304,4194304] 0 2026-03-10T07:50:50.542 INFO:tasks.workunit.client.0.vm05.stdout:8/357: read d1/dd/d4d/f63 [1758415,109005] 0 2026-03-10T07:50:50.542 INFO:tasks.workunit.client.0.vm05.stdout:6/375: dwrite d0/d11/d31/f33 [4194304,4194304] 0 2026-03-10T07:50:50.549 INFO:tasks.workunit.client.0.vm05.stdout:2/469: link d0/f6 d0/f94 0 2026-03-10T07:50:50.549 INFO:tasks.workunit.client.0.vm05.stdout:3/381: dread d8/d1c/f63 [0,4194304] 0 2026-03-10T07:50:50.551 INFO:tasks.workunit.client.0.vm05.stdout:8/358: dwrite d1/dd/d18/f3f [4194304,4194304] 0 2026-03-10T07:50:50.553 INFO:tasks.workunit.client.0.vm05.stdout:5/389: mknod d2/d20/d5b/c85 0 2026-03-10T07:50:50.557 INFO:tasks.workunit.client.0.vm05.stdout:6/376: dwrite d0/d11/d4f/d56/f6b [0,4194304] 0 2026-03-10T07:50:50.558 INFO:tasks.workunit.client.0.vm05.stdout:1/404: mkdir da/d26/d2b/d71 0 2026-03-10T07:50:50.559 INFO:tasks.workunit.client.0.vm05.stdout:6/377: dread - d0/d11/d4f/d56/f73 zero size 2026-03-10T07:50:50.567 INFO:tasks.workunit.client.0.vm05.stdout:6/378: dwrite d0/d11/d57/f5c [0,4194304] 0 2026-03-10T07:50:50.578 INFO:tasks.workunit.client.0.vm05.stdout:1/405: mknod da/d26/c72 0 2026-03-10T07:50:50.583 INFO:tasks.workunit.client.0.vm05.stdout:8/359: creat d1/dd/d18/d20/d2a/d48/d5a/f6d x:0 0 0 2026-03-10T07:50:50.584 INFO:tasks.workunit.client.0.vm05.stdout:2/470: link d0/d8/d43/df/d4d/f57 d0/d8/d43/df/d25/f95 0 2026-03-10T07:50:50.591 INFO:tasks.workunit.client.0.vm05.stdout:5/390: mkdir d2/d20/d33/d86 0 2026-03-10T07:50:50.596 INFO:tasks.workunit.client.0.vm05.stdout:6/379: creat d0/d11/d57/d66/f75 x:0 0 0 2026-03-10T07:50:50.596 INFO:tasks.workunit.client.0.vm05.stdout:6/380: write d0/d11/d57/f5f [1967349,120305] 0 2026-03-10T07:50:50.596 INFO:tasks.workunit.client.0.vm05.stdout:2/471: symlink d0/d8/d43/df/l96 0 2026-03-10T07:50:50.596 INFO:tasks.workunit.client.0.vm05.stdout:6/381: write d0/d11/d31/f63 [343979,99830] 0 2026-03-10T07:50:50.604 INFO:tasks.workunit.client.0.vm05.stdout:6/382: symlink d0/l76 0 2026-03-10T07:50:50.604 INFO:tasks.workunit.client.0.vm05.stdout:6/383: fdatasync d0/d11/d57/d60/f74 0 2026-03-10T07:50:50.605 INFO:tasks.workunit.client.0.vm05.stdout:6/384: write d0/d11/d22/f2b [2026400,105872] 0 2026-03-10T07:50:50.613 INFO:tasks.workunit.client.0.vm05.stdout:6/385: dread d0/d6/f1a [0,4194304] 0 2026-03-10T07:50:50.617 INFO:tasks.workunit.client.0.vm05.stdout:6/386: getdents d0/d6/d3b 0 2026-03-10T07:50:50.685 INFO:tasks.workunit.client.0.vm05.stdout:6/387: dread d0/d11/f21 [0,4194304] 0 2026-03-10T07:50:50.694 INFO:tasks.workunit.client.0.vm05.stdout:6/388: getdents d0/d35/d36 0 2026-03-10T07:50:50.694 INFO:tasks.workunit.client.0.vm05.stdout:6/389: truncate d0/d11/d22/f4c 110137 0 2026-03-10T07:50:50.695 INFO:tasks.workunit.client.0.vm05.stdout:6/390: readlink d0/d11/l25 0 2026-03-10T07:50:50.700 INFO:tasks.workunit.client.0.vm05.stdout:4/406: dwrite d0/d6/f32 [0,4194304] 0 2026-03-10T07:50:50.701 INFO:tasks.workunit.client.0.vm05.stdout:4/407: dread - d0/d20/d26/f73 zero size 2026-03-10T07:50:50.703 INFO:tasks.workunit.client.0.vm05.stdout:6/391: unlink d0/d11/d31/f33 0 2026-03-10T07:50:50.705 INFO:tasks.workunit.client.0.vm05.stdout:0/353: truncate d8/dd/d10/d26/d2a/f2b 5175281 0 2026-03-10T07:50:50.710 INFO:tasks.workunit.client.0.vm05.stdout:6/392: symlink d0/d6/d3b/l77 0 2026-03-10T07:50:50.718 INFO:tasks.workunit.client.0.vm05.stdout:0/354: creat d8/dd/d10/d26/d3a/d5e/f7b x:0 0 0 2026-03-10T07:50:50.718 INFO:tasks.workunit.client.0.vm05.stdout:0/355: readlink d8/dd/d10/d26/d3a/d5e/d63/l78 0 2026-03-10T07:50:50.723 INFO:tasks.workunit.client.0.vm05.stdout:6/393: link d0/l34 d0/d11/d4f/l78 0 2026-03-10T07:50:50.728 INFO:tasks.workunit.client.0.vm05.stdout:0/356: getdents d8/dd/d37/d56 0 2026-03-10T07:50:50.729 INFO:tasks.workunit.client.0.vm05.stdout:0/357: stat d8/dd/f22 0 2026-03-10T07:50:50.730 INFO:tasks.workunit.client.0.vm05.stdout:0/358: chown d8/dd/c1b 465011822 1 2026-03-10T07:50:50.736 INFO:tasks.workunit.client.0.vm05.stdout:4/408: symlink d0/d6/d9/d12/d45/d55/l8b 0 2026-03-10T07:50:50.737 INFO:tasks.workunit.client.0.vm05.stdout:4/409: truncate d0/d6/d9/d12/d45/d55/f5f 1643463 0 2026-03-10T07:50:50.737 INFO:tasks.workunit.client.0.vm05.stdout:4/410: write d0/d3b/f4a [1171552,46333] 0 2026-03-10T07:50:50.740 INFO:tasks.workunit.client.0.vm05.stdout:5/391: dread d2/d12/f5a [0,4194304] 0 2026-03-10T07:50:50.741 INFO:tasks.workunit.client.0.vm05.stdout:5/392: read d2/d5/f10 [3270160,43337] 0 2026-03-10T07:50:50.741 INFO:tasks.workunit.client.0.vm05.stdout:5/393: fsync d2/d12/d2d/d4a/f59 0 2026-03-10T07:50:50.744 INFO:tasks.workunit.client.0.vm05.stdout:4/411: mkdir d0/d6/d9/d8c 0 2026-03-10T07:50:50.765 INFO:tasks.workunit.client.0.vm05.stdout:4/412: sync 2026-03-10T07:50:50.765 INFO:tasks.workunit.client.0.vm05.stdout:6/394: dread d0/d35/d36/d43/f47 [0,4194304] 0 2026-03-10T07:50:50.768 INFO:tasks.workunit.client.0.vm05.stdout:4/413: rmdir d0/d6/d37 39 2026-03-10T07:50:50.769 INFO:tasks.workunit.client.0.vm05.stdout:4/414: fdatasync d0/d6/d9/d5a/f58 0 2026-03-10T07:50:50.769 INFO:tasks.workunit.client.0.vm05.stdout:4/415: read d0/d6/f32 [3538212,106457] 0 2026-03-10T07:50:50.770 INFO:tasks.workunit.client.0.vm05.stdout:6/395: creat d0/d11/d57/d66/f79 x:0 0 0 2026-03-10T07:50:50.771 INFO:tasks.workunit.client.0.vm05.stdout:6/396: dread - d0/d11/d57/d66/f79 zero size 2026-03-10T07:50:50.773 INFO:tasks.workunit.client.0.vm05.stdout:4/416: sync 2026-03-10T07:50:50.777 INFO:tasks.workunit.client.0.vm05.stdout:6/397: creat d0/d11/d57/f7a x:0 0 0 2026-03-10T07:50:50.779 INFO:tasks.workunit.client.0.vm05.stdout:4/417: creat d0/d6/d9/d12/d45/d55/d4e/f8d x:0 0 0 2026-03-10T07:50:50.782 INFO:tasks.workunit.client.0.vm05.stdout:2/472: chown d0/d8/d43/f1d 9943 1 2026-03-10T07:50:50.788 INFO:tasks.workunit.client.0.vm05.stdout:6/398: creat d0/d11/d57/d66/f7b x:0 0 0 2026-03-10T07:50:50.789 INFO:tasks.workunit.client.0.vm05.stdout:4/418: truncate d0/d20/d26/f40 2687320 0 2026-03-10T07:50:50.792 INFO:tasks.workunit.client.0.vm05.stdout:2/473: creat d0/d8/d43/df/f97 x:0 0 0 2026-03-10T07:50:50.793 INFO:tasks.workunit.client.0.vm05.stdout:6/399: mknod d0/d11/d2e/c7c 0 2026-03-10T07:50:50.796 INFO:tasks.workunit.client.0.vm05.stdout:4/419: symlink d0/d6/d9/d12/d45/d55/d44/d85/l8e 0 2026-03-10T07:50:50.797 INFO:tasks.workunit.client.0.vm05.stdout:6/400: dwrite d0/d6/f45 [0,4194304] 0 2026-03-10T07:50:50.802 INFO:tasks.workunit.client.0.vm05.stdout:4/420: sync 2026-03-10T07:50:50.802 INFO:tasks.workunit.client.0.vm05.stdout:4/421: chown d0/d6/f15 3719 1 2026-03-10T07:50:50.808 INFO:tasks.workunit.client.0.vm05.stdout:7/407: mkdir d1/d6/d80 0 2026-03-10T07:50:50.809 INFO:tasks.workunit.client.0.vm05.stdout:7/408: chown d1/l2b 90538 1 2026-03-10T07:50:50.810 INFO:tasks.workunit.client.0.vm05.stdout:6/401: mkdir d0/d11/d4f/d7d 0 2026-03-10T07:50:50.811 INFO:tasks.workunit.client.0.vm05.stdout:6/402: stat d0/c3f 0 2026-03-10T07:50:50.815 INFO:tasks.workunit.client.0.vm05.stdout:6/403: creat d0/d11/d4f/f7e x:0 0 0 2026-03-10T07:50:50.820 INFO:tasks.workunit.client.0.vm05.stdout:6/404: creat d0/d11/d2e/f7f x:0 0 0 2026-03-10T07:50:50.821 INFO:tasks.workunit.client.0.vm05.stdout:2/474: link d0/d8/d43/c6f d0/d8/d43/c98 0 2026-03-10T07:50:50.822 INFO:tasks.workunit.client.0.vm05.stdout:4/422: creat d0/d6/d9/f8f x:0 0 0 2026-03-10T07:50:50.823 INFO:tasks.workunit.client.0.vm05.stdout:6/405: write d0/d11/f21 [5034216,86138] 0 2026-03-10T07:50:50.824 INFO:tasks.workunit.client.0.vm05.stdout:4/423: rmdir d0 39 2026-03-10T07:50:50.827 INFO:tasks.workunit.client.0.vm05.stdout:4/424: mknod d0/d6/d9/d12/d45/d55/d44/d85/c90 0 2026-03-10T07:50:50.829 INFO:tasks.workunit.client.0.vm05.stdout:4/425: mkdir d0/d6/d9/d5a/d91 0 2026-03-10T07:50:50.832 INFO:tasks.workunit.client.0.vm05.stdout:4/426: mknod d0/d6/d37/c92 0 2026-03-10T07:50:50.833 INFO:tasks.workunit.client.0.vm05.stdout:0/359: dwrite d8/f1c [0,4194304] 0 2026-03-10T07:50:50.846 INFO:tasks.workunit.client.0.vm05.stdout:0/360: unlink d8/dd/d37/d56/l77 0 2026-03-10T07:50:50.847 INFO:tasks.workunit.client.0.vm05.stdout:0/361: fsync d8/dd/d34/f3e 0 2026-03-10T07:50:50.847 INFO:tasks.workunit.client.0.vm05.stdout:4/427: dread d0/d6/f39 [0,4194304] 0 2026-03-10T07:50:50.848 INFO:tasks.workunit.client.0.vm05.stdout:0/362: write d8/dd/d10/d26/d2a/f6e [649282,46886] 0 2026-03-10T07:50:50.848 INFO:tasks.workunit.client.0.vm05.stdout:4/428: fsync d0/d28/f7c 0 2026-03-10T07:50:50.849 INFO:tasks.workunit.client.0.vm05.stdout:4/429: truncate d0/d20/d26/f73 21209 0 2026-03-10T07:50:50.852 INFO:tasks.workunit.client.0.vm05.stdout:0/363: mknod d8/dd/d10/d26/d2a/c7c 0 2026-03-10T07:50:50.856 INFO:tasks.workunit.client.0.vm05.stdout:4/430: dwrite d0/d6/f32 [0,4194304] 0 2026-03-10T07:50:50.859 INFO:tasks.workunit.client.0.vm05.stdout:1/406: symlink da/dd/d12/l73 0 2026-03-10T07:50:50.859 INFO:tasks.workunit.client.0.vm05.stdout:9/388: rename d8/d35/d1c/d26 to d8/d86 0 2026-03-10T07:50:50.859 INFO:tasks.workunit.client.0.vm05.stdout:1/407: readlink da/dd/d2a/l53 0 2026-03-10T07:50:50.867 INFO:tasks.workunit.client.0.vm05.stdout:1/408: dwrite da/f5c [0,4194304] 0 2026-03-10T07:50:50.875 INFO:tasks.workunit.client.0.vm05.stdout:6/406: dread d0/d35/d36/d43/f47 [0,4194304] 0 2026-03-10T07:50:50.878 INFO:tasks.workunit.client.0.vm05.stdout:6/407: dwrite d0/d35/d36/f5b [0,4194304] 0 2026-03-10T07:50:50.879 INFO:tasks.workunit.client.0.vm05.stdout:6/408: stat d0/d11/d2e/c42 0 2026-03-10T07:50:50.881 INFO:tasks.workunit.client.0.vm05.stdout:6/409: read d0/d35/f41 [1020906,36582] 0 2026-03-10T07:50:50.888 INFO:tasks.workunit.client.0.vm05.stdout:3/382: rename d8/d1f/d2a/f32 to d8/d16/d19/f81 0 2026-03-10T07:50:50.892 INFO:tasks.workunit.client.0.vm05.stdout:9/389: dread d8/f14 [0,4194304] 0 2026-03-10T07:50:50.893 INFO:tasks.workunit.client.0.vm05.stdout:9/390: dread d8/d35/d22/d33/f5b [0,4194304] 0 2026-03-10T07:50:50.894 INFO:tasks.workunit.client.0.vm05.stdout:9/391: dread d8/f9 [4194304,4194304] 0 2026-03-10T07:50:50.897 INFO:tasks.workunit.client.0.vm05.stdout:1/409: unlink da/dd/d12/d19/d20/f62 0 2026-03-10T07:50:50.898 INFO:tasks.workunit.client.0.vm05.stdout:9/392: dwrite d8/d35/d1c/d20/f32 [0,4194304] 0 2026-03-10T07:50:50.899 INFO:tasks.workunit.client.0.vm05.stdout:9/393: read d8/d35/d22/f6a [269085,49404] 0 2026-03-10T07:50:50.900 INFO:tasks.workunit.client.0.vm05.stdout:9/394: write d8/d86/d28/d79/f44 [31361,72466] 0 2026-03-10T07:50:50.909 INFO:tasks.workunit.client.0.vm05.stdout:6/410: symlink d0/d11/d4f/d56/l80 0 2026-03-10T07:50:50.910 INFO:tasks.workunit.client.0.vm05.stdout:8/360: rename d1/dd/d18/d20/d2a/d34/d49/d5d/c6c to d1/dd/d4d/d64/d6a/c6e 0 2026-03-10T07:50:50.911 INFO:tasks.workunit.client.0.vm05.stdout:5/394: rmdir d2/d20/d33 39 2026-03-10T07:50:50.913 INFO:tasks.workunit.client.0.vm05.stdout:5/395: dread d2/d20/f57 [0,4194304] 0 2026-03-10T07:50:50.920 INFO:tasks.workunit.client.0.vm05.stdout:7/409: dwrite d1/d6/d3b/f42 [0,4194304] 0 2026-03-10T07:50:50.921 INFO:tasks.workunit.client.0.vm05.stdout:7/410: dread - d1/d6/f77 zero size 2026-03-10T07:50:50.926 INFO:tasks.workunit.client.0.vm05.stdout:7/411: dread d1/d6/d3b/f42 [0,4194304] 0 2026-03-10T07:50:50.932 INFO:tasks.workunit.client.0.vm05.stdout:6/411: mkdir d0/d11/d2e/d81 0 2026-03-10T07:50:50.934 INFO:tasks.workunit.client.0.vm05.stdout:2/475: rename d0/f1 to d0/d8/d43/df/d8b/f99 0 2026-03-10T07:50:50.937 INFO:tasks.workunit.client.0.vm05.stdout:1/410: fdatasync da/dd/d12/f22 0 2026-03-10T07:50:50.942 INFO:tasks.workunit.client.0.vm05.stdout:6/412: mknod d0/d11/d4f/d56/c82 0 2026-03-10T07:50:50.943 INFO:tasks.workunit.client.0.vm05.stdout:0/364: dwrite d8/dd/d37/d56/f3b [0,4194304] 0 2026-03-10T07:50:50.945 INFO:tasks.workunit.client.0.vm05.stdout:0/365: readlink d8/dd/d37/d67/l57 0 2026-03-10T07:50:50.947 INFO:tasks.workunit.client.0.vm05.stdout:8/361: mkdir d1/d6f 0 2026-03-10T07:50:50.950 INFO:tasks.workunit.client.0.vm05.stdout:6/413: dwrite d0/d11/d2e/f7f [0,4194304] 0 2026-03-10T07:50:50.958 INFO:tasks.workunit.client.0.vm05.stdout:6/414: dread d0/d35/d36/f5b [0,4194304] 0 2026-03-10T07:50:50.969 INFO:tasks.workunit.client.0.vm05.stdout:6/415: read - d0/d11/d57/d60/f74 zero size 2026-03-10T07:50:50.970 INFO:tasks.workunit.client.0.vm05.stdout:2/476: unlink d0/d8/d66/f6c 0 2026-03-10T07:50:50.970 INFO:tasks.workunit.client.0.vm05.stdout:5/396: symlink d2/d20/d33/d72/l87 0 2026-03-10T07:50:50.971 INFO:tasks.workunit.client.0.vm05.stdout:0/366: write d8/dd/d37/d56/f62 [892527,35777] 0 2026-03-10T07:50:50.975 INFO:tasks.workunit.client.0.vm05.stdout:6/416: creat d0/d11/d4f/d56/f83 x:0 0 0 2026-03-10T07:50:50.976 INFO:tasks.workunit.client.0.vm05.stdout:6/417: fsync d0/d11/d57/f5f 0 2026-03-10T07:50:50.979 INFO:tasks.workunit.client.0.vm05.stdout:2/477: dread d0/d8/d43/df/d25/f2b [0,4194304] 0 2026-03-10T07:50:50.980 INFO:tasks.workunit.client.0.vm05.stdout:2/478: chown d0/d8/d43/df/d53/f82 5 1 2026-03-10T07:50:50.980 INFO:tasks.workunit.client.0.vm05.stdout:2/479: write d0/d8/d43/f30 [2046183,4343] 0 2026-03-10T07:50:50.981 INFO:tasks.workunit.client.0.vm05.stdout:2/480: fsync d0/d8/d43/df/d4d/f84 0 2026-03-10T07:50:50.984 INFO:tasks.workunit.client.0.vm05.stdout:1/411: creat da/dd/d2a/d70/f74 x:0 0 0 2026-03-10T07:50:50.992 INFO:tasks.workunit.client.0.vm05.stdout:8/362: getdents d1/d52 0 2026-03-10T07:50:50.997 INFO:tasks.workunit.client.0.vm05.stdout:0/367: mkdir d8/dd/d37/d56/d6d/d7d 0 2026-03-10T07:50:50.997 INFO:tasks.workunit.client.0.vm05.stdout:1/412: chown da/dd/d2a/f63 162 1 2026-03-10T07:50:50.997 INFO:tasks.workunit.client.0.vm05.stdout:2/481: rename d0/d8/d43/df/d4d/f57 to d0/d8/d43/d38/f9a 0 2026-03-10T07:50:50.997 INFO:tasks.workunit.client.0.vm05.stdout:7/412: getdents d1/d6 0 2026-03-10T07:50:50.997 INFO:tasks.workunit.client.0.vm05.stdout:7/413: write d1/d34/f4d [1023439,115435] 0 2026-03-10T07:50:50.997 INFO:tasks.workunit.client.0.vm05.stdout:7/414: write d1/d34/d59/f64 [805457,12900] 0 2026-03-10T07:50:50.997 INFO:tasks.workunit.client.0.vm05.stdout:0/368: symlink d8/dd/d37/d67/l7e 0 2026-03-10T07:50:50.998 INFO:tasks.workunit.client.0.vm05.stdout:5/397: creat d2/d20/d33/f88 x:0 0 0 2026-03-10T07:50:50.999 INFO:tasks.workunit.client.0.vm05.stdout:7/415: mknod d1/d6/c81 0 2026-03-10T07:50:51.000 INFO:tasks.workunit.client.0.vm05.stdout:5/398: mknod d2/d12/d2d/c89 0 2026-03-10T07:50:51.002 INFO:tasks.workunit.client.0.vm05.stdout:7/416: unlink d1/d6/f1b 0 2026-03-10T07:50:51.002 INFO:tasks.workunit.client.0.vm05.stdout:7/417: write d1/d6/d47/f65 [3161472,1128] 0 2026-03-10T07:50:51.003 INFO:tasks.workunit.client.0.vm05.stdout:5/399: dread d2/f42 [0,4194304] 0 2026-03-10T07:50:51.006 INFO:tasks.workunit.client.0.vm05.stdout:0/369: creat d8/dd/d10/f7f x:0 0 0 2026-03-10T07:50:51.009 INFO:tasks.workunit.client.0.vm05.stdout:0/370: getdents d8/dd/d37/d56 0 2026-03-10T07:50:51.009 INFO:tasks.workunit.client.0.vm05.stdout:0/371: write d8/dd/d37/d56/f62 [1933277,43355] 0 2026-03-10T07:50:51.012 INFO:tasks.workunit.client.0.vm05.stdout:4/431: write d0/d6/d9/d12/d45/d55/f19 [474483,89011] 0 2026-03-10T07:50:51.016 INFO:tasks.workunit.client.0.vm05.stdout:4/432: symlink d0/d6/d37/l93 0 2026-03-10T07:50:51.017 INFO:tasks.workunit.client.0.vm05.stdout:0/372: creat d8/dd/d37/d56/d6d/d70/f80 x:0 0 0 2026-03-10T07:50:51.017 INFO:tasks.workunit.client.0.vm05.stdout:0/373: stat d8/dd/d10/d26/d48/c5d 0 2026-03-10T07:50:51.018 INFO:tasks.workunit.client.0.vm05.stdout:0/374: mkdir d8/dd/d37/d81 0 2026-03-10T07:50:51.019 INFO:tasks.workunit.client.0.vm05.stdout:4/433: creat d0/d6/f94 x:0 0 0 2026-03-10T07:50:51.020 INFO:tasks.workunit.client.0.vm05.stdout:4/434: write d0/d6/d37/f3d [2897388,22125] 0 2026-03-10T07:50:51.031 INFO:tasks.workunit.client.0.vm05.stdout:0/375: getdents d8 0 2026-03-10T07:50:51.032 INFO:tasks.workunit.client.0.vm05.stdout:8/363: sync 2026-03-10T07:50:51.032 INFO:tasks.workunit.client.0.vm05.stdout:5/400: sync 2026-03-10T07:50:51.033 INFO:tasks.workunit.client.0.vm05.stdout:0/376: write d8/dd/d10/d26/d3a/f53 [565025,1230] 0 2026-03-10T07:50:51.039 INFO:tasks.workunit.client.0.vm05.stdout:5/401: dwrite d2/f15 [0,4194304] 0 2026-03-10T07:50:51.050 INFO:tasks.workunit.client.0.vm05.stdout:4/435: rename d0/d20/d26 to d0/d6/d95 0 2026-03-10T07:50:51.050 INFO:tasks.workunit.client.0.vm05.stdout:3/383: truncate d8/d1f/d2a/d4b/f57 4064269 0 2026-03-10T07:50:51.051 INFO:tasks.workunit.client.0.vm05.stdout:9/395: truncate d8/d35/d1c/d20/f32 1646490 0 2026-03-10T07:50:51.055 INFO:tasks.workunit.client.0.vm05.stdout:6/418: write d0/d11/f13 [2246602,98241] 0 2026-03-10T07:50:51.055 INFO:tasks.workunit.client.0.vm05.stdout:2/482: write d0/d8/d43/df/d53/f82 [39770,54769] 0 2026-03-10T07:50:51.058 INFO:tasks.workunit.client.0.vm05.stdout:1/413: dwrite da/d26/d2b/f45 [4194304,4194304] 0 2026-03-10T07:50:51.060 INFO:tasks.workunit.client.0.vm05.stdout:1/414: fdatasync da/dd/d12/d19/d20/f6e 0 2026-03-10T07:50:51.066 INFO:tasks.workunit.client.0.vm05.stdout:7/418: truncate d1/d6/f32 6182861 0 2026-03-10T07:50:51.066 INFO:tasks.workunit.client.0.vm05.stdout:8/364: unlink d1/dd/d18/d20/d2a/d34/f4f 0 2026-03-10T07:50:51.066 INFO:tasks.workunit.client.0.vm05.stdout:0/377: symlink d8/dd/d10/d26/d2a/l82 0 2026-03-10T07:50:51.070 INFO:tasks.workunit.client.0.vm05.stdout:8/365: dwrite d1/dd/d4d/f60 [0,4194304] 0 2026-03-10T07:50:51.072 INFO:tasks.workunit.client.0.vm05.stdout:8/366: write d1/d45/f53 [7077291,79295] 0 2026-03-10T07:50:51.074 INFO:tasks.workunit.client.0.vm05.stdout:6/419: sync 2026-03-10T07:50:51.075 INFO:tasks.workunit.client.0.vm05.stdout:6/420: dread - d0/d11/d57/d60/f74 zero size 2026-03-10T07:50:51.078 INFO:tasks.workunit.client.0.vm05.stdout:8/367: dwrite d1/dd/d4d/f60 [0,4194304] 0 2026-03-10T07:50:51.081 INFO:tasks.workunit.client.0.vm05.stdout:4/436: mkdir d0/d3b/d96 0 2026-03-10T07:50:51.086 INFO:tasks.workunit.client.0.vm05.stdout:8/368: dwrite d1/dd/d18/d20/d2a/d48/f59 [4194304,4194304] 0 2026-03-10T07:50:51.088 INFO:tasks.workunit.client.0.vm05.stdout:4/437: dwrite d0/d6/d9/d12/d45/d55/d4e/f8d [0,4194304] 0 2026-03-10T07:50:51.089 INFO:tasks.workunit.client.0.vm05.stdout:3/384: creat d8/d16/f82 x:0 0 0 2026-03-10T07:50:51.094 INFO:tasks.workunit.client.0.vm05.stdout:9/396: rmdir d8/d35/d22/d33/d62 39 2026-03-10T07:50:51.094 INFO:tasks.workunit.client.0.vm05.stdout:9/397: truncate d8/d86/d28/f84 97509 0 2026-03-10T07:50:51.100 INFO:tasks.workunit.client.0.vm05.stdout:2/483: creat d0/d47/d49/f9b x:0 0 0 2026-03-10T07:50:51.110 INFO:tasks.workunit.client.0.vm05.stdout:7/419: fsync d1/d34/f7d 0 2026-03-10T07:50:51.110 INFO:tasks.workunit.client.0.vm05.stdout:7/420: fdatasync d1/d6/f1d 0 2026-03-10T07:50:51.110 INFO:tasks.workunit.client.0.vm05.stdout:5/402: symlink d2/d20/d33/l8a 0 2026-03-10T07:50:51.118 INFO:tasks.workunit.client.0.vm05.stdout:3/385: sync 2026-03-10T07:50:51.123 INFO:tasks.workunit.client.0.vm05.stdout:3/386: dwrite d8/f4d [0,4194304] 0 2026-03-10T07:50:51.131 INFO:tasks.workunit.client.0.vm05.stdout:1/415: dread da/dd/d12/f16 [0,4194304] 0 2026-03-10T07:50:51.143 INFO:tasks.workunit.client.0.vm05.stdout:5/403: creat d2/d5/d61/f8b x:0 0 0 2026-03-10T07:50:51.145 INFO:tasks.workunit.client.0.vm05.stdout:0/378: link d8/dd/d10/d26/d2a/f6e d8/dd/d10/d26/d48/f83 0 2026-03-10T07:50:51.146 INFO:tasks.workunit.client.0.vm05.stdout:0/379: write d8/dd/d34/f3d [687220,68644] 0 2026-03-10T07:50:51.149 INFO:tasks.workunit.client.0.vm05.stdout:3/387: rename d8/d22/d60/d58/c80 to d8/d1f/d24/d45/c83 0 2026-03-10T07:50:51.149 INFO:tasks.workunit.client.0.vm05.stdout:3/388: truncate d8/d1c/d48/f70 876194 0 2026-03-10T07:50:51.150 INFO:tasks.workunit.client.0.vm05.stdout:3/389: truncate d8/d16/d19/f77 67894 0 2026-03-10T07:50:51.152 INFO:tasks.workunit.client.0.vm05.stdout:1/416: creat da/dd/d2a/f75 x:0 0 0 2026-03-10T07:50:51.154 INFO:tasks.workunit.client.0.vm05.stdout:9/398: link d8/d35/d22/d33/f5b d8/d35/d38/f87 0 2026-03-10T07:50:51.157 INFO:tasks.workunit.client.0.vm05.stdout:7/421: mkdir d1/d6/d80/d82 0 2026-03-10T07:50:51.162 INFO:tasks.workunit.client.0.vm05.stdout:0/380: mknod d8/dd/d37/c84 0 2026-03-10T07:50:51.166 INFO:tasks.workunit.client.0.vm05.stdout:3/390: dwrite d8/d16/d19/f81 [0,4194304] 0 2026-03-10T07:50:51.167 INFO:tasks.workunit.client.0.vm05.stdout:1/417: rename da/f5a to da/dd/d12/d19/f76 0 2026-03-10T07:50:51.169 INFO:tasks.workunit.client.0.vm05.stdout:4/438: getdents d0/d6/d9/d12 0 2026-03-10T07:50:51.172 INFO:tasks.workunit.client.0.vm05.stdout:9/399: link d8/d35/d1c/d36/f50 d8/d35/d1c/d75/f88 0 2026-03-10T07:50:51.180 INFO:tasks.workunit.client.0.vm05.stdout:7/422: symlink d1/d6/d80/d82/l83 0 2026-03-10T07:50:51.190 INFO:tasks.workunit.client.0.vm05.stdout:5/404: link d2/d12/f2f d2/d20/d4c/f8c 0 2026-03-10T07:50:51.191 INFO:tasks.workunit.client.0.vm05.stdout:7/423: dread d1/f11 [0,4194304] 0 2026-03-10T07:50:51.194 INFO:tasks.workunit.client.0.vm05.stdout:9/400: sync 2026-03-10T07:50:51.196 INFO:tasks.workunit.client.0.vm05.stdout:0/381: creat d8/dd/d10/d26/d2a/d6f/f85 x:0 0 0 2026-03-10T07:50:51.196 INFO:tasks.workunit.client.0.vm05.stdout:4/439: rmdir d0/d6/d9/d12/d65 39 2026-03-10T07:50:51.197 INFO:tasks.workunit.client.0.vm05.stdout:5/405: write d2/d12/f40 [823736,26737] 0 2026-03-10T07:50:51.200 INFO:tasks.workunit.client.0.vm05.stdout:1/418: creat da/dd/f77 x:0 0 0 2026-03-10T07:50:51.201 INFO:tasks.workunit.client.0.vm05.stdout:1/419: chown da/dd/d12/d19/d20 127465 1 2026-03-10T07:50:51.204 INFO:tasks.workunit.client.0.vm05.stdout:0/382: mkdir d8/dd/d37/d56/d6d/d86 0 2026-03-10T07:50:51.204 INFO:tasks.workunit.client.0.vm05.stdout:9/401: chown d8/d35/d22/d33/d62/d6d/c7a 406608 1 2026-03-10T07:50:51.205 INFO:tasks.workunit.client.0.vm05.stdout:9/402: chown d8/d35/d1c/d20/f32 46168028 1 2026-03-10T07:50:51.206 INFO:tasks.workunit.client.0.vm05.stdout:1/420: symlink da/dd/d2a/d55/d68/l78 0 2026-03-10T07:50:51.209 INFO:tasks.workunit.client.0.vm05.stdout:1/421: dwrite da/dd/d12/d19/f4e [0,4194304] 0 2026-03-10T07:50:51.211 INFO:tasks.workunit.client.0.vm05.stdout:5/406: mkdir d2/d20/d33/d86/d8d 0 2026-03-10T07:50:51.219 INFO:tasks.workunit.client.0.vm05.stdout:0/383: creat d8/dd/d37/d67/f87 x:0 0 0 2026-03-10T07:50:51.221 INFO:tasks.workunit.client.0.vm05.stdout:9/403: symlink d8/d35/d22/d33/d62/l89 0 2026-03-10T07:50:51.224 INFO:tasks.workunit.client.0.vm05.stdout:5/407: mknod d2/d4b/c8e 0 2026-03-10T07:50:51.225 INFO:tasks.workunit.client.0.vm05.stdout:5/408: read - d2/d20/d33/f45 zero size 2026-03-10T07:50:51.225 INFO:tasks.workunit.client.0.vm05.stdout:9/404: dwrite d8/d35/d22/d33/f73 [0,4194304] 0 2026-03-10T07:50:51.229 INFO:tasks.workunit.client.0.vm05.stdout:1/422: link da/f5c da/dd/d12/d19/d20/f79 0 2026-03-10T07:50:51.230 INFO:tasks.workunit.client.0.vm05.stdout:0/384: symlink d8/dd/d10/l88 0 2026-03-10T07:50:51.234 INFO:tasks.workunit.client.0.vm05.stdout:5/409: creat d2/d12/f8f x:0 0 0 2026-03-10T07:50:51.234 INFO:tasks.workunit.client.0.vm05.stdout:5/410: fsync d2/d5/f1e 0 2026-03-10T07:50:51.253 INFO:tasks.workunit.client.0.vm05.stdout:1/423: creat da/dd/d2a/d55/d64/f7a x:0 0 0 2026-03-10T07:50:51.253 INFO:tasks.workunit.client.0.vm05.stdout:0/385: dread d8/dd/d37/d56/f18 [0,4194304] 0 2026-03-10T07:50:51.253 INFO:tasks.workunit.client.0.vm05.stdout:9/405: dread d8/d35/f1d [0,4194304] 0 2026-03-10T07:50:51.253 INFO:tasks.workunit.client.0.vm05.stdout:5/411: rename d2/d12/f2f to d2/d20/d33/d53/d7d/f90 0 2026-03-10T07:50:51.253 INFO:tasks.workunit.client.0.vm05.stdout:5/412: stat d2/d5/f1e 0 2026-03-10T07:50:51.253 INFO:tasks.workunit.client.0.vm05.stdout:5/413: dread - d2/d20/d5b/f6e zero size 2026-03-10T07:50:51.253 INFO:tasks.workunit.client.0.vm05.stdout:9/406: truncate d8/d35/f48 1802352 0 2026-03-10T07:50:51.253 INFO:tasks.workunit.client.0.vm05.stdout:9/407: unlink d8/d35/d1c/d2c/d63/f77 0 2026-03-10T07:50:51.253 INFO:tasks.workunit.client.0.vm05.stdout:1/424: rename da/dd/d2a/d70/f74 to da/dd/f7b 0 2026-03-10T07:50:51.253 INFO:tasks.workunit.client.0.vm05.stdout:5/414: link d2/d5/d61/l69 d2/d20/d4c/d64/l91 0 2026-03-10T07:50:51.253 INFO:tasks.workunit.client.0.vm05.stdout:5/415: dread - d2/d12/d2d/d4a/f59 zero size 2026-03-10T07:50:51.261 INFO:tasks.workunit.client.0.vm05.stdout:2/484: rmdir d0/d47/d49 39 2026-03-10T07:50:51.263 INFO:tasks.workunit.client.0.vm05.stdout:7/424: write d1/d6/f31 [834831,57306] 0 2026-03-10T07:50:51.265 INFO:tasks.workunit.client.0.vm05.stdout:7/425: dread d1/d6/f4e [0,4194304] 0 2026-03-10T07:50:51.265 INFO:tasks.workunit.client.0.vm05.stdout:7/426: dread - d1/d5b/f73 zero size 2026-03-10T07:50:51.267 INFO:tasks.workunit.client.0.vm05.stdout:0/386: rename d8/l1f to d8/dd/d10/d26/d3a/l89 0 2026-03-10T07:50:51.271 INFO:tasks.workunit.client.0.vm05.stdout:1/425: mkdir da/dd/d2a/d7c 0 2026-03-10T07:50:51.275 INFO:tasks.workunit.client.0.vm05.stdout:6/421: dwrite d0/d11/d22/f52 [0,4194304] 0 2026-03-10T07:50:51.276 INFO:tasks.workunit.client.0.vm05.stdout:5/416: mknod d2/d20/d5b/c92 0 2026-03-10T07:50:51.277 INFO:tasks.workunit.client.0.vm05.stdout:8/369: write d1/fa [2213884,116598] 0 2026-03-10T07:50:51.277 INFO:tasks.workunit.client.0.vm05.stdout:5/417: chown d2/d5/d61/f66 49720 1 2026-03-10T07:50:51.287 INFO:tasks.workunit.client.0.vm05.stdout:0/387: symlink d8/dd/d37/d56/d6d/d70/l8a 0 2026-03-10T07:50:51.288 INFO:tasks.workunit.client.0.vm05.stdout:0/388: write d8/dd/d34/f3d [1172430,128054] 0 2026-03-10T07:50:51.292 INFO:tasks.workunit.client.0.vm05.stdout:9/408: creat d8/f8a x:0 0 0 2026-03-10T07:50:51.296 INFO:tasks.workunit.client.0.vm05.stdout:8/370: unlink d1/dd/d18/f38 0 2026-03-10T07:50:51.296 INFO:tasks.workunit.client.0.vm05.stdout:8/371: fdatasync d1/dd/f17 0 2026-03-10T07:50:51.297 INFO:tasks.workunit.client.0.vm05.stdout:8/372: write d1/d45/f53 [8885555,13805] 0 2026-03-10T07:50:51.298 INFO:tasks.workunit.client.0.vm05.stdout:8/373: write d1/dd/d4d/f60 [4797304,27713] 0 2026-03-10T07:50:51.304 INFO:tasks.workunit.client.0.vm05.stdout:5/418: mkdir d2/d20/d77/d93 0 2026-03-10T07:50:51.306 INFO:tasks.workunit.client.0.vm05.stdout:3/391: dwrite d8/d1f/d2a/d34/f39 [0,4194304] 0 2026-03-10T07:50:51.319 INFO:tasks.workunit.client.0.vm05.stdout:1/426: creat da/d26/d2b/d71/f7d x:0 0 0 2026-03-10T07:50:51.320 INFO:tasks.workunit.client.0.vm05.stdout:1/427: write da/dd/f77 [403447,35196] 0 2026-03-10T07:50:51.320 INFO:tasks.workunit.client.0.vm05.stdout:1/428: read - da/dd/f7b zero size 2026-03-10T07:50:51.323 INFO:tasks.workunit.client.0.vm05.stdout:6/422: mkdir d0/d11/d22/d6c/d84 0 2026-03-10T07:50:51.327 INFO:tasks.workunit.client.0.vm05.stdout:6/423: fdatasync d0/d6/f44 0 2026-03-10T07:50:51.327 INFO:tasks.workunit.client.0.vm05.stdout:4/440: truncate d0/d6/f32 2029111 0 2026-03-10T07:50:51.327 INFO:tasks.workunit.client.0.vm05.stdout:8/374: chown d1/dd/d18/f47 93 1 2026-03-10T07:50:51.330 INFO:tasks.workunit.client.0.vm05.stdout:8/375: dread d1/dd/f25 [0,4194304] 0 2026-03-10T07:50:51.331 INFO:tasks.workunit.client.0.vm05.stdout:5/419: symlink d2/d20/d33/d72/l94 0 2026-03-10T07:50:51.331 INFO:tasks.workunit.client.0.vm05.stdout:5/420: chown d2/d20 259 1 2026-03-10T07:50:51.335 INFO:tasks.workunit.client.0.vm05.stdout:3/392: write d8/d16/f1a [1090097,103166] 0 2026-03-10T07:50:51.345 INFO:tasks.workunit.client.0.vm05.stdout:3/393: dread d8/d1c/f2b [0,4194304] 0 2026-03-10T07:50:51.346 INFO:tasks.workunit.client.0.vm05.stdout:3/394: write d8/d16/f82 [782889,23828] 0 2026-03-10T07:50:51.352 INFO:tasks.workunit.client.0.vm05.stdout:1/429: symlink da/dd/d12/d19/l7e 0 2026-03-10T07:50:51.352 INFO:tasks.workunit.client.0.vm05.stdout:1/430: write da/dd/f77 [675203,95324] 0 2026-03-10T07:50:51.356 INFO:tasks.workunit.client.0.vm05.stdout:6/424: mknod d0/d6/c85 0 2026-03-10T07:50:51.358 INFO:tasks.workunit.client.0.vm05.stdout:8/376: creat d1/dd/d18/f70 x:0 0 0 2026-03-10T07:50:51.361 INFO:tasks.workunit.client.0.vm05.stdout:5/421: mknod d2/d12/d2d/c95 0 2026-03-10T07:50:51.361 INFO:tasks.workunit.client.0.vm05.stdout:5/422: chown d2/d20/d5b/c85 299369 1 2026-03-10T07:50:51.377 INFO:tasks.workunit.client.0.vm05.stdout:1/431: dread da/dd/d12/f18 [0,4194304] 0 2026-03-10T07:50:51.379 INFO:tasks.workunit.client.0.vm05.stdout:5/423: creat d2/d20/d4c/d64/f96 x:0 0 0 2026-03-10T07:50:51.379 INFO:tasks.workunit.client.0.vm05.stdout:5/424: chown d2/d20/f57 124191 1 2026-03-10T07:50:51.380 INFO:tasks.workunit.client.0.vm05.stdout:5/425: chown d2/d20/d4c/d64/f96 451 1 2026-03-10T07:50:51.381 INFO:tasks.workunit.client.0.vm05.stdout:5/426: write d2/d5/f25 [8868362,128781] 0 2026-03-10T07:50:51.384 INFO:tasks.workunit.client.0.vm05.stdout:5/427: dread d2/d20/d77/f7f [0,4194304] 0 2026-03-10T07:50:51.404 INFO:tasks.workunit.client.0.vm05.stdout:5/428: truncate d2/d12/d2d/d4a/f59 41875 0 2026-03-10T07:50:51.404 INFO:tasks.workunit.client.0.vm05.stdout:9/409: getdents d8/d35/d1c/d2c/d63 0 2026-03-10T07:50:51.404 INFO:tasks.workunit.client.0.vm05.stdout:8/377: symlink d1/dd/d18/d20/d2a/d34/d49/l71 0 2026-03-10T07:50:51.404 INFO:tasks.workunit.client.0.vm05.stdout:7/427: write d1/d6/d47/f7b [855940,61525] 0 2026-03-10T07:50:51.404 INFO:tasks.workunit.client.0.vm05.stdout:8/378: dread - d1/dd/d18/d20/d2a/d48/f50 zero size 2026-03-10T07:50:51.404 INFO:tasks.workunit.client.0.vm05.stdout:7/428: chown d1/d6/d47/l4a 882036 1 2026-03-10T07:50:51.404 INFO:tasks.workunit.client.0.vm05.stdout:7/429: dread - d1/d34/d59/f78 zero size 2026-03-10T07:50:51.404 INFO:tasks.workunit.client.0.vm05.stdout:0/389: rename d8/dd/d37/d56/d6d to d8/dd/d10/d26/d8b 0 2026-03-10T07:50:51.404 INFO:tasks.workunit.client.0.vm05.stdout:8/379: write d1/dd/d18/d20/d2a/f3a [777944,31429] 0 2026-03-10T07:50:51.404 INFO:tasks.workunit.client.0.vm05.stdout:7/430: creat d1/d6/f84 x:0 0 0 2026-03-10T07:50:51.404 INFO:tasks.workunit.client.0.vm05.stdout:4/441: rename d0/d6/d95/f73 to d0/d6/d9/d12/d45/d55/d4e/f97 0 2026-03-10T07:50:51.407 INFO:tasks.workunit.client.0.vm05.stdout:2/485: truncate d0/d8/f65 704891 0 2026-03-10T07:50:51.408 INFO:tasks.workunit.client.0.vm05.stdout:8/380: readlink d1/dd/d18/d20/d2a/d34/l62 0 2026-03-10T07:50:51.412 INFO:tasks.workunit.client.0.vm05.stdout:8/381: dwrite d1/dd/d18/d20/d2a/d48/f59 [4194304,4194304] 0 2026-03-10T07:50:51.420 INFO:tasks.workunit.client.0.vm05.stdout:3/395: rename d8/c11 to d8/d22/d60/d6e/c84 0 2026-03-10T07:50:51.421 INFO:tasks.workunit.client.0.vm05.stdout:3/396: dwrite d8/d1c/f23 [0,4194304] 0 2026-03-10T07:50:51.421 INFO:tasks.workunit.client.0.vm05.stdout:4/442: creat d0/d3b/d5c/f98 x:0 0 0 2026-03-10T07:50:51.426 INFO:tasks.workunit.client.0.vm05.stdout:2/486: mkdir d0/d8/d43/df/d9c 0 2026-03-10T07:50:51.434 INFO:tasks.workunit.client.0.vm05.stdout:8/382: mknod d1/dd/d18/d20/c72 0 2026-03-10T07:50:51.437 INFO:tasks.workunit.client.0.vm05.stdout:6/425: sync 2026-03-10T07:50:51.438 INFO:tasks.workunit.client.0.vm05.stdout:1/432: sync 2026-03-10T07:50:51.439 INFO:tasks.workunit.client.0.vm05.stdout:6/426: readlink d0/d6/d3b/l3e 0 2026-03-10T07:50:51.441 INFO:tasks.workunit.client.0.vm05.stdout:5/429: rename d2/d5/f25 to d2/d20/d33/d53/f97 0 2026-03-10T07:50:51.442 INFO:tasks.workunit.client.0.vm05.stdout:5/430: fsync d2/f15 0 2026-03-10T07:50:51.449 INFO:tasks.workunit.client.0.vm05.stdout:2/487: symlink d0/d2a/l9d 0 2026-03-10T07:50:51.451 INFO:tasks.workunit.client.0.vm05.stdout:7/431: link d1/d6/d3b/l45 d1/d6/d47/l85 0 2026-03-10T07:50:51.459 INFO:tasks.workunit.client.0.vm05.stdout:7/432: dread d1/d34/f7d [0,4194304] 0 2026-03-10T07:50:51.464 INFO:tasks.workunit.client.0.vm05.stdout:9/410: rename d8/d35/d1c/d36 to d8/d35/d1c/d20/d59/d8b 0 2026-03-10T07:50:51.465 INFO:tasks.workunit.client.0.vm05.stdout:9/411: chown d8/d35/d22/d33/d47 2 1 2026-03-10T07:50:51.465 INFO:tasks.workunit.client.0.vm05.stdout:9/412: chown d8/d35/d1c/f3b 1615697 1 2026-03-10T07:50:51.473 INFO:tasks.workunit.client.0.vm05.stdout:8/383: mkdir d1/d73 0 2026-03-10T07:50:51.476 INFO:tasks.workunit.client.0.vm05.stdout:8/384: dread d1/dd/d4d/f61 [0,4194304] 0 2026-03-10T07:50:51.478 INFO:tasks.workunit.client.0.vm05.stdout:8/385: dread d1/dd/d4d/f60 [0,4194304] 0 2026-03-10T07:50:51.483 INFO:tasks.workunit.client.0.vm05.stdout:6/427: mkdir d0/d11/d86 0 2026-03-10T07:50:51.483 INFO:tasks.workunit.client.0.vm05.stdout:6/428: fsync d0/f3a 0 2026-03-10T07:50:51.486 INFO:tasks.workunit.client.0.vm05.stdout:8/386: dread d1/dd/d18/f21 [0,4194304] 0 2026-03-10T07:50:51.486 INFO:tasks.workunit.client.0.vm05.stdout:4/443: rename d0/d6/d9/d12/l2b to d0/d6/d9/d12/d65/l99 0 2026-03-10T07:50:51.490 INFO:tasks.workunit.client.0.vm05.stdout:4/444: dwrite d0/d6/d37/f46 [0,4194304] 0 2026-03-10T07:50:51.495 INFO:tasks.workunit.client.0.vm05.stdout:8/387: dread d1/dd/d18/d20/d2a/d48/f4a [0,4194304] 0 2026-03-10T07:50:51.512 INFO:tasks.workunit.client.0.vm05.stdout:2/488: mknod d0/d47/c9e 0 2026-03-10T07:50:51.527 INFO:tasks.workunit.client.0.vm05.stdout:0/390: truncate d8/dd/d34/f3d 565496 0 2026-03-10T07:50:51.530 INFO:tasks.workunit.client.0.vm05.stdout:4/445: creat d0/d3b/f9a x:0 0 0 2026-03-10T07:50:51.532 INFO:tasks.workunit.client.0.vm05.stdout:5/431: link d2/c13 d2/d20/d4c/c98 0 2026-03-10T07:50:51.533 INFO:tasks.workunit.client.0.vm05.stdout:5/432: write d2/d5/f23 [914005,71382] 0 2026-03-10T07:50:51.535 INFO:tasks.workunit.client.0.vm05.stdout:8/388: creat d1/dd/d18/d20/d2a/d34/d49/d5d/f74 x:0 0 0 2026-03-10T07:50:51.538 INFO:tasks.workunit.client.0.vm05.stdout:7/433: creat d1/f86 x:0 0 0 2026-03-10T07:50:51.539 INFO:tasks.workunit.client.0.vm05.stdout:7/434: truncate d1/f86 665806 0 2026-03-10T07:50:51.540 INFO:tasks.workunit.client.0.vm05.stdout:6/429: symlink d0/l87 0 2026-03-10T07:50:51.542 INFO:tasks.workunit.client.0.vm05.stdout:1/433: dwrite da/f5c [0,4194304] 0 2026-03-10T07:50:51.543 INFO:tasks.workunit.client.0.vm05.stdout:1/434: readlink da/dd/d2a/d55/l66 0 2026-03-10T07:50:51.544 INFO:tasks.workunit.client.0.vm05.stdout:0/391: sync 2026-03-10T07:50:51.554 INFO:tasks.workunit.client.0.vm05.stdout:4/446: creat d0/d28/f9b x:0 0 0 2026-03-10T07:50:51.556 INFO:tasks.workunit.client.0.vm05.stdout:3/397: dwrite d8/d1c/f56 [0,4194304] 0 2026-03-10T07:50:51.563 INFO:tasks.workunit.client.0.vm05.stdout:0/392: dread d8/dd/d10/f19 [0,4194304] 0 2026-03-10T07:50:51.566 INFO:tasks.workunit.client.0.vm05.stdout:3/398: dwrite d8/fe [4194304,4194304] 0 2026-03-10T07:50:51.569 INFO:tasks.workunit.client.0.vm05.stdout:3/399: chown d8/d1c/c20 163305 1 2026-03-10T07:50:51.572 INFO:tasks.workunit.client.0.vm05.stdout:9/413: link d8/f5e d8/d35/d1c/f8c 0 2026-03-10T07:50:51.572 INFO:tasks.workunit.client.0.vm05.stdout:5/433: creat d2/d12/d2d/d4a/f99 x:0 0 0 2026-03-10T07:50:51.572 INFO:tasks.workunit.client.0.vm05.stdout:9/414: chown d8/d35/d1c/d20/d59/d8b/l55 0 1 2026-03-10T07:50:51.572 INFO:tasks.workunit.client.0.vm05.stdout:5/434: chown d2/d20/d33/d53/c74 1738 1 2026-03-10T07:50:51.573 INFO:tasks.workunit.client.0.vm05.stdout:5/435: readlink d2/d20/d77/l81 0 2026-03-10T07:50:51.586 INFO:tasks.workunit.client.0.vm05.stdout:7/435: dread d1/d6/f22 [0,4194304] 0 2026-03-10T07:50:51.586 INFO:tasks.workunit.client.0.vm05.stdout:7/436: dread - d1/d6/f77 zero size 2026-03-10T07:50:51.587 INFO:tasks.workunit.client.0.vm05.stdout:1/435: rmdir da/d26/d2b/d71 39 2026-03-10T07:50:51.588 INFO:tasks.workunit.client.0.vm05.stdout:1/436: read - da/dd/f7b zero size 2026-03-10T07:50:51.591 INFO:tasks.workunit.client.0.vm05.stdout:1/437: dwrite da/dd/d2a/f63 [0,4194304] 0 2026-03-10T07:50:51.602 INFO:tasks.workunit.client.0.vm05.stdout:0/393: rename d8/dd/c1b to d8/dd/d10/d26/d2a/d6f/c8c 0 2026-03-10T07:50:51.602 INFO:tasks.workunit.client.0.vm05.stdout:9/415: rmdir d8/d86/d28/d79/d57 39 2026-03-10T07:50:51.602 INFO:tasks.workunit.client.0.vm05.stdout:5/436: write d2/d12/f6b [519084,116909] 0 2026-03-10T07:50:51.602 INFO:tasks.workunit.client.0.vm05.stdout:9/416: dwrite d8/d86/d28/f84 [0,4194304] 0 2026-03-10T07:50:51.611 INFO:tasks.workunit.client.0.vm05.stdout:6/430: symlink d0/d11/d22/d69/l88 0 2026-03-10T07:50:51.613 INFO:tasks.workunit.client.0.vm05.stdout:7/437: fdatasync d1/d6/f2e 0 2026-03-10T07:50:51.614 INFO:tasks.workunit.client.0.vm05.stdout:4/447: mkdir d0/d6/d9/d12/d9c 0 2026-03-10T07:50:51.639 INFO:tasks.workunit.client.0.vm05.stdout:7/438: mknod d1/d6/d80/d82/c87 0 2026-03-10T07:50:51.641 INFO:tasks.workunit.client.0.vm05.stdout:3/400: rename d8/d1f/f79 to d8/d22/d60/d58/f85 0 2026-03-10T07:50:51.642 INFO:tasks.workunit.client.0.vm05.stdout:3/401: write d8/d1f/f49 [4023375,78710] 0 2026-03-10T07:50:51.645 INFO:tasks.workunit.client.0.vm05.stdout:7/439: dwrite d1/d34/d59/f78 [0,4194304] 0 2026-03-10T07:50:51.648 INFO:tasks.workunit.client.0.vm05.stdout:0/394: creat d8/dd/d37/d81/f8d x:0 0 0 2026-03-10T07:50:51.652 INFO:tasks.workunit.client.0.vm05.stdout:9/417: creat d8/d35/d1c/d75/f8d x:0 0 0 2026-03-10T07:50:51.654 INFO:tasks.workunit.client.0.vm05.stdout:9/418: dread d8/d35/d22/d33/f5b [0,4194304] 0 2026-03-10T07:50:51.654 INFO:tasks.workunit.client.0.vm05.stdout:0/395: dwrite d8/dd/d10/f6c [0,4194304] 0 2026-03-10T07:50:51.661 INFO:tasks.workunit.client.0.vm05.stdout:4/448: symlink d0/d6/d9/l9d 0 2026-03-10T07:50:51.669 INFO:tasks.workunit.client.0.vm05.stdout:3/402: rename d8/d1f/d2a/d4b/f55 to d8/d22/f86 0 2026-03-10T07:50:51.670 INFO:tasks.workunit.client.0.vm05.stdout:7/440: truncate d1/d6/f31 289295 0 2026-03-10T07:50:51.677 INFO:tasks.workunit.client.0.vm05.stdout:8/389: write d1/d23/f31 [1387100,40585] 0 2026-03-10T07:50:51.677 INFO:tasks.workunit.client.0.vm05.stdout:8/390: write d1/dd/d18/f3f [2910408,32664] 0 2026-03-10T07:50:51.677 INFO:tasks.workunit.client.0.vm05.stdout:9/419: creat d8/d35/d1c/d75/f8e x:0 0 0 2026-03-10T07:50:51.678 INFO:tasks.workunit.client.0.vm05.stdout:4/449: write d0/d6/d37/f75 [1402549,32526] 0 2026-03-10T07:50:51.682 INFO:tasks.workunit.client.0.vm05.stdout:3/403: mknod d8/d16/d19/c87 0 2026-03-10T07:50:51.686 INFO:tasks.workunit.client.0.vm05.stdout:0/396: write d8/dd/d37/d56/f62 [1794121,124954] 0 2026-03-10T07:50:51.686 INFO:tasks.workunit.client.0.vm05.stdout:0/397: chown d8/dd/d34 297 1 2026-03-10T07:50:51.689 INFO:tasks.workunit.client.0.vm05.stdout:0/398: truncate d8/dd/d37/d67/f87 146185 0 2026-03-10T07:50:51.690 INFO:tasks.workunit.client.0.vm05.stdout:9/420: creat d8/d35/d22/d33/d85/f8f x:0 0 0 2026-03-10T07:50:51.696 INFO:tasks.workunit.client.0.vm05.stdout:4/450: dread d0/d28/f33 [0,4194304] 0 2026-03-10T07:50:51.702 INFO:tasks.workunit.client.0.vm05.stdout:1/438: dwrite da/dd/d12/f22 [0,4194304] 0 2026-03-10T07:50:51.711 INFO:tasks.workunit.client.0.vm05.stdout:7/441: write d1/d6/f2e [3741511,126684] 0 2026-03-10T07:50:51.712 INFO:tasks.workunit.client.0.vm05.stdout:4/451: creat d0/d6/d9/d12/d45/d55/d44/d85/f9e x:0 0 0 2026-03-10T07:50:51.712 INFO:tasks.workunit.client.0.vm05.stdout:1/439: chown da/dd/d2a/f63 36 1 2026-03-10T07:50:51.712 INFO:tasks.workunit.client.0.vm05.stdout:1/440: write f4 [666037,66419] 0 2026-03-10T07:50:51.712 INFO:tasks.workunit.client.0.vm05.stdout:5/437: dwrite d2/d20/d33/d53/d7d/f90 [0,4194304] 0 2026-03-10T07:50:51.713 INFO:tasks.workunit.client.0.vm05.stdout:4/452: write d0/d6/d9/d12/d45/d55/d44/f74 [633151,75951] 0 2026-03-10T07:50:51.717 INFO:tasks.workunit.client.0.vm05.stdout:1/441: rename da/d26/c72 to da/dd/d12/d19/d20/c7f 0 2026-03-10T07:50:51.721 INFO:tasks.workunit.client.0.vm05.stdout:5/438: creat d2/d20/d4c/f9a x:0 0 0 2026-03-10T07:50:51.724 INFO:tasks.workunit.client.0.vm05.stdout:6/431: dwrite d0/d35/f41 [0,4194304] 0 2026-03-10T07:50:51.726 INFO:tasks.workunit.client.0.vm05.stdout:4/453: dwrite d0/d6/d9/d12/f36 [0,4194304] 0 2026-03-10T07:50:51.727 INFO:tasks.workunit.client.0.vm05.stdout:4/454: write d0/d6/d37/f75 [168127,1267] 0 2026-03-10T07:50:51.732 INFO:tasks.workunit.client.0.vm05.stdout:9/421: sync 2026-03-10T07:50:51.738 INFO:tasks.workunit.client.0.vm05.stdout:2/489: write d0/d8/f65 [979039,31663] 0 2026-03-10T07:50:51.741 INFO:tasks.workunit.client.0.vm05.stdout:1/442: mkdir da/dd/d42/d80 0 2026-03-10T07:50:51.742 INFO:tasks.workunit.client.0.vm05.stdout:5/439: creat d2/d20/d33/d53/d7d/f9b x:0 0 0 2026-03-10T07:50:51.746 INFO:tasks.workunit.client.0.vm05.stdout:6/432: symlink d0/d11/d22/d69/l89 0 2026-03-10T07:50:51.746 INFO:tasks.workunit.client.0.vm05.stdout:4/455: chown d0/d6/d95/f40 306 1 2026-03-10T07:50:51.748 INFO:tasks.workunit.client.0.vm05.stdout:9/422: creat d8/d35/d1c/d20/d54/f90 x:0 0 0 2026-03-10T07:50:51.749 INFO:tasks.workunit.client.0.vm05.stdout:9/423: write d8/d86/d28/d79/f44 [660900,48696] 0 2026-03-10T07:50:51.753 INFO:tasks.workunit.client.0.vm05.stdout:9/424: dread d8/d35/d22/f2b [0,4194304] 0 2026-03-10T07:50:51.754 INFO:tasks.workunit.client.0.vm05.stdout:1/443: chown da/dd/d12/d34/f38 266 1 2026-03-10T07:50:51.755 INFO:tasks.workunit.client.0.vm05.stdout:1/444: write da/dd/d2a/f6a [214131,28229] 0 2026-03-10T07:50:51.757 INFO:tasks.workunit.client.0.vm05.stdout:5/440: symlink d2/d12/d2d/d4a/l9c 0 2026-03-10T07:50:51.759 INFO:tasks.workunit.client.0.vm05.stdout:9/425: fdatasync d8/d35/d1c/f8c 0 2026-03-10T07:50:51.760 INFO:tasks.workunit.client.0.vm05.stdout:5/441: stat d2/d20/d33/c41 0 2026-03-10T07:50:51.761 INFO:tasks.workunit.client.0.vm05.stdout:4/456: symlink d0/d6/d9/d8c/l9f 0 2026-03-10T07:50:51.762 INFO:tasks.workunit.client.0.vm05.stdout:2/490: rmdir d0/d8/d43/df/d9c 0 2026-03-10T07:50:51.768 INFO:tasks.workunit.client.0.vm05.stdout:5/442: rmdir d2/d5 39 2026-03-10T07:50:51.770 INFO:tasks.workunit.client.0.vm05.stdout:2/491: read d0/d8/d43/d38/f9a [1418863,40295] 0 2026-03-10T07:50:51.774 INFO:tasks.workunit.client.0.vm05.stdout:2/492: truncate d0/f5 2155351 0 2026-03-10T07:50:51.775 INFO:tasks.workunit.client.0.vm05.stdout:4/457: creat d0/d6/fa0 x:0 0 0 2026-03-10T07:50:51.782 INFO:tasks.workunit.client.0.vm05.stdout:2/493: unlink d0/d2a/l9d 0 2026-03-10T07:50:51.783 INFO:tasks.workunit.client.0.vm05.stdout:2/494: read d0/d8/f65 [896668,67357] 0 2026-03-10T07:50:51.784 INFO:tasks.workunit.client.0.vm05.stdout:2/495: chown d0/d8/d3d/d7d/c50 3836231 1 2026-03-10T07:50:51.786 INFO:tasks.workunit.client.0.vm05.stdout:4/458: creat d0/d6/d37/fa1 x:0 0 0 2026-03-10T07:50:51.786 INFO:tasks.workunit.client.0.vm05.stdout:4/459: write d0/d6/d37/fa1 [108635,4732] 0 2026-03-10T07:50:51.790 INFO:tasks.workunit.client.0.vm05.stdout:4/460: dread d0/d6/d9/f54 [0,4194304] 0 2026-03-10T07:50:51.790 INFO:tasks.workunit.client.0.vm05.stdout:4/461: write d0/d6/d9/f4d [2335458,130315] 0 2026-03-10T07:50:51.799 INFO:tasks.workunit.client.0.vm05.stdout:2/496: mknod d0/d8/d43/df/d4d/c9f 0 2026-03-10T07:50:51.799 INFO:tasks.workunit.client.0.vm05.stdout:2/497: chown d0/d8/d43/df/d53/f69 840481 1 2026-03-10T07:50:51.801 INFO:tasks.workunit.client.0.vm05.stdout:3/404: write d8/f25 [626884,58892] 0 2026-03-10T07:50:51.803 INFO:tasks.workunit.client.0.vm05.stdout:8/391: truncate d1/dd/d18/d20/d2a/d48/f59 6932549 0 2026-03-10T07:50:51.808 INFO:tasks.workunit.client.0.vm05.stdout:4/462: mknod d0/d6/d9/d12/d45/d55/d44/ca2 0 2026-03-10T07:50:51.808 INFO:tasks.workunit.client.0.vm05.stdout:4/463: fsync d0/d6/d9/f4d 0 2026-03-10T07:50:51.809 INFO:tasks.workunit.client.0.vm05.stdout:4/464: dread - d0/d6/d9/f83 zero size 2026-03-10T07:50:51.813 INFO:tasks.workunit.client.0.vm05.stdout:0/399: dwrite d8/dd/d10/d26/d48/f83 [0,4194304] 0 2026-03-10T07:50:51.820 INFO:tasks.workunit.client.0.vm05.stdout:2/498: mknod d0/d8/d43/d38/ca0 0 2026-03-10T07:50:51.821 INFO:tasks.workunit.client.0.vm05.stdout:7/442: stat d1/d34/f7d 0 2026-03-10T07:50:51.821 INFO:tasks.workunit.client.0.vm05.stdout:7/443: chown d1/d6/f41 21 1 2026-03-10T07:50:51.825 INFO:tasks.workunit.client.0.vm05.stdout:6/433: dwrite d0/d35/d36/f5b [4194304,4194304] 0 2026-03-10T07:50:51.826 INFO:tasks.workunit.client.0.vm05.stdout:6/434: chown d0/d35/d36/f71 11 1 2026-03-10T07:50:51.826 INFO:tasks.workunit.client.0.vm05.stdout:2/499: dwrite d0/d8/f1c [0,4194304] 0 2026-03-10T07:50:51.829 INFO:tasks.workunit.client.0.vm05.stdout:1/445: truncate da/d26/d2b/f45 3054544 0 2026-03-10T07:50:51.829 INFO:tasks.workunit.client.0.vm05.stdout:3/405: creat d8/d16/f88 x:0 0 0 2026-03-10T07:50:51.830 INFO:tasks.workunit.client.0.vm05.stdout:1/446: dread - da/dd/d12/d34/f5f zero size 2026-03-10T07:50:51.833 INFO:tasks.workunit.client.0.vm05.stdout:1/447: truncate da/dd/d12/d34/f5f 27766 0 2026-03-10T07:50:51.834 INFO:tasks.workunit.client.0.vm05.stdout:3/406: dwrite d8/d16/f4c [0,4194304] 0 2026-03-10T07:50:51.846 INFO:tasks.workunit.client.0.vm05.stdout:9/426: write d8/d35/d1c/d20/d59/d8b/f39 [1186313,81703] 0 2026-03-10T07:50:51.858 INFO:tasks.workunit.client.0.vm05.stdout:5/443: write d2/d20/f4f [179348,54152] 0 2026-03-10T07:50:51.875 INFO:tasks.workunit.client.0.vm05.stdout:4/465: dwrite d0/d28/f33 [0,4194304] 0 2026-03-10T07:50:51.875 INFO:tasks.workunit.client.0.vm05.stdout:4/466: fsync d0/d3b/f53 0 2026-03-10T07:50:51.875 INFO:tasks.workunit.client.0.vm05.stdout:4/467: write d0/d6/d9/d12/d45/d55/f7d [737646,114758] 0 2026-03-10T07:50:51.875 INFO:tasks.workunit.client.0.vm05.stdout:4/468: dwrite d0/d3b/f53 [0,4194304] 0 2026-03-10T07:50:51.875 INFO:tasks.workunit.client.0.vm05.stdout:4/469: read - d0/d6/fa0 zero size 2026-03-10T07:50:51.882 INFO:tasks.workunit.client.0.vm05.stdout:7/444: chown d1/d3c/l62 0 1 2026-03-10T07:50:51.882 INFO:tasks.workunit.client.0.vm05.stdout:7/445: write d1/d6/f1d [3180599,7238] 0 2026-03-10T07:50:51.883 INFO:tasks.workunit.client.0.vm05.stdout:7/446: write d1/d6/d47/f7b [524630,6809] 0 2026-03-10T07:50:51.885 INFO:tasks.workunit.client.0.vm05.stdout:6/435: truncate d0/d35/d36/d43/f47 514009 0 2026-03-10T07:50:51.885 INFO:tasks.workunit.client.0.vm05.stdout:2/500: mknod d0/d8/d3d/d7d/ca1 0 2026-03-10T07:50:51.887 INFO:tasks.workunit.client.0.vm05.stdout:3/407: creat d8/d1f/d2a/d4a/f89 x:0 0 0 2026-03-10T07:50:51.892 INFO:tasks.workunit.client.0.vm05.stdout:3/408: dwrite d8/d16/f82 [0,4194304] 0 2026-03-10T07:50:51.913 INFO:tasks.workunit.client.0.vm05.stdout:3/409: fsync d8/d1f/d2a/d4a/f4f 0 2026-03-10T07:50:51.913 INFO:tasks.workunit.client.0.vm05.stdout:3/410: write d8/fe [8724003,115112] 0 2026-03-10T07:50:51.913 INFO:tasks.workunit.client.0.vm05.stdout:4/470: unlink d0/d6/d9/d12/d45/d55/l38 0 2026-03-10T07:50:51.913 INFO:tasks.workunit.client.0.vm05.stdout:6/436: unlink d0/f3a 0 2026-03-10T07:50:51.913 INFO:tasks.workunit.client.0.vm05.stdout:0/400: link d8/dd/d10/d26/d2a/c41 d8/dd/d10/d26/d8b/c8e 0 2026-03-10T07:50:51.913 INFO:tasks.workunit.client.0.vm05.stdout:4/471: dwrite d0/d6/d60/f72 [0,4194304] 0 2026-03-10T07:50:51.916 INFO:tasks.workunit.client.0.vm05.stdout:6/437: unlink d0/d6/c85 0 2026-03-10T07:50:51.918 INFO:tasks.workunit.client.0.vm05.stdout:1/448: rmdir da/dd/d2a/d7c 0 2026-03-10T07:50:51.921 INFO:tasks.workunit.client.0.vm05.stdout:8/392: rename d1/c1d to d1/dd/d18/d20/d2a/c75 0 2026-03-10T07:50:51.924 INFO:tasks.workunit.client.0.vm05.stdout:0/401: creat d8/dd/d10/d26/d2a/f8f x:0 0 0 2026-03-10T07:50:51.924 INFO:tasks.workunit.client.0.vm05.stdout:0/402: chown d8/dd/d34/c35 1536 1 2026-03-10T07:50:51.925 INFO:tasks.workunit.client.0.vm05.stdout:0/403: dread - d8/dd/d37/f4f zero size 2026-03-10T07:50:51.926 INFO:tasks.workunit.client.0.vm05.stdout:9/427: sync 2026-03-10T07:50:51.926 INFO:tasks.workunit.client.0.vm05.stdout:2/501: sync 2026-03-10T07:50:51.927 INFO:tasks.workunit.client.0.vm05.stdout:2/502: chown d0/d8/d43/df/l96 156079424 1 2026-03-10T07:50:51.928 INFO:tasks.workunit.client.0.vm05.stdout:2/503: chown d0/d8/d43/d38/f56 21501349 1 2026-03-10T07:50:51.931 INFO:tasks.workunit.client.0.vm05.stdout:8/393: rmdir d1/dd/d18/d20/d2a/d34/d49 39 2026-03-10T07:50:51.943 INFO:tasks.workunit.client.0.vm05.stdout:3/411: rename d8/d1f/d2a/d4b to d8/d1f/d24/d8a 0 2026-03-10T07:50:51.944 INFO:tasks.workunit.client.0.vm05.stdout:3/412: write d8/d1f/d2a/f66 [2857090,2553] 0 2026-03-10T07:50:51.944 INFO:tasks.workunit.client.0.vm05.stdout:8/394: write d1/dd/d18/d20/d2a/d48/f50 [348574,57559] 0 2026-03-10T07:50:51.944 INFO:tasks.workunit.client.0.vm05.stdout:8/395: dread d1/dd/f25 [0,4194304] 0 2026-03-10T07:50:51.944 INFO:tasks.workunit.client.0.vm05.stdout:8/396: chown d1/dd/d18/d20/d2a/d34/l36 936804 1 2026-03-10T07:50:51.944 INFO:tasks.workunit.client.0.vm05.stdout:6/438: truncate d0/d11/d57/f5f 3352149 0 2026-03-10T07:50:51.944 INFO:tasks.workunit.client.0.vm05.stdout:9/428: dwrite d8/d35/d22/d33/f41 [0,4194304] 0 2026-03-10T07:50:51.944 INFO:tasks.workunit.client.0.vm05.stdout:6/439: fdatasync d0/d11/d57/d66/f7b 0 2026-03-10T07:50:51.944 INFO:tasks.workunit.client.0.vm05.stdout:1/449: link da/dd/d2a/f54 da/dd/d2a/d55/d64/f81 0 2026-03-10T07:50:51.944 INFO:tasks.workunit.client.0.vm05.stdout:1/450: stat da/f5d 0 2026-03-10T07:50:51.945 INFO:tasks.workunit.client.0.vm05.stdout:9/429: truncate d8/d35/d1c/d75/f8e 188884 0 2026-03-10T07:50:51.945 INFO:tasks.workunit.client.0.vm05.stdout:4/472: creat d0/fa3 x:0 0 0 2026-03-10T07:50:51.946 INFO:tasks.workunit.client.0.vm05.stdout:3/413: write d8/d22/f86 [450963,121613] 0 2026-03-10T07:50:51.947 INFO:tasks.workunit.client.0.vm05.stdout:8/397: creat d1/dd/d4d/d64/d6a/f76 x:0 0 0 2026-03-10T07:50:51.947 INFO:tasks.workunit.client.0.vm05.stdout:0/404: write d8/dd/d37/d56/f18 [2820120,61928] 0 2026-03-10T07:50:51.948 INFO:tasks.workunit.client.0.vm05.stdout:2/504: dread d0/d8/d43/df/d53/f7c [0,4194304] 0 2026-03-10T07:50:51.949 INFO:tasks.workunit.client.0.vm05.stdout:0/405: chown d8/fb 5075493 1 2026-03-10T07:50:51.952 INFO:tasks.workunit.client.0.vm05.stdout:6/440: read d0/d6/f1d [416118,111040] 0 2026-03-10T07:50:51.954 INFO:tasks.workunit.client.0.vm05.stdout:6/441: dread d0/d11/f1c [0,4194304] 0 2026-03-10T07:50:51.960 INFO:tasks.workunit.client.0.vm05.stdout:4/473: dwrite d0/d6/d9/d12/d65/f8a [0,4194304] 0 2026-03-10T07:50:51.962 INFO:tasks.workunit.client.0.vm05.stdout:4/474: write d0/d6/f94 [427083,74820] 0 2026-03-10T07:50:51.964 INFO:tasks.workunit.client.0.vm05.stdout:1/451: readlink da/dd/d2a/d55/d64/l6d 0 2026-03-10T07:50:51.978 INFO:tasks.workunit.client.0.vm05.stdout:3/414: mknod d8/d22/d60/d58/c8b 0 2026-03-10T07:50:51.978 INFO:tasks.workunit.client.0.vm05.stdout:2/505: unlink d0/d8/d43/df/c86 0 2026-03-10T07:50:51.978 INFO:tasks.workunit.client.0.vm05.stdout:1/452: stat da/dd/d12/d19/l29 0 2026-03-10T07:50:51.978 INFO:tasks.workunit.client.0.vm05.stdout:1/453: write da/dd/d12/d34/f5f [445858,2719] 0 2026-03-10T07:50:51.978 INFO:tasks.workunit.client.0.vm05.stdout:0/406: symlink d8/l90 0 2026-03-10T07:50:51.978 INFO:tasks.workunit.client.0.vm05.stdout:6/442: symlink d0/d35/d36/l8a 0 2026-03-10T07:50:51.978 INFO:tasks.workunit.client.0.vm05.stdout:3/415: mkdir d8/d1f/d2a/d4a/d8c 0 2026-03-10T07:50:51.978 INFO:tasks.workunit.client.0.vm05.stdout:6/443: mknod d0/d35/d36/d43/c8b 0 2026-03-10T07:50:51.978 INFO:tasks.workunit.client.0.vm05.stdout:9/430: creat d8/d86/d28/d79/f91 x:0 0 0 2026-03-10T07:50:51.980 INFO:tasks.workunit.client.0.vm05.stdout:9/431: read f6 [613486,121853] 0 2026-03-10T07:50:51.980 INFO:tasks.workunit.client.0.vm05.stdout:3/416: mknod d8/d1f/d2a/d4a/d7d/c8d 0 2026-03-10T07:50:51.980 INFO:tasks.workunit.client.0.vm05.stdout:0/407: dwrite d8/fb [0,4194304] 0 2026-03-10T07:50:51.983 INFO:tasks.workunit.client.0.vm05.stdout:0/408: chown d8/dd/d7a 0 1 2026-03-10T07:50:51.984 INFO:tasks.workunit.client.0.vm05.stdout:0/409: chown d8/dd/d37/d67/l57 4147112 1 2026-03-10T07:50:51.987 INFO:tasks.workunit.client.0.vm05.stdout:1/454: unlink da/dd/d12/d19/c3c 0 2026-03-10T07:50:51.991 INFO:tasks.workunit.client.0.vm05.stdout:6/444: rename d0/d6/cc to d0/d35/c8c 0 2026-03-10T07:50:51.992 INFO:tasks.workunit.client.0.vm05.stdout:6/445: chown d0/d11/d22/d69/l88 3 1 2026-03-10T07:50:51.992 INFO:tasks.workunit.client.0.vm05.stdout:6/446: write d0/d6/f44 [257295,66955] 0 2026-03-10T07:50:51.995 INFO:tasks.workunit.client.0.vm05.stdout:4/475: link d0/d6/d9/d12/d45/d55/l43 d0/d6/d6f/la4 0 2026-03-10T07:50:51.999 INFO:tasks.workunit.client.0.vm05.stdout:3/417: unlink d8/d1c/c20 0 2026-03-10T07:50:52.001 INFO:tasks.workunit.client.0.vm05.stdout:0/410: creat d8/dd/d37/d81/f91 x:0 0 0 2026-03-10T07:50:52.005 INFO:tasks.workunit.client.0.vm05.stdout:0/411: dread d8/dd/d10/d26/d2a/f6e [0,4194304] 0 2026-03-10T07:50:52.009 INFO:tasks.workunit.client.0.vm05.stdout:6/447: creat d0/d6/d3b/f8d x:0 0 0 2026-03-10T07:50:52.016 INFO:tasks.workunit.client.0.vm05.stdout:4/476: creat d0/d6/d9/d12/d69/fa5 x:0 0 0 2026-03-10T07:50:52.018 INFO:tasks.workunit.client.0.vm05.stdout:3/418: creat d8/d22/d60/f8e x:0 0 0 2026-03-10T07:50:52.018 INFO:tasks.workunit.client.0.vm05.stdout:3/419: write d8/d1c/d64/f72 [390175,16590] 0 2026-03-10T07:50:52.019 INFO:tasks.workunit.client.0.vm05.stdout:3/420: dread - d8/d16/f67 zero size 2026-03-10T07:50:52.020 INFO:tasks.workunit.client.0.vm05.stdout:3/421: chown d8/d22/d54/f75 237580842 1 2026-03-10T07:50:52.037 INFO:tasks.workunit.client.0.vm05.stdout:8/398: dread d1/dd/f11 [0,4194304] 0 2026-03-10T07:50:52.041 INFO:tasks.workunit.client.0.vm05.stdout:2/506: dread d0/d8/d43/df/d8b/f99 [0,4194304] 0 2026-03-10T07:50:52.060 INFO:tasks.workunit.client.0.vm05.stdout:9/432: rename d8/d35/d1c/d2c/f2e to d8/d86/f92 0 2026-03-10T07:50:52.060 INFO:tasks.workunit.client.0.vm05.stdout:4/477: mkdir d0/d6/da6 0 2026-03-10T07:50:52.060 INFO:tasks.workunit.client.0.vm05.stdout:4/478: chown d0/d6/d9/d12/f36 85770 1 2026-03-10T07:50:52.060 INFO:tasks.workunit.client.0.vm05.stdout:4/479: chown d0/d3b/f4c 322207 1 2026-03-10T07:50:52.060 INFO:tasks.workunit.client.0.vm05.stdout:4/480: read - d0/f41 zero size 2026-03-10T07:50:52.060 INFO:tasks.workunit.client.0.vm05.stdout:5/444: dwrite d2/f8 [0,4194304] 0 2026-03-10T07:50:52.061 INFO:tasks.workunit.client.0.vm05.stdout:8/399: creat d1/d52/f77 x:0 0 0 2026-03-10T07:50:52.064 INFO:tasks.workunit.client.0.vm05.stdout:1/455: rename c5 to da/dd/d2a/d55/c82 0 2026-03-10T07:50:52.070 INFO:tasks.workunit.client.0.vm05.stdout:1/456: dread da/dd/f77 [0,4194304] 0 2026-03-10T07:50:52.079 INFO:tasks.workunit.client.0.vm05.stdout:5/445: dwrite d2/d5/d61/f8b [0,4194304] 0 2026-03-10T07:50:52.083 INFO:tasks.workunit.client.0.vm05.stdout:8/400: dread d1/dd/d18/f22 [0,4194304] 0 2026-03-10T07:50:52.090 INFO:tasks.workunit.client.0.vm05.stdout:3/422: rename d8/d1f/d2a/d34/d5f to d8/d8f 0 2026-03-10T07:50:52.090 INFO:tasks.workunit.client.0.vm05.stdout:1/457: fdatasync da/d26/f2d 0 2026-03-10T07:50:52.090 INFO:tasks.workunit.client.0.vm05.stdout:1/458: write da/dd/d12/d19/d20/f6c [934434,44382] 0 2026-03-10T07:50:52.092 INFO:tasks.workunit.client.0.vm05.stdout:7/447: write d1/d34/f7c [1659716,113770] 0 2026-03-10T07:50:52.098 INFO:tasks.workunit.client.0.vm05.stdout:5/446: symlink d2/d20/d33/d53/d7d/l9d 0 2026-03-10T07:50:52.098 INFO:tasks.workunit.client.0.vm05.stdout:5/447: stat d2/d12/d2d/d4a/f59 0 2026-03-10T07:50:52.099 INFO:tasks.workunit.client.0.vm05.stdout:5/448: write d2/d20/d33/d53/d7d/f82 [926107,69952] 0 2026-03-10T07:50:52.101 INFO:tasks.workunit.client.0.vm05.stdout:8/401: readlink d1/dd/d18/d20/d2a/d34/d49/l71 0 2026-03-10T07:50:52.101 INFO:tasks.workunit.client.0.vm05.stdout:8/402: dread - d1/d52/f77 zero size 2026-03-10T07:50:52.106 INFO:tasks.workunit.client.0.vm05.stdout:7/448: mknod d1/d34/d59/c88 0 2026-03-10T07:50:52.106 INFO:tasks.workunit.client.0.vm05.stdout:7/449: write d1/d6/d47/f7b [1195880,59034] 0 2026-03-10T07:50:52.108 INFO:tasks.workunit.client.0.vm05.stdout:5/449: creat d2/d12/d2d/f9e x:0 0 0 2026-03-10T07:50:52.109 INFO:tasks.workunit.client.0.vm05.stdout:0/412: rename d8/dd/c27 to d8/dd/d37/d56/d4d/c92 0 2026-03-10T07:50:52.110 INFO:tasks.workunit.client.0.vm05.stdout:0/413: readlink d8/dd/d10/d26/d3a/d5e/l6b 0 2026-03-10T07:50:52.111 INFO:tasks.workunit.client.0.vm05.stdout:7/450: creat d1/d3c/f89 x:0 0 0 2026-03-10T07:50:52.113 INFO:tasks.workunit.client.0.vm05.stdout:5/450: mknod d2/d5/d61/c9f 0 2026-03-10T07:50:52.114 INFO:tasks.workunit.client.0.vm05.stdout:8/403: dread d1/f2c [0,4194304] 0 2026-03-10T07:50:52.115 INFO:tasks.workunit.client.0.vm05.stdout:4/481: rename d0/d3b to d0/d20/da7 0 2026-03-10T07:50:52.116 INFO:tasks.workunit.client.0.vm05.stdout:0/414: mknod d8/dd/d10/d26/d2a/d6f/c93 0 2026-03-10T07:50:52.127 INFO:tasks.workunit.client.0.vm05.stdout:7/451: stat d1/d6/f31 0 2026-03-10T07:50:52.127 INFO:tasks.workunit.client.0.vm05.stdout:7/452: write d1/d6/f58 [1253241,37831] 0 2026-03-10T07:50:52.128 INFO:tasks.workunit.client.0.vm05.stdout:7/453: readlink d1/d6/d47/l4a 0 2026-03-10T07:50:52.129 INFO:tasks.workunit.client.0.vm05.stdout:1/459: read da/d26/f33 [4022030,122474] 0 2026-03-10T07:50:52.133 INFO:tasks.workunit.client.0.vm05.stdout:3/423: dread d8/d1f/f6c [0,4194304] 0 2026-03-10T07:50:52.134 INFO:tasks.workunit.client.0.vm05.stdout:3/424: truncate d8/d1f/d2a/d4a/d7d/f7e 246469 0 2026-03-10T07:50:52.135 INFO:tasks.workunit.client.0.vm05.stdout:3/425: chown d8/d1f/l6d 13470 1 2026-03-10T07:50:52.141 INFO:tasks.workunit.client.0.vm05.stdout:2/507: rmdir d0 39 2026-03-10T07:50:52.143 INFO:tasks.workunit.client.0.vm05.stdout:6/448: truncate d0/d6/f1d 621173 0 2026-03-10T07:50:52.152 INFO:tasks.workunit.client.0.vm05.stdout:4/482: rename d0/d6/d9/d12/d45/d55/d44/f74 to d0/d6/d9/d5a/d6e/fa8 0 2026-03-10T07:50:52.152 INFO:tasks.workunit.client.0.vm05.stdout:0/415: creat d8/dd/d10/d26/d2a/f94 x:0 0 0 2026-03-10T07:50:52.153 INFO:tasks.workunit.client.0.vm05.stdout:9/433: write d8/d35/d1c/f4c [360481,3618] 0 2026-03-10T07:50:52.160 INFO:tasks.workunit.client.0.vm05.stdout:1/460: dwrite da/dd/d12/f16 [0,4194304] 0 2026-03-10T07:50:52.162 INFO:tasks.workunit.client.0.vm05.stdout:1/461: dread da/dd/d2a/f63 [0,4194304] 0 2026-03-10T07:50:52.163 INFO:tasks.workunit.client.0.vm05.stdout:1/462: write da/f43 [2275414,110900] 0 2026-03-10T07:50:52.178 INFO:tasks.workunit.client.0.vm05.stdout:3/426: unlink d8/l17 0 2026-03-10T07:50:52.178 INFO:tasks.workunit.client.0.vm05.stdout:3/427: fdatasync d8/f12 0 2026-03-10T07:50:52.179 INFO:tasks.workunit.client.0.vm05.stdout:2/508: truncate d0/d8/f7b 1037762 0 2026-03-10T07:50:52.181 INFO:tasks.workunit.client.0.vm05.stdout:8/404: mknod d1/dd/d18/d20/d2a/d34/c78 0 2026-03-10T07:50:52.182 INFO:tasks.workunit.client.0.vm05.stdout:8/405: write d1/dd/d18/f70 [120689,105752] 0 2026-03-10T07:50:52.187 INFO:tasks.workunit.client.0.vm05.stdout:4/483: unlink d0/d6/d9/f54 0 2026-03-10T07:50:52.188 INFO:tasks.workunit.client.0.vm05.stdout:4/484: truncate d0/d20/da7/f9a 475395 0 2026-03-10T07:50:52.189 INFO:tasks.workunit.client.0.vm05.stdout:0/416: unlink d8/dd/d37/d56/f3b 0 2026-03-10T07:50:52.191 INFO:tasks.workunit.client.0.vm05.stdout:7/454: truncate d1/d3c/f63 234387 0 2026-03-10T07:50:52.193 INFO:tasks.workunit.client.0.vm05.stdout:9/434: dread d8/d86/d28/f29 [0,4194304] 0 2026-03-10T07:50:52.194 INFO:tasks.workunit.client.0.vm05.stdout:5/451: write d2/d5/f10 [55065,111361] 0 2026-03-10T07:50:52.199 INFO:tasks.workunit.client.0.vm05.stdout:3/428: creat d8/d8f/f90 x:0 0 0 2026-03-10T07:50:52.202 INFO:tasks.workunit.client.0.vm05.stdout:5/452: dread d2/d12/d4d/f5d [0,4194304] 0 2026-03-10T07:50:52.204 INFO:tasks.workunit.client.0.vm05.stdout:3/429: dwrite d8/d16/f2d [0,4194304] 0 2026-03-10T07:50:52.214 INFO:tasks.workunit.client.0.vm05.stdout:4/485: rename d0/d20/c22 to d0/d20/da7/d96/ca9 0 2026-03-10T07:50:52.224 INFO:tasks.workunit.client.0.vm05.stdout:9/435: unlink d8/d35/d22/d33/d62/l89 0 2026-03-10T07:50:52.224 INFO:tasks.workunit.client.0.vm05.stdout:6/449: link d0/f15 d0/d11/d2e/f8e 0 2026-03-10T07:50:52.225 INFO:tasks.workunit.client.0.vm05.stdout:4/486: symlink d0/d6/d9/d12/d45/d55/d44/laa 0 2026-03-10T07:50:52.226 INFO:tasks.workunit.client.0.vm05.stdout:4/487: read - d0/d6/d9/d5a/f88 zero size 2026-03-10T07:50:52.226 INFO:tasks.workunit.client.0.vm05.stdout:4/488: chown d0/d6/d37 0 1 2026-03-10T07:50:52.227 INFO:tasks.workunit.client.0.vm05.stdout:7/455: dwrite d1/d6/d3b/f42 [0,4194304] 0 2026-03-10T07:50:52.228 INFO:tasks.workunit.client.0.vm05.stdout:7/456: dread - d1/d3c/f89 zero size 2026-03-10T07:50:52.228 INFO:tasks.workunit.client.0.vm05.stdout:7/457: fdatasync d1/d6/f84 0 2026-03-10T07:50:52.233 INFO:tasks.workunit.client.0.vm05.stdout:5/453: symlink d2/d12/la0 0 2026-03-10T07:50:52.233 INFO:tasks.workunit.client.0.vm05.stdout:6/450: rename d0/d6/d3b/f8d to d0/d6/d3b/f8f 0 2026-03-10T07:50:52.240 INFO:tasks.workunit.client.0.vm05.stdout:7/458: rmdir d1/d6/d80/d82 39 2026-03-10T07:50:52.242 INFO:tasks.workunit.client.0.vm05.stdout:5/454: dwrite d2/d20/d4c/f9a [0,4194304] 0 2026-03-10T07:50:52.247 INFO:tasks.workunit.client.0.vm05.stdout:2/509: getdents d0/d8/d3d/d7d 0 2026-03-10T07:50:52.250 INFO:tasks.workunit.client.0.vm05.stdout:0/417: getdents d8/dd/d10/d26/d3a/d5e/d63 0 2026-03-10T07:50:52.252 INFO:tasks.workunit.client.0.vm05.stdout:5/455: rename d2/d20/d77/d93 to d2/d20/d33/d86/d8d/da1 0 2026-03-10T07:50:52.261 INFO:tasks.workunit.client.0.vm05.stdout:1/463: sync 2026-03-10T07:50:52.261 INFO:tasks.workunit.client.0.vm05.stdout:8/406: sync 2026-03-10T07:50:52.264 INFO:tasks.workunit.client.0.vm05.stdout:8/407: chown d1/dd/d18/d20/d2a/d48/f50 52221 1 2026-03-10T07:50:52.265 INFO:tasks.workunit.client.0.vm05.stdout:5/456: dread d2/d20/d33/d53/f97 [0,4194304] 0 2026-03-10T07:50:52.266 INFO:tasks.workunit.client.0.vm05.stdout:5/457: read - d2/d12/d2d/d4a/f99 zero size 2026-03-10T07:50:52.271 INFO:tasks.workunit.client.0.vm05.stdout:2/510: symlink d0/d47/la2 0 2026-03-10T07:50:52.276 INFO:tasks.workunit.client.0.vm05.stdout:9/436: link d8/d35/d22/d33/c53 d8/d35/d1c/d20/d54/c93 0 2026-03-10T07:50:52.277 INFO:tasks.workunit.client.0.vm05.stdout:9/437: write d8/d35/d22/d33/d62/f7d [586045,124080] 0 2026-03-10T07:50:52.277 INFO:tasks.workunit.client.0.vm05.stdout:9/438: chown d8/d35/d1c/d20/c2d 72 1 2026-03-10T07:50:52.277 INFO:tasks.workunit.client.0.vm05.stdout:1/464: rename da/d26/f2d to da/dd/d2a/d70/f83 0 2026-03-10T07:50:52.285 INFO:tasks.workunit.client.0.vm05.stdout:1/465: dwrite da/dd/d2a/f2f [0,4194304] 0 2026-03-10T07:50:52.286 INFO:tasks.workunit.client.0.vm05.stdout:1/466: write da/dd/d12/d19/d20/f79 [2237890,32597] 0 2026-03-10T07:50:52.287 INFO:tasks.workunit.client.0.vm05.stdout:1/467: chown da/dd/d2a/d55/d68/l78 105327 1 2026-03-10T07:50:52.292 INFO:tasks.workunit.client.0.vm05.stdout:4/489: dread d0/d6/d9/d12/d45/d55/f5f [0,4194304] 0 2026-03-10T07:50:52.294 INFO:tasks.workunit.client.0.vm05.stdout:0/418: dread d8/dd/d34/f5b [0,4194304] 0 2026-03-10T07:50:52.294 INFO:tasks.workunit.client.0.vm05.stdout:4/490: write d0/d20/da7/f53 [324481,123326] 0 2026-03-10T07:50:52.295 INFO:tasks.workunit.client.0.vm05.stdout:0/419: chown d8/dd/d10/d26/d3a/d5e/l6b 4914 1 2026-03-10T07:50:52.300 INFO:tasks.workunit.client.0.vm05.stdout:2/511: symlink d0/d8/d43/df/d8b/la3 0 2026-03-10T07:50:52.309 INFO:tasks.workunit.client.0.vm05.stdout:0/420: mknod d8/dd/d37/d81/c95 0 2026-03-10T07:50:52.314 INFO:tasks.workunit.client.0.vm05.stdout:5/458: mknod d2/d20/ca2 0 2026-03-10T07:50:52.315 INFO:tasks.workunit.client.0.vm05.stdout:1/468: mknod da/d26/d2b/c84 0 2026-03-10T07:50:52.315 INFO:tasks.workunit.client.0.vm05.stdout:4/491: mknod d0/d6/d9/d12/d45/cab 0 2026-03-10T07:50:52.316 INFO:tasks.workunit.client.0.vm05.stdout:4/492: dread - d0/d6/d9/d12/d69/fa5 zero size 2026-03-10T07:50:52.322 INFO:tasks.workunit.client.0.vm05.stdout:4/493: dwrite d0/d20/f63 [0,4194304] 0 2026-03-10T07:50:52.322 INFO:tasks.workunit.client.0.vm05.stdout:4/494: fdatasync d0/fa3 0 2026-03-10T07:50:52.330 INFO:tasks.workunit.client.0.vm05.stdout:0/421: rmdir d8/dd/d10/d26/d48 39 2026-03-10T07:50:52.332 INFO:tasks.workunit.client.0.vm05.stdout:0/422: dwrite d8/dd/d37/f38 [0,4194304] 0 2026-03-10T07:50:52.347 INFO:tasks.workunit.client.0.vm05.stdout:5/459: write d2/d20/d33/d53/d7d/f7e [348526,67668] 0 2026-03-10T07:50:52.349 INFO:tasks.workunit.client.0.vm05.stdout:3/430: dwrite d8/d1c/f2b [0,4194304] 0 2026-03-10T07:50:52.350 INFO:tasks.workunit.client.0.vm05.stdout:0/423: fdatasync d8/dd/d10/f19 0 2026-03-10T07:50:52.355 INFO:tasks.workunit.client.0.vm05.stdout:1/469: sync 2026-03-10T07:50:52.359 INFO:tasks.workunit.client.0.vm05.stdout:3/431: dwrite d8/d1f/d2a/d34/f39 [0,4194304] 0 2026-03-10T07:50:52.365 INFO:tasks.workunit.client.0.vm05.stdout:0/424: mkdir d8/dd/d37/d67/d96 0 2026-03-10T07:50:52.370 INFO:tasks.workunit.client.0.vm05.stdout:0/425: symlink d8/dd/d10/d26/d3a/l97 0 2026-03-10T07:50:52.373 INFO:tasks.workunit.client.0.vm05.stdout:1/470: mknod da/dd/c85 0 2026-03-10T07:50:52.377 INFO:tasks.workunit.client.0.vm05.stdout:3/432: link d8/d22/f29 d8/d1f/d24/d8a/f91 0 2026-03-10T07:50:52.377 INFO:tasks.workunit.client.0.vm05.stdout:3/433: write d8/d16/d19/f21 [899136,69099] 0 2026-03-10T07:50:52.381 INFO:tasks.workunit.client.0.vm05.stdout:3/434: dwrite d8/f5d [0,4194304] 0 2026-03-10T07:50:52.387 INFO:tasks.workunit.client.0.vm05.stdout:3/435: read f4 [3893303,101013] 0 2026-03-10T07:50:52.388 INFO:tasks.workunit.client.0.vm05.stdout:3/436: chown d8/d1c/d48/f70 169696161 1 2026-03-10T07:50:52.389 INFO:tasks.workunit.client.0.vm05.stdout:5/460: getdents d2/d12/d2d 0 2026-03-10T07:50:52.393 INFO:tasks.workunit.client.0.vm05.stdout:3/437: mknod d8/d22/d60/d6e/c92 0 2026-03-10T07:50:52.396 INFO:tasks.workunit.client.0.vm05.stdout:5/461: rename d2/d5/d61/c9f to d2/d20/d33/d86/ca3 0 2026-03-10T07:50:52.397 INFO:tasks.workunit.client.0.vm05.stdout:5/462: fdatasync d2/d20/d4c/f8c 0 2026-03-10T07:50:52.398 INFO:tasks.workunit.client.0.vm05.stdout:5/463: write d2/d20/d33/d53/d7d/f9b [18603,74143] 0 2026-03-10T07:50:52.398 INFO:tasks.workunit.client.0.vm05.stdout:5/464: dread - d2/d20/d33/f88 zero size 2026-03-10T07:50:52.398 INFO:tasks.workunit.client.0.vm05.stdout:5/465: stat d2/d20/d5b/c67 0 2026-03-10T07:50:52.402 INFO:tasks.workunit.client.0.vm05.stdout:1/471: mkdir da/dd/d12/d86 0 2026-03-10T07:50:52.412 INFO:tasks.workunit.client.0.vm05.stdout:5/466: dwrite d2/d20/d4c/d64/f96 [0,4194304] 0 2026-03-10T07:50:52.413 INFO:tasks.workunit.client.0.vm05.stdout:5/467: chown d2/d20/d33/d72/l94 43104478 1 2026-03-10T07:50:52.413 INFO:tasks.workunit.client.0.vm05.stdout:1/472: truncate da/dd/d2a/d55/d64/f7a 979279 0 2026-03-10T07:50:52.413 INFO:tasks.workunit.client.0.vm05.stdout:5/468: chown d2/d12/d2d/c95 255361812 1 2026-03-10T07:50:52.413 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:52 vm05.local ceph-mon[50387]: pgmap v15: 65 pgs: 65 active+clean; 840 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 18 MiB/s rd, 98 MiB/s wr, 212 op/s 2026-03-10T07:50:52.413 INFO:tasks.workunit.client.0.vm05.stdout:5/469: dread d2/d20/f2a [4194304,4194304] 0 2026-03-10T07:50:52.413 INFO:tasks.workunit.client.0.vm05.stdout:3/438: mknod d8/c93 0 2026-03-10T07:50:52.415 INFO:tasks.workunit.client.0.vm05.stdout:3/439: read d8/d22/f29 [722585,92684] 0 2026-03-10T07:50:52.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:52 vm08.local ceph-mon[59917]: pgmap v15: 65 pgs: 65 active+clean; 840 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 18 MiB/s rd, 98 MiB/s wr, 212 op/s 2026-03-10T07:50:52.420 INFO:tasks.workunit.client.0.vm05.stdout:3/440: dwrite d8/d22/d60/f50 [0,4194304] 0 2026-03-10T07:50:52.421 INFO:tasks.workunit.client.0.vm05.stdout:3/441: fsync d8/d1f/d2a/f66 0 2026-03-10T07:50:52.425 INFO:tasks.workunit.client.0.vm05.stdout:7/459: dwrite d1/d3c/d4b/f4f [0,4194304] 0 2026-03-10T07:50:52.430 INFO:tasks.workunit.client.0.vm05.stdout:7/460: write d1/d6/f1d [4370905,128301] 0 2026-03-10T07:50:52.433 INFO:tasks.workunit.client.0.vm05.stdout:1/473: readlink da/dd/d2a/d55/d68/l6b 0 2026-03-10T07:50:52.436 INFO:tasks.workunit.client.0.vm05.stdout:3/442: creat d8/d1c/f94 x:0 0 0 2026-03-10T07:50:52.437 INFO:tasks.workunit.client.0.vm05.stdout:8/408: write d1/dd/d4d/d64/f67 [3582198,76145] 0 2026-03-10T07:50:52.438 INFO:tasks.workunit.client.0.vm05.stdout:9/439: truncate d8/d35/d22/d33/f73 334950 0 2026-03-10T07:50:52.443 INFO:tasks.workunit.client.0.vm05.stdout:3/443: creat d8/d1f/f95 x:0 0 0 2026-03-10T07:50:52.445 INFO:tasks.workunit.client.0.vm05.stdout:2/512: dwrite d0/f6 [4194304,4194304] 0 2026-03-10T07:50:52.447 INFO:tasks.workunit.client.0.vm05.stdout:8/409: creat d1/dd/d18/d20/d2a/d48/f79 x:0 0 0 2026-03-10T07:50:52.452 INFO:tasks.workunit.client.0.vm05.stdout:7/461: mkdir d1/d3c/d71/d79/d8a 0 2026-03-10T07:50:52.452 INFO:tasks.workunit.client.0.vm05.stdout:2/513: write d0/d47/d49/f9b [820809,122007] 0 2026-03-10T07:50:52.458 INFO:tasks.workunit.client.0.vm05.stdout:9/440: dwrite d8/d86/f92 [0,4194304] 0 2026-03-10T07:50:52.461 INFO:tasks.workunit.client.0.vm05.stdout:5/470: sync 2026-03-10T07:50:52.462 INFO:tasks.workunit.client.0.vm05.stdout:1/474: sync 2026-03-10T07:50:52.462 INFO:tasks.workunit.client.0.vm05.stdout:3/444: sync 2026-03-10T07:50:52.466 INFO:tasks.workunit.client.0.vm05.stdout:6/451: truncate d0/d6/f1d 651358 0 2026-03-10T07:50:52.470 INFO:tasks.workunit.client.0.vm05.stdout:8/410: symlink d1/dd/d18/d20/d2a/l7a 0 2026-03-10T07:50:52.475 INFO:tasks.workunit.client.0.vm05.stdout:8/411: chown d1/dd/d18/d20/d2a/d48/d5a 1767999343 1 2026-03-10T07:50:52.475 INFO:tasks.workunit.client.0.vm05.stdout:1/475: dwrite da/dd/d12/d34/f5f [0,4194304] 0 2026-03-10T07:50:52.476 INFO:tasks.workunit.client.0.vm05.stdout:5/471: dwrite d2/d20/d33/d53/d7d/f9b [0,4194304] 0 2026-03-10T07:50:52.477 INFO:tasks.workunit.client.0.vm05.stdout:5/472: fsync d2/d20/d33/d53/d7d/f82 0 2026-03-10T07:50:52.493 INFO:tasks.workunit.client.0.vm05.stdout:3/445: rename d8/d22/d60/d58 to d8/d1f/d2a/d96 0 2026-03-10T07:50:52.495 INFO:tasks.workunit.client.0.vm05.stdout:3/446: write d8/f12 [9111213,17264] 0 2026-03-10T07:50:52.496 INFO:tasks.workunit.client.0.vm05.stdout:6/452: readlink d0/d6/l5d 0 2026-03-10T07:50:52.499 INFO:tasks.workunit.client.0.vm05.stdout:8/412: unlink d1/dd/d18/d20/d2a/d48/c5f 0 2026-03-10T07:50:52.501 INFO:tasks.workunit.client.0.vm05.stdout:5/473: rename d2/c29 to d2/d20/d7b/ca4 0 2026-03-10T07:50:52.502 INFO:tasks.workunit.client.0.vm05.stdout:9/441: truncate d8/d35/d1c/d75/f88 203650 0 2026-03-10T07:50:52.503 INFO:tasks.workunit.client.0.vm05.stdout:9/442: truncate d8/f8a 468364 0 2026-03-10T07:50:52.504 INFO:tasks.workunit.client.0.vm05.stdout:6/453: rmdir d0/d6/d3b 39 2026-03-10T07:50:52.505 INFO:tasks.workunit.client.0.vm05.stdout:8/413: rmdir d1/dd/d4d/d64/d6a 39 2026-03-10T07:50:52.511 INFO:tasks.workunit.client.0.vm05.stdout:8/414: creat d1/dd/d18/d20/d2a/d48/f7b x:0 0 0 2026-03-10T07:50:52.514 INFO:tasks.workunit.client.0.vm05.stdout:8/415: dread d1/dd/d4d/f61 [0,4194304] 0 2026-03-10T07:50:52.517 INFO:tasks.workunit.client.0.vm05.stdout:8/416: mkdir d1/dd/d18/d20/d2a/d48/d7c 0 2026-03-10T07:50:52.519 INFO:tasks.workunit.client.0.vm05.stdout:8/417: creat d1/d6f/f7d x:0 0 0 2026-03-10T07:50:52.519 INFO:tasks.workunit.client.0.vm05.stdout:8/418: fdatasync d1/dd/d18/f70 0 2026-03-10T07:50:52.520 INFO:tasks.workunit.client.0.vm05.stdout:9/443: sync 2026-03-10T07:50:52.520 INFO:tasks.workunit.client.0.vm05.stdout:6/454: sync 2026-03-10T07:50:52.523 INFO:tasks.workunit.client.0.vm05.stdout:4/495: truncate d0/d6/d37/f46 5729 0 2026-03-10T07:50:52.523 INFO:tasks.workunit.client.0.vm05.stdout:4/496: stat d0/d6/d9/d12/d45/d55/d4e/c6d 0 2026-03-10T07:50:52.528 INFO:tasks.workunit.client.0.vm05.stdout:6/455: symlink d0/d11/d22/d69/l90 0 2026-03-10T07:50:52.538 INFO:tasks.workunit.client.0.vm05.stdout:6/456: fdatasync d0/d6/f45 0 2026-03-10T07:50:52.539 INFO:tasks.workunit.client.0.vm05.stdout:8/419: mknod d1/dd/d18/d20/d2a/d34/d49/c7e 0 2026-03-10T07:50:52.539 INFO:tasks.workunit.client.0.vm05.stdout:6/457: link d0/d11/d57/d66/f7b d0/d11/d4f/f91 0 2026-03-10T07:50:52.539 INFO:tasks.workunit.client.0.vm05.stdout:6/458: write d0/d11/d2e/f7f [727888,110339] 0 2026-03-10T07:50:52.539 INFO:tasks.workunit.client.0.vm05.stdout:4/497: link d0/d6/d9/l27 d0/d6/d9/d12/d45/d55/lac 0 2026-03-10T07:50:52.539 INFO:tasks.workunit.client.0.vm05.stdout:9/444: getdents d8 0 2026-03-10T07:50:52.539 INFO:tasks.workunit.client.0.vm05.stdout:8/420: mknod d1/dd/d18/d20/d2a/c7f 0 2026-03-10T07:50:52.541 INFO:tasks.workunit.client.0.vm05.stdout:4/498: rename d0/d6/d9/d5a/f88 to d0/d6/d95/fad 0 2026-03-10T07:50:52.543 INFO:tasks.workunit.client.0.vm05.stdout:6/459: mkdir d0/d11/d2e/d81/d92 0 2026-03-10T07:50:52.543 INFO:tasks.workunit.client.0.vm05.stdout:6/460: chown d0/d11/d22/f52 2 1 2026-03-10T07:50:52.544 INFO:tasks.workunit.client.0.vm05.stdout:4/499: write d0/d6/d9/d5a/d6e/fa8 [743276,22424] 0 2026-03-10T07:50:52.545 INFO:tasks.workunit.client.0.vm05.stdout:4/500: write d0/d6/d9/d5a/d6e/fa8 [531490,88324] 0 2026-03-10T07:50:52.546 INFO:tasks.workunit.client.0.vm05.stdout:9/445: mknod d8/d35/d1c/c94 0 2026-03-10T07:50:52.547 INFO:tasks.workunit.client.0.vm05.stdout:8/421: creat d1/dd/d18/d20/d2a/d48/d7c/f80 x:0 0 0 2026-03-10T07:50:52.550 INFO:tasks.workunit.client.0.vm05.stdout:4/501: rmdir d0/d6/d9/d12/d45/d55/d44/d85 39 2026-03-10T07:50:52.554 INFO:tasks.workunit.client.0.vm05.stdout:8/422: dwrite d1/dd/d18/d20/d2a/d48/f79 [0,4194304] 0 2026-03-10T07:50:52.556 INFO:tasks.workunit.client.0.vm05.stdout:8/423: chown d1/dd/d18 1037 1 2026-03-10T07:50:52.556 INFO:tasks.workunit.client.0.vm05.stdout:8/424: chown d1/dd/d18/d20/c35 9617278 1 2026-03-10T07:50:52.561 INFO:tasks.workunit.client.0.vm05.stdout:9/446: sync 2026-03-10T07:50:52.561 INFO:tasks.workunit.client.0.vm05.stdout:6/461: dread d0/d6/f10 [0,4194304] 0 2026-03-10T07:50:52.561 INFO:tasks.workunit.client.0.vm05.stdout:9/447: chown d8/d35/d1c 53 1 2026-03-10T07:50:52.561 INFO:tasks.workunit.client.0.vm05.stdout:6/462: dread - d0/d11/d4f/f91 zero size 2026-03-10T07:50:52.563 INFO:tasks.workunit.client.0.vm05.stdout:6/463: chown d0/d11/d2e/c42 57688057 1 2026-03-10T07:50:52.563 INFO:tasks.workunit.client.0.vm05.stdout:6/464: readlink d0/d11/d22/l5a 0 2026-03-10T07:50:52.564 INFO:tasks.workunit.client.0.vm05.stdout:6/465: dread - d0/d11/d57/f7a zero size 2026-03-10T07:50:52.567 INFO:tasks.workunit.client.0.vm05.stdout:4/502: rmdir d0/d20 39 2026-03-10T07:50:52.570 INFO:tasks.workunit.client.0.vm05.stdout:6/466: mkdir d0/d11/d57/d93 0 2026-03-10T07:50:52.573 INFO:tasks.workunit.client.0.vm05.stdout:4/503: chown d0/d20/da7/d5c/c87 3 1 2026-03-10T07:50:52.576 INFO:tasks.workunit.client.0.vm05.stdout:9/448: mkdir d8/d86/d95 0 2026-03-10T07:50:52.578 INFO:tasks.workunit.client.0.vm05.stdout:9/449: read - d8/d86/d28/d79/f91 zero size 2026-03-10T07:50:52.581 INFO:tasks.workunit.client.0.vm05.stdout:8/425: link d1/dd/d18/d20/f30 d1/d45/f81 0 2026-03-10T07:50:52.583 INFO:tasks.workunit.client.0.vm05.stdout:6/467: dread d0/d11/d4f/d56/f6f [0,4194304] 0 2026-03-10T07:50:52.583 INFO:tasks.workunit.client.0.vm05.stdout:6/468: chown d0/c38 7047 1 2026-03-10T07:50:52.585 INFO:tasks.workunit.client.0.vm05.stdout:6/469: dwrite d0/d11/d4f/f7e [0,4194304] 0 2026-03-10T07:50:52.596 INFO:tasks.workunit.client.0.vm05.stdout:4/504: sync 2026-03-10T07:50:52.599 INFO:tasks.workunit.client.0.vm05.stdout:6/470: fdatasync d0/d6/f1d 0 2026-03-10T07:50:52.604 INFO:tasks.workunit.client.0.vm05.stdout:4/505: mknod d0/cae 0 2026-03-10T07:50:52.608 INFO:tasks.workunit.client.0.vm05.stdout:4/506: truncate d0/d6/f39 4345980 0 2026-03-10T07:50:52.608 INFO:tasks.workunit.client.0.vm05.stdout:4/507: creat d0/d6/d60/faf x:0 0 0 2026-03-10T07:50:52.608 INFO:tasks.workunit.client.0.vm05.stdout:8/426: link d1/l44 d1/dd/d18/d20/d2a/d34/l82 0 2026-03-10T07:50:52.612 INFO:tasks.workunit.client.0.vm05.stdout:8/427: rmdir d1/dd/d18/d20/d2a/d34/d49/d5d 39 2026-03-10T07:50:52.613 INFO:tasks.workunit.client.0.vm05.stdout:6/471: getdents d0/d11/d31 0 2026-03-10T07:50:52.623 INFO:tasks.workunit.client.0.vm05.stdout:0/426: write d8/dd/d34/f3d [431930,36050] 0 2026-03-10T07:50:52.624 INFO:tasks.workunit.client.0.vm05.stdout:0/427: write d8/dd/d10/f6c [2478978,79842] 0 2026-03-10T07:50:52.633 INFO:tasks.workunit.client.0.vm05.stdout:0/428: symlink d8/dd/d10/d26/d3a/d5e/d63/l98 0 2026-03-10T07:50:52.633 INFO:tasks.workunit.client.0.vm05.stdout:0/429: fsync d8/f1c 0 2026-03-10T07:50:52.635 INFO:tasks.workunit.client.0.vm05.stdout:4/508: rename d0/d28/l3f to d0/d6/lb0 0 2026-03-10T07:50:52.636 INFO:tasks.workunit.client.0.vm05.stdout:0/430: truncate d8/dd/d10/d26/d3a/d5e/f7b 128824 0 2026-03-10T07:50:52.636 INFO:tasks.workunit.client.0.vm05.stdout:8/428: creat d1/dd/d4d/d64/d6a/f83 x:0 0 0 2026-03-10T07:50:52.637 INFO:tasks.workunit.client.0.vm05.stdout:8/429: stat d1/dd/d5e 0 2026-03-10T07:50:52.644 INFO:tasks.workunit.client.0.vm05.stdout:8/430: dread d1/dd/f11 [0,4194304] 0 2026-03-10T07:50:52.646 INFO:tasks.workunit.client.0.vm05.stdout:6/472: dread d0/d11/f58 [0,4194304] 0 2026-03-10T07:50:52.647 INFO:tasks.workunit.client.0.vm05.stdout:0/431: mknod d8/dd/d10/d26/d48/c99 0 2026-03-10T07:50:52.648 INFO:tasks.workunit.client.0.vm05.stdout:4/509: dwrite d0/d6/d9/d12/d45/d55/d44/d85/f9e [0,4194304] 0 2026-03-10T07:50:52.649 INFO:tasks.workunit.client.0.vm05.stdout:3/447: getdents d8/d22/d60 0 2026-03-10T07:50:52.652 INFO:tasks.workunit.client.0.vm05.stdout:6/473: dwrite d0/d11/d57/d66/f75 [0,4194304] 0 2026-03-10T07:50:52.654 INFO:tasks.workunit.client.0.vm05.stdout:6/474: fdatasync d0/d11/f58 0 2026-03-10T07:50:52.655 INFO:tasks.workunit.client.0.vm05.stdout:3/448: dwrite d8/d1c/f56 [0,4194304] 0 2026-03-10T07:50:52.661 INFO:tasks.workunit.client.0.vm05.stdout:3/449: dread d8/d1f/d2a/d4a/d7d/f7e [0,4194304] 0 2026-03-10T07:50:52.664 INFO:tasks.workunit.client.0.vm05.stdout:7/462: truncate d1/d6/f1d 1451362 0 2026-03-10T07:50:52.673 INFO:tasks.workunit.client.0.vm05.stdout:2/514: truncate d0/d8/d43/df/f58 3565793 0 2026-03-10T07:50:52.681 INFO:tasks.workunit.client.0.vm05.stdout:2/515: chown d0/f89 1117 1 2026-03-10T07:50:52.681 INFO:tasks.workunit.client.0.vm05.stdout:2/516: write d0/d8/f1c [2779758,45087] 0 2026-03-10T07:50:52.681 INFO:tasks.workunit.client.0.vm05.stdout:1/476: truncate da/dd/d2a/d55/d68/f36 3239455 0 2026-03-10T07:50:52.682 INFO:tasks.workunit.client.0.vm05.stdout:5/474: write d2/f42 [1403647,129765] 0 2026-03-10T07:50:52.682 INFO:tasks.workunit.client.0.vm05.stdout:1/477: write da/dd/d2a/f6a [13349,108193] 0 2026-03-10T07:50:52.682 INFO:tasks.workunit.client.0.vm05.stdout:5/475: fsync d2/d5/f1e 0 2026-03-10T07:50:52.689 INFO:tasks.workunit.client.0.vm05.stdout:0/432: dread - d8/f4a zero size 2026-03-10T07:50:52.689 INFO:tasks.workunit.client.0.vm05.stdout:0/433: chown d8/dd/d10/c1d 1 1 2026-03-10T07:50:52.690 INFO:tasks.workunit.client.0.vm05.stdout:0/434: chown c3 22545120 1 2026-03-10T07:50:52.695 INFO:tasks.workunit.client.0.vm05.stdout:0/435: dread d8/dd/d34/f3d [0,4194304] 0 2026-03-10T07:50:52.706 INFO:tasks.workunit.client.0.vm05.stdout:3/450: dread d8/d16/d19/f21 [0,4194304] 0 2026-03-10T07:50:52.706 INFO:tasks.workunit.client.0.vm05.stdout:3/451: chown d8/d1f/f95 5 1 2026-03-10T07:50:52.706 INFO:tasks.workunit.client.0.vm05.stdout:0/436: dwrite d8/dd/d10/d26/d3a/d5e/f71 [0,4194304] 0 2026-03-10T07:50:52.706 INFO:tasks.workunit.client.0.vm05.stdout:4/510: dread d0/d6/d9/d12/d45/d55/f5f [0,4194304] 0 2026-03-10T07:50:52.708 INFO:tasks.workunit.client.0.vm05.stdout:3/452: dwrite d8/f4d [0,4194304] 0 2026-03-10T07:50:52.708 INFO:tasks.workunit.client.0.vm05.stdout:3/453: write d8/d1f/d24/f3e [1666861,128840] 0 2026-03-10T07:50:52.713 INFO:tasks.workunit.client.0.vm05.stdout:7/463: symlink d1/d6/d3b/l8b 0 2026-03-10T07:50:52.719 INFO:tasks.workunit.client.0.vm05.stdout:8/431: creat d1/dd/d18/d20/d2a/d34/d49/d5d/f84 x:0 0 0 2026-03-10T07:50:52.724 INFO:tasks.workunit.client.0.vm05.stdout:7/464: dwrite d1/d34/d59/f72 [0,4194304] 0 2026-03-10T07:50:52.740 INFO:tasks.workunit.client.0.vm05.stdout:9/450: dwrite d8/d35/d22/f3f [0,4194304] 0 2026-03-10T07:50:52.755 INFO:tasks.workunit.client.0.vm05.stdout:3/454: creat d8/d22/d60/d6e/f97 x:0 0 0 2026-03-10T07:50:52.758 INFO:tasks.workunit.client.0.vm05.stdout:8/432: sync 2026-03-10T07:50:52.758 INFO:tasks.workunit.client.0.vm05.stdout:3/455: dread d8/d16/f2d [0,4194304] 0 2026-03-10T07:50:52.765 INFO:tasks.workunit.client.0.vm05.stdout:9/451: dwrite d8/d35/d38/d71/d81/f83 [0,4194304] 0 2026-03-10T07:50:52.766 INFO:tasks.workunit.client.0.vm05.stdout:3/456: sync 2026-03-10T07:50:52.770 INFO:tasks.workunit.client.0.vm05.stdout:1/478: link da/dd/d12/d19/d20/f79 da/dd/d42/d80/f87 0 2026-03-10T07:50:52.771 INFO:tasks.workunit.client.0.vm05.stdout:1/479: dread - da/dd/f7b zero size 2026-03-10T07:50:52.778 INFO:tasks.workunit.client.0.vm05.stdout:8/433: creat d1/d6f/f85 x:0 0 0 2026-03-10T07:50:52.781 INFO:tasks.workunit.client.0.vm05.stdout:7/465: fdatasync d1/d3c/f63 0 2026-03-10T07:50:52.783 INFO:tasks.workunit.client.0.vm05.stdout:7/466: chown d1/d34/l6a 0 1 2026-03-10T07:50:52.784 INFO:tasks.workunit.client.0.vm05.stdout:7/467: chown d1/l51 110 1 2026-03-10T07:50:52.787 INFO:tasks.workunit.client.0.vm05.stdout:5/476: getdents d2/d12/d2d 0 2026-03-10T07:50:52.790 INFO:tasks.workunit.client.0.vm05.stdout:1/480: dread da/dd/f77 [0,4194304] 0 2026-03-10T07:50:52.794 INFO:tasks.workunit.client.0.vm05.stdout:0/437: rmdir d8/dd/d7a 0 2026-03-10T07:50:52.795 INFO:tasks.workunit.client.0.vm05.stdout:6/475: link d0/l70 d0/d6/l94 0 2026-03-10T07:50:52.802 INFO:tasks.workunit.client.0.vm05.stdout:9/452: mkdir d8/d86/d28/d79/d57/d96 0 2026-03-10T07:50:52.803 INFO:tasks.workunit.client.0.vm05.stdout:9/453: truncate d8/d35/d1c/d75/f8e 754774 0 2026-03-10T07:50:52.804 INFO:tasks.workunit.client.0.vm05.stdout:7/468: mkdir d1/d34/d59/d60/d8c 0 2026-03-10T07:50:52.806 INFO:tasks.workunit.client.0.vm05.stdout:5/477: creat d2/d20/d4c/fa5 x:0 0 0 2026-03-10T07:50:52.816 INFO:tasks.workunit.client.0.vm05.stdout:6/476: fdatasync d0/d11/d57/d66/f7b 0 2026-03-10T07:50:52.816 INFO:tasks.workunit.client.0.vm05.stdout:4/511: dread d0/d6/d9/d12/d45/d55/f2c [0,4194304] 0 2026-03-10T07:50:52.816 INFO:tasks.workunit.client.0.vm05.stdout:5/478: dwrite d2/d5/d61/f66 [0,4194304] 0 2026-03-10T07:50:52.816 INFO:tasks.workunit.client.0.vm05.stdout:5/479: chown d2/d5/f10 120 1 2026-03-10T07:50:52.816 INFO:tasks.workunit.client.0.vm05.stdout:5/480: fsync d2/d12/f40 0 2026-03-10T07:50:52.818 INFO:tasks.workunit.client.0.vm05.stdout:0/438: sync 2026-03-10T07:50:52.818 INFO:tasks.workunit.client.0.vm05.stdout:6/477: sync 2026-03-10T07:50:52.833 INFO:tasks.workunit.client.0.vm05.stdout:4/512: mkdir d0/d20/db1 0 2026-03-10T07:50:52.834 INFO:tasks.workunit.client.0.vm05.stdout:4/513: write d0/d6/d9/d12/d45/d55/d4e/f8d [674764,85102] 0 2026-03-10T07:50:52.841 INFO:tasks.workunit.client.0.vm05.stdout:0/439: symlink d8/dd/d10/d26/d3a/d5e/l9a 0 2026-03-10T07:50:52.841 INFO:tasks.workunit.client.0.vm05.stdout:0/440: read - d8/dd/d10/d26/d2a/f8f zero size 2026-03-10T07:50:52.842 INFO:tasks.workunit.client.0.vm05.stdout:6/478: mknod d0/d11/d57/d60/c95 0 2026-03-10T07:50:52.843 INFO:tasks.workunit.client.0.vm05.stdout:6/479: chown d0/d11/d57/f5c 24840537 1 2026-03-10T07:50:52.843 INFO:tasks.workunit.client.0.vm05.stdout:1/481: creat da/d26/d2b/f88 x:0 0 0 2026-03-10T07:50:52.844 INFO:tasks.workunit.client.0.vm05.stdout:1/482: fdatasync da/dd/d12/d34/f5f 0 2026-03-10T07:50:52.845 INFO:tasks.workunit.client.0.vm05.stdout:4/514: creat d0/d20/fb2 x:0 0 0 2026-03-10T07:50:52.847 INFO:tasks.workunit.client.0.vm05.stdout:0/441: unlink d8/dd/d10/d26/d3a/f53 0 2026-03-10T07:50:52.850 INFO:tasks.workunit.client.0.vm05.stdout:0/442: symlink d8/dd/d10/d26/d3a/d5e/d63/l9b 0 2026-03-10T07:50:52.886 INFO:tasks.workunit.client.0.vm05.stdout:0/443: chown d8/dd/d10/d26/d48/c5d 22160 1 2026-03-10T07:50:52.886 INFO:tasks.workunit.client.0.vm05.stdout:7/469: getdents d1/d34/d59 0 2026-03-10T07:50:52.886 INFO:tasks.workunit.client.0.vm05.stdout:4/515: dwrite d0/d6/d9/d12/d45/d55/d4e/f97 [0,4194304] 0 2026-03-10T07:50:52.886 INFO:tasks.workunit.client.0.vm05.stdout:0/444: rmdir d8/dd/d34 39 2026-03-10T07:50:52.886 INFO:tasks.workunit.client.0.vm05.stdout:1/483: mkdir da/d26/d2b/d89 0 2026-03-10T07:50:52.886 INFO:tasks.workunit.client.0.vm05.stdout:0/445: write d8/dd/d37/f4f [118725,91618] 0 2026-03-10T07:50:52.886 INFO:tasks.workunit.client.0.vm05.stdout:0/446: mkdir d8/d9c 0 2026-03-10T07:50:52.886 INFO:tasks.workunit.client.0.vm05.stdout:0/447: readlink d8/dd/d10/d26/d8b/d70/l8a 0 2026-03-10T07:50:52.886 INFO:tasks.workunit.client.0.vm05.stdout:1/484: mknod da/d26/d2b/d71/c8a 0 2026-03-10T07:50:52.886 INFO:tasks.workunit.client.0.vm05.stdout:2/517: dwrite d0/d8/d43/d38/f9a [0,4194304] 0 2026-03-10T07:50:52.886 INFO:tasks.workunit.client.0.vm05.stdout:1/485: symlink da/dd/d2a/d55/d68/l8b 0 2026-03-10T07:50:52.886 INFO:tasks.workunit.client.0.vm05.stdout:1/486: truncate da/dd/d12/d34/f38 3790498 0 2026-03-10T07:50:52.886 INFO:tasks.workunit.client.0.vm05.stdout:2/518: mkdir d0/d8/d43/da4 0 2026-03-10T07:50:52.886 INFO:tasks.workunit.client.0.vm05.stdout:2/519: dwrite d0/f22 [0,4194304] 0 2026-03-10T07:50:52.889 INFO:tasks.workunit.client.0.vm05.stdout:1/487: rename da/dd/d12/l73 to da/dd/d2a/l8c 0 2026-03-10T07:50:52.893 INFO:tasks.workunit.client.0.vm05.stdout:2/520: mkdir d0/d8/d3d/d7d/da5 0 2026-03-10T07:50:52.893 INFO:tasks.workunit.client.0.vm05.stdout:1/488: dwrite da/dd/d2a/f6f [0,4194304] 0 2026-03-10T07:50:52.901 INFO:tasks.workunit.client.0.vm05.stdout:1/489: symlink da/dd/l8d 0 2026-03-10T07:50:52.901 INFO:tasks.workunit.client.0.vm05.stdout:2/521: creat d0/d47/fa6 x:0 0 0 2026-03-10T07:50:52.903 INFO:tasks.workunit.client.0.vm05.stdout:1/490: chown da/dd/c10 619769630 1 2026-03-10T07:50:52.906 INFO:tasks.workunit.client.0.vm05.stdout:2/522: symlink d0/d47/la7 0 2026-03-10T07:50:52.908 INFO:tasks.workunit.client.0.vm05.stdout:1/491: mkdir da/dd/d12/d34/d58/d8e 0 2026-03-10T07:50:52.917 INFO:tasks.workunit.client.0.vm05.stdout:0/448: dread d8/dd/d10/d26/d2a/f2b [0,4194304] 0 2026-03-10T07:50:52.917 INFO:tasks.workunit.client.0.vm05.stdout:1/492: fsync da/dd/d2a/f2f 0 2026-03-10T07:50:52.917 INFO:tasks.workunit.client.0.vm05.stdout:0/449: dread d8/fc [0,4194304] 0 2026-03-10T07:50:52.918 INFO:tasks.workunit.client.0.vm05.stdout:2/523: read d0/d2a/f45 [2849773,20113] 0 2026-03-10T07:50:52.918 INFO:tasks.workunit.client.0.vm05.stdout:2/524: fdatasync d0/d8/d43/df/f20 0 2026-03-10T07:50:52.920 INFO:tasks.workunit.client.0.vm05.stdout:2/525: fsync d0/d8/d43/df/f3a 0 2026-03-10T07:50:52.920 INFO:tasks.workunit.client.0.vm05.stdout:2/526: readlink d0/d47/la7 0 2026-03-10T07:50:52.922 INFO:tasks.workunit.client.0.vm05.stdout:4/516: dread d0/d28/f7c [0,4194304] 0 2026-03-10T07:50:52.923 INFO:tasks.workunit.client.0.vm05.stdout:2/527: mkdir d0/d8/d3d/d7d/da5/da8 0 2026-03-10T07:50:52.924 INFO:tasks.workunit.client.0.vm05.stdout:4/517: symlink d0/d6/d6f/lb3 0 2026-03-10T07:50:52.924 INFO:tasks.workunit.client.0.vm05.stdout:4/518: chown d0/d6/d9/d5a/c79 37 1 2026-03-10T07:50:52.924 INFO:tasks.workunit.client.0.vm05.stdout:4/519: dread - d0/f41 zero size 2026-03-10T07:50:52.925 INFO:tasks.workunit.client.0.vm05.stdout:2/528: rename d0/d8/d43/df/c55 to d0/d8/d43/d38/ca9 0 2026-03-10T07:50:52.925 INFO:tasks.workunit.client.0.vm05.stdout:4/520: chown d0/cae 3787226 1 2026-03-10T07:50:52.926 INFO:tasks.workunit.client.0.vm05.stdout:2/529: mknod d0/d47/d49/d81/caa 0 2026-03-10T07:50:52.926 INFO:tasks.workunit.client.0.vm05.stdout:4/521: write d0/d6/d37/f75 [4408441,101331] 0 2026-03-10T07:50:52.928 INFO:tasks.workunit.client.0.vm05.stdout:2/530: mkdir d0/d47/d49/dab 0 2026-03-10T07:50:52.931 INFO:tasks.workunit.client.0.vm05.stdout:2/531: rename d0/d8/d43/f30 to d0/d8/d3d/d7d/da5/da8/fac 0 2026-03-10T07:50:52.932 INFO:tasks.workunit.client.0.vm05.stdout:2/532: truncate d0/f5 1800475 0 2026-03-10T07:50:52.933 INFO:tasks.workunit.client.0.vm05.stdout:2/533: unlink d0/d8/l24 0 2026-03-10T07:50:52.934 INFO:tasks.workunit.client.0.vm05.stdout:2/534: write d0/d8/d66/f68 [747423,21785] 0 2026-03-10T07:50:52.934 INFO:tasks.workunit.client.0.vm05.stdout:2/535: readlink d0/d8/d3d/l77 0 2026-03-10T07:50:52.935 INFO:tasks.workunit.client.0.vm05.stdout:2/536: read d0/d52/f88 [75351,32124] 0 2026-03-10T07:50:52.937 INFO:tasks.workunit.client.0.vm05.stdout:2/537: creat d0/d2a/d8c/fad x:0 0 0 2026-03-10T07:50:52.938 INFO:tasks.workunit.client.0.vm05.stdout:0/450: sync 2026-03-10T07:50:52.953 INFO:tasks.workunit.client.0.vm05.stdout:2/538: rename d0/d8/f7b to d0/d47/fae 0 2026-03-10T07:50:52.954 INFO:tasks.workunit.client.0.vm05.stdout:0/451: dwrite d8/dd/d34/f3e [0,4194304] 0 2026-03-10T07:50:52.956 INFO:tasks.workunit.client.0.vm05.stdout:2/539: truncate d0/d8/d43/df/d4d/f93 184965 0 2026-03-10T07:50:52.956 INFO:tasks.workunit.client.0.vm05.stdout:2/540: readlink d0/d52/l70 0 2026-03-10T07:50:52.963 INFO:tasks.workunit.client.0.vm05.stdout:0/452: dwrite d8/dd/d10/d26/d2a/d6f/f85 [0,4194304] 0 2026-03-10T07:50:52.967 INFO:tasks.workunit.client.0.vm05.stdout:0/453: symlink d8/dd/d37/l9d 0 2026-03-10T07:50:52.968 INFO:tasks.workunit.client.0.vm05.stdout:2/541: mknod d0/d8/d43/caf 0 2026-03-10T07:50:52.969 INFO:tasks.workunit.client.0.vm05.stdout:2/542: readlink d0/d8/d43/df/d4d/l79 0 2026-03-10T07:50:52.969 INFO:tasks.workunit.client.0.vm05.stdout:0/454: creat d8/dd/d34/f9e x:0 0 0 2026-03-10T07:50:52.975 INFO:tasks.workunit.client.0.vm05.stdout:0/455: dwrite d8/f4e [0,4194304] 0 2026-03-10T07:50:52.977 INFO:tasks.workunit.client.0.vm05.stdout:0/456: chown d8/d9c 6442575 1 2026-03-10T07:50:52.984 INFO:tasks.workunit.client.0.vm05.stdout:0/457: creat d8/dd/d37/d67/f9f x:0 0 0 2026-03-10T07:50:52.986 INFO:tasks.workunit.client.0.vm05.stdout:2/543: dwrite d0/d52/f88 [0,4194304] 0 2026-03-10T07:50:52.988 INFO:tasks.workunit.client.0.vm05.stdout:2/544: unlink d0/d8/d43/f1d 0 2026-03-10T07:50:53.036 INFO:tasks.workunit.client.0.vm05.stdout:2/545: dread d0/d8/d43/df/d53/f82 [0,4194304] 0 2026-03-10T07:50:53.036 INFO:tasks.workunit.client.0.vm05.stdout:2/546: stat d0/d47/la2 0 2026-03-10T07:50:53.054 INFO:tasks.workunit.client.0.vm05.stdout:9/454: dread d8/d35/d22/d33/f73 [0,4194304] 0 2026-03-10T07:50:53.057 INFO:tasks.workunit.client.0.vm05.stdout:8/434: dwrite d1/dd/d18/d20/d2a/d48/f4a [0,4194304] 0 2026-03-10T07:50:53.059 INFO:tasks.workunit.client.0.vm05.stdout:3/457: dwrite d8/d22/f29 [4194304,4194304] 0 2026-03-10T07:50:53.063 INFO:tasks.workunit.client.0.vm05.stdout:3/458: stat d8/d47 0 2026-03-10T07:50:53.080 INFO:tasks.workunit.client.0.vm05.stdout:3/459: dwrite d8/d22/f29 [4194304,4194304] 0 2026-03-10T07:50:53.087 INFO:tasks.workunit.client.0.vm05.stdout:5/481: truncate d2/d5/f18 3673943 0 2026-03-10T07:50:53.088 INFO:tasks.workunit.client.0.vm05.stdout:5/482: chown d2/d5/l26 28233761 1 2026-03-10T07:50:53.091 INFO:tasks.workunit.client.0.vm05.stdout:6/480: write d0/d11/f1c [1107412,30989] 0 2026-03-10T07:50:53.092 INFO:tasks.workunit.client.0.vm05.stdout:2/547: link d0/d8/d43/df/f21 d0/d8/d3d/d7d/da5/fb0 0 2026-03-10T07:50:53.093 INFO:tasks.workunit.client.0.vm05.stdout:6/481: write d0/d6/f44 [1130158,88512] 0 2026-03-10T07:50:53.097 INFO:tasks.workunit.client.0.vm05.stdout:3/460: mkdir d8/d1f/d2a/d98 0 2026-03-10T07:50:53.100 INFO:tasks.workunit.client.0.vm05.stdout:7/470: write d1/d34/f7d [952662,2074] 0 2026-03-10T07:50:53.104 INFO:tasks.workunit.client.0.vm05.stdout:7/471: write d1/d6/f77 [297104,68414] 0 2026-03-10T07:50:53.104 INFO:tasks.workunit.client.0.vm05.stdout:7/472: readlink d1/d6/d3b/l8b 0 2026-03-10T07:50:53.105 INFO:tasks.workunit.client.0.vm05.stdout:2/548: mkdir d0/d47/d49/db1 0 2026-03-10T07:50:53.111 INFO:tasks.workunit.client.0.vm05.stdout:8/435: creat d1/dd/d4d/d64/f86 x:0 0 0 2026-03-10T07:50:53.113 INFO:tasks.workunit.client.0.vm05.stdout:6/482: dread d0/d11/d2e/f7f [0,4194304] 0 2026-03-10T07:50:53.116 INFO:tasks.workunit.client.0.vm05.stdout:0/458: fsync d8/f4e 0 2026-03-10T07:50:53.116 INFO:tasks.workunit.client.0.vm05.stdout:1/493: dread da/dd/d12/d34/f38 [0,4194304] 0 2026-03-10T07:50:53.117 INFO:tasks.workunit.client.0.vm05.stdout:1/494: chown da/d26/c46 72 1 2026-03-10T07:50:53.122 INFO:tasks.workunit.client.0.vm05.stdout:3/461: symlink d8/d1f/d24/d45/l99 0 2026-03-10T07:50:53.123 INFO:tasks.workunit.client.0.vm05.stdout:7/473: rename d1/d34/d59/d60/d6e to d1/d6/d47/d8d 0 2026-03-10T07:50:53.130 INFO:tasks.workunit.client.0.vm05.stdout:1/495: mkdir da/dd/d12/d19/d20/d8f 0 2026-03-10T07:50:53.131 INFO:tasks.workunit.client.0.vm05.stdout:1/496: chown da/d26/d2b/d71/c8a 153585 1 2026-03-10T07:50:53.134 INFO:tasks.workunit.client.0.vm05.stdout:0/459: rename d8/dd/d10/d26/d2a/f6e to d8/dd/d10/d26/d8b/d86/fa0 0 2026-03-10T07:50:53.136 INFO:tasks.workunit.client.0.vm05.stdout:0/460: chown d8/l90 66363 1 2026-03-10T07:50:53.139 INFO:tasks.workunit.client.0.vm05.stdout:1/497: dread da/dd/d12/d19/d20/f6c [0,4194304] 0 2026-03-10T07:50:53.140 INFO:tasks.workunit.client.0.vm05.stdout:1/498: chown da/d26/d2b/f65 7 1 2026-03-10T07:50:53.141 INFO:tasks.workunit.client.0.vm05.stdout:7/474: creat d1/d34/d59/d60/f8e x:0 0 0 2026-03-10T07:50:53.147 INFO:tasks.workunit.client.0.vm05.stdout:0/461: mknod d8/dd/d10/d26/d48/ca1 0 2026-03-10T07:50:53.147 INFO:tasks.workunit.client.0.vm05.stdout:1/499: dwrite da/dd/d2a/f6a [0,4194304] 0 2026-03-10T07:50:53.148 INFO:tasks.workunit.client.0.vm05.stdout:1/500: fdatasync da/f43 0 2026-03-10T07:50:53.149 INFO:tasks.workunit.client.0.vm05.stdout:1/501: write da/d26/d2b/f65 [917123,116508] 0 2026-03-10T07:50:53.161 INFO:tasks.workunit.client.0.vm05.stdout:8/436: creat d1/dd/f87 x:0 0 0 2026-03-10T07:50:53.162 INFO:tasks.workunit.client.0.vm05.stdout:3/462: link d8/d22/d60/l5e d8/d16/d19/d37/l9a 0 2026-03-10T07:50:53.162 INFO:tasks.workunit.client.0.vm05.stdout:6/483: dread d0/f23 [0,4194304] 0 2026-03-10T07:50:53.165 INFO:tasks.workunit.client.0.vm05.stdout:6/484: dread d0/d11/d4f/f7e [0,4194304] 0 2026-03-10T07:50:53.166 INFO:tasks.workunit.client.0.vm05.stdout:0/462: rename d8/dd/d10/d26/d3a/d5e/d63/l98 to d8/dd/d10/d26/d8b/d70/la2 0 2026-03-10T07:50:53.169 INFO:tasks.workunit.client.0.vm05.stdout:9/455: write d8/d35/f1f [820297,87823] 0 2026-03-10T07:50:53.170 INFO:tasks.workunit.client.0.vm05.stdout:5/483: write d2/d12/d2d/f60 [544211,72276] 0 2026-03-10T07:50:53.173 INFO:tasks.workunit.client.0.vm05.stdout:5/484: truncate d2/d20/d33/d53/f97 9993177 0 2026-03-10T07:50:53.174 INFO:tasks.workunit.client.0.vm05.stdout:0/463: dread d8/dd/d37/f38 [0,4194304] 0 2026-03-10T07:50:53.174 INFO:tasks.workunit.client.0.vm05.stdout:4/522: dwrite d0/d6/d37/f46 [0,4194304] 0 2026-03-10T07:50:53.179 INFO:tasks.workunit.client.0.vm05.stdout:2/549: fdatasync d0/d8/d43/df/f58 0 2026-03-10T07:50:53.180 INFO:tasks.workunit.client.0.vm05.stdout:2/550: stat d0/d52/f91 0 2026-03-10T07:50:53.184 INFO:tasks.workunit.client.0.vm05.stdout:4/523: dwrite d0/d28/f9b [0,4194304] 0 2026-03-10T07:50:53.187 INFO:tasks.workunit.client.0.vm05.stdout:7/475: rmdir d1/d6/d80/d82 39 2026-03-10T07:50:53.192 INFO:tasks.workunit.client.0.vm05.stdout:5/485: dread d2/d12/f6b [0,4194304] 0 2026-03-10T07:50:53.204 INFO:tasks.workunit.client.0.vm05.stdout:3/463: symlink d8/d1c/d64/l9b 0 2026-03-10T07:50:53.204 INFO:tasks.workunit.client.0.vm05.stdout:9/456: rmdir d8/d35/d1c/d2c/d63 39 2026-03-10T07:50:53.204 INFO:tasks.workunit.client.0.vm05.stdout:6/485: mkdir d0/d11/d4f/d56/d96 0 2026-03-10T07:50:53.204 INFO:tasks.workunit.client.0.vm05.stdout:4/524: creat d0/d6/d9/d12/d4f/fb4 x:0 0 0 2026-03-10T07:50:53.207 INFO:tasks.workunit.client.0.vm05.stdout:9/457: dread d8/d35/d1c/d20/d59/d8b/f50 [0,4194304] 0 2026-03-10T07:50:53.210 INFO:tasks.workunit.client.0.vm05.stdout:1/502: getdents da/dd/d2a 0 2026-03-10T07:50:53.212 INFO:tasks.workunit.client.0.vm05.stdout:9/458: creat d8/d35/d6b/f97 x:0 0 0 2026-03-10T07:50:53.213 INFO:tasks.workunit.client.0.vm05.stdout:8/437: read d1/dd/d18/d20/d2a/f3a [117396,20543] 0 2026-03-10T07:50:53.214 INFO:tasks.workunit.client.0.vm05.stdout:4/525: fdatasync d0/f2 0 2026-03-10T07:50:53.215 INFO:tasks.workunit.client.0.vm05.stdout:1/503: fsync da/dd/d12/d19/d20/f79 0 2026-03-10T07:50:53.216 INFO:tasks.workunit.client.0.vm05.stdout:6/486: link d0/d6/c51 d0/d11/d57/d66/c97 0 2026-03-10T07:50:53.217 INFO:tasks.workunit.client.0.vm05.stdout:1/504: chown da/d26/d2b/d89 8347959 1 2026-03-10T07:50:53.218 INFO:tasks.workunit.client.0.vm05.stdout:4/526: rmdir d0/d20 39 2026-03-10T07:50:53.222 INFO:tasks.workunit.client.0.vm05.stdout:1/505: creat da/dd/d2a/f90 x:0 0 0 2026-03-10T07:50:53.222 INFO:tasks.workunit.client.0.vm05.stdout:8/438: creat d1/dd/d18/d20/d2a/f88 x:0 0 0 2026-03-10T07:50:53.223 INFO:tasks.workunit.client.0.vm05.stdout:1/506: dread - da/dd/d2a/d55/d68/f4d zero size 2026-03-10T07:50:53.223 INFO:tasks.workunit.client.0.vm05.stdout:8/439: chown d1/dd/d18/d20/d2a/f3a 55 1 2026-03-10T07:50:53.225 INFO:tasks.workunit.client.0.vm05.stdout:8/440: fdatasync d1/dd/d18/f5c 0 2026-03-10T07:50:53.229 INFO:tasks.workunit.client.0.vm05.stdout:6/487: rename d0/d11/d22/f2b to d0/d6/f98 0 2026-03-10T07:50:53.231 INFO:tasks.workunit.client.0.vm05.stdout:9/459: dread d8/d86/d28/f84 [0,4194304] 0 2026-03-10T07:50:53.234 INFO:tasks.workunit.client.0.vm05.stdout:8/441: mknod d1/dd/d18/d20/d2a/c89 0 2026-03-10T07:50:53.235 INFO:tasks.workunit.client.0.vm05.stdout:9/460: dread - d8/d35/d1c/d75/f8d zero size 2026-03-10T07:50:53.240 INFO:tasks.workunit.client.0.vm05.stdout:1/507: creat da/dd/d12/d34/d58/d8e/f91 x:0 0 0 2026-03-10T07:50:53.247 INFO:tasks.workunit.client.0.vm05.stdout:7/476: rmdir d1/d34 39 2026-03-10T07:50:53.251 INFO:tasks.workunit.client.0.vm05.stdout:9/461: dwrite d8/d35/d1c/d75/f8d [0,4194304] 0 2026-03-10T07:50:53.258 INFO:tasks.workunit.client.0.vm05.stdout:8/442: fdatasync d1/dd/d5e/f6b 0 2026-03-10T07:50:53.258 INFO:tasks.workunit.client.0.vm05.stdout:0/464: write d8/dd/f29 [2836985,34293] 0 2026-03-10T07:50:53.259 INFO:tasks.workunit.client.0.vm05.stdout:9/462: chown d8/d86/d28/d79/f91 261430 1 2026-03-10T07:50:53.260 INFO:tasks.workunit.client.0.vm05.stdout:1/508: write da/dd/f7b [411729,64898] 0 2026-03-10T07:50:53.260 INFO:tasks.workunit.client.0.vm05.stdout:0/465: write d8/f2d [3997781,130743] 0 2026-03-10T07:50:53.273 INFO:tasks.workunit.client.0.vm05.stdout:7/477: mknod d1/c8f 0 2026-03-10T07:50:53.273 INFO:tasks.workunit.client.0.vm05.stdout:8/443: creat d1/dd/d4d/f8a x:0 0 0 2026-03-10T07:50:53.275 INFO:tasks.workunit.client.0.vm05.stdout:2/551: dwrite d0/d2a/f45 [0,4194304] 0 2026-03-10T07:50:53.281 INFO:tasks.workunit.client.0.vm05.stdout:2/552: chown d0/d8/d43/c4b 43 1 2026-03-10T07:50:53.281 INFO:tasks.workunit.client.0.vm05.stdout:7/478: dread d1/d6/f31 [0,4194304] 0 2026-03-10T07:50:53.281 INFO:tasks.workunit.client.0.vm05.stdout:2/553: dread d0/d8/d43/df/d4d/f93 [0,4194304] 0 2026-03-10T07:50:53.284 INFO:tasks.workunit.client.0.vm05.stdout:9/463: rename d8/d35/d1c/d75/f8e to d8/d35/d22/f98 0 2026-03-10T07:50:53.288 INFO:tasks.workunit.client.0.vm05.stdout:9/464: read d8/d86/f92 [1407388,47616] 0 2026-03-10T07:50:53.301 INFO:tasks.workunit.client.0.vm05.stdout:2/554: rmdir d0/d8/d3d 39 2026-03-10T07:50:53.315 INFO:tasks.workunit.client.0.vm05.stdout:9/465: rename d8/d35/d1c/d20/d59/d8b/l78 to d8/d35/d1c/d2c/l99 0 2026-03-10T07:50:53.316 INFO:tasks.workunit.client.0.vm05.stdout:9/466: write d8/f14 [3046050,10709] 0 2026-03-10T07:50:53.320 INFO:tasks.workunit.client.0.vm05.stdout:9/467: rename d8/d35/d22/f2b to d8/d35/d22/d33/d62/f9a 0 2026-03-10T07:50:53.321 INFO:tasks.workunit.client.0.vm05.stdout:9/468: rename d8/d35/d22/d33/d47/f76 to d8/d35/d22/f9b 0 2026-03-10T07:50:53.324 INFO:tasks.workunit.client.0.vm05.stdout:9/469: write d8/d35/d22/f3f [4110124,115003] 0 2026-03-10T07:50:53.324 INFO:tasks.workunit.client.0.vm05.stdout:0/466: sync 2026-03-10T07:50:53.324 INFO:tasks.workunit.client.0.vm05.stdout:2/555: sync 2026-03-10T07:50:53.333 INFO:tasks.workunit.client.0.vm05.stdout:0/467: fsync d8/dd/d10/d26/d48/f83 0 2026-03-10T07:50:53.333 INFO:tasks.workunit.client.0.vm05.stdout:2/556: rename d0/d8/d43/df/d25 to d0/d8/d3d/d7d/db2 0 2026-03-10T07:50:53.338 INFO:tasks.workunit.client.0.vm05.stdout:0/468: write d8/f65 [1832420,4184] 0 2026-03-10T07:50:53.340 INFO:tasks.workunit.client.0.vm05.stdout:5/486: write d2/d20/d33/d53/f6a [285557,123236] 0 2026-03-10T07:50:53.343 INFO:tasks.workunit.client.0.vm05.stdout:2/557: mkdir d0/d47/d49/db3 0 2026-03-10T07:50:53.347 INFO:tasks.workunit.client.0.vm05.stdout:0/469: truncate d8/dd/d37/f38 3082314 0 2026-03-10T07:50:53.347 INFO:tasks.workunit.client.0.vm05.stdout:3/464: truncate f4 3512366 0 2026-03-10T07:50:53.351 INFO:tasks.workunit.client.0.vm05.stdout:3/465: read d8/d16/f2d [4026189,54405] 0 2026-03-10T07:50:53.351 INFO:tasks.workunit.client.0.vm05.stdout:0/470: creat d8/dd/d10/d26/d3a/d5e/fa3 x:0 0 0 2026-03-10T07:50:53.351 INFO:tasks.workunit.client.0.vm05.stdout:0/471: readlink d8/l90 0 2026-03-10T07:50:53.352 INFO:tasks.workunit.client.0.vm05.stdout:0/472: fdatasync d8/dd/d37/d56/f18 0 2026-03-10T07:50:53.363 INFO:tasks.workunit.client.0.vm05.stdout:5/487: dread d2/d20/d77/f7f [0,4194304] 0 2026-03-10T07:50:53.363 INFO:tasks.workunit.client.0.vm05.stdout:5/488: write d2/d20/d4c/d64/f96 [683358,108524] 0 2026-03-10T07:50:53.363 INFO:tasks.workunit.client.0.vm05.stdout:1/509: write da/dd/f77 [1243645,106990] 0 2026-03-10T07:50:53.363 INFO:tasks.workunit.client.0.vm05.stdout:4/527: write d0/d6/d9/d12/d45/d55/f5f [483527,8071] 0 2026-03-10T07:50:53.368 INFO:tasks.workunit.client.0.vm05.stdout:8/444: dwrite d1/f2c [0,4194304] 0 2026-03-10T07:50:53.378 INFO:tasks.workunit.client.0.vm05.stdout:6/488: dwrite d0/d6/f16 [0,4194304] 0 2026-03-10T07:50:53.379 INFO:tasks.workunit.client.0.vm05.stdout:6/489: chown d0/d11/d22/l3d 67057 1 2026-03-10T07:50:53.380 INFO:tasks.workunit.client.0.vm05.stdout:6/490: readlink d0/l19 0 2026-03-10T07:50:53.392 INFO:tasks.workunit.client.0.vm05.stdout:0/473: rename d8/dd/d34 to d8/dd/d10/d26/d8b/da4 0 2026-03-10T07:50:53.401 INFO:tasks.workunit.client.0.vm05.stdout:6/491: mknod d0/d11/d4f/c99 0 2026-03-10T07:50:53.401 INFO:tasks.workunit.client.0.vm05.stdout:1/510: dread da/f3a [0,4194304] 0 2026-03-10T07:50:53.401 INFO:tasks.workunit.client.0.vm05.stdout:6/492: fsync d0/d11/d4f/d56/f6b 0 2026-03-10T07:50:53.402 INFO:tasks.workunit.client.0.vm05.stdout:5/489: rename d2/d5/l26 to d2/d12/d4d/la6 0 2026-03-10T07:50:53.405 INFO:tasks.workunit.client.0.vm05.stdout:8/445: symlink d1/dd/d5e/l8b 0 2026-03-10T07:50:53.405 INFO:tasks.workunit.client.0.vm05.stdout:0/474: chown d8/dd/d37/d56/c5c 32606755 1 2026-03-10T07:50:53.409 INFO:tasks.workunit.client.0.vm05.stdout:7/479: dread d1/d34/f7c [0,4194304] 0 2026-03-10T07:50:53.419 INFO:tasks.workunit.client.0.vm05.stdout:5/490: write d2/f1a [1522209,80112] 0 2026-03-10T07:50:53.419 INFO:tasks.workunit.client.0.vm05.stdout:9/470: dwrite d8/d86/d28/f43 [0,4194304] 0 2026-03-10T07:50:53.424 INFO:tasks.workunit.client.0.vm05.stdout:2/558: dread d0/d8/d43/df/f58 [0,4194304] 0 2026-03-10T07:50:53.425 INFO:tasks.workunit.client.0.vm05.stdout:2/559: chown d0/d8/d3d/d7d/db2/f95 469 1 2026-03-10T07:50:53.425 INFO:tasks.workunit.client.0.vm05.stdout:7/480: dread d1/d6/f41 [0,4194304] 0 2026-03-10T07:50:53.432 INFO:tasks.workunit.client.0.vm05.stdout:9/471: dread d8/d35/f5d [0,4194304] 0 2026-03-10T07:50:53.433 INFO:tasks.workunit.client.0.vm05.stdout:9/472: dread d8/f8a [0,4194304] 0 2026-03-10T07:50:53.434 INFO:tasks.workunit.client.0.vm05.stdout:1/511: dread da/dd/d12/f18 [0,4194304] 0 2026-03-10T07:50:53.436 INFO:tasks.workunit.client.0.vm05.stdout:4/528: rename d0/d6/d9/d12/d4f/fb4 to d0/d6/d9/d12/d65/fb5 0 2026-03-10T07:50:53.437 INFO:tasks.workunit.client.0.vm05.stdout:0/475: mknod d8/dd/d10/d26/d3a/d5e/ca5 0 2026-03-10T07:50:53.438 INFO:tasks.workunit.client.0.vm05.stdout:3/466: write d8/d16/f2d [2140567,37915] 0 2026-03-10T07:50:53.445 INFO:tasks.workunit.client.0.vm05.stdout:6/493: creat d0/d11/d4f/d6e/f9a x:0 0 0 2026-03-10T07:50:53.446 INFO:tasks.workunit.client.0.vm05.stdout:3/467: read d8/d1f/d2a/d4a/f4f [1638859,30570] 0 2026-03-10T07:50:53.446 INFO:tasks.workunit.client.0.vm05.stdout:9/473: chown d8/d86/l72 2350 1 2026-03-10T07:50:53.447 INFO:tasks.workunit.client.0.vm05.stdout:8/446: mknod d1/dd/c8c 0 2026-03-10T07:50:53.451 INFO:tasks.workunit.client.0.vm05.stdout:2/560: rename d0/d47 to d0/d7e/db4 0 2026-03-10T07:50:53.453 INFO:tasks.workunit.client.0.vm05.stdout:4/529: mkdir d0/d6/d9/d5a/d6e/db6 0 2026-03-10T07:50:53.454 INFO:tasks.workunit.client.0.vm05.stdout:4/530: readlink d0/d6/d9/d5a/d6e/l7b 0 2026-03-10T07:50:53.457 INFO:tasks.workunit.client.0.vm05.stdout:0/476: creat d8/dd/d10/d26/d3a/d5e/fa6 x:0 0 0 2026-03-10T07:50:53.457 INFO:tasks.workunit.client.0.vm05.stdout:7/481: creat d1/d3c/d71/d79/d8a/f90 x:0 0 0 2026-03-10T07:50:53.457 INFO:tasks.workunit.client.0.vm05.stdout:7/482: chown d1/l2f 246380 1 2026-03-10T07:50:53.458 INFO:tasks.workunit.client.0.vm05.stdout:5/491: symlink d2/d20/d33/la7 0 2026-03-10T07:50:53.458 INFO:tasks.workunit.client.0.vm05.stdout:3/468: creat d8/d16/f9c x:0 0 0 2026-03-10T07:50:53.460 INFO:tasks.workunit.client.0.vm05.stdout:6/494: chown d0/c18 0 1 2026-03-10T07:50:53.464 INFO:tasks.workunit.client.0.vm05.stdout:9/474: unlink d8/d35/d22/d33/f41 0 2026-03-10T07:50:53.469 INFO:tasks.workunit.client.0.vm05.stdout:2/561: symlink d0/d7e/lb5 0 2026-03-10T07:50:53.473 INFO:tasks.workunit.client.0.vm05.stdout:1/512: rename da/dd/d12/d19/f3b to da/d26/f92 0 2026-03-10T07:50:53.482 INFO:tasks.workunit.client.0.vm05.stdout:9/475: creat d8/f9c x:0 0 0 2026-03-10T07:50:53.486 INFO:tasks.workunit.client.0.vm05.stdout:1/513: creat da/dd/d2a/f93 x:0 0 0 2026-03-10T07:50:53.489 INFO:tasks.workunit.client.0.vm05.stdout:2/562: sync 2026-03-10T07:50:53.490 INFO:tasks.workunit.client.0.vm05.stdout:2/563: fdatasync d0/f22 0 2026-03-10T07:50:53.492 INFO:tasks.workunit.client.0.vm05.stdout:2/564: write d0/d8/d43/df/d8b/f8f [560599,30713] 0 2026-03-10T07:50:53.495 INFO:tasks.workunit.client.0.vm05.stdout:7/483: symlink d1/d6/d80/l91 0 2026-03-10T07:50:53.498 INFO:tasks.workunit.client.0.vm05.stdout:6/495: symlink d0/d11/d22/d6c/d84/l9b 0 2026-03-10T07:50:53.500 INFO:tasks.workunit.client.0.vm05.stdout:8/447: truncate d1/f2c 2106334 0 2026-03-10T07:50:53.504 INFO:tasks.workunit.client.0.vm05.stdout:3/469: dwrite d8/d1f/f6c [0,4194304] 0 2026-03-10T07:50:53.509 INFO:tasks.workunit.client.0.vm05.stdout:4/531: rename d0/d20 to d0/d6/d9/d12/d9c/db7 0 2026-03-10T07:50:53.521 INFO:tasks.workunit.client.0.vm05.stdout:0/477: link d8/f20 d8/dd/fa7 0 2026-03-10T07:50:53.524 INFO:tasks.workunit.client.0.vm05.stdout:3/470: dwrite d8/f12 [4194304,4194304] 0 2026-03-10T07:50:53.525 INFO:tasks.workunit.client.0.vm05.stdout:6/496: read - d0/d6/d3b/f8f zero size 2026-03-10T07:50:53.526 INFO:tasks.workunit.client.0.vm05.stdout:8/448: mknod d1/dd/d18/c8d 0 2026-03-10T07:50:53.527 INFO:tasks.workunit.client.0.vm05.stdout:9/476: chown d8/d35/d1c/d75/f88 553 1 2026-03-10T07:50:53.527 INFO:tasks.workunit.client.0.vm05.stdout:0/478: unlink d8/dd/d10/d26/d48/ca1 0 2026-03-10T07:50:53.532 INFO:tasks.workunit.client.0.vm05.stdout:3/471: creat d8/d1c/f9d x:0 0 0 2026-03-10T07:50:53.535 INFO:tasks.workunit.client.0.vm05.stdout:1/514: dwrite da/dd/d12/d34/f38 [0,4194304] 0 2026-03-10T07:50:53.541 INFO:tasks.workunit.client.0.vm05.stdout:8/449: rmdir d1/dd/d5e 39 2026-03-10T07:50:53.542 INFO:tasks.workunit.client.0.vm05.stdout:4/532: symlink d0/d6/d9/d12/d45/lb8 0 2026-03-10T07:50:53.547 INFO:tasks.workunit.client.0.vm05.stdout:3/472: dwrite d8/f25 [0,4194304] 0 2026-03-10T07:50:53.551 INFO:tasks.workunit.client.0.vm05.stdout:0/479: dread d8/dd/d10/d26/d8b/da4/f3d [0,4194304] 0 2026-03-10T07:50:53.555 INFO:tasks.workunit.client.0.vm05.stdout:4/533: dread d0/d6/d9/d12/d45/d55/f19 [0,4194304] 0 2026-03-10T07:50:53.555 INFO:tasks.workunit.client.0.vm05.stdout:0/480: dwrite d8/dd/d10/d26/d3a/d5e/fa3 [0,4194304] 0 2026-03-10T07:50:53.556 INFO:tasks.workunit.client.0.vm05.stdout:4/534: chown d0/l7a 5870589 1 2026-03-10T07:50:53.558 INFO:tasks.workunit.client.0.vm05.stdout:6/497: dread d0/f26 [0,4194304] 0 2026-03-10T07:50:53.559 INFO:tasks.workunit.client.0.vm05.stdout:0/481: write d8/dd/d10/d26/d8b/da4/f5b [74440,130787] 0 2026-03-10T07:50:53.569 INFO:tasks.workunit.client.0.vm05.stdout:0/482: dread - d8/dd/d10/f7f zero size 2026-03-10T07:50:53.569 INFO:tasks.workunit.client.0.vm05.stdout:0/483: dwrite d8/f2d [0,4194304] 0 2026-03-10T07:50:53.569 INFO:tasks.workunit.client.0.vm05.stdout:0/484: fdatasync d8/dd/d10/d26/d2a/d6f/f85 0 2026-03-10T07:50:53.576 INFO:tasks.workunit.client.0.vm05.stdout:3/473: dread d8/d1f/d24/f3e [0,4194304] 0 2026-03-10T07:50:53.576 INFO:tasks.workunit.client.0.vm05.stdout:0/485: dread d8/dd/d10/d26/d2a/f2b [4194304,4194304] 0 2026-03-10T07:50:53.576 INFO:tasks.workunit.client.0.vm05.stdout:8/450: symlink d1/d6f/l8e 0 2026-03-10T07:50:53.582 INFO:tasks.workunit.client.0.vm05.stdout:9/477: unlink d8/d86/c2f 0 2026-03-10T07:50:53.594 INFO:tasks.workunit.client.0.vm05.stdout:5/492: rename d2/d20/d33/d72 to d2/d12/da8 0 2026-03-10T07:50:53.594 INFO:tasks.workunit.client.0.vm05.stdout:7/484: rename d1/d3c/d71/d79 to d1/d3c/d71/d79/d92 22 2026-03-10T07:50:53.595 INFO:tasks.workunit.client.0.vm05.stdout:5/493: dread - d2/d12/f8f zero size 2026-03-10T07:50:53.602 INFO:tasks.workunit.client.0.vm05.stdout:1/515: creat da/dd/d42/d80/f94 x:0 0 0 2026-03-10T07:50:53.610 INFO:tasks.workunit.client.0.vm05.stdout:6/498: rmdir d0/d11/d57/d60 39 2026-03-10T07:50:53.614 INFO:tasks.workunit.client.0.vm05.stdout:3/474: creat d8/d22/d60/d6e/f9e x:0 0 0 2026-03-10T07:50:53.618 INFO:tasks.workunit.client.0.vm05.stdout:4/535: dwrite d0/d6/f78 [0,4194304] 0 2026-03-10T07:50:53.628 INFO:tasks.workunit.client.0.vm05.stdout:4/536: fsync d0/f24 0 2026-03-10T07:50:53.628 INFO:tasks.workunit.client.0.vm05.stdout:2/565: dwrite d0/d8/d43/df/d4d/f93 [0,4194304] 0 2026-03-10T07:50:53.629 INFO:tasks.workunit.client.0.vm05.stdout:7/485: creat d1/d3c/d71/d79/f93 x:0 0 0 2026-03-10T07:50:53.632 INFO:tasks.workunit.client.0.vm05.stdout:5/494: creat d2/d12/d2d/fa9 x:0 0 0 2026-03-10T07:50:53.635 INFO:tasks.workunit.client.0.vm05.stdout:6/499: sync 2026-03-10T07:50:53.644 INFO:tasks.workunit.client.0.vm05.stdout:7/486: dread d1/d6/f58 [0,4194304] 0 2026-03-10T07:50:53.651 INFO:tasks.workunit.client.0.vm05.stdout:3/475: read d8/d1c/f63 [2186225,70274] 0 2026-03-10T07:50:53.652 INFO:tasks.workunit.client.0.vm05.stdout:3/476: chown d8/d1c/c5a 669 1 2026-03-10T07:50:53.657 INFO:tasks.workunit.client.0.vm05.stdout:8/451: mkdir d1/dd/d4d/d64/d8f 0 2026-03-10T07:50:53.660 INFO:tasks.workunit.client.0.vm05.stdout:9/478: symlink d8/d35/d22/d33/d70/l9d 0 2026-03-10T07:50:53.671 INFO:tasks.workunit.client.0.vm05.stdout:5/495: write d2/d12/f5a [1021880,17296] 0 2026-03-10T07:50:53.672 INFO:tasks.workunit.client.0.vm05.stdout:1/516: rename da/dd/d12/d19/d20/c7f to da/dd/d12/d19/d20/d8f/c95 0 2026-03-10T07:50:53.675 INFO:tasks.workunit.client.0.vm05.stdout:6/500: mkdir d0/d35/d36/d43/d9c 0 2026-03-10T07:50:53.677 INFO:tasks.workunit.client.0.vm05.stdout:8/452: dread d1/dd/d18/f3f [4194304,4194304] 0 2026-03-10T07:50:53.686 INFO:tasks.workunit.client.0.vm05.stdout:1/517: dread da/dd/d2a/d70/f83 [0,4194304] 0 2026-03-10T07:50:53.687 INFO:tasks.workunit.client.0.vm05.stdout:6/501: sync 2026-03-10T07:50:53.687 INFO:tasks.workunit.client.0.vm05.stdout:1/518: write da/dd/d2a/d55/d64/f7a [828723,21667] 0 2026-03-10T07:50:53.695 INFO:tasks.workunit.client.0.vm05.stdout:1/519: dread da/dd/d2a/d70/f83 [0,4194304] 0 2026-03-10T07:50:53.708 INFO:tasks.workunit.client.0.vm05.stdout:7/487: write d1/d6/f31 [915347,108904] 0 2026-03-10T07:50:53.711 INFO:tasks.workunit.client.0.vm05.stdout:4/537: mkdir d0/d6/d9/d5a/d6e/db6/db9 0 2026-03-10T07:50:53.713 INFO:tasks.workunit.client.0.vm05.stdout:4/538: read d0/d6/d9/d12/d9c/db7/da7/f9a [401537,82645] 0 2026-03-10T07:50:53.713 INFO:tasks.workunit.client.0.vm05.stdout:4/539: stat d0/d6/d37/c92 0 2026-03-10T07:50:53.726 INFO:tasks.workunit.client.0.vm05.stdout:5/496: creat d2/d20/d33/d86/d8d/da1/faa x:0 0 0 2026-03-10T07:50:53.727 INFO:tasks.workunit.client.0.vm05.stdout:5/497: fdatasync d2/d20/d33/f45 0 2026-03-10T07:50:53.730 INFO:tasks.workunit.client.0.vm05.stdout:2/566: truncate d0/d8/d3d/d7d/db2/f2b 2236664 0 2026-03-10T07:50:53.742 INFO:tasks.workunit.client.0.vm05.stdout:0/486: getdents d8/dd/d37 0 2026-03-10T07:50:53.743 INFO:tasks.workunit.client.0.vm05.stdout:1/520: write da/dd/d2a/d70/f83 [527144,63948] 0 2026-03-10T07:50:53.747 INFO:tasks.workunit.client.0.vm05.stdout:7/488: readlink d1/l18 0 2026-03-10T07:50:53.753 INFO:tasks.workunit.client.0.vm05.stdout:8/453: write d1/dd/d4d/f61 [1205199,87463] 0 2026-03-10T07:50:53.753 INFO:tasks.workunit.client.0.vm05.stdout:3/477: rename f4 to d8/d16/d52/f9f 0 2026-03-10T07:50:53.754 INFO:tasks.workunit.client.0.vm05.stdout:9/479: write d8/d35/f1f [1186009,114947] 0 2026-03-10T07:50:53.758 INFO:tasks.workunit.client.0.vm05.stdout:8/454: fsync d1/dd/d18/d20/f43 0 2026-03-10T07:50:53.762 INFO:tasks.workunit.client.0.vm05.stdout:8/455: write d1/dd/d4d/f8a [903279,83324] 0 2026-03-10T07:50:53.762 INFO:tasks.workunit.client.0.vm05.stdout:0/487: dwrite d8/f2d [4194304,4194304] 0 2026-03-10T07:50:53.766 INFO:tasks.workunit.client.0.vm05.stdout:2/567: dwrite d0/d8/d3d/d7d/da5/da8/fac [0,4194304] 0 2026-03-10T07:50:53.770 INFO:tasks.workunit.client.0.vm05.stdout:2/568: fsync d0/d52/f91 0 2026-03-10T07:50:53.776 INFO:tasks.workunit.client.0.vm05.stdout:7/489: chown d1/d6/d3b/l45 78 1 2026-03-10T07:50:53.783 INFO:tasks.workunit.client.0.vm05.stdout:1/521: mknod da/dd/d12/d34/d58/d8e/c96 0 2026-03-10T07:50:53.783 INFO:tasks.workunit.client.0.vm05.stdout:1/522: fdatasync da/dd/d2a/f6f 0 2026-03-10T07:50:53.792 INFO:tasks.workunit.client.0.vm05.stdout:4/540: rename d0/d28/f7c to d0/d6/da6/fba 0 2026-03-10T07:50:53.800 INFO:tasks.workunit.client.0.vm05.stdout:3/478: rmdir d8/d22/d60 39 2026-03-10T07:50:53.800 INFO:tasks.workunit.client.0.vm05.stdout:8/456: mkdir d1/d45/d90 0 2026-03-10T07:50:53.800 INFO:tasks.workunit.client.0.vm05.stdout:2/569: symlink d0/d7e/db4/d49/d81/lb6 0 2026-03-10T07:50:53.800 INFO:tasks.workunit.client.0.vm05.stdout:1/523: rename da/dd/d2a/f63 to da/d26/d2b/d71/f97 0 2026-03-10T07:50:53.800 INFO:tasks.workunit.client.0.vm05.stdout:9/480: chown d8/d35/d1c/d20/d59/d8b/f39 49267 1 2026-03-10T07:50:53.800 INFO:tasks.workunit.client.0.vm05.stdout:0/488: mknod d8/d9c/ca8 0 2026-03-10T07:50:53.803 INFO:tasks.workunit.client.0.vm05.stdout:4/541: fdatasync d0/d6/d9/d12/d45/d55/f19 0 2026-03-10T07:50:53.807 INFO:tasks.workunit.client.0.vm05.stdout:4/542: chown d0/d6/d9/d12/d45/d55/f5f 173820 1 2026-03-10T07:50:53.808 INFO:tasks.workunit.client.0.vm05.stdout:5/498: getdents d2/d20 0 2026-03-10T07:50:53.808 INFO:tasks.workunit.client.0.vm05.stdout:5/499: fsync d2/d5/d61/f66 0 2026-03-10T07:50:53.812 INFO:tasks.workunit.client.0.vm05.stdout:8/457: dwrite d1/dd/d18/d20/d2a/d48/d7c/f80 [0,4194304] 0 2026-03-10T07:50:53.816 INFO:tasks.workunit.client.0.vm05.stdout:9/481: truncate d8/d35/d22/d33/d62/f7d 1729293 0 2026-03-10T07:50:53.816 INFO:tasks.workunit.client.0.vm05.stdout:0/489: chown d8/fb 203483704 1 2026-03-10T07:50:53.821 INFO:tasks.workunit.client.0.vm05.stdout:8/458: dwrite d1/dd/d4d/d64/d6a/f76 [0,4194304] 0 2026-03-10T07:50:53.821 INFO:tasks.workunit.client.0.vm05.stdout:5/500: mkdir d2/d20/dab 0 2026-03-10T07:50:53.822 INFO:tasks.workunit.client.0.vm05.stdout:0/490: chown d8/dd/f22 31332 1 2026-03-10T07:50:53.824 INFO:tasks.workunit.client.0.vm05.stdout:6/502: truncate d0/d6/f16 3120352 0 2026-03-10T07:50:53.824 INFO:tasks.workunit.client.0.vm05.stdout:1/524: rmdir da/dd/d12/d34/d58/d8e 39 2026-03-10T07:50:53.831 INFO:tasks.workunit.client.0.vm05.stdout:0/491: write d8/dd/d37/f4f [1142942,118552] 0 2026-03-10T07:50:53.839 INFO:tasks.workunit.client.0.vm05.stdout:7/490: truncate d1/d3c/d4b/f4f 670217 0 2026-03-10T07:50:53.839 INFO:tasks.workunit.client.0.vm05.stdout:3/479: truncate d8/d16/f1a 856950 0 2026-03-10T07:50:53.843 INFO:tasks.workunit.client.0.vm05.stdout:8/459: write d1/dd/d18/d20/d2a/d48/d7c/f80 [4886806,80193] 0 2026-03-10T07:50:53.849 INFO:tasks.workunit.client.0.vm05.stdout:5/501: dread d2/d5/d61/f66 [0,4194304] 0 2026-03-10T07:50:53.851 INFO:tasks.workunit.client.0.vm05.stdout:3/480: dwrite d8/d1f/d2a/d4a/f89 [0,4194304] 0 2026-03-10T07:50:53.862 INFO:tasks.workunit.client.0.vm05.stdout:1/525: creat da/d26/d2b/d71/f98 x:0 0 0 2026-03-10T07:50:53.862 INFO:tasks.workunit.client.0.vm05.stdout:0/492: truncate d8/dd/d37/d56/f62 2842104 0 2026-03-10T07:50:53.862 INFO:tasks.workunit.client.0.vm05.stdout:7/491: dwrite d1/d34/f7a [0,4194304] 0 2026-03-10T07:50:53.868 INFO:tasks.workunit.client.0.vm05.stdout:0/493: chown d8/dd/d10/d26/d48/c5d 15195339 1 2026-03-10T07:50:53.868 INFO:tasks.workunit.client.0.vm05.stdout:7/492: fsync d1/d34/d59/f78 0 2026-03-10T07:50:53.869 INFO:tasks.workunit.client.0.vm05.stdout:6/503: creat d0/d11/f9d x:0 0 0 2026-03-10T07:50:53.870 INFO:tasks.workunit.client.0.vm05.stdout:1/526: symlink da/dd/d42/d80/l99 0 2026-03-10T07:50:53.873 INFO:tasks.workunit.client.0.vm05.stdout:2/570: dread d0/d8/d43/df/f18 [0,4194304] 0 2026-03-10T07:50:53.873 INFO:tasks.workunit.client.0.vm05.stdout:8/460: symlink d1/d23/l91 0 2026-03-10T07:50:53.873 INFO:tasks.workunit.client.0.vm05.stdout:3/481: symlink d8/d16/d19/d6b/la0 0 2026-03-10T07:50:53.873 INFO:tasks.workunit.client.0.vm05.stdout:4/543: getdents d0/d6/d9/d12/d45 0 2026-03-10T07:50:53.878 INFO:tasks.workunit.client.0.vm05.stdout:7/493: mknod d1/d5b/c94 0 2026-03-10T07:50:53.878 INFO:tasks.workunit.client.0.vm05.stdout:8/461: creat d1/dd/d18/d20/d2a/d48/f92 x:0 0 0 2026-03-10T07:50:53.878 INFO:tasks.workunit.client.0.vm05.stdout:8/462: chown d1/d6f/f7d 1 1 2026-03-10T07:50:53.885 INFO:tasks.workunit.client.0.vm05.stdout:1/527: mkdir da/dd/d12/d86/d9a 0 2026-03-10T07:50:53.885 INFO:tasks.workunit.client.0.vm05.stdout:3/482: rename d8/d1f/d24/d45/c83 to d8/d16/ca1 0 2026-03-10T07:50:53.886 INFO:tasks.workunit.client.0.vm05.stdout:8/463: chown d1/dd/d5e/l8b 12 1 2026-03-10T07:50:53.886 INFO:tasks.workunit.client.0.vm05.stdout:4/544: readlink d0/d6/d9/d12/d65/l99 0 2026-03-10T07:50:53.889 INFO:tasks.workunit.client.0.vm05.stdout:8/464: readlink d1/l44 0 2026-03-10T07:50:53.889 INFO:tasks.workunit.client.0.vm05.stdout:3/483: mknod d8/ca2 0 2026-03-10T07:50:53.890 INFO:tasks.workunit.client.0.vm05.stdout:7/494: dwrite d1/d34/f7a [0,4194304] 0 2026-03-10T07:50:53.890 INFO:tasks.workunit.client.0.vm05.stdout:6/504: rename d0/d6/f10 to d0/d11/d57/f9e 0 2026-03-10T07:50:53.898 INFO:tasks.workunit.client.0.vm05.stdout:4/545: truncate d0/f2 1008017 0 2026-03-10T07:50:53.898 INFO:tasks.workunit.client.0.vm05.stdout:6/505: getdents d0/d11/d57/d93 0 2026-03-10T07:50:53.898 INFO:tasks.workunit.client.0.vm05.stdout:8/465: link d1/dd/d18/d20/d2a/c7f d1/d6f/c93 0 2026-03-10T07:50:53.898 INFO:tasks.workunit.client.0.vm05.stdout:3/484: write d8/d22/d60/f50 [2402728,82629] 0 2026-03-10T07:50:53.906 INFO:tasks.workunit.client.0.vm05.stdout:3/485: write d8/d1f/d2a/f42 [345280,130590] 0 2026-03-10T07:50:53.910 INFO:tasks.workunit.client.0.vm05.stdout:8/466: creat d1/d52/f94 x:0 0 0 2026-03-10T07:50:53.920 INFO:tasks.workunit.client.0.vm05.stdout:1/528: dwrite da/dd/d12/f16 [0,4194304] 0 2026-03-10T07:50:53.920 INFO:tasks.workunit.client.0.vm05.stdout:1/529: chown da/dd/d2a/f6f 1649282 1 2026-03-10T07:50:53.920 INFO:tasks.workunit.client.0.vm05.stdout:7/495: dwrite d1/d34/d59/f78 [0,4194304] 0 2026-03-10T07:50:53.921 INFO:tasks.workunit.client.0.vm05.stdout:4/546: dread d0/d6/d9/f4d [0,4194304] 0 2026-03-10T07:50:53.922 INFO:tasks.workunit.client.0.vm05.stdout:6/506: sync 2026-03-10T07:50:53.926 INFO:tasks.workunit.client.0.vm05.stdout:9/482: write d8/d35/d38/f87 [606000,49267] 0 2026-03-10T07:50:53.927 INFO:tasks.workunit.client.0.vm05.stdout:7/496: creat d1/d3c/d71/f95 x:0 0 0 2026-03-10T07:50:53.927 INFO:tasks.workunit.client.0.vm05.stdout:4/547: write d0/d6/d9/f4d [1650342,126702] 0 2026-03-10T07:50:53.933 INFO:tasks.workunit.client.0.vm05.stdout:7/497: dread - d1/d3c/d71/d79/d8a/f90 zero size 2026-03-10T07:50:53.933 INFO:tasks.workunit.client.0.vm05.stdout:4/548: symlink d0/d6/da6/lbb 0 2026-03-10T07:50:53.934 INFO:tasks.workunit.client.0.vm05.stdout:4/549: write d0/d6/d60/faf [604464,45464] 0 2026-03-10T07:50:53.940 INFO:tasks.workunit.client.0.vm05.stdout:9/483: rmdir d8/d35/d22/d33/d70 39 2026-03-10T07:50:53.940 INFO:tasks.workunit.client.0.vm05.stdout:6/507: fdatasync d0/d6/f1a 0 2026-03-10T07:50:53.941 INFO:tasks.workunit.client.0.vm05.stdout:7/498: rmdir d1/d34/d59 39 2026-03-10T07:50:53.945 INFO:tasks.workunit.client.0.vm05.stdout:8/467: dread d1/dd/d4d/f61 [0,4194304] 0 2026-03-10T07:50:53.951 INFO:tasks.workunit.client.0.vm05.stdout:9/484: rmdir d8/d86/d28/d79 39 2026-03-10T07:50:53.951 INFO:tasks.workunit.client.0.vm05.stdout:9/485: stat d8/d35/d22/d33/d62/f6c 0 2026-03-10T07:50:53.951 INFO:tasks.workunit.client.0.vm05.stdout:9/486: chown d8/d35/d22/c6e 136365259 1 2026-03-10T07:50:53.951 INFO:tasks.workunit.client.0.vm05.stdout:6/508: dwrite d0/d11/d4f/d56/f6b [0,4194304] 0 2026-03-10T07:50:53.954 INFO:tasks.workunit.client.0.vm05.stdout:4/550: creat d0/d6/d9/d5a/d6e/db6/db9/fbc x:0 0 0 2026-03-10T07:50:53.958 INFO:tasks.workunit.client.0.vm05.stdout:8/468: dwrite d1/dd/f17 [0,4194304] 0 2026-03-10T07:50:53.965 INFO:tasks.workunit.client.0.vm05.stdout:3/486: dread d8/f12 [0,4194304] 0 2026-03-10T07:50:53.979 INFO:tasks.workunit.client.0.vm05.stdout:8/469: dwrite d1/d52/f77 [0,4194304] 0 2026-03-10T07:50:53.979 INFO:tasks.workunit.client.0.vm05.stdout:4/551: dwrite d0/d6/d9/d12/d9c/db7/fb2 [0,4194304] 0 2026-03-10T07:50:53.988 INFO:tasks.workunit.client.0.vm05.stdout:5/502: write d2/d12/d2d/f36 [825320,49974] 0 2026-03-10T07:50:53.993 INFO:tasks.workunit.client.0.vm05.stdout:9/487: write d8/f14 [337006,127402] 0 2026-03-10T07:50:53.993 INFO:tasks.workunit.client.0.vm05.stdout:4/552: read d0/d6/d9/d12/f36 [1875772,119988] 0 2026-03-10T07:50:53.993 INFO:tasks.workunit.client.0.vm05.stdout:4/553: write d0/d6/d9/d5a/f58 [3749691,20343] 0 2026-03-10T07:50:54.001 INFO:tasks.workunit.client.0.vm05.stdout:7/499: link d1/d6/l5c d1/d3c/d71/l96 0 2026-03-10T07:50:54.006 INFO:tasks.workunit.client.0.vm05.stdout:0/494: dwrite d8/dd/fa7 [0,4194304] 0 2026-03-10T07:50:54.007 INFO:tasks.workunit.client.0.vm05.stdout:2/571: dwrite d0/d8/d3d/d7d/db2/f2b [0,4194304] 0 2026-03-10T07:50:54.011 INFO:tasks.workunit.client.0.vm05.stdout:3/487: creat d8/d1f/d24/d8a/fa3 x:0 0 0 2026-03-10T07:50:54.017 INFO:tasks.workunit.client.0.vm05.stdout:0/495: chown d8/dd/d10/d26/d2a/f74 11574791 1 2026-03-10T07:50:54.036 INFO:tasks.workunit.client.0.vm05.stdout:8/470: fdatasync d1/fe 0 2026-03-10T07:50:54.038 INFO:tasks.workunit.client.0.vm05.stdout:8/471: readlink d1/d23/l91 0 2026-03-10T07:50:54.039 INFO:tasks.workunit.client.0.vm05.stdout:6/509: symlink d0/l9f 0 2026-03-10T07:50:54.040 INFO:tasks.workunit.client.0.vm05.stdout:9/488: readlink d8/d35/d22/d33/d70/l9d 0 2026-03-10T07:50:54.040 INFO:tasks.workunit.client.0.vm05.stdout:3/488: mkdir d8/d16/d52/da4 0 2026-03-10T07:50:54.041 INFO:tasks.workunit.client.0.vm05.stdout:0/496: dwrite d8/dd/d10/d26/d2a/f2e [0,4194304] 0 2026-03-10T07:50:54.042 INFO:tasks.workunit.client.0.vm05.stdout:8/472: fsync d1/dd/d18/d20/d2a/d48/d7c/f80 0 2026-03-10T07:50:54.042 INFO:tasks.workunit.client.0.vm05.stdout:8/473: stat d1 0 2026-03-10T07:50:54.048 INFO:tasks.workunit.client.0.vm05.stdout:0/497: truncate d8/dd/d10/d26/d8b/da4/f9e 607445 0 2026-03-10T07:50:54.051 INFO:tasks.workunit.client.0.vm05.stdout:0/498: read - d8/dd/d10/d26/d3a/d5e/fa6 zero size 2026-03-10T07:50:54.054 INFO:tasks.workunit.client.0.vm05.stdout:2/572: rename d0/d52/c71 to d0/d8/d43/df/d8b/cb7 0 2026-03-10T07:50:54.055 INFO:tasks.workunit.client.0.vm05.stdout:2/573: chown d0/d8/d43/c27 67 1 2026-03-10T07:50:54.064 INFO:tasks.workunit.client.0.vm05.stdout:6/510: dread d0/d35/d36/f5b [0,4194304] 0 2026-03-10T07:50:54.080 INFO:tasks.workunit.client.0.vm05.stdout:4/554: symlink d0/d6/d9/d5a/d91/lbd 0 2026-03-10T07:50:54.080 INFO:tasks.workunit.client.0.vm05.stdout:9/489: fdatasync d8/d35/d22/f9b 0 2026-03-10T07:50:54.082 INFO:tasks.workunit.client.0.vm05.stdout:1/530: truncate da/dd/d12/f22 2379161 0 2026-03-10T07:50:54.093 INFO:tasks.workunit.client.0.vm05.stdout:2/574: dread d0/d8/d43/d38/f56 [0,4194304] 0 2026-03-10T07:50:54.094 INFO:tasks.workunit.client.0.vm05.stdout:2/575: chown d0/d8/d3d/l63 44862593 1 2026-03-10T07:50:54.096 INFO:tasks.workunit.client.0.vm05.stdout:6/511: rmdir d0/d11/d4f/d6e 39 2026-03-10T07:50:54.104 INFO:tasks.workunit.client.0.vm05.stdout:5/503: mkdir d2/d20/d33/d86/dac 0 2026-03-10T07:50:54.112 INFO:tasks.workunit.client.0.vm05.stdout:7/500: creat d1/d34/d59/d60/d8c/f97 x:0 0 0 2026-03-10T07:50:54.114 INFO:tasks.workunit.client.0.vm05.stdout:4/555: fsync d0/d6/d95/f3a 0 2026-03-10T07:50:54.115 INFO:tasks.workunit.client.0.vm05.stdout:4/556: chown d0/d6/d9/d12/d45/d55/d44/f7e 9362 1 2026-03-10T07:50:54.117 INFO:tasks.workunit.client.0.vm05.stdout:0/499: link d8/f65 d8/dd/d37/fa9 0 2026-03-10T07:50:54.121 INFO:tasks.workunit.client.0.vm05.stdout:2/576: unlink d0/d8/d43/df/d53/f69 0 2026-03-10T07:50:54.122 INFO:tasks.workunit.client.0.vm05.stdout:2/577: read d0/d8/d3d/d7d/da5/da8/fac [2935054,128439] 0 2026-03-10T07:50:54.123 INFO:tasks.workunit.client.0.vm05.stdout:6/512: truncate d0/d6/f24 1392508 0 2026-03-10T07:50:54.124 INFO:tasks.workunit.client.0.vm05.stdout:6/513: write d0/d6/f1a [3350105,75701] 0 2026-03-10T07:50:54.124 INFO:tasks.workunit.client.0.vm05.stdout:6/514: write d0/d35/d36/f59 [3749338,57369] 0 2026-03-10T07:50:54.129 INFO:tasks.workunit.client.0.vm05.stdout:6/515: dwrite d0/d11/d57/f5c [0,4194304] 0 2026-03-10T07:50:54.131 INFO:tasks.workunit.client.0.vm05.stdout:6/516: write d0/d11/d31/f63 [741833,21916] 0 2026-03-10T07:50:54.139 INFO:tasks.workunit.client.0.vm05.stdout:9/490: write d8/d35/d1c/f49 [82962,60674] 0 2026-03-10T07:50:54.141 INFO:tasks.workunit.client.0.vm05.stdout:3/489: link d8/d16/d19/d6b/c74 d8/d16/d52/ca5 0 2026-03-10T07:50:54.142 INFO:tasks.workunit.client.0.vm05.stdout:7/501: chown d1/d6/d80/d82/c87 0 1 2026-03-10T07:50:54.143 INFO:tasks.workunit.client.0.vm05.stdout:7/502: chown d1/l18 46568801 1 2026-03-10T07:50:54.143 INFO:tasks.workunit.client.0.vm05.stdout:4/557: mkdir d0/d6/d9/d8c/dbe 0 2026-03-10T07:50:54.144 INFO:tasks.workunit.client.0.vm05.stdout:4/558: write d0/d6/d9/d12/d45/d55/f5f [304911,6118] 0 2026-03-10T07:50:54.152 INFO:tasks.workunit.client.0.vm05.stdout:8/474: link d1/d45/f53 d1/dd/d18/d20/d2a/f95 0 2026-03-10T07:50:54.159 INFO:tasks.workunit.client.0.vm05.stdout:4/559: dread d0/d6/d9/d12/d45/d55/f56 [0,4194304] 0 2026-03-10T07:50:54.162 INFO:tasks.workunit.client.0.vm05.stdout:0/500: dwrite d8/fc [0,4194304] 0 2026-03-10T07:50:54.167 INFO:tasks.workunit.client.0.vm05.stdout:0/501: dread d8/dd/d10/d26/d3a/d5e/fa3 [0,4194304] 0 2026-03-10T07:50:54.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:53 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:54.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:53 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:54.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:53 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:50:54.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:53 vm08.local ceph-mon[59917]: pgmap v16: 65 pgs: 65 active+clean; 840 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 18 MiB/s rd, 98 MiB/s wr, 212 op/s 2026-03-10T07:50:54.169 INFO:tasks.workunit.client.0.vm05.stdout:2/578: creat d0/d7e/db4/d49/fb8 x:0 0 0 2026-03-10T07:50:54.169 INFO:tasks.workunit.client.0.vm05.stdout:2/579: readlink d0/d7e/db4/la7 0 2026-03-10T07:50:54.170 INFO:tasks.workunit.client.0.vm05.stdout:2/580: stat d0/d8/d43/df/f97 0 2026-03-10T07:50:54.174 INFO:tasks.workunit.client.0.vm05.stdout:3/490: unlink d8/ff 0 2026-03-10T07:50:54.175 INFO:tasks.workunit.client.0.vm05.stdout:7/503: mknod d1/d6/c98 0 2026-03-10T07:50:54.179 INFO:tasks.workunit.client.0.vm05.stdout:4/560: chown d0/d6/d9/l59 125951 1 2026-03-10T07:50:54.179 INFO:tasks.workunit.client.0.vm05.stdout:4/561: write d0/d6/f15 [377035,74373] 0 2026-03-10T07:50:54.181 INFO:tasks.workunit.client.0.vm05.stdout:0/502: creat d8/dd/d10/d26/d2a/d6f/faa x:0 0 0 2026-03-10T07:50:54.183 INFO:tasks.workunit.client.0.vm05.stdout:5/504: creat d2/fad x:0 0 0 2026-03-10T07:50:54.184 INFO:tasks.workunit.client.0.vm05.stdout:2/581: symlink d0/d8/d43/d38/lb9 0 2026-03-10T07:50:54.185 INFO:tasks.workunit.client.0.vm05.stdout:6/517: unlink d0/d35/d36/d43/f47 0 2026-03-10T07:50:54.189 INFO:tasks.workunit.client.0.vm05.stdout:7/504: read d1/d6/f32 [5404957,22334] 0 2026-03-10T07:50:54.194 INFO:tasks.workunit.client.0.vm05.stdout:8/475: rename d1/dd/c14 to d1/dd/d4d/d64/d8f/c96 0 2026-03-10T07:50:54.194 INFO:tasks.workunit.client.0.vm05.stdout:8/476: stat d1/dd/d4d/f61 0 2026-03-10T07:50:54.198 INFO:tasks.workunit.client.0.vm05.stdout:9/491: dwrite d8/d35/d1c/d75/f88 [0,4194304] 0 2026-03-10T07:50:54.202 INFO:tasks.workunit.client.0.vm05.stdout:1/531: link da/dd/d2a/d55/d68/c67 da/dd/d12/d86/c9b 0 2026-03-10T07:50:54.202 INFO:tasks.workunit.client.0.vm05.stdout:1/532: fdatasync da/dd/d2a/f6a 0 2026-03-10T07:50:54.214 INFO:tasks.workunit.client.0.vm05.stdout:5/505: unlink d2/d12/f8f 0 2026-03-10T07:50:54.215 INFO:tasks.workunit.client.0.vm05.stdout:2/582: creat d0/d8/d3d/d7d/db2/fba x:0 0 0 2026-03-10T07:50:54.222 INFO:tasks.workunit.client.0.vm05.stdout:7/505: creat d1/d34/d59/f99 x:0 0 0 2026-03-10T07:50:54.225 INFO:tasks.workunit.client.0.vm05.stdout:6/518: dread d0/d11/d57/f5f [0,4194304] 0 2026-03-10T07:50:54.226 INFO:tasks.workunit.client.0.vm05.stdout:6/519: chown d0/d11/c12 486 1 2026-03-10T07:50:54.226 INFO:tasks.workunit.client.0.vm05.stdout:8/477: symlink d1/dd/d4d/l97 0 2026-03-10T07:50:54.227 INFO:tasks.workunit.client.0.vm05.stdout:6/520: chown d0/d11/d4f/d56/f6b 1002 1 2026-03-10T07:50:54.227 INFO:tasks.workunit.client.0.vm05.stdout:6/521: read - d0/d11/d57/f7a zero size 2026-03-10T07:50:54.229 INFO:tasks.workunit.client.0.vm05.stdout:1/533: creat da/dd/d2a/d70/f9c x:0 0 0 2026-03-10T07:50:54.231 INFO:tasks.workunit.client.0.vm05.stdout:0/503: link d8/dd/d10/d26/d2a/f2e d8/dd/d10/d26/d2a/fab 0 2026-03-10T07:50:54.232 INFO:tasks.workunit.client.0.vm05.stdout:5/506: truncate d2/d20/f51 866919 0 2026-03-10T07:50:54.233 INFO:tasks.workunit.client.0.vm05.stdout:8/478: creat d1/dd/d18/d20/d2a/d48/d5a/f98 x:0 0 0 2026-03-10T07:50:54.235 INFO:tasks.workunit.client.0.vm05.stdout:9/492: mkdir d8/d35/d22/d33/d62/d6d/d9e 0 2026-03-10T07:50:54.236 INFO:tasks.workunit.client.0.vm05.stdout:4/562: creat d0/d6/d9/d12/d45/d55/d44/fbf x:0 0 0 2026-03-10T07:50:54.236 INFO:tasks.workunit.client.0.vm05.stdout:4/563: fdatasync d0/d6/d37/f46 0 2026-03-10T07:50:54.237 INFO:tasks.workunit.client.0.vm05.stdout:4/564: chown d0/d6/d9/d12/l14 5 1 2026-03-10T07:50:54.238 INFO:tasks.workunit.client.0.vm05.stdout:5/507: symlink d2/d12/da8/lae 0 2026-03-10T07:50:54.238 INFO:tasks.workunit.client.0.vm05.stdout:7/506: creat d1/d6/d3b/d7f/f9a x:0 0 0 2026-03-10T07:50:54.239 INFO:tasks.workunit.client.0.vm05.stdout:8/479: symlink d1/dd/d4d/d64/d8f/l99 0 2026-03-10T07:50:54.240 INFO:tasks.workunit.client.0.vm05.stdout:7/507: write d1/d34/d59/d60/d8c/f97 [664050,41272] 0 2026-03-10T07:50:54.240 INFO:tasks.workunit.client.0.vm05.stdout:7/508: chown d1/d6/d47/l4a 0 1 2026-03-10T07:50:54.242 INFO:tasks.workunit.client.0.vm05.stdout:6/522: mkdir d0/d11/d4f/da0 0 2026-03-10T07:50:54.243 INFO:tasks.workunit.client.0.vm05.stdout:2/583: sync 2026-03-10T07:50:54.247 INFO:tasks.workunit.client.0.vm05.stdout:2/584: dwrite d0/d8/f1c [4194304,4194304] 0 2026-03-10T07:50:54.249 INFO:tasks.workunit.client.0.vm05.stdout:9/493: creat d8/d35/d1c/d20/f9f x:0 0 0 2026-03-10T07:50:54.268 INFO:tasks.workunit.client.0.vm05.stdout:4/565: creat d0/d6/d9/d12/d45/d55/d44/d85/fc0 x:0 0 0 2026-03-10T07:50:54.269 INFO:tasks.workunit.client.0.vm05.stdout:5/508: truncate d2/f9 3410149 0 2026-03-10T07:50:54.272 INFO:tasks.workunit.client.0.vm05.stdout:7/509: rmdir d1/d34/d59/d60 39 2026-03-10T07:50:54.274 INFO:tasks.workunit.client.0.vm05.stdout:7/510: fdatasync d1/d6/f31 0 2026-03-10T07:50:54.274 INFO:tasks.workunit.client.0.vm05.stdout:6/523: truncate d0/d11/d57/f5f 478657 0 2026-03-10T07:50:54.276 INFO:tasks.workunit.client.0.vm05.stdout:1/534: read da/d26/d2b/f65 [871390,114528] 0 2026-03-10T07:50:54.293 INFO:tasks.workunit.client.0.vm05.stdout:9/494: dread d8/d35/d22/d33/d47/f5a [0,4194304] 0 2026-03-10T07:50:54.295 INFO:tasks.workunit.client.0.vm05.stdout:3/491: dread d8/f3b [0,4194304] 0 2026-03-10T07:50:54.296 INFO:tasks.workunit.client.0.vm05.stdout:0/504: creat d8/dd/d10/fac x:0 0 0 2026-03-10T07:50:54.301 INFO:tasks.workunit.client.0.vm05.stdout:5/509: dread d2/d12/f40 [0,4194304] 0 2026-03-10T07:50:54.303 INFO:tasks.workunit.client.0.vm05.stdout:8/480: dwrite d1/dd/d18/f21 [0,4194304] 0 2026-03-10T07:50:54.310 INFO:tasks.workunit.client.0.vm05.stdout:2/585: write d0/d8/d3d/d7d/da5/da8/fac [3613274,42063] 0 2026-03-10T07:50:54.311 INFO:tasks.workunit.client.0.vm05.stdout:2/586: chown d0/d8/d43/df/d53/l5c 3 1 2026-03-10T07:50:54.312 INFO:tasks.workunit.client.0.vm05.stdout:4/566: creat d0/d6/d9/d8c/fc1 x:0 0 0 2026-03-10T07:50:54.313 INFO:tasks.workunit.client.0.vm05.stdout:2/587: read d0/f22 [489291,21379] 0 2026-03-10T07:50:54.314 INFO:tasks.workunit.client.0.vm05.stdout:8/481: read d1/dd/d18/f70 [85804,61822] 0 2026-03-10T07:50:54.324 INFO:tasks.workunit.client.0.vm05.stdout:7/511: symlink d1/d6/d3b/d7f/l9b 0 2026-03-10T07:50:54.328 INFO:tasks.workunit.client.0.vm05.stdout:8/482: dread d1/f15 [0,4194304] 0 2026-03-10T07:50:54.329 INFO:tasks.workunit.client.0.vm05.stdout:6/524: symlink d0/d11/d31/la1 0 2026-03-10T07:50:54.331 INFO:tasks.workunit.client.0.vm05.stdout:9/495: creat d8/d35/d22/d33/d62/d6d/fa0 x:0 0 0 2026-03-10T07:50:54.334 INFO:tasks.workunit.client.0.vm05.stdout:4/567: mkdir d0/d6/d6f/dc2 0 2026-03-10T07:50:54.334 INFO:tasks.workunit.client.0.vm05.stdout:4/568: write d0/d6/d9/f4d [493004,73269] 0 2026-03-10T07:50:54.341 INFO:tasks.workunit.client.0.vm05.stdout:4/569: stat d0/d6/d9/d12/d45/d55/f56 0 2026-03-10T07:50:54.341 INFO:tasks.workunit.client.0.vm05.stdout:4/570: write d0/d6/d9/d12/d45/d55/d44/d85/fc0 [311395,12489] 0 2026-03-10T07:50:54.342 INFO:tasks.workunit.client.0.vm05.stdout:2/588: symlink d0/d2a/d8c/lbb 0 2026-03-10T07:50:54.342 INFO:tasks.workunit.client.0.vm05.stdout:8/483: unlink d1/dd/f25 0 2026-03-10T07:50:54.342 INFO:tasks.workunit.client.0.vm05.stdout:6/525: stat d0/l70 0 2026-03-10T07:50:54.343 INFO:tasks.workunit.client.0.vm05.stdout:7/512: sync 2026-03-10T07:50:54.345 INFO:tasks.workunit.client.0.vm05.stdout:1/535: symlink da/dd/d12/d34/d58/d8e/l9d 0 2026-03-10T07:50:54.346 INFO:tasks.workunit.client.0.vm05.stdout:9/496: creat d8/d35/d1c/d20/d59/d8b/fa1 x:0 0 0 2026-03-10T07:50:54.347 INFO:tasks.workunit.client.0.vm05.stdout:3/492: mknod d8/d47/ca6 0 2026-03-10T07:50:54.347 INFO:tasks.workunit.client.0.vm05.stdout:3/493: readlink d8/l9 0 2026-03-10T07:50:54.352 INFO:tasks.workunit.client.0.vm05.stdout:2/589: dread d0/d8/d43/df/d53/f82 [0,4194304] 0 2026-03-10T07:50:54.354 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:53 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:54.354 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:53 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:54.354 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:53 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:50:54.354 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:53 vm05.local ceph-mon[50387]: pgmap v16: 65 pgs: 65 active+clean; 840 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 18 MiB/s rd, 98 MiB/s wr, 212 op/s 2026-03-10T07:50:54.356 INFO:tasks.workunit.client.0.vm05.stdout:3/494: dwrite d8/d22/d60/f8e [0,4194304] 0 2026-03-10T07:50:54.368 INFO:tasks.workunit.client.0.vm05.stdout:4/571: fdatasync d0/d6/d9/d12/d69/f6a 0 2026-03-10T07:50:54.375 INFO:tasks.workunit.client.0.vm05.stdout:8/484: fsync d1/dd/d18/d20/d2a/f95 0 2026-03-10T07:50:54.376 INFO:tasks.workunit.client.0.vm05.stdout:7/513: write d1/d6/f1d [2100226,97174] 0 2026-03-10T07:50:54.376 INFO:tasks.workunit.client.0.vm05.stdout:1/536: mkdir da/d26/d9e 0 2026-03-10T07:50:54.382 INFO:tasks.workunit.client.0.vm05.stdout:0/505: creat d8/dd/d10/d26/fad x:0 0 0 2026-03-10T07:50:54.383 INFO:tasks.workunit.client.0.vm05.stdout:2/590: creat d0/d8/d3d/d7d/db2/fbc x:0 0 0 2026-03-10T07:50:54.387 INFO:tasks.workunit.client.0.vm05.stdout:8/485: dread d1/dd/d18/f21 [0,4194304] 0 2026-03-10T07:50:54.396 INFO:tasks.workunit.client.0.vm05.stdout:9/497: dwrite d8/d35/f25 [0,4194304] 0 2026-03-10T07:50:54.424 INFO:tasks.workunit.client.0.vm05.stdout:7/514: symlink d1/d6/l9c 0 2026-03-10T07:50:54.433 INFO:tasks.workunit.client.0.vm05.stdout:2/591: creat d0/d8/d3d/d7d/db2/fbd x:0 0 0 2026-03-10T07:50:54.443 INFO:tasks.workunit.client.0.vm05.stdout:7/515: dread d1/d6/f2e [0,4194304] 0 2026-03-10T07:50:54.444 INFO:tasks.workunit.client.0.vm05.stdout:5/510: getdents d2/d20/d33/d53 0 2026-03-10T07:50:54.445 INFO:tasks.workunit.client.0.vm05.stdout:2/592: dwrite d0/d8/d43/d38/f9a [0,4194304] 0 2026-03-10T07:50:54.445 INFO:tasks.workunit.client.0.vm05.stdout:5/511: write d2/d12/f5a [4045334,72518] 0 2026-03-10T07:50:54.464 INFO:tasks.workunit.client.0.vm05.stdout:4/572: creat d0/d6/d9/d8c/dbe/fc3 x:0 0 0 2026-03-10T07:50:54.468 INFO:tasks.workunit.client.0.vm05.stdout:3/495: dread d8/d1f/d2a/d34/f39 [0,4194304] 0 2026-03-10T07:50:54.481 INFO:tasks.workunit.client.0.vm05.stdout:6/526: creat d0/d11/d4f/fa2 x:0 0 0 2026-03-10T07:50:54.485 INFO:tasks.workunit.client.0.vm05.stdout:4/573: dread d0/f24 [4194304,4194304] 0 2026-03-10T07:50:54.508 INFO:tasks.workunit.client.0.vm05.stdout:0/506: write d8/dd/d37/f38 [1380708,89406] 0 2026-03-10T07:50:54.508 INFO:tasks.workunit.client.0.vm05.stdout:9/498: write d8/d35/d22/d33/f73 [50507,110996] 0 2026-03-10T07:50:54.512 INFO:tasks.workunit.client.0.vm05.stdout:9/499: write d8/d35/d1c/d75/f88 [1191124,29194] 0 2026-03-10T07:50:54.513 INFO:tasks.workunit.client.0.vm05.stdout:7/516: dread d1/d6/f58 [0,4194304] 0 2026-03-10T07:50:54.515 INFO:tasks.workunit.client.0.vm05.stdout:1/537: dwrite da/dd/d2a/f54 [0,4194304] 0 2026-03-10T07:50:54.520 INFO:tasks.workunit.client.0.vm05.stdout:1/538: dread - da/dd/d2a/d70/f9c zero size 2026-03-10T07:50:54.539 INFO:tasks.workunit.client.0.vm05.stdout:3/496: mknod d8/d1f/d24/ca7 0 2026-03-10T07:50:54.539 INFO:tasks.workunit.client.0.vm05.stdout:6/527: creat d0/d11/d2e/d81/fa3 x:0 0 0 2026-03-10T07:50:54.540 INFO:tasks.workunit.client.0.vm05.stdout:4/574: truncate d0/d6/d9/d12/d69/f6a 439603 0 2026-03-10T07:50:54.540 INFO:tasks.workunit.client.0.vm05.stdout:8/486: mkdir d1/dd/d18/d20/d2a/d9a 0 2026-03-10T07:50:54.541 INFO:tasks.workunit.client.0.vm05.stdout:6/528: write d0/d11/d57/f5c [4139050,32924] 0 2026-03-10T07:50:54.550 INFO:tasks.workunit.client.0.vm05.stdout:0/507: unlink d8/dd/d37/d67/l57 0 2026-03-10T07:50:54.551 INFO:tasks.workunit.client.0.vm05.stdout:7/517: unlink d1/d6/d3b/l45 0 2026-03-10T07:50:54.553 INFO:tasks.workunit.client.0.vm05.stdout:9/500: mknod d8/d35/d1c/d20/d59/d8b/ca2 0 2026-03-10T07:50:54.555 INFO:tasks.workunit.client.0.vm05.stdout:3/497: sync 2026-03-10T07:50:54.557 INFO:tasks.workunit.client.0.vm05.stdout:3/498: readlink d8/d1f/l7a 0 2026-03-10T07:50:54.558 INFO:tasks.workunit.client.0.vm05.stdout:1/539: creat da/dd/d2a/d55/d64/f9f x:0 0 0 2026-03-10T07:50:54.562 INFO:tasks.workunit.client.0.vm05.stdout:2/593: dwrite d0/f7 [4194304,4194304] 0 2026-03-10T07:50:54.564 INFO:tasks.workunit.client.0.vm05.stdout:2/594: readlink d0/d7e/db4/d49/d81/lb6 0 2026-03-10T07:50:54.571 INFO:tasks.workunit.client.0.vm05.stdout:0/508: unlink d8/f4e 0 2026-03-10T07:50:54.580 INFO:tasks.workunit.client.0.vm05.stdout:9/501: creat d8/d35/d38/fa3 x:0 0 0 2026-03-10T07:50:54.589 INFO:tasks.workunit.client.0.vm05.stdout:0/509: stat d8/dd/d10/d26/d3a/c6a 0 2026-03-10T07:50:54.589 INFO:tasks.workunit.client.0.vm05.stdout:7/518: mknod d1/c9d 0 2026-03-10T07:50:54.591 INFO:tasks.workunit.client.0.vm05.stdout:3/499: symlink d8/d1f/d24/d76/la8 0 2026-03-10T07:50:54.592 INFO:tasks.workunit.client.0.vm05.stdout:5/512: getdents d2/d20/d33/d86 0 2026-03-10T07:50:54.592 INFO:tasks.workunit.client.0.vm05.stdout:8/487: mknod d1/dd/d18/d20/d2a/d48/d5a/c9b 0 2026-03-10T07:50:54.593 INFO:tasks.workunit.client.0.vm05.stdout:4/575: rename d0/d6/d9/d5a/d6e/l7b to d0/d6/d9/d12/d45/d55/d44/lc4 0 2026-03-10T07:50:54.594 INFO:tasks.workunit.client.0.vm05.stdout:7/519: mknod d1/d6/d80/d82/c9e 0 2026-03-10T07:50:54.597 INFO:tasks.workunit.client.0.vm05.stdout:7/520: stat d1/d3c/d71/c74 0 2026-03-10T07:50:54.601 INFO:tasks.workunit.client.0.vm05.stdout:9/502: dwrite d8/d35/d22/d33/d62/d6d/fa0 [0,4194304] 0 2026-03-10T07:50:54.603 INFO:tasks.workunit.client.0.vm05.stdout:9/503: stat d8/d35/d22/d33/f73 0 2026-03-10T07:50:54.608 INFO:tasks.workunit.client.0.vm05.stdout:2/595: link d0/d7e/db4/d49/d81/lb6 d0/d8/d3d/d7d/db2/lbe 0 2026-03-10T07:50:54.628 INFO:tasks.workunit.client.0.vm05.stdout:8/488: rmdir d1/dd/d18/d20/d2a/d48/d7c 39 2026-03-10T07:50:54.629 INFO:tasks.workunit.client.0.vm05.stdout:3/500: write d8/d1c/f63 [2149740,20173] 0 2026-03-10T07:50:54.629 INFO:tasks.workunit.client.0.vm05.stdout:6/529: write d0/f15 [4251107,117601] 0 2026-03-10T07:50:54.629 INFO:tasks.workunit.client.0.vm05.stdout:7/521: mknod d1/d5b/c9f 0 2026-03-10T07:50:54.632 INFO:tasks.workunit.client.0.vm05.stdout:1/540: link da/l30 da/d26/d2b/d89/la0 0 2026-03-10T07:50:54.633 INFO:tasks.workunit.client.0.vm05.stdout:2/596: mkdir d0/d8/d43/df/d8b/dbf 0 2026-03-10T07:50:54.636 INFO:tasks.workunit.client.0.vm05.stdout:0/510: rename d8/dd/d37/d56/l17 to d8/d9c/lae 0 2026-03-10T07:50:54.638 INFO:tasks.workunit.client.0.vm05.stdout:1/541: dwrite da/dd/d2a/d55/d64/f7a [0,4194304] 0 2026-03-10T07:50:54.641 INFO:tasks.workunit.client.0.vm05.stdout:1/542: write da/dd/d2a/d55/d68/f4d [610279,114607] 0 2026-03-10T07:50:54.643 INFO:tasks.workunit.client.0.vm05.stdout:1/543: stat da/dd/d2a/d55/d68/c32 0 2026-03-10T07:50:54.647 INFO:tasks.workunit.client.0.vm05.stdout:3/501: write d8/f12 [6846118,58924] 0 2026-03-10T07:50:54.648 INFO:tasks.workunit.client.0.vm05.stdout:3/502: write d8/d1c/f2b [4596341,114341] 0 2026-03-10T07:50:54.652 INFO:tasks.workunit.client.0.vm05.stdout:8/489: fdatasync d1/dd/d18/d20/d2a/f88 0 2026-03-10T07:50:54.655 INFO:tasks.workunit.client.0.vm05.stdout:9/504: symlink d8/d86/d95/la4 0 2026-03-10T07:50:54.661 INFO:tasks.workunit.client.0.vm05.stdout:4/576: rename d0/d6/d9/c86 to d0/d6/d9/d12/d4f/cc5 0 2026-03-10T07:50:54.664 INFO:tasks.workunit.client.0.vm05.stdout:0/511: rename d8/f2d to d8/faf 0 2026-03-10T07:50:54.664 INFO:tasks.workunit.client.0.vm05.stdout:0/512: truncate d8/dd/d10/fac 1033776 0 2026-03-10T07:50:54.674 INFO:tasks.workunit.client.0.vm05.stdout:8/490: dread - d1/d45/f81 zero size 2026-03-10T07:50:54.680 INFO:tasks.workunit.client.0.vm05.stdout:5/513: getdents d2/d12/d4d 0 2026-03-10T07:50:54.691 INFO:tasks.workunit.client.0.vm05.stdout:2/597: creat d0/d8/d43/da4/fc0 x:0 0 0 2026-03-10T07:50:54.694 INFO:tasks.workunit.client.0.vm05.stdout:4/577: dwrite d0/d6/da6/fba [4194304,4194304] 0 2026-03-10T07:50:54.701 INFO:tasks.workunit.client.0.vm05.stdout:0/513: creat d8/dd/d10/d26/d48/fb0 x:0 0 0 2026-03-10T07:50:54.701 INFO:tasks.workunit.client.0.vm05.stdout:0/514: readlink d8/dd/d10/d26/d3a/l97 0 2026-03-10T07:50:54.704 INFO:tasks.workunit.client.0.vm05.stdout:1/544: creat da/d26/d9e/fa1 x:0 0 0 2026-03-10T07:50:54.710 INFO:tasks.workunit.client.0.vm05.stdout:3/503: write d8/d16/d19/f21 [1298893,44775] 0 2026-03-10T07:50:54.718 INFO:tasks.workunit.client.0.vm05.stdout:3/504: dread - d8/d22/d60/d6e/f97 zero size 2026-03-10T07:50:54.718 INFO:tasks.workunit.client.0.vm05.stdout:3/505: dread d8/d16/d52/f9f [0,4194304] 0 2026-03-10T07:50:54.718 INFO:tasks.workunit.client.0.vm05.stdout:9/505: mknod d8/d86/d28/ca5 0 2026-03-10T07:50:54.720 INFO:tasks.workunit.client.0.vm05.stdout:2/598: dread d0/d8/d3d/d7d/f36 [4194304,4194304] 0 2026-03-10T07:50:54.720 INFO:tasks.workunit.client.0.vm05.stdout:4/578: creat d0/d6/d9/d5a/d91/fc6 x:0 0 0 2026-03-10T07:50:54.729 INFO:tasks.workunit.client.0.vm05.stdout:1/545: creat da/d26/d2b/d71/fa2 x:0 0 0 2026-03-10T07:50:54.733 INFO:tasks.workunit.client.0.vm05.stdout:8/491: mkdir d1/dd/d18/d20/d2a/d48/d7c/d9c 0 2026-03-10T07:50:54.734 INFO:tasks.workunit.client.0.vm05.stdout:8/492: readlink d1/lf 0 2026-03-10T07:50:54.735 INFO:tasks.workunit.client.0.vm05.stdout:7/522: creat d1/d6/d80/d82/fa0 x:0 0 0 2026-03-10T07:50:54.746 INFO:tasks.workunit.client.0.vm05.stdout:5/514: fdatasync d2/d20/f57 0 2026-03-10T07:50:54.752 INFO:tasks.workunit.client.0.vm05.stdout:3/506: dread d8/d1f/d2a/d96/f85 [0,4194304] 0 2026-03-10T07:50:54.757 INFO:tasks.workunit.client.0.vm05.stdout:4/579: mkdir d0/d6/d9/d12/d69/dc7 0 2026-03-10T07:50:54.758 INFO:tasks.workunit.client.0.vm05.stdout:4/580: fdatasync d0/d6/d9/d5a/f58 0 2026-03-10T07:50:54.764 INFO:tasks.workunit.client.0.vm05.stdout:8/493: creat d1/d45/f9d x:0 0 0 2026-03-10T07:50:54.765 INFO:tasks.workunit.client.0.vm05.stdout:7/523: write d1/d6/f84 [560402,95064] 0 2026-03-10T07:50:54.768 INFO:tasks.workunit.client.0.vm05.stdout:6/530: truncate d0/d11/d4f/d56/f6b 2769860 0 2026-03-10T07:50:54.773 INFO:tasks.workunit.client.0.vm05.stdout:5/515: chown d2/d20/d4c/fa5 3084 1 2026-03-10T07:50:54.774 INFO:tasks.workunit.client.0.vm05.stdout:5/516: write d2/d12/d2d/f9e [757010,45644] 0 2026-03-10T07:50:54.779 INFO:tasks.workunit.client.0.vm05.stdout:2/599: getdents d0/d7e/db4/d49/db1 0 2026-03-10T07:50:54.781 INFO:tasks.workunit.client.0.vm05.stdout:0/515: rename c3 to d8/dd/d10/d26/cb1 0 2026-03-10T07:50:54.783 INFO:tasks.workunit.client.0.vm05.stdout:1/546: link da/dd/d12/d34/f5f da/dd/d12/d19/d20/fa3 0 2026-03-10T07:50:54.784 INFO:tasks.workunit.client.0.vm05.stdout:4/581: creat d0/d6/d9/d12/d45/fc8 x:0 0 0 2026-03-10T07:50:54.789 INFO:tasks.workunit.client.0.vm05.stdout:8/494: mkdir d1/dd/d5e/d9e 0 2026-03-10T07:50:54.793 INFO:tasks.workunit.client.0.vm05.stdout:9/506: link d8/d35/d22/d33/l7f d8/d35/d22/d33/d62/la6 0 2026-03-10T07:50:54.794 INFO:tasks.workunit.client.0.vm05.stdout:7/524: write d1/d3c/d71/d79/d8a/f90 [138376,96680] 0 2026-03-10T07:50:54.795 INFO:tasks.workunit.client.0.vm05.stdout:4/582: dwrite d0/d6/d9/f67 [4194304,4194304] 0 2026-03-10T07:50:54.797 INFO:tasks.workunit.client.0.vm05.stdout:7/525: chown d1/d3c/d71/d79/d8a 257657923 1 2026-03-10T07:50:54.804 INFO:tasks.workunit.client.0.vm05.stdout:7/526: readlink d1/d6/l26 0 2026-03-10T07:50:54.804 INFO:tasks.workunit.client.0.vm05.stdout:2/600: mknod d0/d8/d66/cc1 0 2026-03-10T07:50:54.805 INFO:tasks.workunit.client.0.vm05.stdout:2/601: truncate d0/d8/d43/df/f97 55273 0 2026-03-10T07:50:54.807 INFO:tasks.workunit.client.0.vm05.stdout:9/507: dread d8/d86/d28/f29 [0,4194304] 0 2026-03-10T07:50:54.808 INFO:tasks.workunit.client.0.vm05.stdout:6/531: mkdir d0/d11/d57/da4 0 2026-03-10T07:50:54.809 INFO:tasks.workunit.client.0.vm05.stdout:7/527: creat d1/d6/d3b/fa1 x:0 0 0 2026-03-10T07:50:54.811 INFO:tasks.workunit.client.0.vm05.stdout:4/583: creat d0/d6/d9/d5a/d91/fc9 x:0 0 0 2026-03-10T07:50:54.811 INFO:tasks.workunit.client.0.vm05.stdout:0/516: symlink d8/dd/d37/d67/d96/lb2 0 2026-03-10T07:50:54.811 INFO:tasks.workunit.client.0.vm05.stdout:5/517: dwrite d2/d20/d7b/f83 [0,4194304] 0 2026-03-10T07:50:54.811 INFO:tasks.workunit.client.0.vm05.stdout:9/508: write d8/d35/d22/d33/f73 [351570,83048] 0 2026-03-10T07:50:54.824 INFO:tasks.workunit.client.0.vm05.stdout:1/547: mknod da/d26/d2b/ca4 0 2026-03-10T07:50:54.826 INFO:tasks.workunit.client.0.vm05.stdout:6/532: creat d0/d11/d22/d6c/fa5 x:0 0 0 2026-03-10T07:50:54.826 INFO:tasks.workunit.client.0.vm05.stdout:4/584: mknod d0/d6/d95/cca 0 2026-03-10T07:50:54.827 INFO:tasks.workunit.client.0.vm05.stdout:0/517: rmdir d8/dd/d37 39 2026-03-10T07:50:54.827 INFO:tasks.workunit.client.0.vm05.stdout:4/585: dread - d0/d6/d9/f83 zero size 2026-03-10T07:50:54.828 INFO:tasks.workunit.client.0.vm05.stdout:9/509: creat d8/d35/d1c/d20/fa7 x:0 0 0 2026-03-10T07:50:54.831 INFO:tasks.workunit.client.0.vm05.stdout:8/495: sync 2026-03-10T07:50:54.834 INFO:tasks.workunit.client.0.vm05.stdout:5/518: creat d2/d12/d2d/d4a/faf x:0 0 0 2026-03-10T07:50:54.837 INFO:tasks.workunit.client.0.vm05.stdout:0/518: mknod d8/dd/d10/d26/d3a/d5e/cb3 0 2026-03-10T07:50:54.842 INFO:tasks.workunit.client.0.vm05.stdout:6/533: dwrite d0/d11/d57/d66/f79 [0,4194304] 0 2026-03-10T07:50:54.842 INFO:tasks.workunit.client.0.vm05.stdout:2/602: dread d0/d8/d43/df/f21 [0,4194304] 0 2026-03-10T07:50:54.843 INFO:tasks.workunit.client.0.vm05.stdout:4/586: creat d0/d6/da6/fcb x:0 0 0 2026-03-10T07:50:54.848 INFO:tasks.workunit.client.0.vm05.stdout:8/496: symlink d1/d45/l9f 0 2026-03-10T07:50:54.853 INFO:tasks.workunit.client.0.vm05.stdout:9/510: sync 2026-03-10T07:50:54.863 INFO:tasks.workunit.client.0.vm05.stdout:5/519: fdatasync d2/d20/d33/d53/f97 0 2026-03-10T07:50:54.868 INFO:tasks.workunit.client.0.vm05.stdout:7/528: dread d1/d6/f32 [0,4194304] 0 2026-03-10T07:50:54.876 INFO:tasks.workunit.client.0.vm05.stdout:2/603: creat d0/d8/d43/df/fc2 x:0 0 0 2026-03-10T07:50:54.887 INFO:tasks.workunit.client.0.vm05.stdout:5/520: creat d2/d20/d33/d53/fb0 x:0 0 0 2026-03-10T07:50:54.895 INFO:tasks.workunit.client.0.vm05.stdout:6/534: fsync d0/d11/d22/d6c/fa5 0 2026-03-10T07:50:54.896 INFO:tasks.workunit.client.0.vm05.stdout:7/529: creat d1/d6/d3b/d7f/fa2 x:0 0 0 2026-03-10T07:50:54.896 INFO:tasks.workunit.client.0.vm05.stdout:3/507: dwrite d8/d16/f1a [0,4194304] 0 2026-03-10T07:50:54.898 INFO:tasks.workunit.client.0.vm05.stdout:3/508: fsync d8/d22/d60/f8e 0 2026-03-10T07:50:54.898 INFO:tasks.workunit.client.0.vm05.stdout:7/530: chown d1/d6/d47/f52 3360 1 2026-03-10T07:50:54.904 INFO:tasks.workunit.client.0.vm05.stdout:8/497: mkdir d1/dd/d4d/d64/da0 0 2026-03-10T07:50:54.904 INFO:tasks.workunit.client.0.vm05.stdout:9/511: symlink d8/d35/la8 0 2026-03-10T07:50:54.904 INFO:tasks.workunit.client.0.vm05.stdout:7/531: chown d1/d6/d47/l4a 73 1 2026-03-10T07:50:54.904 INFO:tasks.workunit.client.0.vm05.stdout:5/521: rename d2/d12/f24 to d2/d20/d4c/fb1 0 2026-03-10T07:50:54.905 INFO:tasks.workunit.client.0.vm05.stdout:4/587: fsync d0/d6/d9/d12/d45/d55/f7d 0 2026-03-10T07:50:54.909 INFO:tasks.workunit.client.0.vm05.stdout:9/512: write d8/d35/d22/f4a [420750,115663] 0 2026-03-10T07:50:54.910 INFO:tasks.workunit.client.0.vm05.stdout:0/519: mknod d8/dd/d37/d56/cb4 0 2026-03-10T07:50:54.911 INFO:tasks.workunit.client.0.vm05.stdout:3/509: unlink d8/d1c/f2b 0 2026-03-10T07:50:54.911 INFO:tasks.workunit.client.0.vm05.stdout:0/520: fsync d8/dd/d10/d26/d8b/d86/fa0 0 2026-03-10T07:50:54.915 INFO:tasks.workunit.client.0.vm05.stdout:1/548: link da/dd/d12/f22 da/dd/fa5 0 2026-03-10T07:50:54.915 INFO:tasks.workunit.client.0.vm05.stdout:1/549: stat f4 0 2026-03-10T07:50:54.915 INFO:tasks.workunit.client.0.vm05.stdout:3/510: write d8/d16/d19/f21 [4268130,130823] 0 2026-03-10T07:50:54.916 INFO:tasks.workunit.client.0.vm05.stdout:5/522: symlink d2/d5/lb2 0 2026-03-10T07:50:54.916 INFO:tasks.workunit.client.0.vm05.stdout:0/521: write d8/dd/d10/d26/d2a/f8f [13726,10922] 0 2026-03-10T07:50:54.918 INFO:tasks.workunit.client.0.vm05.stdout:9/513: readlink d8/d35/d22/d33/l7f 0 2026-03-10T07:50:54.918 INFO:tasks.workunit.client.0.vm05.stdout:8/498: chown d1/dd/d18/d20/d2a/d34/l82 71278 1 2026-03-10T07:50:54.920 INFO:tasks.workunit.client.0.vm05.stdout:5/523: truncate d2/d12/d2d/fa9 716934 0 2026-03-10T07:50:54.921 INFO:tasks.workunit.client.0.vm05.stdout:6/535: mkdir d0/d11/d4f/da0/da6 0 2026-03-10T07:50:54.922 INFO:tasks.workunit.client.0.vm05.stdout:4/588: dwrite d0/d6/d9/d12/d9c/db7/da7/d5c/f98 [0,4194304] 0 2026-03-10T07:50:54.923 INFO:tasks.workunit.client.0.vm05.stdout:5/524: write d2/d20/d33/d53/f97 [864401,8396] 0 2026-03-10T07:50:54.925 INFO:tasks.workunit.client.0.vm05.stdout:2/604: creat d0/d8/fc3 x:0 0 0 2026-03-10T07:50:54.931 INFO:tasks.workunit.client.0.vm05.stdout:4/589: chown d0/d6/d95/fad 245 1 2026-03-10T07:50:54.931 INFO:tasks.workunit.client.0.vm05.stdout:0/522: rmdir d8/dd/d37/d56/d4d 39 2026-03-10T07:50:54.935 INFO:tasks.workunit.client.0.vm05.stdout:3/511: truncate d8/f5d 198556 0 2026-03-10T07:50:54.936 INFO:tasks.workunit.client.0.vm05.stdout:8/499: mkdir d1/dd/d18/d20/d2a/d48/d7c/da1 0 2026-03-10T07:50:54.936 INFO:tasks.workunit.client.0.vm05.stdout:3/512: fdatasync d8/d1c/f56 0 2026-03-10T07:50:54.936 INFO:tasks.workunit.client.0.vm05.stdout:0/523: truncate d8/f75 5124768 0 2026-03-10T07:50:54.941 INFO:tasks.workunit.client.0.vm05.stdout:0/524: stat d8/dd/d10/d26/d3a/c6a 0 2026-03-10T07:50:54.942 INFO:tasks.workunit.client.0.vm05.stdout:2/605: creat d0/d8/d43/d38/fc4 x:0 0 0 2026-03-10T07:50:54.948 INFO:tasks.workunit.client.0.vm05.stdout:7/532: symlink d1/d34/d59/d60/d8c/la3 0 2026-03-10T07:50:54.950 INFO:tasks.workunit.client.0.vm05.stdout:4/590: dwrite d0/d6/d9/d5a/d6e/db6/db9/fbc [0,4194304] 0 2026-03-10T07:50:54.957 INFO:tasks.workunit.client.0.vm05.stdout:4/591: stat d0/d6/d9/d12/d65/l99 0 2026-03-10T07:50:54.978 INFO:tasks.workunit.client.0.vm05.stdout:4/592: dread d0/d28/f9b [0,4194304] 0 2026-03-10T07:50:54.981 INFO:tasks.workunit.client.0.vm05.stdout:3/513: mkdir d8/d1f/d2a/d96/da9 0 2026-03-10T07:50:54.981 INFO:tasks.workunit.client.0.vm05.stdout:5/525: truncate d2/f9 4283426 0 2026-03-10T07:50:54.986 INFO:tasks.workunit.client.0.vm05.stdout:9/514: creat d8/d86/d28/d79/d57/d96/fa9 x:0 0 0 2026-03-10T07:50:55.004 INFO:tasks.workunit.client.0.vm05.stdout:6/536: creat d0/d11/d4f/d6e/fa7 x:0 0 0 2026-03-10T07:50:55.011 INFO:tasks.workunit.client.0.vm05.stdout:4/593: symlink d0/d6/d9/d12/d69/lcc 0 2026-03-10T07:50:55.014 INFO:tasks.workunit.client.0.vm05.stdout:3/514: creat d8/d1c/d48/faa x:0 0 0 2026-03-10T07:50:55.015 INFO:tasks.workunit.client.0.vm05.stdout:5/526: creat d2/d20/d33/d86/fb3 x:0 0 0 2026-03-10T07:50:55.016 INFO:tasks.workunit.client.0.vm05.stdout:8/500: creat d1/dd/d18/d20/d2a/d48/d7c/da1/fa2 x:0 0 0 2026-03-10T07:50:55.020 INFO:tasks.workunit.client.0.vm05.stdout:4/594: fdatasync d0/d6/d9/d12/d45/d55/d44/d85/f9e 0 2026-03-10T07:50:55.024 INFO:tasks.workunit.client.0.vm05.stdout:0/525: rename d8/dd/d10/d26/d8b/d70/la2 to d8/lb5 0 2026-03-10T07:50:55.024 INFO:tasks.workunit.client.0.vm05.stdout:0/526: chown d8/c73 95887186 1 2026-03-10T07:50:55.025 INFO:tasks.workunit.client.0.vm05.stdout:1/550: dread da/dd/d12/f22 [0,4194304] 0 2026-03-10T07:50:55.029 INFO:tasks.workunit.client.0.vm05.stdout:5/527: dread d2/d5/d61/f8b [0,4194304] 0 2026-03-10T07:50:55.029 INFO:tasks.workunit.client.0.vm05.stdout:4/595: truncate d0/d6/d95/fad 826475 0 2026-03-10T07:50:55.034 INFO:tasks.workunit.client.0.vm05.stdout:4/596: write d0/d6/d9/d12/d45/d55/d44/d85/fc0 [1360384,8500] 0 2026-03-10T07:50:55.035 INFO:tasks.workunit.client.0.vm05.stdout:4/597: write d0/d6/f39 [1466336,82011] 0 2026-03-10T07:50:55.046 INFO:tasks.workunit.client.0.vm05.stdout:1/551: sync 2026-03-10T07:50:55.046 INFO:tasks.workunit.client.0.vm05.stdout:2/606: write d0/d8/f42 [625378,123870] 0 2026-03-10T07:50:55.046 INFO:tasks.workunit.client.0.vm05.stdout:4/598: write d0/d6/d9/d12/d9c/db7/f63 [3614456,81144] 0 2026-03-10T07:50:55.048 INFO:tasks.workunit.client.0.vm05.stdout:6/537: creat d0/d11/d22/d6c/d84/fa8 x:0 0 0 2026-03-10T07:50:55.048 INFO:tasks.workunit.client.0.vm05.stdout:1/552: dread - da/dd/d12/d19/d20/f6e zero size 2026-03-10T07:50:55.048 INFO:tasks.workunit.client.0.vm05.stdout:2/607: dread d0/d8/d43/f90 [0,4194304] 0 2026-03-10T07:50:55.052 INFO:tasks.workunit.client.0.vm05.stdout:6/538: read d0/d11/d57/d66/f79 [3870486,111699] 0 2026-03-10T07:50:55.052 INFO:tasks.workunit.client.0.vm05.stdout:2/608: chown d0/d7e/db4/fa6 962926509 1 2026-03-10T07:50:55.052 INFO:tasks.workunit.client.0.vm05.stdout:6/539: stat d0/d11/f21 0 2026-03-10T07:50:55.055 INFO:tasks.workunit.client.0.vm05.stdout:6/540: dwrite d0/d35/f41 [0,4194304] 0 2026-03-10T07:50:55.059 INFO:tasks.workunit.client.0.vm05.stdout:9/515: mknod d8/d35/d1c/d2c/d63/caa 0 2026-03-10T07:50:55.065 INFO:tasks.workunit.client.0.vm05.stdout:0/527: rename d8/dd/d10/fac to d8/dd/d37/d67/fb6 0 2026-03-10T07:50:55.066 INFO:tasks.workunit.client.0.vm05.stdout:5/528: creat d2/d20/d33/d53/d7d/fb4 x:0 0 0 2026-03-10T07:50:55.067 INFO:tasks.workunit.client.0.vm05.stdout:0/528: chown d8/dd/d10/d26/d8b/da4/f3e 3017451 1 2026-03-10T07:50:55.073 INFO:tasks.workunit.client.0.vm05.stdout:5/529: dwrite d2/d20/d33/d86/d8d/da1/faa [0,4194304] 0 2026-03-10T07:50:55.079 INFO:tasks.workunit.client.0.vm05.stdout:5/530: stat d2/d12/d2d/c89 0 2026-03-10T07:50:55.079 INFO:tasks.workunit.client.0.vm05.stdout:5/531: stat d2/d12/f40 0 2026-03-10T07:50:55.079 INFO:tasks.workunit.client.0.vm05.stdout:5/532: chown d2/d5/f3c 107209270 1 2026-03-10T07:50:55.079 INFO:tasks.workunit.client.0.vm05.stdout:5/533: read - d2/fad zero size 2026-03-10T07:50:55.091 INFO:tasks.workunit.client.0.vm05.stdout:0/529: dread d8/f75 [0,4194304] 0 2026-03-10T07:50:55.092 INFO:tasks.workunit.client.0.vm05.stdout:4/599: creat d0/d6/d9/d5a/fcd x:0 0 0 2026-03-10T07:50:55.096 INFO:tasks.workunit.client.0.vm05.stdout:0/530: dread d8/dd/d10/d26/d2a/fab [0,4194304] 0 2026-03-10T07:50:55.102 INFO:tasks.workunit.client.0.vm05.stdout:2/609: symlink d0/d8/d3d/d7d/da5/da8/lc5 0 2026-03-10T07:50:55.102 INFO:tasks.workunit.client.0.vm05.stdout:3/515: creat d8/d16/d52/da4/fab x:0 0 0 2026-03-10T07:50:55.102 INFO:tasks.workunit.client.0.vm05.stdout:9/516: mkdir d8/d35/d22/dab 0 2026-03-10T07:50:55.103 INFO:tasks.workunit.client.0.vm05.stdout:7/533: link d1/d6/f32 d1/fa4 0 2026-03-10T07:50:55.103 INFO:tasks.workunit.client.0.vm05.stdout:5/534: creat d2/d20/d33/d86/fb5 x:0 0 0 2026-03-10T07:50:55.105 INFO:tasks.workunit.client.0.vm05.stdout:6/541: dread d0/d11/d57/f5f [0,4194304] 0 2026-03-10T07:50:55.107 INFO:tasks.workunit.client.0.vm05.stdout:3/516: read d8/d1c/f56 [1973980,112066] 0 2026-03-10T07:50:55.107 INFO:tasks.workunit.client.0.vm05.stdout:4/600: symlink d0/d6/d9/d12/d45/d55/lce 0 2026-03-10T07:50:55.121 INFO:tasks.workunit.client.0.vm05.stdout:1/553: dwrite da/dd/d12/d19/d20/f6c [0,4194304] 0 2026-03-10T07:50:55.123 INFO:tasks.workunit.client.0.vm05.stdout:1/554: stat da/d26/d2b/d71/fa2 0 2026-03-10T07:50:55.141 INFO:tasks.workunit.client.0.vm05.stdout:2/610: dread - d0/d8/d43/d38/f85 zero size 2026-03-10T07:50:55.142 INFO:tasks.workunit.client.0.vm05.stdout:5/535: fsync d2/d4b/f73 0 2026-03-10T07:50:55.145 INFO:tasks.workunit.client.0.vm05.stdout:1/555: unlink da/f43 0 2026-03-10T07:50:55.145 INFO:tasks.workunit.client.0.vm05.stdout:8/501: getdents d1/dd/d18/d20/d2a/d34 0 2026-03-10T07:50:55.150 INFO:tasks.workunit.client.0.vm05.stdout:4/601: dread d0/d6/d95/f40 [0,4194304] 0 2026-03-10T07:50:55.153 INFO:tasks.workunit.client.0.vm05.stdout:0/531: dwrite d8/f65 [0,4194304] 0 2026-03-10T07:50:55.158 INFO:tasks.workunit.client.0.vm05.stdout:0/532: stat d8/dd/d10/d26/d2a 0 2026-03-10T07:50:55.158 INFO:tasks.workunit.client.0.vm05.stdout:0/533: readlink d8/dd/d10/l2c 0 2026-03-10T07:50:55.158 INFO:tasks.workunit.client.0.vm05.stdout:0/534: chown d8/dd 51350621 1 2026-03-10T07:50:55.165 INFO:tasks.workunit.client.0.vm05.stdout:1/556: mknod da/dd/d12/d19/ca6 0 2026-03-10T07:50:55.165 INFO:tasks.workunit.client.0.vm05.stdout:1/557: fdatasync f4 0 2026-03-10T07:50:55.168 INFO:tasks.workunit.client.0.vm05.stdout:2/611: mkdir d0/d8/dc6 0 2026-03-10T07:50:55.181 INFO:tasks.workunit.client.0.vm05.stdout:7/534: getdents d1 0 2026-03-10T07:50:55.183 INFO:tasks.workunit.client.0.vm05.stdout:9/517: getdents d8/d35/d1c/d2c 0 2026-03-10T07:50:55.183 INFO:tasks.workunit.client.0.vm05.stdout:2/612: truncate d0/f1e 2024926 0 2026-03-10T07:50:55.183 INFO:tasks.workunit.client.0.vm05.stdout:0/535: mkdir d8/dd/d10/db7 0 2026-03-10T07:50:55.185 INFO:tasks.workunit.client.0.vm05.stdout:9/518: write d8/d86/d28/f43 [1016687,39992] 0 2026-03-10T07:50:55.185 INFO:tasks.workunit.client.0.vm05.stdout:2/613: chown d0/d7e/db4/d49/dab 7958085 1 2026-03-10T07:50:55.185 INFO:tasks.workunit.client.0.vm05.stdout:5/536: rmdir d2/d20/dab 0 2026-03-10T07:50:55.186 INFO:tasks.workunit.client.0.vm05.stdout:3/517: getdents d8/d1c/d48 0 2026-03-10T07:50:55.187 INFO:tasks.workunit.client.0.vm05.stdout:6/542: getdents d0/d6 0 2026-03-10T07:50:55.187 INFO:tasks.workunit.client.0.vm05.stdout:1/558: creat da/d26/d2b/d89/fa7 x:0 0 0 2026-03-10T07:50:55.189 INFO:tasks.workunit.client.0.vm05.stdout:5/537: fsync d2/d12/d2d/d4a/f99 0 2026-03-10T07:50:55.189 INFO:tasks.workunit.client.0.vm05.stdout:3/518: dread - d8/d16/d52/da4/fab zero size 2026-03-10T07:50:55.192 INFO:tasks.workunit.client.0.vm05.stdout:3/519: truncate d8/d16/d19/f21 5256776 0 2026-03-10T07:50:55.193 INFO:tasks.workunit.client.0.vm05.stdout:0/536: creat d8/dd/d10/d26/d2a/d6f/fb8 x:0 0 0 2026-03-10T07:50:55.195 INFO:tasks.workunit.client.0.vm05.stdout:2/614: mknod d0/d52/cc7 0 2026-03-10T07:50:55.197 INFO:tasks.workunit.client.0.vm05.stdout:4/602: dread d0/d6/f32 [0,4194304] 0 2026-03-10T07:50:55.207 INFO:tasks.workunit.client.0.vm05.stdout:8/502: dwrite d1/f15 [0,4194304] 0 2026-03-10T07:50:55.208 INFO:tasks.workunit.client.0.vm05.stdout:6/543: rmdir d0/d11/d57/d93 0 2026-03-10T07:50:55.210 INFO:tasks.workunit.client.0.vm05.stdout:1/559: dwrite da/dd/d2a/f54 [4194304,4194304] 0 2026-03-10T07:50:55.210 INFO:tasks.workunit.client.0.vm05.stdout:1/560: chown f4 1081 1 2026-03-10T07:50:55.210 INFO:tasks.workunit.client.0.vm05.stdout:3/520: dwrite d8/d16/d19/f21 [0,4194304] 0 2026-03-10T07:50:55.213 INFO:tasks.workunit.client.0.vm05.stdout:7/535: fsync d1/d34/f3e 0 2026-03-10T07:50:55.216 INFO:tasks.workunit.client.0.vm05.stdout:0/537: mknod d8/dd/d10/d26/cb9 0 2026-03-10T07:50:55.216 INFO:tasks.workunit.client.0.vm05.stdout:2/615: creat d0/d7e/db4/d49/fc8 x:0 0 0 2026-03-10T07:50:55.223 INFO:tasks.workunit.client.0.vm05.stdout:5/538: dwrite d2/d20/f51 [0,4194304] 0 2026-03-10T07:50:55.228 INFO:tasks.workunit.client.0.vm05.stdout:3/521: fsync d8/d22/d60/f61 0 2026-03-10T07:50:55.228 INFO:tasks.workunit.client.0.vm05.stdout:3/522: truncate d8/d22/f86 5105070 0 2026-03-10T07:50:55.229 INFO:tasks.workunit.client.0.vm05.stdout:1/561: mkdir da/d26/d9e/da8 0 2026-03-10T07:50:55.229 INFO:tasks.workunit.client.0.vm05.stdout:6/544: write d0/d6/d3b/f8f [667000,73579] 0 2026-03-10T07:50:55.231 INFO:tasks.workunit.client.0.vm05.stdout:4/603: truncate d0/d6/d9/d5a/d6e/db6/db9/fbc 4826461 0 2026-03-10T07:50:55.241 INFO:tasks.workunit.client.0.vm05.stdout:4/604: write d0/d6/d37/f3d [1811829,44603] 0 2026-03-10T07:50:55.242 INFO:tasks.workunit.client.0.vm05.stdout:8/503: unlink d1/dd/d18/d20/d2a/d34/d49/f56 0 2026-03-10T07:50:55.243 INFO:tasks.workunit.client.0.vm05.stdout:3/523: rename d8/d22/d54 to d8/d16/dac 0 2026-03-10T07:50:55.246 INFO:tasks.workunit.client.0.vm05.stdout:5/539: mkdir d2/d20/d4c/db6 0 2026-03-10T07:50:55.250 INFO:tasks.workunit.client.0.vm05.stdout:4/605: truncate d0/d6/d9/d12/d45/d55/d4e/f97 4705613 0 2026-03-10T07:50:55.256 INFO:tasks.workunit.client.0.vm05.stdout:9/519: dread f6 [0,4194304] 0 2026-03-10T07:50:55.256 INFO:tasks.workunit.client.0.vm05.stdout:9/520: fsync d8/d35/d22/d33/d62/d6d/fa0 0 2026-03-10T07:50:55.258 INFO:tasks.workunit.client.0.vm05.stdout:8/504: dwrite d1/dd/d18/f70 [0,4194304] 0 2026-03-10T07:50:55.262 INFO:tasks.workunit.client.0.vm05.stdout:6/545: unlink d0/d35/c62 0 2026-03-10T07:50:55.262 INFO:tasks.workunit.client.0.vm05.stdout:1/562: creat da/dd/d12/d19/fa9 x:0 0 0 2026-03-10T07:50:55.267 INFO:tasks.workunit.client.0.vm05.stdout:2/616: sync 2026-03-10T07:50:55.282 INFO:tasks.workunit.client.0.vm05.stdout:5/540: mknod d2/d12/d2d/cb7 0 2026-03-10T07:50:55.287 INFO:tasks.workunit.client.0.vm05.stdout:4/606: mkdir d0/d6/d9/d12/d65/dcf 0 2026-03-10T07:50:55.289 INFO:tasks.workunit.client.0.vm05.stdout:7/536: dread d1/d6/f77 [0,4194304] 0 2026-03-10T07:50:55.292 INFO:tasks.workunit.client.0.vm05.stdout:4/607: dwrite d0/d6/d9/d5a/f58 [4194304,4194304] 0 2026-03-10T07:50:55.299 INFO:tasks.workunit.client.0.vm05.stdout:9/521: dread d8/d35/d22/d33/d62/f6c [0,4194304] 0 2026-03-10T07:50:55.302 INFO:tasks.workunit.client.0.vm05.stdout:9/522: dwrite d8/d35/d22/d33/f5b [0,4194304] 0 2026-03-10T07:50:55.325 INFO:tasks.workunit.client.0.vm05.stdout:3/524: mkdir d8/d22/dad 0 2026-03-10T07:50:55.325 INFO:tasks.workunit.client.0.vm05.stdout:5/541: rename d2/d12/da8/lae to d2/d12/d4d/lb8 0 2026-03-10T07:50:55.325 INFO:tasks.workunit.client.0.vm05.stdout:5/542: write d2/d12/f40 [551463,91663] 0 2026-03-10T07:50:55.325 INFO:tasks.workunit.client.0.vm05.stdout:7/537: creat d1/d3c/d71/d79/d8a/fa5 x:0 0 0 2026-03-10T07:50:55.325 INFO:tasks.workunit.client.0.vm05.stdout:1/563: mknod da/caa 0 2026-03-10T07:50:55.325 INFO:tasks.workunit.client.0.vm05.stdout:1/564: truncate da/d26/d2b/d71/f98 886453 0 2026-03-10T07:50:55.325 INFO:tasks.workunit.client.0.vm05.stdout:1/565: dread - da/d26/d2b/d71/fa2 zero size 2026-03-10T07:50:55.325 INFO:tasks.workunit.client.0.vm05.stdout:1/566: chown da/dd/d12/d19/l7e 1124574824 1 2026-03-10T07:50:55.325 INFO:tasks.workunit.client.0.vm05.stdout:1/567: readlink da/dd/d2a/d55/d68/l8b 0 2026-03-10T07:50:55.325 INFO:tasks.workunit.client.0.vm05.stdout:4/608: symlink d0/d6/d9/d5a/ld0 0 2026-03-10T07:50:55.325 INFO:tasks.workunit.client.0.vm05.stdout:1/568: chown f4 1391 1 2026-03-10T07:50:55.325 INFO:tasks.workunit.client.0.vm05.stdout:3/525: creat d8/d1f/d2a/d34/fae x:0 0 0 2026-03-10T07:50:55.325 INFO:tasks.workunit.client.0.vm05.stdout:5/543: symlink d2/d20/d4c/lb9 0 2026-03-10T07:50:55.325 INFO:tasks.workunit.client.0.vm05.stdout:4/609: read d0/d6/d9/d12/d9c/db7/da7/f53 [699326,120511] 0 2026-03-10T07:50:55.326 INFO:tasks.workunit.client.0.vm05.stdout:7/538: fdatasync d1/d6/d47/f52 0 2026-03-10T07:50:55.337 INFO:tasks.workunit.client.0.vm05.stdout:3/526: mkdir d8/d16/d19/daf 0 2026-03-10T07:50:55.337 INFO:tasks.workunit.client.0.vm05.stdout:5/544: symlink d2/d12/d2d/d4a/lba 0 2026-03-10T07:50:55.337 INFO:tasks.workunit.client.0.vm05.stdout:4/610: mkdir d0/d6/d9/d5a/d6e/dd1 0 2026-03-10T07:50:55.337 INFO:tasks.workunit.client.0.vm05.stdout:3/527: dwrite d8/d16/f2d [0,4194304] 0 2026-03-10T07:50:55.337 INFO:tasks.workunit.client.0.vm05.stdout:7/539: mkdir d1/d3c/d4b/da6 0 2026-03-10T07:50:55.338 INFO:tasks.workunit.client.0.vm05.stdout:1/569: rename c7 to da/dd/d12/d86/cab 0 2026-03-10T07:50:55.340 INFO:tasks.workunit.client.0.vm05.stdout:7/540: read d1/d6/f22 [601727,47935] 0 2026-03-10T07:50:55.349 INFO:tasks.workunit.client.0.vm05.stdout:9/523: creat d8/d35/d22/d33/fac x:0 0 0 2026-03-10T07:50:55.352 INFO:tasks.workunit.client.0.vm05.stdout:7/541: dwrite d1/d6/d80/d82/fa0 [0,4194304] 0 2026-03-10T07:50:55.358 INFO:tasks.workunit.client.0.vm05.stdout:7/542: chown d1/f46 27844 1 2026-03-10T07:50:55.358 INFO:tasks.workunit.client.0.vm05.stdout:9/524: dread d8/d35/d22/f4a [0,4194304] 0 2026-03-10T07:50:55.358 INFO:tasks.workunit.client.0.vm05.stdout:5/545: read - d2/d20/d5b/f6e zero size 2026-03-10T07:50:55.360 INFO:tasks.workunit.client.0.vm05.stdout:9/525: truncate d8/d35/d22/d33/f73 1145549 0 2026-03-10T07:50:55.361 INFO:tasks.workunit.client.0.vm05.stdout:3/528: mknod d8/d1c/d64/cb0 0 2026-03-10T07:50:55.363 INFO:tasks.workunit.client.0.vm05.stdout:4/611: dread d0/d6/d9/d12/d9c/db7/f63 [0,4194304] 0 2026-03-10T07:50:55.369 INFO:tasks.workunit.client.0.vm05.stdout:9/526: dwrite d8/d35/d1c/f49 [0,4194304] 0 2026-03-10T07:50:55.372 INFO:tasks.workunit.client.0.vm05.stdout:5/546: rename d2/d20/d33/d53/f6a to d2/d20/d33/d86/fbb 0 2026-03-10T07:50:55.377 INFO:tasks.workunit.client.0.vm05.stdout:5/547: dread d2/d12/d2d/d4a/f59 [0,4194304] 0 2026-03-10T07:50:55.392 INFO:tasks.workunit.client.0.vm05.stdout:5/548: dwrite d2/f1a [0,4194304] 0 2026-03-10T07:50:55.392 INFO:tasks.workunit.client.0.vm05.stdout:5/549: write d2/d20/d4c/fa5 [268030,91020] 0 2026-03-10T07:50:55.402 INFO:tasks.workunit.client.0.vm05.stdout:5/550: read d2/d5/f3d [6365172,50759] 0 2026-03-10T07:50:55.426 INFO:tasks.workunit.client.0.vm05.stdout:9/527: dread d8/d35/d38/d71/d81/f83 [0,4194304] 0 2026-03-10T07:50:55.429 INFO:tasks.workunit.client.0.vm05.stdout:1/570: fsync da/d26/d2b/d89/fa7 0 2026-03-10T07:50:55.431 INFO:tasks.workunit.client.0.vm05.stdout:3/529: mknod d8/d1f/d24/cb1 0 2026-03-10T07:50:55.474 INFO:tasks.workunit.client.0.vm05.stdout:5/551: mkdir d2/d20/d7b/dbc 0 2026-03-10T07:50:55.519 INFO:tasks.workunit.client.0.vm05.stdout:0/538: dwrite d8/dd/d10/d26/d8b/da4/f3d [0,4194304] 0 2026-03-10T07:50:55.527 INFO:tasks.workunit.client.0.vm05.stdout:2/617: dwrite d0/d8/d43/df/f21 [4194304,4194304] 0 2026-03-10T07:50:55.528 INFO:tasks.workunit.client.0.vm05.stdout:8/505: dwrite d1/dd/d18/d20/d2a/d48/f57 [0,4194304] 0 2026-03-10T07:50:55.534 INFO:tasks.workunit.client.0.vm05.stdout:0/539: truncate d8/dd/d37/d56/f18 8437253 0 2026-03-10T07:50:55.534 INFO:tasks.workunit.client.0.vm05.stdout:6/546: dwrite d0/d35/d36/f71 [0,4194304] 0 2026-03-10T07:50:55.539 INFO:tasks.workunit.client.0.vm05.stdout:6/547: dwrite d0/d11/d57/d66/f75 [0,4194304] 0 2026-03-10T07:50:55.562 INFO:tasks.workunit.client.0.vm05.stdout:5/552: symlink d2/d20/d33/d53/d7d/lbd 0 2026-03-10T07:50:55.563 INFO:tasks.workunit.client.0.vm05.stdout:2/618: fdatasync d0/d8/f65 0 2026-03-10T07:50:55.563 INFO:tasks.workunit.client.0.vm05.stdout:5/553: chown d2/d20/d33/d53/f97 66719626 1 2026-03-10T07:50:55.579 INFO:tasks.workunit.client.0.vm05.stdout:6/548: fsync d0/d11/d4f/f91 0 2026-03-10T07:50:55.587 INFO:tasks.workunit.client.0.vm05.stdout:3/530: creat d8/d1c/d48/d69/fb2 x:0 0 0 2026-03-10T07:50:55.595 INFO:tasks.workunit.client.0.vm05.stdout:5/554: mkdir d2/d20/d7b/dbe 0 2026-03-10T07:50:55.597 INFO:tasks.workunit.client.0.vm05.stdout:6/549: mknod d0/d11/d31/ca9 0 2026-03-10T07:50:55.601 INFO:tasks.workunit.client.0.vm05.stdout:5/555: dwrite d2/d20/d4c/d64/f96 [0,4194304] 0 2026-03-10T07:50:55.602 INFO:tasks.workunit.client.0.vm05.stdout:3/531: dwrite d8/d1f/d24/d8a/f91 [4194304,4194304] 0 2026-03-10T07:50:55.608 INFO:tasks.workunit.client.0.vm05.stdout:3/532: write d8/d1f/f49 [1028936,128812] 0 2026-03-10T07:50:55.613 INFO:tasks.workunit.client.0.vm05.stdout:3/533: write d8/d1c/d48/d69/fb2 [17871,56103] 0 2026-03-10T07:50:55.621 INFO:tasks.workunit.client.0.vm05.stdout:7/543: truncate d1/d34/f7c 4229968 0 2026-03-10T07:50:55.621 INFO:tasks.workunit.client.0.vm05.stdout:7/544: stat d1/c69 0 2026-03-10T07:50:55.621 INFO:tasks.workunit.client.0.vm05.stdout:6/550: dread d0/d6/d3b/f55 [0,4194304] 0 2026-03-10T07:50:55.621 INFO:tasks.workunit.client.0.vm05.stdout:7/545: write d1/d34/f4d [2176528,1909] 0 2026-03-10T07:50:55.621 INFO:tasks.workunit.client.0.vm05.stdout:7/546: stat d1/d6/d80 0 2026-03-10T07:50:55.621 INFO:tasks.workunit.client.0.vm05.stdout:6/551: dread d0/d11/d57/f5f [0,4194304] 0 2026-03-10T07:50:55.621 INFO:tasks.workunit.client.0.vm05.stdout:7/547: write d1/d5b/f73 [538792,35993] 0 2026-03-10T07:50:55.630 INFO:tasks.workunit.client.0.vm05.stdout:7/548: mkdir d1/d6/d47/d8d/da7 0 2026-03-10T07:50:55.631 INFO:tasks.workunit.client.0.vm05.stdout:6/552: creat d0/d11/d4f/d6e/faa x:0 0 0 2026-03-10T07:50:55.632 INFO:tasks.workunit.client.0.vm05.stdout:6/553: dread - d0/d11/d4f/d56/f73 zero size 2026-03-10T07:50:55.643 INFO:tasks.workunit.client.0.vm05.stdout:6/554: fsync d0/d11/d57/f9e 0 2026-03-10T07:50:55.651 INFO:tasks.workunit.client.0.vm05.stdout:7/549: dread d1/d34/d59/f72 [0,4194304] 0 2026-03-10T07:50:55.657 INFO:tasks.workunit.client.0.vm05.stdout:7/550: dwrite d1/d3c/d71/d79/f93 [0,4194304] 0 2026-03-10T07:50:55.658 INFO:tasks.workunit.client.0.vm05.stdout:7/551: write d1/d3c/d71/d79/f93 [1424879,83936] 0 2026-03-10T07:50:55.660 INFO:tasks.workunit.client.0.vm05.stdout:7/552: truncate d1/d3c/d71/f95 898460 0 2026-03-10T07:50:55.660 INFO:tasks.workunit.client.0.vm05.stdout:7/553: stat d1/d6/d47/d8d 0 2026-03-10T07:50:55.673 INFO:tasks.workunit.client.0.vm05.stdout:2/619: dread d0/d8/d43/df/f20 [0,4194304] 0 2026-03-10T07:50:55.719 INFO:tasks.workunit.client.0.vm05.stdout:7/554: sync 2026-03-10T07:50:55.729 INFO:tasks.workunit.client.0.vm05.stdout:7/555: unlink d1/d34/d59/d60/f8e 0 2026-03-10T07:50:55.729 INFO:tasks.workunit.client.0.vm05.stdout:7/556: readlink d1/d3c/d4b/l56 0 2026-03-10T07:50:55.732 INFO:tasks.workunit.client.0.vm05.stdout:7/557: creat d1/d6/d80/d82/fa8 x:0 0 0 2026-03-10T07:50:55.741 INFO:tasks.workunit.client.0.vm05.stdout:7/558: creat d1/d6/d80/d82/fa9 x:0 0 0 2026-03-10T07:50:55.741 INFO:tasks.workunit.client.0.vm05.stdout:7/559: write d1/d6/d3b/f42 [3317725,2902] 0 2026-03-10T07:50:55.745 INFO:tasks.workunit.client.0.vm05.stdout:7/560: symlink d1/d34/d59/d60/d8c/laa 0 2026-03-10T07:50:55.747 INFO:tasks.workunit.client.0.vm05.stdout:7/561: creat d1/d3c/d71/fab x:0 0 0 2026-03-10T07:50:55.770 INFO:tasks.workunit.client.0.vm05.stdout:4/612: truncate d0/d6/f39 1188729 0 2026-03-10T07:50:55.774 INFO:tasks.workunit.client.0.vm05.stdout:4/613: dwrite d0/d6/d9/d8c/fc1 [0,4194304] 0 2026-03-10T07:50:55.788 INFO:tasks.workunit.client.0.vm05.stdout:8/506: dwrite d1/d45/f53 [4194304,4194304] 0 2026-03-10T07:50:55.797 INFO:tasks.workunit.client.0.vm05.stdout:8/507: write d1/dd/d18/d20/d2a/d48/f57 [2948450,68654] 0 2026-03-10T07:50:55.805 INFO:tasks.workunit.client.0.vm05.stdout:4/614: mkdir d0/d6/d9/d12/d45/d55/d4e/dd2 0 2026-03-10T07:50:55.806 INFO:tasks.workunit.client.0.vm05.stdout:1/571: truncate da/dd/d12/d19/f4e 454115 0 2026-03-10T07:50:55.807 INFO:tasks.workunit.client.0.vm05.stdout:0/540: write d8/dd/d37/d56/d4d/f69 [210385,19171] 0 2026-03-10T07:50:55.812 INFO:tasks.workunit.client.0.vm05.stdout:4/615: fdatasync d0/d6/d9/d12/d9c/db7/da7/f9a 0 2026-03-10T07:50:55.813 INFO:tasks.workunit.client.0.vm05.stdout:4/616: truncate d0/d6/d9/d12/d45/d55/f7d 1843437 0 2026-03-10T07:50:55.826 INFO:tasks.workunit.client.0.vm05.stdout:3/534: mkdir d8/d1c/db3 0 2026-03-10T07:50:55.835 INFO:tasks.workunit.client.0.vm05.stdout:3/535: dread d8/d1f/d2a/f42 [0,4194304] 0 2026-03-10T07:50:55.847 INFO:tasks.workunit.client.0.vm05.stdout:3/536: link d8/d1f/d2a/d34/fae d8/d8f/fb4 0 2026-03-10T07:50:55.851 INFO:tasks.workunit.client.0.vm05.stdout:5/556: dwrite d2/d20/f57 [0,4194304] 0 2026-03-10T07:50:55.853 INFO:tasks.workunit.client.0.vm05.stdout:3/537: chown d8/d1f/d2a/d4a/d7d/c8d 1160 1 2026-03-10T07:50:55.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:55 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:55.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:55 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:55.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:55 vm05.local ceph-mon[50387]: pgmap v17: 65 pgs: 65 active+clean; 1.1 GiB data, 4.0 GiB used, 116 GiB / 120 GiB avail; 42 MiB/s rd, 157 MiB/s wr, 354 op/s 2026-03-10T07:50:55.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:55 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:55.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:55 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:55.912 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:55 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:55.913 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:55 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:55.913 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:55 vm08.local ceph-mon[59917]: pgmap v17: 65 pgs: 65 active+clean; 1.1 GiB data, 4.0 GiB used, 116 GiB / 120 GiB avail; 42 MiB/s rd, 157 MiB/s wr, 354 op/s 2026-03-10T07:50:55.913 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:55 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:55.913 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:55 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:55.945 INFO:tasks.workunit.client.0.vm05.stdout:6/555: rename d0/l70 to d0/d11/d4f/d7d/lab 0 2026-03-10T07:50:55.947 INFO:tasks.workunit.client.0.vm05.stdout:1/572: rename da/d26/d9e/da8 to da/dd/d2a/d55/d64/dac 0 2026-03-10T07:50:55.947 INFO:tasks.workunit.client.0.vm05.stdout:1/573: fdatasync da/dd/d2a/f75 0 2026-03-10T07:50:55.950 INFO:tasks.workunit.client.0.vm05.stdout:6/556: creat d0/d35/d36/fac x:0 0 0 2026-03-10T07:50:55.950 INFO:tasks.workunit.client.0.vm05.stdout:6/557: chown d0/d11/d57/f5f 419107 1 2026-03-10T07:50:55.977 INFO:tasks.workunit.client.0.vm05.stdout:1/574: dread da/dd/d2a/d70/f83 [0,4194304] 0 2026-03-10T07:50:55.978 INFO:tasks.workunit.client.0.vm05.stdout:1/575: mknod da/dd/d2a/d70/cad 0 2026-03-10T07:50:55.983 INFO:tasks.workunit.client.0.vm05.stdout:4/617: creat d0/d6/d9/d12/fd3 x:0 0 0 2026-03-10T07:50:55.985 INFO:tasks.workunit.client.0.vm05.stdout:2/620: mkdir d0/d8/d43/dc9 0 2026-03-10T07:50:55.987 INFO:tasks.workunit.client.0.vm05.stdout:4/618: unlink d0/d6/f32 0 2026-03-10T07:50:55.987 INFO:tasks.workunit.client.0.vm05.stdout:4/619: readlink d0/d6/d9/d12/d9c/db7/da7/d5c/l61 0 2026-03-10T07:50:55.996 INFO:tasks.workunit.client.0.vm05.stdout:4/620: dwrite d0/d6/d9/d12/d9c/db7/da7/d5c/f98 [0,4194304] 0 2026-03-10T07:50:55.997 INFO:tasks.workunit.client.0.vm05.stdout:1/576: dread da/dd/d2a/f2f [0,4194304] 0 2026-03-10T07:50:56.005 INFO:tasks.workunit.client.0.vm05.stdout:2/621: mknod d0/d8/d43/da4/cca 0 2026-03-10T07:50:56.005 INFO:tasks.workunit.client.0.vm05.stdout:7/562: truncate d1/d6/d47/f65 1919025 0 2026-03-10T07:50:56.006 INFO:tasks.workunit.client.0.vm05.stdout:2/622: chown d0/d8/d43/df/d4d/l79 188910 1 2026-03-10T07:50:56.007 INFO:tasks.workunit.client.0.vm05.stdout:1/577: mknod da/dd/d12/d86/cae 0 2026-03-10T07:50:56.009 INFO:tasks.workunit.client.0.vm05.stdout:7/563: mkdir d1/d3c/d71/d79/d8a/dac 0 2026-03-10T07:50:56.018 INFO:tasks.workunit.client.0.vm05.stdout:7/564: dwrite d1/d6/f77 [0,4194304] 0 2026-03-10T07:50:56.021 INFO:tasks.workunit.client.0.vm05.stdout:2/623: dread d0/d8/f65 [0,4194304] 0 2026-03-10T07:50:56.026 INFO:tasks.workunit.client.0.vm05.stdout:8/508: dwrite d1/dd/d18/f3f [4194304,4194304] 0 2026-03-10T07:50:56.028 INFO:tasks.workunit.client.0.vm05.stdout:3/538: rename d8/d16/f67 to d8/d1f/d2a/d96/da9/fb5 0 2026-03-10T07:50:56.033 INFO:tasks.workunit.client.0.vm05.stdout:2/624: symlink d0/d7e/db4/d49/d81/lcb 0 2026-03-10T07:50:56.034 INFO:tasks.workunit.client.0.vm05.stdout:8/509: truncate d1/dd/d18/d20/f30 217603 0 2026-03-10T07:50:56.036 INFO:tasks.workunit.client.0.vm05.stdout:2/625: symlink d0/d8/d43/dc9/lcc 0 2026-03-10T07:50:56.036 INFO:tasks.workunit.client.0.vm05.stdout:1/578: read da/d26/f92 [2578129,82607] 0 2026-03-10T07:50:56.040 INFO:tasks.workunit.client.0.vm05.stdout:0/541: dwrite d8/dd/d10/f19 [0,4194304] 0 2026-03-10T07:50:56.044 INFO:tasks.workunit.client.0.vm05.stdout:0/542: mknod d8/dd/d10/d26/d8b/d86/cba 0 2026-03-10T07:50:56.060 INFO:tasks.workunit.client.0.vm05.stdout:2/626: creat d0/d8/fcd x:0 0 0 2026-03-10T07:50:56.062 INFO:tasks.workunit.client.0.vm05.stdout:0/543: mknod d8/d9c/cbb 0 2026-03-10T07:50:56.063 INFO:tasks.workunit.client.0.vm05.stdout:0/544: write d8/f1c [5009020,36245] 0 2026-03-10T07:50:56.067 INFO:tasks.workunit.client.0.vm05.stdout:0/545: creat d8/dd/fbc x:0 0 0 2026-03-10T07:50:56.069 INFO:tasks.workunit.client.0.vm05.stdout:0/546: mknod d8/dd/d10/d26/d3a/d5e/cbd 0 2026-03-10T07:50:56.080 INFO:tasks.workunit.client.0.vm05.stdout:0/547: getdents d8/dd/d10/d26/d3a 0 2026-03-10T07:50:56.081 INFO:tasks.workunit.client.0.vm05.stdout:0/548: write d8/dd/d10/d26/d3a/d5e/fa6 [267743,26291] 0 2026-03-10T07:50:56.086 INFO:tasks.workunit.client.0.vm05.stdout:0/549: read d8/dd/d10/d26/d8b/da4/f3e [1414612,77425] 0 2026-03-10T07:50:56.092 INFO:tasks.workunit.client.0.vm05.stdout:0/550: creat d8/dd/d10/d26/d8b/d70/fbe x:0 0 0 2026-03-10T07:50:56.092 INFO:tasks.workunit.client.0.vm05.stdout:0/551: dread - d8/dd/d10/d26/d48/fb0 zero size 2026-03-10T07:50:56.096 INFO:tasks.workunit.client.0.vm05.stdout:2/627: dread d0/d8/d43/df/d4d/f93 [0,4194304] 0 2026-03-10T07:50:56.100 INFO:tasks.workunit.client.0.vm05.stdout:5/557: dwrite d2/d5/f18 [0,4194304] 0 2026-03-10T07:50:56.102 INFO:tasks.workunit.client.0.vm05.stdout:9/528: mknod d8/d35/d38/d71/cad 0 2026-03-10T07:50:56.111 INFO:tasks.workunit.client.0.vm05.stdout:1/579: dread da/dd/d12/f18 [0,4194304] 0 2026-03-10T07:50:56.124 INFO:tasks.workunit.client.0.vm05.stdout:5/558: creat d2/d20/d33/d86/d8d/da1/fbf x:0 0 0 2026-03-10T07:50:56.125 INFO:tasks.workunit.client.0.vm05.stdout:5/559: chown d2/d5/f23 0 1 2026-03-10T07:50:56.125 INFO:tasks.workunit.client.0.vm05.stdout:5/560: write d2/d4b/f73 [1594700,8374] 0 2026-03-10T07:50:56.139 INFO:tasks.workunit.client.0.vm05.stdout:4/621: dread d0/d6/d9/d12/d9c/db7/da7/f4a [0,4194304] 0 2026-03-10T07:50:56.140 INFO:tasks.workunit.client.0.vm05.stdout:4/622: readlink d0/d6/d9/d12/d69/lcc 0 2026-03-10T07:50:56.140 INFO:tasks.workunit.client.0.vm05.stdout:4/623: dread - d0/d6/d9/d12/d45/d55/d44/fbf zero size 2026-03-10T07:50:56.148 INFO:tasks.workunit.client.0.vm05.stdout:5/561: mkdir d2/d20/d33/d86/d8d/da1/dc0 0 2026-03-10T07:50:56.153 INFO:tasks.workunit.client.0.vm05.stdout:0/552: link d8/dd/d37/l9d d8/dd/d37/d81/lbf 0 2026-03-10T07:50:56.156 INFO:tasks.workunit.client.0.vm05.stdout:0/553: dread d8/dd/d10/d26/d2a/f2e [0,4194304] 0 2026-03-10T07:50:56.157 INFO:tasks.workunit.client.0.vm05.stdout:6/558: write d0/fa [1883289,22462] 0 2026-03-10T07:50:56.167 INFO:tasks.workunit.client.0.vm05.stdout:7/565: write d1/d6/f2e [3454710,61329] 0 2026-03-10T07:50:56.169 INFO:tasks.workunit.client.0.vm05.stdout:1/580: mkdir da/d26/d2b/daf 0 2026-03-10T07:50:56.170 INFO:tasks.workunit.client.0.vm05.stdout:2/628: link d0/d8/d43/d38/c6b d0/d8/d43/df/cce 0 2026-03-10T07:50:56.176 INFO:tasks.workunit.client.0.vm05.stdout:4/624: write d0/d6/d9/f8f [330882,126170] 0 2026-03-10T07:50:56.181 INFO:tasks.workunit.client.0.vm05.stdout:1/581: dread da/dd/d12/d34/f38 [0,4194304] 0 2026-03-10T07:50:56.191 INFO:tasks.workunit.client.0.vm05.stdout:0/554: symlink d8/dd/d10/d26/d3a/lc0 0 2026-03-10T07:50:56.192 INFO:tasks.workunit.client.0.vm05.stdout:0/555: read - d8/f4a zero size 2026-03-10T07:50:56.194 INFO:tasks.workunit.client.0.vm05.stdout:3/539: write d8/d1f/d2a/d34/f3f [5017840,78233] 0 2026-03-10T07:50:56.197 INFO:tasks.workunit.client.0.vm05.stdout:8/510: write d1/dd/d18/f21 [987322,67419] 0 2026-03-10T07:50:56.213 INFO:tasks.workunit.client.0.vm05.stdout:7/566: chown d1/d3c/d4b/f4f 2655 1 2026-03-10T07:50:56.225 INFO:tasks.workunit.client.0.vm05.stdout:4/625: dread d0/d6/d9/d12/d4f/f5b [0,4194304] 0 2026-03-10T07:50:56.235 INFO:tasks.workunit.client.0.vm05.stdout:0/556: mknod d8/dd/d10/d26/d2a/d6f/cc1 0 2026-03-10T07:50:56.235 INFO:tasks.workunit.client.0.vm05.stdout:0/557: chown d8/dd/d37 0 1 2026-03-10T07:50:56.236 INFO:tasks.workunit.client.0.vm05.stdout:3/540: fdatasync d8/d16/d19/d37/f43 0 2026-03-10T07:50:56.240 INFO:tasks.workunit.client.0.vm05.stdout:2/629: dread d0/f4 [0,4194304] 0 2026-03-10T07:50:56.246 INFO:tasks.workunit.client.0.vm05.stdout:3/541: dread d8/d1f/f49 [0,4194304] 0 2026-03-10T07:50:56.254 INFO:tasks.workunit.client.0.vm05.stdout:8/511: creat d1/d23/fa3 x:0 0 0 2026-03-10T07:50:56.263 INFO:tasks.workunit.client.0.vm05.stdout:1/582: write da/dd/d12/f18 [3770470,123315] 0 2026-03-10T07:50:56.267 INFO:tasks.workunit.client.0.vm05.stdout:5/562: write d2/d20/d77/f7f [4845507,85991] 0 2026-03-10T07:50:56.270 INFO:tasks.workunit.client.0.vm05.stdout:6/559: mknod d0/d11/cad 0 2026-03-10T07:50:56.277 INFO:tasks.workunit.client.0.vm05.stdout:5/563: dwrite d2/d12/d2d/d4a/f99 [0,4194304] 0 2026-03-10T07:50:56.278 INFO:tasks.workunit.client.0.vm05.stdout:6/560: truncate d0/d11/d31/f63 882949 0 2026-03-10T07:50:56.278 INFO:tasks.workunit.client.0.vm05.stdout:5/564: stat d2/d20/d33/d53/d7d/f7e 0 2026-03-10T07:50:56.279 INFO:tasks.workunit.client.0.vm05.stdout:5/565: chown d2/d20/d33/d53/d7d/f9b 6885372 1 2026-03-10T07:50:56.280 INFO:tasks.workunit.client.0.vm05.stdout:1/583: dwrite da/dd/d12/f18 [0,4194304] 0 2026-03-10T07:50:56.281 INFO:tasks.workunit.client.0.vm05.stdout:7/567: unlink d1/d6/d80/d82/fa0 0 2026-03-10T07:50:56.311 INFO:tasks.workunit.client.0.vm05.stdout:9/529: dwrite d8/f9 [4194304,4194304] 0 2026-03-10T07:50:56.316 INFO:tasks.workunit.client.0.vm05.stdout:0/558: dwrite d8/faf [8388608,4194304] 0 2026-03-10T07:50:56.317 INFO:tasks.workunit.client.0.vm05.stdout:0/559: write d8/dd/d10/d26/fad [196165,28729] 0 2026-03-10T07:50:56.344 INFO:tasks.workunit.client.0.vm05.stdout:2/630: rename d0/d8/d43/df/d4e/l6e to d0/d7e/db4/d49/dab/lcf 0 2026-03-10T07:50:56.360 INFO:tasks.workunit.client.0.vm05.stdout:6/561: fdatasync d0/d6/d3b/f55 0 2026-03-10T07:50:56.360 INFO:tasks.workunit.client.0.vm05.stdout:6/562: stat d0/d6/d3b 0 2026-03-10T07:50:56.364 INFO:tasks.workunit.client.0.vm05.stdout:5/566: fdatasync d2/d20/d33/d86/fbb 0 2026-03-10T07:50:56.365 INFO:tasks.workunit.client.0.vm05.stdout:9/530: dread d8/d35/d22/d33/f73 [0,4194304] 0 2026-03-10T07:50:56.365 INFO:tasks.workunit.client.0.vm05.stdout:9/531: truncate d8/d35/d22/d33/fac 391600 0 2026-03-10T07:50:56.372 INFO:tasks.workunit.client.0.vm05.stdout:7/568: unlink d1/d6/d80/d82/c9e 0 2026-03-10T07:50:56.372 INFO:tasks.workunit.client.0.vm05.stdout:2/631: dread - d0/f89 zero size 2026-03-10T07:50:56.381 INFO:tasks.workunit.client.0.vm05.stdout:6/563: unlink d0/d35/l40 0 2026-03-10T07:50:56.381 INFO:tasks.workunit.client.0.vm05.stdout:5/567: readlink d2/d20/l43 0 2026-03-10T07:50:56.383 INFO:tasks.workunit.client.0.vm05.stdout:4/626: rmdir d0/d6/d9/d12/d65/dcf 0 2026-03-10T07:50:56.391 INFO:tasks.workunit.client.0.vm05.stdout:7/569: dread d1/d34/f4d [0,4194304] 0 2026-03-10T07:50:56.392 INFO:tasks.workunit.client.0.vm05.stdout:5/568: dwrite d2/d5/f1e [0,4194304] 0 2026-03-10T07:50:56.403 INFO:tasks.workunit.client.0.vm05.stdout:8/512: rmdir d1/dd/d4d/d64/da0 0 2026-03-10T07:50:56.426 INFO:tasks.workunit.client.0.vm05.stdout:1/584: creat da/d26/d2b/fb0 x:0 0 0 2026-03-10T07:50:56.426 INFO:tasks.workunit.client.0.vm05.stdout:1/585: dread - da/dd/d2a/f75 zero size 2026-03-10T07:50:56.432 INFO:tasks.workunit.client.0.vm05.stdout:7/570: truncate d1/f46 271446 0 2026-03-10T07:50:56.439 INFO:tasks.workunit.client.0.vm05.stdout:9/532: link d8/d35/d22/d33/d62/f7d d8/d35/d1c/d20/d59/fae 0 2026-03-10T07:50:56.440 INFO:tasks.workunit.client.0.vm05.stdout:3/542: truncate d8/d22/d60/f61 88935 0 2026-03-10T07:50:56.456 INFO:tasks.workunit.client.0.vm05.stdout:6/564: symlink d0/d11/d4f/d56/d96/lae 0 2026-03-10T07:50:56.456 INFO:tasks.workunit.client.0.vm05.stdout:0/560: getdents d8/dd/d10/d26/d8b 0 2026-03-10T07:50:56.458 INFO:tasks.workunit.client.0.vm05.stdout:6/565: write d0/d6/d3b/f8f [1213381,124752] 0 2026-03-10T07:50:56.466 INFO:tasks.workunit.client.0.vm05.stdout:4/627: dwrite d0/d6/d9/d12/d45/d55/d44/f7e [0,4194304] 0 2026-03-10T07:50:56.469 INFO:tasks.workunit.client.0.vm05.stdout:2/632: dwrite d0/d8/f65 [0,4194304] 0 2026-03-10T07:50:56.471 INFO:tasks.workunit.client.0.vm05.stdout:7/571: creat d1/d3c/d71/d79/d8a/fad x:0 0 0 2026-03-10T07:50:56.477 INFO:tasks.workunit.client.0.vm05.stdout:7/572: readlink d1/d3c/l62 0 2026-03-10T07:50:56.487 INFO:tasks.workunit.client.0.vm05.stdout:9/533: creat d8/d35/d22/d33/d62/d6d/faf x:0 0 0 2026-03-10T07:50:56.487 INFO:tasks.workunit.client.0.vm05.stdout:3/543: mknod d8/d47/cb6 0 2026-03-10T07:50:56.487 INFO:tasks.workunit.client.0.vm05.stdout:5/569: mkdir d2/d20/d33/d86/dac/dc1 0 2026-03-10T07:50:56.489 INFO:tasks.workunit.client.0.vm05.stdout:9/534: write d8/d86/d28/d79/f91 [909126,24759] 0 2026-03-10T07:50:56.495 INFO:tasks.workunit.client.0.vm05.stdout:8/513: dwrite d1/dd/d18/f47 [0,4194304] 0 2026-03-10T07:50:56.502 INFO:tasks.workunit.client.0.vm05.stdout:9/535: dwrite d8/d35/d1c/d20/d54/f90 [0,4194304] 0 2026-03-10T07:50:56.506 INFO:tasks.workunit.client.0.vm05.stdout:9/536: dread - d8/d35/d1c/d20/fa7 zero size 2026-03-10T07:50:56.513 INFO:tasks.workunit.client.0.vm05.stdout:2/633: mknod d0/d2a/cd0 0 2026-03-10T07:50:56.513 INFO:tasks.workunit.client.0.vm05.stdout:3/544: symlink d8/d1f/d24/lb7 0 2026-03-10T07:50:56.513 INFO:tasks.workunit.client.0.vm05.stdout:4/628: creat d0/d6/d9/d12/d45/d55/d44/d85/fd4 x:0 0 0 2026-03-10T07:50:56.516 INFO:tasks.workunit.client.0.vm05.stdout:2/634: stat d0/d8/fcd 0 2026-03-10T07:50:56.518 INFO:tasks.workunit.client.0.vm05.stdout:1/586: link da/f5d da/d26/d2b/d89/fb1 0 2026-03-10T07:50:56.519 INFO:tasks.workunit.client.0.vm05.stdout:2/635: dread - d0/d7e/db4/d49/fc8 zero size 2026-03-10T07:50:56.526 INFO:tasks.workunit.client.0.vm05.stdout:8/514: rename d1/dd/d18/d20/d2a/d48/d7c/da1 to d1/dd/d18/d20/d2a/d48/d7c/d9c/da4 0 2026-03-10T07:50:56.532 INFO:tasks.workunit.client.0.vm05.stdout:5/570: mkdir d2/d20/d33/d86/d8d/da1/dc0/dc2 0 2026-03-10T07:50:56.534 INFO:tasks.workunit.client.0.vm05.stdout:6/566: creat d0/d11/d57/faf x:0 0 0 2026-03-10T07:50:56.546 INFO:tasks.workunit.client.0.vm05.stdout:6/567: readlink d0/d11/d22/d69/l88 0 2026-03-10T07:50:56.546 INFO:tasks.workunit.client.0.vm05.stdout:4/629: mknod d0/d6/d9/d12/d9c/db7/da7/cd5 0 2026-03-10T07:50:56.546 INFO:tasks.workunit.client.0.vm05.stdout:9/537: creat d8/d35/d6b/fb0 x:0 0 0 2026-03-10T07:50:56.554 INFO:tasks.workunit.client.0.vm05.stdout:0/561: getdents d8/dd/d37 0 2026-03-10T07:50:56.560 INFO:tasks.workunit.client.0.vm05.stdout:5/571: creat d2/d4b/fc3 x:0 0 0 2026-03-10T07:50:56.560 INFO:tasks.workunit.client.0.vm05.stdout:5/572: chown d2/d20/d33/d86/dac 403247640 1 2026-03-10T07:50:56.580 INFO:tasks.workunit.client.0.vm05.stdout:9/538: rename d8/d86/d28/f84 to d8/d35/d22/fb1 0 2026-03-10T07:50:56.582 INFO:tasks.workunit.client.0.vm05.stdout:7/573: getdents d1/d6/d3b/d7f 0 2026-03-10T07:50:56.589 INFO:tasks.workunit.client.0.vm05.stdout:0/562: mknod d8/dd/d37/d56/d4d/cc2 0 2026-03-10T07:50:56.597 INFO:tasks.workunit.client.0.vm05.stdout:6/568: mknod d0/d11/d57/cb0 0 2026-03-10T07:50:56.602 INFO:tasks.workunit.client.0.vm05.stdout:9/539: mknod d8/d35/d1c/d20/cb2 0 2026-03-10T07:50:56.603 INFO:tasks.workunit.client.0.vm05.stdout:0/563: truncate d8/dd/d10/d26/d8b/d70/f80 597732 0 2026-03-10T07:50:56.604 INFO:tasks.workunit.client.0.vm05.stdout:1/587: getdents da 0 2026-03-10T07:50:56.609 INFO:tasks.workunit.client.0.vm05.stdout:5/573: sync 2026-03-10T07:50:56.612 INFO:tasks.workunit.client.0.vm05.stdout:5/574: dread d2/d20/f57 [0,4194304] 0 2026-03-10T07:50:56.612 INFO:tasks.workunit.client.0.vm05.stdout:5/575: chown d2/d20/d5b 11308 1 2026-03-10T07:50:56.618 INFO:tasks.workunit.client.0.vm05.stdout:4/630: creat d0/d6/d9/d12/fd6 x:0 0 0 2026-03-10T07:50:56.620 INFO:tasks.workunit.client.0.vm05.stdout:9/540: chown d8/d35/l18 1075 1 2026-03-10T07:50:56.620 INFO:tasks.workunit.client.0.vm05.stdout:9/541: chown d8/d35/d22/d33/d62/f6c 58205 1 2026-03-10T07:50:56.621 INFO:tasks.workunit.client.0.vm05.stdout:9/542: fdatasync d8/d35/d1c/d20/f9f 0 2026-03-10T07:50:56.632 INFO:tasks.workunit.client.0.vm05.stdout:3/545: write d8/d16/d19/d37/f43 [144078,78809] 0 2026-03-10T07:50:56.648 INFO:tasks.workunit.client.0.vm05.stdout:5/576: fsync d2/d5/f10 0 2026-03-10T07:50:56.652 INFO:tasks.workunit.client.0.vm05.stdout:8/515: dwrite d1/dd/d4d/f60 [0,4194304] 0 2026-03-10T07:50:56.652 INFO:tasks.workunit.client.0.vm05.stdout:2/636: truncate d0/d8/d43/df/d4d/f93 2040236 0 2026-03-10T07:50:56.657 INFO:tasks.workunit.client.0.vm05.stdout:5/577: fsync d2/fad 0 2026-03-10T07:50:56.657 INFO:tasks.workunit.client.0.vm05.stdout:8/516: stat d1/d6f/f7d 0 2026-03-10T07:50:56.673 INFO:tasks.workunit.client.0.vm05.stdout:6/569: write d0/d11/d2e/f30 [1350886,106375] 0 2026-03-10T07:50:56.677 INFO:tasks.workunit.client.0.vm05.stdout:7/574: link d1/d6/l9c d1/d6/d47/lae 0 2026-03-10T07:50:56.677 INFO:tasks.workunit.client.0.vm05.stdout:6/570: write d0/d11/d4f/d6e/faa [1028765,56229] 0 2026-03-10T07:50:56.679 INFO:tasks.workunit.client.0.vm05.stdout:9/543: dwrite d8/d35/d22/f98 [0,4194304] 0 2026-03-10T07:50:56.680 INFO:tasks.workunit.client.0.vm05.stdout:0/564: write d8/dd/f3c [3643211,29850] 0 2026-03-10T07:50:56.685 INFO:tasks.workunit.client.0.vm05.stdout:3/546: unlink d8/d1c/d64/f72 0 2026-03-10T07:50:56.689 INFO:tasks.workunit.client.0.vm05.stdout:1/588: truncate da/dd/d12/f31 1382632 0 2026-03-10T07:50:56.700 INFO:tasks.workunit.client.0.vm05.stdout:2/637: truncate d0/d8/d43/df/f18 3777087 0 2026-03-10T07:50:56.704 INFO:tasks.workunit.client.0.vm05.stdout:4/631: dwrite d0/d6/d9/d5a/f2f [0,4194304] 0 2026-03-10T07:50:56.705 INFO:tasks.workunit.client.0.vm05.stdout:4/632: dread - d0/d6/d9/d12/fd6 zero size 2026-03-10T07:50:56.717 INFO:tasks.workunit.client.0.vm05.stdout:8/517: rmdir d1/d45 39 2026-03-10T07:50:56.720 INFO:tasks.workunit.client.0.vm05.stdout:8/518: write d1/dd/d18/f47 [3607056,93912] 0 2026-03-10T07:50:56.722 INFO:tasks.workunit.client.0.vm05.stdout:4/633: sync 2026-03-10T07:50:56.725 INFO:tasks.workunit.client.0.vm05.stdout:4/634: read - d0/d6/d9/d12/d45/d55/d44/d85/fd4 zero size 2026-03-10T07:50:56.726 INFO:tasks.workunit.client.0.vm05.stdout:4/635: write d0/f24 [7124272,108877] 0 2026-03-10T07:50:56.735 INFO:tasks.workunit.client.0.vm05.stdout:7/575: creat d1/d6/d47/d8d/faf x:0 0 0 2026-03-10T07:50:56.736 INFO:tasks.workunit.client.0.vm05.stdout:7/576: chown d1/f49 94 1 2026-03-10T07:50:56.740 INFO:tasks.workunit.client.0.vm05.stdout:4/636: sync 2026-03-10T07:50:56.743 INFO:tasks.workunit.client.0.vm05.stdout:6/571: mknod d0/d11/d22/d6c/d84/cb1 0 2026-03-10T07:50:56.746 INFO:tasks.workunit.client.0.vm05.stdout:0/565: unlink d8/dd/d37/fa9 0 2026-03-10T07:50:56.761 INFO:tasks.workunit.client.0.vm05.stdout:3/547: fdatasync d8/d1f/f2f 0 2026-03-10T07:50:56.780 INFO:tasks.workunit.client.0.vm05.stdout:1/589: rename da/dd/d12/d19/d20/f6c to da/fb2 0 2026-03-10T07:50:56.803 INFO:tasks.workunit.client.0.vm05.stdout:4/637: rmdir d0/d6/d9/d12/d45/d55/d4e 39 2026-03-10T07:50:56.804 INFO:tasks.workunit.client.0.vm05.stdout:6/572: creat d0/d35/fb2 x:0 0 0 2026-03-10T07:50:56.818 INFO:tasks.workunit.client.0.vm05.stdout:3/548: rename d8/d16/d19/l62 to d8/d16/d19/lb8 0 2026-03-10T07:50:56.831 INFO:tasks.workunit.client.0.vm05.stdout:2/638: rename d0/d7e/db4 to d0/d8/d66/dd1 0 2026-03-10T07:50:56.844 INFO:tasks.workunit.client.0.vm05.stdout:7/577: truncate d1/d6/d47/f65 2658962 0 2026-03-10T07:50:56.844 INFO:tasks.workunit.client.0.vm05.stdout:4/638: rmdir d0/d6/d9/d5a/d6e/db6/db9 39 2026-03-10T07:50:56.846 INFO:tasks.workunit.client.0.vm05.stdout:4/639: readlink d0/d6/d9/l59 0 2026-03-10T07:50:56.849 INFO:tasks.workunit.client.0.vm05.stdout:9/544: link d8/d35/d1c/d20/d59/d8b/l58 d8/d35/d22/d33/d62/d6d/d9e/lb3 0 2026-03-10T07:50:56.855 INFO:tasks.workunit.client.0.vm05.stdout:8/519: truncate d1/dd/d18/f3f 2026735 0 2026-03-10T07:50:56.855 INFO:tasks.workunit.client.0.vm05.stdout:6/573: dread d0/d11/d2e/f7f [0,4194304] 0 2026-03-10T07:50:56.857 INFO:tasks.workunit.client.0.vm05.stdout:8/520: read - d1/dd/d4d/d64/f86 zero size 2026-03-10T07:50:56.857 INFO:tasks.workunit.client.0.vm05.stdout:1/590: read da/dd/d12/d34/f38 [817576,74354] 0 2026-03-10T07:50:56.857 INFO:tasks.workunit.client.0.vm05.stdout:6/574: write d0/d35/d36/f59 [277006,20994] 0 2026-03-10T07:50:56.858 INFO:tasks.workunit.client.0.vm05.stdout:1/591: dread - da/dd/d2a/d70/f9c zero size 2026-03-10T07:50:56.858 INFO:tasks.workunit.client.0.vm05.stdout:8/521: write d1/dd/d4d/d64/f67 [3092598,88419] 0 2026-03-10T07:50:56.862 INFO:tasks.workunit.client.0.vm05.stdout:5/578: getdents d2/d12/d2d/d4a 0 2026-03-10T07:50:56.866 INFO:tasks.workunit.client.0.vm05.stdout:2/639: creat d0/d8/d3d/d7d/da5/fd2 x:0 0 0 2026-03-10T07:50:56.867 INFO:tasks.workunit.client.0.vm05.stdout:0/566: truncate d8/dd/d10/d26/d8b/da4/f5b 187713 0 2026-03-10T07:50:56.867 INFO:tasks.workunit.client.0.vm05.stdout:1/592: dwrite da/dd/d42/d80/f94 [0,4194304] 0 2026-03-10T07:50:56.889 INFO:tasks.workunit.client.0.vm05.stdout:2/640: creat d0/d8/d66/dd1/d49/fd3 x:0 0 0 2026-03-10T07:50:56.890 INFO:tasks.workunit.client.0.vm05.stdout:1/593: write da/d26/d2b/d71/f98 [1113350,78143] 0 2026-03-10T07:50:56.894 INFO:tasks.workunit.client.0.vm05.stdout:9/545: mkdir d8/d35/d22/dab/db4 0 2026-03-10T07:50:56.896 INFO:tasks.workunit.client.0.vm05.stdout:8/522: dread d1/dd/d18/d20/d2a/f3a [0,4194304] 0 2026-03-10T07:50:56.896 INFO:tasks.workunit.client.0.vm05.stdout:6/575: mkdir d0/d11/d57/da4/db3 0 2026-03-10T07:50:56.897 INFO:tasks.workunit.client.0.vm05.stdout:6/576: chown d0/d6/d3b/f8f 23621763 1 2026-03-10T07:50:56.903 INFO:tasks.workunit.client.0.vm05.stdout:3/549: write d8/f3b [1067214,115738] 0 2026-03-10T07:50:56.906 INFO:tasks.workunit.client.0.vm05.stdout:0/567: mkdir d8/dd/d10/db7/dc3 0 2026-03-10T07:50:56.906 INFO:tasks.workunit.client.0.vm05.stdout:2/641: mknod d0/d8/d3d/d7d/da5/cd4 0 2026-03-10T07:50:56.906 INFO:tasks.workunit.client.0.vm05.stdout:7/578: creat d1/d34/fb0 x:0 0 0 2026-03-10T07:50:56.907 INFO:tasks.workunit.client.0.vm05.stdout:7/579: chown d1/d34/f4d 0 1 2026-03-10T07:50:56.907 INFO:tasks.workunit.client.0.vm05.stdout:2/642: read - d0/d8/d3d/d7d/db2/fba zero size 2026-03-10T07:50:56.914 INFO:tasks.workunit.client.0.vm05.stdout:2/643: chown d0/d8/d3d/d7d/db2/f95 605 1 2026-03-10T07:50:56.924 INFO:tasks.workunit.client.0.vm05.stdout:4/640: getdents d0/d6/d9/d12 0 2026-03-10T07:50:56.924 INFO:tasks.workunit.client.0.vm05.stdout:4/641: write d0/d6/d9/d12/d9c/db7/fb2 [683261,3973] 0 2026-03-10T07:50:56.926 INFO:tasks.workunit.client.0.vm05.stdout:0/568: creat d8/dd/d10/d26/d8b/da4/fc4 x:0 0 0 2026-03-10T07:50:56.931 INFO:tasks.workunit.client.0.vm05.stdout:0/569: dwrite d8/dd/d10/d26/d48/fb0 [0,4194304] 0 2026-03-10T07:50:56.937 INFO:tasks.workunit.client.0.vm05.stdout:7/580: creat d1/d3c/d71/fb1 x:0 0 0 2026-03-10T07:50:56.938 INFO:tasks.workunit.client.0.vm05.stdout:7/581: readlink d1/d34/l35 0 2026-03-10T07:50:56.953 INFO:tasks.workunit.client.0.vm05.stdout:7/582: read d1/d34/d59/f78 [2276206,121824] 0 2026-03-10T07:50:56.954 INFO:tasks.workunit.client.0.vm05.stdout:7/583: write d1/d6/d80/d82/fa8 [397943,130815] 0 2026-03-10T07:50:56.957 INFO:tasks.workunit.client.0.vm05.stdout:4/642: dread d0/d6/d9/d12/d9c/db7/fb2 [0,4194304] 0 2026-03-10T07:50:56.965 INFO:tasks.workunit.client.0.vm05.stdout:2/644: truncate d0/f89 27959 0 2026-03-10T07:50:56.972 INFO:tasks.workunit.client.0.vm05.stdout:2/645: dwrite d0/d8/fc3 [0,4194304] 0 2026-03-10T07:50:56.974 INFO:tasks.workunit.client.0.vm05.stdout:6/577: mknod d0/d11/d57/cb4 0 2026-03-10T07:50:56.975 INFO:tasks.workunit.client.0.vm05.stdout:6/578: fsync d0/d11/d31/f63 0 2026-03-10T07:50:56.984 INFO:tasks.workunit.client.0.vm05.stdout:5/579: getdents d2/d20/d5b 0 2026-03-10T07:50:56.997 INFO:tasks.workunit.client.0.vm05.stdout:9/546: truncate d8/f14 4089975 0 2026-03-10T07:50:56.998 INFO:tasks.workunit.client.0.vm05.stdout:7/584: mknod d1/d6/d80/d82/cb2 0 2026-03-10T07:50:57.015 INFO:tasks.workunit.client.0.vm05.stdout:7/585: dread d1/d6/f2e [0,4194304] 0 2026-03-10T07:50:57.015 INFO:tasks.workunit.client.0.vm05.stdout:7/586: stat d1/d34/fb0 0 2026-03-10T07:50:57.017 INFO:tasks.workunit.client.0.vm05.stdout:6/579: mknod d0/d11/d22/d6c/d84/cb5 0 2026-03-10T07:50:57.018 INFO:tasks.workunit.client.0.vm05.stdout:7/587: truncate d1/d6/d3b/fa1 256951 0 2026-03-10T07:50:57.021 INFO:tasks.workunit.client.0.vm05.stdout:5/580: creat d2/d20/d4c/fc4 x:0 0 0 2026-03-10T07:50:57.021 INFO:tasks.workunit.client.0.vm05.stdout:3/550: creat d8/d22/fb9 x:0 0 0 2026-03-10T07:50:57.022 INFO:tasks.workunit.client.0.vm05.stdout:5/581: chown d2/d20/d33/c44 1864736 1 2026-03-10T07:50:57.024 INFO:tasks.workunit.client.0.vm05.stdout:5/582: stat d2/d12/d2d/cb7 0 2026-03-10T07:50:57.029 INFO:tasks.workunit.client.0.vm05.stdout:1/594: rename da/dd/c61 to da/dd/d12/cb3 0 2026-03-10T07:50:57.030 INFO:tasks.workunit.client.0.vm05.stdout:1/595: stat da/dd/d2a/f54 0 2026-03-10T07:50:57.038 INFO:tasks.workunit.client.0.vm05.stdout:8/523: getdents d1 0 2026-03-10T07:50:57.039 INFO:tasks.workunit.client.0.vm05.stdout:4/643: symlink d0/d6/d9/ld7 0 2026-03-10T07:50:57.040 INFO:tasks.workunit.client.0.vm05.stdout:0/570: symlink d8/dd/lc5 0 2026-03-10T07:50:57.048 INFO:tasks.workunit.client.0.vm05.stdout:0/571: write d8/dd/d10/d26/d8b/da4/fc4 [916113,90836] 0 2026-03-10T07:50:57.058 INFO:tasks.workunit.client.0.vm05.stdout:6/580: truncate d0/d6/f16 2408110 0 2026-03-10T07:50:57.062 INFO:tasks.workunit.client.0.vm05.stdout:5/583: creat d2/d5/d61/fc5 x:0 0 0 2026-03-10T07:50:57.073 INFO:tasks.workunit.client.0.vm05.stdout:7/588: dread d1/d6/f32 [4194304,4194304] 0 2026-03-10T07:50:57.074 INFO:tasks.workunit.client.0.vm05.stdout:0/572: sync 2026-03-10T07:50:57.085 INFO:tasks.workunit.client.0.vm05.stdout:2/646: truncate d0/d8/d43/df/d8b/f99 3533264 0 2026-03-10T07:50:57.089 INFO:tasks.workunit.client.0.vm05.stdout:3/551: write d8/d1f/d2a/d4a/d7d/f7e [1222996,29318] 0 2026-03-10T07:50:57.122 INFO:tasks.workunit.client.0.vm05.stdout:4/644: rename d0/d6/d9/d12/d9c/db7/da7/d5c/c87 to d0/d6/d9/d12/d69/cd8 0 2026-03-10T07:50:57.126 INFO:tasks.workunit.client.0.vm05.stdout:4/645: dwrite d0/d6/d9/d5a/d91/fc9 [0,4194304] 0 2026-03-10T07:50:57.136 INFO:tasks.workunit.client.0.vm05.stdout:4/646: sync 2026-03-10T07:50:57.137 INFO:tasks.workunit.client.0.vm05.stdout:5/584: creat d2/d4b/fc6 x:0 0 0 2026-03-10T07:50:57.137 INFO:tasks.workunit.client.0.vm05.stdout:4/647: chown d0/d6/d6f/lb3 24055 1 2026-03-10T07:50:57.139 INFO:tasks.workunit.client.0.vm05.stdout:1/596: mknod da/dd/cb4 0 2026-03-10T07:50:57.143 INFO:tasks.workunit.client.0.vm05.stdout:7/589: creat d1/d3c/d4b/fb3 x:0 0 0 2026-03-10T07:50:57.152 INFO:tasks.workunit.client.0.vm05.stdout:7/590: write d1/d3c/d71/fab [759409,24879] 0 2026-03-10T07:50:57.153 INFO:tasks.workunit.client.0.vm05.stdout:0/573: mknod d8/dd/d10/d26/d8b/d86/cc6 0 2026-03-10T07:50:57.153 INFO:tasks.workunit.client.0.vm05.stdout:3/552: dread - d8/d1f/d2a/d96/f7f zero size 2026-03-10T07:50:57.161 INFO:tasks.workunit.client.0.vm05.stdout:6/581: rename d0/d11/d4f/d6e to d0/d11/d4f/d56/d96/db6 0 2026-03-10T07:50:57.161 INFO:tasks.workunit.client.0.vm05.stdout:6/582: stat d0/d11/d22/d6c/fa5 0 2026-03-10T07:50:57.164 INFO:tasks.workunit.client.0.vm05.stdout:2/647: dread d0/d8/f3b [0,4194304] 0 2026-03-10T07:50:57.164 INFO:tasks.workunit.client.0.vm05.stdout:2/648: dread - d0/d8/d3d/d7d/db2/fba zero size 2026-03-10T07:50:57.167 INFO:tasks.workunit.client.0.vm05.stdout:6/583: dwrite d0/d6/d3b/f8f [0,4194304] 0 2026-03-10T07:50:57.168 INFO:tasks.workunit.client.0.vm05.stdout:6/584: stat d0/d6/d3b/l65 0 2026-03-10T07:50:57.182 INFO:tasks.workunit.client.0.vm05.stdout:4/648: dread d0/d6/d9/d12/d9c/db7/fb2 [0,4194304] 0 2026-03-10T07:50:57.186 INFO:tasks.workunit.client.0.vm05.stdout:1/597: dwrite da/dd/d2a/d70/f83 [0,4194304] 0 2026-03-10T07:50:57.197 INFO:tasks.workunit.client.0.vm05.stdout:0/574: creat d8/dd/d10/d26/d2a/fc7 x:0 0 0 2026-03-10T07:50:57.199 INFO:tasks.workunit.client.0.vm05.stdout:3/553: chown d8/d1f/d2a/f66 0 1 2026-03-10T07:50:57.199 INFO:tasks.workunit.client.0.vm05.stdout:3/554: write d8/d16/f9c [632327,54216] 0 2026-03-10T07:50:57.200 INFO:tasks.workunit.client.0.vm05.stdout:9/547: getdents d8/d35/d1c/d20 0 2026-03-10T07:50:57.201 INFO:tasks.workunit.client.0.vm05.stdout:9/548: chown d8/d35/d1c/f52 47 1 2026-03-10T07:50:57.202 INFO:tasks.workunit.client.0.vm05.stdout:2/649: mkdir d0/d8/d66/dd1/d49/d81/dd5 0 2026-03-10T07:50:57.204 INFO:tasks.workunit.client.0.vm05.stdout:6/585: mkdir d0/d11/d4f/d7d/db7 0 2026-03-10T07:50:57.206 INFO:tasks.workunit.client.0.vm05.stdout:1/598: creat da/dd/d12/d19/d20/d8f/fb5 x:0 0 0 2026-03-10T07:50:57.220 INFO:tasks.workunit.client.0.vm05.stdout:8/524: getdents d1/dd/d18/d20/d2a/d48 0 2026-03-10T07:50:57.220 INFO:tasks.workunit.client.0.vm05.stdout:0/575: rmdir d8/dd/d10/d26/d3a/d5e/d63 39 2026-03-10T07:50:57.220 INFO:tasks.workunit.client.0.vm05.stdout:3/555: creat d8/d1c/d48/fba x:0 0 0 2026-03-10T07:50:57.220 INFO:tasks.workunit.client.0.vm05.stdout:3/556: write d8/d1c/f56 [6080357,127445] 0 2026-03-10T07:50:57.221 INFO:tasks.workunit.client.0.vm05.stdout:5/585: creat d2/d20/d33/fc7 x:0 0 0 2026-03-10T07:50:57.221 INFO:tasks.workunit.client.0.vm05.stdout:5/586: dwrite d2/d20/d4c/fc4 [0,4194304] 0 2026-03-10T07:50:57.223 INFO:tasks.workunit.client.0.vm05.stdout:4/649: unlink d0/d6/d9/d12/d45/d55/d4e/c6d 0 2026-03-10T07:50:57.224 INFO:tasks.workunit.client.0.vm05.stdout:5/587: write d2/d20/d33/d53/f97 [7992370,70811] 0 2026-03-10T07:50:57.239 INFO:tasks.workunit.client.0.vm05.stdout:1/599: rmdir da/dd/d42/d80 39 2026-03-10T07:50:57.240 INFO:tasks.workunit.client.0.vm05.stdout:8/525: mkdir d1/dd/d18/d20/d2a/d34/da5 0 2026-03-10T07:50:57.244 INFO:tasks.workunit.client.0.vm05.stdout:0/576: mkdir d8/d9c/dc8 0 2026-03-10T07:50:57.244 INFO:tasks.workunit.client.0.vm05.stdout:8/526: dwrite d1/d45/f9d [0,4194304] 0 2026-03-10T07:50:57.252 INFO:tasks.workunit.client.0.vm05.stdout:3/557: mkdir d8/d16/d52/dbb 0 2026-03-10T07:50:57.252 INFO:tasks.workunit.client.0.vm05.stdout:6/586: truncate d0/d11/d4f/d56/f6b 1214068 0 2026-03-10T07:50:57.252 INFO:tasks.workunit.client.0.vm05.stdout:3/558: chown d8/d16 1383 1 2026-03-10T07:50:57.264 INFO:tasks.workunit.client.0.vm05.stdout:8/527: creat d1/dd/d4d/d64/d6a/fa6 x:0 0 0 2026-03-10T07:50:57.265 INFO:tasks.workunit.client.0.vm05.stdout:3/559: mkdir d8/d8f/dbc 0 2026-03-10T07:50:57.266 INFO:tasks.workunit.client.0.vm05.stdout:3/560: fsync d8/d16/d52/da4/fab 0 2026-03-10T07:50:57.266 INFO:tasks.workunit.client.0.vm05.stdout:3/561: stat d8/d1f/d2a/d4a/d7d 0 2026-03-10T07:50:57.267 INFO:tasks.workunit.client.0.vm05.stdout:4/650: creat d0/d6/d9/d5a/d6e/db6/db9/fd9 x:0 0 0 2026-03-10T07:50:57.275 INFO:tasks.workunit.client.0.vm05.stdout:1/600: dread da/fb2 [0,4194304] 0 2026-03-10T07:50:57.278 INFO:tasks.workunit.client.0.vm05.stdout:9/549: getdents d8/d35/d22/d33/d47 0 2026-03-10T07:50:57.280 INFO:tasks.workunit.client.0.vm05.stdout:6/587: sync 2026-03-10T07:50:57.281 INFO:tasks.workunit.client.0.vm05.stdout:6/588: read d0/d11/d4f/d56/f6f [1415602,85095] 0 2026-03-10T07:50:57.287 INFO:tasks.workunit.client.0.vm05.stdout:3/562: mkdir d8/d1f/d2a/d34/dbd 0 2026-03-10T07:50:57.293 INFO:tasks.workunit.client.0.vm05.stdout:4/651: readlink d0/d6/d6f/la4 0 2026-03-10T07:50:57.293 INFO:tasks.workunit.client.0.vm05.stdout:4/652: fdatasync d0/d6/d9/d5a/f58 0 2026-03-10T07:50:57.295 INFO:tasks.workunit.client.0.vm05.stdout:7/591: write d1/f86 [733164,117662] 0 2026-03-10T07:50:57.296 INFO:tasks.workunit.client.0.vm05.stdout:7/592: dread - d1/d34/d59/f99 zero size 2026-03-10T07:50:57.301 INFO:tasks.workunit.client.0.vm05.stdout:2/650: dwrite d0/d8/d43/f90 [0,4194304] 0 2026-03-10T07:50:57.314 INFO:tasks.workunit.client.0.vm05.stdout:5/588: link d2/d20/d77/l81 d2/d5/d61/lc8 0 2026-03-10T07:50:57.314 INFO:tasks.workunit.client.0.vm05.stdout:5/589: readlink d2/d5/l1d 0 2026-03-10T07:50:57.316 INFO:tasks.workunit.client.0.vm05.stdout:0/577: link d8/dd/f40 d8/dd/d37/fc9 0 2026-03-10T07:50:57.316 INFO:tasks.workunit.client.0.vm05.stdout:5/590: fdatasync d2/d20/d33/d86/fb3 0 2026-03-10T07:50:57.316 INFO:tasks.workunit.client.0.vm05.stdout:0/578: chown d8/dd/d37/d67/f9f 1596100119 1 2026-03-10T07:50:57.317 INFO:tasks.workunit.client.0.vm05.stdout:5/591: stat d2/d20/d33/d86/fb5 0 2026-03-10T07:50:57.318 INFO:tasks.workunit.client.0.vm05.stdout:5/592: dread - d2/d20/d33/d53/d7d/fb4 zero size 2026-03-10T07:50:57.329 INFO:tasks.workunit.client.0.vm05.stdout:8/528: dwrite d1/dd/d18/f5c [0,4194304] 0 2026-03-10T07:50:57.331 INFO:tasks.workunit.client.0.vm05.stdout:0/579: dwrite d8/dd/d37/d56/f62 [0,4194304] 0 2026-03-10T07:50:57.344 INFO:tasks.workunit.client.0.vm05.stdout:3/563: rmdir d8/d8f 39 2026-03-10T07:50:57.354 INFO:tasks.workunit.client.0.vm05.stdout:4/653: creat d0/d28/fda x:0 0 0 2026-03-10T07:50:57.356 INFO:tasks.workunit.client.0.vm05.stdout:0/580: read d8/dd/d37/d56/d4d/f69 [118494,123134] 0 2026-03-10T07:50:57.358 INFO:tasks.workunit.client.0.vm05.stdout:0/581: chown d8/dd/d37/d67/c61 9034638 1 2026-03-10T07:50:57.360 INFO:tasks.workunit.client.0.vm05.stdout:7/593: creat d1/d6/d47/d8d/fb4 x:0 0 0 2026-03-10T07:50:57.379 INFO:tasks.workunit.client.0.vm05.stdout:5/593: dread d2/d5/f23 [0,4194304] 0 2026-03-10T07:50:57.383 INFO:tasks.workunit.client.0.vm05.stdout:0/582: sync 2026-03-10T07:50:57.387 INFO:tasks.workunit.client.0.vm05.stdout:0/583: sync 2026-03-10T07:50:57.387 INFO:tasks.workunit.client.0.vm05.stdout:0/584: chown d8/dd/d10/d26/d3a/lc0 7291 1 2026-03-10T07:50:57.389 INFO:tasks.workunit.client.0.vm05.stdout:8/529: chown d1/dd/d18/f3f 606015 1 2026-03-10T07:50:57.392 INFO:tasks.workunit.client.0.vm05.stdout:9/550: write d8/d35/d22/d33/d62/f9a [745142,57647] 0 2026-03-10T07:50:57.395 INFO:tasks.workunit.client.0.vm05.stdout:6/589: write d0/d11/d2e/f7f [887368,129100] 0 2026-03-10T07:50:57.397 INFO:tasks.workunit.client.0.vm05.stdout:6/590: truncate d0/f15 5108040 0 2026-03-10T07:50:57.400 INFO:tasks.workunit.client.0.vm05.stdout:3/564: rename d8/d22/d60/f50 to d8/d1c/d48/d69/fbe 0 2026-03-10T07:50:57.403 INFO:tasks.workunit.client.0.vm05.stdout:4/654: creat d0/d6/d9/d12/d69/fdb x:0 0 0 2026-03-10T07:50:57.407 INFO:tasks.workunit.client.0.vm05.stdout:7/594: symlink d1/d34/d59/d60/d8c/lb5 0 2026-03-10T07:50:57.411 INFO:tasks.workunit.client.0.vm05.stdout:2/651: unlink d0/d8/d3d/d7d/db2/lbe 0 2026-03-10T07:50:57.412 INFO:tasks.workunit.client.0.vm05.stdout:1/601: link da/dd/d12/d19/d20/fa3 da/d26/d2b/daf/fb6 0 2026-03-10T07:50:57.417 INFO:tasks.workunit.client.0.vm05.stdout:8/530: creat d1/d6f/fa7 x:0 0 0 2026-03-10T07:50:57.417 INFO:tasks.workunit.client.0.vm05.stdout:8/531: write d1/dd/f87 [47627,44308] 0 2026-03-10T07:50:57.418 INFO:tasks.workunit.client.0.vm05.stdout:8/532: write d1/dd/d18/d20/d2a/d48/d5a/f98 [117714,28245] 0 2026-03-10T07:50:57.420 INFO:tasks.workunit.client.0.vm05.stdout:8/533: stat d1/dd/d18/d20/d2a/d48/f59 0 2026-03-10T07:50:57.432 INFO:tasks.workunit.client.0.vm05.stdout:0/585: rename d8/dd/d10/d26/d3a/d5e/l6b to d8/dd/d10/d26/d8b/d86/lca 0 2026-03-10T07:50:57.433 INFO:tasks.workunit.client.0.vm05.stdout:0/586: write d8/dd/fa7 [4621034,16829] 0 2026-03-10T07:50:57.434 INFO:tasks.workunit.client.0.vm05.stdout:0/587: truncate d8/dd/d10/d26/d2a/d6f/faa 906128 0 2026-03-10T07:50:57.441 INFO:tasks.workunit.client.0.vm05.stdout:9/551: write d8/d35/d1c/f3b [617234,124539] 0 2026-03-10T07:50:57.444 INFO:tasks.workunit.client.0.vm05.stdout:4/655: mkdir d0/d6/d9/d12/d69/ddc 0 2026-03-10T07:50:57.445 INFO:tasks.workunit.client.0.vm05.stdout:7/595: unlink d1/d3c/c40 0 2026-03-10T07:50:57.449 INFO:tasks.workunit.client.0.vm05.stdout:5/594: creat d2/d20/d33/d86/dac/dc1/fc9 x:0 0 0 2026-03-10T07:50:57.453 INFO:tasks.workunit.client.0.vm05.stdout:4/656: dread d0/d6/d95/f40 [0,4194304] 0 2026-03-10T07:50:57.468 INFO:tasks.workunit.client.0.vm05.stdout:6/591: truncate d0/fa 1482712 0 2026-03-10T07:50:57.470 INFO:tasks.workunit.client.0.vm05.stdout:3/565: mknod d8/d1f/d2a/d34/dbd/cbf 0 2026-03-10T07:50:57.470 INFO:tasks.workunit.client.0.vm05.stdout:6/592: readlink d0/l19 0 2026-03-10T07:50:57.471 INFO:tasks.workunit.client.0.vm05.stdout:3/566: chown d8/d1f/d24/d45/l99 9425 1 2026-03-10T07:50:57.471 INFO:tasks.workunit.client.0.vm05.stdout:7/596: mknod d1/d3c/d71/cb6 0 2026-03-10T07:50:57.472 INFO:tasks.workunit.client.0.vm05.stdout:2/652: mknod d0/d8/cd6 0 2026-03-10T07:50:57.479 INFO:tasks.workunit.client.0.vm05.stdout:5/595: mkdir d2/d20/d7b/dca 0 2026-03-10T07:50:57.480 INFO:tasks.workunit.client.0.vm05.stdout:5/596: chown d2/d12/da8/l80 17658 1 2026-03-10T07:50:57.489 INFO:tasks.workunit.client.0.vm05.stdout:1/602: write da/fc [1030783,9519] 0 2026-03-10T07:50:57.490 INFO:tasks.workunit.client.0.vm05.stdout:1/603: dread - da/dd/d12/d34/d58/d8e/f91 zero size 2026-03-10T07:50:57.490 INFO:tasks.workunit.client.0.vm05.stdout:1/604: fdatasync da/d26/d2b/fb0 0 2026-03-10T07:50:57.491 INFO:tasks.workunit.client.0.vm05.stdout:1/605: chown da/dd/d2a/d55/l66 96751383 1 2026-03-10T07:50:57.502 INFO:tasks.workunit.client.0.vm05.stdout:6/593: mkdir d0/d35/d36/db8 0 2026-03-10T07:50:57.503 INFO:tasks.workunit.client.0.vm05.stdout:9/552: write d8/d35/f5d [538348,75715] 0 2026-03-10T07:50:57.518 INFO:tasks.workunit.client.0.vm05.stdout:7/597: mknod d1/d6/d3b/d7f/cb7 0 2026-03-10T07:50:57.522 INFO:tasks.workunit.client.0.vm05.stdout:7/598: dwrite d1/d6/f84 [0,4194304] 0 2026-03-10T07:50:57.527 INFO:tasks.workunit.client.0.vm05.stdout:7/599: fsync d1/d34/f7a 0 2026-03-10T07:50:57.532 INFO:tasks.workunit.client.0.vm05.stdout:8/534: creat d1/dd/d18/d20/d2a/d34/da5/fa8 x:0 0 0 2026-03-10T07:50:57.543 INFO:tasks.workunit.client.0.vm05.stdout:3/567: fsync d8/d22/d60/f61 0 2026-03-10T07:50:57.548 INFO:tasks.workunit.client.0.vm05.stdout:9/553: creat d8/d35/d38/d71/fb5 x:0 0 0 2026-03-10T07:50:57.548 INFO:tasks.workunit.client.0.vm05.stdout:2/653: truncate d0/f89 926882 0 2026-03-10T07:50:57.548 INFO:tasks.workunit.client.0.vm05.stdout:5/597: symlink d2/d20/d7b/dbe/lcb 0 2026-03-10T07:50:57.548 INFO:tasks.workunit.client.0.vm05.stdout:9/554: stat d8/d35/d22/d33/d47/f5f 0 2026-03-10T07:50:57.548 INFO:tasks.workunit.client.0.vm05.stdout:7/600: mkdir d1/d3c/db8 0 2026-03-10T07:50:57.549 INFO:tasks.workunit.client.0.vm05.stdout:7/601: write d1/d6/d3b/d7f/f9a [987852,117565] 0 2026-03-10T07:50:57.554 INFO:tasks.workunit.client.0.vm05.stdout:0/588: rename d8/dd/d10/d26/d3a/l89 to d8/dd/d10/lcb 0 2026-03-10T07:50:57.559 INFO:tasks.workunit.client.0.vm05.stdout:3/568: rmdir d8/d1f/d2a/d4a/d7d 39 2026-03-10T07:50:57.561 INFO:tasks.workunit.client.0.vm05.stdout:9/555: dwrite d8/f9 [4194304,4194304] 0 2026-03-10T07:50:57.562 INFO:tasks.workunit.client.0.vm05.stdout:5/598: mknod d2/d20/d5b/ccc 0 2026-03-10T07:50:57.562 INFO:tasks.workunit.client.0.vm05.stdout:2/654: mkdir d0/d8/d3d/d7d/db2/dd7 0 2026-03-10T07:50:57.573 INFO:tasks.workunit.client.0.vm05.stdout:4/657: rename d0/c11 to d0/d6/d9/d12/d9c/db7/db1/cdd 0 2026-03-10T07:50:57.575 INFO:tasks.workunit.client.0.vm05.stdout:4/658: fdatasync d0/d6/d9/f8f 0 2026-03-10T07:50:57.575 INFO:tasks.workunit.client.0.vm05.stdout:9/556: dwrite d8/d35/f5d [0,4194304] 0 2026-03-10T07:50:57.576 INFO:tasks.workunit.client.0.vm05.stdout:9/557: readlink d8/d35/d22/d33/d62/la6 0 2026-03-10T07:50:57.576 INFO:tasks.workunit.client.0.vm05.stdout:0/589: write d8/f75 [5015907,69375] 0 2026-03-10T07:50:57.581 INFO:tasks.workunit.client.0.vm05.stdout:6/594: creat d0/d11/d57/fb9 x:0 0 0 2026-03-10T07:50:57.586 INFO:tasks.workunit.client.0.vm05.stdout:3/569: creat d8/d16/d52/fc0 x:0 0 0 2026-03-10T07:50:57.592 INFO:tasks.workunit.client.0.vm05.stdout:7/602: unlink d1/d6/d47/l85 0 2026-03-10T07:50:57.592 INFO:tasks.workunit.client.0.vm05.stdout:5/599: readlink d2/d12/l6f 0 2026-03-10T07:50:57.592 INFO:tasks.workunit.client.0.vm05.stdout:1/606: rename da/d26/f33 to da/dd/d12/d19/d20/fb7 0 2026-03-10T07:50:57.597 INFO:tasks.workunit.client.0.vm05.stdout:0/590: dread d8/dd/d10/d26/d3a/d5e/fa6 [0,4194304] 0 2026-03-10T07:50:57.599 INFO:tasks.workunit.client.0.vm05.stdout:4/659: mkdir d0/d6/d60/dde 0 2026-03-10T07:50:57.600 INFO:tasks.workunit.client.0.vm05.stdout:4/660: read d0/d6/d95/fad [361131,80036] 0 2026-03-10T07:50:57.600 INFO:tasks.workunit.client.0.vm05.stdout:9/558: stat d8/fa 0 2026-03-10T07:50:57.602 INFO:tasks.workunit.client.0.vm05.stdout:6/595: symlink d0/d11/d2e/lba 0 2026-03-10T07:50:57.603 INFO:tasks.workunit.client.0.vm05.stdout:6/596: stat d0/d35/d36/d43 0 2026-03-10T07:50:57.605 INFO:tasks.workunit.client.0.vm05.stdout:6/597: write d0/d11/d22/f4c [1564972,55287] 0 2026-03-10T07:50:57.606 INFO:tasks.workunit.client.0.vm05.stdout:6/598: read - d0/d11/d22/d6c/fa5 zero size 2026-03-10T07:50:57.614 INFO:tasks.workunit.client.0.vm05.stdout:3/570: creat d8/d1f/d24/d76/fc1 x:0 0 0 2026-03-10T07:50:57.619 INFO:tasks.workunit.client.0.vm05.stdout:1/607: symlink da/d26/d2b/d89/lb8 0 2026-03-10T07:50:57.620 INFO:tasks.workunit.client.0.vm05.stdout:0/591: fsync d8/dd/d10/d26/d8b/da4/f3e 0 2026-03-10T07:50:57.620 INFO:tasks.workunit.client.0.vm05.stdout:3/571: dwrite d8/f5d [0,4194304] 0 2026-03-10T07:50:57.624 INFO:tasks.workunit.client.0.vm05.stdout:4/661: mknod d0/d6/d9/d12/d4f/cdf 0 2026-03-10T07:50:57.625 INFO:tasks.workunit.client.0.vm05.stdout:9/559: rename d8/d35/d1c/d75/f8d to d8/d35/d22/dab/db4/fb6 0 2026-03-10T07:50:57.628 INFO:tasks.workunit.client.0.vm05.stdout:9/560: read d8/d86/d28/f43 [719489,97646] 0 2026-03-10T07:50:57.638 INFO:tasks.workunit.client.0.vm05.stdout:4/662: dwrite d0/d6/d9/d12/d45/d55/f5f [0,4194304] 0 2026-03-10T07:50:57.638 INFO:tasks.workunit.client.0.vm05.stdout:5/600: mknod d2/d20/d33/d86/d8d/da1/dc0/dc2/ccd 0 2026-03-10T07:50:57.640 INFO:tasks.workunit.client.0.vm05.stdout:6/599: sync 2026-03-10T07:50:57.643 INFO:tasks.workunit.client.0.vm05.stdout:3/572: symlink d8/d16/lc2 0 2026-03-10T07:50:57.645 INFO:tasks.workunit.client.0.vm05.stdout:8/535: link d1/fa d1/dd/d18/d20/d2a/d34/d49/fa9 0 2026-03-10T07:50:57.646 INFO:tasks.workunit.client.0.vm05.stdout:8/536: stat d1/dd/d18/d20/d2a/d48/d7c/d9c 0 2026-03-10T07:50:57.655 INFO:tasks.workunit.client.0.vm05.stdout:8/537: dread - d1/d23/fa3 zero size 2026-03-10T07:50:57.658 INFO:tasks.workunit.client.0.vm05.stdout:3/573: dread d8/d1f/d2a/f66 [0,4194304] 0 2026-03-10T07:50:57.658 INFO:tasks.workunit.client.0.vm05.stdout:3/574: readlink d8/d1f/l59 0 2026-03-10T07:50:57.659 INFO:tasks.workunit.client.0.vm05.stdout:9/561: symlink d8/d86/d28/d79/d57/lb7 0 2026-03-10T07:50:57.659 INFO:tasks.workunit.client.0.vm05.stdout:4/663: creat d0/d6/d9/d12/d9c/db7/da7/d96/fe0 x:0 0 0 2026-03-10T07:50:57.659 INFO:tasks.workunit.client.0.vm05.stdout:7/603: creat d1/d6/d80/fb9 x:0 0 0 2026-03-10T07:50:57.659 INFO:tasks.workunit.client.0.vm05.stdout:1/608: unlink da/dd/d2a/d55/c82 0 2026-03-10T07:50:57.670 INFO:tasks.workunit.client.0.vm05.stdout:8/538: rename d1/dd/f11 to d1/d45/d90/faa 0 2026-03-10T07:50:57.671 INFO:tasks.workunit.client.0.vm05.stdout:3/575: sync 2026-03-10T07:50:57.671 INFO:tasks.workunit.client.0.vm05.stdout:9/562: sync 2026-03-10T07:50:57.676 INFO:tasks.workunit.client.0.vm05.stdout:7/604: creat d1/d5b/fba x:0 0 0 2026-03-10T07:50:57.683 INFO:tasks.workunit.client.0.vm05.stdout:1/609: dwrite da/dd/d12/d34/f38 [0,4194304] 0 2026-03-10T07:50:57.691 INFO:tasks.workunit.client.0.vm05.stdout:6/600: mknod d0/d11/d4f/d7d/db7/cbb 0 2026-03-10T07:50:57.697 INFO:tasks.workunit.client.0.vm05.stdout:5/601: fsync d2/f9 0 2026-03-10T07:50:57.697 INFO:tasks.workunit.client.0.vm05.stdout:5/602: fsync d2/d20/d77/f7f 0 2026-03-10T07:50:57.703 INFO:tasks.workunit.client.0.vm05.stdout:4/664: rename d0/d6/d9/d5a/d91/fc6 to d0/d6/d9/d12/d69/fe1 0 2026-03-10T07:50:57.713 INFO:tasks.workunit.client.0.vm05.stdout:0/592: rmdir d8/dd/d10 39 2026-03-10T07:50:57.713 INFO:tasks.workunit.client.0.vm05.stdout:8/539: creat d1/dd/d18/d20/d2a/d48/d7c/d9c/fab x:0 0 0 2026-03-10T07:50:57.713 INFO:tasks.workunit.client.0.vm05.stdout:7/605: mknod d1/d3c/d71/d79/d8a/cbb 0 2026-03-10T07:50:57.713 INFO:tasks.workunit.client.0.vm05.stdout:7/606: chown d1/d6/l26 3 1 2026-03-10T07:50:57.714 INFO:tasks.workunit.client.0.vm05.stdout:7/607: readlink d1/d34/d59/d60/d8c/lb5 0 2026-03-10T07:50:57.717 INFO:tasks.workunit.client.0.vm05.stdout:1/610: write da/dd/d12/d19/f76 [3704245,86671] 0 2026-03-10T07:50:57.720 INFO:tasks.workunit.client.0.vm05.stdout:7/608: sync 2026-03-10T07:50:57.720 INFO:tasks.workunit.client.0.vm05.stdout:7/609: dread - d1/d3c/d4b/fb3 zero size 2026-03-10T07:50:57.722 INFO:tasks.workunit.client.0.vm05.stdout:1/611: dwrite da/d26/d9e/fa1 [0,4194304] 0 2026-03-10T07:50:57.739 INFO:tasks.workunit.client.0.vm05.stdout:2/655: dwrite d0/d8/d3d/d7d/db2/f29 [0,4194304] 0 2026-03-10T07:50:57.754 INFO:tasks.workunit.client.0.vm05.stdout:6/601: creat d0/d11/d2e/fbc x:0 0 0 2026-03-10T07:50:57.785 INFO:tasks.workunit.client.0.vm05.stdout:5/603: chown d2/d12/f3a 298473 1 2026-03-10T07:50:57.785 INFO:tasks.workunit.client.0.vm05.stdout:0/593: truncate d8/dd/f59 69512 0 2026-03-10T07:50:57.811 INFO:tasks.workunit.client.0.vm05.stdout:3/576: creat d8/d16/d52/d7b/fc3 x:0 0 0 2026-03-10T07:50:57.811 INFO:tasks.workunit.client.0.vm05.stdout:3/577: chown d8/d16/d19/d37/c40 107092715 1 2026-03-10T07:50:57.813 INFO:tasks.workunit.client.0.vm05.stdout:3/578: write d8/d22/d60/f8e [3326817,127765] 0 2026-03-10T07:50:57.817 INFO:tasks.workunit.client.0.vm05.stdout:7/610: rename d1/d34/fb0 to d1/d3c/d71/d79/d8a/fbc 0 2026-03-10T07:50:57.818 INFO:tasks.workunit.client.0.vm05.stdout:7/611: truncate d1/d34/d59/f99 245093 0 2026-03-10T07:50:57.823 INFO:tasks.workunit.client.0.vm05.stdout:1/612: mknod da/dd/d12/d34/cb9 0 2026-03-10T07:50:57.834 INFO:tasks.workunit.client.0.vm05.stdout:4/665: write d0/d6/d60/f72 [827869,33358] 0 2026-03-10T07:50:57.851 INFO:tasks.workunit.client.0.vm05.stdout:0/594: write d8/dd/d10/f19 [615282,118192] 0 2026-03-10T07:50:57.851 INFO:tasks.workunit.client.0.vm05.stdout:0/595: readlink d8/dd/d10/l2c 0 2026-03-10T07:50:57.855 INFO:tasks.workunit.client.0.vm05.stdout:0/596: dwrite d8/dd/d37/f4f [0,4194304] 0 2026-03-10T07:50:57.875 INFO:tasks.workunit.client.0.vm05.stdout:6/602: rename d0/d11/d4f/fa2 to d0/d11/d57/d66/fbd 0 2026-03-10T07:50:57.877 INFO:tasks.workunit.client.0.vm05.stdout:2/656: unlink d0/d8/d43/d38/l6a 0 2026-03-10T07:50:57.889 INFO:tasks.workunit.client.0.vm05.stdout:9/563: getdents d8/d35/d1c/d75 0 2026-03-10T07:50:57.889 INFO:tasks.workunit.client.0.vm05.stdout:2/657: read d0/d8/d3d/d7d/da5/fb0 [8290609,12323] 0 2026-03-10T07:50:57.892 INFO:tasks.workunit.client.0.vm05.stdout:8/540: creat d1/dd/d4d/d64/fac x:0 0 0 2026-03-10T07:50:57.894 INFO:tasks.workunit.client.0.vm05.stdout:3/579: unlink d8/d16/ca1 0 2026-03-10T07:50:57.895 INFO:tasks.workunit.client.0.vm05.stdout:3/580: stat d8/c93 0 2026-03-10T07:50:57.896 INFO:tasks.workunit.client.0.vm05.stdout:5/604: write d2/d5/f3d [2078104,99638] 0 2026-03-10T07:50:57.901 INFO:tasks.workunit.client.0.vm05.stdout:5/605: read d2/d12/d2d/f60 [261365,104476] 0 2026-03-10T07:50:57.901 INFO:tasks.workunit.client.0.vm05.stdout:4/666: write d0/d6/d9/d12/d9c/db7/da7/f4c [467806,115308] 0 2026-03-10T07:50:57.902 INFO:tasks.workunit.client.0.vm05.stdout:5/606: fsync d2/d12/d2d/f36 0 2026-03-10T07:50:57.911 INFO:tasks.workunit.client.0.vm05.stdout:7/612: fsync d1/f46 0 2026-03-10T07:50:57.920 INFO:tasks.workunit.client.0.vm05.stdout:0/597: mkdir d8/dd/d10/d26/d3a/d5e/d63/dcc 0 2026-03-10T07:50:57.920 INFO:tasks.workunit.client.0.vm05.stdout:7/613: dwrite d1/d3c/f89 [0,4194304] 0 2026-03-10T07:50:57.931 INFO:tasks.workunit.client.0.vm05.stdout:2/658: dwrite d0/d8/d66/f68 [0,4194304] 0 2026-03-10T07:50:57.938 INFO:tasks.workunit.client.0.vm05.stdout:6/603: dread d0/d6/f44 [0,4194304] 0 2026-03-10T07:50:57.953 INFO:tasks.workunit.client.0.vm05.stdout:5/607: rename d2/d12/f6b to d2/d20/d33/d86/d8d/da1/dc0/fce 0 2026-03-10T07:50:57.954 INFO:tasks.workunit.client.0.vm05.stdout:5/608: stat d2/d5/f3c 0 2026-03-10T07:50:57.956 INFO:tasks.workunit.client.0.vm05.stdout:1/613: symlink da/dd/d12/d86/lba 0 2026-03-10T07:50:57.959 INFO:tasks.workunit.client.0.vm05.stdout:1/614: read da/fc [917320,49931] 0 2026-03-10T07:50:57.965 INFO:tasks.workunit.client.0.vm05.stdout:9/564: mkdir d8/d86/db8 0 2026-03-10T07:50:57.965 INFO:tasks.workunit.client.0.vm05.stdout:9/565: chown d8/d35/d6b/f97 39145 1 2026-03-10T07:50:57.968 INFO:tasks.workunit.client.0.vm05.stdout:7/614: creat d1/d6/d80/d82/fbd x:0 0 0 2026-03-10T07:50:57.982 INFO:tasks.workunit.client.0.vm05.stdout:6/604: mkdir d0/d6/d3b/dbe 0 2026-03-10T07:50:57.982 INFO:tasks.workunit.client.0.vm05.stdout:5/609: rename d2/d5/f63 to d2/d4b/fcf 0 2026-03-10T07:50:57.982 INFO:tasks.workunit.client.0.vm05.stdout:0/598: mknod d8/dd/d10/d26/d8b/d7d/ccd 0 2026-03-10T07:50:57.982 INFO:tasks.workunit.client.0.vm05.stdout:9/566: symlink d8/d86/d28/d79/d57/lb9 0 2026-03-10T07:50:57.982 INFO:tasks.workunit.client.0.vm05.stdout:7/615: creat d1/d34/d59/d60/fbe x:0 0 0 2026-03-10T07:50:57.985 INFO:tasks.workunit.client.0.vm05.stdout:6/605: mkdir d0/d11/d31/dbf 0 2026-03-10T07:50:57.989 INFO:tasks.workunit.client.0.vm05.stdout:5/610: creat d2/d20/d33/d53/fd0 x:0 0 0 2026-03-10T07:50:57.993 INFO:tasks.workunit.client.0.vm05.stdout:7/616: dwrite d1/d34/f7a [4194304,4194304] 0 2026-03-10T07:50:57.993 INFO:tasks.workunit.client.0.vm05.stdout:0/599: creat d8/dd/d37/d81/fce x:0 0 0 2026-03-10T07:50:57.998 INFO:tasks.workunit.client.0.vm05.stdout:4/667: getdents d0/d6/d9/d12/d45/d55/d44 0 2026-03-10T07:50:58.007 INFO:tasks.workunit.client.0.vm05.stdout:7/617: dwrite d1/d6/d80/d82/fa9 [0,4194304] 0 2026-03-10T07:50:58.007 INFO:tasks.workunit.client.0.vm05.stdout:7/618: truncate d1/d6/d47/d8d/faf 951244 0 2026-03-10T07:50:58.007 INFO:tasks.workunit.client.0.vm05.stdout:7/619: dwrite d1/d6/d47/d8d/fb4 [0,4194304] 0 2026-03-10T07:50:58.011 INFO:tasks.workunit.client.0.vm05.stdout:7/620: chown d1/d34/d59/d60/d8c/lb5 0 1 2026-03-10T07:50:58.019 INFO:tasks.workunit.client.0.vm05.stdout:0/600: symlink d8/dd/d10/db7/dc3/lcf 0 2026-03-10T07:50:58.026 INFO:tasks.workunit.client.0.vm05.stdout:4/668: rename d0/f41 to d0/d6/d9/d12/d9c/db7/da7/fe2 0 2026-03-10T07:50:58.032 INFO:tasks.workunit.client.0.vm05.stdout:7/621: write d1/d6/f22 [1936720,37517] 0 2026-03-10T07:50:58.039 INFO:tasks.workunit.client.0.vm05.stdout:7/622: getdents d1/d6/d47/d8d/da7 0 2026-03-10T07:50:58.043 INFO:tasks.workunit.client.0.vm05.stdout:7/623: truncate d1/d3c/d71/d79/d8a/f90 808331 0 2026-03-10T07:50:58.043 INFO:tasks.workunit.client.0.vm05.stdout:4/669: creat d0/d6/d9/d12/d45/d55/fe3 x:0 0 0 2026-03-10T07:50:58.044 INFO:tasks.workunit.client.0.vm05.stdout:5/611: sync 2026-03-10T07:50:58.045 INFO:tasks.workunit.client.0.vm05.stdout:6/606: sync 2026-03-10T07:50:58.045 INFO:tasks.workunit.client.0.vm05.stdout:1/615: sync 2026-03-10T07:50:58.058 INFO:tasks.workunit.client.0.vm05.stdout:4/670: dwrite d0/d6/d37/f75 [0,4194304] 0 2026-03-10T07:50:58.059 INFO:tasks.workunit.client.0.vm05.stdout:6/607: dwrite d0/d11/d22/f4c [0,4194304] 0 2026-03-10T07:50:58.061 INFO:tasks.workunit.client.0.vm05.stdout:4/671: write d0/d6/d37/f3d [4230787,90862] 0 2026-03-10T07:50:58.068 INFO:tasks.workunit.client.0.vm05.stdout:5/612: dwrite d2/d20/d4c/d64/f96 [4194304,4194304] 0 2026-03-10T07:50:58.085 INFO:tasks.workunit.client.0.vm05.stdout:1/616: sync 2026-03-10T07:50:58.086 INFO:tasks.workunit.client.0.vm05.stdout:4/672: dwrite d0/d6/d9/f83 [0,4194304] 0 2026-03-10T07:50:58.090 INFO:tasks.workunit.client.0.vm05.stdout:5/613: sync 2026-03-10T07:50:58.091 INFO:tasks.workunit.client.0.vm05.stdout:6/608: dread d0/f23 [0,4194304] 0 2026-03-10T07:50:58.095 INFO:tasks.workunit.client.0.vm05.stdout:6/609: write d0/d11/d57/d66/f75 [2678908,31806] 0 2026-03-10T07:50:58.103 INFO:tasks.workunit.client.0.vm05.stdout:8/541: write d1/dd/d5e/f6b [5517420,4117] 0 2026-03-10T07:50:58.103 INFO:tasks.workunit.client.0.vm05.stdout:3/581: write d8/d16/dac/f75 [950791,130896] 0 2026-03-10T07:50:58.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:57 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:58.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:57 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:58.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:57 vm05.local ceph-mon[50387]: pgmap v18: 65 pgs: 65 active+clean; 1.1 GiB data, 4.0 GiB used, 116 GiB / 120 GiB avail; 40 MiB/s rd, 107 MiB/s wr, 264 op/s 2026-03-10T07:50:58.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:57 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:50:58.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:57 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:58.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:57 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:58.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:57 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:58.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:57 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:58.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:57 vm08.local ceph-mon[59917]: pgmap v18: 65 pgs: 65 active+clean; 1.1 GiB data, 4.0 GiB used, 116 GiB / 120 GiB avail; 40 MiB/s rd, 107 MiB/s wr, 264 op/s 2026-03-10T07:50:58.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:57 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:50:58.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:57 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:58.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:57 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:58.171 INFO:tasks.workunit.client.0.vm05.stdout:9/567: write d8/d35/d22/f4f [5105815,47142] 0 2026-03-10T07:50:58.175 INFO:tasks.workunit.client.0.vm05.stdout:5/614: symlink d2/d20/d33/d86/d8d/da1/dc0/ld1 0 2026-03-10T07:50:58.177 INFO:tasks.workunit.client.0.vm05.stdout:5/615: dread - d2/fad zero size 2026-03-10T07:50:58.183 INFO:tasks.workunit.client.0.vm05.stdout:8/542: rmdir d1/d52 39 2026-03-10T07:50:58.184 INFO:tasks.workunit.client.0.vm05.stdout:0/601: dwrite d8/dd/f22 [0,4194304] 0 2026-03-10T07:50:58.188 INFO:tasks.workunit.client.0.vm05.stdout:1/617: dwrite da/d26/f92 [4194304,4194304] 0 2026-03-10T07:50:58.201 INFO:tasks.workunit.client.0.vm05.stdout:7/624: rename d1/d34/l6a to d1/d6/d80/lbf 0 2026-03-10T07:50:58.202 INFO:tasks.workunit.client.0.vm05.stdout:4/673: mkdir d0/de4 0 2026-03-10T07:50:58.202 INFO:tasks.workunit.client.0.vm05.stdout:7/625: fsync d1/d3c/f89 0 2026-03-10T07:50:58.207 INFO:tasks.workunit.client.0.vm05.stdout:2/659: chown d0/d8/d43/da4/cca 149548 1 2026-03-10T07:50:58.212 INFO:tasks.workunit.client.0.vm05.stdout:4/674: dwrite d0/d6/d9/d8c/dbe/fc3 [0,4194304] 0 2026-03-10T07:50:58.226 INFO:tasks.workunit.client.0.vm05.stdout:6/610: mkdir d0/d11/d57/da4/db3/dc0 0 2026-03-10T07:50:58.226 INFO:tasks.workunit.client.0.vm05.stdout:0/602: stat d8/dd/d10/d26/cb1 0 2026-03-10T07:50:58.227 INFO:tasks.workunit.client.0.vm05.stdout:0/603: chown d8/dd/d37/d67/c61 1828 1 2026-03-10T07:50:58.244 INFO:tasks.workunit.client.0.vm05.stdout:7/626: rmdir d1/d6/d80/d82 39 2026-03-10T07:50:58.245 INFO:tasks.workunit.client.0.vm05.stdout:3/582: read d8/d16/d19/f81 [376987,18571] 0 2026-03-10T07:50:58.246 INFO:tasks.workunit.client.0.vm05.stdout:7/627: write d1/d3c/f89 [2156915,127745] 0 2026-03-10T07:50:58.259 INFO:tasks.workunit.client.0.vm05.stdout:4/675: symlink d0/d6/d9/d12/d9c/db7/db1/le5 0 2026-03-10T07:50:58.273 INFO:tasks.workunit.client.0.vm05.stdout:0/604: truncate d8/dd/d10/d26/d3a/d5e/f71 5004727 0 2026-03-10T07:50:58.273 INFO:tasks.workunit.client.0.vm05.stdout:9/568: link d8/d35/d22/f6a d8/d35/d22/d33/d62/fba 0 2026-03-10T07:50:58.274 INFO:tasks.workunit.client.0.vm05.stdout:1/618: chown da/dd/d12/d86/cab 1952 1 2026-03-10T07:50:58.275 INFO:tasks.workunit.client.0.vm05.stdout:5/616: truncate d2/d20/f51 1408036 0 2026-03-10T07:50:58.285 INFO:tasks.workunit.client.0.vm05.stdout:3/583: dread d8/f18 [0,4194304] 0 2026-03-10T07:50:58.292 INFO:tasks.workunit.client.0.vm05.stdout:3/584: dread d8/d16/f2d [0,4194304] 0 2026-03-10T07:50:58.301 INFO:tasks.workunit.client.0.vm05.stdout:0/605: creat d8/d9c/fd0 x:0 0 0 2026-03-10T07:50:58.304 INFO:tasks.workunit.client.0.vm05.stdout:0/606: dwrite d8/dd/d37/d56/f62 [4194304,4194304] 0 2026-03-10T07:50:58.314 INFO:tasks.workunit.client.0.vm05.stdout:2/660: link d0/d8/d43/df/d53/f82 d0/d8/d66/fd8 0 2026-03-10T07:50:58.317 INFO:tasks.workunit.client.0.vm05.stdout:2/661: write d0/d2a/f45 [2655786,81295] 0 2026-03-10T07:50:58.317 INFO:tasks.workunit.client.0.vm05.stdout:2/662: chown d0/d52/cc7 166 1 2026-03-10T07:50:58.320 INFO:tasks.workunit.client.0.vm05.stdout:9/569: mknod d8/d35/d22/d33/d62/cbb 0 2026-03-10T07:50:58.322 INFO:tasks.workunit.client.0.vm05.stdout:5/617: stat d2/d20/d4c/d64/l91 0 2026-03-10T07:50:58.323 INFO:tasks.workunit.client.0.vm05.stdout:5/618: chown d2/d20/d5b/c92 27256 1 2026-03-10T07:50:58.327 INFO:tasks.workunit.client.0.vm05.stdout:4/676: link d0/d6/d37/f75 d0/d6/d60/fe6 0 2026-03-10T07:50:58.337 INFO:tasks.workunit.client.0.vm05.stdout:3/585: rename d8/d1c/c5a to d8/d1f/cc4 0 2026-03-10T07:50:58.343 INFO:tasks.workunit.client.0.vm05.stdout:1/619: symlink da/dd/lbb 0 2026-03-10T07:50:58.348 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.345+0000 7f95ebfff700 1 -- 192.168.123.105:0/1644398674 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95ec107d90 msgr2=0x7f95ec10a1c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:58.348 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.345+0000 7f95ebfff700 1 --2- 192.168.123.105:0/1644398674 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95ec107d90 0x7f95ec10a1c0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f95dc009b00 tx=0x7f95dc009e10 comp rx=0 tx=0).stop 2026-03-10T07:50:58.348 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.346+0000 7f95ebfff700 1 -- 192.168.123.105:0/1644398674 shutdown_connections 2026-03-10T07:50:58.348 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.346+0000 7f95ebfff700 1 --2- 192.168.123.105:0/1644398674 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95ec10a700 0x7f95ec10cb90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:58.348 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.346+0000 7f95ebfff700 1 --2- 192.168.123.105:0/1644398674 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95ec107d90 0x7f95ec10a1c0 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:58.348 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.346+0000 7f95ebfff700 1 -- 192.168.123.105:0/1644398674 >> 192.168.123.105:0/1644398674 conn(0x7f95ec06daa0 msgr2=0x7f95ec06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:50:58.348 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.346+0000 7f95ebfff700 1 -- 192.168.123.105:0/1644398674 shutdown_connections 2026-03-10T07:50:58.348 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.346+0000 7f95ebfff700 1 -- 192.168.123.105:0/1644398674 wait complete. 2026-03-10T07:50:58.348 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.346+0000 7f95ebfff700 1 Processor -- start 2026-03-10T07:50:58.348 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.346+0000 7f95ebfff700 1 -- start start 2026-03-10T07:50:58.348 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.347+0000 7f95ebfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95ec107d90 0x7f95ec116b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:58.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.347+0000 7f95ebfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95ec10a700 0x7f95ec117080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:58.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.347+0000 7f95ebfff700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f95ec1176a0 con 0x7f95ec107d90 2026-03-10T07:50:58.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.347+0000 7f95ebfff700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f95ec1177e0 con 0x7f95ec10a700 2026-03-10T07:50:58.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.347+0000 7f95e3fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95ec10a700 0x7f95ec117080 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:50:58.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.347+0000 7f95e3fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95ec10a700 0x7f95ec117080 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:42738/0 (socket says 192.168.123.105:42738) 2026-03-10T07:50:58.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.347+0000 7f95e3fff700 1 -- 192.168.123.105:0/3338501354 learned_addr learned my addr 192.168.123.105:0/3338501354 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:50:58.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.347+0000 7f95e3fff700 1 -- 192.168.123.105:0/3338501354 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95ec107d90 msgr2=0x7f95ec116b40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:58.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.347+0000 7f95e3fff700 1 --2- 192.168.123.105:0/3338501354 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95ec107d90 0x7f95ec116b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:58.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.347+0000 7f95e3fff700 1 -- 192.168.123.105:0/3338501354 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f95dc0097e0 con 0x7f95ec10a700 2026-03-10T07:50:58.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.348+0000 7f95e3fff700 1 --2- 192.168.123.105:0/3338501354 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95ec10a700 0x7f95ec117080 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f95d400b770 tx=0x7f95d400bb30 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:50:58.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.348+0000 7f95e8ff9700 1 -- 192.168.123.105:0/3338501354 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f95d400f820 con 0x7f95ec10a700 2026-03-10T07:50:58.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.348+0000 7f95ebfff700 1 -- 192.168.123.105:0/3338501354 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f95ec1b3400 con 0x7f95ec10a700 2026-03-10T07:50:58.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.348+0000 7f95ebfff700 1 -- 192.168.123.105:0/3338501354 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f95ec1b3950 con 0x7f95ec10a700 2026-03-10T07:50:58.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.349+0000 7f95e8ff9700 1 -- 192.168.123.105:0/3338501354 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f95d400fe60 con 0x7f95ec10a700 2026-03-10T07:50:58.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.349+0000 7f95e8ff9700 1 -- 192.168.123.105:0/3338501354 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f95d400d610 con 0x7f95ec10a700 2026-03-10T07:50:58.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.350+0000 7f95e8ff9700 1 -- 192.168.123.105:0/3338501354 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f95d4017400 con 0x7f95ec10a700 2026-03-10T07:50:58.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.350+0000 7f95ebfff700 1 -- 192.168.123.105:0/3338501354 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f95ec04ea90 con 0x7f95ec10a700 2026-03-10T07:50:58.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.350+0000 7f95e8ff9700 1 --2- 192.168.123.105:0/3338501354 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f95cc077680 0x7f95cc079b40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:58.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.351+0000 7f95eaffd700 1 --2- 192.168.123.105:0/3338501354 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f95cc077680 0x7f95cc079b40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:50:58.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.351+0000 7f95e8ff9700 1 -- 192.168.123.105:0/3338501354 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f95d4099520 con 0x7f95ec10a700 2026-03-10T07:50:58.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.352+0000 7f95eaffd700 1 --2- 192.168.123.105:0/3338501354 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f95cc077680 0x7f95cc079b40 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f95dc00b5c0 tx=0x7f95dc005dc0 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:50:58.355 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.354+0000 7f95e8ff9700 1 -- 192.168.123.105:0/3338501354 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f95d4061fb0 con 0x7f95ec10a700 2026-03-10T07:50:58.358 INFO:tasks.workunit.client.0.vm05.stdout:9/570: mkdir d8/d86/d28/d79/d57/dbc 0 2026-03-10T07:50:58.367 INFO:tasks.workunit.client.0.vm05.stdout:4/677: symlink d0/d6/d9/d5a/d91/le7 0 2026-03-10T07:50:58.367 INFO:tasks.workunit.client.0.vm05.stdout:3/586: dread - d8/d16/f88 zero size 2026-03-10T07:50:58.368 INFO:tasks.workunit.client.0.vm05.stdout:4/678: truncate d0/d6/d9/d12/d9c/db7/da7/fe2 734723 0 2026-03-10T07:50:58.368 INFO:tasks.workunit.client.0.vm05.stdout:4/679: chown d0/d6/da6/lbb 1 1 2026-03-10T07:50:58.373 INFO:tasks.workunit.client.0.vm05.stdout:5/619: mknod d2/d20/d33/cd2 0 2026-03-10T07:50:58.401 INFO:tasks.workunit.client.0.vm05.stdout:6/611: dwrite d0/d11/d57/d60/f74 [0,4194304] 0 2026-03-10T07:50:58.403 INFO:tasks.workunit.client.0.vm05.stdout:8/543: truncate d1/dd/d4d/d64/d6a/f76 1831172 0 2026-03-10T07:50:58.405 INFO:tasks.workunit.client.0.vm05.stdout:7/628: dwrite d1/d3c/f63 [0,4194304] 0 2026-03-10T07:50:58.411 INFO:tasks.workunit.client.0.vm05.stdout:0/607: write d8/dd/d37/d81/f91 [188086,112245] 0 2026-03-10T07:50:58.412 INFO:tasks.workunit.client.0.vm05.stdout:7/629: readlink d1/d6/d80/l91 0 2026-03-10T07:50:58.416 INFO:tasks.workunit.client.0.vm05.stdout:4/680: truncate d0/d6/d9/d12/d9c/db7/fb2 3153356 0 2026-03-10T07:50:58.426 INFO:tasks.workunit.client.0.vm05.stdout:5/620: truncate d2/d20/d33/f88 686715 0 2026-03-10T07:50:58.427 INFO:tasks.workunit.client.0.vm05.stdout:2/663: write d0/f6 [5123557,96194] 0 2026-03-10T07:50:58.427 INFO:tasks.workunit.client.0.vm05.stdout:9/571: link d8/d35/d1c/f49 d8/d35/d1c/d20/fbd 0 2026-03-10T07:50:58.427 INFO:tasks.workunit.client.0.vm05.stdout:6/612: dread d0/d6/f24 [0,4194304] 0 2026-03-10T07:50:58.433 INFO:tasks.workunit.client.0.vm05.stdout:1/620: creat da/dd/d12/d86/d9a/fbc x:0 0 0 2026-03-10T07:50:58.514 INFO:tasks.workunit.client.0.vm05.stdout:0/608: rmdir d8/dd/d37/d56 39 2026-03-10T07:50:58.514 INFO:tasks.workunit.client.0.vm05.stdout:4/681: readlink d0/d6/d9/d12/d45/lb8 0 2026-03-10T07:50:58.520 INFO:tasks.workunit.client.0.vm05.stdout:8/544: dwrite d1/dd/d18/d20/d2a/d34/d49/d5d/f84 [0,4194304] 0 2026-03-10T07:50:58.521 INFO:tasks.workunit.client.0.vm05.stdout:2/664: write d0/f4 [1246657,16220] 0 2026-03-10T07:50:58.528 INFO:tasks.workunit.client.0.vm05.stdout:7/630: dwrite d1/f49 [4194304,4194304] 0 2026-03-10T07:50:58.574 INFO:tasks.workunit.client.0.vm05.stdout:7/631: sync 2026-03-10T07:50:58.595 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.588+0000 7f95ebfff700 1 -- 192.168.123.105:0/3338501354 --> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f95ec10b3f0 con 0x7f95cc077680 2026-03-10T07:50:58.595 INFO:tasks.workunit.client.0.vm05.stdout:6/613: symlink d0/d11/d4f/da0/da6/lc1 0 2026-03-10T07:50:58.595 INFO:tasks.workunit.client.0.vm05.stdout:3/587: getdents d8/d16/d52/da4 0 2026-03-10T07:50:58.595 INFO:tasks.workunit.client.0.vm05.stdout:6/614: stat d0/d11/d57/f7a 0 2026-03-10T07:50:58.596 INFO:tasks.workunit.client.0.vm05.stdout:7/632: mknod d1/d5b/cc0 0 2026-03-10T07:50:58.600 INFO:tasks.workunit.client.0.vm05.stdout:7/633: dwrite d1/d34/d59/d60/fbe [0,4194304] 0 2026-03-10T07:50:58.602 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.599+0000 7f95e8ff9700 1 -- 192.168.123.105:0/3338501354 <== mgr.14628 v2:192.168.123.105:6800/413688438 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+370 (secure 0 0 0) 0x7f95ec10b3f0 con 0x7f95cc077680 2026-03-10T07:50:58.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.602+0000 7f95e1ffb700 1 -- 192.168.123.105:0/3338501354 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f95cc077680 msgr2=0x7f95cc079b40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:58.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.602+0000 7f95e1ffb700 1 --2- 192.168.123.105:0/3338501354 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f95cc077680 0x7f95cc079b40 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f95dc00b5c0 tx=0x7f95dc005dc0 comp rx=0 tx=0).stop 2026-03-10T07:50:58.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.602+0000 7f95e1ffb700 1 -- 192.168.123.105:0/3338501354 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95ec10a700 msgr2=0x7f95ec117080 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:58.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.602+0000 7f95e1ffb700 1 --2- 192.168.123.105:0/3338501354 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95ec10a700 0x7f95ec117080 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f95d400b770 tx=0x7f95d400bb30 comp rx=0 tx=0).stop 2026-03-10T07:50:58.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.603+0000 7f95e1ffb700 1 -- 192.168.123.105:0/3338501354 shutdown_connections 2026-03-10T07:50:58.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.603+0000 7f95e1ffb700 1 --2- 192.168.123.105:0/3338501354 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f95cc077680 0x7f95cc079b40 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:58.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.603+0000 7f95e1ffb700 1 --2- 192.168.123.105:0/3338501354 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f95ec107d90 0x7f95ec116b40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:58.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.603+0000 7f95e1ffb700 1 --2- 192.168.123.105:0/3338501354 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f95ec10a700 0x7f95ec117080 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:58.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.603+0000 7f95e1ffb700 1 -- 192.168.123.105:0/3338501354 >> 192.168.123.105:0/3338501354 conn(0x7f95ec06daa0 msgr2=0x7f95ec06ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:50:58.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.603+0000 7f95e1ffb700 1 -- 192.168.123.105:0/3338501354 shutdown_connections 2026-03-10T07:50:58.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.604+0000 7f95e1ffb700 1 -- 192.168.123.105:0/3338501354 wait complete. 2026-03-10T07:50:58.610 INFO:tasks.workunit.client.0.vm05.stdout:7/634: dread d1/d6/f84 [0,4194304] 0 2026-03-10T07:50:58.615 INFO:tasks.workunit.client.0.vm05.stdout:4/682: symlink d0/d6/d9/d12/d69/ddc/le8 0 2026-03-10T07:50:58.622 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T07:50:58.631 INFO:tasks.workunit.client.0.vm05.stdout:5/621: getdents d2/d20/d4c/d64 0 2026-03-10T07:50:58.637 INFO:tasks.workunit.client.0.vm05.stdout:3/588: truncate d8/d1f/d2a/d96/f7f 483803 0 2026-03-10T07:50:58.638 INFO:tasks.workunit.client.0.vm05.stdout:5/622: write d2/d12/f40 [998725,62183] 0 2026-03-10T07:50:58.638 INFO:tasks.workunit.client.0.vm05.stdout:7/635: symlink d1/d6/d3b/lc1 0 2026-03-10T07:50:58.638 INFO:tasks.workunit.client.0.vm05.stdout:7/636: write d1/d34/d59/d60/fbe [3121381,127875] 0 2026-03-10T07:50:58.650 INFO:tasks.workunit.client.0.vm05.stdout:9/572: write d8/d35/d22/f6a [1422868,38190] 0 2026-03-10T07:50:58.650 INFO:tasks.workunit.client.0.vm05.stdout:0/609: creat d8/dd/d37/fd1 x:0 0 0 2026-03-10T07:50:58.653 INFO:tasks.workunit.client.0.vm05.stdout:4/683: creat d0/d6/d60/fe9 x:0 0 0 2026-03-10T07:50:58.654 INFO:tasks.workunit.client.0.vm05.stdout:8/545: creat d1/fad x:0 0 0 2026-03-10T07:50:58.654 INFO:tasks.workunit.client.0.vm05.stdout:1/621: getdents da 0 2026-03-10T07:50:58.656 INFO:tasks.workunit.client.0.vm05.stdout:3/589: unlink d8/d1f/l6d 0 2026-03-10T07:50:58.658 INFO:tasks.workunit.client.0.vm05.stdout:4/684: sync 2026-03-10T07:50:58.666 INFO:tasks.workunit.client.0.vm05.stdout:2/665: write d0/f5 [1740662,115021] 0 2026-03-10T07:50:58.672 INFO:tasks.workunit.client.0.vm05.stdout:6/615: write d0/d35/d36/f5b [7291273,120862] 0 2026-03-10T07:50:58.674 INFO:tasks.workunit.client.0.vm05.stdout:6/616: readlink d0/l76 0 2026-03-10T07:50:58.674 INFO:tasks.workunit.client.0.vm05.stdout:0/610: mknod d8/cd2 0 2026-03-10T07:50:58.684 INFO:tasks.workunit.client.0.vm05.stdout:1/622: mkdir da/d26/d2b/d89/dbd 0 2026-03-10T07:50:58.685 INFO:tasks.workunit.client.0.vm05.stdout:0/611: dwrite d8/f75 [0,4194304] 0 2026-03-10T07:50:58.686 INFO:tasks.workunit.client.0.vm05.stdout:1/623: chown da/dd/d2a/d55 2688 1 2026-03-10T07:50:58.695 INFO:tasks.workunit.client.0.vm05.stdout:6/617: dwrite d0/d35/d36/f59 [0,4194304] 0 2026-03-10T07:50:58.695 INFO:tasks.workunit.client.0.vm05.stdout:1/624: truncate da/dd/d2a/f75 786015 0 2026-03-10T07:50:58.712 INFO:tasks.workunit.client.0.vm05.stdout:8/546: stat d1/dd/d18/d20/d2a/c7f 0 2026-03-10T07:50:58.726 INFO:tasks.workunit.client.0.vm05.stdout:3/590: unlink d8/d1f/d24/ca7 0 2026-03-10T07:50:58.727 INFO:tasks.workunit.client.0.vm05.stdout:3/591: chown d8/d1f/d24/f3e 3 1 2026-03-10T07:50:58.730 INFO:tasks.workunit.client.0.vm05.stdout:3/592: readlink d8/d1f/d24/lb7 0 2026-03-10T07:50:58.738 INFO:tasks.workunit.client.0.vm05.stdout:6/618: dwrite d0/d11/f9d [0,4194304] 0 2026-03-10T07:50:58.742 INFO:tasks.workunit.client.0.vm05.stdout:5/623: truncate d2/d5/d61/f66 279370 0 2026-03-10T07:50:58.757 INFO:tasks.workunit.client.0.vm05.stdout:1/625: rename da/dd/d12/d34/d58/d8e to da/d26/d2b/daf/dbe 0 2026-03-10T07:50:58.757 INFO:tasks.workunit.client.0.vm05.stdout:4/685: mknod d0/cea 0 2026-03-10T07:50:58.757 INFO:tasks.workunit.client.0.vm05.stdout:7/637: creat d1/d3c/d71/d79/d8a/dac/fc2 x:0 0 0 2026-03-10T07:50:58.758 INFO:tasks.workunit.client.0.vm05.stdout:3/593: mkdir d8/d1f/d24/d76/dc5 0 2026-03-10T07:50:58.766 INFO:tasks.workunit.client.0.vm05.stdout:2/666: creat d0/d8/d66/dd1/d49/db1/fd9 x:0 0 0 2026-03-10T07:50:58.767 INFO:tasks.workunit.client.0.vm05.stdout:7/638: read d1/d6/f41 [1397064,89839] 0 2026-03-10T07:50:58.769 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.767+0000 7f3d16eff700 1 -- 192.168.123.105:0/538591105 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d10075c80 msgr2=0x7f3d10078110 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:58.769 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.767+0000 7f3d16eff700 1 --2- 192.168.123.105:0/538591105 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d10075c80 0x7f3d10078110 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f3d0800d3f0 tx=0x7f3d0800d700 comp rx=0 tx=0).stop 2026-03-10T07:50:58.769 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.767+0000 7f3d16eff700 1 -- 192.168.123.105:0/538591105 shutdown_connections 2026-03-10T07:50:58.769 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.767+0000 7f3d16eff700 1 --2- 192.168.123.105:0/538591105 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d10075c80 0x7f3d10078110 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:58.769 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.767+0000 7f3d16eff700 1 --2- 192.168.123.105:0/538591105 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d10072d90 0x7f3d100731b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:58.769 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.767+0000 7f3d16eff700 1 -- 192.168.123.105:0/538591105 >> 192.168.123.105:0/538591105 conn(0x7f3d1006dda0 msgr2=0x7f3d10070220 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:50:58.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.767+0000 7f3d16eff700 1 -- 192.168.123.105:0/538591105 shutdown_connections 2026-03-10T07:50:58.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.767+0000 7f3d16eff700 1 -- 192.168.123.105:0/538591105 wait complete. 2026-03-10T07:50:58.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.767+0000 7f3d16eff700 1 Processor -- start 2026-03-10T07:50:58.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.767+0000 7f3d16eff700 1 -- start start 2026-03-10T07:50:58.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.767+0000 7f3d16eff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d10072d90 0x7f3d1012bdb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:58.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.767+0000 7f3d16eff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d10083ad0 0x7f3d1012e300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:58.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.767+0000 7f3d16eff700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3d1012e840 con 0x7f3d10072d90 2026-03-10T07:50:58.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.767+0000 7f3d16eff700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3d1012e9b0 con 0x7f3d10083ad0 2026-03-10T07:50:58.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.768+0000 7f3d0ffff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d10083ad0 0x7f3d1012e300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:50:58.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.768+0000 7f3d0ffff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d10083ad0 0x7f3d1012e300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:42748/0 (socket says 192.168.123.105:42748) 2026-03-10T07:50:58.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.768+0000 7f3d0ffff700 1 -- 192.168.123.105:0/1962141481 learned_addr learned my addr 192.168.123.105:0/1962141481 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:50:58.770 INFO:tasks.workunit.client.0.vm05.stdout:4/686: sync 2026-03-10T07:50:58.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.769+0000 7f3d0ffff700 1 -- 192.168.123.105:0/1962141481 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d10072d90 msgr2=0x7f3d1012bdb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:58.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.769+0000 7f3d0ffff700 1 --2- 192.168.123.105:0/1962141481 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d10072d90 0x7f3d1012bdb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:58.770 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.769+0000 7f3d0ffff700 1 -- 192.168.123.105:0/1962141481 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3d08007ed0 con 0x7f3d10083ad0 2026-03-10T07:50:58.770 INFO:tasks.workunit.client.0.vm05.stdout:7/639: chown d1/d34/d59/c88 4415471 1 2026-03-10T07:50:58.771 INFO:tasks.workunit.client.0.vm05.stdout:4/687: stat d0/d6/d60/fe6 0 2026-03-10T07:50:58.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.770+0000 7f3d0ffff700 1 --2- 192.168.123.105:0/1962141481 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d10083ad0 0x7f3d1012e300 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f3d08003c60 tx=0x7f3d08003d40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:50:58.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.770+0000 7f3d0dffb700 1 -- 192.168.123.105:0/1962141481 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3d0801c070 con 0x7f3d10083ad0 2026-03-10T07:50:58.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.771+0000 7f3d16eff700 1 -- 192.168.123.105:0/1962141481 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3d1012ebd0 con 0x7f3d10083ad0 2026-03-10T07:50:58.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.771+0000 7f3d16eff700 1 -- 192.168.123.105:0/1962141481 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3d1012f120 con 0x7f3d10083ad0 2026-03-10T07:50:58.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.771+0000 7f3d0dffb700 1 -- 192.168.123.105:0/1962141481 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f3d0800fcf0 con 0x7f3d10083ad0 2026-03-10T07:50:58.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.771+0000 7f3d0dffb700 1 -- 192.168.123.105:0/1962141481 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3d08017d40 con 0x7f3d10083ad0 2026-03-10T07:50:58.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.771+0000 7f3d16eff700 1 -- 192.168.123.105:0/1962141481 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3cfc005320 con 0x7f3d10083ad0 2026-03-10T07:50:58.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.772+0000 7f3d0dffb700 1 -- 192.168.123.105:0/1962141481 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f3d08017420 con 0x7f3d10083ad0 2026-03-10T07:50:58.776 INFO:tasks.workunit.client.0.vm05.stdout:0/612: link d8/dd/d37/d81/f91 d8/dd/d37/d81/fd3 0 2026-03-10T07:50:58.777 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.773+0000 7f3d0dffb700 1 --2- 192.168.123.105:0/1962141481 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f3cf8077780 0x7f3cf8079c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:58.777 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.773+0000 7f3d0dffb700 1 -- 192.168.123.105:0/1962141481 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f3d08013070 con 0x7f3d10083ad0 2026-03-10T07:50:58.777 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.775+0000 7f3d14c9b700 1 --2- 192.168.123.105:0/1962141481 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f3cf8077780 0x7f3cf8079c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:50:58.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.776+0000 7f3d0dffb700 1 -- 192.168.123.105:0/1962141481 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f3d08063ea0 con 0x7f3d10083ad0 2026-03-10T07:50:58.780 INFO:tasks.workunit.client.0.vm05.stdout:3/594: unlink d8/d16/d19/d6b/c74 0 2026-03-10T07:50:58.783 INFO:tasks.workunit.client.0.vm05.stdout:2/667: unlink d0/d8/d43/df/f97 0 2026-03-10T07:50:58.785 INFO:tasks.workunit.client.0.vm05.stdout:7/640: mkdir d1/d6/dc3 0 2026-03-10T07:50:58.787 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:58.780+0000 7f3d14c9b700 1 --2- 192.168.123.105:0/1962141481 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f3cf8077780 0x7f3cf8079c40 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f3d000098a0 tx=0x7f3d00006d90 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:50:58.794 INFO:tasks.workunit.client.0.vm05.stdout:1/626: dread da/d26/d2b/d71/f97 [0,4194304] 0 2026-03-10T07:50:58.799 INFO:tasks.workunit.client.0.vm05.stdout:9/573: dwrite d8/d35/d22/d33/d62/f6c [0,4194304] 0 2026-03-10T07:50:58.814 INFO:tasks.workunit.client.0.vm05.stdout:3/595: creat d8/d22/d60/fc6 x:0 0 0 2026-03-10T07:50:58.815 INFO:tasks.workunit.client.0.vm05.stdout:8/547: getdents d1/dd/d5e 0 2026-03-10T07:50:58.816 INFO:tasks.workunit.client.0.vm05.stdout:8/548: fdatasync d1/dd/d18/f47 0 2026-03-10T07:50:58.818 INFO:tasks.workunit.client.0.vm05.stdout:8/549: write d1/dd/d18/d20/d2a/d48/f92 [10735,61048] 0 2026-03-10T07:50:58.832 INFO:tasks.workunit.client.0.vm05.stdout:0/613: creat d8/d9c/dc8/fd4 x:0 0 0 2026-03-10T07:50:58.839 INFO:tasks.workunit.client.0.vm05.stdout:6/619: write d0/d11/d57/f5f [160892,110953] 0 2026-03-10T07:50:58.839 INFO:tasks.workunit.client.0.vm05.stdout:5/624: write d2/d5/f23 [1626092,130709] 0 2026-03-10T07:50:58.846 INFO:tasks.workunit.client.0.vm05.stdout:6/620: read - d0/d11/d2e/d81/fa3 zero size 2026-03-10T07:50:58.847 INFO:tasks.workunit.client.0.vm05.stdout:9/574: unlink d8/d35/f25 0 2026-03-10T07:50:58.847 INFO:tasks.workunit.client.0.vm05.stdout:1/627: creat da/dd/d2a/d55/fbf x:0 0 0 2026-03-10T07:50:58.855 INFO:tasks.workunit.client.0.vm05.stdout:9/575: readlink d8/d35/d22/l4e 0 2026-03-10T07:50:58.855 INFO:tasks.workunit.client.0.vm05.stdout:7/641: mknod d1/d6/d80/d82/cc4 0 2026-03-10T07:50:58.861 INFO:tasks.workunit.client.0.vm05.stdout:5/625: truncate d2/f9 4420035 0 2026-03-10T07:50:58.861 INFO:tasks.workunit.client.0.vm05.stdout:1/628: fdatasync da/dd/d2a/d55/fbf 0 2026-03-10T07:50:58.874 INFO:tasks.workunit.client.0.vm05.stdout:4/688: link d0/d6/d95/f3a d0/d6/d9/d12/d9c/db7/feb 0 2026-03-10T07:50:58.874 INFO:tasks.workunit.client.0.vm05.stdout:4/689: chown d0/d6/d9/d12/d9c/db7/db1/le5 113 1 2026-03-10T07:50:58.880 INFO:tasks.workunit.client.0.vm05.stdout:6/621: dread d0/d11/d4f/d56/d96/db6/faa [0,4194304] 0 2026-03-10T07:50:58.903 INFO:tasks.workunit.client.0.vm05.stdout:2/668: rename d0/d8/d66/dd1/d49/fc8 to d0/d8/d66/dd1/fda 0 2026-03-10T07:50:58.904 INFO:tasks.workunit.client.0.vm05.stdout:8/550: creat d1/dd/d18/d20/d2a/d9a/fae x:0 0 0 2026-03-10T07:50:58.926 INFO:tasks.workunit.client.0.vm05.stdout:9/576: write d8/d35/d1c/d20/d59/d8b/f39 [311774,106907] 0 2026-03-10T07:50:58.927 INFO:tasks.workunit.client.0.vm05.stdout:5/626: unlink d2/d4b/c8e 0 2026-03-10T07:50:58.928 INFO:tasks.workunit.client.0.vm05.stdout:4/690: fdatasync d0/d6/d9/d12/d9c/db7/da7/f53 0 2026-03-10T07:50:58.929 INFO:tasks.workunit.client.0.vm05.stdout:5/627: fsync d2/d5/d61/f65 0 2026-03-10T07:50:58.946 INFO:tasks.workunit.client.0.vm05.stdout:4/691: sync 2026-03-10T07:50:58.946 INFO:tasks.workunit.client.0.vm05.stdout:1/629: rename da/dd/d12/d19/d20 to da/d26/d2b/daf/dbe/dc0 0 2026-03-10T07:50:58.960 INFO:tasks.workunit.client.0.vm05.stdout:6/622: write d0/d6/f98 [2039930,38534] 0 2026-03-10T07:50:58.960 INFO:tasks.workunit.client.0.vm05.stdout:8/551: dwrite d1/dd/d18/f70 [0,4194304] 0 2026-03-10T07:50:58.960 INFO:tasks.workunit.client.0.vm05.stdout:6/623: stat d0/d35/d36/f5b 0 2026-03-10T07:50:58.961 INFO:tasks.workunit.client.0.vm05.stdout:4/692: write d0/d6/d9/d5a/f58 [541575,103453] 0 2026-03-10T07:50:58.966 INFO:tasks.workunit.client.0.vm05.stdout:3/596: mkdir d8/d8f/dbc/dc7 0 2026-03-10T07:50:58.968 INFO:tasks.workunit.client.0.vm05.stdout:3/597: chown d8/d16/d19/f21 33607 1 2026-03-10T07:50:58.986 INFO:tasks.workunit.client.0.vm05.stdout:9/577: mknod d8/d35/d1c/d20/cbe 0 2026-03-10T07:50:58.986 INFO:tasks.workunit.client.0.vm05.stdout:3/598: dread d8/d1f/d24/d8a/f91 [0,4194304] 0 2026-03-10T07:50:58.989 INFO:tasks.workunit.client.0.vm05.stdout:0/614: write d8/dd/f40 [2863802,111111] 0 2026-03-10T07:50:58.996 INFO:tasks.workunit.client.0.vm05.stdout:9/578: sync 2026-03-10T07:50:59.005 INFO:tasks.workunit.client.0.vm05.stdout:5/628: rmdir d2/d20/d33/d86/d8d/da1/dc0/dc2 39 2026-03-10T07:50:59.015 INFO:tasks.workunit.client.0.vm05.stdout:5/629: dwrite d2/d12/d2d/f36 [0,4194304] 0 2026-03-10T07:50:59.037 INFO:tasks.workunit.client.0.vm05.stdout:8/552: creat d1/dd/d4d/d64/d8f/faf x:0 0 0 2026-03-10T07:50:59.044 INFO:tasks.workunit.client.0.vm05.stdout:2/669: dwrite d0/d8/d66/f68 [0,4194304] 0 2026-03-10T07:50:59.046 INFO:tasks.workunit.client.0.vm05.stdout:4/693: unlink d0/d6/d9/f67 0 2026-03-10T07:50:59.056 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.054+0000 7f3d16eff700 1 -- 192.168.123.105:0/1962141481 --> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f3cfc000bf0 con 0x7f3cf8077780 2026-03-10T07:50:59.057 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.055+0000 7f3d0dffb700 1 -- 192.168.123.105:0/1962141481 <== mgr.14628 v2:192.168.123.105:6800/413688438 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+370 (secure 0 0 0) 0x7f3cfc000bf0 con 0x7f3cf8077780 2026-03-10T07:50:59.060 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.058+0000 7f3cf77fe700 1 -- 192.168.123.105:0/1962141481 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f3cf8077780 msgr2=0x7f3cf8079c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:59.060 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.058+0000 7f3cf77fe700 1 --2- 192.168.123.105:0/1962141481 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f3cf8077780 0x7f3cf8079c40 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f3d000098a0 tx=0x7f3d00006d90 comp rx=0 tx=0).stop 2026-03-10T07:50:59.060 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.058+0000 7f3cf77fe700 1 -- 192.168.123.105:0/1962141481 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d10083ad0 msgr2=0x7f3d1012e300 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:59.060 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.058+0000 7f3cf77fe700 1 --2- 192.168.123.105:0/1962141481 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d10083ad0 0x7f3d1012e300 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f3d08003c60 tx=0x7f3d08003d40 comp rx=0 tx=0).stop 2026-03-10T07:50:59.060 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.059+0000 7f3cf77fe700 1 -- 192.168.123.105:0/1962141481 shutdown_connections 2026-03-10T07:50:59.060 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.059+0000 7f3cf77fe700 1 --2- 192.168.123.105:0/1962141481 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f3cf8077780 0x7f3cf8079c40 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:59.060 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.059+0000 7f3cf77fe700 1 --2- 192.168.123.105:0/1962141481 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d10072d90 0x7f3d1012bdb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:59.060 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.059+0000 7f3cf77fe700 1 --2- 192.168.123.105:0/1962141481 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d10083ad0 0x7f3d1012e300 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:59.060 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.059+0000 7f3cf77fe700 1 -- 192.168.123.105:0/1962141481 >> 192.168.123.105:0/1962141481 conn(0x7f3d1006dda0 msgr2=0x7f3d100774f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:50:59.060 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.059+0000 7f3cf77fe700 1 -- 192.168.123.105:0/1962141481 shutdown_connections 2026-03-10T07:50:59.061 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.059+0000 7f3cf77fe700 1 -- 192.168.123.105:0/1962141481 wait complete. 2026-03-10T07:50:59.066 INFO:tasks.workunit.client.0.vm05.stdout:0/615: mknod d8/dd/d37/d67/cd5 0 2026-03-10T07:50:59.074 INFO:tasks.workunit.client.0.vm05.stdout:9/579: dwrite d8/d86/d28/f43 [0,4194304] 0 2026-03-10T07:50:59.092 INFO:tasks.workunit.client.0.vm05.stdout:5/630: mkdir d2/d20/d33/d86/dac/dd3 0 2026-03-10T07:50:59.095 INFO:tasks.workunit.client.0.vm05.stdout:8/553: unlink d1/dd/d4d/f61 0 2026-03-10T07:50:59.101 INFO:tasks.workunit.client.0.vm05.stdout:1/630: rename da/dd/d12/cb3 to da/d26/d2b/daf/dbe/cc1 0 2026-03-10T07:50:59.109 INFO:tasks.workunit.client.0.vm05.stdout:5/631: sync 2026-03-10T07:50:59.123 INFO:tasks.workunit.client.0.vm05.stdout:6/624: dwrite d0/d11/d4f/d56/f6b [0,4194304] 0 2026-03-10T07:50:59.132 INFO:tasks.workunit.client.0.vm05.stdout:6/625: write d0/d6/f98 [4571399,103460] 0 2026-03-10T07:50:59.167 INFO:tasks.workunit.client.0.vm05.stdout:3/599: creat d8/d1f/d24/d76/dc5/fc8 x:0 0 0 2026-03-10T07:50:59.168 INFO:tasks.workunit.client.0.vm05.stdout:7/642: link d1/d6/d47/c75 d1/d6/d3b/d7f/cc5 0 2026-03-10T07:50:59.172 INFO:tasks.workunit.client.0.vm05.stdout:0/616: readlink d8/dd/d10/lcb 0 2026-03-10T07:50:59.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.181+0000 7f724d4ea700 1 -- 192.168.123.105:0/3698583428 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7248107d90 msgr2=0x7f724810a1c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:59.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.181+0000 7f724d4ea700 1 --2- 192.168.123.105:0/3698583428 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7248107d90 0x7f724810a1c0 secure :-1 s=READY pgs=317 cs=0 l=1 rev1=1 crypto rx=0x7f7238009b00 tx=0x7f7238009e10 comp rx=0 tx=0).stop 2026-03-10T07:50:59.184 INFO:tasks.workunit.client.0.vm05.stdout:8/554: mknod d1/dd/d18/d20/d2a/d48/d5a/cb0 0 2026-03-10T07:50:59.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.183+0000 7f724d4ea700 1 -- 192.168.123.105:0/3698583428 shutdown_connections 2026-03-10T07:50:59.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.183+0000 7f724d4ea700 1 --2- 192.168.123.105:0/3698583428 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f724810a700 0x7f724810cb90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:59.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.183+0000 7f724d4ea700 1 --2- 192.168.123.105:0/3698583428 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7248107d90 0x7f724810a1c0 unknown :-1 s=CLOSED pgs=317 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:59.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.183+0000 7f724d4ea700 1 -- 192.168.123.105:0/3698583428 >> 192.168.123.105:0/3698583428 conn(0x7f724806daa0 msgr2=0x7f724806ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:50:59.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.183+0000 7f724d4ea700 1 -- 192.168.123.105:0/3698583428 shutdown_connections 2026-03-10T07:50:59.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.183+0000 7f724d4ea700 1 -- 192.168.123.105:0/3698583428 wait complete. 2026-03-10T07:50:59.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.184+0000 7f724d4ea700 1 Processor -- start 2026-03-10T07:50:59.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.184+0000 7f724d4ea700 1 -- start start 2026-03-10T07:50:59.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.184+0000 7f724d4ea700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7248107d90 0x7f7248116be0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:59.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.184+0000 7f724d4ea700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f724810a700 0x7f7248117120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:59.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.184+0000 7f724d4ea700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f72481aef70 con 0x7f724810a700 2026-03-10T07:50:59.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.184+0000 7f724d4ea700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f72481af0e0 con 0x7f7248107d90 2026-03-10T07:50:59.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.184+0000 7f72477fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f724810a700 0x7f7248117120 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:50:59.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.184+0000 7f72477fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f724810a700 0x7f7248117120 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36350/0 (socket says 192.168.123.105:36350) 2026-03-10T07:50:59.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.184+0000 7f72477fe700 1 -- 192.168.123.105:0/1003931284 learned_addr learned my addr 192.168.123.105:0/1003931284 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:50:59.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.184+0000 7f7247fff700 1 --2- 192.168.123.105:0/1003931284 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7248107d90 0x7f7248116be0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:50:59.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.184+0000 7f72477fe700 1 -- 192.168.123.105:0/1003931284 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7248107d90 msgr2=0x7f7248116be0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:59.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.184+0000 7f72477fe700 1 --2- 192.168.123.105:0/1003931284 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7248107d90 0x7f7248116be0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:59.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.184+0000 7f72477fe700 1 -- 192.168.123.105:0/1003931284 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f72380097e0 con 0x7f724810a700 2026-03-10T07:50:59.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.184+0000 7f72477fe700 1 --2- 192.168.123.105:0/1003931284 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f724810a700 0x7f7248117120 secure :-1 s=READY pgs=318 cs=0 l=1 rev1=1 crypto rx=0x7f723c00eb10 tx=0x7f723c00eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:50:59.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.187+0000 7f72457fa700 1 -- 192.168.123.105:0/1003931284 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f723c00cca0 con 0x7f724810a700 2026-03-10T07:50:59.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.187+0000 7f72457fa700 1 -- 192.168.123.105:0/1003931284 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f723c00ce00 con 0x7f724810a700 2026-03-10T07:50:59.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.187+0000 7f72457fa700 1 -- 192.168.123.105:0/1003931284 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f723c018910 con 0x7f724810a700 2026-03-10T07:50:59.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.187+0000 7f724d4ea700 1 -- 192.168.123.105:0/1003931284 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f72481af360 con 0x7f724810a700 2026-03-10T07:50:59.189 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.187+0000 7f724d4ea700 1 -- 192.168.123.105:0/1003931284 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f72481af8b0 con 0x7f724810a700 2026-03-10T07:50:59.190 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.188+0000 7f724d4ea700 1 -- 192.168.123.105:0/1003931284 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7248110c60 con 0x7f724810a700 2026-03-10T07:50:59.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.189+0000 7f72457fa700 1 -- 192.168.123.105:0/1003931284 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f723c018a70 con 0x7f724810a700 2026-03-10T07:50:59.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.189+0000 7f72457fa700 1 --2- 192.168.123.105:0/1003931284 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f7230077660 0x7f7230079b20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:59.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.189+0000 7f72457fa700 1 -- 192.168.123.105:0/1003931284 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f723c014070 con 0x7f724810a700 2026-03-10T07:50:59.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.192+0000 7f7247fff700 1 --2- 192.168.123.105:0/1003931284 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f7230077660 0x7f7230079b20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:50:59.196 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.193+0000 7f7247fff700 1 --2- 192.168.123.105:0/1003931284 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f7230077660 0x7f7230079b20 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f7238006010 tx=0x7f723801a040 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:50:59.201 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.195+0000 7f72457fa700 1 -- 192.168.123.105:0/1003931284 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f723c063300 con 0x7f724810a700 2026-03-10T07:50:59.201 INFO:tasks.workunit.client.0.vm05.stdout:1/631: mkdir da/dd/d2a/d55/d64/dc2 0 2026-03-10T07:50:59.201 INFO:tasks.workunit.client.0.vm05.stdout:5/632: creat d2/d12/d2d/d4a/fd4 x:0 0 0 2026-03-10T07:50:59.202 INFO:tasks.workunit.client.0.vm05.stdout:6/626: fdatasync d0/d11/d4f/f7e 0 2026-03-10T07:50:59.202 INFO:tasks.workunit.client.0.vm05.stdout:9/580: dwrite d8/d35/d38/f87 [4194304,4194304] 0 2026-03-10T07:50:59.206 INFO:tasks.workunit.client.0.vm05.stdout:5/633: write d2/d12/f5a [552731,95114] 0 2026-03-10T07:50:59.210 INFO:tasks.workunit.client.0.vm05.stdout:6/627: write d0/d6/f1a [3074737,115522] 0 2026-03-10T07:50:59.222 INFO:tasks.workunit.client.0.vm05.stdout:4/694: truncate d0/d6/d9/d12/d9c/db7/fb2 3970745 0 2026-03-10T07:50:59.228 INFO:tasks.workunit.client.0.vm05.stdout:3/600: stat d8/d16/d19/d37/l9a 0 2026-03-10T07:50:59.228 INFO:tasks.workunit.client.0.vm05.stdout:6/628: dread d0/d11/d57/d60/f74 [0,4194304] 0 2026-03-10T07:50:59.230 INFO:tasks.workunit.client.0.vm05.stdout:2/670: mkdir d0/d8/d3d/d7d/db2/dd7/ddb 0 2026-03-10T07:50:59.258 INFO:tasks.workunit.client.0.vm05.stdout:0/617: write d8/dd/d10/d26/d3a/d5e/f7b [1120644,37245] 0 2026-03-10T07:50:59.259 INFO:tasks.workunit.client.0.vm05.stdout:1/632: dread da/dd/d12/f18 [0,4194304] 0 2026-03-10T07:50:59.263 INFO:tasks.workunit.client.0.vm05.stdout:9/581: dread - d8/f9c zero size 2026-03-10T07:50:59.265 INFO:tasks.workunit.client.0.vm05.stdout:9/582: chown d8/d35/d1c/f4c 293 1 2026-03-10T07:50:59.265 INFO:tasks.workunit.client.0.vm05.stdout:1/633: chown da/d26/d2b/daf/dbe/dc0/fb7 1525 1 2026-03-10T07:50:59.269 INFO:tasks.workunit.client.0.vm05.stdout:1/634: truncate da/dd/d2a/d55/d64/f9f 489632 0 2026-03-10T07:50:59.270 INFO:tasks.workunit.client.0.vm05.stdout:1/635: chown da/dd/c85 19224 1 2026-03-10T07:50:59.271 INFO:tasks.workunit.client.0.vm05.stdout:1/636: write da/dd/d2a/f75 [294471,63676] 0 2026-03-10T07:50:59.285 INFO:tasks.workunit.client.0.vm05.stdout:7/643: getdents d1/d3c/d4b/da6 0 2026-03-10T07:50:59.298 INFO:tasks.workunit.client.0.vm05.stdout:8/555: symlink d1/d45/lb1 0 2026-03-10T07:50:59.308 INFO:tasks.workunit.client.0.vm05.stdout:7/644: dread d1/d6/d47/f7b [0,4194304] 0 2026-03-10T07:50:59.330 INFO:tasks.workunit.client.0.vm05.stdout:3/601: mknod d8/d1f/d2a/d4a/cc9 0 2026-03-10T07:50:59.337 INFO:tasks.workunit.client.0.vm05.stdout:6/629: mkdir d0/d11/d2e/d81/d92/dc2 0 2026-03-10T07:50:59.339 INFO:tasks.workunit.client.0.vm05.stdout:6/630: stat d0/d11/d2e/fbc 0 2026-03-10T07:50:59.342 INFO:tasks.workunit.client.0.vm05.stdout:2/671: dread d0/d8/d43/f5e [0,4194304] 0 2026-03-10T07:50:59.352 INFO:tasks.workunit.client.0.vm05.stdout:2/672: dwrite d0/f4 [0,4194304] 0 2026-03-10T07:50:59.364 INFO:tasks.workunit.client.0.vm05.stdout:7/645: rmdir d1/d3c/d71/d79 39 2026-03-10T07:50:59.392 INFO:tasks.workunit.client.0.vm05.stdout:3/602: mkdir d8/d22/d60/d6e/dca 0 2026-03-10T07:50:59.392 INFO:tasks.workunit.client.0.vm05.stdout:6/631: creat d0/d11/d4f/da0/fc3 x:0 0 0 2026-03-10T07:50:59.394 INFO:tasks.workunit.client.0.vm05.stdout:4/695: creat d0/d6/d9/d5a/d6e/db6/fec x:0 0 0 2026-03-10T07:50:59.396 INFO:tasks.workunit.client.0.vm05.stdout:8/556: dread d1/d45/d90/faa [0,4194304] 0 2026-03-10T07:50:59.397 INFO:tasks.workunit.client.0.vm05.stdout:2/673: unlink d0/d8/d43/ca 0 2026-03-10T07:50:59.400 INFO:tasks.workunit.client.0.vm05.stdout:8/557: chown d1/dd/d5e/l8b 787 1 2026-03-10T07:50:59.402 INFO:tasks.workunit.client.0.vm05.stdout:8/558: chown d1/dd/d18/d20/d2a/d48/d5a/f98 7 1 2026-03-10T07:50:59.403 INFO:tasks.workunit.client.0.vm05.stdout:5/634: getdents d2/d12/d2d 0 2026-03-10T07:50:59.404 INFO:tasks.workunit.client.0.vm05.stdout:5/635: read - d2/d20/d33/d53/fb0 zero size 2026-03-10T07:50:59.418 INFO:tasks.workunit.client.0.vm05.stdout:1/637: creat da/dd/d12/d86/fc3 x:0 0 0 2026-03-10T07:50:59.423 INFO:tasks.workunit.client.0.vm05.stdout:3/603: fdatasync d8/d16/f82 0 2026-03-10T07:50:59.429 INFO:tasks.workunit.client.0.vm05.stdout:0/618: getdents d8/dd/d10/db7/dc3 0 2026-03-10T07:50:59.432 INFO:tasks.workunit.client.0.vm05.stdout:4/696: unlink d0/d6/d37/fa1 0 2026-03-10T07:50:59.438 INFO:tasks.workunit.client.0.vm05.stdout:6/632: truncate d0/d6/d3b/f55 1317636 0 2026-03-10T07:50:59.439 INFO:tasks.workunit.client.0.vm05.stdout:8/559: mkdir d1/dd/d18/d20/d2a/d34/da5/db2 0 2026-03-10T07:50:59.441 INFO:tasks.workunit.client.0.vm05.stdout:9/583: getdents d8/d35/d22/d33/d62/d6d/d9e 0 2026-03-10T07:50:59.442 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.434+0000 7f724d4ea700 1 -- 192.168.123.105:0/1003931284 --> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f72480611d0 con 0x7f7230077660 2026-03-10T07:50:59.442 INFO:tasks.workunit.client.0.vm05.stdout:9/584: chown d8/d35/d22/fb1 32231970 1 2026-03-10T07:50:59.443 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:59 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:59.443 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:59 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:59.443 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:59 vm05.local ceph-mon[50387]: from='client.24453 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:50:59.443 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:59 vm05.local ceph-mon[50387]: pgmap v19: 65 pgs: 65 active+clean; 1.4 GiB data, 4.7 GiB used, 115 GiB / 120 GiB avail; 60 MiB/s rd, 149 MiB/s wr, 387 op/s 2026-03-10T07:50:59.443 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:50:59 vm05.local ceph-mon[50387]: from='client.24457 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.444+0000 7f72457fa700 1 -- 192.168.123.105:0/1003931284 <== mgr.14628 v2:192.168.123.105:6800/413688438 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f72480611d0 con 0x7f7230077660 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (3m) 2s ago 4m 25.2M - 0.25.0 c8568f914cd2 f87529717116 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (4m) 2s ago 4m 8460k - 18.2.1 5be31c24972a 26c4db858175 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (4m) 4s ago 4m 8409k - 18.2.1 5be31c24972a 209e2398a09c 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (4m) 2s ago 4m 7419k - 18.2.1 5be31c24972a d3d7b92c8ac3 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (4m) 4s ago 4m 7415k - 18.2.1 5be31c24972a 96136e0195f7 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (3m) 2s ago 4m 89.8M - 9.4.7 954c08fa6188 35089be30fc6 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.omfhnh vm05 running (2m) 2s ago 2m 224M - 18.2.1 5be31c24972a e23de179e09c 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.pavqil vm05 running (2m) 2s ago 2m 15.4M - 18.2.1 5be31c24972a 5b9e5afa214c 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.dgsaon vm08 running (2m) 4s ago 2m 16.4M - 18.2.1 5be31c24972a 1696aee522b5 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ybmbgd vm08 running (2m) 4s ago 2m 14.7M - 18.2.1 5be31c24972a 30b0e51cd2ed 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.blexke vm05 *:8443,9283,8765 running (40s) 2s ago 5m 603M - 19.2.3-678-ge911bdeb 654f31e6858e 5467b6a3bd38 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.orfpog vm08 *:8443,9283,8765 running (21s) 4s ago 4m 487M - 19.2.3-678-ge911bdeb 654f31e6858e 7a15d7811d5b 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (5m) 2s ago 5m 51.4M 2048M 18.2.1 5be31c24972a 2a459bf05146 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (4m) 4s ago 4m 42.1M 2048M 18.2.1 5be31c24972a e01dfb712474 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (11s) 2s ago 4m 8358k - 1.7.0 72c9c2088986 7cd0b23b4118 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (6s) 4s ago 4m 5368k - 1.7.0 72c9c2088986 3dd4d91d5881 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (3m) 2s ago 3m 215M 4096M 18.2.1 5be31c24972a 9b7c5ea48cea 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (3m) 2s ago 3m 212M 4096M 18.2.1 5be31c24972a 88e0b65b2c93 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (3m) 2s ago 3m 178M 4096M 18.2.1 5be31c24972a b8d3cbeb49f1 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (3m) 4s ago 3m 257M 4096M 18.2.1 5be31c24972a 0a62c54a86c0 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (3m) 4s ago 3m 196M 4096M 18.2.1 5be31c24972a bd748b691ccd 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (2m) 4s ago 2m 220M 4096M 18.2.1 5be31c24972a 9f08820ae98b 2026-03-10T07:50:59.446 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (24s) 2s ago 4m 45.9M - 2.43.0 a07b618ecd1d 29ccdc951883 2026-03-10T07:50:59.448 INFO:tasks.workunit.client.0.vm05.stdout:9/585: dread d8/d35/d22/f6a [0,4194304] 0 2026-03-10T07:50:59.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.447+0000 7f722effd700 1 -- 192.168.123.105:0/1003931284 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f7230077660 msgr2=0x7f7230079b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:59.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.447+0000 7f722effd700 1 --2- 192.168.123.105:0/1003931284 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f7230077660 0x7f7230079b20 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f7238006010 tx=0x7f723801a040 comp rx=0 tx=0).stop 2026-03-10T07:50:59.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.447+0000 7f722effd700 1 -- 192.168.123.105:0/1003931284 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f724810a700 msgr2=0x7f7248117120 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:59.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.447+0000 7f722effd700 1 --2- 192.168.123.105:0/1003931284 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f724810a700 0x7f7248117120 secure :-1 s=READY pgs=318 cs=0 l=1 rev1=1 crypto rx=0x7f723c00eb10 tx=0x7f723c00eed0 comp rx=0 tx=0).stop 2026-03-10T07:50:59.449 INFO:tasks.workunit.client.0.vm05.stdout:9/586: write d8/f9 [7590104,51423] 0 2026-03-10T07:50:59.450 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.449+0000 7f722effd700 1 -- 192.168.123.105:0/1003931284 shutdown_connections 2026-03-10T07:50:59.450 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.449+0000 7f722effd700 1 --2- 192.168.123.105:0/1003931284 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7248107d90 0x7f7248116be0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:59.450 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.449+0000 7f722effd700 1 --2- 192.168.123.105:0/1003931284 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f7230077660 0x7f7230079b20 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:59.450 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.449+0000 7f722effd700 1 --2- 192.168.123.105:0/1003931284 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f724810a700 0x7f7248117120 unknown :-1 s=CLOSED pgs=318 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:59.450 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.449+0000 7f722effd700 1 -- 192.168.123.105:0/1003931284 >> 192.168.123.105:0/1003931284 conn(0x7f724806daa0 msgr2=0x7f724806ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:50:59.450 INFO:tasks.workunit.client.0.vm05.stdout:9/587: stat d8/d35/d1c/d20/d59/d8b/c7e 0 2026-03-10T07:50:59.450 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.449+0000 7f722effd700 1 -- 192.168.123.105:0/1003931284 shutdown_connections 2026-03-10T07:50:59.451 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.449+0000 7f722effd700 1 -- 192.168.123.105:0/1003931284 wait complete. 2026-03-10T07:50:59.457 INFO:tasks.workunit.client.0.vm05.stdout:7/646: symlink d1/d3c/db8/lc6 0 2026-03-10T07:50:59.470 INFO:tasks.workunit.client.0.vm05.stdout:4/697: unlink d0/d6/d9/d12/d9c/db7/l84 0 2026-03-10T07:50:59.473 INFO:tasks.workunit.client.0.vm05.stdout:3/604: dread d8/d1f/f49 [0,4194304] 0 2026-03-10T07:50:59.481 INFO:tasks.workunit.client.0.vm05.stdout:2/674: fsync d0/d8/f3b 0 2026-03-10T07:50:59.486 INFO:tasks.workunit.client.0.vm05.stdout:6/633: mkdir d0/d11/d22/d6c/d84/dc4 0 2026-03-10T07:50:59.491 INFO:tasks.workunit.client.0.vm05.stdout:8/560: creat d1/d45/d90/fb3 x:0 0 0 2026-03-10T07:50:59.511 INFO:tasks.workunit.client.0.vm05.stdout:5/636: dwrite d2/d20/d33/d86/d8d/da1/dc0/fce [0,4194304] 0 2026-03-10T07:50:59.518 INFO:tasks.workunit.client.0.vm05.stdout:1/638: symlink da/d26/d2b/lc4 0 2026-03-10T07:50:59.526 INFO:tasks.workunit.client.0.vm05.stdout:4/698: dread d0/d6/d60/faf [0,4194304] 0 2026-03-10T07:50:59.529 INFO:tasks.workunit.client.0.vm05.stdout:4/699: dwrite d0/d6/d9/d12/d45/d55/d4e/f97 [0,4194304] 0 2026-03-10T07:50:59.531 INFO:tasks.workunit.client.0.vm05.stdout:2/675: chown d0/d8/fc3 1 1 2026-03-10T07:50:59.536 INFO:tasks.workunit.client.0.vm05.stdout:8/561: creat d1/dd/d18/d20/d2a/d48/d7c/d9c/fb4 x:0 0 0 2026-03-10T07:50:59.542 INFO:tasks.workunit.client.0.vm05.stdout:6/634: dwrite d0/d11/d57/d66/f79 [4194304,4194304] 0 2026-03-10T07:50:59.543 INFO:tasks.workunit.client.0.vm05.stdout:6/635: dread - d0/d11/d57/faf zero size 2026-03-10T07:50:59.550 INFO:tasks.workunit.client.0.vm05.stdout:5/637: mknod d2/d20/d33/d86/dac/cd5 0 2026-03-10T07:50:59.551 INFO:tasks.workunit.client.0.vm05.stdout:5/638: readlink d2/d5/lb2 0 2026-03-10T07:50:59.551 INFO:tasks.workunit.client.0.vm05.stdout:6/636: dread d0/d11/d22/f4c [0,4194304] 0 2026-03-10T07:50:59.552 INFO:tasks.workunit.client.0.vm05.stdout:6/637: fdatasync d0/d11/d57/f5f 0 2026-03-10T07:50:59.552 INFO:tasks.workunit.client.0.vm05.stdout:9/588: fsync d8/f8a 0 2026-03-10T07:50:59.557 INFO:tasks.workunit.client.0.vm05.stdout:7/647: symlink d1/d3c/d71/d79/d8a/lc7 0 2026-03-10T07:50:59.567 INFO:tasks.workunit.client.0.vm05.stdout:1/639: creat da/d26/d2b/daf/dbe/dc0/d8f/fc5 x:0 0 0 2026-03-10T07:50:59.586 INFO:tasks.workunit.client.0.vm05.stdout:0/619: creat d8/dd/d37/fd6 x:0 0 0 2026-03-10T07:50:59.599 INFO:tasks.workunit.client.0.vm05.stdout:2/676: fdatasync d0/d8/d43/f1f 0 2026-03-10T07:50:59.603 INFO:tasks.workunit.client.0.vm05.stdout:8/562: truncate d1/dd/d18/d20/d2a/f3a 325098 0 2026-03-10T07:50:59.620 INFO:tasks.workunit.client.0.vm05.stdout:7/648: creat d1/d3c/d4b/fc8 x:0 0 0 2026-03-10T07:50:59.624 INFO:tasks.workunit.client.0.vm05.stdout:4/700: fdatasync d0/d6/d9/d12/d45/f66 0 2026-03-10T07:50:59.627 INFO:tasks.workunit.client.0.vm05.stdout:3/605: dwrite d8/d1c/f23 [0,4194304] 0 2026-03-10T07:50:59.640 INFO:tasks.workunit.client.0.vm05.stdout:2/677: unlink d0/d8/d66/dd1/d49/db1/fd9 0 2026-03-10T07:50:59.645 INFO:tasks.workunit.client.0.vm05.stdout:2/678: dwrite d0/d8/d3d/d7d/db2/fba [0,4194304] 0 2026-03-10T07:50:59.647 INFO:tasks.workunit.client.0.vm05.stdout:2/679: chown d0/d8/d43/dc9 330847024 1 2026-03-10T07:50:59.664 INFO:tasks.workunit.client.0.vm05.stdout:2/680: dread d0/d8/d43/df/f3a [0,4194304] 0 2026-03-10T07:50:59.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:59 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:59.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:59 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:50:59.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:59 vm08.local ceph-mon[59917]: from='client.24453 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:50:59.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:59 vm08.local ceph-mon[59917]: pgmap v19: 65 pgs: 65 active+clean; 1.4 GiB data, 4.7 GiB used, 115 GiB / 120 GiB avail; 60 MiB/s rd, 149 MiB/s wr, 387 op/s 2026-03-10T07:50:59.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:50:59 vm08.local ceph-mon[59917]: from='client.24457 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:50:59.675 INFO:tasks.workunit.client.0.vm05.stdout:6/638: creat d0/d35/d36/d43/d9c/fc5 x:0 0 0 2026-03-10T07:50:59.677 INFO:tasks.workunit.client.0.vm05.stdout:9/589: link d8/d35/d38/f87 d8/d35/d22/d33/d62/fbf 0 2026-03-10T07:50:59.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.681+0000 7f5a1eb23700 1 -- 192.168.123.105:0/1193296395 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5a18107d90 msgr2=0x7f5a1810a1c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:59.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.681+0000 7f5a1eb23700 1 --2- 192.168.123.105:0/1193296395 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5a18107d90 0x7f5a1810a1c0 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f5a1000b3a0 tx=0x7f5a1000b6b0 comp rx=0 tx=0).stop 2026-03-10T07:50:59.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.682+0000 7f5a1eb23700 1 -- 192.168.123.105:0/1193296395 shutdown_connections 2026-03-10T07:50:59.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.682+0000 7f5a1eb23700 1 --2- 192.168.123.105:0/1193296395 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a1810a700 0x7f5a1810cb90 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:59.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.682+0000 7f5a1eb23700 1 --2- 192.168.123.105:0/1193296395 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5a18107d90 0x7f5a1810a1c0 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:59.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.682+0000 7f5a1eb23700 1 -- 192.168.123.105:0/1193296395 >> 192.168.123.105:0/1193296395 conn(0x7f5a1806daa0 msgr2=0x7f5a1806ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:50:59.684 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.682+0000 7f5a1eb23700 1 -- 192.168.123.105:0/1193296395 shutdown_connections 2026-03-10T07:50:59.684 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.682+0000 7f5a1eb23700 1 -- 192.168.123.105:0/1193296395 wait complete. 2026-03-10T07:50:59.685 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.683+0000 7f5a1eb23700 1 Processor -- start 2026-03-10T07:50:59.685 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.683+0000 7f5a1eb23700 1 -- start start 2026-03-10T07:50:59.685 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.683+0000 7f5a1eb23700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5a1810a700 0x7f5a181a53d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:59.685 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.683+0000 7f5a1eb23700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a181a5910 0x7f5a18076fe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:59.685 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.683+0000 7f5a1eb23700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5a181a5e20 con 0x7f5a181a5910 2026-03-10T07:50:59.686 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.683+0000 7f5a1eb23700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5a181a5f90 con 0x7f5a1810a700 2026-03-10T07:50:59.686 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.683+0000 7f5a1db21700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5a1810a700 0x7f5a181a53d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:50:59.686 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.684+0000 7f5a1db21700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5a1810a700 0x7f5a181a53d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:44532/0 (socket says 192.168.123.105:44532) 2026-03-10T07:50:59.686 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.684+0000 7f5a1db21700 1 -- 192.168.123.105:0/1530116858 learned_addr learned my addr 192.168.123.105:0/1530116858 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:50:59.686 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.684+0000 7f5a1d320700 1 --2- 192.168.123.105:0/1530116858 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a181a5910 0x7f5a18076fe0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:50:59.686 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.684+0000 7f5a1db21700 1 -- 192.168.123.105:0/1530116858 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a181a5910 msgr2=0x7f5a18076fe0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:50:59.686 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.684+0000 7f5a1db21700 1 --2- 192.168.123.105:0/1530116858 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a181a5910 0x7f5a18076fe0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:50:59.686 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.684+0000 7f5a1db21700 1 -- 192.168.123.105:0/1530116858 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5a1000b050 con 0x7f5a1810a700 2026-03-10T07:50:59.686 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.684+0000 7f5a1d320700 1 --2- 192.168.123.105:0/1530116858 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a181a5910 0x7f5a18076fe0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:50:59.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.684+0000 7f5a1db21700 1 --2- 192.168.123.105:0/1530116858 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5a1810a700 0x7f5a181a53d0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f5a10007ae0 tx=0x7f5a100095a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:50:59.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.685+0000 7f5a0effd700 1 -- 192.168.123.105:0/1530116858 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5a1000e070 con 0x7f5a1810a700 2026-03-10T07:50:59.688 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.686+0000 7f5a1eb23700 1 -- 192.168.123.105:0/1530116858 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5a18077520 con 0x7f5a1810a700 2026-03-10T07:50:59.688 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.686+0000 7f5a1eb23700 1 -- 192.168.123.105:0/1530116858 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5a18077a10 con 0x7f5a1810a700 2026-03-10T07:50:59.688 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.687+0000 7f5a0effd700 1 -- 192.168.123.105:0/1530116858 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f5a10003d10 con 0x7f5a1810a700 2026-03-10T07:50:59.689 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.688+0000 7f5a0effd700 1 -- 192.168.123.105:0/1530116858 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5a1001bb90 con 0x7f5a1810a700 2026-03-10T07:50:59.689 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.688+0000 7f5a0effd700 1 -- 192.168.123.105:0/1530116858 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f5a10019040 con 0x7f5a1810a700 2026-03-10T07:50:59.690 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.689+0000 7f5a0effd700 1 --2- 192.168.123.105:0/1530116858 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f5a04077860 0x7f5a04079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:50:59.691 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.690+0000 7f5a0effd700 1 -- 192.168.123.105:0/1530116858 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f5a1009b4a0 con 0x7f5a1810a700 2026-03-10T07:50:59.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.690+0000 7f5a1eb23700 1 -- 192.168.123.105:0/1530116858 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f59fc005320 con 0x7f5a1810a700 2026-03-10T07:50:59.694 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.691+0000 7f5a1d320700 1 --2- 192.168.123.105:0/1530116858 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f5a04077860 0x7f5a04079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:50:59.697 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.696+0000 7f5a1d320700 1 --2- 192.168.123.105:0/1530116858 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f5a04077860 0x7f5a04079d20 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f5a08006fd0 tx=0x7f5a08008040 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:50:59.698 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:50:59.696+0000 7f5a0effd700 1 -- 192.168.123.105:0/1530116858 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f5a10063fb0 con 0x7f5a1810a700 2026-03-10T07:50:59.702 INFO:tasks.workunit.client.0.vm05.stdout:5/639: dwrite d2/d5/d61/f66 [0,4194304] 0 2026-03-10T07:50:59.704 INFO:tasks.workunit.client.0.vm05.stdout:5/640: fdatasync d2/d12/d2d/d4a/faf 0 2026-03-10T07:50:59.710 INFO:tasks.workunit.client.0.vm05.stdout:5/641: dwrite d2/d20/d4c/fc4 [0,4194304] 0 2026-03-10T07:50:59.711 INFO:tasks.workunit.client.0.vm05.stdout:5/642: dread - d2/d20/d33/d53/fd0 zero size 2026-03-10T07:50:59.712 INFO:tasks.workunit.client.0.vm05.stdout:5/643: dread - d2/d20/d5b/f6e zero size 2026-03-10T07:50:59.713 INFO:tasks.workunit.client.0.vm05.stdout:5/644: chown d2/d12/d2d/cb7 1137952056 1 2026-03-10T07:50:59.731 INFO:tasks.workunit.client.0.vm05.stdout:7/649: write d1/d3c/d4b/f4f [1449436,107802] 0 2026-03-10T07:50:59.734 INFO:tasks.workunit.client.0.vm05.stdout:0/620: creat d8/dd/d37/d56/d4d/fd7 x:0 0 0 2026-03-10T07:50:59.735 INFO:tasks.workunit.client.0.vm05.stdout:0/621: write d8/dd/d10/d26/d2a/f8f [596824,31503] 0 2026-03-10T07:50:59.735 INFO:tasks.workunit.client.0.vm05.stdout:0/622: chown d8/dd/d37/d56/c5c 334 1 2026-03-10T07:50:59.743 INFO:tasks.workunit.client.0.vm05.stdout:3/606: fdatasync d8/d22/d60/d6e/f9e 0 2026-03-10T07:50:59.751 INFO:tasks.workunit.client.0.vm05.stdout:9/590: mkdir d8/d35/d22/d33/d62/dc0 0 2026-03-10T07:50:59.751 INFO:tasks.workunit.client.0.vm05.stdout:6/639: fsync d0/d11/d4f/d56/f6f 0 2026-03-10T07:50:59.754 INFO:tasks.workunit.client.0.vm05.stdout:6/640: readlink d0/d35/d36/l8a 0 2026-03-10T07:50:59.762 INFO:tasks.workunit.client.0.vm05.stdout:5/645: mknod d2/d5/d61/cd6 0 2026-03-10T07:50:59.763 INFO:tasks.workunit.client.0.vm05.stdout:7/650: creat d1/d6/d80/d82/fc9 x:0 0 0 2026-03-10T07:50:59.764 INFO:tasks.workunit.client.0.vm05.stdout:5/646: chown d2/d20/d33/d86/d8d/da1/dc0/fce 110478915 1 2026-03-10T07:50:59.764 INFO:tasks.workunit.client.0.vm05.stdout:5/647: write d2/d20/d4c/d64/f96 [8576691,112256] 0 2026-03-10T07:50:59.770 INFO:tasks.workunit.client.0.vm05.stdout:0/623: rename d8/dd/fbc to d8/d9c/fd8 0 2026-03-10T07:50:59.776 INFO:tasks.workunit.client.0.vm05.stdout:3/607: read d8/d1f/d2a/d96/f85 [1716687,112617] 0 2026-03-10T07:50:59.779 INFO:tasks.workunit.client.0.vm05.stdout:1/640: write da/dd/d12/f31 [2356999,65716] 0 2026-03-10T07:50:59.787 INFO:tasks.workunit.client.0.vm05.stdout:6/641: creat d0/d11/d4f/da0/fc6 x:0 0 0 2026-03-10T07:50:59.788 INFO:tasks.workunit.client.0.vm05.stdout:7/651: truncate d1/d34/f3e 3507072 0 2026-03-10T07:50:59.794 INFO:tasks.workunit.client.0.vm05.stdout:9/591: sync 2026-03-10T07:50:59.796 INFO:tasks.workunit.client.0.vm05.stdout:1/641: creat da/d26/d2b/d89/fc6 x:0 0 0 2026-03-10T07:50:59.797 INFO:tasks.workunit.client.0.vm05.stdout:2/681: creat d0/d8/dc6/fdc x:0 0 0 2026-03-10T07:50:59.799 INFO:tasks.workunit.client.0.vm05.stdout:6/642: rmdir d0/d11/d4f/d7d/db7 39 2026-03-10T07:50:59.802 INFO:tasks.workunit.client.0.vm05.stdout:0/624: dread d8/dd/f29 [0,4194304] 0 2026-03-10T07:50:59.803 INFO:tasks.workunit.client.0.vm05.stdout:6/643: dwrite d0/d11/d57/f7a [0,4194304] 0 2026-03-10T07:50:59.804 INFO:tasks.workunit.client.0.vm05.stdout:0/625: write d8/dd/d37/d56/d4d/fd7 [718479,12017] 0 2026-03-10T07:50:59.804 INFO:tasks.workunit.client.0.vm05.stdout:6/644: chown d0/d11/d86 1960391 1 2026-03-10T07:50:59.807 INFO:tasks.workunit.client.0.vm05.stdout:7/652: unlink d1/d34/d59/f72 0 2026-03-10T07:50:59.809 INFO:tasks.workunit.client.0.vm05.stdout:5/648: mkdir d2/dd7 0 2026-03-10T07:50:59.809 INFO:tasks.workunit.client.0.vm05.stdout:5/649: chown d2/dd7 228839 1 2026-03-10T07:50:59.811 INFO:tasks.workunit.client.0.vm05.stdout:6/645: sync 2026-03-10T07:50:59.814 INFO:tasks.workunit.client.0.vm05.stdout:4/701: getdents d0/d6/d9/d8c 0 2026-03-10T07:50:59.817 INFO:tasks.workunit.client.0.vm05.stdout:9/592: dread - d8/d35/d6b/f97 zero size 2026-03-10T07:50:59.823 INFO:tasks.workunit.client.0.vm05.stdout:5/650: dread d2/d5/f23 [0,4194304] 0 2026-03-10T07:50:59.825 INFO:tasks.workunit.client.0.vm05.stdout:5/651: write d2/d12/d2d/d4a/f99 [3101383,65818] 0 2026-03-10T07:50:59.826 INFO:tasks.workunit.client.0.vm05.stdout:5/652: write d2/d5/f1e [977574,28532] 0 2026-03-10T07:50:59.836 INFO:tasks.workunit.client.0.vm05.stdout:1/642: dread da/dd/d2a/f2f [0,4194304] 0 2026-03-10T07:50:59.838 INFO:tasks.workunit.client.0.vm05.stdout:5/653: dread d2/d20/d4c/d64/f96 [0,4194304] 0 2026-03-10T07:50:59.842 INFO:tasks.workunit.client.0.vm05.stdout:8/563: truncate d1/dd/d18/d20/d2a/d34/d49/d5d/f84 583624 0 2026-03-10T07:50:59.857 INFO:tasks.workunit.client.0.vm05.stdout:5/654: dread d2/d5/f10 [4194304,4194304] 0 2026-03-10T07:50:59.864 INFO:tasks.workunit.client.0.vm05.stdout:4/702: unlink d0/d6/d6f/la4 0 2026-03-10T07:50:59.865 INFO:tasks.workunit.client.0.vm05.stdout:3/608: write d8/d16/f4c [1458302,12027] 0 2026-03-10T07:50:59.875 INFO:tasks.workunit.client.0.vm05.stdout:6/646: rename d0/d11/d57/da4/db3/dc0 to d0/d35/d36/d43/d9c/dc7 0 2026-03-10T07:50:59.875 INFO:tasks.workunit.client.0.vm05.stdout:6/647: readlink d0/l19 0 2026-03-10T07:50:59.894 INFO:tasks.workunit.client.0.vm05.stdout:1/643: dwrite da/dd/d42/d80/f94 [0,4194304] 0 2026-03-10T07:50:59.895 INFO:tasks.workunit.client.0.vm05.stdout:1/644: read - da/d26/d2b/d89/fc6 zero size 2026-03-10T07:50:59.911 INFO:tasks.workunit.client.0.vm05.stdout:3/609: rmdir d8/d1f 39 2026-03-10T07:50:59.928 INFO:tasks.workunit.client.0.vm05.stdout:7/653: truncate d1/d5b/f73 512414 0 2026-03-10T07:50:59.937 INFO:tasks.workunit.client.0.vm05.stdout:8/564: rename d1/dd/d5e/f6b to d1/dd/d4d/d64/d6a/fb5 0 2026-03-10T07:50:59.937 INFO:tasks.workunit.client.0.vm05.stdout:8/565: readlink d1/d45/l9f 0 2026-03-10T07:50:59.941 INFO:tasks.workunit.client.0.vm05.stdout:2/682: link d0/d8/d66/dd1/fa6 d0/d8/d3d/fdd 0 2026-03-10T07:50:59.956 INFO:tasks.workunit.client.0.vm05.stdout:0/626: creat d8/dd/d10/fd9 x:0 0 0 2026-03-10T07:50:59.957 INFO:tasks.workunit.client.0.vm05.stdout:0/627: read d8/dd/d10/f19 [3775223,73175] 0 2026-03-10T07:50:59.959 INFO:tasks.workunit.client.0.vm05.stdout:5/655: link d2/d20/d33/d86/fbb d2/d20/d33/d86/dac/fd8 0 2026-03-10T07:50:59.960 INFO:tasks.workunit.client.0.vm05.stdout:0/628: dread d8/dd/d37/d81/fd3 [0,4194304] 0 2026-03-10T07:50:59.963 INFO:tasks.workunit.client.0.vm05.stdout:4/703: mkdir d0/d6/d9/d12/d69/dc7/ded 0 2026-03-10T07:50:59.967 INFO:tasks.workunit.client.0.vm05.stdout:9/593: link d8/d35/d1c/d20/f9f d8/d86/fc1 0 2026-03-10T07:50:59.973 INFO:tasks.workunit.client.0.vm05.stdout:1/645: rename da/d26/d2b/f88 to da/dd/d12/d19/fc7 0 2026-03-10T07:50:59.973 INFO:tasks.workunit.client.0.vm05.stdout:1/646: chown da/dd/d2a/f2f 58744679 1 2026-03-10T07:50:59.991 INFO:tasks.workunit.client.0.vm05.stdout:2/683: dwrite d0/d8/d3d/fdd [0,4194304] 0 2026-03-10T07:51:00.005 INFO:tasks.workunit.client.0.vm05.stdout:0/629: symlink d8/dd/d10/d26/d8b/d86/lda 0 2026-03-10T07:51:00.007 INFO:tasks.workunit.client.0.vm05.stdout:4/704: creat d0/d6/d9/d5a/fee x:0 0 0 2026-03-10T07:51:00.011 INFO:tasks.workunit.client.0.vm05.stdout:0/630: dread d8/dd/d37/d56/d4d/fd7 [0,4194304] 0 2026-03-10T07:51:00.013 INFO:tasks.workunit.client.0.vm05.stdout:3/610: creat d8/d1f/d24/d8a/fcb x:0 0 0 2026-03-10T07:51:00.016 INFO:tasks.workunit.client.0.vm05.stdout:9/594: creat d8/d35/d22/d33/d47/fc2 x:0 0 0 2026-03-10T07:51:00.016 INFO:tasks.workunit.client.0.vm05.stdout:9/595: write d8/d35/d1c/d20/d54/f90 [58519,119440] 0 2026-03-10T07:51:00.019 INFO:tasks.workunit.client.0.vm05.stdout:7/654: getdents d1/d3c/d4b/da6 0 2026-03-10T07:51:00.020 INFO:tasks.workunit.client.0.vm05.stdout:7/655: read d1/d6/f58 [1074899,41494] 0 2026-03-10T07:51:00.020 INFO:tasks.workunit.client.0.vm05.stdout:7/656: write d1/d3c/d71/f95 [423463,12999] 0 2026-03-10T07:51:00.023 INFO:tasks.workunit.client.0.vm05.stdout:1/647: unlink da/d26/d2b/d89/fc6 0 2026-03-10T07:51:00.030 INFO:tasks.workunit.client.0.vm05.stdout:2/684: creat d0/d8/d66/dd1/d49/fde x:0 0 0 2026-03-10T07:51:00.037 INFO:tasks.workunit.client.0.vm05.stdout:5/656: write d2/d12/d4d/f84 [684833,61643] 0 2026-03-10T07:51:00.045 INFO:tasks.workunit.client.0.vm05.stdout:0/631: rename d8/f4a to d8/dd/d37/d67/d96/fdb 0 2026-03-10T07:51:00.046 INFO:tasks.workunit.client.0.vm05.stdout:0/632: dread - d8/dd/d37/fd1 zero size 2026-03-10T07:51:00.047 INFO:tasks.workunit.client.0.vm05.stdout:0/633: chown d8/dd/d10/lcb 1 1 2026-03-10T07:51:00.060 INFO:tasks.workunit.client.0.vm05.stdout:9/596: symlink d8/d35/d22/lc3 0 2026-03-10T07:51:00.061 INFO:tasks.workunit.client.0.vm05.stdout:9/597: fsync d8/d35/d1c/d20/d59/d8b/f39 0 2026-03-10T07:51:00.062 INFO:tasks.workunit.client.0.vm05.stdout:9/598: write d8/d35/d22/f4a [3890127,66488] 0 2026-03-10T07:51:00.069 INFO:tasks.workunit.client.0.vm05.stdout:8/566: creat d1/dd/d18/d20/d2a/fb6 x:0 0 0 2026-03-10T07:51:00.070 INFO:tasks.workunit.client.0.vm05.stdout:6/648: getdents d0/d11/d57/d60 0 2026-03-10T07:51:00.071 INFO:tasks.workunit.client.0.vm05.stdout:6/649: dread - d0/d11/d22/d6c/fa5 zero size 2026-03-10T07:51:00.076 INFO:tasks.workunit.client.0.vm05.stdout:5/657: symlink d2/d20/d33/d86/dac/dc1/ld9 0 2026-03-10T07:51:00.079 INFO:tasks.workunit.client.0.vm05.stdout:4/705: readlink d0/d6/d9/d12/d65/l6c 0 2026-03-10T07:51:00.114 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.113+0000 7f5a1eb23700 1 -- 192.168.123.105:0/1530116858 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f59fc005cc0 con 0x7f5a1810a700 2026-03-10T07:51:00.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.115+0000 7f5a0effd700 1 -- 192.168.123.105:0/1530116858 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+679 (secure 0 0 0) 0x7f5a10017070 con 0x7f5a1810a700 2026-03-10T07:51:00.121 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:51:00.121 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T07:51:00.121 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-10T07:51:00.121 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:51:00.121 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T07:51:00.121 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:51:00.121 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:51:00.121 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T07:51:00.121 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 6 2026-03-10T07:51:00.121 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:51:00.121 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T07:51:00.121 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T07:51:00.121 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:51:00.121 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T07:51:00.121 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 12, 2026-03-10T07:51:00.121 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:51:00.121 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T07:51:00.121 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:51:00.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.128+0000 7f5a0cff9700 1 -- 192.168.123.105:0/1530116858 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f5a04077860 msgr2=0x7f5a04079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:00.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.128+0000 7f5a0cff9700 1 --2- 192.168.123.105:0/1530116858 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f5a04077860 0x7f5a04079d20 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f5a08006fd0 tx=0x7f5a08008040 comp rx=0 tx=0).stop 2026-03-10T07:51:00.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.128+0000 7f5a0cff9700 1 -- 192.168.123.105:0/1530116858 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5a1810a700 msgr2=0x7f5a181a53d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:00.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.128+0000 7f5a0cff9700 1 --2- 192.168.123.105:0/1530116858 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5a1810a700 0x7f5a181a53d0 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f5a10007ae0 tx=0x7f5a100095a0 comp rx=0 tx=0).stop 2026-03-10T07:51:00.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.128+0000 7f5a0cff9700 1 -- 192.168.123.105:0/1530116858 shutdown_connections 2026-03-10T07:51:00.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.128+0000 7f5a0cff9700 1 --2- 192.168.123.105:0/1530116858 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5a1810a700 0x7f5a181a53d0 secure :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7f5a10007ae0 tx=0x7f5a100095a0 comp rx=0 tx=0).stop 2026-03-10T07:51:00.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.128+0000 7f5a0cff9700 1 --2- 192.168.123.105:0/1530116858 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f5a04077860 0x7f5a04079d20 secure :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f5a08006fd0 tx=0x7f5a08008040 comp rx=0 tx=0).stop 2026-03-10T07:51:00.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.128+0000 7f5a0cff9700 1 --2- 192.168.123.105:0/1530116858 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5a181a5910 0x7f5a18076fe0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:00.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.128+0000 7f5a0cff9700 1 -- 192.168.123.105:0/1530116858 >> 192.168.123.105:0/1530116858 conn(0x7f5a1806daa0 msgr2=0x7f5a18109fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:51:00.131 INFO:tasks.workunit.client.0.vm05.stdout:3/611: truncate d8/d16/d52/da4/fab 318515 0 2026-03-10T07:51:00.131 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.130+0000 7f5a0cff9700 1 -- 192.168.123.105:0/1530116858 shutdown_connections 2026-03-10T07:51:00.131 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.130+0000 7f5a0cff9700 1 -- 192.168.123.105:0/1530116858 wait complete. 2026-03-10T07:51:00.131 INFO:tasks.workunit.client.0.vm05.stdout:3/612: chown d8/d16/f4c 1089454623 1 2026-03-10T07:51:00.134 INFO:tasks.workunit.client.0.vm05.stdout:1/648: mknod da/dd/d2a/d55/d64/dc2/cc8 0 2026-03-10T07:51:00.135 INFO:tasks.workunit.client.0.vm05.stdout:6/650: mkdir d0/d35/d36/dc8 0 2026-03-10T07:51:00.136 INFO:tasks.workunit.client.0.vm05.stdout:2/685: creat d0/d8/d66/dd1/d49/db3/fdf x:0 0 0 2026-03-10T07:51:00.147 INFO:tasks.workunit.client.0.vm05.stdout:7/657: creat d1/d34/d59/fca x:0 0 0 2026-03-10T07:51:00.147 INFO:tasks.workunit.client.0.vm05.stdout:8/567: mkdir d1/dd/d18/d20/d2a/d34/d49/db7 0 2026-03-10T07:51:00.157 INFO:tasks.workunit.client.0.vm05.stdout:2/686: symlink d0/d8/d43/df/d4d/le0 0 2026-03-10T07:51:00.171 INFO:tasks.workunit.client.0.vm05.stdout:4/706: mkdir d0/d6/d9/d12/d65/def 0 2026-03-10T07:51:00.172 INFO:tasks.workunit.client.0.vm05.stdout:3/613: mknod d8/d8f/dbc/dc7/ccc 0 2026-03-10T07:51:00.173 INFO:tasks.workunit.client.0.vm05.stdout:4/707: read d0/d6/d9/d12/d45/d55/f5f [1904760,105601] 0 2026-03-10T07:51:00.176 INFO:tasks.workunit.client.0.vm05.stdout:3/614: dwrite d8/d22/d60/f8e [4194304,4194304] 0 2026-03-10T07:51:00.184 INFO:tasks.workunit.client.0.vm05.stdout:6/651: creat d0/d11/d4f/d7d/db7/fc9 x:0 0 0 2026-03-10T07:51:00.184 INFO:tasks.workunit.client.0.vm05.stdout:6/652: dread - d0/d11/d57/fb9 zero size 2026-03-10T07:51:00.189 INFO:tasks.workunit.client.0.vm05.stdout:7/658: rmdir d1/d6/d3b/d7f 39 2026-03-10T07:51:00.198 INFO:tasks.workunit.client.0.vm05.stdout:7/659: chown d1/l2f 31235242 1 2026-03-10T07:51:00.199 INFO:tasks.workunit.client.0.vm05.stdout:7/660: write d1/f49 [9213897,64454] 0 2026-03-10T07:51:00.199 INFO:tasks.workunit.client.0.vm05.stdout:9/599: dwrite d8/d35/d22/d33/d62/fba [0,4194304] 0 2026-03-10T07:51:00.199 INFO:tasks.workunit.client.0.vm05.stdout:5/658: rename d2/d20/d33/d86/d8d to d2/d12/dda 0 2026-03-10T07:51:00.199 INFO:tasks.workunit.client.0.vm05.stdout:0/634: truncate d8/dd/d10/d26/d48/fb0 3261420 0 2026-03-10T07:51:00.200 INFO:tasks.workunit.client.0.vm05.stdout:2/687: sync 2026-03-10T07:51:00.207 INFO:tasks.workunit.client.0.vm05.stdout:3/615: creat d8/d1f/fcd x:0 0 0 2026-03-10T07:51:00.208 INFO:tasks.workunit.client.0.vm05.stdout:1/649: write da/d26/d2b/d71/f7d [937289,99269] 0 2026-03-10T07:51:00.209 INFO:tasks.workunit.client.0.vm05.stdout:1/650: write da/d26/d2b/d89/fa7 [531280,27047] 0 2026-03-10T07:51:00.210 INFO:tasks.workunit.client.0.vm05.stdout:6/653: mknod d0/d11/d57/da4/cca 0 2026-03-10T07:51:00.217 INFO:tasks.workunit.client.0.vm05.stdout:7/661: dread d1/d34/d59/d60/d8c/f97 [0,4194304] 0 2026-03-10T07:51:00.221 INFO:tasks.workunit.client.0.vm05.stdout:4/708: rename d0/d6/d9/d12/d45/d55/d44/d85/fc0 to d0/d6/d9/d12/d69/dc7/ded/ff0 0 2026-03-10T07:51:00.226 INFO:tasks.workunit.client.0.vm05.stdout:4/709: write d0/d6/d9/d5a/fee [84325,7492] 0 2026-03-10T07:51:00.226 INFO:tasks.workunit.client.0.vm05.stdout:5/659: mkdir d2/d5/ddb 0 2026-03-10T07:51:00.227 INFO:tasks.workunit.client.0.vm05.stdout:0/635: truncate d8/dd/f29 2209686 0 2026-03-10T07:51:00.228 INFO:tasks.workunit.client.0.vm05.stdout:3/616: creat d8/d22/d60/fce x:0 0 0 2026-03-10T07:51:00.230 INFO:tasks.workunit.client.0.vm05.stdout:3/617: read d8/d1f/d2a/f42 [92032,16010] 0 2026-03-10T07:51:00.244 INFO:tasks.workunit.client.0.vm05.stdout:1/651: truncate da/dd/fa5 1554824 0 2026-03-10T07:51:00.247 INFO:tasks.workunit.client.0.vm05.stdout:1/652: dwrite da/d26/d2b/fb0 [0,4194304] 0 2026-03-10T07:51:00.254 INFO:tasks.workunit.client.0.vm05.stdout:8/568: link d1/dd/d4d/l97 d1/dd/d4d/d64/d6a/lb8 0 2026-03-10T07:51:00.260 INFO:tasks.workunit.client.0.vm05.stdout:9/600: rename d8/d35/l18 to d8/d35/d38/lc4 0 2026-03-10T07:51:00.260 INFO:tasks.workunit.client.0.vm05.stdout:9/601: fsync d8/d35/d22/d33/d62/fba 0 2026-03-10T07:51:00.262 INFO:tasks.workunit.client.0.vm05.stdout:9/602: truncate d8/d35/d22/d33/d62/d6d/faf 447882 0 2026-03-10T07:51:00.267 INFO:tasks.workunit.client.0.vm05.stdout:5/660: symlink d2/d12/d4d/ldc 0 2026-03-10T07:51:00.267 INFO:tasks.workunit.client.0.vm05.stdout:5/661: chown d2/d12/d2d/c95 0 1 2026-03-10T07:51:00.273 INFO:tasks.workunit.client.0.vm05.stdout:4/710: dread d0/d28/f33 [0,4194304] 0 2026-03-10T07:51:00.276 INFO:tasks.workunit.client.0.vm05.stdout:0/636: creat d8/dd/d10/d26/d8b/da4/fdc x:0 0 0 2026-03-10T07:51:00.290 INFO:tasks.workunit.client.0.vm05.stdout:0/637: sync 2026-03-10T07:51:00.294 INFO:tasks.workunit.client.0.vm05.stdout:1/653: rmdir da/d26/d2b/daf/dbe/dc0/d8f 39 2026-03-10T07:51:00.302 INFO:tasks.workunit.client.0.vm05.stdout:6/654: truncate d0/d6/f1a 1963710 0 2026-03-10T07:51:00.302 INFO:tasks.workunit.client.0.vm05.stdout:6/655: dread - d0/d11/d57/d66/f7b zero size 2026-03-10T07:51:00.303 INFO:tasks.workunit.client.0.vm05.stdout:6/656: truncate d0/d11/d57/d66/fbd 899552 0 2026-03-10T07:51:00.307 INFO:tasks.workunit.client.0.vm05.stdout:8/569: mknod d1/dd/d18/d20/d2a/d48/d7c/d9c/da4/cb9 0 2026-03-10T07:51:00.313 INFO:tasks.workunit.client.0.vm05.stdout:9/603: rmdir d8/d35/d38/d71/d81 39 2026-03-10T07:51:00.317 INFO:tasks.workunit.client.0.vm05.stdout:5/662: mkdir d2/d12/da8/ddd 0 2026-03-10T07:51:00.320 INFO:tasks.workunit.client.0.vm05.stdout:2/688: creat d0/d8/d43/fe1 x:0 0 0 2026-03-10T07:51:00.329 INFO:tasks.workunit.client.0.vm05.stdout:0/638: dwrite d8/dd/d10/d26/d3a/d5e/fa3 [4194304,4194304] 0 2026-03-10T07:51:00.331 INFO:tasks.workunit.client.0.vm05.stdout:0/639: chown d8/dd/d10/fd9 95639226 1 2026-03-10T07:51:00.332 INFO:tasks.workunit.client.0.vm05.stdout:0/640: write d8/dd/d37/f4f [721005,61297] 0 2026-03-10T07:51:00.352 INFO:tasks.workunit.client.0.vm05.stdout:0/641: dread d8/f20 [0,4194304] 0 2026-03-10T07:51:00.363 INFO:tasks.workunit.client.0.vm05.stdout:8/570: write d1/dd/d18/d20/f5b [6930726,53527] 0 2026-03-10T07:51:00.365 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.363+0000 7fc1a48d0700 1 -- 192.168.123.105:0/1604722730 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc1a010a700 msgr2=0x7fc1a010cb90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:00.365 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.363+0000 7fc1a48d0700 1 --2- 192.168.123.105:0/1604722730 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc1a010a700 0x7fc1a010cb90 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7fc194009b00 tx=0x7fc194009e10 comp rx=0 tx=0).stop 2026-03-10T07:51:00.365 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.364+0000 7fc1a48d0700 1 -- 192.168.123.105:0/1604722730 shutdown_connections 2026-03-10T07:51:00.365 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.364+0000 7fc1a48d0700 1 --2- 192.168.123.105:0/1604722730 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc1a010a700 0x7fc1a010cb90 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:00.365 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.364+0000 7fc1a48d0700 1 --2- 192.168.123.105:0/1604722730 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc1a0107d90 0x7fc1a010a1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:00.365 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.364+0000 7fc1a48d0700 1 -- 192.168.123.105:0/1604722730 >> 192.168.123.105:0/1604722730 conn(0x7fc1a006daa0 msgr2=0x7fc1a006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:51:00.365 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.364+0000 7fc1a48d0700 1 -- 192.168.123.105:0/1604722730 shutdown_connections 2026-03-10T07:51:00.366 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.364+0000 7fc1a48d0700 1 -- 192.168.123.105:0/1604722730 wait complete. 2026-03-10T07:51:00.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.365+0000 7fc1a48d0700 1 Processor -- start 2026-03-10T07:51:00.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.365+0000 7fc1a48d0700 1 -- start start 2026-03-10T07:51:00.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.365+0000 7fc1a48d0700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc1a0107d90 0x7fc1a0116a60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:51:00.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.365+0000 7fc1a48d0700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc1a0116fa0 0x7fc1a0076fe0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:51:00.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.365+0000 7fc1a48d0700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc1a01174b0 con 0x7fc1a0107d90 2026-03-10T07:51:00.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.365+0000 7fc1a48d0700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc1a0117620 con 0x7fc1a0116fa0 2026-03-10T07:51:00.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.365+0000 7fc19f7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc1a0107d90 0x7fc1a0116a60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:51:00.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.365+0000 7fc19f7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc1a0107d90 0x7fc1a0116a60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36386/0 (socket says 192.168.123.105:36386) 2026-03-10T07:51:00.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.365+0000 7fc19f7fe700 1 -- 192.168.123.105:0/4083894101 learned_addr learned my addr 192.168.123.105:0/4083894101 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:51:00.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.365+0000 7fc19effd700 1 --2- 192.168.123.105:0/4083894101 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc1a0116fa0 0x7fc1a0076fe0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:51:00.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.366+0000 7fc19f7fe700 1 -- 192.168.123.105:0/4083894101 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc1a0116fa0 msgr2=0x7fc1a0076fe0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:00.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.366+0000 7fc19f7fe700 1 --2- 192.168.123.105:0/4083894101 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc1a0116fa0 0x7fc1a0076fe0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:00.368 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.366+0000 7fc19f7fe700 1 -- 192.168.123.105:0/4083894101 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc1940097e0 con 0x7fc1a0107d90 2026-03-10T07:51:00.368 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.366+0000 7fc19f7fe700 1 --2- 192.168.123.105:0/4083894101 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc1a0107d90 0x7fc1a0116a60 secure :-1 s=READY pgs=319 cs=0 l=1 rev1=1 crypto rx=0x7fc19000d8d0 tx=0x7fc19000dbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:51:00.369 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.367+0000 7fc19cff9700 1 -- 192.168.123.105:0/4083894101 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc190009880 con 0x7fc1a0107d90 2026-03-10T07:51:00.369 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.367+0000 7fc1a48d0700 1 -- 192.168.123.105:0/4083894101 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc1a0077580 con 0x7fc1a0107d90 2026-03-10T07:51:00.369 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.367+0000 7fc1a48d0700 1 -- 192.168.123.105:0/4083894101 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc1a0077ad0 con 0x7fc1a0107d90 2026-03-10T07:51:00.370 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.368+0000 7fc19cff9700 1 -- 192.168.123.105:0/4083894101 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fc19000b6e0 con 0x7fc1a0107d90 2026-03-10T07:51:00.370 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.368+0000 7fc19cff9700 1 -- 192.168.123.105:0/4083894101 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc19000f5d0 con 0x7fc1a0107d90 2026-03-10T07:51:00.371 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.370+0000 7fc19cff9700 1 -- 192.168.123.105:0/4083894101 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7fc19000f730 con 0x7fc1a0107d90 2026-03-10T07:51:00.372 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.370+0000 7fc19cff9700 1 --2- 192.168.123.105:0/4083894101 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fc188077780 0x7fc188079c40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:51:00.373 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.370+0000 7fc1a48d0700 1 -- 192.168.123.105:0/4083894101 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc18c005320 con 0x7fc1a0107d90 2026-03-10T07:51:00.373 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.371+0000 7fc19cff9700 1 -- 192.168.123.105:0/4083894101 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7fc19009a4e0 con 0x7fc1a0107d90 2026-03-10T07:51:00.377 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.375+0000 7fc19effd700 1 --2- 192.168.123.105:0/4083894101 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fc188077780 0x7fc188079c40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:51:00.377 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.375+0000 7fc19effd700 1 --2- 192.168.123.105:0/4083894101 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fc188077780 0x7fc188079c40 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7fc194000c00 tx=0x7fc194019040 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:51:00.377 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.375+0000 7fc19cff9700 1 -- 192.168.123.105:0/4083894101 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7fc190062ff0 con 0x7fc1a0107d90 2026-03-10T07:51:00.381 INFO:tasks.workunit.client.0.vm05.stdout:5/663: creat d2/d20/d4c/d64/fde x:0 0 0 2026-03-10T07:51:00.385 INFO:tasks.workunit.client.0.vm05.stdout:2/689: rename d0/d8/d43/df/d4d/f84 to d0/d8/d66/dd1/d49/d81/dd5/fe2 0 2026-03-10T07:51:00.385 INFO:tasks.workunit.client.0.vm05.stdout:2/690: chown d0/d8/d3d/d7d/db2/c32 21545132 1 2026-03-10T07:51:00.389 INFO:tasks.workunit.client.0.vm05.stdout:3/618: link d8/d16/f88 d8/d1f/d2a/d96/fcf 0 2026-03-10T07:51:00.414 INFO:tasks.workunit.client.0.vm05.stdout:6/657: dread d0/d6/f1a [0,4194304] 0 2026-03-10T07:51:00.414 INFO:tasks.workunit.client.0.vm05.stdout:1/654: dwrite da/d26/d2b/daf/fb6 [0,4194304] 0 2026-03-10T07:51:00.429 INFO:tasks.workunit.client.0.vm05.stdout:0/642: dread d8/dd/f29 [0,4194304] 0 2026-03-10T07:51:00.436 INFO:tasks.workunit.client.0.vm05.stdout:7/662: getdents d1/d34 0 2026-03-10T07:51:00.437 INFO:tasks.workunit.client.0.vm05.stdout:7/663: chown d1/d3c/d71/fab 9 1 2026-03-10T07:51:00.441 INFO:tasks.workunit.client.0.vm05.stdout:9/604: creat d8/d86/d28/d79/d57/dbc/fc5 x:0 0 0 2026-03-10T07:51:00.446 INFO:tasks.workunit.client.0.vm05.stdout:9/605: write d8/d35/d22/d33/d47/fc2 [755551,17711] 0 2026-03-10T07:51:00.449 INFO:tasks.workunit.client.0.vm05.stdout:2/691: chown d0/d8/c64 114 1 2026-03-10T07:51:00.463 INFO:tasks.workunit.client.0.vm05.stdout:3/619: write d8/d1f/d2a/d96/f85 [1655965,47955] 0 2026-03-10T07:51:00.470 INFO:tasks.workunit.client.0.vm05.stdout:6/658: rename d0/l9f to d0/d11/d22/d6c/d84/dc4/lcb 0 2026-03-10T07:51:00.580 INFO:tasks.workunit.client.0.vm05.stdout:9/606: mkdir d8/d35/d1c/d75/dc6 0 2026-03-10T07:51:00.580 INFO:tasks.workunit.client.0.vm05.stdout:9/607: chown d8/d86/d28 11143 1 2026-03-10T07:51:00.583 INFO:tasks.workunit.client.0.vm05.stdout:5/664: creat d2/d12/da8/ddd/fdf x:0 0 0 2026-03-10T07:51:00.587 INFO:tasks.workunit.client.0.vm05.stdout:4/711: getdents d0/d28 0 2026-03-10T07:51:00.601 INFO:tasks.workunit.client.0.vm05.stdout:0/643: rename d8/dd/d37/d81/f8d to d8/dd/d10/d26/d48/fdd 0 2026-03-10T07:51:00.604 INFO:tasks.workunit.client.0.vm05.stdout:6/659: rmdir d0/d11/d4f/d7d/db7 39 2026-03-10T07:51:00.606 INFO:tasks.workunit.client.0.vm05.stdout:1/655: mknod da/d26/d2b/d89/dbd/cc9 0 2026-03-10T07:51:00.607 INFO:tasks.workunit.client.0.vm05.stdout:1/656: chown da/c57 226997477 1 2026-03-10T07:51:00.607 INFO:tasks.workunit.client.0.vm05.stdout:1/657: chown da/dd/d2a/d55/d68/c32 71 1 2026-03-10T07:51:00.609 INFO:tasks.workunit.client.0.vm05.stdout:8/571: link d1/dd/d4d/f8a d1/d45/fba 0 2026-03-10T07:51:00.613 INFO:tasks.workunit.client.0.vm05.stdout:9/608: read - d8/d35/d1c/d20/d59/d8b/fa1 zero size 2026-03-10T07:51:00.614 INFO:tasks.workunit.client.0.vm05.stdout:5/665: read d2/d5/f71 [1200888,32838] 0 2026-03-10T07:51:00.618 INFO:tasks.workunit.client.0.vm05.stdout:4/712: truncate d0/d6/d9/d12/d69/fa5 240899 0 2026-03-10T07:51:00.618 INFO:tasks.workunit.client.0.vm05.stdout:3/620: symlink d8/d22/d60/d6e/dca/ld0 0 2026-03-10T07:51:00.619 INFO:tasks.workunit.client.0.vm05.stdout:4/713: dread - d0/d6/d9/d5a/d6e/db6/db9/fd9 zero size 2026-03-10T07:51:00.622 INFO:tasks.workunit.client.0.vm05.stdout:1/658: unlink da/dd/d2a/d55/d68/c67 0 2026-03-10T07:51:00.622 INFO:tasks.workunit.client.0.vm05.stdout:1/659: chown da/dd 27 1 2026-03-10T07:51:00.632 INFO:tasks.workunit.client.0.vm05.stdout:4/714: fdatasync d0/d6/d9/d12/d45/d55/f5f 0 2026-03-10T07:51:00.632 INFO:tasks.workunit.client.0.vm05.stdout:2/692: creat d0/fe3 x:0 0 0 2026-03-10T07:51:00.634 INFO:tasks.workunit.client.0.vm05.stdout:2/693: chown d0/d8/d3d/d7d/db2/c32 219716 1 2026-03-10T07:51:00.645 INFO:tasks.workunit.client.0.vm05.stdout:7/664: getdents d1/d6/d80/d82 0 2026-03-10T07:51:00.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:00 vm05.local ceph-mon[50387]: from='client.14680 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:51:00.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:00 vm05.local ceph-mon[50387]: from='client.? 192.168.123.105:0/1530116858' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:00.665 INFO:tasks.workunit.client.0.vm05.stdout:8/572: rename d1/c4 to d1/d45/cbb 0 2026-03-10T07:51:00.666 INFO:tasks.workunit.client.0.vm05.stdout:8/573: dread - d1/dd/d4d/d64/fac zero size 2026-03-10T07:51:00.667 INFO:tasks.workunit.client.0.vm05.stdout:8/574: chown d1/dd/d18/c69 164482 1 2026-03-10T07:51:00.669 INFO:tasks.workunit.client.0.vm05.stdout:3/621: symlink d8/ld1 0 2026-03-10T07:51:00.672 INFO:tasks.workunit.client.0.vm05.stdout:7/665: sync 2026-03-10T07:51:00.676 INFO:tasks.workunit.client.0.vm05.stdout:2/694: rmdir d0/d2a 39 2026-03-10T07:51:00.677 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.675+0000 7fc1a48d0700 1 -- 192.168.123.105:0/4083894101 --> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fc18c000bf0 con 0x7fc188077780 2026-03-10T07:51:00.680 INFO:tasks.workunit.client.0.vm05.stdout:1/660: truncate da/d26/d2b/d89/fb1 205859 0 2026-03-10T07:51:00.681 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.680+0000 7fc19cff9700 1 -- 192.168.123.105:0/4083894101 <== mgr.14628 v2:192.168.123.105:6800/413688438 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+370 (secure 0 0 0) 0x7fc18c000bf0 con 0x7fc188077780 2026-03-10T07:51:00.681 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:51:00.681 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T07:51:00.681 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T07:51:00.681 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading daemons of type(s) mgr", 2026-03-10T07:51:00.681 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T07:51:00.681 INFO:teuthology.orchestra.run.vm05.stdout: "mgr" 2026-03-10T07:51:00.681 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T07:51:00.681 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "2/2 daemons upgraded", 2026-03-10T07:51:00.681 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading node-exporter daemons", 2026-03-10T07:51:00.681 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T07:51:00.681 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:51:00.682 INFO:tasks.workunit.client.0.vm05.stdout:9/609: creat d8/d35/d38/d71/fc7 x:0 0 0 2026-03-10T07:51:00.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.685+0000 7fc1867fc700 1 -- 192.168.123.105:0/4083894101 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fc188077780 msgr2=0x7fc188079c40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:00.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.685+0000 7fc1867fc700 1 --2- 192.168.123.105:0/4083894101 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fc188077780 0x7fc188079c40 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7fc194000c00 tx=0x7fc194019040 comp rx=0 tx=0).stop 2026-03-10T07:51:00.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.685+0000 7fc1867fc700 1 -- 192.168.123.105:0/4083894101 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc1a0107d90 msgr2=0x7fc1a0116a60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:00.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.685+0000 7fc1867fc700 1 --2- 192.168.123.105:0/4083894101 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc1a0107d90 0x7fc1a0116a60 secure :-1 s=READY pgs=319 cs=0 l=1 rev1=1 crypto rx=0x7fc19000d8d0 tx=0x7fc19000dbe0 comp rx=0 tx=0).stop 2026-03-10T07:51:00.688 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.686+0000 7fc1867fc700 1 -- 192.168.123.105:0/4083894101 shutdown_connections 2026-03-10T07:51:00.688 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.686+0000 7fc1867fc700 1 --2- 192.168.123.105:0/4083894101 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fc188077780 0x7fc188079c40 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:00.688 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.686+0000 7fc1867fc700 1 --2- 192.168.123.105:0/4083894101 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc1a0107d90 0x7fc1a0116a60 unknown :-1 s=CLOSED pgs=319 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:00.688 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.686+0000 7fc1867fc700 1 --2- 192.168.123.105:0/4083894101 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc1a0116fa0 0x7fc1a0076fe0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:00.688 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.686+0000 7fc1867fc700 1 -- 192.168.123.105:0/4083894101 >> 192.168.123.105:0/4083894101 conn(0x7fc1a006daa0 msgr2=0x7fc1a006e780 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:51:00.689 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.687+0000 7fc1867fc700 1 -- 192.168.123.105:0/4083894101 shutdown_connections 2026-03-10T07:51:00.689 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:00.687+0000 7fc1867fc700 1 -- 192.168.123.105:0/4083894101 wait complete. 2026-03-10T07:51:00.692 INFO:tasks.workunit.client.0.vm05.stdout:3/622: mkdir d8/d1f/d2a/d34/dd2 0 2026-03-10T07:51:00.696 INFO:tasks.workunit.client.0.vm05.stdout:3/623: dwrite d8/d1c/f56 [0,4194304] 0 2026-03-10T07:51:00.697 INFO:tasks.workunit.client.0.vm05.stdout:3/624: write d8/d22/fb9 [388378,59485] 0 2026-03-10T07:51:00.703 INFO:tasks.workunit.client.0.vm05.stdout:7/666: creat d1/d34/d59/d60/d8c/fcb x:0 0 0 2026-03-10T07:51:00.711 INFO:tasks.workunit.client.0.vm05.stdout:6/660: getdents d0/d11/d22/d6c 0 2026-03-10T07:51:00.716 INFO:tasks.workunit.client.0.vm05.stdout:1/661: unlink da/dd/d12/d86/fc3 0 2026-03-10T07:51:00.720 INFO:tasks.workunit.client.0.vm05.stdout:7/667: sync 2026-03-10T07:51:00.735 INFO:tasks.workunit.client.0.vm05.stdout:3/625: unlink d8/d22/d60/d6e/f9e 0 2026-03-10T07:51:00.735 INFO:tasks.workunit.client.0.vm05.stdout:3/626: write d8/d1f/fcd [859483,111091] 0 2026-03-10T07:51:00.741 INFO:tasks.workunit.client.0.vm05.stdout:6/661: dwrite d0/d6/f45 [0,4194304] 0 2026-03-10T07:51:00.743 INFO:tasks.workunit.client.0.vm05.stdout:6/662: chown d0/d11/d4f/d56/d96/lae 1 1 2026-03-10T07:51:00.755 INFO:tasks.workunit.client.0.vm05.stdout:1/662: creat da/d26/d2b/d89/fca x:0 0 0 2026-03-10T07:51:00.761 INFO:tasks.workunit.client.0.vm05.stdout:7/668: dread d1/d6/f84 [0,4194304] 0 2026-03-10T07:51:00.767 INFO:tasks.workunit.client.0.vm05.stdout:7/669: dread d1/d6/f84 [0,4194304] 0 2026-03-10T07:51:00.767 INFO:tasks.workunit.client.0.vm05.stdout:7/670: chown d1/d34/f4d 168135 1 2026-03-10T07:51:00.767 INFO:tasks.workunit.client.0.vm05.stdout:0/644: getdents d8/dd/d10/d26/d8b 0 2026-03-10T07:51:00.770 INFO:tasks.workunit.client.0.vm05.stdout:6/663: mkdir d0/d11/d57/d60/dcc 0 2026-03-10T07:51:00.779 INFO:tasks.workunit.client.0.vm05.stdout:7/671: mknod d1/d34/d59/d60/d8c/ccc 0 2026-03-10T07:51:00.795 INFO:tasks.workunit.client.0.vm05.stdout:3/627: symlink d8/d16/d52/dbb/ld3 0 2026-03-10T07:51:00.805 INFO:tasks.workunit.client.0.vm05.stdout:7/672: mkdir d1/d6/d80/dcd 0 2026-03-10T07:51:00.818 INFO:tasks.workunit.client.0.vm05.stdout:3/628: unlink d8/d1c/d48/fba 0 2026-03-10T07:51:00.822 INFO:tasks.workunit.client.0.vm05.stdout:3/629: dwrite d8/d1c/f56 [4194304,4194304] 0 2026-03-10T07:51:00.833 INFO:tasks.workunit.client.0.vm05.stdout:7/673: creat d1/d3c/d71/d79/d8a/fce x:0 0 0 2026-03-10T07:51:00.847 INFO:tasks.workunit.client.0.vm05.stdout:7/674: dread d1/d6/f1d [0,4194304] 0 2026-03-10T07:51:00.868 INFO:tasks.workunit.client.0.vm05.stdout:2/695: rename d0/d8/d3d/d7d/c50 to d0/d8/d3d/d7d/ce4 0 2026-03-10T07:51:00.873 INFO:tasks.workunit.client.0.vm05.stdout:7/675: rmdir d1/d3c/db8 39 2026-03-10T07:51:00.886 INFO:tasks.workunit.client.0.vm05.stdout:5/666: dwrite d2/d20/d7b/f83 [0,4194304] 0 2026-03-10T07:51:00.891 INFO:tasks.workunit.client.0.vm05.stdout:0/645: creat d8/dd/fde x:0 0 0 2026-03-10T07:51:00.894 INFO:tasks.workunit.client.0.vm05.stdout:6/664: creat d0/d11/fcd x:0 0 0 2026-03-10T07:51:00.894 INFO:tasks.workunit.client.0.vm05.stdout:6/665: chown d0/d11/d31/c3c 306 1 2026-03-10T07:51:00.899 INFO:tasks.workunit.client.0.vm05.stdout:7/676: creat d1/d3c/d71/d79/d8a/dac/fcf x:0 0 0 2026-03-10T07:51:00.902 INFO:tasks.workunit.client.0.vm05.stdout:0/646: mkdir d8/dd/d10/d26/d8b/da4/ddf 0 2026-03-10T07:51:00.906 INFO:tasks.workunit.client.0.vm05.stdout:2/696: mkdir d0/d8/d43/df/d8b/dbf/de5 0 2026-03-10T07:51:00.914 INFO:tasks.workunit.client.0.vm05.stdout:7/677: unlink d1/d6/d3b/f42 0 2026-03-10T07:51:00.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:00 vm08.local ceph-mon[59917]: from='client.14680 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:51:00.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:00 vm08.local ceph-mon[59917]: from='client.? 192.168.123.105:0/1530116858' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:00.919 INFO:tasks.workunit.client.0.vm05.stdout:6/666: symlink d0/d11/d2e/d81/d92/dc2/lce 0 2026-03-10T07:51:00.930 INFO:tasks.workunit.client.0.vm05.stdout:4/715: write d0/d6/d9/d12/d45/d55/f19 [1392148,23238] 0 2026-03-10T07:51:00.936 INFO:tasks.workunit.client.0.vm05.stdout:8/575: write d1/dd/d18/d20/d2a/d48/d7c/d9c/da4/fa2 [820511,11889] 0 2026-03-10T07:51:00.943 INFO:tasks.workunit.client.0.vm05.stdout:0/647: fsync d8/dd/d10/f19 0 2026-03-10T07:51:00.949 INFO:tasks.workunit.client.0.vm05.stdout:4/716: sync 2026-03-10T07:51:00.949 INFO:tasks.workunit.client.0.vm05.stdout:8/576: sync 2026-03-10T07:51:00.969 INFO:tasks.workunit.client.0.vm05.stdout:9/610: dwrite d8/d35/d22/d33/d47/f5f [0,4194304] 0 2026-03-10T07:51:00.970 INFO:tasks.workunit.client.0.vm05.stdout:9/611: chown d8/d35/d1c/d20/d59/d8b/c7e 5332 1 2026-03-10T07:51:00.989 INFO:tasks.workunit.client.0.vm05.stdout:4/717: dread d0/d6/d9/d12/d9c/db7/da7/f53 [0,4194304] 0 2026-03-10T07:51:00.996 INFO:tasks.workunit.client.0.vm05.stdout:7/678: symlink d1/d34/d59/d60/d8c/ld0 0 2026-03-10T07:51:01.008 INFO:tasks.workunit.client.0.vm05.stdout:6/667: rename d0/d35/d36/fac to d0/d35/d36/db8/fcf 0 2026-03-10T07:51:01.008 INFO:tasks.workunit.client.0.vm05.stdout:6/668: stat d0/f23 0 2026-03-10T07:51:01.016 INFO:tasks.workunit.client.0.vm05.stdout:5/667: creat d2/d20/d33/fe0 x:0 0 0 2026-03-10T07:51:01.037 INFO:tasks.workunit.client.0.vm05.stdout:3/630: write d8/d16/f1a [2160115,79411] 0 2026-03-10T07:51:01.041 INFO:tasks.workunit.client.0.vm05.stdout:4/718: creat d0/d6/d9/d5a/ff1 x:0 0 0 2026-03-10T07:51:01.062 INFO:tasks.workunit.client.0.vm05.stdout:6/669: dread d0/d6/f16 [0,4194304] 0 2026-03-10T07:51:01.063 INFO:tasks.workunit.client.0.vm05.stdout:3/631: unlink d8/d1c/d64/l9b 0 2026-03-10T07:51:01.071 INFO:tasks.workunit.client.0.vm05.stdout:6/670: dread d0/d35/d36/f5b [4194304,4194304] 0 2026-03-10T07:51:01.081 INFO:tasks.workunit.client.0.vm05.stdout:1/663: truncate da/d26/d2b/d89/fb1 125098 0 2026-03-10T07:51:01.082 INFO:tasks.workunit.client.0.vm05.stdout:2/697: dwrite d0/d8/d43/df/f21 [4194304,4194304] 0 2026-03-10T07:51:01.096 INFO:tasks.workunit.client.0.vm05.stdout:0/648: write d8/dd/d10/d26/d2a/fab [2407153,395] 0 2026-03-10T07:51:01.123 INFO:tasks.workunit.client.0.vm05.stdout:1/664: dread da/d26/d2b/d71/f97 [0,4194304] 0 2026-03-10T07:51:01.123 INFO:tasks.workunit.client.0.vm05.stdout:1/665: chown da/d26/d2b/d71/f7d 127047880 1 2026-03-10T07:51:01.152 INFO:tasks.workunit.client.0.vm05.stdout:8/577: dwrite d1/d45/f81 [0,4194304] 0 2026-03-10T07:51:01.162 INFO:tasks.workunit.client.0.vm05.stdout:4/719: dwrite d0/d6/d9/d12/d9c/db7/da7/f53 [0,4194304] 0 2026-03-10T07:51:01.165 INFO:tasks.workunit.client.0.vm05.stdout:5/668: truncate d2/d12/dda/da1/faa 3669935 0 2026-03-10T07:51:01.179 INFO:tasks.workunit.client.0.vm05.stdout:3/632: creat d8/d1f/d2a/d96/fd4 x:0 0 0 2026-03-10T07:51:01.181 INFO:tasks.workunit.client.0.vm05.stdout:8/578: sync 2026-03-10T07:51:01.183 INFO:tasks.workunit.client.0.vm05.stdout:9/612: link d8/d35/d22/c30 d8/d35/d1c/d20/cc8 0 2026-03-10T07:51:01.185 INFO:tasks.workunit.client.0.vm05.stdout:3/633: dwrite d8/d22/d60/f8e [0,4194304] 0 2026-03-10T07:51:01.187 INFO:tasks.workunit.client.0.vm05.stdout:7/679: rename d1/d6/fb to d1/d34/d59/fd1 0 2026-03-10T07:51:01.196 INFO:tasks.workunit.client.0.vm05.stdout:0/649: fsync d8/fc 0 2026-03-10T07:51:01.196 INFO:tasks.workunit.client.0.vm05.stdout:1/666: fdatasync da/d26/d2b/daf/dbe/dc0/d8f/fc5 0 2026-03-10T07:51:01.197 INFO:tasks.workunit.client.0.vm05.stdout:0/650: stat d8/dd/d10/d26/d2a 0 2026-03-10T07:51:01.197 INFO:tasks.workunit.client.0.vm05.stdout:1/667: stat da/d26/d2b/daf/dbe/dc0/c44 0 2026-03-10T07:51:01.199 INFO:tasks.workunit.client.0.vm05.stdout:4/720: rmdir d0/d6/d9/d12/d45/d55/d4e 39 2026-03-10T07:51:01.209 INFO:tasks.workunit.client.0.vm05.stdout:5/669: chown d2/d20/c37 480 1 2026-03-10T07:51:01.209 INFO:tasks.workunit.client.0.vm05.stdout:4/721: dwrite d0/d6/d9/d12/d9c/db7/da7/f53 [4194304,4194304] 0 2026-03-10T07:51:01.212 INFO:tasks.workunit.client.0.vm05.stdout:4/722: readlink d0/d6/d9/d12/d45/d55/d44/d85/l8e 0 2026-03-10T07:51:01.212 INFO:tasks.workunit.client.0.vm05.stdout:6/671: mknod d0/d11/d4f/cd0 0 2026-03-10T07:51:01.215 INFO:tasks.workunit.client.0.vm05.stdout:6/672: dread d0/d6/f1a [0,4194304] 0 2026-03-10T07:51:01.223 INFO:tasks.workunit.client.0.vm05.stdout:9/613: creat d8/d35/d1c/d20/d54/fc9 x:0 0 0 2026-03-10T07:51:01.227 INFO:tasks.workunit.client.0.vm05.stdout:8/579: rename d1/dd/d18/d20/d2a/d48/f65 to d1/dd/d18/d20/d2a/d34/da5/fbc 0 2026-03-10T07:51:01.229 INFO:tasks.workunit.client.0.vm05.stdout:0/651: mknod d8/dd/d10/d26/d8b/ce0 0 2026-03-10T07:51:01.230 INFO:tasks.workunit.client.0.vm05.stdout:0/652: chown d8/dd/d10/db7 1080 1 2026-03-10T07:51:01.236 INFO:tasks.workunit.client.0.vm05.stdout:6/673: mknod d0/d11/d57/d66/cd1 0 2026-03-10T07:51:01.237 INFO:tasks.workunit.client.0.vm05.stdout:1/668: dread da/d26/d2b/f45 [0,4194304] 0 2026-03-10T07:51:01.237 INFO:tasks.workunit.client.0.vm05.stdout:6/674: chown d0/d11/d2e/f30 298 1 2026-03-10T07:51:01.237 INFO:tasks.workunit.client.0.vm05.stdout:6/675: stat d0/d35/d36/d43/d9c/fc5 0 2026-03-10T07:51:01.238 INFO:tasks.workunit.client.0.vm05.stdout:6/676: truncate d0/d35/fb2 940705 0 2026-03-10T07:51:01.254 INFO:tasks.workunit.client.0.vm05.stdout:9/614: creat d8/d35/d1c/d20/d59/fca x:0 0 0 2026-03-10T07:51:01.257 INFO:tasks.workunit.client.0.vm05.stdout:3/634: mkdir d8/dd5 0 2026-03-10T07:51:01.267 INFO:tasks.workunit.client.0.vm05.stdout:4/723: mkdir d0/d6/d60/dde/df2 0 2026-03-10T07:51:01.271 INFO:tasks.workunit.client.0.vm05.stdout:2/698: dwrite d0/d8/d43/d38/fc4 [0,4194304] 0 2026-03-10T07:51:01.271 INFO:tasks.workunit.client.0.vm05.stdout:2/699: fsync d0/d8/dc6/fdc 0 2026-03-10T07:51:01.272 INFO:tasks.workunit.client.0.vm05.stdout:2/700: chown d0/d8/d43/f1f 470115 1 2026-03-10T07:51:01.284 INFO:tasks.workunit.client.0.vm05.stdout:9/615: creat d8/d86/d95/fcb x:0 0 0 2026-03-10T07:51:01.284 INFO:tasks.workunit.client.0.vm05.stdout:7/680: link d1/d3c/d71/fb1 d1/d3c/d71/fd2 0 2026-03-10T07:51:01.288 INFO:tasks.workunit.client.0.vm05.stdout:8/580: truncate d1/f15 162296 0 2026-03-10T07:51:01.292 INFO:tasks.workunit.client.0.vm05.stdout:1/669: dread da/dd/d2a/d55/d68/f36 [0,4194304] 0 2026-03-10T07:51:01.293 INFO:tasks.workunit.client.0.vm05.stdout:1/670: readlink da/dd/d2a/d55/d68/l78 0 2026-03-10T07:51:01.296 INFO:tasks.workunit.client.0.vm05.stdout:1/671: dwrite da/d26/d2b/daf/fb6 [0,4194304] 0 2026-03-10T07:51:01.316 INFO:tasks.workunit.client.0.vm05.stdout:9/616: creat d8/d35/d22/d33/d62/d6d/d9e/fcc x:0 0 0 2026-03-10T07:51:01.316 INFO:tasks.workunit.client.0.vm05.stdout:5/670: link d2/d12/da8/c76 d2/d20/d77/ce1 0 2026-03-10T07:51:01.320 INFO:tasks.workunit.client.0.vm05.stdout:4/724: rename d0/d6/d9/d12/d9c/db7/fb2 to d0/d6/d9/d12/d9c/db7/db1/ff3 0 2026-03-10T07:51:01.320 INFO:tasks.workunit.client.0.vm05.stdout:4/725: fsync d0/d6/d60/fe9 0 2026-03-10T07:51:01.342 INFO:tasks.workunit.client.0.vm05.stdout:7/681: mknod d1/d6/d3b/d7f/cd3 0 2026-03-10T07:51:01.344 INFO:tasks.workunit.client.0.vm05.stdout:9/617: read d8/d35/d22/f3f [861464,56743] 0 2026-03-10T07:51:01.346 INFO:tasks.workunit.client.0.vm05.stdout:6/677: truncate d0/d11/f21 3947395 0 2026-03-10T07:51:01.347 INFO:tasks.workunit.client.0.vm05.stdout:0/653: write d8/dd/f59 [457726,40300] 0 2026-03-10T07:51:01.349 INFO:tasks.workunit.client.0.vm05.stdout:3/635: truncate d8/d16/f1a 3739619 0 2026-03-10T07:51:01.349 INFO:tasks.workunit.client.0.vm05.stdout:0/654: write d8/dd/d10/d26/d3a/d5e/f7b [528438,102206] 0 2026-03-10T07:51:01.354 INFO:tasks.workunit.client.0.vm05.stdout:2/701: write d0/d8/d66/dd1/d49/d81/dd5/fe2 [126548,124269] 0 2026-03-10T07:51:01.357 INFO:tasks.workunit.client.0.vm05.stdout:8/581: mkdir d1/dd/d18/d20/d2a/d34/d49/db7/dbd 0 2026-03-10T07:51:01.358 INFO:tasks.workunit.client.0.vm05.stdout:1/672: dwrite da/d26/d2b/daf/dbe/dc0/f6e [0,4194304] 0 2026-03-10T07:51:01.361 INFO:tasks.workunit.client.0.vm05.stdout:1/673: chown da/dd/d2a/f93 23690 1 2026-03-10T07:51:01.361 INFO:tasks.workunit.client.0.vm05.stdout:4/726: dread - d0/d6/da6/fcb zero size 2026-03-10T07:51:01.362 INFO:tasks.workunit.client.0.vm05.stdout:9/618: symlink d8/d35/d22/d33/d47/lcd 0 2026-03-10T07:51:01.365 INFO:tasks.workunit.client.0.vm05.stdout:9/619: write d8/d35/d22/d33/d62/f9a [1518119,79152] 0 2026-03-10T07:51:01.365 INFO:tasks.workunit.client.0.vm05.stdout:3/636: fsync d8/d16/d19/d37/f7c 0 2026-03-10T07:51:01.377 INFO:tasks.workunit.client.0.vm05.stdout:7/682: write d1/d6/d3b/d7f/fa2 [271143,5468] 0 2026-03-10T07:51:01.380 INFO:tasks.workunit.client.0.vm05.stdout:0/655: creat d8/dd/d10/d26/d3a/fe1 x:0 0 0 2026-03-10T07:51:01.385 INFO:tasks.workunit.client.0.vm05.stdout:5/671: symlink d2/dd7/le2 0 2026-03-10T07:51:01.387 INFO:tasks.workunit.client.0.vm05.stdout:5/672: write d2/d12/f5a [2584370,8140] 0 2026-03-10T07:51:01.387 INFO:tasks.workunit.client.0.vm05.stdout:8/582: creat d1/dd/d18/d20/d2a/d34/d49/db7/fbe x:0 0 0 2026-03-10T07:51:01.390 INFO:tasks.workunit.client.0.vm05.stdout:4/727: creat d0/d6/d9/d8c/dbe/ff4 x:0 0 0 2026-03-10T07:51:01.399 INFO:tasks.workunit.client.0.vm05.stdout:1/674: sync 2026-03-10T07:51:01.399 INFO:tasks.workunit.client.0.vm05.stdout:7/683: truncate d1/f3a 3736319 0 2026-03-10T07:51:01.399 INFO:tasks.workunit.client.0.vm05.stdout:2/702: fsync d0/d8/f1c 0 2026-03-10T07:51:01.399 INFO:tasks.workunit.client.0.vm05.stdout:5/673: dwrite d2/d12/d4d/f84 [0,4194304] 0 2026-03-10T07:51:01.404 INFO:tasks.workunit.client.0.vm05.stdout:2/703: truncate d0/d8/d66/dd1/d49/fd3 22872 0 2026-03-10T07:51:01.407 INFO:tasks.workunit.client.0.vm05.stdout:2/704: dwrite d0/d8/d43/df/f21 [0,4194304] 0 2026-03-10T07:51:01.417 INFO:tasks.workunit.client.0.vm05.stdout:2/705: dwrite d0/d8/d3d/d7d/da5/fb0 [8388608,4194304] 0 2026-03-10T07:51:01.420 INFO:tasks.workunit.client.0.vm05.stdout:2/706: write d0/fe3 [895424,74137] 0 2026-03-10T07:51:01.435 INFO:tasks.workunit.client.0.vm05.stdout:4/728: fdatasync d0/d6/d37/f46 0 2026-03-10T07:51:01.462 INFO:tasks.workunit.client.0.vm05.stdout:8/583: dread - d1/d6f/f85 zero size 2026-03-10T07:51:01.464 INFO:tasks.workunit.client.0.vm05.stdout:6/678: getdents d0/d11/d2e/d81/d92/dc2 0 2026-03-10T07:51:01.466 INFO:tasks.workunit.client.0.vm05.stdout:5/674: creat d2/d20/d4c/fe3 x:0 0 0 2026-03-10T07:51:01.467 INFO:tasks.workunit.client.0.vm05.stdout:3/637: dwrite d8/d1f/d24/d8a/fa3 [0,4194304] 0 2026-03-10T07:51:01.484 INFO:tasks.workunit.client.0.vm05.stdout:5/675: dread d2/d5/d61/f65 [0,4194304] 0 2026-03-10T07:51:01.489 INFO:tasks.workunit.client.0.vm05.stdout:5/676: dwrite d2/d20/d33/d86/dac/dc1/fc9 [0,4194304] 0 2026-03-10T07:51:01.490 INFO:tasks.workunit.client.0.vm05.stdout:4/729: read d0/f2 [337513,124207] 0 2026-03-10T07:51:01.490 INFO:tasks.workunit.client.0.vm05.stdout:4/730: chown d0/d6/d95/fad 27070 1 2026-03-10T07:51:01.491 INFO:tasks.workunit.client.0.vm05.stdout:4/731: chown d0/d6/d9/d12/d9c/db7/db1 14928 1 2026-03-10T07:51:01.495 INFO:tasks.workunit.client.0.vm05.stdout:9/620: link d8/d86/d95/la4 d8/d35/d22/d33/d62/d6d/lce 0 2026-03-10T07:51:01.498 INFO:tasks.workunit.client.0.vm05.stdout:1/675: mkdir da/d26/d2b/dcb 0 2026-03-10T07:51:01.502 INFO:tasks.workunit.client.0.vm05.stdout:6/679: mkdir d0/d35/d36/dd2 0 2026-03-10T07:51:01.523 INFO:tasks.workunit.client.0.vm05.stdout:2/707: write d0/f7 [1828130,43085] 0 2026-03-10T07:51:01.524 INFO:tasks.workunit.client.0.vm05.stdout:2/708: dread - d0/d8/dc6/fdc zero size 2026-03-10T07:51:01.535 INFO:tasks.workunit.client.0.vm05.stdout:1/676: mkdir da/d26/d9e/dcc 0 2026-03-10T07:51:01.537 INFO:tasks.workunit.client.0.vm05.stdout:8/584: rename d1/d73 to d1/d45/d90/dbf 0 2026-03-10T07:51:01.537 INFO:tasks.workunit.client.0.vm05.stdout:6/680: mknod d0/d11/d4f/d56/d96/db6/cd3 0 2026-03-10T07:51:01.537 INFO:tasks.workunit.client.0.vm05.stdout:1/677: dread - da/d26/d2b/daf/dbe/f91 zero size 2026-03-10T07:51:01.539 INFO:tasks.workunit.client.0.vm05.stdout:6/681: read d0/d11/d4f/d56/f6f [3953560,11365] 0 2026-03-10T07:51:01.551 INFO:tasks.workunit.client.0.vm05.stdout:7/684: mknod d1/d6/d47/d8d/cd4 0 2026-03-10T07:51:01.555 INFO:tasks.workunit.client.0.vm05.stdout:9/621: mkdir d8/d35/d38/d71/d81/dcf 0 2026-03-10T07:51:01.559 INFO:tasks.workunit.client.0.vm05.stdout:6/682: creat d0/d11/d22/d6c/d84/fd4 x:0 0 0 2026-03-10T07:51:01.559 INFO:tasks.workunit.client.0.vm05.stdout:0/656: link d8/dd/d10/d26/d8b/da4/f5b d8/dd/d10/d26/d3a/d5e/fe2 0 2026-03-10T07:51:01.564 INFO:tasks.workunit.client.0.vm05.stdout:3/638: dwrite d8/d1f/f49 [4194304,4194304] 0 2026-03-10T07:51:01.573 INFO:tasks.workunit.client.0.vm05.stdout:4/732: dwrite d0/d6/d9/d12/d9c/db7/f63 [0,4194304] 0 2026-03-10T07:51:01.585 INFO:tasks.workunit.client.0.vm05.stdout:9/622: readlink d8/lb 0 2026-03-10T07:51:01.587 INFO:tasks.workunit.client.0.vm05.stdout:2/709: dwrite d0/d52/f88 [0,4194304] 0 2026-03-10T07:51:01.592 INFO:tasks.workunit.client.0.vm05.stdout:2/710: chown d0/d8/d3d/d7d/da5/fd2 53080 1 2026-03-10T07:51:01.602 INFO:tasks.workunit.client.0.vm05.stdout:5/677: write d2/f9 [172333,33940] 0 2026-03-10T07:51:01.602 INFO:tasks.workunit.client.0.vm05.stdout:1/678: write da/dd/f7b [162079,59513] 0 2026-03-10T07:51:01.603 INFO:tasks.workunit.client.0.vm05.stdout:5/678: chown d2/d5/f75 0 1 2026-03-10T07:51:01.612 INFO:tasks.workunit.client.0.vm05.stdout:0/657: rmdir d8/dd/d37/d67/d96 39 2026-03-10T07:51:01.619 INFO:tasks.workunit.client.0.vm05.stdout:3/639: truncate d8/d1f/d24/d8a/f91 199797 0 2026-03-10T07:51:01.619 INFO:tasks.workunit.client.0.vm05.stdout:3/640: fdatasync d8/d22/d60/fce 0 2026-03-10T07:51:01.620 INFO:tasks.workunit.client.0.vm05.stdout:4/733: mknod d0/d6/d9/d12/d69/ddc/cf5 0 2026-03-10T07:51:01.621 INFO:tasks.workunit.client.0.vm05.stdout:4/734: stat d0/d6/d9/f83 0 2026-03-10T07:51:01.632 INFO:tasks.workunit.client.0.vm05.stdout:9/623: dwrite d8/d35/d1c/d20/f32 [0,4194304] 0 2026-03-10T07:51:01.658 INFO:tasks.workunit.client.0.vm05.stdout:2/711: rename d0/d8/d66/dd1/la7 to d0/d8/d43/df/d8b/dbf/le6 0 2026-03-10T07:51:01.659 INFO:tasks.workunit.client.0.vm05.stdout:5/679: unlink d2/d20/d33/d86/ca3 0 2026-03-10T07:51:01.659 INFO:tasks.workunit.client.0.vm05.stdout:5/680: write d2/d12/f40 [1178587,32176] 0 2026-03-10T07:51:01.659 INFO:tasks.workunit.client.0.vm05.stdout:8/585: symlink d1/dd/d18/d20/d2a/d48/d7c/d9c/lc0 0 2026-03-10T07:51:01.659 INFO:tasks.workunit.client.0.vm05.stdout:0/658: symlink d8/dd/d37/d56/d4d/le3 0 2026-03-10T07:51:01.659 INFO:tasks.workunit.client.0.vm05.stdout:8/586: stat d1/dd/d4d/d64/d8f 0 2026-03-10T07:51:01.659 INFO:tasks.workunit.client.0.vm05.stdout:1/679: symlink da/d26/d9e/dcc/lcd 0 2026-03-10T07:51:01.659 INFO:tasks.workunit.client.0.vm05.stdout:1/680: readlink da/dd/d2a/l53 0 2026-03-10T07:51:01.669 INFO:tasks.workunit.client.0.vm05.stdout:2/712: symlink d0/d52/le7 0 2026-03-10T07:51:01.684 INFO:tasks.workunit.client.0.vm05.stdout:2/713: chown d0/d8/fcd 4794269 1 2026-03-10T07:51:01.687 INFO:tasks.workunit.client.0.vm05.stdout:6/683: getdents d0/d11/d22 0 2026-03-10T07:51:01.689 INFO:tasks.workunit.client.0.vm05.stdout:5/681: mknod d2/d12/dda/da1/dc0/ce4 0 2026-03-10T07:51:01.691 INFO:tasks.workunit.client.0.vm05.stdout:8/587: symlink d1/dd/d18/d20/d2a/lc1 0 2026-03-10T07:51:01.710 INFO:tasks.workunit.client.0.vm05.stdout:2/714: truncate d0/d8/d43/da4/fc0 710963 0 2026-03-10T07:51:01.713 INFO:tasks.workunit.client.0.vm05.stdout:5/682: symlink d2/dd7/le5 0 2026-03-10T07:51:01.713 INFO:tasks.workunit.client.0.vm05.stdout:5/683: stat d2/d20/d4c/fb1 0 2026-03-10T07:51:01.716 INFO:tasks.workunit.client.0.vm05.stdout:4/735: getdents d0/d6/d60 0 2026-03-10T07:51:01.716 INFO:tasks.workunit.client.0.vm05.stdout:5/684: truncate d2/d20/d33/d53/d7d/fb4 692601 0 2026-03-10T07:51:01.719 INFO:tasks.workunit.client.0.vm05.stdout:0/659: dread d8/fb [0,4194304] 0 2026-03-10T07:51:01.723 INFO:tasks.workunit.client.0.vm05.stdout:8/588: sync 2026-03-10T07:51:01.733 INFO:tasks.workunit.client.0.vm05.stdout:7/685: dwrite d1/d34/d59/fd1 [4194304,4194304] 0 2026-03-10T07:51:01.738 INFO:tasks.workunit.client.0.vm05.stdout:7/686: sync 2026-03-10T07:51:01.738 INFO:tasks.workunit.client.0.vm05.stdout:7/687: readlink d1/l6b 0 2026-03-10T07:51:01.743 INFO:tasks.workunit.client.0.vm05.stdout:3/641: dwrite d8/d22/d60/d6e/f97 [0,4194304] 0 2026-03-10T07:51:01.749 INFO:tasks.workunit.client.0.vm05.stdout:1/681: getdents da 0 2026-03-10T07:51:01.757 INFO:tasks.workunit.client.0.vm05.stdout:9/624: dwrite d8/d35/d1c/d20/f9f [0,4194304] 0 2026-03-10T07:51:01.757 INFO:tasks.workunit.client.0.vm05.stdout:5/685: dread d2/d20/d33/d53/d7d/f82 [0,4194304] 0 2026-03-10T07:51:01.757 INFO:tasks.workunit.client.0.vm05.stdout:3/642: sync 2026-03-10T07:51:01.775 INFO:tasks.workunit.client.0.vm05.stdout:6/684: creat d0/d11/d4f/fd5 x:0 0 0 2026-03-10T07:51:01.775 INFO:tasks.workunit.client.0.vm05.stdout:2/715: read d0/d2a/f2e [1835233,56350] 0 2026-03-10T07:51:01.777 INFO:tasks.workunit.client.0.vm05.stdout:4/736: fdatasync d0/d6/d9/d12/d69/dc7/ded/ff0 0 2026-03-10T07:51:01.790 INFO:tasks.workunit.client.0.vm05.stdout:2/716: dread d0/fe3 [0,4194304] 0 2026-03-10T07:51:01.795 INFO:tasks.workunit.client.0.vm05.stdout:5/686: unlink d2/d5/l49 0 2026-03-10T07:51:01.797 INFO:tasks.workunit.client.0.vm05.stdout:9/625: chown d8/d35/d38/lc4 932 1 2026-03-10T07:51:01.808 INFO:tasks.workunit.client.0.vm05.stdout:0/660: write d8/dd/d10/d26/d8b/da4/f9e [798056,88462] 0 2026-03-10T07:51:01.810 INFO:tasks.workunit.client.0.vm05.stdout:0/661: fsync d8/dd/d10/d26/d2a/d6f/faa 0 2026-03-10T07:51:01.814 INFO:tasks.workunit.client.0.vm05.stdout:8/589: dwrite d1/d45/f53 [0,4194304] 0 2026-03-10T07:51:01.817 INFO:tasks.workunit.client.0.vm05.stdout:7/688: dwrite d1/fa4 [4194304,4194304] 0 2026-03-10T07:51:01.827 INFO:tasks.workunit.client.0.vm05.stdout:7/689: dwrite d1/d6/d80/fb9 [0,4194304] 0 2026-03-10T07:51:01.831 INFO:tasks.workunit.client.0.vm05.stdout:7/690: readlink d1/d6/d80/d82/l83 0 2026-03-10T07:51:01.841 INFO:tasks.workunit.client.0.vm05.stdout:3/643: write d8/d1c/f9d [1030260,37185] 0 2026-03-10T07:51:01.842 INFO:tasks.workunit.client.0.vm05.stdout:6/685: truncate d0/d6/d3b/f55 182890 0 2026-03-10T07:51:01.842 INFO:tasks.workunit.client.0.vm05.stdout:9/626: readlink d8/d86/d95/la4 0 2026-03-10T07:51:01.848 INFO:tasks.workunit.client.0.vm05.stdout:8/590: readlink d1/dd/d18/d20/d2a/d34/l82 0 2026-03-10T07:51:01.848 INFO:tasks.workunit.client.0.vm05.stdout:7/691: rename d1/d3c/d71/d79/d8a/f90 to d1/d5b/fd5 0 2026-03-10T07:51:01.850 INFO:tasks.workunit.client.0.vm05.stdout:0/662: symlink d8/dd/d10/d26/d8b/da4/ddf/le4 0 2026-03-10T07:51:01.853 INFO:tasks.workunit.client.0.vm05.stdout:6/686: mkdir d0/d11/d2e/d81/dd6 0 2026-03-10T07:51:01.853 INFO:tasks.workunit.client.0.vm05.stdout:1/682: link da/dd/d2a/f54 da/dd/d12/d34/d58/fce 0 2026-03-10T07:51:01.853 INFO:tasks.workunit.client.0.vm05.stdout:2/717: creat d0/d8/d66/dd1/fe8 x:0 0 0 2026-03-10T07:51:01.854 INFO:tasks.workunit.client.0.vm05.stdout:1/683: readlink da/dd/d2a/l49 0 2026-03-10T07:51:01.858 INFO:tasks.workunit.client.0.vm05.stdout:8/591: symlink d1/d45/d90/lc2 0 2026-03-10T07:51:01.859 INFO:tasks.workunit.client.0.vm05.stdout:5/687: link d2/d12/d4d/ldc d2/d12/da8/ddd/le6 0 2026-03-10T07:51:01.860 INFO:tasks.workunit.client.0.vm05.stdout:5/688: fdatasync d2/d20/d4c/fc4 0 2026-03-10T07:51:01.866 INFO:tasks.workunit.client.0.vm05.stdout:0/663: mknod d8/dd/d37/d56/d4d/ce5 0 2026-03-10T07:51:01.870 INFO:tasks.workunit.client.0.vm05.stdout:3/644: link d8/d16/f9c d8/d1f/d2a/d96/da9/fd6 0 2026-03-10T07:51:01.873 INFO:tasks.workunit.client.0.vm05.stdout:2/718: creat d0/d8/d66/dd1/d49/dab/fe9 x:0 0 0 2026-03-10T07:51:01.873 INFO:tasks.workunit.client.0.vm05.stdout:2/719: chown d0/d8/d3d/d7d/db2 255183275 1 2026-03-10T07:51:01.877 INFO:tasks.workunit.client.0.vm05.stdout:1/684: mknod da/d26/d2b/daf/dbe/ccf 0 2026-03-10T07:51:01.878 INFO:tasks.workunit.client.0.vm05.stdout:8/592: fsync d1/dd/d4d/f8a 0 2026-03-10T07:51:01.878 INFO:tasks.workunit.client.0.vm05.stdout:7/692: fdatasync d1/d34/f3e 0 2026-03-10T07:51:01.878 INFO:tasks.workunit.client.0.vm05.stdout:3/645: symlink d8/d1f/d2a/d34/ld7 0 2026-03-10T07:51:01.881 INFO:tasks.workunit.client.0.vm05.stdout:3/646: chown d8/d1f/d2a/d96/da9/fb5 611314265 1 2026-03-10T07:51:01.884 INFO:tasks.workunit.client.0.vm05.stdout:5/689: dread d2/d5/f3c [0,4194304] 0 2026-03-10T07:51:01.888 INFO:tasks.workunit.client.0.vm05.stdout:2/720: dwrite d0/d8/d66/dd1/d49/d81/dd5/fe2 [0,4194304] 0 2026-03-10T07:51:01.900 INFO:tasks.workunit.client.0.vm05.stdout:5/690: sync 2026-03-10T07:51:01.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:01 vm05.local ceph-mon[50387]: from='client.14684 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:51:01.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:01 vm05.local ceph-mon[50387]: pgmap v20: 65 pgs: 65 active+clean; 1.4 GiB data, 4.7 GiB used, 115 GiB / 120 GiB avail; 44 MiB/s rd, 101 MiB/s wr, 265 op/s 2026-03-10T07:51:01.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:01 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:01.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:01 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:01.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:01 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:51:01.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:01 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:51:01.911 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:01 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:01.913 INFO:tasks.workunit.client.0.vm05.stdout:0/664: link d8/dd/d37/f4f d8/dd/d10/d26/d3a/d5e/d63/fe6 0 2026-03-10T07:51:01.913 INFO:tasks.workunit.client.0.vm05.stdout:6/687: link d0/d11/d57/f5c d0/d6/d3b/fd7 0 2026-03-10T07:51:01.913 INFO:tasks.workunit.client.0.vm05.stdout:3/647: mknod d8/d16/dac/cd8 0 2026-03-10T07:51:01.917 INFO:tasks.workunit.client.0.vm05.stdout:6/688: write d0/d6/f45 [1959619,49394] 0 2026-03-10T07:51:01.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:01 vm08.local ceph-mon[59917]: from='client.14684 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:51:01.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:01 vm08.local ceph-mon[59917]: pgmap v20: 65 pgs: 65 active+clean; 1.4 GiB data, 4.7 GiB used, 115 GiB / 120 GiB avail; 44 MiB/s rd, 101 MiB/s wr, 265 op/s 2026-03-10T07:51:01.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:01 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:01.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:01 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:01.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:01 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:51:01.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:01 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:51:01.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:01 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:01.920 INFO:tasks.workunit.client.0.vm05.stdout:7/693: dread d1/f37 [0,4194304] 0 2026-03-10T07:51:01.926 INFO:tasks.workunit.client.0.vm05.stdout:8/593: creat d1/dd/d5e/d9e/fc3 x:0 0 0 2026-03-10T07:51:01.942 INFO:tasks.workunit.client.0.vm05.stdout:4/737: truncate d0/d6/d9/d5a/f58 6382473 0 2026-03-10T07:51:01.942 INFO:tasks.workunit.client.0.vm05.stdout:3/648: mknod d8/d1f/d24/d76/cd9 0 2026-03-10T07:51:01.943 INFO:tasks.workunit.client.0.vm05.stdout:4/738: chown d0/d6/d9/d8c/fc1 25252599 1 2026-03-10T07:51:01.953 INFO:tasks.workunit.client.0.vm05.stdout:3/649: fsync d8/d1c/f63 0 2026-03-10T07:51:01.975 INFO:tasks.workunit.client.0.vm05.stdout:1/685: link da/d26/d2b/daf/dbe/dc0/fb7 da/dd/d12/d86/d9a/fd0 0 2026-03-10T07:51:01.975 INFO:tasks.workunit.client.0.vm05.stdout:9/627: write d8/f15 [4167663,8429] 0 2026-03-10T07:51:01.981 INFO:tasks.workunit.client.0.vm05.stdout:8/594: truncate d1/dd/d18/d20/d2a/d48/f50 1362862 0 2026-03-10T07:51:01.984 INFO:tasks.workunit.client.0.vm05.stdout:3/650: read d8/d16/d52/f9f [2698305,82965] 0 2026-03-10T07:51:01.987 INFO:tasks.workunit.client.0.vm05.stdout:7/694: rename d1/d6/d3b/fa1 to d1/d6/d47/fd6 0 2026-03-10T07:51:02.001 INFO:tasks.workunit.client.0.vm05.stdout:8/595: fdatasync d1/dd/f17 0 2026-03-10T07:51:02.002 INFO:tasks.workunit.client.0.vm05.stdout:7/695: dwrite d1/d3c/d71/d79/d8a/fad [0,4194304] 0 2026-03-10T07:51:02.011 INFO:tasks.workunit.client.0.vm05.stdout:0/665: getdents d8 0 2026-03-10T07:51:02.011 INFO:tasks.workunit.client.0.vm05.stdout:2/721: write d0/d8/d43/df/d8b/f8f [1417390,32903] 0 2026-03-10T07:51:02.015 INFO:tasks.workunit.client.0.vm05.stdout:6/689: getdents d0/d11/d22/d6c/d84/dc4 0 2026-03-10T07:51:02.015 INFO:tasks.workunit.client.0.vm05.stdout:5/691: write d2/d20/d4c/d64/f96 [1897700,35894] 0 2026-03-10T07:51:02.015 INFO:tasks.workunit.client.0.vm05.stdout:3/651: mkdir d8/d22/d60/d6e/dca/dda 0 2026-03-10T07:51:02.015 INFO:tasks.workunit.client.0.vm05.stdout:4/739: rename d0/d6/d9/d12/d45/fc8 to d0/d6/d9/d5a/d6e/ff6 0 2026-03-10T07:51:02.016 INFO:tasks.workunit.client.0.vm05.stdout:3/652: dread - d8/d1f/d24/d76/fc1 zero size 2026-03-10T07:51:02.026 INFO:tasks.workunit.client.0.vm05.stdout:0/666: mkdir d8/dd/d10/d26/d8b/da4/de7 0 2026-03-10T07:51:02.034 INFO:tasks.workunit.client.0.vm05.stdout:5/692: mkdir d2/d12/d2d/d4a/de7 0 2026-03-10T07:51:02.036 INFO:tasks.workunit.client.0.vm05.stdout:2/722: sync 2026-03-10T07:51:02.042 INFO:tasks.workunit.client.0.vm05.stdout:6/690: dwrite d0/d11/d57/d66/fbd [0,4194304] 0 2026-03-10T07:51:02.042 INFO:tasks.workunit.client.0.vm05.stdout:1/686: rename da/dd/d12/d19 to da/dd/d2a/d55/d64/dd1 0 2026-03-10T07:51:02.042 INFO:tasks.workunit.client.0.vm05.stdout:1/687: write da/dd/d12/d34/f38 [3966916,119864] 0 2026-03-10T07:51:02.048 INFO:tasks.workunit.client.0.vm05.stdout:1/688: fdatasync da/d26/d2b/d89/fa7 0 2026-03-10T07:51:02.051 INFO:tasks.workunit.client.0.vm05.stdout:8/596: chown d1/dd/d18/d20/d2a/d34/d49/fa9 15 1 2026-03-10T07:51:02.060 INFO:tasks.workunit.client.0.vm05.stdout:8/597: dwrite d1/dd/d5e/d9e/fc3 [0,4194304] 0 2026-03-10T07:51:02.062 INFO:tasks.workunit.client.0.vm05.stdout:6/691: dread d0/d11/d4f/d56/f6f [0,4194304] 0 2026-03-10T07:51:02.089 INFO:tasks.workunit.client.0.vm05.stdout:5/693: symlink d2/d5/le8 0 2026-03-10T07:51:02.090 INFO:tasks.workunit.client.0.vm05.stdout:5/694: chown d2/d20/d33/l8a 2174 1 2026-03-10T07:51:02.096 INFO:tasks.workunit.client.0.vm05.stdout:2/723: rmdir d0/d8/d3d/d7d 39 2026-03-10T07:51:02.097 INFO:tasks.workunit.client.0.vm05.stdout:2/724: chown d0/d8/d43/df/d4d/l79 44789 1 2026-03-10T07:51:02.133 INFO:tasks.workunit.client.0.vm05.stdout:1/689: mknod da/dd/d12/d34/cd2 0 2026-03-10T07:51:02.133 INFO:tasks.workunit.client.0.vm05.stdout:1/690: chown da/d26/d2b/d71 801699335 1 2026-03-10T07:51:02.161 INFO:tasks.workunit.client.0.vm05.stdout:8/598: chown d1/l5 40946 1 2026-03-10T07:51:02.161 INFO:tasks.workunit.client.0.vm05.stdout:9/628: getdents d8/d35/d1c/d20/d59/d8b 0 2026-03-10T07:51:02.164 INFO:tasks.workunit.client.0.vm05.stdout:0/667: mknod d8/dd/d37/d56/ce8 0 2026-03-10T07:51:02.167 INFO:tasks.workunit.client.0.vm05.stdout:5/695: rmdir d2/d5/d61 39 2026-03-10T07:51:02.188 INFO:tasks.workunit.client.0.vm05.stdout:7/696: rmdir d1/d6/d47/d8d/da7 0 2026-03-10T07:51:02.191 INFO:tasks.workunit.client.0.vm05.stdout:6/692: dwrite d0/d35/d36/f5b [4194304,4194304] 0 2026-03-10T07:51:02.210 INFO:tasks.workunit.client.0.vm05.stdout:6/693: dread d0/f29 [0,4194304] 0 2026-03-10T07:51:02.210 INFO:tasks.workunit.client.0.vm05.stdout:1/691: symlink da/dd/d2a/ld3 0 2026-03-10T07:51:02.211 INFO:tasks.workunit.client.0.vm05.stdout:9/629: creat d8/d35/d38/fd0 x:0 0 0 2026-03-10T07:51:02.211 INFO:tasks.workunit.client.0.vm05.stdout:2/725: unlink d0/d8/d3d/d7d/db2/f95 0 2026-03-10T07:51:02.213 INFO:tasks.workunit.client.0.vm05.stdout:3/653: getdents d8/d1f/d2a/d34 0 2026-03-10T07:51:02.214 INFO:tasks.workunit.client.0.vm05.stdout:3/654: chown d8/d16/d52/da4 70280902 1 2026-03-10T07:51:02.215 INFO:tasks.workunit.client.0.vm05.stdout:4/740: write d0/d6/d9/d5a/f58 [5190246,4400] 0 2026-03-10T07:51:02.216 INFO:tasks.workunit.client.0.vm05.stdout:4/741: chown d0/d6/d6f/c77 1809811 1 2026-03-10T07:51:02.217 INFO:tasks.workunit.client.0.vm05.stdout:1/692: dread da/dd/d12/d34/f5f [0,4194304] 0 2026-03-10T07:51:02.221 INFO:tasks.workunit.client.0.vm05.stdout:1/693: read da/dd/d2a/d55/d64/f9f [331605,20751] 0 2026-03-10T07:51:02.224 INFO:tasks.workunit.client.0.vm05.stdout:9/630: dread d8/d35/d1c/f3b [0,4194304] 0 2026-03-10T07:51:02.235 INFO:tasks.workunit.client.0.vm05.stdout:6/694: rename d0/lf to d0/d11/d4f/d56/d96/db6/ld8 0 2026-03-10T07:51:02.240 INFO:tasks.workunit.client.0.vm05.stdout:7/697: mkdir d1/d3c/d71/dd7 0 2026-03-10T07:51:02.242 INFO:tasks.workunit.client.0.vm05.stdout:2/726: mkdir d0/d8/d43/da4/dea 0 2026-03-10T07:51:02.247 INFO:tasks.workunit.client.0.vm05.stdout:4/742: creat d0/d6/d95/ff7 x:0 0 0 2026-03-10T07:51:02.247 INFO:tasks.workunit.client.0.vm05.stdout:2/727: dread d0/d8/d43/d38/f56 [0,4194304] 0 2026-03-10T07:51:02.247 INFO:tasks.workunit.client.0.vm05.stdout:1/694: rmdir da/dd/d2a/d55/d68 39 2026-03-10T07:51:02.256 INFO:tasks.workunit.client.0.vm05.stdout:8/599: link d1/d6f/l8e d1/dd/d18/d20/d2a/d34/da5/db2/lc4 0 2026-03-10T07:51:02.258 INFO:tasks.workunit.client.0.vm05.stdout:5/696: rmdir d2/d5/ddb 0 2026-03-10T07:51:02.259 INFO:tasks.workunit.client.0.vm05.stdout:7/698: mknod d1/d34/cd8 0 2026-03-10T07:51:02.264 INFO:tasks.workunit.client.0.vm05.stdout:4/743: sync 2026-03-10T07:51:02.274 INFO:tasks.workunit.client.0.vm05.stdout:7/699: dread d1/d3c/f89 [0,4194304] 0 2026-03-10T07:51:02.286 INFO:tasks.workunit.client.0.vm05.stdout:0/668: dwrite d8/dd/d10/d26/d8b/d70/fbe [0,4194304] 0 2026-03-10T07:51:02.297 INFO:tasks.workunit.client.0.vm05.stdout:9/631: write d8/d35/d22/d33/d85/f8f [391337,9248] 0 2026-03-10T07:51:02.303 INFO:tasks.workunit.client.0.vm05.stdout:3/655: dwrite d8/d16/f1a [0,4194304] 0 2026-03-10T07:51:02.312 INFO:tasks.workunit.client.0.vm05.stdout:2/728: write d0/d2a/f2e [971810,123503] 0 2026-03-10T07:51:02.312 INFO:tasks.workunit.client.0.vm05.stdout:2/729: dread - d0/d8/d66/dd1/d49/fde zero size 2026-03-10T07:51:02.318 INFO:tasks.workunit.client.0.vm05.stdout:5/697: mkdir d2/d12/da8/ddd/de9 0 2026-03-10T07:51:02.319 INFO:tasks.workunit.client.0.vm05.stdout:5/698: dread d2/d12/d2d/d4a/f59 [0,4194304] 0 2026-03-10T07:51:02.328 INFO:tasks.workunit.client.0.vm05.stdout:0/669: dread d8/f20 [0,4194304] 0 2026-03-10T07:51:02.331 INFO:tasks.workunit.client.0.vm05.stdout:2/730: creat d0/d8/d43/df/feb x:0 0 0 2026-03-10T07:51:02.332 INFO:tasks.workunit.client.0.vm05.stdout:2/731: chown d0/d8/d43/df/fc2 25 1 2026-03-10T07:51:02.339 INFO:tasks.workunit.client.0.vm05.stdout:5/699: mknod d2/d12/da8/ddd/cea 0 2026-03-10T07:51:02.346 INFO:tasks.workunit.client.0.vm05.stdout:7/700: write d1/f3a [2540173,56578] 0 2026-03-10T07:51:02.348 INFO:tasks.workunit.client.0.vm05.stdout:9/632: rename d8/d35/d22/d33/l4b to d8/d35/d22/ld1 0 2026-03-10T07:51:02.350 INFO:tasks.workunit.client.0.vm05.stdout:0/670: read d8/dd/d10/f6c [1677215,99326] 0 2026-03-10T07:51:02.351 INFO:tasks.workunit.client.0.vm05.stdout:0/671: chown d8/dd/d10/d26/d3a 1300493 1 2026-03-10T07:51:02.352 INFO:tasks.workunit.client.0.vm05.stdout:8/600: creat d1/dd/d4d/d64/fc5 x:0 0 0 2026-03-10T07:51:02.359 INFO:tasks.workunit.client.0.vm05.stdout:2/732: write d0/d8/d3d/d7d/db2/fbc [849851,85435] 0 2026-03-10T07:51:02.363 INFO:tasks.workunit.client.0.vm05.stdout:6/695: getdents d0/d11/d22 0 2026-03-10T07:51:02.370 INFO:tasks.workunit.client.0.vm05.stdout:5/700: dread d2/f15 [0,4194304] 0 2026-03-10T07:51:02.373 INFO:tasks.workunit.client.0.vm05.stdout:5/701: dwrite d2/fad [0,4194304] 0 2026-03-10T07:51:02.381 INFO:tasks.workunit.client.0.vm05.stdout:9/633: dread - d8/d35/d1c/d2c/d63/f64 zero size 2026-03-10T07:51:02.383 INFO:tasks.workunit.client.0.vm05.stdout:1/695: link da/d26/d2b/daf/fb6 da/dd/d12/d34/d58/fd4 0 2026-03-10T07:51:02.384 INFO:tasks.workunit.client.0.vm05.stdout:9/634: read d8/d35/f5d [3005684,85174] 0 2026-03-10T07:51:02.389 INFO:tasks.workunit.client.0.vm05.stdout:3/656: link d8/c38 d8/d1c/d48/cdb 0 2026-03-10T07:51:02.394 INFO:tasks.workunit.client.0.vm05.stdout:3/657: dread d8/d1f/f6c [0,4194304] 0 2026-03-10T07:51:02.397 INFO:tasks.workunit.client.0.vm05.stdout:6/696: rename d0/d11/d4f/fd5 to d0/d11/d4f/da0/da6/fd9 0 2026-03-10T07:51:02.399 INFO:tasks.workunit.client.0.vm05.stdout:4/744: creat d0/d6/d9/d12/d45/ff8 x:0 0 0 2026-03-10T07:51:02.401 INFO:tasks.workunit.client.0.vm05.stdout:5/702: creat d2/d20/d5b/feb x:0 0 0 2026-03-10T07:51:02.405 INFO:tasks.workunit.client.0.vm05.stdout:5/703: dwrite d2/d20/d4c/fe3 [0,4194304] 0 2026-03-10T07:51:02.407 INFO:tasks.workunit.client.0.vm05.stdout:5/704: chown d2/d12/dda/da1/dc0/ld1 723 1 2026-03-10T07:51:02.422 INFO:tasks.workunit.client.0.vm05.stdout:7/701: dwrite d1/d34/f7d [0,4194304] 0 2026-03-10T07:51:02.425 INFO:tasks.workunit.client.0.vm05.stdout:9/635: creat d8/d35/d22/d33/d62/fd2 x:0 0 0 2026-03-10T07:51:02.428 INFO:tasks.workunit.client.0.vm05.stdout:9/636: dwrite d8/d35/d38/d71/fc7 [0,4194304] 0 2026-03-10T07:51:02.436 INFO:tasks.workunit.client.0.vm05.stdout:0/672: write d8/dd/d37/d56/d4d/f69 [1011192,111341] 0 2026-03-10T07:51:02.436 INFO:tasks.workunit.client.0.vm05.stdout:0/673: readlink d8/dd/d10/d26/d3a/lc0 0 2026-03-10T07:51:02.443 INFO:tasks.workunit.client.0.vm05.stdout:2/733: dwrite d0/f22 [0,4194304] 0 2026-03-10T07:51:02.446 INFO:tasks.workunit.client.0.vm05.stdout:9/637: dread d8/d35/d22/f4a [0,4194304] 0 2026-03-10T07:51:02.447 INFO:tasks.workunit.client.0.vm05.stdout:9/638: dread - d8/d86/d28/d79/d57/dbc/fc5 zero size 2026-03-10T07:51:02.459 INFO:tasks.workunit.client.0.vm05.stdout:3/658: mknod d8/d1f/cdc 0 2026-03-10T07:51:02.468 INFO:tasks.workunit.client.0.vm05.stdout:6/697: mkdir d0/d11/d22/d69/dda 0 2026-03-10T07:51:02.470 INFO:tasks.workunit.client.0.vm05.stdout:4/745: rename d0/d6/d9/d12/fd6 to d0/d6/d9/d12/d9c/db7/ff9 0 2026-03-10T07:51:02.486 INFO:tasks.workunit.client.0.vm05.stdout:7/702: mknod d1/d3c/d71/d79/cd9 0 2026-03-10T07:51:02.493 INFO:tasks.workunit.client.0.vm05.stdout:5/705: dread d2/d20/f2a [4194304,4194304] 0 2026-03-10T07:51:02.503 INFO:tasks.workunit.client.0.vm05.stdout:2/734: mknod d0/d8/d43/df/d8b/cec 0 2026-03-10T07:51:02.504 INFO:tasks.workunit.client.0.vm05.stdout:2/735: chown d0/d8/d43/df/c10 9141706 1 2026-03-10T07:51:02.513 INFO:tasks.workunit.client.0.vm05.stdout:9/639: dread d8/d35/d1c/d20/d59/d8b/f50 [0,4194304] 0 2026-03-10T07:51:02.514 INFO:tasks.workunit.client.0.vm05.stdout:9/640: truncate d8/d86/d95/fcb 678625 0 2026-03-10T07:51:02.524 INFO:tasks.workunit.client.0.vm05.stdout:3/659: unlink d8/d1f/d24/d8a/fa3 0 2026-03-10T07:51:02.534 INFO:tasks.workunit.client.0.vm05.stdout:6/698: creat d0/d11/d2e/d81/d92/dc2/fdb x:0 0 0 2026-03-10T07:51:02.535 INFO:tasks.workunit.client.0.vm05.stdout:6/699: fdatasync d0/d11/d57/faf 0 2026-03-10T07:51:02.538 INFO:tasks.workunit.client.0.vm05.stdout:8/601: dwrite d1/f15 [0,4194304] 0 2026-03-10T07:51:02.538 INFO:tasks.workunit.client.0.vm05.stdout:1/696: dwrite da/dd/d2a/f93 [0,4194304] 0 2026-03-10T07:51:02.539 INFO:tasks.workunit.client.0.vm05.stdout:8/602: chown d1/dd/d18/d20/d2a/f88 3289 1 2026-03-10T07:51:02.553 INFO:tasks.workunit.client.0.vm05.stdout:7/703: fsync d1/d34/f7c 0 2026-03-10T07:51:02.557 INFO:tasks.workunit.client.0.vm05.stdout:7/704: dread d1/d3c/d71/d79/d8a/fad [0,4194304] 0 2026-03-10T07:51:02.568 INFO:tasks.workunit.client.0.vm05.stdout:5/706: rename d2/d12/d2d/d4a/f59 to d2/d20/d7b/dbc/fec 0 2026-03-10T07:51:02.578 INFO:tasks.workunit.client.0.vm05.stdout:5/707: write d2/d12/f40 [905628,5104] 0 2026-03-10T07:51:02.578 INFO:tasks.workunit.client.0.vm05.stdout:0/674: write d8/dd/f29 [169703,50724] 0 2026-03-10T07:51:02.578 INFO:tasks.workunit.client.0.vm05.stdout:2/736: symlink d0/d8/d43/da4/led 0 2026-03-10T07:51:02.578 INFO:tasks.workunit.client.0.vm05.stdout:0/675: write d8/f75 [778298,28782] 0 2026-03-10T07:51:02.581 INFO:tasks.workunit.client.0.vm05.stdout:0/676: dwrite d8/dd/f22 [0,4194304] 0 2026-03-10T07:51:02.583 INFO:tasks.workunit.client.0.vm05.stdout:3/660: creat d8/d16/d19/d6b/fdd x:0 0 0 2026-03-10T07:51:02.598 INFO:tasks.workunit.client.0.vm05.stdout:6/700: mknod d0/d11/d57/d60/cdc 0 2026-03-10T07:51:02.609 INFO:tasks.workunit.client.0.vm05.stdout:4/746: symlink d0/d6/d9/lfa 0 2026-03-10T07:51:02.613 INFO:tasks.workunit.client.0.vm05.stdout:4/747: dwrite d0/d6/f15 [4194304,4194304] 0 2026-03-10T07:51:02.625 INFO:tasks.workunit.client.0.vm05.stdout:9/641: rename d8/d35/d1c/d2c to d8/d35/d1c/d20/dd3 0 2026-03-10T07:51:02.638 INFO:tasks.workunit.client.0.vm05.stdout:2/737: mknod d0/d8/d43/df/cee 0 2026-03-10T07:51:02.639 INFO:tasks.workunit.client.0.vm05.stdout:0/677: truncate d8/dd/d10/d26/d2a/d6f/f85 3280559 0 2026-03-10T07:51:02.639 INFO:tasks.workunit.client.0.vm05.stdout:6/701: fsync d0/d11/d4f/d56/d96/db6/f9a 0 2026-03-10T07:51:02.640 INFO:tasks.workunit.client.0.vm05.stdout:2/738: stat d0/d8/d43/d38/ca0 0 2026-03-10T07:51:02.661 INFO:tasks.workunit.client.0.vm05.stdout:5/708: dwrite d2/f42 [0,4194304] 0 2026-03-10T07:51:02.669 INFO:tasks.workunit.client.0.vm05.stdout:1/697: write da/d26/d2b/daf/dbe/dc0/fb7 [1527116,63078] 0 2026-03-10T07:51:02.670 INFO:tasks.workunit.client.0.vm05.stdout:1/698: write da/d26/d2b/daf/dbe/dc0/f6e [3339700,17409] 0 2026-03-10T07:51:02.671 INFO:tasks.workunit.client.0.vm05.stdout:1/699: stat da/dd/cb4 0 2026-03-10T07:51:02.674 INFO:tasks.workunit.client.0.vm05.stdout:4/748: chown d0/d6/d9/d12/c2e 3316 1 2026-03-10T07:51:02.676 INFO:tasks.workunit.client.0.vm05.stdout:9/642: dread - d8/f5e zero size 2026-03-10T07:51:02.678 INFO:tasks.workunit.client.0.vm05.stdout:4/749: dread d0/d6/d9/d12/d9c/db7/f63 [0,4194304] 0 2026-03-10T07:51:02.680 INFO:tasks.workunit.client.0.vm05.stdout:6/702: read d0/d35/f41 [3356256,97858] 0 2026-03-10T07:51:02.681 INFO:tasks.workunit.client.0.vm05.stdout:6/703: dread - d0/d35/d36/d43/d9c/fc5 zero size 2026-03-10T07:51:02.681 INFO:tasks.workunit.client.0.vm05.stdout:6/704: write d0/d11/d22/d6c/fa5 [689552,47184] 0 2026-03-10T07:51:02.683 INFO:tasks.workunit.client.0.vm05.stdout:6/705: chown d0/d11/d4f/d56/f6f 0 1 2026-03-10T07:51:02.686 INFO:tasks.workunit.client.0.vm05.stdout:4/750: dwrite d0/d6/d95/ff7 [0,4194304] 0 2026-03-10T07:51:02.691 INFO:tasks.workunit.client.0.vm05.stdout:7/705: creat d1/d6/d3b/fda x:0 0 0 2026-03-10T07:51:02.694 INFO:tasks.workunit.client.0.vm05.stdout:9/643: symlink d8/d35/d1c/d20/d59/d8b/ld4 0 2026-03-10T07:51:02.701 INFO:tasks.workunit.client.0.vm05.stdout:2/739: mkdir d0/d8/def 0 2026-03-10T07:51:02.708 INFO:tasks.workunit.client.0.vm05.stdout:5/709: symlink d2/d5/led 0 2026-03-10T07:51:02.709 INFO:tasks.workunit.client.0.vm05.stdout:5/710: fsync d2/d12/d2d/d4a/faf 0 2026-03-10T07:51:02.709 INFO:tasks.workunit.client.0.vm05.stdout:7/706: rmdir d1/d34 39 2026-03-10T07:51:02.709 INFO:tasks.workunit.client.0.vm05.stdout:8/603: rename d1/dd/d18/f70 to d1/dd/d18/d20/d2a/d34/fc6 0 2026-03-10T07:51:02.709 INFO:tasks.workunit.client.0.vm05.stdout:0/678: creat d8/dd/d37/d56/fe9 x:0 0 0 2026-03-10T07:51:02.710 INFO:tasks.workunit.client.0.vm05.stdout:0/679: chown d8/dd/d10/d26/d8b/d70/fbe 40 1 2026-03-10T07:51:02.716 INFO:tasks.workunit.client.0.vm05.stdout:5/711: dwrite d2/d20/d4c/fb1 [0,4194304] 0 2026-03-10T07:51:02.725 INFO:tasks.workunit.client.0.vm05.stdout:1/700: write da/dd/d2a/d55/d64/f9f [832381,92294] 0 2026-03-10T07:51:02.730 INFO:tasks.workunit.client.0.vm05.stdout:6/706: unlink d0/d11/d31/c54 0 2026-03-10T07:51:02.735 INFO:tasks.workunit.client.0.vm05.stdout:4/751: rename d0/d6/d9/d12/d65/l6c to d0/d6/d9/d12/d9c/lfb 0 2026-03-10T07:51:02.743 INFO:tasks.workunit.client.0.vm05.stdout:4/752: chown d0/d6/d9/d5a/d6e/c70 4025087 1 2026-03-10T07:51:02.743 INFO:tasks.workunit.client.0.vm05.stdout:9/644: unlink d8/d35/l69 0 2026-03-10T07:51:02.743 INFO:tasks.workunit.client.0.vm05.stdout:3/661: link d8/d1f/d24/d8a/f91 d8/d16/d52/fde 0 2026-03-10T07:51:02.743 INFO:tasks.workunit.client.0.vm05.stdout:2/740: symlink d0/d8/d3d/d7d/db2/dd7/ddb/lf0 0 2026-03-10T07:51:02.743 INFO:tasks.workunit.client.0.vm05.stdout:2/741: read d0/d8/d43/d38/f56 [47568,100510] 0 2026-03-10T07:51:02.762 INFO:tasks.workunit.client.0.vm05.stdout:0/680: rmdir d8/dd 39 2026-03-10T07:51:02.762 INFO:tasks.workunit.client.0.vm05.stdout:1/701: creat da/d26/d9e/fd5 x:0 0 0 2026-03-10T07:51:02.764 INFO:tasks.workunit.client.0.vm05.stdout:4/753: truncate d0/d6/d9/f4d 3174661 0 2026-03-10T07:51:02.768 INFO:tasks.workunit.client.0.vm05.stdout:9/645: creat d8/d35/d1c/d20/dd3/d63/fd5 x:0 0 0 2026-03-10T07:51:02.772 INFO:tasks.workunit.client.0.vm05.stdout:9/646: write d8/d35/d38/d71/fc7 [1974215,51813] 0 2026-03-10T07:51:02.778 INFO:tasks.workunit.client.0.vm05.stdout:6/707: dread d0/d6/f98 [0,4194304] 0 2026-03-10T07:51:02.785 INFO:tasks.workunit.client.0.vm05.stdout:3/662: dwrite d8/d1f/d2a/d96/fcf [0,4194304] 0 2026-03-10T07:51:02.786 INFO:tasks.workunit.client.0.vm05.stdout:3/663: chown d8/d16/f82 172 1 2026-03-10T07:51:02.798 INFO:tasks.workunit.client.0.vm05.stdout:5/712: mknod d2/d5/cee 0 2026-03-10T07:51:02.801 INFO:tasks.workunit.client.0.vm05.stdout:4/754: dread - d0/fa3 zero size 2026-03-10T07:51:02.801 INFO:tasks.workunit.client.0.vm05.stdout:4/755: chown d0/d6/d37/c92 1051646 1 2026-03-10T07:51:02.801 INFO:tasks.workunit.client.0.vm05.stdout:9/647: read f6 [2502213,55732] 0 2026-03-10T07:51:02.807 INFO:tasks.workunit.client.0.vm05.stdout:3/664: mkdir d8/d1f/d24/d45/ddf 0 2026-03-10T07:51:02.810 INFO:tasks.workunit.client.0.vm05.stdout:4/756: mknod d0/d6/d9/d12/d9c/db7/da7/d5c/cfc 0 2026-03-10T07:51:02.810 INFO:tasks.workunit.client.0.vm05.stdout:5/713: truncate d2/d5/d61/f66 4262609 0 2026-03-10T07:51:02.816 INFO:tasks.workunit.client.0.vm05.stdout:7/707: dread d1/d3c/d71/d79/f93 [0,4194304] 0 2026-03-10T07:51:02.821 INFO:tasks.workunit.client.0.vm05.stdout:8/604: rename d1/f2c to d1/dd/d18/d20/d2a/d34/d49/fc7 0 2026-03-10T07:51:02.821 INFO:tasks.workunit.client.0.vm05.stdout:6/708: dwrite d0/d11/d4f/da0/da6/fd9 [0,4194304] 0 2026-03-10T07:51:02.832 INFO:tasks.workunit.client.0.vm05.stdout:5/714: truncate d2/d5/f3d 2631270 0 2026-03-10T07:51:02.838 INFO:tasks.workunit.client.0.vm05.stdout:9/648: dread d8/d35/d22/d33/d62/f9a [0,4194304] 0 2026-03-10T07:51:02.850 INFO:tasks.workunit.client.0.vm05.stdout:1/702: rename da/l51 to da/d26/ld6 0 2026-03-10T07:51:02.853 INFO:tasks.workunit.client.0.vm05.stdout:7/708: symlink d1/d3c/d71/d79/d8a/ldb 0 2026-03-10T07:51:02.856 INFO:tasks.workunit.client.0.vm05.stdout:4/757: stat d0/d6/d9/d12/d45/d55 0 2026-03-10T07:51:02.858 INFO:tasks.workunit.client.0.vm05.stdout:5/715: chown d2/d5/d61/lc8 1 1 2026-03-10T07:51:02.865 INFO:tasks.workunit.client.0.vm05.stdout:1/703: unlink da/d26/d2b/d71/f97 0 2026-03-10T07:51:02.865 INFO:tasks.workunit.client.0.vm05.stdout:9/649: dread - d8/d35/d1c/d20/fa7 zero size 2026-03-10T07:51:02.866 INFO:tasks.workunit.client.0.vm05.stdout:6/709: mknod d0/d35/d36/dc8/cdd 0 2026-03-10T07:51:02.868 INFO:tasks.workunit.client.0.vm05.stdout:1/704: chown da/d26/d2b/daf/dbe/dc0/c44 106 1 2026-03-10T07:51:02.873 INFO:tasks.workunit.client.0.vm05.stdout:9/650: dwrite d8/d35/d22/d33/d62/fd2 [0,4194304] 0 2026-03-10T07:51:02.877 INFO:tasks.workunit.client.0.vm05.stdout:2/742: write d0/d8/d3d/d7d/f36 [1668275,123515] 0 2026-03-10T07:51:02.887 INFO:tasks.workunit.client.0.vm05.stdout:7/709: symlink d1/d34/d59/d60/ldc 0 2026-03-10T07:51:02.888 INFO:tasks.workunit.client.0.vm05.stdout:6/710: symlink d0/d11/d4f/da0/da6/lde 0 2026-03-10T07:51:02.892 INFO:tasks.workunit.client.0.vm05.stdout:3/665: write d8/f5c [978510,34351] 0 2026-03-10T07:51:02.896 INFO:tasks.workunit.client.0.vm05.stdout:9/651: fsync d8/d35/d22/dab/db4/fb6 0 2026-03-10T07:51:02.898 INFO:tasks.workunit.client.0.vm05.stdout:0/681: getdents d8/dd/d37/d67/d96 0 2026-03-10T07:51:02.899 INFO:tasks.workunit.client.0.vm05.stdout:0/682: write d8/dd/d37/fc9 [168236,10083] 0 2026-03-10T07:51:02.901 INFO:tasks.workunit.client.0.vm05.stdout:8/605: write d1/dd/d18/d20/d2a/d34/da5/fbc [1006124,88925] 0 2026-03-10T07:51:02.905 INFO:tasks.workunit.client.0.vm05.stdout:4/758: write d0/d6/da6/fcb [740152,50174] 0 2026-03-10T07:51:02.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:02 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:51:02.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:02 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:02.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:02 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:02.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:02 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:02.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:02 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:02.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:02 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:02.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:02 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:02.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:02 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:02.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:02 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:02.907 INFO:tasks.workunit.client.0.vm05.stdout:2/743: truncate d0/d8/fc3 1375259 0 2026-03-10T07:51:02.908 INFO:tasks.workunit.client.0.vm05.stdout:2/744: write d0/d8/d66/dd1/d49/db3/fdf [550512,69292] 0 2026-03-10T07:51:02.909 INFO:tasks.workunit.client.0.vm05.stdout:8/606: dwrite d1/dd/d18/d20/d2a/d48/d7c/d9c/fb4 [0,4194304] 0 2026-03-10T07:51:02.910 INFO:tasks.workunit.client.0.vm05.stdout:2/745: chown d0/d8/d3d/d7d/db2/f2b 8227721 1 2026-03-10T07:51:02.913 INFO:tasks.workunit.client.0.vm05.stdout:8/607: truncate d1/dd/d4d/d64/d6a/fa6 22787 0 2026-03-10T07:51:02.915 INFO:tasks.workunit.client.0.vm05.stdout:7/710: rmdir d1/d34/d59/d60/d8c 39 2026-03-10T07:51:02.918 INFO:tasks.workunit.client.0.vm05.stdout:7/711: dread d1/f86 [0,4194304] 0 2026-03-10T07:51:02.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:02 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:51:02.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:02 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:02.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:02 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:02.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:02 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:02.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:02 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:02.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:02 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:02.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:02 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:02.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:02 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:02.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:02 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:02.918 INFO:tasks.workunit.client.0.vm05.stdout:7/712: fsync d1/d34/d59/fca 0 2026-03-10T07:51:02.926 INFO:tasks.workunit.client.0.vm05.stdout:5/716: write d2/d20/d4c/f8c [1977058,92332] 0 2026-03-10T07:51:02.927 INFO:tasks.workunit.client.0.vm05.stdout:5/717: dread - d2/d12/d2d/d4a/faf zero size 2026-03-10T07:51:02.927 INFO:tasks.workunit.client.0.vm05.stdout:5/718: chown d2/d12/dda/da1 1 1 2026-03-10T07:51:02.935 INFO:tasks.workunit.client.0.vm05.stdout:9/652: creat d8/d35/d1c/d20/d54/fd6 x:0 0 0 2026-03-10T07:51:02.938 INFO:tasks.workunit.client.0.vm05.stdout:9/653: dwrite d8/d35/d22/d33/d62/fba [0,4194304] 0 2026-03-10T07:51:02.946 INFO:tasks.workunit.client.0.vm05.stdout:0/683: mkdir d8/dd/d10/db7/dc3/dea 0 2026-03-10T07:51:02.952 INFO:tasks.workunit.client.0.vm05.stdout:2/746: creat d0/d8/d43/da4/ff1 x:0 0 0 2026-03-10T07:51:02.964 INFO:tasks.workunit.client.0.vm05.stdout:6/711: dwrite d0/d35/f41 [0,4194304] 0 2026-03-10T07:51:02.967 INFO:tasks.workunit.client.0.vm05.stdout:6/712: read d0/d11/d57/d66/f75 [1041549,99024] 0 2026-03-10T07:51:02.976 INFO:tasks.workunit.client.0.vm05.stdout:7/713: symlink d1/d34/d59/d60/ldd 0 2026-03-10T07:51:02.981 INFO:tasks.workunit.client.0.vm05.stdout:5/719: link d2/d20/d33/d53/fd0 d2/d12/da8/ddd/fef 0 2026-03-10T07:51:02.985 INFO:tasks.workunit.client.0.vm05.stdout:9/654: link d8/d35/d22/d33/d62/d6d/d9e/fcc d8/d35/d22/dab/fd7 0 2026-03-10T07:51:02.989 INFO:tasks.workunit.client.0.vm05.stdout:0/684: unlink d8/dd/d10/d26/d8b/d70/f80 0 2026-03-10T07:51:02.991 INFO:tasks.workunit.client.0.vm05.stdout:0/685: truncate d8/dd/d37/f4f 4408690 0 2026-03-10T07:51:02.991 INFO:tasks.workunit.client.0.vm05.stdout:1/705: write da/dd/d12/d34/d58/fce [9024531,117698] 0 2026-03-10T07:51:03.000 INFO:tasks.workunit.client.0.vm05.stdout:8/608: creat d1/dd/d18/d20/d2a/fc8 x:0 0 0 2026-03-10T07:51:03.005 INFO:tasks.workunit.client.0.vm05.stdout:7/714: dread d1/d34/f4d [0,4194304] 0 2026-03-10T07:51:03.006 INFO:tasks.workunit.client.0.vm05.stdout:4/759: write d0/d6/d9/d12/f36 [5088755,36593] 0 2026-03-10T07:51:03.008 INFO:tasks.workunit.client.0.vm05.stdout:2/747: write d0/d8/d3d/d7d/db2/f2b [5035833,14878] 0 2026-03-10T07:51:03.013 INFO:tasks.workunit.client.0.vm05.stdout:6/713: write d0/d11/d57/d66/f7b [454311,84887] 0 2026-03-10T07:51:03.015 INFO:tasks.workunit.client.0.vm05.stdout:5/720: fsync d2/d4b/fc3 0 2026-03-10T07:51:03.016 INFO:tasks.workunit.client.0.vm05.stdout:3/666: getdents d8/d1f/d2a/d34/dbd 0 2026-03-10T07:51:03.019 INFO:tasks.workunit.client.0.vm05.stdout:1/706: fsync da/dd/d42/d80/f87 0 2026-03-10T07:51:03.024 INFO:tasks.workunit.client.0.vm05.stdout:7/715: creat d1/d3c/d4b/fde x:0 0 0 2026-03-10T07:51:03.024 INFO:tasks.workunit.client.0.vm05.stdout:7/716: stat d1/d3c/d71/cb6 0 2026-03-10T07:51:03.024 INFO:tasks.workunit.client.0.vm05.stdout:7/717: stat d1/d6/d80/d82/fc9 0 2026-03-10T07:51:03.027 INFO:tasks.workunit.client.0.vm05.stdout:2/748: fdatasync d0/d8/d43/f1f 0 2026-03-10T07:51:03.037 INFO:tasks.workunit.client.0.vm05.stdout:6/714: unlink d0/d11/d31/la1 0 2026-03-10T07:51:03.041 INFO:tasks.workunit.client.0.vm05.stdout:6/715: dwrite d0/d35/fb2 [0,4194304] 0 2026-03-10T07:51:03.043 INFO:tasks.workunit.client.0.vm05.stdout:6/716: chown d0/d35/d36/d43/d9c/fc5 521 1 2026-03-10T07:51:03.045 INFO:tasks.workunit.client.0.vm05.stdout:9/655: dwrite d8/d86/d28/f43 [0,4194304] 0 2026-03-10T07:51:03.046 INFO:tasks.workunit.client.0.vm05.stdout:0/686: dwrite d8/dd/d10/d26/d2a/d6f/fb8 [0,4194304] 0 2026-03-10T07:51:03.057 INFO:tasks.workunit.client.0.vm05.stdout:0/687: stat d8/dd/d10/d26/d8b/d7d 0 2026-03-10T07:51:03.066 INFO:tasks.workunit.client.0.vm05.stdout:3/667: creat d8/d8f/dbc/fe0 x:0 0 0 2026-03-10T07:51:03.066 INFO:tasks.workunit.client.0.vm05.stdout:8/609: fdatasync d1/fe 0 2026-03-10T07:51:03.067 INFO:tasks.workunit.client.0.vm05.stdout:3/668: write d8/d22/d60/fc6 [53636,36870] 0 2026-03-10T07:51:03.075 INFO:tasks.workunit.client.0.vm05.stdout:8/610: write d1/dd/d5e/d9e/fc3 [4557398,48939] 0 2026-03-10T07:51:03.082 INFO:tasks.workunit.client.0.vm05.stdout:1/707: rename da/dd/d12/d34/f38 to da/d26/d2b/dcb/fd7 0 2026-03-10T07:51:03.097 INFO:tasks.workunit.client.0.vm05.stdout:6/717: creat d0/d11/d57/da4/fdf x:0 0 0 2026-03-10T07:51:03.100 INFO:tasks.workunit.client.0.vm05.stdout:9/656: mkdir d8/d86/d28/d79/d57/d96/dd8 0 2026-03-10T07:51:03.101 INFO:tasks.workunit.client.0.vm05.stdout:9/657: chown d8/d35/d22/d33/d62/cbb 2 1 2026-03-10T07:51:03.106 INFO:tasks.workunit.client.0.vm05.stdout:6/718: dread d0/d11/d22/f52 [0,4194304] 0 2026-03-10T07:51:03.123 INFO:tasks.workunit.client.0.vm05.stdout:5/721: mknod d2/d12/dda/da1/dc0/dc2/cf0 0 2026-03-10T07:51:03.126 INFO:tasks.workunit.client.0.vm05.stdout:8/611: rmdir d1/dd/d5e/d9e 39 2026-03-10T07:51:03.127 INFO:tasks.workunit.client.0.vm05.stdout:1/708: unlink da/d26/d2b/daf/dbe/l9d 0 2026-03-10T07:51:03.128 INFO:tasks.workunit.client.0.vm05.stdout:4/760: link d0/d6/d9/d12/d69/dc7/ded/ff0 d0/d6/da6/ffd 0 2026-03-10T07:51:03.134 INFO:tasks.workunit.client.0.vm05.stdout:9/658: fsync d8/d35/d1c/d20/fa7 0 2026-03-10T07:51:03.135 INFO:tasks.workunit.client.0.vm05.stdout:6/719: mknod d0/d11/d22/ce0 0 2026-03-10T07:51:03.137 INFO:tasks.workunit.client.0.vm05.stdout:5/722: creat d2/d20/d33/d86/ff1 x:0 0 0 2026-03-10T07:51:03.139 INFO:tasks.workunit.client.0.vm05.stdout:4/761: mkdir d0/d6/da6/dfe 0 2026-03-10T07:51:03.139 INFO:tasks.workunit.client.0.vm05.stdout:7/718: creat d1/d34/d59/fdf x:0 0 0 2026-03-10T07:51:03.140 INFO:tasks.workunit.client.0.vm05.stdout:9/659: symlink d8/d35/d1c/d20/d59/d8b/ld9 0 2026-03-10T07:51:03.142 INFO:tasks.workunit.client.0.vm05.stdout:9/660: write d8/d35/d1c/d20/dd3/d63/fd5 [520546,79635] 0 2026-03-10T07:51:03.143 INFO:tasks.workunit.client.0.vm05.stdout:6/720: dwrite d0/d35/f41 [4194304,4194304] 0 2026-03-10T07:51:03.157 INFO:tasks.workunit.client.0.vm05.stdout:3/669: sync 2026-03-10T07:51:03.167 INFO:tasks.workunit.client.0.vm05.stdout:3/670: dwrite d8/d1f/d24/d8a/fcb [0,4194304] 0 2026-03-10T07:51:03.175 INFO:tasks.workunit.client.0.vm05.stdout:5/723: rename d2/d5/fa to d2/d12/d2d/d4a/de7/ff2 0 2026-03-10T07:51:03.176 INFO:tasks.workunit.client.0.vm05.stdout:4/762: dread - d0/d6/d9/d12/fd3 zero size 2026-03-10T07:51:03.176 INFO:tasks.workunit.client.0.vm05.stdout:6/721: truncate d0/d6/f24 1230329 0 2026-03-10T07:51:03.177 INFO:tasks.workunit.client.0.vm05.stdout:4/763: stat d0/d6/d9/ld7 0 2026-03-10T07:51:03.178 INFO:tasks.workunit.client.0.vm05.stdout:6/722: fsync d0/d11/d4f/f91 0 2026-03-10T07:51:03.178 INFO:tasks.workunit.client.0.vm05.stdout:4/764: write d0/d6/d9/d5a/d6e/db6/db9/fd9 [708513,35007] 0 2026-03-10T07:51:03.183 INFO:tasks.workunit.client.0.vm05.stdout:1/709: link da/d26/d2b/ca4 da/d26/d2b/d89/dbd/cd8 0 2026-03-10T07:51:03.192 INFO:tasks.workunit.client.0.vm05.stdout:9/661: rename d8/d35/d22/d33/d62/f7d to d8/d86/db8/fda 0 2026-03-10T07:51:03.194 INFO:tasks.workunit.client.0.vm05.stdout:4/765: unlink d0/d6/d9/d8c/dbe/fc3 0 2026-03-10T07:51:03.199 INFO:tasks.workunit.client.0.vm05.stdout:6/723: creat d0/d35/d36/d43/fe1 x:0 0 0 2026-03-10T07:51:03.199 INFO:tasks.workunit.client.0.vm05.stdout:1/710: creat da/d26/fd9 x:0 0 0 2026-03-10T07:51:03.200 INFO:tasks.workunit.client.0.vm05.stdout:7/719: rename d1/d3c/d71/cb6 to d1/d34/d59/d60/d8c/ce0 0 2026-03-10T07:51:03.200 INFO:tasks.workunit.client.0.vm05.stdout:1/711: fsync da/dd/d2a/d55/fbf 0 2026-03-10T07:51:03.203 INFO:tasks.workunit.client.0.vm05.stdout:9/662: fsync d8/f14 0 2026-03-10T07:51:03.207 INFO:tasks.workunit.client.0.vm05.stdout:4/766: rename d0/d28/f9b to d0/d6/d9/d12/d9c/fff 0 2026-03-10T07:51:03.209 INFO:tasks.workunit.client.0.vm05.stdout:7/720: fsync d1/d6/d80/d82/fa8 0 2026-03-10T07:51:03.214 INFO:tasks.workunit.client.0.vm05.stdout:8/612: truncate d1/d23/fa3 454119 0 2026-03-10T07:51:03.220 INFO:tasks.workunit.client.0.vm05.stdout:4/767: creat d0/d6/d9/d12/d9c/db7/da7/f100 x:0 0 0 2026-03-10T07:51:03.221 INFO:tasks.workunit.client.0.vm05.stdout:1/712: creat da/dd/d12/d86/fda x:0 0 0 2026-03-10T07:51:03.221 INFO:tasks.workunit.client.0.vm05.stdout:6/724: sync 2026-03-10T07:51:03.223 INFO:tasks.workunit.client.0.vm05.stdout:9/663: dread d8/d86/f92 [0,4194304] 0 2026-03-10T07:51:03.229 INFO:tasks.workunit.client.0.vm05.stdout:0/688: write d8/dd/d10/d26/d8b/d86/fa0 [882478,59716] 0 2026-03-10T07:51:03.234 INFO:tasks.workunit.client.0.vm05.stdout:0/689: dwrite d8/d9c/dc8/fd4 [0,4194304] 0 2026-03-10T07:51:03.236 INFO:tasks.workunit.client.0.vm05.stdout:0/690: chown d8/dd/d10/d26/d3a/fe1 231 1 2026-03-10T07:51:03.244 INFO:tasks.workunit.client.0.vm05.stdout:3/671: dread d8/d1c/f9d [0,4194304] 0 2026-03-10T07:51:03.247 INFO:tasks.workunit.client.0.vm05.stdout:8/613: fsync d1/dd/d18/d20/d2a/f54 0 2026-03-10T07:51:03.252 INFO:tasks.workunit.client.0.vm05.stdout:8/614: dread d1/f15 [0,4194304] 0 2026-03-10T07:51:03.257 INFO:tasks.workunit.client.0.vm05.stdout:6/725: mknod d0/d11/d57/d60/ce2 0 2026-03-10T07:51:03.257 INFO:tasks.workunit.client.0.vm05.stdout:2/749: write d0/f1e [2075612,57978] 0 2026-03-10T07:51:03.257 INFO:tasks.workunit.client.0.vm05.stdout:1/713: rename da/d26/d2b/daf/dbe/dc0/d8f to da/dd/d12/d34/ddb 0 2026-03-10T07:51:03.258 INFO:tasks.workunit.client.0.vm05.stdout:1/714: write da/dd/d2a/d55/fbf [395897,75845] 0 2026-03-10T07:51:03.258 INFO:tasks.workunit.client.0.vm05.stdout:9/664: chown d8/d35/f1d 13223878 1 2026-03-10T07:51:03.264 INFO:tasks.workunit.client.0.vm05.stdout:5/724: dread d2/d20/d7b/f83 [0,4194304] 0 2026-03-10T07:51:03.268 INFO:tasks.workunit.client.0.vm05.stdout:5/725: dwrite d2/d20/d4c/d64/f96 [0,4194304] 0 2026-03-10T07:51:03.283 INFO:tasks.workunit.client.0.vm05.stdout:5/726: sync 2026-03-10T07:51:03.283 INFO:tasks.workunit.client.0.vm05.stdout:5/727: fdatasync d2/f42 0 2026-03-10T07:51:03.291 INFO:tasks.workunit.client.0.vm05.stdout:3/672: rename d8/d16 to d8/d1f/d24/d76/dc5/de1 0 2026-03-10T07:51:03.301 INFO:tasks.workunit.client.0.vm05.stdout:8/615: dread d1/fa [0,4194304] 0 2026-03-10T07:51:03.301 INFO:tasks.workunit.client.0.vm05.stdout:4/768: write d0/d6/d9/d5a/fcd [502578,76030] 0 2026-03-10T07:51:03.301 INFO:tasks.workunit.client.0.vm05.stdout:7/721: dwrite d1/d34/d59/f78 [0,4194304] 0 2026-03-10T07:51:03.301 INFO:tasks.workunit.client.0.vm05.stdout:7/722: stat d1/d3c/d71/d79/d8a/fbc 0 2026-03-10T07:51:03.301 INFO:tasks.workunit.client.0.vm05.stdout:7/723: chown d1/d6/d80/d82/cb2 48152 1 2026-03-10T07:51:03.303 INFO:tasks.workunit.client.0.vm05.stdout:5/728: sync 2026-03-10T07:51:03.308 INFO:tasks.workunit.client.0.vm05.stdout:4/769: dread d0/d28/f33 [0,4194304] 0 2026-03-10T07:51:03.321 INFO:tasks.workunit.client.0.vm05.stdout:1/715: dread da/f3a [0,4194304] 0 2026-03-10T07:51:03.325 INFO:tasks.workunit.client.0.vm05.stdout:2/750: getdents d0/d8/d43/da4/dea 0 2026-03-10T07:51:03.329 INFO:tasks.workunit.client.0.vm05.stdout:5/729: symlink d2/d5/d61/lf3 0 2026-03-10T07:51:03.331 INFO:tasks.workunit.client.0.vm05.stdout:4/770: chown d0/d6/d9/l27 483647 1 2026-03-10T07:51:03.331 INFO:tasks.workunit.client.0.vm05.stdout:7/724: dread d1/d3c/d71/d79/d8a/fad [0,4194304] 0 2026-03-10T07:51:03.332 INFO:tasks.workunit.client.0.vm05.stdout:9/665: symlink d8/d35/d22/d33/ldb 0 2026-03-10T07:51:03.334 INFO:tasks.workunit.client.0.vm05.stdout:0/691: creat d8/dd/d37/d56/feb x:0 0 0 2026-03-10T07:51:03.335 INFO:tasks.workunit.client.0.vm05.stdout:1/716: creat da/dd/d2a/d55/d64/dac/fdc x:0 0 0 2026-03-10T07:51:03.342 INFO:tasks.workunit.client.0.vm05.stdout:6/726: dwrite d0/d6/d3b/fd7 [4194304,4194304] 0 2026-03-10T07:51:03.345 INFO:tasks.workunit.client.0.vm05.stdout:2/751: read d0/d8/d43/d38/f9a [416573,5442] 0 2026-03-10T07:51:03.350 INFO:tasks.workunit.client.0.vm05.stdout:9/666: mknod d8/d35/d1c/d20/cdc 0 2026-03-10T07:51:03.351 INFO:tasks.workunit.client.0.vm05.stdout:9/667: write d8/d35/d1c/d20/f9f [1818499,71620] 0 2026-03-10T07:51:03.354 INFO:tasks.workunit.client.0.vm05.stdout:0/692: creat d8/d9c/fec x:0 0 0 2026-03-10T07:51:03.357 INFO:tasks.workunit.client.0.vm05.stdout:1/717: fdatasync da/dd/d2a/d70/f83 0 2026-03-10T07:51:03.358 INFO:tasks.workunit.client.0.vm05.stdout:3/673: creat d8/d22/fe2 x:0 0 0 2026-03-10T07:51:03.358 INFO:tasks.workunit.client.0.vm05.stdout:3/674: readlink d8/d1f/d24/d76/la8 0 2026-03-10T07:51:03.366 INFO:tasks.workunit.client.0.vm05.stdout:8/616: rename d1/d23 to d1/dc9 0 2026-03-10T07:51:03.366 INFO:tasks.workunit.client.0.vm05.stdout:8/617: stat d1/dd/f17 0 2026-03-10T07:51:03.366 INFO:tasks.workunit.client.0.vm05.stdout:2/752: fdatasync d0/d8/d43/d38/f9a 0 2026-03-10T07:51:03.366 INFO:tasks.workunit.client.0.vm05.stdout:4/771: write d0/d6/d9/d12/d65/fb5 [150847,88181] 0 2026-03-10T07:51:03.369 INFO:tasks.workunit.client.0.vm05.stdout:9/668: symlink d8/d86/d28/d79/d57/dbc/ldd 0 2026-03-10T07:51:03.370 INFO:tasks.workunit.client.0.vm05.stdout:9/669: write d8/d86/d28/f43 [946862,69822] 0 2026-03-10T07:51:03.373 INFO:tasks.workunit.client.0.vm05.stdout:9/670: chown d8/d35/d22/d33/d70/l9d 7 1 2026-03-10T07:51:03.375 INFO:tasks.workunit.client.0.vm05.stdout:2/753: sync 2026-03-10T07:51:03.379 INFO:tasks.workunit.client.0.vm05.stdout:2/754: sync 2026-03-10T07:51:03.380 INFO:tasks.workunit.client.0.vm05.stdout:2/755: write d0/d2a/f2e [3073366,105766] 0 2026-03-10T07:51:03.385 INFO:tasks.workunit.client.0.vm05.stdout:5/730: write d2/d5/d61/f65 [3889171,4182] 0 2026-03-10T07:51:03.386 INFO:tasks.workunit.client.0.vm05.stdout:7/725: write d1/f37 [2747704,48447] 0 2026-03-10T07:51:03.394 INFO:tasks.workunit.client.0.vm05.stdout:1/718: write da/dd/d12/d34/d58/fd4 [3326448,115497] 0 2026-03-10T07:51:03.402 INFO:tasks.workunit.client.0.vm05.stdout:3/675: dread d8/d1f/d24/d8a/f57 [0,4194304] 0 2026-03-10T07:51:03.405 INFO:tasks.workunit.client.0.vm05.stdout:9/671: mkdir d8/d35/d6b/dde 0 2026-03-10T07:51:03.406 INFO:tasks.workunit.client.0.vm05.stdout:9/672: dread - d8/d86/d28/d79/d57/dbc/fc5 zero size 2026-03-10T07:51:03.407 INFO:tasks.workunit.client.0.vm05.stdout:6/727: creat d0/d11/d4f/fe3 x:0 0 0 2026-03-10T07:51:03.407 INFO:tasks.workunit.client.0.vm05.stdout:6/728: chown d0/d35/d36/f5b 1092 1 2026-03-10T07:51:03.408 INFO:tasks.workunit.client.0.vm05.stdout:6/729: chown d0/d11/d4f/d56/d96/lae 46 1 2026-03-10T07:51:03.409 INFO:tasks.workunit.client.0.vm05.stdout:8/618: dread - d1/dd/d18/d20/d2a/d48/d5a/f6d zero size 2026-03-10T07:51:03.409 INFO:tasks.workunit.client.0.vm05.stdout:8/619: chown d1/dd/d4d/d64/d8f/l99 34528 1 2026-03-10T07:51:03.413 INFO:tasks.workunit.client.0.vm05.stdout:7/726: rename d1/d34/f7c to d1/d3c/d71/d79/fe1 0 2026-03-10T07:51:03.415 INFO:tasks.workunit.client.0.vm05.stdout:8/620: sync 2026-03-10T07:51:03.420 INFO:tasks.workunit.client.0.vm05.stdout:2/756: dread d0/d8/f3b [0,4194304] 0 2026-03-10T07:51:03.422 INFO:tasks.workunit.client.0.vm05.stdout:4/772: write d0/d6/f78 [1754309,53873] 0 2026-03-10T07:51:03.424 INFO:tasks.workunit.client.0.vm05.stdout:4/773: sync 2026-03-10T07:51:03.429 INFO:tasks.workunit.client.0.vm05.stdout:0/693: dwrite d8/dd/d10/d26/d3a/d5e/fe2 [0,4194304] 0 2026-03-10T07:51:03.435 INFO:tasks.workunit.client.0.vm05.stdout:1/719: dwrite da/dd/d2a/d55/d64/f7a [0,4194304] 0 2026-03-10T07:51:03.443 INFO:tasks.workunit.client.0.vm05.stdout:6/730: creat d0/d35/d36/dc8/fe4 x:0 0 0 2026-03-10T07:51:03.443 INFO:tasks.workunit.client.0.vm05.stdout:5/731: mknod d2/d20/d7b/dca/cf4 0 2026-03-10T07:51:03.452 INFO:tasks.workunit.client.0.vm05.stdout:5/732: dread d2/d12/d4d/f84 [0,4194304] 0 2026-03-10T07:51:03.453 INFO:tasks.workunit.client.0.vm05.stdout:7/727: symlink d1/d6/d47/d8d/le2 0 2026-03-10T07:51:03.457 INFO:tasks.workunit.client.0.vm05.stdout:9/673: dwrite d8/d86/d28/d79/f44 [0,4194304] 0 2026-03-10T07:51:03.470 INFO:tasks.workunit.client.0.vm05.stdout:9/674: dread d8/d35/d22/d33/d47/f5a [0,4194304] 0 2026-03-10T07:51:03.472 INFO:tasks.workunit.client.0.vm05.stdout:8/621: dread d1/dd/d4d/d64/d6a/f76 [0,4194304] 0 2026-03-10T07:51:03.476 INFO:tasks.workunit.client.0.vm05.stdout:2/757: creat d0/d8/d43/df/ff2 x:0 0 0 2026-03-10T07:51:03.481 INFO:tasks.workunit.client.0.vm05.stdout:4/774: creat d0/d6/d9/d12/d4f/f101 x:0 0 0 2026-03-10T07:51:03.482 INFO:tasks.workunit.client.0.vm05.stdout:4/775: read d0/d6/d9/d12/d45/d55/f56 [8173242,67267] 0 2026-03-10T07:51:03.492 INFO:tasks.workunit.client.0.vm05.stdout:5/733: rmdir d2/d20/d7b/dbe 39 2026-03-10T07:51:03.496 INFO:tasks.workunit.client.0.vm05.stdout:9/675: sync 2026-03-10T07:51:03.501 INFO:tasks.workunit.client.0.vm05.stdout:8/622: rename d1/d45/f9d to d1/dd/d18/d20/d2a/d9a/fca 0 2026-03-10T07:51:03.502 INFO:tasks.workunit.client.0.vm05.stdout:8/623: readlink d1/dd/d4d/d64/d8f/l99 0 2026-03-10T07:51:03.507 INFO:tasks.workunit.client.0.vm05.stdout:3/676: dwrite d8/d1f/d24/d76/dc5/de1/d52/fde [0,4194304] 0 2026-03-10T07:51:03.508 INFO:tasks.workunit.client.0.vm05.stdout:7/728: write d1/d3c/d71/d79/d8a/fad [985514,25870] 0 2026-03-10T07:51:03.509 INFO:tasks.workunit.client.0.vm05.stdout:7/729: readlink d1/d34/d59/d60/d8c/lb5 0 2026-03-10T07:51:03.515 INFO:tasks.workunit.client.0.vm05.stdout:2/758: read d0/d8/fe [4106808,5962] 0 2026-03-10T07:51:03.517 INFO:tasks.workunit.client.0.vm05.stdout:0/694: mknod d8/dd/d10/d26/d8b/da4/de7/ced 0 2026-03-10T07:51:03.518 INFO:tasks.workunit.client.0.vm05.stdout:0/695: read d8/dd/d10/d26/d3a/d5e/fe2 [1071866,32438] 0 2026-03-10T07:51:03.520 INFO:tasks.workunit.client.0.vm05.stdout:1/720: symlink da/dd/d12/d34/d58/ldd 0 2026-03-10T07:51:03.523 INFO:tasks.workunit.client.0.vm05.stdout:6/731: creat d0/d11/d4f/d7d/db7/fe5 x:0 0 0 2026-03-10T07:51:03.533 INFO:tasks.workunit.client.0.vm05.stdout:5/734: fsync d2/d20/d33/d86/fb3 0 2026-03-10T07:51:03.533 INFO:tasks.workunit.client.0.vm05.stdout:5/735: write d2/d5/d61/f66 [3479445,43003] 0 2026-03-10T07:51:03.533 INFO:tasks.workunit.client.0.vm05.stdout:5/736: write d2/d12/dda/da1/dc0/fce [746804,107353] 0 2026-03-10T07:51:03.537 INFO:tasks.workunit.client.0.vm05.stdout:3/677: rename d8/d8f/dbc/dc7/ccc to d8/d1f/d24/d76/dc5/ce3 0 2026-03-10T07:51:03.544 INFO:tasks.workunit.client.0.vm05.stdout:8/624: write d1/dd/d18/d20/f43 [2416002,124570] 0 2026-03-10T07:51:03.550 INFO:tasks.workunit.client.0.vm05.stdout:0/696: creat d8/dd/d10/d26/d2a/fee x:0 0 0 2026-03-10T07:51:03.552 INFO:tasks.workunit.client.0.vm05.stdout:1/721: symlink da/dd/d2a/d55/d64/lde 0 2026-03-10T07:51:03.562 INFO:tasks.workunit.client.0.vm05.stdout:5/737: unlink d2/d20/d33/d53/d7d/f90 0 2026-03-10T07:51:03.564 INFO:tasks.workunit.client.0.vm05.stdout:9/676: symlink d8/d35/d22/d33/d62/dc0/ldf 0 2026-03-10T07:51:03.568 INFO:tasks.workunit.client.0.vm05.stdout:3/678: rename d8/d1f/d24/f3e to d8/d1f/d24/d45/ddf/fe4 0 2026-03-10T07:51:03.571 INFO:tasks.workunit.client.0.vm05.stdout:7/730: mknod d1/d34/d59/ce3 0 2026-03-10T07:51:03.572 INFO:tasks.workunit.client.0.vm05.stdout:7/731: chown d1/d3c/d4b/f4f 0 1 2026-03-10T07:51:03.575 INFO:tasks.workunit.client.0.vm05.stdout:3/679: dwrite d8/d1f/d24/d8a/f91 [0,4194304] 0 2026-03-10T07:51:03.583 INFO:tasks.workunit.client.0.vm05.stdout:2/759: mknod d0/cf3 0 2026-03-10T07:51:03.584 INFO:tasks.workunit.client.0.vm05.stdout:4/776: link d0/cae d0/d6/d9/d12/d69/dc7/ded/c102 0 2026-03-10T07:51:03.584 INFO:tasks.workunit.client.0.vm05.stdout:4/777: stat d0/d6/d9/d5a/d91 0 2026-03-10T07:51:03.585 INFO:tasks.workunit.client.0.vm05.stdout:4/778: truncate d0/d6/d9/f83 5094507 0 2026-03-10T07:51:03.586 INFO:tasks.workunit.client.0.vm05.stdout:4/779: write d0/d6/d9/d5a/f58 [3739093,5557] 0 2026-03-10T07:51:03.589 INFO:tasks.workunit.client.0.vm05.stdout:1/722: symlink da/dd/d2a/d55/d64/ldf 0 2026-03-10T07:51:03.600 INFO:tasks.workunit.client.0.vm05.stdout:5/738: symlink d2/d20/d33/d53/d7d/lf5 0 2026-03-10T07:51:03.600 INFO:tasks.workunit.client.0.vm05.stdout:9/677: mkdir d8/d35/d22/d33/d62/de0 0 2026-03-10T07:51:03.600 INFO:tasks.workunit.client.0.vm05.stdout:0/697: rename d8/dd/d10/d26/d3a/lc0 to d8/dd/d10/d26/d2a/d6f/lef 0 2026-03-10T07:51:03.600 INFO:tasks.workunit.client.0.vm05.stdout:0/698: stat d8/dd/fde 0 2026-03-10T07:51:03.600 INFO:tasks.workunit.client.0.vm05.stdout:3/680: symlink d8/d1f/d24/d76/dc5/de1/d19/d6b/le5 0 2026-03-10T07:51:03.600 INFO:tasks.workunit.client.0.vm05.stdout:3/681: fdatasync d8/d1f/d24/d76/dc5/de1/d52/da4/fab 0 2026-03-10T07:51:03.600 INFO:tasks.workunit.client.0.vm05.stdout:3/682: chown d8/d1f/d2a/d34/dbd 36 1 2026-03-10T07:51:03.600 INFO:tasks.workunit.client.0.vm05.stdout:3/683: stat d8/d1f/d24/d76/dc5/de1/f88 0 2026-03-10T07:51:03.600 INFO:tasks.workunit.client.0.vm05.stdout:4/780: truncate d0/d6/d9/d12/d45/d55/d44/d85/fd4 511614 0 2026-03-10T07:51:03.604 INFO:tasks.workunit.client.0.vm05.stdout:9/678: truncate d8/d35/d22/d33/d47/f5a 4295199 0 2026-03-10T07:51:03.614 INFO:tasks.workunit.client.0.vm05.stdout:7/732: write d1/d3c/f89 [749371,64448] 0 2026-03-10T07:51:03.614 INFO:tasks.workunit.client.0.vm05.stdout:2/760: write d0/d2a/d8c/fad [519781,7536] 0 2026-03-10T07:51:03.619 INFO:tasks.workunit.client.0.vm05.stdout:0/699: dwrite d8/fc [0,4194304] 0 2026-03-10T07:51:03.622 INFO:tasks.workunit.client.0.vm05.stdout:2/761: dwrite d0/d8/d3d/fdd [4194304,4194304] 0 2026-03-10T07:51:03.628 INFO:tasks.workunit.client.0.vm05.stdout:2/762: dread d0/d2a/d8c/fad [0,4194304] 0 2026-03-10T07:51:03.636 INFO:tasks.workunit.client.0.vm05.stdout:3/684: creat d8/d1f/d2a/d96/da9/fe6 x:0 0 0 2026-03-10T07:51:03.640 INFO:tasks.workunit.client.0.vm05.stdout:3/685: chown d8/d1f/d2a/d34/dbd/cbf 33047319 1 2026-03-10T07:51:03.640 INFO:tasks.workunit.client.0.vm05.stdout:4/781: symlink d0/d6/d60/dde/l103 0 2026-03-10T07:51:03.640 INFO:tasks.workunit.client.0.vm05.stdout:8/625: link d1/dd/d18/d20/d2a/d48/d7c/d9c/lc0 d1/dc9/lcb 0 2026-03-10T07:51:03.640 INFO:tasks.workunit.client.0.vm05.stdout:6/732: getdents d0/d11/d4f/da0/da6 0 2026-03-10T07:51:03.641 INFO:tasks.workunit.client.0.vm05.stdout:6/733: fsync d0/d11/d57/d66/f7b 0 2026-03-10T07:51:03.642 INFO:tasks.workunit.client.0.vm05.stdout:5/739: symlink d2/d20/d7b/dbe/lf6 0 2026-03-10T07:51:03.644 INFO:tasks.workunit.client.0.vm05.stdout:9/679: rename d8/d35 to d8/d86/d28/d79/d57/de1 0 2026-03-10T07:51:03.645 INFO:tasks.workunit.client.0.vm05.stdout:7/733: unlink d1/d34/d59/c88 0 2026-03-10T07:51:03.649 INFO:tasks.workunit.client.0.vm05.stdout:0/700: creat d8/dd/d10/d26/d48/ff0 x:0 0 0 2026-03-10T07:51:03.652 INFO:tasks.workunit.client.0.vm05.stdout:8/626: truncate d1/dd/d18/f5c 1344433 0 2026-03-10T07:51:03.652 INFO:tasks.workunit.client.0.vm05.stdout:1/723: creat da/dd/d12/d86/fe0 x:0 0 0 2026-03-10T07:51:03.655 INFO:tasks.workunit.client.0.vm05.stdout:5/740: creat d2/d12/d2d/d4a/ff7 x:0 0 0 2026-03-10T07:51:03.658 INFO:tasks.workunit.client.0.vm05.stdout:3/686: dread d8/d1c/d48/d69/fb2 [0,4194304] 0 2026-03-10T07:51:03.661 INFO:tasks.workunit.client.0.vm05.stdout:6/734: dread d0/d11/d31/f63 [0,4194304] 0 2026-03-10T07:51:03.661 INFO:tasks.workunit.client.0.vm05.stdout:3/687: read d8/d1f/f6c [2221707,14638] 0 2026-03-10T07:51:03.664 INFO:tasks.workunit.client.0.vm05.stdout:3/688: truncate d8/d22/d60/fce 818122 0 2026-03-10T07:51:03.665 INFO:tasks.workunit.client.0.vm05.stdout:7/734: fsync d1/d6/f84 0 2026-03-10T07:51:03.665 INFO:tasks.workunit.client.0.vm05.stdout:5/741: dwrite d2/d20/d33/fe0 [0,4194304] 0 2026-03-10T07:51:03.668 INFO:tasks.workunit.client.0.vm05.stdout:7/735: dread - d1/d3c/d71/d79/d8a/fce zero size 2026-03-10T07:51:03.676 INFO:tasks.workunit.client.0.vm05.stdout:0/701: unlink d8/dd/d10/d26/d2a/d6f/fb8 0 2026-03-10T07:51:03.680 INFO:tasks.workunit.client.0.vm05.stdout:8/627: fdatasync d1/dd/d18/f22 0 2026-03-10T07:51:03.685 INFO:tasks.workunit.client.0.vm05.stdout:2/763: write d0/d8/d3d/d7d/db2/fbd [681729,130682] 0 2026-03-10T07:51:03.693 INFO:tasks.workunit.client.0.vm05.stdout:5/742: fdatasync d2/d20/d33/d86/fb3 0 2026-03-10T07:51:03.693 INFO:tasks.workunit.client.0.vm05.stdout:7/736: mkdir d1/d34/d59/d60/d8c/de4 0 2026-03-10T07:51:03.693 INFO:tasks.workunit.client.0.vm05.stdout:5/743: chown d2/d20/d33/fe0 10932 1 2026-03-10T07:51:03.693 INFO:tasks.workunit.client.0.vm05.stdout:0/702: symlink d8/dd/d37/d67/d96/lf1 0 2026-03-10T07:51:03.693 INFO:tasks.workunit.client.0.vm05.stdout:1/724: mknod da/dd/d2a/d55/ce1 0 2026-03-10T07:51:03.694 INFO:tasks.workunit.client.0.vm05.stdout:8/628: fdatasync d1/d6f/fa7 0 2026-03-10T07:51:03.705 INFO:tasks.workunit.client.0.vm05.stdout:4/782: dwrite d0/d6/f39 [0,4194304] 0 2026-03-10T07:51:03.711 INFO:tasks.workunit.client.0.vm05.stdout:2/764: stat d0/d8/l80 0 2026-03-10T07:51:03.712 INFO:tasks.workunit.client.0.vm05.stdout:2/765: fsync d0/f5 0 2026-03-10T07:51:03.716 INFO:tasks.workunit.client.0.vm05.stdout:7/737: dread - d1/d34/d59/f6f zero size 2026-03-10T07:51:03.716 INFO:tasks.workunit.client.0.vm05.stdout:5/744: rename d2/d20/d33/d53/d7d/f9b to d2/d12/dda/da1/dc0/dc2/ff8 0 2026-03-10T07:51:03.719 INFO:tasks.workunit.client.0.vm05.stdout:8/629: fsync d1/dd/d5e/d9e/fc3 0 2026-03-10T07:51:03.726 INFO:tasks.workunit.client.0.vm05.stdout:4/783: symlink d0/d6/d9/d12/d4f/l104 0 2026-03-10T07:51:03.726 INFO:tasks.workunit.client.0.vm05.stdout:4/784: chown d0/d6 34626 1 2026-03-10T07:51:03.726 INFO:tasks.workunit.client.0.vm05.stdout:3/689: link d8/fe d8/d1f/d24/d76/dc5/de1/d19/d6b/fe7 0 2026-03-10T07:51:03.726 INFO:tasks.workunit.client.0.vm05.stdout:0/703: sync 2026-03-10T07:51:03.727 INFO:tasks.workunit.client.0.vm05.stdout:5/745: read d2/d20/f2a [9361795,52615] 0 2026-03-10T07:51:03.728 INFO:tasks.workunit.client.0.vm05.stdout:8/630: rename d1/dd/d18/d20/d2a/d34/d49/db7 to d1/dd/d4d/dcc 0 2026-03-10T07:51:03.731 INFO:tasks.workunit.client.0.vm05.stdout:4/785: read d0/d6/d9/d12/d9c/db7/da7/f4a [905491,68921] 0 2026-03-10T07:51:03.735 INFO:tasks.workunit.client.0.vm05.stdout:5/746: truncate d2/f1a 3449670 0 2026-03-10T07:51:03.738 INFO:tasks.workunit.client.0.vm05.stdout:0/704: creat d8/d9c/ff2 x:0 0 0 2026-03-10T07:51:03.738 INFO:tasks.workunit.client.0.vm05.stdout:8/631: write d1/dd/d18/d20/d2a/d34/fc6 [2317733,54668] 0 2026-03-10T07:51:03.743 INFO:tasks.workunit.client.0.vm05.stdout:3/690: mknod d8/d22/ce8 0 2026-03-10T07:51:03.749 INFO:tasks.workunit.client.0.vm05.stdout:0/705: creat d8/dd/d37/d67/d96/ff3 x:0 0 0 2026-03-10T07:51:03.751 INFO:tasks.workunit.client.0.vm05.stdout:3/691: sync 2026-03-10T07:51:03.752 INFO:tasks.workunit.client.0.vm05.stdout:8/632: stat d1/dd/d18/d20/d2a/d9a/fca 0 2026-03-10T07:51:03.755 INFO:tasks.workunit.client.0.vm05.stdout:0/706: dwrite d8/d9c/fec [0,4194304] 0 2026-03-10T07:51:03.758 INFO:tasks.workunit.client.0.vm05.stdout:3/692: creat d8/d1f/d24/d76/dc5/fe9 x:0 0 0 2026-03-10T07:51:03.764 INFO:tasks.workunit.client.0.vm05.stdout:6/735: truncate d0/d11/d57/d66/fbd 620626 0 2026-03-10T07:51:03.769 INFO:tasks.workunit.client.0.vm05.stdout:9/680: write d8/d86/d28/d79/d57/de1/f48 [1314752,46106] 0 2026-03-10T07:51:03.776 INFO:tasks.workunit.client.0.vm05.stdout:9/681: fdatasync d8/d86/d28/d79/d57/de1/d1c/d20/f9f 0 2026-03-10T07:51:03.776 INFO:tasks.workunit.client.0.vm05.stdout:1/725: dwrite da/dd/d12/d34/ddb/fb5 [0,4194304] 0 2026-03-10T07:51:03.776 INFO:tasks.workunit.client.0.vm05.stdout:8/633: symlink d1/dd/d18/d20/d2a/d9a/lcd 0 2026-03-10T07:51:03.776 INFO:tasks.workunit.client.0.vm05.stdout:7/738: dwrite d1/d34/f3e [0,4194304] 0 2026-03-10T07:51:03.776 INFO:tasks.workunit.client.0.vm05.stdout:9/682: chown d8/d86/d28/d79/d57/de1/d22/f4f 56344786 1 2026-03-10T07:51:03.779 INFO:tasks.workunit.client.0.vm05.stdout:0/707: mknod d8/dd/d37/d81/cf4 0 2026-03-10T07:51:03.783 INFO:tasks.workunit.client.0.vm05.stdout:8/634: symlink d1/dd/d18/d20/d2a/d34/da5/lce 0 2026-03-10T07:51:03.785 INFO:tasks.workunit.client.0.vm05.stdout:2/766: truncate d0/d8/d3d/fdd 3765437 0 2026-03-10T07:51:03.785 INFO:tasks.workunit.client.0.vm05.stdout:6/736: read d0/d11/d22/f4c [757834,123473] 0 2026-03-10T07:51:03.793 INFO:tasks.workunit.client.0.vm05.stdout:3/693: mkdir d8/d1f/d24/d76/dc5/de1/d52/dea 0 2026-03-10T07:51:03.793 INFO:tasks.workunit.client.0.vm05.stdout:9/683: dwrite d8/d86/d28/d79/d57/de1/d1c/d20/d59/fca [0,4194304] 0 2026-03-10T07:51:03.801 INFO:tasks.workunit.client.0.vm05.stdout:7/739: dwrite d1/d34/d59/fdf [0,4194304] 0 2026-03-10T07:51:03.805 INFO:tasks.workunit.client.0.vm05.stdout:3/694: dwrite d8/d1f/d2a/d96/fd4 [0,4194304] 0 2026-03-10T07:51:03.805 INFO:tasks.workunit.client.0.vm05.stdout:3/695: chown d8/d1f/f95 8100 1 2026-03-10T07:51:03.815 INFO:tasks.workunit.client.0.vm05.stdout:2/767: rename d0/d8/fc3 to d0/d8/d43/da4/dea/ff4 0 2026-03-10T07:51:03.818 INFO:tasks.workunit.client.0.vm05.stdout:1/726: dread da/d26/d2b/daf/dbe/dc0/f79 [0,4194304] 0 2026-03-10T07:51:03.822 INFO:tasks.workunit.client.0.vm05.stdout:3/696: mkdir d8/d22/d60/d6e/deb 0 2026-03-10T07:51:03.827 INFO:tasks.workunit.client.0.vm05.stdout:7/740: truncate d1/f46 24410 0 2026-03-10T07:51:03.833 INFO:tasks.workunit.client.0.vm05.stdout:6/737: rmdir d0/d6/d3b/dbe 0 2026-03-10T07:51:03.835 INFO:tasks.workunit.client.0.vm05.stdout:2/768: mknod d0/d8/d43/dc9/cf5 0 2026-03-10T07:51:03.838 INFO:tasks.workunit.client.0.vm05.stdout:1/727: rmdir da/dd/d2a/d70 39 2026-03-10T07:51:03.842 INFO:tasks.workunit.client.0.vm05.stdout:1/728: dwrite da/dd/d2a/f93 [0,4194304] 0 2026-03-10T07:51:03.846 INFO:tasks.workunit.client.0.vm05.stdout:3/697: symlink d8/d1c/lec 0 2026-03-10T07:51:03.855 INFO:tasks.workunit.client.0.vm05.stdout:0/708: getdents d8/dd/d37/d56 0 2026-03-10T07:51:03.863 INFO:tasks.workunit.client.0.vm05.stdout:4/786: dwrite d0/d28/fda [0,4194304] 0 2026-03-10T07:51:03.872 INFO:tasks.workunit.client.0.vm05.stdout:2/769: creat d0/d8/d66/dd1/d49/dab/ff6 x:0 0 0 2026-03-10T07:51:03.878 INFO:tasks.workunit.client.0.vm05.stdout:3/698: dread - d8/d1c/d48/faa zero size 2026-03-10T07:51:03.880 INFO:tasks.workunit.client.0.vm05.stdout:7/741: mkdir d1/d3c/db8/de5 0 2026-03-10T07:51:03.883 INFO:tasks.workunit.client.0.vm05.stdout:9/684: rename d8/d86/d28/d79/d57/de1/d1c/d20/f9f to d8/fe2 0 2026-03-10T07:51:03.888 INFO:tasks.workunit.client.0.vm05.stdout:1/729: mknod da/d26/d2b/daf/dbe/dc0/ce2 0 2026-03-10T07:51:03.891 INFO:tasks.workunit.client.0.vm05.stdout:7/742: mkdir d1/d5b/de6 0 2026-03-10T07:51:03.896 INFO:tasks.workunit.client.0.vm05.stdout:4/787: symlink d0/d6/d9/d5a/d6e/dd1/l105 0 2026-03-10T07:51:03.902 INFO:tasks.workunit.client.0.vm05.stdout:7/743: unlink d1/d34/d59/d60/d8c/lb5 0 2026-03-10T07:51:03.902 INFO:tasks.workunit.client.0.vm05.stdout:6/738: getdents d0/d35/d36/db8 0 2026-03-10T07:51:03.902 INFO:tasks.workunit.client.0.vm05.stdout:7/744: fdatasync d1/d34/d59/fdf 0 2026-03-10T07:51:03.903 INFO:tasks.workunit.client.0.vm05.stdout:0/709: link d8/dd/d37/f38 d8/dd/d10/d26/d8b/d86/ff5 0 2026-03-10T07:51:03.904 INFO:tasks.workunit.client.0.vm05.stdout:0/710: write d8/dd/d10/d26/d2a/f8f [1346175,23942] 0 2026-03-10T07:51:03.905 INFO:tasks.workunit.client.0.vm05.stdout:0/711: truncate d8/d9c/ff2 381653 0 2026-03-10T07:51:03.905 INFO:tasks.workunit.client.0.vm05.stdout:0/712: readlink d8/dd/lc5 0 2026-03-10T07:51:03.909 INFO:tasks.workunit.client.0.vm05.stdout:4/788: rmdir d0/d6/d9/d12/d45/d55/d44/d85 39 2026-03-10T07:51:03.912 INFO:tasks.workunit.client.0.vm05.stdout:5/747: truncate d2/d20/d7b/f83 3007134 0 2026-03-10T07:51:03.923 INFO:tasks.workunit.client.0.vm05.stdout:0/713: dread - d8/dd/d10/f7f zero size 2026-03-10T07:51:03.925 INFO:tasks.workunit.client.0.vm05.stdout:9/685: getdents d8/d86/d28/d79/d57/de1/d6b/dde 0 2026-03-10T07:51:03.937 INFO:tasks.workunit.client.0.vm05.stdout:1/730: creat da/dd/d42/fe3 x:0 0 0 2026-03-10T07:51:03.939 INFO:tasks.workunit.client.0.vm05.stdout:5/748: chown d2/d12/dda/da1/faa 68 1 2026-03-10T07:51:03.939 INFO:tasks.workunit.client.0.vm05.stdout:5/749: fsync d2/d20/d33/fe0 0 2026-03-10T07:51:03.940 INFO:tasks.workunit.client.0.vm05.stdout:5/750: chown d2/l22 795 1 2026-03-10T07:51:03.941 INFO:tasks.workunit.client.0.vm05.stdout:6/739: symlink d0/d11/le6 0 2026-03-10T07:51:03.942 INFO:tasks.workunit.client.0.vm05.stdout:9/686: sync 2026-03-10T07:51:03.946 INFO:tasks.workunit.client.0.vm05.stdout:0/714: mknod d8/dd/d10/d26/d8b/d70/cf6 0 2026-03-10T07:51:03.946 INFO:tasks.workunit.client.0.vm05.stdout:8/635: write d1/dd/d18/d20/d2a/d48/f79 [1950169,2126] 0 2026-03-10T07:51:03.947 INFO:tasks.workunit.client.0.vm05.stdout:0/715: write d8/dd/f22 [1647541,75151] 0 2026-03-10T07:51:03.953 INFO:tasks.workunit.client.0.vm05.stdout:2/770: getdents d0/d8/d43/da4/dea 0 2026-03-10T07:51:03.959 INFO:tasks.workunit.client.0.vm05.stdout:5/751: creat d2/d4b/ff9 x:0 0 0 2026-03-10T07:51:03.960 INFO:tasks.workunit.client.0.vm05.stdout:6/740: mkdir d0/d11/d57/da4/db3/de7 0 2026-03-10T07:51:03.960 INFO:tasks.workunit.client.0.vm05.stdout:6/741: read d0/d11/d4f/da0/da6/fd9 [2435411,29329] 0 2026-03-10T07:51:03.969 INFO:tasks.workunit.client.0.vm05.stdout:9/687: mknod d8/d86/d28/d79/d57/de1/d22/dab/db4/ce3 0 2026-03-10T07:51:03.974 INFO:tasks.workunit.client.0.vm05.stdout:5/752: dread - d2/d5/d61/fc5 zero size 2026-03-10T07:51:03.978 INFO:tasks.workunit.client.0.vm05.stdout:3/699: write d8/d8f/fb4 [159042,47117] 0 2026-03-10T07:51:03.979 INFO:tasks.workunit.client.0.vm05.stdout:3/700: fdatasync d8/d22/fb9 0 2026-03-10T07:51:03.979 INFO:tasks.workunit.client.0.vm05.stdout:7/745: getdents d1/d6/d47/d8d 0 2026-03-10T07:51:04.000 INFO:tasks.workunit.client.0.vm05.stdout:4/789: dwrite d0/d6/da6/fba [0,4194304] 0 2026-03-10T07:51:04.006 INFO:tasks.workunit.client.0.vm05.stdout:6/742: mkdir d0/d11/d4f/d7d/de8 0 2026-03-10T07:51:04.008 INFO:tasks.workunit.client.0.vm05.stdout:6/743: dread d0/d11/d22/f52 [0,4194304] 0 2026-03-10T07:51:04.009 INFO:tasks.workunit.client.0.vm05.stdout:1/731: write da/f5c [1535364,22169] 0 2026-03-10T07:51:04.013 INFO:tasks.workunit.client.0.vm05.stdout:8/636: dread d1/dd/d4d/f60 [0,4194304] 0 2026-03-10T07:51:04.020 INFO:tasks.workunit.client.0.vm05.stdout:0/716: rename d8/dd/d37/d56/d4d/le3 to d8/dd/d10/d26/lf7 0 2026-03-10T07:51:04.026 INFO:tasks.workunit.client.0.vm05.stdout:4/790: rename d0/d6/d9/d5a to d0/d6/d9/d5a/d106 22 2026-03-10T07:51:04.026 INFO:tasks.workunit.client.0.vm05.stdout:9/688: write d8/d86/d28/d79/d57/d96/fa9 [901138,34275] 0 2026-03-10T07:51:04.026 INFO:tasks.workunit.client.0.vm05.stdout:2/771: truncate d0/d8/d3d/f40 482248 0 2026-03-10T07:51:04.033 INFO:tasks.workunit.client.0.vm05.stdout:5/753: write d2/d20/d33/d53/d7d/f82 [1631447,78296] 0 2026-03-10T07:51:04.034 INFO:tasks.workunit.client.0.vm05.stdout:5/754: write d2/d12/dda/da1/dc0/fce [2685998,13512] 0 2026-03-10T07:51:04.042 INFO:tasks.workunit.client.0.vm05.stdout:0/717: mkdir d8/dd/d37/d56/d4d/df8 0 2026-03-10T07:51:04.043 INFO:tasks.workunit.client.0.vm05.stdout:0/718: stat d8/dd/d37/d67/d96/fdb 0 2026-03-10T07:51:04.043 INFO:tasks.workunit.client.0.vm05.stdout:0/719: chown d8/dd/d10/d26/d8b/d7d/ccd 25 1 2026-03-10T07:51:04.044 INFO:tasks.workunit.client.0.vm05.stdout:3/701: write d8/f12 [3404693,42307] 0 2026-03-10T07:51:04.045 INFO:tasks.workunit.client.0.vm05.stdout:9/689: creat d8/d86/db8/fe4 x:0 0 0 2026-03-10T07:51:04.053 INFO:tasks.workunit.client.0.vm05.stdout:5/755: symlink d2/d4b/lfa 0 2026-03-10T07:51:04.054 INFO:tasks.workunit.client.0.vm05.stdout:1/732: write da/dd/d2a/f90 [283598,98165] 0 2026-03-10T07:51:04.059 INFO:tasks.workunit.client.0.vm05.stdout:7/746: creat d1/d6/d47/fe7 x:0 0 0 2026-03-10T07:51:04.061 INFO:tasks.workunit.client.0.vm05.stdout:8/637: rename d1/dd/d18/f5c to d1/dd/d4d/dcc/dbd/fcf 0 2026-03-10T07:51:04.062 INFO:tasks.workunit.client.0.vm05.stdout:4/791: symlink d0/de4/l107 0 2026-03-10T07:51:04.087 INFO:tasks.workunit.client.0.vm05.stdout:3/702: dwrite d8/d1c/d48/f70 [0,4194304] 0 2026-03-10T07:51:04.108 INFO:tasks.workunit.client.0.vm05.stdout:9/690: rename d8/d86/d28/d79/d57/de1/d22/dab/db4/fb6 to d8/d86/d28/d79/d57/de1/d22/fe5 0 2026-03-10T07:51:04.110 INFO:tasks.workunit.client.0.vm05.stdout:8/638: truncate d1/dd/f17 4398688 0 2026-03-10T07:51:04.113 INFO:tasks.workunit.client.0.vm05.stdout:8/639: dwrite d1/d45/f53 [8388608,4194304] 0 2026-03-10T07:51:04.115 INFO:tasks.workunit.client.0.vm05.stdout:8/640: readlink d1/dd/d5e/l8b 0 2026-03-10T07:51:04.122 INFO:tasks.workunit.client.0.vm05.stdout:4/792: mkdir d0/d6/d9/d5a/d6e/dd1/d108 0 2026-03-10T07:51:04.123 INFO:tasks.workunit.client.0.vm05.stdout:4/793: chown d0/d6/d9/d12/d9c/db7/da7/f9a 0 1 2026-03-10T07:51:04.127 INFO:tasks.workunit.client.0.vm05.stdout:2/772: creat d0/d8/d66/dd1/ff7 x:0 0 0 2026-03-10T07:51:04.129 INFO:tasks.workunit.client.0.vm05.stdout:6/744: getdents d0/d6/d3b 0 2026-03-10T07:51:04.137 INFO:tasks.workunit.client.0.vm05.stdout:3/703: mknod d8/d1c/d64/ced 0 2026-03-10T07:51:04.140 INFO:tasks.workunit.client.0.vm05.stdout:9/691: fdatasync d8/d86/d28/d79/d57/de1/d1c/f4c 0 2026-03-10T07:51:04.143 INFO:tasks.workunit.client.0.vm05.stdout:7/747: dwrite d1/d34/d59/f64 [4194304,4194304] 0 2026-03-10T07:51:04.147 INFO:tasks.workunit.client.0.vm05.stdout:8/641: fdatasync d1/dd/d18/f21 0 2026-03-10T07:51:04.147 INFO:tasks.workunit.client.0.vm05.stdout:0/720: creat d8/dd/d10/d26/d3a/d5e/ff9 x:0 0 0 2026-03-10T07:51:04.147 INFO:tasks.workunit.client.0.vm05.stdout:2/773: mkdir d0/d8/d43/df/df8 0 2026-03-10T07:51:04.148 INFO:tasks.workunit.client.0.vm05.stdout:2/774: fsync d0/d8/d3d/d7d/db2/fbd 0 2026-03-10T07:51:04.148 INFO:tasks.workunit.client.0.vm05.stdout:0/721: chown d8/dd/d10/d26/d48/ff0 1860691 1 2026-03-10T07:51:04.149 INFO:tasks.workunit.client.0.vm05.stdout:2/775: dread - d0/d8/d43/d38/f85 zero size 2026-03-10T07:51:04.153 INFO:tasks.workunit.client.0.vm05.stdout:0/722: dwrite d8/dd/f59 [0,4194304] 0 2026-03-10T07:51:04.155 INFO:tasks.workunit.client.0.vm05.stdout:0/723: write d8/dd/d10/d26/d3a/d5e/fa3 [1256859,35745] 0 2026-03-10T07:51:04.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:03 vm05.local ceph-mon[50387]: Upgrade: Updating prometheus.vm05 2026-03-10T07:51:04.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:03 vm05.local ceph-mon[50387]: Deploying daemon prometheus.vm05 on vm05 2026-03-10T07:51:04.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:03 vm05.local ceph-mon[50387]: pgmap v21: 65 pgs: 65 active+clean; 1.4 GiB data, 4.7 GiB used, 115 GiB / 120 GiB avail; 44 MiB/s rd, 101 MiB/s wr, 265 op/s 2026-03-10T07:51:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:03 vm08.local ceph-mon[59917]: Upgrade: Updating prometheus.vm05 2026-03-10T07:51:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:03 vm08.local ceph-mon[59917]: Deploying daemon prometheus.vm05 on vm05 2026-03-10T07:51:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:03 vm08.local ceph-mon[59917]: pgmap v21: 65 pgs: 65 active+clean; 1.4 GiB data, 4.7 GiB used, 115 GiB / 120 GiB avail; 44 MiB/s rd, 101 MiB/s wr, 265 op/s 2026-03-10T07:51:04.171 INFO:tasks.workunit.client.0.vm05.stdout:5/756: dwrite d2/d12/dda/da1/dc0/dc2/ff8 [0,4194304] 0 2026-03-10T07:51:04.178 INFO:tasks.workunit.client.0.vm05.stdout:9/692: mknod d8/d86/d28/d79/d57/d96/ce6 0 2026-03-10T07:51:04.178 INFO:tasks.workunit.client.0.vm05.stdout:9/693: stat d8/d86/d28/d79/d57/de1/d1c/d20/cdc 0 2026-03-10T07:51:04.184 INFO:tasks.workunit.client.0.vm05.stdout:6/745: link d0/d11/d4f/fe3 d0/d11/d4f/da0/fe9 0 2026-03-10T07:51:04.184 INFO:tasks.workunit.client.0.vm05.stdout:6/746: chown d0/d11/le6 87294522 1 2026-03-10T07:51:04.185 INFO:tasks.workunit.client.0.vm05.stdout:8/642: dread d1/dc9/fa3 [0,4194304] 0 2026-03-10T07:51:04.187 INFO:tasks.workunit.client.0.vm05.stdout:2/776: fsync d0/d8/d3d/d7d/db2/f29 0 2026-03-10T07:51:04.187 INFO:tasks.workunit.client.0.vm05.stdout:2/777: chown d0/d8/d43/df/d8b/la3 30258382 1 2026-03-10T07:51:04.189 INFO:tasks.workunit.client.0.vm05.stdout:1/733: link da/dd/d12/d34/c60 da/dd/d2a/d55/ce4 0 2026-03-10T07:51:04.192 INFO:tasks.workunit.client.0.vm05.stdout:0/724: creat d8/dd/d37/d56/d4d/ffa x:0 0 0 2026-03-10T07:51:04.193 INFO:tasks.workunit.client.0.vm05.stdout:3/704: creat d8/d1f/d2a/d4a/d7d/fee x:0 0 0 2026-03-10T07:51:04.193 INFO:tasks.workunit.client.0.vm05.stdout:5/757: fdatasync d2/d12/f3a 0 2026-03-10T07:51:04.194 INFO:tasks.workunit.client.0.vm05.stdout:3/705: chown d8/d1f/d24/d76/dc5/de1/f4c 5233 1 2026-03-10T07:51:04.197 INFO:tasks.workunit.client.0.vm05.stdout:9/694: mkdir d8/d86/d28/d79/d57/de1/d22/d33/d62/d6d/de7 0 2026-03-10T07:51:04.206 INFO:tasks.workunit.client.0.vm05.stdout:6/747: dread d0/d11/f13 [0,4194304] 0 2026-03-10T07:51:04.207 INFO:tasks.workunit.client.0.vm05.stdout:8/643: mknod d1/dd/d18/d20/cd0 0 2026-03-10T07:51:04.209 INFO:tasks.workunit.client.0.vm05.stdout:2/778: dread - d0/d8/d3d/d7d/da5/fd2 zero size 2026-03-10T07:51:04.210 INFO:tasks.workunit.client.0.vm05.stdout:2/779: chown d0/d8/d43/d38/lb9 63 1 2026-03-10T07:51:04.221 INFO:tasks.workunit.client.0.vm05.stdout:3/706: dread d8/f25 [0,4194304] 0 2026-03-10T07:51:04.221 INFO:tasks.workunit.client.0.vm05.stdout:3/707: chown d8/d1c/d48/f70 610 1 2026-03-10T07:51:04.224 INFO:tasks.workunit.client.0.vm05.stdout:9/695: mknod d8/d86/d28/d79/d57/de1/d22/d33/d62/d6d/d9e/ce8 0 2026-03-10T07:51:04.227 INFO:tasks.workunit.client.0.vm05.stdout:1/734: sync 2026-03-10T07:51:04.230 INFO:tasks.workunit.client.0.vm05.stdout:6/748: creat d0/d11/d2e/d81/d92/dc2/fea x:0 0 0 2026-03-10T07:51:04.230 INFO:tasks.workunit.client.0.vm05.stdout:6/749: chown d0/d11/d4f/da0/da6 28 1 2026-03-10T07:51:04.238 INFO:tasks.workunit.client.0.vm05.stdout:5/758: dread d2/d20/d4c/fa5 [0,4194304] 0 2026-03-10T07:51:04.244 INFO:tasks.workunit.client.0.vm05.stdout:8/644: truncate d1/dd/d18/d20/d2a/d34/d49/fc7 2358573 0 2026-03-10T07:51:04.245 INFO:tasks.workunit.client.0.vm05.stdout:0/725: rename d8/dd/d10/d26/d2a/fab to d8/dd/d10/d26/d3a/d5e/ffb 0 2026-03-10T07:51:04.246 INFO:tasks.workunit.client.0.vm05.stdout:0/726: fsync d8/dd/d10/d26/d3a/d5e/ff9 0 2026-03-10T07:51:04.252 INFO:tasks.workunit.client.0.vm05.stdout:0/727: sync 2026-03-10T07:51:04.253 INFO:tasks.workunit.client.0.vm05.stdout:7/748: dwrite d1/d3c/d71/d79/f93 [0,4194304] 0 2026-03-10T07:51:04.266 INFO:tasks.workunit.client.0.vm05.stdout:4/794: write d0/d6/d9/d12/d45/d55/d44/f7e [5031556,22513] 0 2026-03-10T07:51:04.273 INFO:tasks.workunit.client.0.vm05.stdout:2/780: rename d0/d8/d3d/d7d to d0/d8/d66/dd1/d49/df9 0 2026-03-10T07:51:04.279 INFO:tasks.workunit.client.0.vm05.stdout:6/750: write d0/d11/d4f/d56/d96/db6/faa [2054225,3865] 0 2026-03-10T07:51:04.280 INFO:tasks.workunit.client.0.vm05.stdout:9/696: write d8/d86/d28/d79/d57/de1/f51 [3833837,20301] 0 2026-03-10T07:51:04.280 INFO:tasks.workunit.client.0.vm05.stdout:7/749: creat d1/d6/d3b/d7f/fe8 x:0 0 0 2026-03-10T07:51:04.280 INFO:tasks.workunit.client.0.vm05.stdout:7/750: readlink d1/d3c/l62 0 2026-03-10T07:51:04.280 INFO:tasks.workunit.client.0.vm05.stdout:7/751: write d1/d3c/d71/d79/d8a/fce [704592,128245] 0 2026-03-10T07:51:04.284 INFO:tasks.workunit.client.0.vm05.stdout:7/752: dwrite d1/d3c/d4b/f4f [0,4194304] 0 2026-03-10T07:51:04.287 INFO:tasks.workunit.client.0.vm05.stdout:1/735: link da/d26/d9e/fa1 da/fe5 0 2026-03-10T07:51:04.291 INFO:tasks.workunit.client.0.vm05.stdout:4/795: write d0/d6/d9/d12/d45/d55/f56 [6224045,41259] 0 2026-03-10T07:51:04.300 INFO:tasks.workunit.client.0.vm05.stdout:3/708: rename d8/d22/d60/d6e/deb to d8/d1f/d24/d76/dc5/de1/d19/d37/def 0 2026-03-10T07:51:04.301 INFO:tasks.workunit.client.0.vm05.stdout:3/709: chown d8/d1f/d24/d76/cd9 5348131 1 2026-03-10T07:51:04.304 INFO:tasks.workunit.client.0.vm05.stdout:6/751: creat d0/d35/d36/db8/feb x:0 0 0 2026-03-10T07:51:04.310 INFO:tasks.workunit.client.0.vm05.stdout:0/728: creat d8/dd/d37/d56/d4d/df8/ffc x:0 0 0 2026-03-10T07:51:04.319 INFO:tasks.workunit.client.0.vm05.stdout:5/759: write d2/d20/d4c/f8c [2948509,34023] 0 2026-03-10T07:51:04.320 INFO:tasks.workunit.client.0.vm05.stdout:8/645: truncate d1/d45/f81 2452131 0 2026-03-10T07:51:04.324 INFO:tasks.workunit.client.0.vm05.stdout:4/796: mknod d0/d6/d60/dde/c109 0 2026-03-10T07:51:04.327 INFO:tasks.workunit.client.0.vm05.stdout:6/752: truncate d0/d6/f16 1275429 0 2026-03-10T07:51:04.332 INFO:tasks.workunit.client.0.vm05.stdout:6/753: dwrite d0/d11/d2e/d81/d92/dc2/fdb [0,4194304] 0 2026-03-10T07:51:04.343 INFO:tasks.workunit.client.0.vm05.stdout:9/697: mkdir d8/d86/d28/de9 0 2026-03-10T07:51:04.344 INFO:tasks.workunit.client.0.vm05.stdout:4/797: truncate d0/d6/d9/d12/d45/d55/f7d 324422 0 2026-03-10T07:51:04.344 INFO:tasks.workunit.client.0.vm05.stdout:8/646: fsync d1/fe 0 2026-03-10T07:51:04.349 INFO:tasks.workunit.client.0.vm05.stdout:8/647: dwrite d1/dd/d4d/d64/d6a/fb5 [8388608,4194304] 0 2026-03-10T07:51:04.354 INFO:tasks.workunit.client.0.vm05.stdout:0/729: mkdir d8/dd/d37/dfd 0 2026-03-10T07:51:04.365 INFO:tasks.workunit.client.0.vm05.stdout:6/754: mknod d0/d35/d36/dc8/cec 0 2026-03-10T07:51:04.373 INFO:tasks.workunit.client.0.vm05.stdout:7/753: creat d1/d6/d3b/fe9 x:0 0 0 2026-03-10T07:51:04.374 INFO:tasks.workunit.client.0.vm05.stdout:6/755: write d0/d11/d4f/d56/f6b [299442,51239] 0 2026-03-10T07:51:04.374 INFO:tasks.workunit.client.0.vm05.stdout:6/756: readlink d0/d11/d4f/d56/d96/lae 0 2026-03-10T07:51:04.374 INFO:tasks.workunit.client.0.vm05.stdout:8/648: dwrite d1/dd/d5e/d9e/fc3 [0,4194304] 0 2026-03-10T07:51:04.374 INFO:tasks.workunit.client.0.vm05.stdout:4/798: dread d0/d6/d9/d5a/d6e/fa8 [0,4194304] 0 2026-03-10T07:51:04.377 INFO:tasks.workunit.client.0.vm05.stdout:8/649: dread d1/fe [0,4194304] 0 2026-03-10T07:51:04.383 INFO:tasks.workunit.client.0.vm05.stdout:2/781: getdents d0/d8/d66 0 2026-03-10T07:51:04.385 INFO:tasks.workunit.client.0.vm05.stdout:0/730: mknod d8/dd/d10/d26/d8b/da4/ddf/cfe 0 2026-03-10T07:51:04.388 INFO:tasks.workunit.client.0.vm05.stdout:5/760: creat d2/d20/d4c/ffb x:0 0 0 2026-03-10T07:51:04.390 INFO:tasks.workunit.client.0.vm05.stdout:3/710: getdents d8/d22 0 2026-03-10T07:51:04.393 INFO:tasks.workunit.client.0.vm05.stdout:9/698: symlink d8/d86/d28/d79/d57/de1/d22/d33/d62/de0/lea 0 2026-03-10T07:51:04.395 INFO:tasks.workunit.client.0.vm05.stdout:2/782: mknod d0/d52/cfa 0 2026-03-10T07:51:04.397 INFO:tasks.workunit.client.0.vm05.stdout:6/757: creat d0/d11/d31/dbf/fed x:0 0 0 2026-03-10T07:51:04.399 INFO:tasks.workunit.client.0.vm05.stdout:7/754: sync 2026-03-10T07:51:04.400 INFO:tasks.workunit.client.0.vm05.stdout:4/799: sync 2026-03-10T07:51:04.401 INFO:tasks.workunit.client.0.vm05.stdout:4/800: dread d0/d6/d9/d5a/fee [0,4194304] 0 2026-03-10T07:51:04.407 INFO:tasks.workunit.client.0.vm05.stdout:5/761: read d2/d12/f2b [841465,48972] 0 2026-03-10T07:51:04.407 INFO:tasks.workunit.client.0.vm05.stdout:3/711: mkdir d8/d1f/d24/d76/df0 0 2026-03-10T07:51:04.407 INFO:tasks.workunit.client.0.vm05.stdout:9/699: creat d8/d86/d28/d79/d57/de1/d1c/d20/d59/feb x:0 0 0 2026-03-10T07:51:04.409 INFO:tasks.workunit.client.0.vm05.stdout:2/783: mknod d0/d8/d66/dd1/d49/df9/db2/dd7/ddb/cfb 0 2026-03-10T07:51:04.413 INFO:tasks.workunit.client.0.vm05.stdout:7/755: read d1/d6/f84 [2132548,127344] 0 2026-03-10T07:51:04.414 INFO:tasks.workunit.client.0.vm05.stdout:0/731: dread d8/dd/d10/d26/d48/fb0 [0,4194304] 0 2026-03-10T07:51:04.414 INFO:tasks.workunit.client.0.vm05.stdout:7/756: dread - d1/d3c/d71/d79/d8a/fbc zero size 2026-03-10T07:51:04.415 INFO:tasks.workunit.client.0.vm05.stdout:0/732: readlink d8/dd/d10/l88 0 2026-03-10T07:51:04.416 INFO:tasks.workunit.client.0.vm05.stdout:7/757: truncate d1/d6/d80/d82/fc9 535448 0 2026-03-10T07:51:04.417 INFO:tasks.workunit.client.0.vm05.stdout:6/758: fdatasync d0/d35/d36/db8/fcf 0 2026-03-10T07:51:04.417 INFO:tasks.workunit.client.0.vm05.stdout:5/762: symlink d2/d20/d7b/dbe/lfc 0 2026-03-10T07:51:04.420 INFO:tasks.workunit.client.0.vm05.stdout:6/759: write d0/d11/d2e/d81/d92/dc2/fdb [109104,47196] 0 2026-03-10T07:51:04.422 INFO:tasks.workunit.client.0.vm05.stdout:4/801: dread d0/d6/d9/d12/d45/f66 [0,4194304] 0 2026-03-10T07:51:04.427 INFO:tasks.workunit.client.0.vm05.stdout:3/712: unlink d8/f12 0 2026-03-10T07:51:04.428 INFO:tasks.workunit.client.0.vm05.stdout:3/713: chown d8/d1f/d2a/d96/da9/fd6 1694834 1 2026-03-10T07:51:04.431 INFO:tasks.workunit.client.0.vm05.stdout:2/784: unlink d0/d8/d66/dd1/fae 0 2026-03-10T07:51:04.434 INFO:tasks.workunit.client.0.vm05.stdout:4/802: dwrite d0/d6/d9/d12/f36 [4194304,4194304] 0 2026-03-10T07:51:04.451 INFO:tasks.workunit.client.0.vm05.stdout:7/758: readlink d1/d6/d47/lae 0 2026-03-10T07:51:04.452 INFO:tasks.workunit.client.0.vm05.stdout:7/759: chown d1/d3c/d71/d79/d8a/fad 13517 1 2026-03-10T07:51:04.458 INFO:tasks.workunit.client.0.vm05.stdout:9/700: creat d8/d86/d28/d79/d57/d96/dd8/fec x:0 0 0 2026-03-10T07:51:04.479 INFO:tasks.workunit.client.0.vm05.stdout:9/701: dread d8/d86/d28/d79/d57/de1/d1c/d20/d59/d8b/f39 [0,4194304] 0 2026-03-10T07:51:04.481 INFO:tasks.workunit.client.0.vm05.stdout:5/763: mknod d2/d12/d4d/cfd 0 2026-03-10T07:51:04.481 INFO:tasks.workunit.client.0.vm05.stdout:6/760: mknod d0/d11/d2e/d81/d92/cee 0 2026-03-10T07:51:04.482 INFO:tasks.workunit.client.0.vm05.stdout:6/761: dread - d0/d11/d2e/d81/d92/dc2/fea zero size 2026-03-10T07:51:04.483 INFO:tasks.workunit.client.0.vm05.stdout:6/762: dread - d0/d11/d57/fb9 zero size 2026-03-10T07:51:04.496 INFO:tasks.workunit.client.0.vm05.stdout:2/785: mkdir d0/d8/d43/dc9/dfc 0 2026-03-10T07:51:04.498 INFO:tasks.workunit.client.0.vm05.stdout:4/803: unlink d0/d28/fda 0 2026-03-10T07:51:04.501 INFO:tasks.workunit.client.0.vm05.stdout:0/733: creat d8/dd/d37/dfd/fff x:0 0 0 2026-03-10T07:51:04.512 INFO:tasks.workunit.client.0.vm05.stdout:1/736: dwrite da/dd/d12/f22 [0,4194304] 0 2026-03-10T07:51:04.538 INFO:tasks.workunit.client.0.vm05.stdout:6/763: rename d0/d6/f98 to d0/d11/d4f/d56/d96/db6/fef 0 2026-03-10T07:51:04.541 INFO:tasks.workunit.client.0.vm05.stdout:2/786: unlink d0/d8/d3d/l63 0 2026-03-10T07:51:04.554 INFO:tasks.workunit.client.0.vm05.stdout:0/734: mkdir d8/d9c/dc8/d100 0 2026-03-10T07:51:04.560 INFO:tasks.workunit.client.0.vm05.stdout:5/764: link d2/d20/d33/d53/fb0 d2/d5/d61/ffe 0 2026-03-10T07:51:04.562 INFO:tasks.workunit.client.0.vm05.stdout:6/764: rmdir d0/d11/d4f/d7d 39 2026-03-10T07:51:04.563 INFO:tasks.workunit.client.0.vm05.stdout:3/714: link d8/l9 d8/d1f/d24/d76/dc5/de1/d19/d37/lf1 0 2026-03-10T07:51:04.565 INFO:tasks.workunit.client.0.vm05.stdout:5/765: truncate d2/f9 5216729 0 2026-03-10T07:51:04.567 INFO:tasks.workunit.client.0.vm05.stdout:1/737: rename da/dd/d42/d80/l99 to da/d26/le6 0 2026-03-10T07:51:04.568 INFO:tasks.workunit.client.0.vm05.stdout:2/787: creat d0/d2a/d8c/ffd x:0 0 0 2026-03-10T07:51:04.575 INFO:tasks.workunit.client.0.vm05.stdout:3/715: readlink d8/d1f/d24/d76/dc5/de1/d19/d6b/la0 0 2026-03-10T07:51:04.577 INFO:tasks.workunit.client.0.vm05.stdout:5/766: rmdir d2/d20 39 2026-03-10T07:51:04.577 INFO:tasks.workunit.client.0.vm05.stdout:9/702: getdents d8/d86/d28/d79/d57/dbc 0 2026-03-10T07:51:04.578 INFO:tasks.workunit.client.0.vm05.stdout:6/765: truncate d0/d11/f13 2718459 0 2026-03-10T07:51:04.582 INFO:tasks.workunit.client.0.vm05.stdout:5/767: chown d2/d12/d2d/d4a 1 1 2026-03-10T07:51:04.589 INFO:tasks.workunit.client.0.vm05.stdout:1/738: sync 2026-03-10T07:51:04.592 INFO:tasks.workunit.client.0.vm05.stdout:8/650: dwrite d1/dd/d18/d20/d2a/d34/d49/d5d/f84 [0,4194304] 0 2026-03-10T07:51:04.605 INFO:tasks.workunit.client.0.vm05.stdout:0/735: dwrite d8/dd/d10/d26/d2a/fee [0,4194304] 0 2026-03-10T07:51:04.609 INFO:tasks.workunit.client.0.vm05.stdout:5/768: readlink d2/d20/d33/l8a 0 2026-03-10T07:51:04.610 INFO:tasks.workunit.client.0.vm05.stdout:5/769: readlink d2/l28 0 2026-03-10T07:51:04.623 INFO:tasks.workunit.client.0.vm05.stdout:1/739: mknod da/dd/d2a/d55/d64/dd1/ce7 0 2026-03-10T07:51:04.623 INFO:tasks.workunit.client.0.vm05.stdout:2/788: link d0/d8/fcd d0/d8/d43/dc9/dfc/ffe 0 2026-03-10T07:51:04.625 INFO:tasks.workunit.client.0.vm05.stdout:1/740: write da/d26/d2b/d89/fca [997211,25885] 0 2026-03-10T07:51:04.626 INFO:tasks.workunit.client.0.vm05.stdout:1/741: chown da/d26/d2b/daf/dbe/dc0 167266 1 2026-03-10T07:51:04.630 INFO:tasks.workunit.client.0.vm05.stdout:6/766: symlink d0/d11/d4f/lf0 0 2026-03-10T07:51:04.630 INFO:tasks.workunit.client.0.vm05.stdout:1/742: truncate da/dd/d2a/d55/fbf 1355687 0 2026-03-10T07:51:04.632 INFO:tasks.workunit.client.0.vm05.stdout:0/736: fdatasync d8/dd/d37/f4f 0 2026-03-10T07:51:04.637 INFO:tasks.workunit.client.0.vm05.stdout:0/737: dread - d8/dd/d10/d26/d3a/fe1 zero size 2026-03-10T07:51:04.655 INFO:tasks.workunit.client.0.vm05.stdout:2/789: rename d0/d8/d66/dd1/d49/df9/da5/da8/lc5 to d0/d8/d43/df/df8/lff 0 2026-03-10T07:51:04.659 INFO:tasks.workunit.client.0.vm05.stdout:2/790: chown d0/d8/d43/df/c13 1 1 2026-03-10T07:51:04.663 INFO:tasks.workunit.client.0.vm05.stdout:6/767: mknod d0/d11/d4f/da0/da6/cf1 0 2026-03-10T07:51:04.667 INFO:tasks.workunit.client.0.vm05.stdout:7/760: dwrite d1/d6/d80/d82/fa8 [0,4194304] 0 2026-03-10T07:51:04.671 INFO:tasks.workunit.client.0.vm05.stdout:5/770: mknod d2/d12/da8/ddd/de9/cff 0 2026-03-10T07:51:04.671 INFO:tasks.workunit.client.0.vm05.stdout:0/738: write d8/dd/d10/d26/d2a/f2e [1165795,86683] 0 2026-03-10T07:51:04.675 INFO:tasks.workunit.client.0.vm05.stdout:5/771: write d2/d12/d2d/d4a/ff7 [639483,51968] 0 2026-03-10T07:51:04.678 INFO:tasks.workunit.client.0.vm05.stdout:2/791: rmdir d0/d8/d43/df/d4d 39 2026-03-10T07:51:04.684 INFO:tasks.workunit.client.0.vm05.stdout:4/804: dwrite d0/d6/d9/d12/d9c/db7/da7/f4c [0,4194304] 0 2026-03-10T07:51:04.705 INFO:tasks.workunit.client.0.vm05.stdout:4/805: dread d0/d6/d9/d12/d9c/db7/da7/f4a [0,4194304] 0 2026-03-10T07:51:04.705 INFO:tasks.workunit.client.0.vm05.stdout:7/761: symlink d1/d34/lea 0 2026-03-10T07:51:04.708 INFO:tasks.workunit.client.0.vm05.stdout:0/739: rmdir d8/dd/d10/d26/d3a/d5e 39 2026-03-10T07:51:04.711 INFO:tasks.workunit.client.0.vm05.stdout:2/792: mknod d0/d8/d66/dd1/d49/df9/db2/dd7/c100 0 2026-03-10T07:51:04.714 INFO:tasks.workunit.client.0.vm05.stdout:6/768: mknod d0/d11/d86/cf2 0 2026-03-10T07:51:04.715 INFO:tasks.workunit.client.0.vm05.stdout:9/703: write d8/d86/d28/d79/d57/de1/d1c/f8c [558501,66670] 0 2026-03-10T07:51:04.729 INFO:tasks.workunit.client.0.vm05.stdout:4/806: dwrite d0/d6/d9/d5a/ff1 [0,4194304] 0 2026-03-10T07:51:04.738 INFO:tasks.workunit.client.0.vm05.stdout:1/743: link da/d26/d2b/ca4 da/d26/d2b/dcb/ce8 0 2026-03-10T07:51:04.738 INFO:tasks.workunit.client.0.vm05.stdout:7/762: symlink d1/d3c/db8/de5/leb 0 2026-03-10T07:51:04.740 INFO:tasks.workunit.client.0.vm05.stdout:3/716: dwrite d8/d1f/d24/d76/dc5/de1/d52/f9f [0,4194304] 0 2026-03-10T07:51:04.750 INFO:tasks.workunit.client.0.vm05.stdout:5/772: link d2/d12/d4d/f5d d2/d20/d33/d53/f100 0 2026-03-10T07:51:04.750 INFO:tasks.workunit.client.0.vm05.stdout:0/740: dwrite d8/dd/d37/d56/feb [0,4194304] 0 2026-03-10T07:51:04.759 INFO:tasks.workunit.client.0.vm05.stdout:2/793: creat d0/d8/d43/df/d4e/f101 x:0 0 0 2026-03-10T07:51:04.791 INFO:tasks.workunit.client.0.vm05.stdout:7/763: mknod d1/d3c/d71/d79/cec 0 2026-03-10T07:51:04.792 INFO:tasks.workunit.client.0.vm05.stdout:6/769: symlink d0/lf3 0 2026-03-10T07:51:04.808 INFO:tasks.workunit.client.0.vm05.stdout:0/741: unlink d8/dd/d37/d67/cd5 0 2026-03-10T07:51:04.809 INFO:tasks.workunit.client.0.vm05.stdout:8/651: truncate d1/dd/d18/d20/f30 1626635 0 2026-03-10T07:51:04.810 INFO:tasks.workunit.client.0.vm05.stdout:2/794: creat d0/d8/d43/da4/f102 x:0 0 0 2026-03-10T07:51:04.825 INFO:tasks.workunit.client.0.vm05.stdout:7/764: creat d1/d34/fed x:0 0 0 2026-03-10T07:51:04.825 INFO:tasks.workunit.client.0.vm05.stdout:3/717: chown d8/d8f/dbc/dc7 828 1 2026-03-10T07:51:04.825 INFO:tasks.workunit.client.0.vm05.stdout:4/807: write d0/d6/fa0 [881733,128236] 0 2026-03-10T07:51:04.828 INFO:tasks.workunit.client.0.vm05.stdout:1/744: rename da/dd/d12/d86/cae to da/d26/d2b/d89/dbd/ce9 0 2026-03-10T07:51:04.833 INFO:tasks.workunit.client.0.vm05.stdout:1/745: write da/dd/d12/d34/d58/fd4 [662917,65092] 0 2026-03-10T07:51:04.836 INFO:tasks.workunit.client.0.vm05.stdout:8/652: symlink d1/dd/d18/d20/d2a/d34/da5/ld1 0 2026-03-10T07:51:04.840 INFO:tasks.workunit.client.0.vm05.stdout:7/765: truncate d1/d3c/d71/fb1 339475 0 2026-03-10T07:51:04.841 INFO:tasks.workunit.client.0.vm05.stdout:7/766: chown d1/d3c/c54 1316 1 2026-03-10T07:51:04.844 INFO:tasks.workunit.client.0.vm05.stdout:2/795: mkdir d0/d8/d66/dd1/d103 0 2026-03-10T07:51:04.845 INFO:tasks.workunit.client.0.vm05.stdout:9/704: getdents d8/d86/d28/d79/d57/de1/d1c/d20/dd3/d63 0 2026-03-10T07:51:04.845 INFO:tasks.workunit.client.0.vm05.stdout:8/653: creat d1/dd/d18/d20/fd2 x:0 0 0 2026-03-10T07:51:04.852 INFO:tasks.workunit.client.0.vm05.stdout:6/770: link d0/d11/d2e/d81/d92/dc2/fea d0/d11/d57/ff4 0 2026-03-10T07:51:04.853 INFO:tasks.workunit.client.0.vm05.stdout:5/773: dwrite d2/d12/dda/da1/fbf [0,4194304] 0 2026-03-10T07:51:04.860 INFO:tasks.workunit.client.0.vm05.stdout:0/742: link d8/dd/d10/f19 d8/dd/d10/d26/d8b/d86/f101 0 2026-03-10T07:51:04.861 INFO:tasks.workunit.client.0.vm05.stdout:4/808: dwrite d0/d6/da6/fba [4194304,4194304] 0 2026-03-10T07:51:04.871 INFO:tasks.workunit.client.0.vm05.stdout:1/746: truncate da/dd/d12/d86/d9a/fbc 814901 0 2026-03-10T07:51:04.879 INFO:tasks.workunit.client.0.vm05.stdout:2/796: dwrite d0/d8/d66/dd1/d49/dab/fe9 [0,4194304] 0 2026-03-10T07:51:04.883 INFO:tasks.workunit.client.0.vm05.stdout:7/767: fdatasync d1/d3c/d4b/fb3 0 2026-03-10T07:51:04.885 INFO:tasks.workunit.client.0.vm05.stdout:7/768: readlink d1/d34/d59/d60/d8c/ld0 0 2026-03-10T07:51:04.897 INFO:tasks.workunit.client.0.vm05.stdout:6/771: mknod d0/d35/d36/d43/d9c/cf5 0 2026-03-10T07:51:04.897 INFO:tasks.workunit.client.0.vm05.stdout:5/774: rename d2/d5/l1d to d2/d12/da8/ddd/l101 0 2026-03-10T07:51:04.904 INFO:tasks.workunit.client.0.vm05.stdout:2/797: chown d0/d8/d43/df/d8b/f99 1750737 1 2026-03-10T07:51:04.912 INFO:tasks.workunit.client.0.vm05.stdout:6/772: creat d0/d11/d22/d6c/d84/ff6 x:0 0 0 2026-03-10T07:51:04.914 INFO:tasks.workunit.client.0.vm05.stdout:7/769: link d1/d6/d3b/fda d1/d3c/d71/d79/fee 0 2026-03-10T07:51:04.917 INFO:tasks.workunit.client.0.vm05.stdout:4/809: mknod d0/d6/d9/d5a/d6e/db6/c10a 0 2026-03-10T07:51:04.917 INFO:tasks.workunit.client.0.vm05.stdout:1/747: mknod da/dd/d2a/cea 0 2026-03-10T07:51:04.918 INFO:tasks.workunit.client.0.vm05.stdout:7/770: chown d1/d6/d80/d82 70268 1 2026-03-10T07:51:04.919 INFO:tasks.workunit.client.0.vm05.stdout:2/798: creat d0/d2a/d8c/f104 x:0 0 0 2026-03-10T07:51:04.921 INFO:tasks.workunit.client.0.vm05.stdout:7/771: chown d1/d6/d47/d8d/faf 28475 1 2026-03-10T07:51:04.926 INFO:tasks.workunit.client.0.vm05.stdout:6/773: rename d0/d11/f1c to d0/d35/d36/d43/ff7 0 2026-03-10T07:51:04.929 INFO:tasks.workunit.client.0.vm05.stdout:1/748: stat da/dd/d2a/d55/d68/f36 0 2026-03-10T07:51:04.938 INFO:tasks.workunit.client.0.vm05.stdout:3/718: truncate d8/f3b 1644486 0 2026-03-10T07:51:04.938 INFO:tasks.workunit.client.0.vm05.stdout:6/774: truncate d0/d11/d57/faf 313841 0 2026-03-10T07:51:04.947 INFO:tasks.workunit.client.0.vm05.stdout:7/772: symlink d1/d34/d59/d60/lef 0 2026-03-10T07:51:04.947 INFO:tasks.workunit.client.0.vm05.stdout:5/775: write d2/f15 [3884565,72585] 0 2026-03-10T07:51:04.950 INFO:tasks.workunit.client.0.vm05.stdout:4/810: rename d0/d6/d9/d5a/f2f to d0/d6/d9/d12/d45/d55/d44/d85/f10b 0 2026-03-10T07:51:04.960 INFO:tasks.workunit.client.0.vm05.stdout:9/705: dwrite d8/d86/d28/d79/d57/de1/d38/d71/fb5 [0,4194304] 0 2026-03-10T07:51:04.960 INFO:tasks.workunit.client.0.vm05.stdout:5/776: truncate d2/d4b/fc6 222290 0 2026-03-10T07:51:04.964 INFO:tasks.workunit.client.0.vm05.stdout:8/654: dwrite d1/dd/d18/d20/d2a/d9a/fae [0,4194304] 0 2026-03-10T07:51:04.979 INFO:tasks.workunit.client.0.vm05.stdout:9/706: chown d8/d86/d28/d79/d57/de1/d1c/f8c 79823865 1 2026-03-10T07:51:04.997 INFO:tasks.workunit.client.0.vm05.stdout:7/773: dwrite d1/f49 [8388608,4194304] 0 2026-03-10T07:51:05.007 INFO:tasks.workunit.client.0.vm05.stdout:1/749: dwrite da/dd/d2a/d55/d64/f7a [0,4194304] 0 2026-03-10T07:51:05.008 INFO:tasks.workunit.client.0.vm05.stdout:0/743: dwrite d8/dd/d10/d26/d3a/d5e/f71 [4194304,4194304] 0 2026-03-10T07:51:05.023 INFO:tasks.workunit.client.0.vm05.stdout:9/707: rmdir d8/d86/d28/d79/d57/de1/d22/d33/d62/de0 39 2026-03-10T07:51:05.043 INFO:tasks.workunit.client.0.vm05.stdout:4/811: rename d0/d6/d9/d12/d45/d55/d44/fbf to d0/d6/d9/d5a/d91/f10c 0 2026-03-10T07:51:05.049 INFO:tasks.workunit.client.0.vm05.stdout:0/744: creat d8/dd/d10/d26/d8b/d7d/f102 x:0 0 0 2026-03-10T07:51:05.049 INFO:tasks.workunit.client.0.vm05.stdout:0/745: readlink d8/dd/d10/db7/dc3/lcf 0 2026-03-10T07:51:05.051 INFO:tasks.workunit.client.0.vm05.stdout:0/746: dread - d8/dd/d37/d56/d4d/ffa zero size 2026-03-10T07:51:05.052 INFO:tasks.workunit.client.0.vm05.stdout:0/747: write d8/dd/f40 [3455178,65507] 0 2026-03-10T07:51:05.052 INFO:tasks.workunit.client.0.vm05.stdout:8/655: mkdir d1/d52/dd3 0 2026-03-10T07:51:05.056 INFO:tasks.workunit.client.0.vm05.stdout:9/708: dread d8/f8a [0,4194304] 0 2026-03-10T07:51:05.063 INFO:tasks.workunit.client.0.vm05.stdout:3/719: rename d8/d1c/d48/cdb to d8/d22/dad/cf2 0 2026-03-10T07:51:05.071 INFO:tasks.workunit.client.0.vm05.stdout:4/812: read d0/d6/d9/d5a/d91/fc9 [342218,125555] 0 2026-03-10T07:51:05.071 INFO:tasks.workunit.client.0.vm05.stdout:6/775: creat d0/d11/d57/ff8 x:0 0 0 2026-03-10T07:51:05.077 INFO:tasks.workunit.client.0.vm05.stdout:8/656: creat d1/dd/d18/d20/d2a/d9a/fd4 x:0 0 0 2026-03-10T07:51:05.105 INFO:tasks.workunit.client.0.vm05.stdout:2/799: dread d0/d8/d43/df/f20 [0,4194304] 0 2026-03-10T07:51:05.110 INFO:tasks.workunit.client.0.vm05.stdout:5/777: write d2/d20/d33/d86/fb5 [576059,102478] 0 2026-03-10T07:51:05.131 INFO:tasks.workunit.client.0.vm05.stdout:9/709: truncate d8/d86/d28/d79/d57/de1/d38/fa3 241183 0 2026-03-10T07:51:05.132 INFO:tasks.workunit.client.0.vm05.stdout:2/800: rename d0/d8/d66/dd1/d49/d81/caa to d0/d8/d66/dd1/d49/df9/da5/da8/c105 0 2026-03-10T07:51:05.133 INFO:tasks.workunit.client.0.vm05.stdout:3/720: creat d8/d1f/d2a/d4a/d7d/ff3 x:0 0 0 2026-03-10T07:51:05.139 INFO:tasks.workunit.client.0.vm05.stdout:0/748: creat d8/dd/d37/d67/f103 x:0 0 0 2026-03-10T07:51:05.143 INFO:tasks.workunit.client.0.vm05.stdout:7/774: write d1/f16 [836470,80118] 0 2026-03-10T07:51:05.146 INFO:tasks.workunit.client.0.vm05.stdout:9/710: rmdir d8/d86/d28/d79/d57/de1/d1c/d20 39 2026-03-10T07:51:05.147 INFO:tasks.workunit.client.0.vm05.stdout:6/776: link d0/l5 d0/d11/d22/d6c/lf9 0 2026-03-10T07:51:05.148 INFO:tasks.workunit.client.0.vm05.stdout:0/749: chown d8/dd/d10/d26/c64 41401 1 2026-03-10T07:51:05.149 INFO:tasks.workunit.client.0.vm05.stdout:7/775: rename d1/d3c/d71/d79/d8a/cbb to d1/d34/cf0 0 2026-03-10T07:51:05.150 INFO:tasks.workunit.client.0.vm05.stdout:3/721: dwrite d8/d1f/d2a/d4a/d7d/ff3 [0,4194304] 0 2026-03-10T07:51:05.155 INFO:tasks.workunit.client.0.vm05.stdout:8/657: sync 2026-03-10T07:51:05.158 INFO:tasks.workunit.client.0.vm05.stdout:7/776: chown d1/d6/d3b/d7f/cc5 0 1 2026-03-10T07:51:05.166 INFO:tasks.workunit.client.0.vm05.stdout:8/658: dread - d1/d6f/f85 zero size 2026-03-10T07:51:05.171 INFO:tasks.workunit.client.0.vm05.stdout:0/750: mkdir d8/dd/d37/d104 0 2026-03-10T07:51:05.176 INFO:tasks.workunit.client.0.vm05.stdout:1/750: dwrite da/d26/d9e/fa1 [0,4194304] 0 2026-03-10T07:51:05.176 INFO:tasks.workunit.client.0.vm05.stdout:8/659: dread - d1/dd/d4d/dcc/fbe zero size 2026-03-10T07:51:05.183 INFO:tasks.workunit.client.0.vm05.stdout:9/711: creat d8/d86/d28/d79/d57/de1/d22/d33/d62/d6d/de7/fed x:0 0 0 2026-03-10T07:51:05.187 INFO:tasks.workunit.client.0.vm05.stdout:6/777: rmdir d0/d11/d4f/d7d 39 2026-03-10T07:51:05.191 INFO:tasks.workunit.client.0.vm05.stdout:3/722: unlink d8/d1f/d2a/d4a/d7d/f7e 0 2026-03-10T07:51:05.196 INFO:tasks.workunit.client.0.vm05.stdout:6/778: read d0/d6/f1a [191315,76956] 0 2026-03-10T07:51:05.196 INFO:tasks.workunit.client.0.vm05.stdout:1/751: dwrite da/dd/fa5 [4194304,4194304] 0 2026-03-10T07:51:05.207 INFO:tasks.workunit.client.0.vm05.stdout:0/751: dread d8/d9c/dc8/fd4 [0,4194304] 0 2026-03-10T07:51:05.212 INFO:tasks.workunit.client.0.vm05.stdout:8/660: chown d1/dd/d18/d20/f30 10453489 1 2026-03-10T07:51:05.239 INFO:tasks.workunit.client.0.vm05.stdout:9/712: mkdir d8/d86/d28/d79/d57/de1/d1c/d20/dee 0 2026-03-10T07:51:05.240 INFO:tasks.workunit.client.0.vm05.stdout:1/752: dread da/dd/d2a/f54 [0,4194304] 0 2026-03-10T07:51:05.247 INFO:tasks.workunit.client.0.vm05.stdout:1/753: write da/d26/d2b/d89/fca [856317,870] 0 2026-03-10T07:51:05.247 INFO:tasks.workunit.client.0.vm05.stdout:1/754: chown da/dd/c1e 12129 1 2026-03-10T07:51:05.253 INFO:tasks.workunit.client.0.vm05.stdout:0/752: mknod d8/dd/d37/d104/c105 0 2026-03-10T07:51:05.266 INFO:tasks.workunit.client.0.vm05.stdout:7/777: getdents d1/d6 0 2026-03-10T07:51:05.273 INFO:tasks.workunit.client.0.vm05.stdout:5/778: dwrite d2/f1a [0,4194304] 0 2026-03-10T07:51:05.273 INFO:tasks.workunit.client.0.vm05.stdout:4/813: dwrite d0/d6/d60/f72 [0,4194304] 0 2026-03-10T07:51:05.274 INFO:tasks.workunit.client.0.vm05.stdout:6/779: creat d0/d11/d4f/d7d/de8/ffa x:0 0 0 2026-03-10T07:51:05.277 INFO:tasks.workunit.client.0.vm05.stdout:6/780: dread - d0/d11/d57/ff4 zero size 2026-03-10T07:51:05.288 INFO:tasks.workunit.client.0.vm05.stdout:0/753: dread d8/dd/d10/d26/d8b/d70/fbe [0,4194304] 0 2026-03-10T07:51:05.289 INFO:tasks.workunit.client.0.vm05.stdout:0/754: chown d8/dd/d37/d67 16513 1 2026-03-10T07:51:05.311 INFO:tasks.workunit.client.0.vm05.stdout:8/661: creat d1/dd/fd5 x:0 0 0 2026-03-10T07:51:05.312 INFO:tasks.workunit.client.0.vm05.stdout:8/662: chown d1/dd/d18/f22 2 1 2026-03-10T07:51:05.323 INFO:tasks.workunit.client.0.vm05.stdout:1/755: mknod da/d26/d2b/daf/dbe/dc0/ceb 0 2026-03-10T07:51:05.323 INFO:tasks.workunit.client.0.vm05.stdout:3/723: write d8/d1f/d2a/d34/f39 [5177067,9001] 0 2026-03-10T07:51:05.334 INFO:tasks.workunit.client.0.vm05.stdout:4/814: symlink d0/d6/d60/l10d 0 2026-03-10T07:51:05.336 INFO:tasks.workunit.client.0.vm05.stdout:7/778: dread d1/d6/f77 [0,4194304] 0 2026-03-10T07:51:05.338 INFO:tasks.workunit.client.0.vm05.stdout:7/779: fsync d1/d6/d3b/d7f/fe8 0 2026-03-10T07:51:05.347 INFO:tasks.workunit.client.0.vm05.stdout:5/779: creat d2/d12/d2d/d4a/de7/f102 x:0 0 0 2026-03-10T07:51:05.350 INFO:tasks.workunit.client.0.vm05.stdout:6/781: unlink d0/d11/d2e/d81/d92/dc2/fdb 0 2026-03-10T07:51:05.351 INFO:tasks.workunit.client.0.vm05.stdout:6/782: dread - d0/d35/d36/db8/fcf zero size 2026-03-10T07:51:05.353 INFO:tasks.workunit.client.0.vm05.stdout:6/783: write d0/d35/d36/db8/feb [522293,108522] 0 2026-03-10T07:51:05.354 INFO:tasks.workunit.client.0.vm05.stdout:0/755: rmdir d8/dd/d10/d26/d8b/d7d 39 2026-03-10T07:51:05.361 INFO:tasks.workunit.client.0.vm05.stdout:8/663: creat d1/dd/d18/d20/d2a/d34/d49/d5d/fd6 x:0 0 0 2026-03-10T07:51:05.363 INFO:tasks.workunit.client.0.vm05.stdout:9/713: creat d8/d86/d28/d79/d57/de1/d22/d33/d62/fef x:0 0 0 2026-03-10T07:51:05.364 INFO:tasks.workunit.client.0.vm05.stdout:9/714: dread - d8/d86/d28/d79/d57/de1/d1c/d20/d59/d8b/fa1 zero size 2026-03-10T07:51:05.371 INFO:tasks.workunit.client.0.vm05.stdout:3/724: creat d8/d1f/d24/d8a/ff4 x:0 0 0 2026-03-10T07:51:05.371 INFO:tasks.workunit.client.0.vm05.stdout:1/756: dread da/d26/d2b/d89/fb1 [0,4194304] 0 2026-03-10T07:51:05.375 INFO:tasks.workunit.client.0.vm05.stdout:4/815: rmdir d0/d6/d9/d5a 39 2026-03-10T07:51:05.380 INFO:tasks.workunit.client.0.vm05.stdout:7/780: symlink d1/d3c/d71/d79/d8a/lf1 0 2026-03-10T07:51:05.387 INFO:tasks.workunit.client.0.vm05.stdout:3/725: dread d8/d1f/d24/d8a/fcb [0,4194304] 0 2026-03-10T07:51:05.389 INFO:tasks.workunit.client.0.vm05.stdout:6/784: mknod d0/d11/d4f/d7d/db7/cfb 0 2026-03-10T07:51:05.390 INFO:tasks.workunit.client.0.vm05.stdout:0/756: chown d8/c1e 0 1 2026-03-10T07:51:05.391 INFO:tasks.workunit.client.0.vm05.stdout:6/785: chown d0/c2f 1120197 1 2026-03-10T07:51:05.393 INFO:tasks.workunit.client.0.vm05.stdout:2/801: link d0/d8/d66/dd1/d49/df9/da5/da8/c105 d0/d8/d66/dd1/d49/db3/c106 0 2026-03-10T07:51:05.394 INFO:tasks.workunit.client.0.vm05.stdout:2/802: write d0/d8/d43/da4/ff1 [600004,96090] 0 2026-03-10T07:51:05.395 INFO:tasks.workunit.client.0.vm05.stdout:5/780: dwrite d2/d20/d4c/d64/f96 [0,4194304] 0 2026-03-10T07:51:05.414 INFO:tasks.workunit.client.0.vm05.stdout:1/757: creat da/d26/d2b/d71/fec x:0 0 0 2026-03-10T07:51:05.415 INFO:tasks.workunit.client.0.vm05.stdout:4/816: symlink d0/d6/d9/d12/d45/d55/d44/d85/l10e 0 2026-03-10T07:51:05.415 INFO:tasks.workunit.client.0.vm05.stdout:7/781: creat d1/d34/ff2 x:0 0 0 2026-03-10T07:51:05.420 INFO:tasks.workunit.client.0.vm05.stdout:0/757: creat d8/d9c/dc8/f106 x:0 0 0 2026-03-10T07:51:05.422 INFO:tasks.workunit.client.0.vm05.stdout:8/664: creat d1/d52/dd3/fd7 x:0 0 0 2026-03-10T07:51:05.422 INFO:tasks.workunit.client.0.vm05.stdout:0/758: write d8/dd/d10/d26/d3a/d5e/f71 [2520425,15464] 0 2026-03-10T07:51:05.426 INFO:tasks.workunit.client.0.vm05.stdout:2/803: mkdir d0/d8/d66/dd1/d49/db1/d107 0 2026-03-10T07:51:05.428 INFO:tasks.workunit.client.0.vm05.stdout:9/715: truncate d8/d86/d28/d79/d57/de1/d38/f87 7587031 0 2026-03-10T07:51:05.432 INFO:tasks.workunit.client.0.vm05.stdout:3/726: getdents d8/d1f/d24/d76/dc5/de1/d19/daf 0 2026-03-10T07:51:05.433 INFO:tasks.workunit.client.0.vm05.stdout:1/758: read da/dd/d2a/f75 [292677,67561] 0 2026-03-10T07:51:05.441 INFO:tasks.workunit.client.0.vm05.stdout:8/665: rmdir d1/d6f 39 2026-03-10T07:51:05.449 INFO:tasks.workunit.client.0.vm05.stdout:7/782: unlink d1/d6/d47/l7e 0 2026-03-10T07:51:05.451 INFO:tasks.workunit.client.0.vm05.stdout:3/727: mknod d8/d1c/d48/d69/cf5 0 2026-03-10T07:51:05.452 INFO:tasks.workunit.client.0.vm05.stdout:0/759: dread d8/dd/d37/d56/d4d/fd7 [0,4194304] 0 2026-03-10T07:51:05.468 INFO:tasks.workunit.client.0.vm05.stdout:1/759: mknod da/dd/d2a/d55/d64/dc2/ced 0 2026-03-10T07:51:05.472 INFO:tasks.workunit.client.0.vm05.stdout:6/786: link d0/d11/d86/cf2 d0/d11/d4f/d56/cfc 0 2026-03-10T07:51:05.473 INFO:tasks.workunit.client.0.vm05.stdout:8/666: dread - d1/d52/f94 zero size 2026-03-10T07:51:05.475 INFO:tasks.workunit.client.0.vm05.stdout:5/781: rename d2/d20/d33/f45 to d2/d20/d4c/f103 0 2026-03-10T07:51:05.475 INFO:tasks.workunit.client.0.vm05.stdout:2/804: sync 2026-03-10T07:51:05.479 INFO:tasks.workunit.client.0.vm05.stdout:9/716: dread d8/f9 [4194304,4194304] 0 2026-03-10T07:51:05.479 INFO:tasks.workunit.client.0.vm05.stdout:3/728: chown d8/d1f/d24/d76/dc5/de1/d52/d7b/fc3 25284 1 2026-03-10T07:51:05.553 INFO:tasks.workunit.client.0.vm05.stdout:6/787: truncate d0/f15 4550056 0 2026-03-10T07:51:05.560 INFO:tasks.workunit.client.0.vm05.stdout:7/783: mknod d1/d6/d3b/cf3 0 2026-03-10T07:51:05.566 INFO:tasks.workunit.client.0.vm05.stdout:4/817: truncate d0/d6/d9/d12/d9c/db7/da7/f53 1823784 0 2026-03-10T07:51:05.569 INFO:tasks.workunit.client.0.vm05.stdout:2/805: dread d0/f94 [0,4194304] 0 2026-03-10T07:51:05.572 INFO:tasks.workunit.client.0.vm05.stdout:8/667: truncate d1/dd/d18/d20/d2a/d34/da5/fa8 795711 0 2026-03-10T07:51:05.576 INFO:tasks.workunit.client.0.vm05.stdout:9/717: creat d8/d86/d28/d79/d57/de1/d38/ff0 x:0 0 0 2026-03-10T07:51:05.601 INFO:tasks.workunit.client.0.vm05.stdout:6/788: unlink d0/d11/d2e/d81/d92/cee 0 2026-03-10T07:51:05.616 INFO:tasks.workunit.client.0.vm05.stdout:2/806: dread d0/d8/f1c [4194304,4194304] 0 2026-03-10T07:51:05.624 INFO:tasks.workunit.client.0.vm05.stdout:7/784: unlink d1/d3c/d4b/f4f 0 2026-03-10T07:51:05.634 INFO:tasks.workunit.client.0.vm05.stdout:4/818: creat d0/d6/d6f/f10f x:0 0 0 2026-03-10T07:51:05.634 INFO:tasks.workunit.client.0.vm05.stdout:4/819: chown d0/d6/d6f/f10f 3453098 1 2026-03-10T07:51:05.649 INFO:tasks.workunit.client.0.vm05.stdout:4/820: dread d0/d6/d95/ff7 [0,4194304] 0 2026-03-10T07:51:05.657 INFO:tasks.workunit.client.0.vm05.stdout:8/668: rmdir d1/dd/d18/d20/d2a/d9a 39 2026-03-10T07:51:05.661 INFO:tasks.workunit.client.0.vm05.stdout:9/718: fsync d8/d86/d28/d79/d57/de1/d22/d33/d62/f6c 0 2026-03-10T07:51:05.664 INFO:tasks.workunit.client.0.vm05.stdout:1/760: link da/d26/d2b/c5e da/dd/d2a/d55/d64/dd1/cee 0 2026-03-10T07:51:05.687 INFO:tasks.workunit.client.0.vm05.stdout:6/789: symlink d0/d11/d22/d69/lfd 0 2026-03-10T07:51:05.702 INFO:tasks.workunit.client.0.vm05.stdout:4/821: fsync d0/fa3 0 2026-03-10T07:51:05.724 INFO:tasks.workunit.client.0.vm05.stdout:9/719: mknod d8/d86/d28/d79/d57/de1/d1c/d20/dd3/d63/cf1 0 2026-03-10T07:51:05.732 INFO:tasks.workunit.client.0.vm05.stdout:6/790: read d0/d11/d4f/d56/d96/db6/fef [3807865,2612] 0 2026-03-10T07:51:05.753 INFO:tasks.workunit.client.0.vm05.stdout:7/785: link d1/d6/f1d d1/d34/d59/d60/d8c/ff4 0 2026-03-10T07:51:05.754 INFO:tasks.workunit.client.0.vm05.stdout:9/720: dread d8/f5e [0,4194304] 0 2026-03-10T07:51:05.766 INFO:tasks.workunit.client.0.vm05.stdout:8/669: creat d1/dd/d18/d20/d2a/fd8 x:0 0 0 2026-03-10T07:51:05.783 INFO:tasks.workunit.client.0.vm05.stdout:6/791: truncate d0/d35/d36/d43/ff7 1515553 0 2026-03-10T07:51:05.802 INFO:tasks.workunit.client.0.vm05.stdout:5/782: link d2/d20/d33/d86/dac/dc1/ld9 d2/d12/d2d/l104 0 2026-03-10T07:51:05.812 INFO:tasks.workunit.client.0.vm05.stdout:9/721: dread d8/fa [0,4194304] 0 2026-03-10T07:51:05.814 INFO:tasks.workunit.client.0.vm05.stdout:6/792: creat d0/d35/ffe x:0 0 0 2026-03-10T07:51:05.814 INFO:tasks.workunit.client.0.vm05.stdout:6/793: readlink d0/l87 0 2026-03-10T07:51:05.816 INFO:tasks.workunit.client.0.vm05.stdout:4/822: getdents d0/d6/d9/d8c/dbe 0 2026-03-10T07:51:05.822 INFO:tasks.workunit.client.0.vm05.stdout:6/794: dwrite d0/d11/d22/d6c/d84/fd4 [0,4194304] 0 2026-03-10T07:51:05.839 INFO:tasks.workunit.client.0.vm05.stdout:6/795: mknod d0/d11/d4f/d7d/db7/cff 0 2026-03-10T07:51:05.840 INFO:tasks.workunit.client.0.vm05.stdout:0/760: dwrite d8/dd/f3c [0,4194304] 0 2026-03-10T07:51:05.841 INFO:tasks.workunit.client.0.vm05.stdout:0/761: write d8/dd/d37/d56/d4d/df8/ffc [570030,42547] 0 2026-03-10T07:51:05.842 INFO:tasks.workunit.client.0.vm05.stdout:4/823: dwrite d0/d6/d9/d5a/d6e/db6/db9/fd9 [0,4194304] 0 2026-03-10T07:51:05.843 INFO:tasks.workunit.client.0.vm05.stdout:8/670: dread d1/dd/d18/d20/f43 [0,4194304] 0 2026-03-10T07:51:05.851 INFO:tasks.workunit.client.0.vm05.stdout:5/783: symlink d2/d20/l105 0 2026-03-10T07:51:05.866 INFO:tasks.workunit.client.0.vm05.stdout:8/671: chown d1/dd/d18/d20/d2a/d34/c78 5517 1 2026-03-10T07:51:05.868 INFO:tasks.workunit.client.0.vm05.stdout:6/796: mknod d0/d11/d4f/d7d/db7/c100 0 2026-03-10T07:51:05.877 INFO:tasks.workunit.client.0.vm05.stdout:4/824: creat d0/d6/d9/d12/d69/dc7/f110 x:0 0 0 2026-03-10T07:51:05.882 INFO:tasks.workunit.client.0.vm05.stdout:5/784: creat d2/d20/d77/f106 x:0 0 0 2026-03-10T07:51:05.894 INFO:tasks.workunit.client.0.vm05.stdout:4/825: symlink d0/d6/d9/d8c/dbe/l111 0 2026-03-10T07:51:05.909 INFO:tasks.workunit.client.0.vm05.stdout:8/672: unlink d1/dd/d4d/d64/d6a/c6e 0 2026-03-10T07:51:05.965 INFO:tasks.workunit.client.0.vm05.stdout:6/797: sync 2026-03-10T07:51:05.965 INFO:tasks.workunit.client.0.vm05.stdout:4/826: sync 2026-03-10T07:51:06.031 INFO:tasks.workunit.client.0.vm05.stdout:3/729: rename d8/d1c/d64/cb0 to d8/d1f/d24/d76/cf6 0 2026-03-10T07:51:06.031 INFO:tasks.workunit.client.0.vm05.stdout:0/762: rename d8/dd/d37 to d8/dd/d37/dfd/d107 22 2026-03-10T07:51:06.032 INFO:tasks.workunit.client.0.vm05.stdout:0/763: chown d8/dd/d10/d26/d8b 55996 1 2026-03-10T07:51:06.048 INFO:tasks.workunit.client.0.vm05.stdout:0/764: dread d8/dd/d10/d26/d3a/d5e/d63/fe6 [0,4194304] 0 2026-03-10T07:51:06.051 INFO:tasks.workunit.client.0.vm05.stdout:7/786: creat d1/d3c/d71/d79/ff5 x:0 0 0 2026-03-10T07:51:06.052 INFO:tasks.workunit.client.0.vm05.stdout:8/673: rename d1/dd/d18/d20/l3e to d1/dd/d4d/dcc/ld9 0 2026-03-10T07:51:06.055 INFO:tasks.workunit.client.0.vm05.stdout:3/730: mkdir d8/d22/d60/df7 0 2026-03-10T07:51:06.077 INFO:tasks.workunit.client.0.vm05.stdout:3/731: dread d8/d1f/f2f [0,4194304] 0 2026-03-10T07:51:06.097 INFO:tasks.workunit.client.0.vm05.stdout:7/787: rmdir d1/d34/d59/d60/d8c 39 2026-03-10T07:51:06.106 INFO:tasks.workunit.client.0.vm05.stdout:7/788: dwrite d1/d34/f7d [0,4194304] 0 2026-03-10T07:51:06.111 INFO:tasks.workunit.client.0.vm05.stdout:7/789: truncate d1/d34/ff2 496164 0 2026-03-10T07:51:06.115 INFO:tasks.workunit.client.0.vm05.stdout:8/674: dread d1/dd/d18/d20/d2a/d48/f57 [0,4194304] 0 2026-03-10T07:51:06.117 INFO:tasks.workunit.client.0.vm05.stdout:4/827: symlink d0/d6/l112 0 2026-03-10T07:51:06.131 INFO:tasks.workunit.client.0.vm05.stdout:2/807: rmdir d0/d8/d66/dd1/d49/db3 39 2026-03-10T07:51:06.135 INFO:tasks.workunit.client.0.vm05.stdout:1/761: write da/dd/d2a/d70/f9c [483862,101927] 0 2026-03-10T07:51:06.147 INFO:tasks.workunit.client.0.vm05.stdout:3/732: dread d8/d1f/d24/d76/dc5/de1/d19/d37/f43 [0,4194304] 0 2026-03-10T07:51:06.151 INFO:tasks.workunit.client.0.vm05.stdout:3/733: read d8/d22/d60/f61 [3457752,104831] 0 2026-03-10T07:51:06.156 INFO:tasks.workunit.client.0.vm05.stdout:4/828: creat d0/d6/d9/d12/d4f/f113 x:0 0 0 2026-03-10T07:51:06.157 INFO:tasks.workunit.client.0.vm05.stdout:4/829: chown d0/d6/d37 401 1 2026-03-10T07:51:06.160 INFO:tasks.workunit.client.0.vm05.stdout:9/722: truncate d8/d86/d28/d79/d57/de1/d38/d71/fb5 2566178 0 2026-03-10T07:51:06.162 INFO:tasks.workunit.client.0.vm05.stdout:9/723: stat d8/d86/d28/d79/d57/de1/d22/l4e 0 2026-03-10T07:51:06.165 INFO:tasks.workunit.client.0.vm05.stdout:8/675: mknod d1/dd/d18/d20/d2a/d48/d5a/cda 0 2026-03-10T07:51:06.169 INFO:tasks.workunit.client.0.vm05.stdout:2/808: creat d0/d8/d66/dd1/d49/d81/dd5/f108 x:0 0 0 2026-03-10T07:51:06.172 INFO:tasks.workunit.client.0.vm05.stdout:3/734: dread d8/d1f/d24/d76/dc5/de1/d52/fde [0,4194304] 0 2026-03-10T07:51:06.175 INFO:tasks.workunit.client.0.vm05.stdout:1/762: mkdir da/dd/d12/d34/ddb/def 0 2026-03-10T07:51:06.189 INFO:tasks.workunit.client.0.vm05.stdout:4/830: dread d0/d6/da6/fba [0,4194304] 0 2026-03-10T07:51:06.208 INFO:tasks.workunit.client.0.vm05.stdout:3/735: creat d8/d22/dad/ff8 x:0 0 0 2026-03-10T07:51:06.208 INFO:tasks.workunit.client.0.vm05.stdout:3/736: fsync d8/d22/d60/fc6 0 2026-03-10T07:51:06.214 INFO:tasks.workunit.client.0.vm05.stdout:6/798: write d0/d11/d57/f5f [783400,48612] 0 2026-03-10T07:51:06.218 INFO:tasks.workunit.client.0.vm05.stdout:0/765: unlink d8/c60 0 2026-03-10T07:51:06.226 INFO:tasks.workunit.client.0.vm05.stdout:2/809: truncate d0/f5 163120 0 2026-03-10T07:51:06.234 INFO:tasks.workunit.client.0.vm05.stdout:1/763: unlink da/dd/d2a/d55/d64/dd1/cee 0 2026-03-10T07:51:06.235 INFO:tasks.workunit.client.0.vm05.stdout:5/785: creat d2/f107 x:0 0 0 2026-03-10T07:51:06.241 INFO:tasks.workunit.client.0.vm05.stdout:6/799: dread d0/d11/d57/d60/f74 [0,4194304] 0 2026-03-10T07:51:06.241 INFO:tasks.workunit.client.0.vm05.stdout:3/737: mkdir d8/d22/dad/df9 0 2026-03-10T07:51:06.242 INFO:tasks.workunit.client.0.vm05.stdout:4/831: getdents d0/d6/d9/d5a/d6e/dd1/d108 0 2026-03-10T07:51:06.244 INFO:tasks.workunit.client.0.vm05.stdout:0/766: rmdir d8/dd/d10/d26/d48 39 2026-03-10T07:51:06.247 INFO:tasks.workunit.client.0.vm05.stdout:2/810: unlink d0/f94 0 2026-03-10T07:51:06.248 INFO:tasks.workunit.client.0.vm05.stdout:7/790: write d1/d6/f2e [3042099,58039] 0 2026-03-10T07:51:06.257 INFO:tasks.workunit.client.0.vm05.stdout:5/786: mknod d2/d20/d33/d86/dac/dc1/c108 0 2026-03-10T07:51:06.260 INFO:tasks.workunit.client.0.vm05.stdout:4/832: truncate d0/d6/d9/d5a/f82 337924 0 2026-03-10T07:51:06.266 INFO:tasks.workunit.client.0.vm05.stdout:7/791: dwrite d1/d6/d3b/fe9 [0,4194304] 0 2026-03-10T07:51:06.275 INFO:tasks.workunit.client.0.vm05.stdout:8/676: dwrite d1/dd/d18/d20/d2a/d48/d5a/f98 [0,4194304] 0 2026-03-10T07:51:06.296 INFO:tasks.workunit.client.0.vm05.stdout:6/800: mkdir d0/d35/d36/dd2/d101 0 2026-03-10T07:51:06.297 INFO:tasks.workunit.client.0.vm05.stdout:5/787: mkdir d2/d20/d33/d86/dac/dc1/d109 0 2026-03-10T07:51:06.298 INFO:tasks.workunit.client.0.vm05.stdout:9/724: write d8/d86/d28/d79/d57/de1/d38/d71/fb5 [1638908,93536] 0 2026-03-10T07:51:06.311 INFO:tasks.workunit.client.0.vm05.stdout:5/788: read d2/f15 [1438330,126286] 0 2026-03-10T07:51:06.313 INFO:tasks.workunit.client.0.vm05.stdout:2/811: link d0/d8/d43/d38/fc4 d0/d8/d66/dd1/d49/df9/da5/f109 0 2026-03-10T07:51:06.313 INFO:tasks.workunit.client.0.vm05.stdout:4/833: symlink d0/d6/d9/d5a/d6e/db6/db9/l114 0 2026-03-10T07:51:06.313 INFO:tasks.workunit.client.0.vm05.stdout:5/789: stat d2/d20/d5b/ccc 0 2026-03-10T07:51:06.314 INFO:tasks.workunit.client.0.vm05.stdout:1/764: rename da/d26/d2b/d89/lb8 to da/dd/d42/lf0 0 2026-03-10T07:51:06.314 INFO:tasks.workunit.client.0.vm05.stdout:5/790: stat d2/d20/d33/fc7 0 2026-03-10T07:51:06.320 INFO:tasks.workunit.client.0.vm05.stdout:0/767: creat d8/dd/d10/f108 x:0 0 0 2026-03-10T07:51:06.326 INFO:tasks.workunit.client.0.vm05.stdout:9/725: creat d8/d86/d95/ff2 x:0 0 0 2026-03-10T07:51:06.326 INFO:tasks.workunit.client.0.vm05.stdout:9/726: read d8/f8a [452945,30331] 0 2026-03-10T07:51:06.332 INFO:tasks.workunit.client.0.vm05.stdout:3/738: link d8/d1f/d24/d76/dc5/de1/d19/d37/l9a d8/d1f/d2a/d34/dd2/lfa 0 2026-03-10T07:51:06.333 INFO:tasks.workunit.client.0.vm05.stdout:3/739: chown d8/d1f/d24/d76/dc5/de1/lc2 32 1 2026-03-10T07:51:06.336 INFO:tasks.workunit.client.0.vm05.stdout:2/812: rmdir d0/d8/d3d 39 2026-03-10T07:51:06.336 INFO:tasks.workunit.client.0.vm05.stdout:2/813: fdatasync d0/d8/d66/dd1/ff7 0 2026-03-10T07:51:06.352 INFO:tasks.workunit.client.0.vm05.stdout:7/792: dwrite d1/d3c/d71/fd2 [0,4194304] 0 2026-03-10T07:51:06.355 INFO:tasks.workunit.client.0.vm05.stdout:6/801: dwrite d0/d11/d2e/f30 [0,4194304] 0 2026-03-10T07:51:06.363 INFO:tasks.workunit.client.0.vm05.stdout:8/677: rename d1/dd/d18/d20/d2a/d34/da5/db2 to d1/d52/dd3/ddb 0 2026-03-10T07:51:06.369 INFO:tasks.workunit.client.0.vm05.stdout:7/793: dread d1/d6/f31 [0,4194304] 0 2026-03-10T07:51:06.369 INFO:tasks.workunit.client.0.vm05.stdout:5/791: write d2/d20/d4c/fb1 [3959412,36876] 0 2026-03-10T07:51:06.370 INFO:tasks.workunit.client.0.vm05.stdout:7/794: chown d1/d34/f7d 7681 1 2026-03-10T07:51:06.371 INFO:tasks.workunit.client.0.vm05.stdout:7/795: write d1/d3c/f89 [3023675,73597] 0 2026-03-10T07:51:06.376 INFO:tasks.workunit.client.0.vm05.stdout:4/834: truncate d0/d6/d9/d8c/fc1 3870510 0 2026-03-10T07:51:06.392 INFO:tasks.workunit.client.0.vm05.stdout:0/768: creat d8/dd/d10/d26/d8b/da4/de7/f109 x:0 0 0 2026-03-10T07:51:06.392 INFO:tasks.workunit.client.0.vm05.stdout:9/727: mknod d8/d86/d28/d79/d57/de1/d6b/cf3 0 2026-03-10T07:51:06.392 INFO:tasks.workunit.client.0.vm05.stdout:3/740: fsync d8/d1f/d24/d76/dc5/de1/d19/f81 0 2026-03-10T07:51:06.392 INFO:tasks.workunit.client.0.vm05.stdout:6/802: symlink d0/d35/d36/db8/l102 0 2026-03-10T07:51:06.392 INFO:tasks.workunit.client.0.vm05.stdout:8/678: symlink d1/dd/d18/d20/d2a/d48/d7c/ldc 0 2026-03-10T07:51:06.398 INFO:tasks.workunit.client.0.vm05.stdout:7/796: unlink d1/d6/d80/d82/fa9 0 2026-03-10T07:51:06.399 INFO:tasks.workunit.client.0.vm05.stdout:4/835: symlink d0/d28/l115 0 2026-03-10T07:51:06.400 INFO:tasks.workunit.client.0.vm05.stdout:0/769: creat d8/dd/d10/db7/f10a x:0 0 0 2026-03-10T07:51:06.401 INFO:tasks.workunit.client.0.vm05.stdout:0/770: chown d8/dd/d37/c84 411 1 2026-03-10T07:51:06.401 INFO:tasks.workunit.client.0.vm05.stdout:9/728: rename d8/d86/d28/d79/d57/lb7 to d8/d86/d28/d79/d57/de1/d1c/d20/lf4 0 2026-03-10T07:51:06.404 INFO:tasks.workunit.client.0.vm05.stdout:0/771: chown d8/dd/d10/d26/d8b/d70/cf6 0 1 2026-03-10T07:51:06.405 INFO:tasks.workunit.client.0.vm05.stdout:2/814: rmdir d0/d8/d43/df/d4d 39 2026-03-10T07:51:06.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:06 vm05.local ceph-mon[50387]: pgmap v22: 65 pgs: 65 active+clean; 1.6 GiB data, 5.4 GiB used, 115 GiB / 120 GiB avail; 57 MiB/s rd, 133 MiB/s wr, 350 op/s 2026-03-10T07:51:06.409 INFO:tasks.workunit.client.0.vm05.stdout:8/679: symlink d1/d45/d90/ldd 0 2026-03-10T07:51:06.409 INFO:tasks.workunit.client.0.vm05.stdout:5/792: symlink d2/d12/l10a 0 2026-03-10T07:51:06.410 INFO:tasks.workunit.client.0.vm05.stdout:4/836: truncate d0/d6/d9/d5a/d6e/db6/db9/fbc 665912 0 2026-03-10T07:51:06.412 INFO:tasks.workunit.client.0.vm05.stdout:9/729: creat d8/d86/d28/d79/d57/ff5 x:0 0 0 2026-03-10T07:51:06.415 INFO:tasks.workunit.client.0.vm05.stdout:2/815: mkdir d0/d8/d43/df/d4e/d10a 0 2026-03-10T07:51:06.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:06 vm08.local ceph-mon[59917]: pgmap v22: 65 pgs: 65 active+clean; 1.6 GiB data, 5.4 GiB used, 115 GiB / 120 GiB avail; 57 MiB/s rd, 133 MiB/s wr, 350 op/s 2026-03-10T07:51:06.423 INFO:tasks.workunit.client.0.vm05.stdout:8/680: unlink d1/dd/d18/d20/d2a/d48/d7c/d9c/da4/fa2 0 2026-03-10T07:51:06.429 INFO:tasks.workunit.client.0.vm05.stdout:3/741: mkdir d8/dd5/dfb 0 2026-03-10T07:51:06.431 INFO:tasks.workunit.client.0.vm05.stdout:6/803: creat d0/d11/f103 x:0 0 0 2026-03-10T07:51:06.434 INFO:tasks.workunit.client.0.vm05.stdout:3/742: creat d8/d1f/ffc x:0 0 0 2026-03-10T07:51:06.434 INFO:tasks.workunit.client.0.vm05.stdout:4/837: truncate d0/d6/d9/f4d 3247390 0 2026-03-10T07:51:06.435 INFO:tasks.workunit.client.0.vm05.stdout:2/816: link d0/d8/d66/dd1/d49/fde d0/d8/d43/df/d4e/d10a/f10b 0 2026-03-10T07:51:06.436 INFO:tasks.workunit.client.0.vm05.stdout:6/804: mknod d0/d11/d22/d6c/d84/c104 0 2026-03-10T07:51:06.438 INFO:tasks.workunit.client.0.vm05.stdout:8/681: truncate d1/d6f/f7d 364246 0 2026-03-10T07:51:06.442 INFO:tasks.workunit.client.0.vm05.stdout:6/805: rename d0/d11/d31/dbf/fed to d0/d11/d2e/f105 0 2026-03-10T07:51:06.452 INFO:tasks.workunit.client.0.vm05.stdout:8/682: mknod d1/dd/d18/d20/d2a/d48/d7c/d9c/cde 0 2026-03-10T07:51:06.452 INFO:tasks.workunit.client.0.vm05.stdout:2/817: mkdir d0/d8/d43/df/d8b/dbf/de5/d10c 0 2026-03-10T07:51:06.452 INFO:tasks.workunit.client.0.vm05.stdout:2/818: stat d0/d8/d43/df/d8b/dbf/de5 0 2026-03-10T07:51:06.452 INFO:tasks.workunit.client.0.vm05.stdout:8/683: rmdir d1 39 2026-03-10T07:51:06.455 INFO:tasks.workunit.client.0.vm05.stdout:8/684: creat d1/d52/dd3/fdf x:0 0 0 2026-03-10T07:51:06.455 INFO:tasks.workunit.client.0.vm05.stdout:2/819: link d0/d2a/cd0 d0/d8/dc6/c10d 0 2026-03-10T07:51:06.460 INFO:tasks.workunit.client.0.vm05.stdout:8/685: fdatasync d1/dd/d18/d20/d2a/d34/d49/fc7 0 2026-03-10T07:51:06.462 INFO:tasks.workunit.client.0.vm05.stdout:2/820: unlink d0/d8/d66/dd1/d49/dab/lcf 0 2026-03-10T07:51:06.463 INFO:tasks.workunit.client.0.vm05.stdout:8/686: unlink d1/dd/d18/d20/d2a/c7f 0 2026-03-10T07:51:06.472 INFO:tasks.workunit.client.0.vm05.stdout:2/821: truncate d0/d8/d66/dd1/d49/dab/fe9 4263727 0 2026-03-10T07:51:06.472 INFO:tasks.workunit.client.0.vm05.stdout:2/822: stat d0/d8/d43/df/d8b/dbf/de5 0 2026-03-10T07:51:06.472 INFO:tasks.workunit.client.0.vm05.stdout:2/823: chown d0/d8/d3d/fdd 133 1 2026-03-10T07:51:06.472 INFO:tasks.workunit.client.0.vm05.stdout:2/824: truncate d0/d8/f2d 714429 0 2026-03-10T07:51:06.472 INFO:tasks.workunit.client.0.vm05.stdout:2/825: readlink d0/d8/d43/dc9/lcc 0 2026-03-10T07:51:06.479 INFO:tasks.workunit.client.0.vm05.stdout:2/826: write d0/d8/d66/dd1/d49/d81/dd5/fe2 [1108135,31434] 0 2026-03-10T07:51:06.481 INFO:tasks.workunit.client.0.vm05.stdout:1/765: write da/dd/d2a/f75 [1698183,87942] 0 2026-03-10T07:51:06.482 INFO:tasks.workunit.client.0.vm05.stdout:7/797: sync 2026-03-10T07:51:06.483 INFO:tasks.workunit.client.0.vm05.stdout:0/772: sync 2026-03-10T07:51:06.483 INFO:tasks.workunit.client.0.vm05.stdout:7/798: readlink d1/d34/d59/d60/ldd 0 2026-03-10T07:51:06.489 INFO:tasks.workunit.client.0.vm05.stdout:8/687: sync 2026-03-10T07:51:06.489 INFO:tasks.workunit.client.0.vm05.stdout:1/766: dread da/dd/d2a/d55/d64/f81 [4194304,4194304] 0 2026-03-10T07:51:06.494 INFO:tasks.workunit.client.0.vm05.stdout:8/688: rename d1/dd/d18/d20 to d1/dd/d18/d20/d2a/d48/d7c/d9c/da4/de0 22 2026-03-10T07:51:06.497 INFO:tasks.workunit.client.0.vm05.stdout:1/767: dread da/fe5 [0,4194304] 0 2026-03-10T07:51:06.498 INFO:tasks.workunit.client.0.vm05.stdout:2/827: fsync d0/d8/d66/dd1/fda 0 2026-03-10T07:51:06.512 INFO:tasks.workunit.client.0.vm05.stdout:7/799: creat d1/d3c/d4b/ff6 x:0 0 0 2026-03-10T07:51:06.520 INFO:tasks.workunit.client.0.vm05.stdout:1/768: rmdir da/d26/d2b/d71 39 2026-03-10T07:51:06.523 INFO:tasks.workunit.client.0.vm05.stdout:1/769: stat da/dd/d12/d34/ddb 0 2026-03-10T07:51:06.526 INFO:tasks.workunit.client.0.vm05.stdout:0/773: rmdir d8/dd/d10/d26/d48 39 2026-03-10T07:51:06.531 INFO:tasks.workunit.client.0.vm05.stdout:0/774: read d8/dd/d10/d26/d8b/da4/f5b [2639497,8333] 0 2026-03-10T07:51:06.540 INFO:tasks.workunit.client.0.vm05.stdout:7/800: dread d1/d34/f7a [0,4194304] 0 2026-03-10T07:51:06.563 INFO:tasks.workunit.client.0.vm05.stdout:1/770: rename da/d26/d2b/daf/dbe/cc1 to da/d26/d2b/dcb/cf1 0 2026-03-10T07:51:06.565 INFO:tasks.workunit.client.0.vm05.stdout:1/771: chown da/dd/d2a/d55/d68/l6b 60152019 1 2026-03-10T07:51:06.566 INFO:tasks.workunit.client.0.vm05.stdout:1/772: chown da/dd/c56 4023986 1 2026-03-10T07:51:06.596 INFO:tasks.workunit.client.0.vm05.stdout:1/773: rename da/d26/d2b/d89/la0 to da/d26/d2b/daf/lf2 0 2026-03-10T07:51:06.596 INFO:tasks.workunit.client.0.vm05.stdout:5/793: write d2/d12/f2b [1244543,118804] 0 2026-03-10T07:51:06.601 INFO:tasks.workunit.client.0.vm05.stdout:1/774: write da/dd/d2a/d55/d64/f81 [2026974,79466] 0 2026-03-10T07:51:06.602 INFO:tasks.workunit.client.0.vm05.stdout:7/801: link d1/d34/d59/d60/d8c/ff4 d1/d5b/ff7 0 2026-03-10T07:51:06.605 INFO:tasks.workunit.client.0.vm05.stdout:7/802: chown d1/d3c/c54 93775 1 2026-03-10T07:51:06.605 INFO:tasks.workunit.client.0.vm05.stdout:1/775: readlink da/dd/d12/d86/lba 0 2026-03-10T07:51:06.612 INFO:tasks.workunit.client.0.vm05.stdout:9/730: dwrite d8/d86/d28/d79/d57/de1/d6b/f97 [0,4194304] 0 2026-03-10T07:51:06.622 INFO:tasks.workunit.client.0.vm05.stdout:1/776: dread da/fc [0,4194304] 0 2026-03-10T07:51:06.623 INFO:tasks.workunit.client.0.vm05.stdout:5/794: link d2/d20/d5b/f6e d2/d20/d5b/f10b 0 2026-03-10T07:51:06.624 INFO:tasks.workunit.client.0.vm05.stdout:5/795: read - d2/d20/d33/d86/ff1 zero size 2026-03-10T07:51:06.625 INFO:tasks.workunit.client.0.vm05.stdout:5/796: fdatasync d2/f42 0 2026-03-10T07:51:06.641 INFO:tasks.workunit.client.0.vm05.stdout:5/797: fdatasync d2/d12/d4d/f84 0 2026-03-10T07:51:06.645 INFO:tasks.workunit.client.0.vm05.stdout:5/798: stat d2/d12/d4d/cfd 0 2026-03-10T07:51:06.646 INFO:tasks.workunit.client.0.vm05.stdout:9/731: rename d8/d86/d28/d79/d57/de1/d1c/d20/dd3/d63/f64 to d8/d86/ff6 0 2026-03-10T07:51:06.649 INFO:tasks.workunit.client.0.vm05.stdout:6/806: write d0/f26 [1415729,32425] 0 2026-03-10T07:51:06.652 INFO:tasks.workunit.client.0.vm05.stdout:3/743: dwrite d8/d1f/d24/d76/dc5/de1/d19/d6b/fe7 [4194304,4194304] 0 2026-03-10T07:51:06.652 INFO:tasks.workunit.client.0.vm05.stdout:4/838: dwrite d0/d6/d95/f3a [0,4194304] 0 2026-03-10T07:51:06.659 INFO:tasks.workunit.client.0.vm05.stdout:9/732: write d8/d86/d28/d79/d57/de1/d1c/d20/f32 [387673,128547] 0 2026-03-10T07:51:06.675 INFO:tasks.workunit.client.0.vm05.stdout:3/744: dread - d8/d1f/d24/d76/dc5/de1/d19/d37/f7c zero size 2026-03-10T07:51:06.676 INFO:tasks.workunit.client.0.vm05.stdout:5/799: dread d2/d20/d33/d53/d7d/f82 [0,4194304] 0 2026-03-10T07:51:06.705 INFO:tasks.workunit.client.0.vm05.stdout:6/807: readlink d0/d11/d22/d6c/lf9 0 2026-03-10T07:51:06.707 INFO:tasks.workunit.client.0.vm05.stdout:8/689: write d1/dd/d18/f21 [3779189,20328] 0 2026-03-10T07:51:06.717 INFO:tasks.workunit.client.0.vm05.stdout:2/828: dwrite d0/d8/d66/dd1/d49/fb8 [0,4194304] 0 2026-03-10T07:51:06.742 INFO:tasks.workunit.client.0.vm05.stdout:0/775: dwrite d8/dd/d10/d26/d2a/fc7 [0,4194304] 0 2026-03-10T07:51:06.747 INFO:tasks.workunit.client.0.vm05.stdout:9/733: sync 2026-03-10T07:51:06.760 INFO:tasks.workunit.client.0.vm05.stdout:3/745: symlink d8/d1f/d2a/d34/dbd/lfd 0 2026-03-10T07:51:06.763 INFO:tasks.workunit.client.0.vm05.stdout:5/800: rename d2/d5/c4e to d2/d12/da8/ddd/de9/c10c 0 2026-03-10T07:51:06.771 INFO:tasks.workunit.client.0.vm05.stdout:8/690: dread d1/dd/d18/d20/d2a/f3a [0,4194304] 0 2026-03-10T07:51:06.782 INFO:tasks.workunit.client.0.vm05.stdout:7/803: dwrite d1/d3c/d4b/fb3 [0,4194304] 0 2026-03-10T07:51:06.784 INFO:tasks.workunit.client.0.vm05.stdout:0/776: fsync d8/dd/d10/fd9 0 2026-03-10T07:51:06.788 INFO:tasks.workunit.client.0.vm05.stdout:0/777: fdatasync d8/dd/d10/d26/d3a/d5e/ffb 0 2026-03-10T07:51:06.794 INFO:tasks.workunit.client.0.vm05.stdout:7/804: truncate d1/d3c/d71/d79/d8a/fce 1164564 0 2026-03-10T07:51:06.800 INFO:tasks.workunit.client.0.vm05.stdout:1/777: dwrite da/dd/d42/d80/f94 [0,4194304] 0 2026-03-10T07:51:06.800 INFO:tasks.workunit.client.0.vm05.stdout:3/746: creat d8/d22/d60/d6e/ffe x:0 0 0 2026-03-10T07:51:06.800 INFO:tasks.workunit.client.0.vm05.stdout:8/691: read - d1/dd/d4d/d64/d6a/f83 zero size 2026-03-10T07:51:06.800 INFO:tasks.workunit.client.0.vm05.stdout:8/692: chown d1/d52/f94 4635 1 2026-03-10T07:51:06.803 INFO:tasks.workunit.client.0.vm05.stdout:0/778: mkdir d8/dd/d10/db7/dc3/d10b 0 2026-03-10T07:51:06.803 INFO:tasks.workunit.client.0.vm05.stdout:0/779: chown d8/dd/d10/f6c 13352650 1 2026-03-10T07:51:06.805 INFO:tasks.workunit.client.0.vm05.stdout:7/805: symlink d1/d6/d80/d82/lf8 0 2026-03-10T07:51:06.809 INFO:tasks.workunit.client.0.vm05.stdout:3/747: creat d8/d22/dad/fff x:0 0 0 2026-03-10T07:51:06.811 INFO:tasks.workunit.client.0.vm05.stdout:4/839: getdents d0 0 2026-03-10T07:51:06.812 INFO:tasks.workunit.client.0.vm05.stdout:5/801: symlink d2/d20/d7b/l10d 0 2026-03-10T07:51:06.816 INFO:tasks.workunit.client.0.vm05.stdout:0/780: dwrite d8/dd/f40 [0,4194304] 0 2026-03-10T07:51:06.837 INFO:tasks.workunit.client.0.vm05.stdout:7/806: symlink d1/d6/d80/dcd/lf9 0 2026-03-10T07:51:06.837 INFO:tasks.workunit.client.0.vm05.stdout:3/748: creat d8/dd5/dfb/f100 x:0 0 0 2026-03-10T07:51:06.837 INFO:tasks.workunit.client.0.vm05.stdout:0/781: dread d8/f20 [0,4194304] 0 2026-03-10T07:51:06.837 INFO:tasks.workunit.client.0.vm05.stdout:0/782: chown d8/dd/d10/f6c 18599 1 2026-03-10T07:51:06.841 INFO:tasks.workunit.client.0.vm05.stdout:0/783: readlink d8/dd/d10/d26/d2a/l82 0 2026-03-10T07:51:06.845 INFO:tasks.workunit.client.0.vm05.stdout:0/784: truncate d8/d9c/fd8 602914 0 2026-03-10T07:51:06.846 INFO:tasks.workunit.client.0.vm05.stdout:0/785: chown d8/dd/d10/d26/d3a/d5e/cbd 26810888 1 2026-03-10T07:51:06.853 INFO:tasks.workunit.client.0.vm05.stdout:7/807: creat d1/d6/d3b/ffa x:0 0 0 2026-03-10T07:51:06.853 INFO:tasks.workunit.client.0.vm05.stdout:7/808: chown d1/d34/d59/d60/d8c/laa 65101298 1 2026-03-10T07:51:06.855 INFO:tasks.workunit.client.0.vm05.stdout:7/809: write d1/d3c/f89 [3213348,123797] 0 2026-03-10T07:51:06.855 INFO:tasks.workunit.client.0.vm05.stdout:1/778: dread da/d26/d2b/fb0 [0,4194304] 0 2026-03-10T07:51:06.875 INFO:tasks.workunit.client.0.vm05.stdout:5/802: read d2/d4b/f73 [1711136,38168] 0 2026-03-10T07:51:06.883 INFO:tasks.workunit.client.0.vm05.stdout:9/734: getdents d8/d86 0 2026-03-10T07:51:06.883 INFO:tasks.workunit.client.0.vm05.stdout:7/810: symlink d1/d3c/d71/d79/d8a/dac/lfb 0 2026-03-10T07:51:06.884 INFO:tasks.workunit.client.0.vm05.stdout:5/803: dread d2/d5/f71 [0,4194304] 0 2026-03-10T07:51:06.886 INFO:tasks.workunit.client.0.vm05.stdout:5/804: dread - d2/d12/d2d/d4a/de7/f102 zero size 2026-03-10T07:51:06.916 INFO:tasks.workunit.client.0.vm05.stdout:6/808: dwrite d0/d11/f9d [0,4194304] 0 2026-03-10T07:51:06.921 INFO:tasks.workunit.client.0.vm05.stdout:4/840: write d0/d6/d9/f49 [659413,75234] 0 2026-03-10T07:51:06.921 INFO:tasks.workunit.client.0.vm05.stdout:2/829: dwrite d0/d8/d66/dd1/d49/df9/da5/f109 [0,4194304] 0 2026-03-10T07:51:06.921 INFO:tasks.workunit.client.0.vm05.stdout:3/749: write d8/d1f/d2a/d96/da9/fb5 [123131,14294] 0 2026-03-10T07:51:06.927 INFO:tasks.workunit.client.0.vm05.stdout:2/830: dread - d0/d8/d66/dd1/d49/d81/dd5/f108 zero size 2026-03-10T07:51:06.927 INFO:tasks.workunit.client.0.vm05.stdout:8/693: dwrite d1/d52/f77 [0,4194304] 0 2026-03-10T07:51:06.933 INFO:tasks.workunit.client.0.vm05.stdout:2/831: stat d0/d8/d43/d38 0 2026-03-10T07:51:06.935 INFO:tasks.workunit.client.0.vm05.stdout:8/694: stat d1/fe 0 2026-03-10T07:51:06.950 INFO:tasks.workunit.client.0.vm05.stdout:9/735: rename d8/d86/d28/d79/d57/de1/d1c/d20/d59/feb to d8/d86/d28/d79/d57/de1/d22/d33/d47/ff7 0 2026-03-10T07:51:06.960 INFO:tasks.workunit.client.0.vm05.stdout:6/809: mkdir d0/d35/d36/dc8/d106 0 2026-03-10T07:51:06.964 INFO:tasks.workunit.client.0.vm05.stdout:2/832: dread d0/d8/d66/dd1/fa6 [0,4194304] 0 2026-03-10T07:51:06.967 INFO:tasks.workunit.client.0.vm05.stdout:3/750: symlink d8/d1f/d24/d45/l101 0 2026-03-10T07:51:06.974 INFO:tasks.workunit.client.0.vm05.stdout:9/736: dread f6 [0,4194304] 0 2026-03-10T07:51:06.985 INFO:tasks.workunit.client.0.vm05.stdout:5/805: mknod d2/d12/dda/da1/dc0/c10e 0 2026-03-10T07:51:06.988 INFO:tasks.workunit.client.0.vm05.stdout:5/806: chown d2/d12/f3a 1785827804 1 2026-03-10T07:51:06.993 INFO:tasks.workunit.client.0.vm05.stdout:4/841: read d0/d6/d9/f8f [80825,88573] 0 2026-03-10T07:51:06.994 INFO:tasks.workunit.client.0.vm05.stdout:0/786: getdents d8/dd/d10/db7 0 2026-03-10T07:51:06.998 INFO:tasks.workunit.client.0.vm05.stdout:6/810: rename d0/d35/d36/dc8/cec to d0/d11/d2e/c107 0 2026-03-10T07:51:07.000 INFO:tasks.workunit.client.0.vm05.stdout:2/833: mknod d0/d7e/c10e 0 2026-03-10T07:51:07.013 INFO:tasks.workunit.client.0.vm05.stdout:5/807: fdatasync d2/d20/d4c/fa5 0 2026-03-10T07:51:07.017 INFO:tasks.workunit.client.0.vm05.stdout:4/842: creat d0/d6/d9/d12/d4f/f116 x:0 0 0 2026-03-10T07:51:07.026 INFO:tasks.workunit.client.0.vm05.stdout:4/843: fdatasync d0/d6/d9/d12/d45/d55/f56 0 2026-03-10T07:51:07.030 INFO:tasks.workunit.client.0.vm05.stdout:4/844: stat d0/d6/d9/d5a/f58 0 2026-03-10T07:51:07.030 INFO:tasks.workunit.client.0.vm05.stdout:4/845: fdatasync d0/d6/d9/f83 0 2026-03-10T07:51:07.044 INFO:tasks.workunit.client.0.vm05.stdout:2/834: dwrite d0/d8/d66/dd1/d49/db3/fdf [0,4194304] 0 2026-03-10T07:51:07.050 INFO:tasks.workunit.client.0.vm05.stdout:9/737: sync 2026-03-10T07:51:07.051 INFO:tasks.workunit.client.0.vm05.stdout:2/835: readlink d0/d8/d43/l5b 0 2026-03-10T07:51:07.055 INFO:tasks.workunit.client.0.vm05.stdout:6/811: dread d0/d11/d57/d66/f79 [4194304,4194304] 0 2026-03-10T07:51:07.055 INFO:tasks.workunit.client.0.vm05.stdout:5/808: mknod d2/d20/d33/d53/c10f 0 2026-03-10T07:51:07.071 INFO:tasks.workunit.client.0.vm05.stdout:0/787: rename d8/dd/d10/db7/dc3/lcf to d8/dd/d10/d26/d8b/l10c 0 2026-03-10T07:51:07.072 INFO:tasks.workunit.client.0.vm05.stdout:6/812: dwrite d0/d35/f41 [0,4194304] 0 2026-03-10T07:51:07.084 INFO:tasks.workunit.client.0.vm05.stdout:2/836: dwrite d0/d8/d66/dd1/fda [0,4194304] 0 2026-03-10T07:51:07.088 INFO:tasks.workunit.client.0.vm05.stdout:7/811: dwrite d1/d3c/d71/fab [0,4194304] 0 2026-03-10T07:51:07.100 INFO:tasks.workunit.client.0.vm05.stdout:1/779: dwrite da/dd/d2a/d55/d64/dd1/f4e [0,4194304] 0 2026-03-10T07:51:07.103 INFO:tasks.workunit.client.0.vm05.stdout:4/846: mknod d0/d6/d60/dde/df2/c117 0 2026-03-10T07:51:07.107 INFO:tasks.workunit.client.0.vm05.stdout:4/847: write d0/d6/d9/d12/d45/d55/d44/f7e [2345029,21926] 0 2026-03-10T07:51:07.120 INFO:tasks.workunit.client.0.vm05.stdout:8/695: dwrite d1/d6f/fa7 [0,4194304] 0 2026-03-10T07:51:07.147 INFO:tasks.workunit.client.0.vm05.stdout:3/751: link d8/d22/d60/l5e d8/d1f/d24/d76/dc5/l102 0 2026-03-10T07:51:07.149 INFO:tasks.workunit.client.0.vm05.stdout:3/752: write d8/d1f/d2a/d96/fcf [1075574,66303] 0 2026-03-10T07:51:07.185 INFO:tasks.workunit.client.0.vm05.stdout:5/809: write d2/d5/f71 [100720,126171] 0 2026-03-10T07:51:07.231 INFO:tasks.workunit.client.0.vm05.stdout:7/812: creat d1/d34/ffc x:0 0 0 2026-03-10T07:51:07.233 INFO:tasks.workunit.client.0.vm05.stdout:7/813: dread - d1/d34/d59/f6f zero size 2026-03-10T07:51:07.257 INFO:tasks.workunit.client.0.vm05.stdout:8/696: chown d1/dd/d18/d20/d2a/d48/f50 245 1 2026-03-10T07:51:07.274 INFO:tasks.workunit.client.0.vm05.stdout:0/788: creat d8/dd/d10/db7/dc3/d10b/f10d x:0 0 0 2026-03-10T07:51:07.292 INFO:tasks.workunit.client.0.vm05.stdout:8/697: stat d1/d6f/f7d 0 2026-03-10T07:51:07.298 INFO:tasks.workunit.client.0.vm05.stdout:7/814: dread d1/d34/d59/fd1 [0,4194304] 0 2026-03-10T07:51:07.308 INFO:tasks.workunit.client.0.vm05.stdout:0/789: symlink d8/dd/d37/dfd/l10e 0 2026-03-10T07:51:07.315 INFO:tasks.workunit.client.0.vm05.stdout:4/848: dwrite d0/fa3 [0,4194304] 0 2026-03-10T07:51:07.315 INFO:tasks.workunit.client.0.vm05.stdout:2/837: link d0/d8/d66/dd1/d49/fd3 d0/d8/d66/dd1/d49/df9/db2/dd7/f10f 0 2026-03-10T07:51:07.324 INFO:tasks.workunit.client.0.vm05.stdout:6/813: link d0/d11/d4f/d56/f83 d0/d35/f108 0 2026-03-10T07:51:07.330 INFO:tasks.workunit.client.0.vm05.stdout:1/780: link da/caa da/d26/d2b/daf/dbe/cf3 0 2026-03-10T07:51:07.330 INFO:tasks.workunit.client.0.vm05.stdout:8/698: mknod d1/dd/d18/ce1 0 2026-03-10T07:51:07.331 INFO:tasks.workunit.client.0.vm05.stdout:4/849: dwrite d0/d6/f39 [0,4194304] 0 2026-03-10T07:51:07.338 INFO:tasks.workunit.client.0.vm05.stdout:7/815: rmdir d1/d34 39 2026-03-10T07:51:07.350 INFO:tasks.workunit.client.0.vm05.stdout:5/810: creat d2/d20/d4c/db6/f110 x:0 0 0 2026-03-10T07:51:07.351 INFO:tasks.workunit.client.0.vm05.stdout:9/738: getdents d8/d86/d28/d79/d57/de1/d22/d33/d62/de0 0 2026-03-10T07:51:07.356 INFO:tasks.workunit.client.0.vm05.stdout:0/790: unlink d8/dd/d37/d67/d96/lf1 0 2026-03-10T07:51:07.360 INFO:tasks.workunit.client.0.vm05.stdout:2/838: mknod d0/d8/d66/dd1/d49/df9/db2/dd7/c110 0 2026-03-10T07:51:07.375 INFO:tasks.workunit.client.0.vm05.stdout:7/816: chown d1/d34/lea 419049 1 2026-03-10T07:51:07.376 INFO:tasks.workunit.client.0.vm05.stdout:7/817: chown d1/d6/d3b/lc1 23265230 1 2026-03-10T07:51:07.382 INFO:tasks.workunit.client.0.vm05.stdout:3/753: truncate d8/d1f/d2a/d4a/d7d/ff3 1791991 0 2026-03-10T07:51:07.387 INFO:tasks.workunit.client.0.vm05.stdout:8/699: dwrite d1/dd/d18/d20/f43 [0,4194304] 0 2026-03-10T07:51:07.396 INFO:tasks.workunit.client.0.vm05.stdout:4/850: rename d0/d6/d9/d12/d69/fe1 to d0/d6/d9/d12/d9c/db7/da7/d96/f118 0 2026-03-10T07:51:07.405 INFO:tasks.workunit.client.0.vm05.stdout:2/839: dread - d0/d8/d43/fe1 zero size 2026-03-10T07:51:07.406 INFO:tasks.workunit.client.0.vm05.stdout:6/814: dwrite d0/d6/f44 [0,4194304] 0 2026-03-10T07:51:07.415 INFO:tasks.workunit.client.0.vm05.stdout:7/818: rmdir d1/d6 39 2026-03-10T07:51:07.419 INFO:tasks.workunit.client.0.vm05.stdout:3/754: truncate d8/d1f/d24/d8a/f91 4653935 0 2026-03-10T07:51:07.433 INFO:tasks.workunit.client.0.vm05.stdout:4/851: dread d0/d6/d9/d12/d69/fa5 [0,4194304] 0 2026-03-10T07:51:07.434 INFO:tasks.workunit.client.0.vm05.stdout:6/815: fsync d0/f29 0 2026-03-10T07:51:07.434 INFO:tasks.workunit.client.0.vm05.stdout:6/816: chown d0/d11/d22/d6c/d84 8089 1 2026-03-10T07:51:07.435 INFO:tasks.workunit.client.0.vm05.stdout:4/852: stat d0/d6/d9/l59 0 2026-03-10T07:51:07.436 INFO:tasks.workunit.client.0.vm05.stdout:4/853: chown d0/d6/d9/d12/d4f/cdf 213 1 2026-03-10T07:51:07.440 INFO:tasks.workunit.client.0.vm05.stdout:2/840: mkdir d0/d8/d66/dd1/d49/db1/d111 0 2026-03-10T07:51:07.443 INFO:tasks.workunit.client.0.vm05.stdout:2/841: read - d0/d8/d43/fe1 zero size 2026-03-10T07:51:07.444 INFO:tasks.workunit.client.0.vm05.stdout:7/819: dwrite d1/d3c/d71/d79/d8a/fad [0,4194304] 0 2026-03-10T07:51:07.459 INFO:tasks.workunit.client.0.vm05.stdout:9/739: creat d8/d86/d28/d79/d57/de1/d22/d33/ff8 x:0 0 0 2026-03-10T07:51:07.459 INFO:tasks.workunit.client.0.vm05.stdout:1/781: getdents da/dd/d2a 0 2026-03-10T07:51:07.460 INFO:tasks.workunit.client.0.vm05.stdout:1/782: fdatasync da/dd/d2a/d55/d64/f9f 0 2026-03-10T07:51:07.460 INFO:tasks.workunit.client.0.vm05.stdout:6/817: fdatasync d0/d35/d36/f59 0 2026-03-10T07:51:07.463 INFO:tasks.workunit.client.0.vm05.stdout:5/811: getdents d2/d12/dda/da1/dc0/dc2 0 2026-03-10T07:51:07.463 INFO:tasks.workunit.client.0.vm05.stdout:4/854: symlink d0/d6/d9/d12/d69/dc7/ded/l119 0 2026-03-10T07:51:07.464 INFO:tasks.workunit.client.0.vm05.stdout:7/820: dwrite d1/f49 [0,4194304] 0 2026-03-10T07:51:07.465 INFO:tasks.workunit.client.0.vm05.stdout:2/842: creat d0/d8/d66/dd1/d49/df9/db2/dd7/ddb/f112 x:0 0 0 2026-03-10T07:51:07.474 INFO:tasks.workunit.client.0.vm05.stdout:4/855: dwrite d0/d6/d9/d12/d69/dc7/f110 [0,4194304] 0 2026-03-10T07:51:07.482 INFO:tasks.workunit.client.0.vm05.stdout:5/812: fsync d2/d5/d61/ffe 0 2026-03-10T07:51:07.482 INFO:tasks.workunit.client.0.vm05.stdout:7/821: dwrite d1/d34/d59/f64 [0,4194304] 0 2026-03-10T07:51:07.496 INFO:tasks.workunit.client.0.vm05.stdout:2/843: chown d0/d8/d43/df/d4d/l8a 1457473 1 2026-03-10T07:51:07.502 INFO:tasks.workunit.client.0.vm05.stdout:4/856: dread d0/d6/d9/d12/d4f/f5b [0,4194304] 0 2026-03-10T07:51:07.510 INFO:tasks.workunit.client.0.vm05.stdout:9/740: mkdir d8/d86/d28/d79/d57/de1/d22/d33/df9 0 2026-03-10T07:51:07.521 INFO:tasks.workunit.client.0.vm05.stdout:4/857: dread d0/d6/d9/d12/f36 [4194304,4194304] 0 2026-03-10T07:51:07.547 INFO:tasks.workunit.client.0.vm05.stdout:6/818: link d0/d35/d36/dc8/cdd d0/d35/d36/db8/c109 0 2026-03-10T07:51:07.547 INFO:tasks.workunit.client.0.vm05.stdout:6/819: fdatasync d0/f26 0 2026-03-10T07:51:07.559 INFO:tasks.workunit.client.0.vm05.stdout:3/755: sync 2026-03-10T07:51:07.564 INFO:tasks.workunit.client.0.vm05.stdout:5/813: symlink d2/d12/dda/da1/dc0/l111 0 2026-03-10T07:51:07.574 INFO:tasks.workunit.client.0.vm05.stdout:1/783: rename da/d26/d2b/ca4 to da/dd/d2a/d55/cf4 0 2026-03-10T07:51:07.582 INFO:tasks.workunit.client.0.vm05.stdout:6/820: symlink d0/d6/d3b/l10a 0 2026-03-10T07:51:07.585 INFO:tasks.workunit.client.0.vm05.stdout:3/756: truncate d8/d1f/d24/d8a/fcb 1786340 0 2026-03-10T07:51:07.595 INFO:tasks.workunit.client.0.vm05.stdout:0/791: write d8/dd/d37/fd6 [589598,75269] 0 2026-03-10T07:51:07.595 INFO:tasks.workunit.client.0.vm05.stdout:0/792: fsync d8/dd/d37/fc9 0 2026-03-10T07:51:07.596 INFO:tasks.workunit.client.0.vm05.stdout:8/700: write d1/d6f/f85 [308020,80300] 0 2026-03-10T07:51:07.600 INFO:tasks.workunit.client.0.vm05.stdout:8/701: write d1/dd/d18/d20/d2a/d34/d49/d5d/fd6 [649073,69409] 0 2026-03-10T07:51:07.608 INFO:tasks.workunit.client.0.vm05.stdout:3/757: dread d8/d1f/d2a/d34/f39 [0,4194304] 0 2026-03-10T07:51:07.610 INFO:tasks.workunit.client.0.vm05.stdout:6/821: mkdir d0/d11/d22/d10b 0 2026-03-10T07:51:07.611 INFO:tasks.workunit.client.0.vm05.stdout:6/822: read - d0/d11/d2e/f105 zero size 2026-03-10T07:51:07.619 INFO:tasks.workunit.client.0.vm05.stdout:0/793: read d8/dd/d37/d67/f87 [55266,34705] 0 2026-03-10T07:51:07.629 INFO:tasks.workunit.client.0.vm05.stdout:2/844: dwrite d0/d8/d43/df/d53/f82 [0,4194304] 0 2026-03-10T07:51:07.629 INFO:tasks.workunit.client.0.vm05.stdout:3/758: truncate d8/d1f/d24/d76/dc5/de1/d19/f21 2752609 0 2026-03-10T07:51:07.632 INFO:tasks.workunit.client.0.vm05.stdout:4/858: write d0/d6/d37/f75 [2948726,62849] 0 2026-03-10T07:51:07.641 INFO:tasks.workunit.client.0.vm05.stdout:0/794: mkdir d8/d9c/dc8/d10f 0 2026-03-10T07:51:07.642 INFO:tasks.workunit.client.0.vm05.stdout:1/784: link da/d26/d2b/daf/dbe/dc0/ce2 da/dd/d2a/d55/d64/dac/cf5 0 2026-03-10T07:51:07.650 INFO:tasks.workunit.client.0.vm05.stdout:9/741: dwrite d8/d86/d28/d79/d57/de1/f5d [0,4194304] 0 2026-03-10T07:51:07.656 INFO:tasks.workunit.client.0.vm05.stdout:7/822: dwrite d1/d6/f31 [0,4194304] 0 2026-03-10T07:51:07.674 INFO:tasks.workunit.client.0.vm05.stdout:5/814: write d2/d20/d33/d86/fb3 [180239,98051] 0 2026-03-10T07:51:07.693 INFO:tasks.workunit.client.0.vm05.stdout:8/702: rename d1/dd/d18/d20/d2a/d34/d49/fa9 to d1/dd/d4d/d64/fe2 0 2026-03-10T07:51:07.696 INFO:tasks.workunit.client.0.vm05.stdout:6/823: dwrite d0/d11/d57/d66/f79 [4194304,4194304] 0 2026-03-10T07:51:07.696 INFO:tasks.workunit.client.0.vm05.stdout:6/824: write d0/d6/f44 [3863359,58609] 0 2026-03-10T07:51:07.696 INFO:tasks.workunit.client.0.vm05.stdout:3/759: rename d8/d22/ce8 to d8/d1f/d2a/d4a/d7d/c103 0 2026-03-10T07:51:07.696 INFO:tasks.workunit.client.0.vm05.stdout:8/703: stat d1/dd/d18/d20/d2a/d48/f4a 0 2026-03-10T07:51:07.696 INFO:tasks.workunit.client.0.vm05.stdout:8/704: chown d1/dd/d5e/d9e 104 1 2026-03-10T07:51:07.696 INFO:tasks.workunit.client.0.vm05.stdout:9/742: dread d8/d86/d28/d79/d57/de1/d22/d33/d47/fc2 [0,4194304] 0 2026-03-10T07:51:07.700 INFO:tasks.workunit.client.0.vm05.stdout:7/823: link d1/f16 d1/d6/d80/d82/ffd 0 2026-03-10T07:51:07.706 INFO:tasks.workunit.client.0.vm05.stdout:1/785: dread da/d26/f92 [4194304,4194304] 0 2026-03-10T07:51:07.715 INFO:tasks.workunit.client.0.vm05.stdout:1/786: dwrite da/dd/d2a/d55/d64/dd1/f4e [0,4194304] 0 2026-03-10T07:51:07.720 INFO:tasks.workunit.client.0.vm05.stdout:4/859: sync 2026-03-10T07:51:07.726 INFO:tasks.workunit.client.0.vm05.stdout:0/795: dread d8/dd/d10/d26/d3a/d5e/ffb [0,4194304] 0 2026-03-10T07:51:07.734 INFO:tasks.workunit.client.0.vm05.stdout:6/825: mknod d0/d11/c10c 0 2026-03-10T07:51:07.750 INFO:tasks.workunit.client.0.vm05.stdout:7/824: creat d1/d34/d59/d60/d8c/de4/ffe x:0 0 0 2026-03-10T07:51:07.754 INFO:tasks.workunit.client.0.vm05.stdout:8/705: read d1/dd/d18/d20/d2a/d34/d49/d5d/fd6 [352409,48843] 0 2026-03-10T07:51:07.760 INFO:tasks.workunit.client.0.vm05.stdout:7/825: dwrite d1/d3c/d71/fd2 [4194304,4194304] 0 2026-03-10T07:51:07.766 INFO:tasks.workunit.client.0.vm05.stdout:7/826: write d1/d6/f2e [4522964,105780] 0 2026-03-10T07:51:07.767 INFO:tasks.workunit.client.0.vm05.stdout:4/860: creat d0/d6/da6/f11a x:0 0 0 2026-03-10T07:51:07.781 INFO:tasks.workunit.client.0.vm05.stdout:2/845: write d0/d8/d43/fe1 [530077,127300] 0 2026-03-10T07:51:07.786 INFO:tasks.workunit.client.0.vm05.stdout:5/815: write d2/d20/d4c/f9a [1208284,72578] 0 2026-03-10T07:51:07.794 INFO:tasks.workunit.client.0.vm05.stdout:4/861: mknod d0/d6/d9/d5a/d6e/db6/db9/c11b 0 2026-03-10T07:51:07.794 INFO:tasks.workunit.client.0.vm05.stdout:4/862: chown d0/d6/d9/d5a/c4b 90558725 1 2026-03-10T07:51:07.797 INFO:tasks.workunit.client.0.vm05.stdout:7/827: mkdir d1/d34/d59/d60/d8c/dff 0 2026-03-10T07:51:07.809 INFO:tasks.workunit.client.0.vm05.stdout:9/743: dwrite d8/d86/d28/d79/d57/de1/d22/d33/f73 [0,4194304] 0 2026-03-10T07:51:07.812 INFO:tasks.workunit.client.0.vm05.stdout:2/846: unlink d0/d8/d43/df/d8b/f99 0 2026-03-10T07:51:07.820 INFO:tasks.workunit.client.0.vm05.stdout:1/787: dwrite da/d26/f92 [4194304,4194304] 0 2026-03-10T07:51:07.824 INFO:tasks.workunit.client.0.vm05.stdout:1/788: fsync da/d26/d9e/fd5 0 2026-03-10T07:51:07.825 INFO:tasks.workunit.client.0.vm05.stdout:4/863: mkdir d0/d11c 0 2026-03-10T07:51:07.826 INFO:tasks.workunit.client.0.vm05.stdout:8/706: write d1/dd/f87 [359945,97333] 0 2026-03-10T07:51:07.836 INFO:tasks.workunit.client.0.vm05.stdout:0/796: write d8/dd/d10/d26/d3a/d5e/d63/fe6 [1154449,78461] 0 2026-03-10T07:51:07.836 INFO:tasks.workunit.client.0.vm05.stdout:6/826: write d0/d11/d22/d6c/d84/fa8 [293449,109686] 0 2026-03-10T07:51:07.836 INFO:tasks.workunit.client.0.vm05.stdout:0/797: truncate d8/dd/d10/d26/d3a/d5e/ff9 840816 0 2026-03-10T07:51:07.836 INFO:tasks.workunit.client.0.vm05.stdout:8/707: write d1/dd/d18/d20/d2a/d34/d49/d5d/f84 [4169037,76691] 0 2026-03-10T07:51:07.836 INFO:tasks.workunit.client.0.vm05.stdout:8/708: chown d1/dd/d18/d20/d2a/d48/d7c/d9c 130 1 2026-03-10T07:51:07.841 INFO:tasks.workunit.client.0.vm05.stdout:3/760: rename d8/d22/f86 to d8/d1f/d24/d8a/f104 0 2026-03-10T07:51:07.854 INFO:tasks.workunit.client.0.vm05.stdout:1/789: creat da/dd/d12/d86/d9a/ff6 x:0 0 0 2026-03-10T07:51:07.854 INFO:tasks.workunit.client.0.vm05.stdout:1/790: readlink da/dd/d2a/d55/d64/lde 0 2026-03-10T07:51:07.855 INFO:tasks.workunit.client.0.vm05.stdout:1/791: chown da/dd/d12/d34/c3e 150546 1 2026-03-10T07:51:07.859 INFO:tasks.workunit.client.0.vm05.stdout:6/827: symlink d0/d35/d36/db8/l10d 0 2026-03-10T07:51:07.867 INFO:tasks.workunit.client.0.vm05.stdout:6/828: chown d0/d11/d22/f52 187237329 1 2026-03-10T07:51:07.867 INFO:tasks.workunit.client.0.vm05.stdout:0/798: mknod d8/dd/d10/d26/d8b/da4/de7/c110 0 2026-03-10T07:51:07.867 INFO:tasks.workunit.client.0.vm05.stdout:8/709: dread d1/dd/d4d/d64/d6a/fb5 [0,4194304] 0 2026-03-10T07:51:07.867 INFO:tasks.workunit.client.0.vm05.stdout:8/710: write d1/d6f/f85 [464347,129627] 0 2026-03-10T07:51:07.870 INFO:tasks.workunit.client.0.vm05.stdout:8/711: read d1/fe [1700771,55186] 0 2026-03-10T07:51:07.875 INFO:tasks.workunit.client.0.vm05.stdout:1/792: rmdir da/d26 39 2026-03-10T07:51:07.875 INFO:tasks.workunit.client.0.vm05.stdout:1/793: stat da/dd/d2a/d55/d64/f81 0 2026-03-10T07:51:07.888 INFO:tasks.workunit.client.0.vm05.stdout:8/712: write d1/dd/d4d/d64/d6a/fb5 [8947039,12752] 0 2026-03-10T07:51:07.898 INFO:tasks.workunit.client.0.vm05.stdout:8/713: mkdir d1/dd/d18/d20/d2a/de3 0 2026-03-10T07:51:07.898 INFO:tasks.workunit.client.0.vm05.stdout:8/714: creat d1/dd/d18/d20/fe4 x:0 0 0 2026-03-10T07:51:07.898 INFO:tasks.workunit.client.0.vm05.stdout:8/715: chown d1/dd/d18/d20/d2a/d34/l62 521237 1 2026-03-10T07:51:07.902 INFO:tasks.workunit.client.0.vm05.stdout:6/829: sync 2026-03-10T07:51:07.902 INFO:tasks.workunit.client.0.vm05.stdout:6/830: chown d0/d11/d2e/d81 9 1 2026-03-10T07:51:07.902 INFO:tasks.workunit.client.0.vm05.stdout:8/716: truncate d1/d52/f94 695758 0 2026-03-10T07:51:07.904 INFO:tasks.workunit.client.0.vm05.stdout:8/717: stat d1/dd/d18/d20/d2a/f3a 0 2026-03-10T07:51:07.949 INFO:tasks.workunit.client.0.vm05.stdout:7/828: write d1/d6/d3b/d7f/f9a [2152856,2252] 0 2026-03-10T07:51:07.954 INFO:tasks.workunit.client.0.vm05.stdout:2/847: dwrite d0/d8/d43/f5e [0,4194304] 0 2026-03-10T07:51:07.955 INFO:tasks.workunit.client.0.vm05.stdout:5/816: dwrite d2/d20/d4c/fc4 [0,4194304] 0 2026-03-10T07:51:07.984 INFO:tasks.workunit.client.0.vm05.stdout:5/817: rmdir d2/d12/d4d 39 2026-03-10T07:51:07.985 INFO:tasks.workunit.client.0.vm05.stdout:9/744: dwrite d8/d86/d28/d79/d57/de1/d38/fa3 [0,4194304] 0 2026-03-10T07:51:07.993 INFO:tasks.workunit.client.0.vm05.stdout:0/799: dwrite d8/dd/d37/d67/f87 [0,4194304] 0 2026-03-10T07:51:07.994 INFO:tasks.workunit.client.0.vm05.stdout:0/800: chown d8/dd/d10/d26/d3a/d5e/cb3 7 1 2026-03-10T07:51:08.006 INFO:tasks.workunit.client.0.vm05.stdout:2/848: dread d0/d8/f3b [0,4194304] 0 2026-03-10T07:51:08.006 INFO:tasks.workunit.client.0.vm05.stdout:3/761: mkdir d8/d22/d60/d6e/d105 0 2026-03-10T07:51:08.006 INFO:tasks.workunit.client.0.vm05.stdout:2/849: write d0/d8/d66/dd1/d49/df9/db2/fbc [1204233,124914] 0 2026-03-10T07:51:08.006 INFO:tasks.workunit.client.0.vm05.stdout:1/794: write da/dd/d12/d34/ddb/fc5 [586026,128245] 0 2026-03-10T07:51:08.006 INFO:tasks.workunit.client.0.vm05.stdout:1/795: stat da/dd/d42/fe3 0 2026-03-10T07:51:08.028 INFO:tasks.workunit.client.0.vm05.stdout:5/818: rmdir d2/d20/d33/d86 39 2026-03-10T07:51:08.041 INFO:tasks.workunit.client.0.vm05.stdout:6/831: write d0/d11/d4f/d56/f73 [294125,18721] 0 2026-03-10T07:51:08.048 INFO:tasks.workunit.client.0.vm05.stdout:0/801: mkdir d8/dd/d10/d26/d8b/da4/de7/d111 0 2026-03-10T07:51:08.071 INFO:tasks.workunit.client.0.vm05.stdout:3/762: mkdir d8/d1f/d24/d76/dc5/de1/d106 0 2026-03-10T07:51:08.078 INFO:tasks.workunit.client.0.vm05.stdout:8/718: dwrite d1/dd/d18/d20/d2a/d34/d49/d5d/f74 [0,4194304] 0 2026-03-10T07:51:08.103 INFO:tasks.workunit.client.0.vm05.stdout:7/829: write d1/d6/d47/f65 [2050486,63604] 0 2026-03-10T07:51:08.103 INFO:tasks.workunit.client.0.vm05.stdout:1/796: fsync da/d26/d9e/fd5 0 2026-03-10T07:51:08.103 INFO:tasks.workunit.client.0.vm05.stdout:1/797: stat da/dd/d12/d34 0 2026-03-10T07:51:08.103 INFO:tasks.workunit.client.0.vm05.stdout:7/830: readlink d1/d34/l39 0 2026-03-10T07:51:08.120 INFO:tasks.workunit.client.0.vm05.stdout:6/832: truncate d0/d11/d4f/f7e 2517542 0 2026-03-10T07:51:08.206 INFO:tasks.workunit.client.0.vm05.stdout:3/763: creat d8/d8f/dbc/f107 x:0 0 0 2026-03-10T07:51:08.209 INFO:tasks.workunit.client.0.vm05.stdout:2/850: mkdir d0/d113 0 2026-03-10T07:51:08.212 INFO:tasks.workunit.client.0.vm05.stdout:4/864: rename d0/d6/d9/d12/d45/d55/d44/l47 to d0/d6/d9/d12/d45/l11d 0 2026-03-10T07:51:08.214 INFO:tasks.workunit.client.0.vm05.stdout:5/819: rmdir d2/d20/d33/d86 39 2026-03-10T07:51:08.214 INFO:tasks.workunit.client.0.vm05.stdout:6/833: creat d0/d6/d3b/f10e x:0 0 0 2026-03-10T07:51:08.214 INFO:tasks.workunit.client.0.vm05.stdout:4/865: dread d0/d6/d9/d12/d9c/db7/da7/f9a [0,4194304] 0 2026-03-10T07:51:08.219 INFO:tasks.workunit.client.0.vm05.stdout:0/802: link d8/dd/f40 d8/dd/d37/d56/d4d/f112 0 2026-03-10T07:51:08.220 INFO:tasks.workunit.client.0.vm05.stdout:0/803: chown d8/dd/d37/d56/ce8 178367 1 2026-03-10T07:51:08.223 INFO:tasks.workunit.client.0.vm05.stdout:0/804: chown d8/dd/d10/d26/d8b/da4/ddf 24004515 1 2026-03-10T07:51:08.237 INFO:tasks.workunit.client.0.vm05.stdout:8/719: rename d1/dd/d18/d20 to d1/dd/d4d/d64/d6a/de5 0 2026-03-10T07:51:08.267 INFO:tasks.workunit.client.0.vm05.stdout:5/820: fsync d2/d20/d33/d86/ff1 0 2026-03-10T07:51:08.268 INFO:tasks.workunit.client.0.vm05.stdout:5/821: truncate d2/d12/da8/ddd/fef 395445 0 2026-03-10T07:51:08.286 INFO:tasks.workunit.client.0.vm05.stdout:9/745: link d8/d86/d28/d79/d57/de1/d1c/d20/d59/d8b/l55 d8/d86/d28/d79/d57/de1/d38/d71/lfa 0 2026-03-10T07:51:08.293 INFO:tasks.workunit.client.0.vm05.stdout:0/805: creat d8/dd/d37/d56/d4d/f113 x:0 0 0 2026-03-10T07:51:08.297 INFO:tasks.workunit.client.0.vm05.stdout:2/851: unlink d0/d8/d3d/f40 0 2026-03-10T07:51:08.302 INFO:tasks.workunit.client.0.vm05.stdout:3/764: rename d8/d1f/d24/d76/dc5/de1/d106 to d8/d1f/d108 0 2026-03-10T07:51:08.306 INFO:tasks.workunit.client.0.vm05.stdout:8/720: chown d1/fa 33674 1 2026-03-10T07:51:08.311 INFO:tasks.workunit.client.0.vm05.stdout:1/798: creat da/dd/d12/d34/d58/ff7 x:0 0 0 2026-03-10T07:51:08.314 INFO:tasks.workunit.client.0.vm05.stdout:6/834: mknod d0/d11/c10f 0 2026-03-10T07:51:08.325 INFO:tasks.workunit.client.0.vm05.stdout:0/806: mknod d8/dd/d37/d56/d4d/df8/c114 0 2026-03-10T07:51:08.332 INFO:tasks.workunit.client.0.vm05.stdout:2/852: truncate d0/f22 643765 0 2026-03-10T07:51:08.335 INFO:tasks.workunit.client.0.vm05.stdout:0/807: dread d8/dd/f3c [0,4194304] 0 2026-03-10T07:51:08.336 INFO:tasks.workunit.client.0.vm05.stdout:0/808: fsync d8/dd/d37/fd6 0 2026-03-10T07:51:08.341 INFO:tasks.workunit.client.0.vm05.stdout:7/831: getdents d1 0 2026-03-10T07:51:08.341 INFO:tasks.workunit.client.0.vm05.stdout:7/832: stat d1/f49 0 2026-03-10T07:51:08.356 INFO:tasks.workunit.client.0.vm05.stdout:4/866: write d0/d6/d9/d12/d9c/db7/da7/d96/fe0 [184564,106249] 0 2026-03-10T07:51:08.362 INFO:tasks.workunit.client.0.vm05.stdout:3/765: truncate d8/f18 3259029 0 2026-03-10T07:51:08.363 INFO:tasks.workunit.client.0.vm05.stdout:3/766: stat d8/d1c/c36 0 2026-03-10T07:51:08.364 INFO:tasks.workunit.client.0.vm05.stdout:7/833: symlink d1/d3c/d71/d79/d8a/dac/l100 0 2026-03-10T07:51:08.372 INFO:tasks.workunit.client.0.vm05.stdout:6/835: creat d0/d11/d57/d60/dcc/f110 x:0 0 0 2026-03-10T07:51:08.372 INFO:tasks.workunit.client.0.vm05.stdout:9/746: write d8/d86/d28/d79/d57/de1/d22/fb1 [3695388,41147] 0 2026-03-10T07:51:08.404 INFO:tasks.workunit.client.0.vm05.stdout:5/822: write d2/d12/d4d/f5d [5160758,85767] 0 2026-03-10T07:51:08.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:08 vm05.local ceph-mon[50387]: pgmap v23: 65 pgs: 65 active+clean; 1.6 GiB data, 5.4 GiB used, 115 GiB / 120 GiB avail; 33 MiB/s rd, 74 MiB/s wr, 207 op/s 2026-03-10T07:51:08.408 INFO:tasks.workunit.client.0.vm05.stdout:5/823: read d2/f42 [454291,100070] 0 2026-03-10T07:51:08.411 INFO:tasks.workunit.client.0.vm05.stdout:1/799: link da/dd/d2a/d55/d68/l8b da/d26/d2b/dcb/lf8 0 2026-03-10T07:51:08.413 INFO:tasks.workunit.client.0.vm05.stdout:5/824: truncate d2/d20/d4c/ffb 314866 0 2026-03-10T07:51:08.413 INFO:tasks.workunit.client.0.vm05.stdout:1/800: chown da/d26/d2b/dcb 18583 1 2026-03-10T07:51:08.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:08 vm08.local ceph-mon[59917]: pgmap v23: 65 pgs: 65 active+clean; 1.6 GiB data, 5.4 GiB used, 115 GiB / 120 GiB avail; 33 MiB/s rd, 74 MiB/s wr, 207 op/s 2026-03-10T07:51:08.421 INFO:tasks.workunit.client.0.vm05.stdout:9/747: truncate d8/d86/d28/d79/d57/de1/d22/d33/d47/fc2 1744804 0 2026-03-10T07:51:08.427 INFO:tasks.workunit.client.0.vm05.stdout:2/853: creat d0/f114 x:0 0 0 2026-03-10T07:51:08.430 INFO:tasks.workunit.client.0.vm05.stdout:0/809: creat d8/dd/d10/f115 x:0 0 0 2026-03-10T07:51:08.436 INFO:tasks.workunit.client.0.vm05.stdout:8/721: getdents d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d5d 0 2026-03-10T07:51:08.446 INFO:tasks.workunit.client.0.vm05.stdout:0/810: dwrite d8/dd/d37/d56/d4d/ffa [0,4194304] 0 2026-03-10T07:51:08.451 INFO:tasks.workunit.client.0.vm05.stdout:0/811: write d8/dd/d10/d26/d8b/da4/de7/f109 [249607,50897] 0 2026-03-10T07:51:08.453 INFO:tasks.workunit.client.0.vm05.stdout:7/834: creat d1/d6/dc3/f101 x:0 0 0 2026-03-10T07:51:08.457 INFO:tasks.workunit.client.0.vm05.stdout:3/767: dwrite d8/d1f/d24/d76/fc1 [0,4194304] 0 2026-03-10T07:51:08.461 INFO:tasks.workunit.client.0.vm05.stdout:0/812: stat d8/dd/d10/d26/d8b/d70/l8a 0 2026-03-10T07:51:08.464 INFO:tasks.workunit.client.0.vm05.stdout:3/768: truncate d8/d1f/ffc 368318 0 2026-03-10T07:51:08.467 INFO:tasks.workunit.client.0.vm05.stdout:6/836: symlink d0/d11/d57/l111 0 2026-03-10T07:51:08.475 INFO:tasks.workunit.client.0.vm05.stdout:4/867: write d0/d6/d9/d5a/d6e/ff6 [509530,62210] 0 2026-03-10T07:51:08.481 INFO:tasks.workunit.client.0.vm05.stdout:7/835: dread d1/d6/d47/d8d/fb4 [0,4194304] 0 2026-03-10T07:51:08.494 INFO:tasks.workunit.client.0.vm05.stdout:9/748: dwrite d8/d86/d28/d79/d57/de1/d22/d33/d47/ff7 [0,4194304] 0 2026-03-10T07:51:08.506 INFO:tasks.workunit.client.0.vm05.stdout:8/722: unlink d1/d45/d90/ldd 0 2026-03-10T07:51:08.507 INFO:tasks.workunit.client.0.vm05.stdout:9/749: dread d8/d86/d28/d79/d57/de1/f48 [0,4194304] 0 2026-03-10T07:51:08.509 INFO:tasks.workunit.client.0.vm05.stdout:0/813: dread d8/dd/d37/d56/feb [0,4194304] 0 2026-03-10T07:51:08.517 INFO:tasks.workunit.client.0.vm05.stdout:3/769: unlink d8/d22/d60/fce 0 2026-03-10T07:51:08.518 INFO:tasks.workunit.client.0.vm05.stdout:3/770: chown d8/d1c/d48/d69/fbe 6790672 1 2026-03-10T07:51:08.520 INFO:tasks.workunit.client.0.vm05.stdout:6/837: mkdir d0/d11/d22/d112 0 2026-03-10T07:51:08.523 INFO:tasks.workunit.client.0.vm05.stdout:4/868: read - d0/d6/d9/d12/d45/ff8 zero size 2026-03-10T07:51:08.523 INFO:tasks.workunit.client.0.vm05.stdout:6/838: dread d0/d11/d57/d66/f7b [0,4194304] 0 2026-03-10T07:51:08.529 INFO:tasks.workunit.client.0.vm05.stdout:1/801: link da/dd/d2a/f54 da/dd/ff9 0 2026-03-10T07:51:08.559 INFO:tasks.workunit.client.0.vm05.stdout:8/723: mknod d1/dd/d4d/d64/d6a/de5/d2a/d9a/ce6 0 2026-03-10T07:51:08.559 INFO:tasks.workunit.client.0.vm05.stdout:4/869: fsync d0/d6/d9/d5a/d91/f10c 0 2026-03-10T07:51:08.559 INFO:tasks.workunit.client.0.vm05.stdout:7/836: creat d1/d3c/f102 x:0 0 0 2026-03-10T07:51:08.559 INFO:tasks.workunit.client.0.vm05.stdout:0/814: creat d8/dd/f116 x:0 0 0 2026-03-10T07:51:08.559 INFO:tasks.workunit.client.0.vm05.stdout:8/724: truncate d1/dd/d4d/d64/d6a/de5/d2a/d48/d7c/d9c/fab 353029 0 2026-03-10T07:51:08.559 INFO:tasks.workunit.client.0.vm05.stdout:7/837: rmdir d1/d34/d59/d60 39 2026-03-10T07:51:08.559 INFO:tasks.workunit.client.0.vm05.stdout:0/815: symlink d8/dd/d37/d56/l117 0 2026-03-10T07:51:08.559 INFO:tasks.workunit.client.0.vm05.stdout:4/870: dwrite d0/d6/f39 [0,4194304] 0 2026-03-10T07:51:08.559 INFO:tasks.workunit.client.0.vm05.stdout:8/725: fdatasync d1/dd/d4d/d64/d6a/de5/d2a/d48/f4a 0 2026-03-10T07:51:08.559 INFO:tasks.workunit.client.0.vm05.stdout:0/816: chown d8/dd/d10/d26/cb9 21899479 1 2026-03-10T07:51:08.559 INFO:tasks.workunit.client.0.vm05.stdout:8/726: readlink d1/dd/d4d/d64/d6a/de5/d2a/d48/d7c/d9c/lc0 0 2026-03-10T07:51:08.559 INFO:tasks.workunit.client.0.vm05.stdout:0/817: unlink d8/dd/d10/d26/d3a/d5e/ca5 0 2026-03-10T07:51:08.559 INFO:tasks.workunit.client.0.vm05.stdout:0/818: dwrite d8/dd/d10/d26/d3a/fe1 [0,4194304] 0 2026-03-10T07:51:08.559 INFO:tasks.workunit.client.0.vm05.stdout:0/819: chown d8/dd/d10/d26/d3a 1030873609 1 2026-03-10T07:51:08.560 INFO:tasks.workunit.client.0.vm05.stdout:3/771: read d8/fe [3120925,56243] 0 2026-03-10T07:51:08.569 INFO:tasks.workunit.client.0.vm05.stdout:1/802: dread da/dd/d2a/f93 [0,4194304] 0 2026-03-10T07:51:08.570 INFO:tasks.workunit.client.0.vm05.stdout:3/772: mkdir d8/d1c/d109 0 2026-03-10T07:51:08.570 INFO:tasks.workunit.client.0.vm05.stdout:8/727: dread d1/dd/d18/f21 [0,4194304] 0 2026-03-10T07:51:08.572 INFO:tasks.workunit.client.0.vm05.stdout:4/871: getdents d0/d6/d60/dde 0 2026-03-10T07:51:08.574 INFO:tasks.workunit.client.0.vm05.stdout:4/872: write d0/d6/d95/f3a [523184,18406] 0 2026-03-10T07:51:08.574 INFO:tasks.workunit.client.0.vm05.stdout:8/728: creat d1/dd/d4d/d64/d6a/fe7 x:0 0 0 2026-03-10T07:51:08.578 INFO:tasks.workunit.client.0.vm05.stdout:0/820: rename d8/dd/d10/d26/d2a/f74 to d8/dd/f118 0 2026-03-10T07:51:08.587 INFO:tasks.workunit.client.0.vm05.stdout:0/821: write d8/dd/d37/d67/d96/ff3 [279740,57027] 0 2026-03-10T07:51:08.587 INFO:tasks.workunit.client.0.vm05.stdout:4/873: unlink d0/d6/d60/l10d 0 2026-03-10T07:51:08.587 INFO:tasks.workunit.client.0.vm05.stdout:1/803: dread da/d26/f92 [0,4194304] 0 2026-03-10T07:51:08.587 INFO:tasks.workunit.client.0.vm05.stdout:0/822: creat d8/dd/d37/dfd/f119 x:0 0 0 2026-03-10T07:51:08.587 INFO:tasks.workunit.client.0.vm05.stdout:1/804: rmdir da/d26 39 2026-03-10T07:51:08.587 INFO:tasks.workunit.client.0.vm05.stdout:4/874: mkdir d0/d6/d9/d8c/dbe/d11e 0 2026-03-10T07:51:08.587 INFO:tasks.workunit.client.0.vm05.stdout:8/729: creat d1/dd/d4d/d64/fe8 x:0 0 0 2026-03-10T07:51:08.587 INFO:tasks.workunit.client.0.vm05.stdout:1/805: mkdir da/dd/d2a/d55/d64/dac/dfa 0 2026-03-10T07:51:08.587 INFO:tasks.workunit.client.0.vm05.stdout:4/875: write d0/d6/d9/d12/d9c/db7/da7/d96/f118 [1046964,73352] 0 2026-03-10T07:51:08.587 INFO:tasks.workunit.client.0.vm05.stdout:1/806: readlink da/dd/d2a/d55/d64/dd1/l29 0 2026-03-10T07:51:08.588 INFO:tasks.workunit.client.0.vm05.stdout:1/807: readlink da/dd/d12/d34/l37 0 2026-03-10T07:51:08.590 INFO:tasks.workunit.client.0.vm05.stdout:4/876: rename d0/d6/d6f/c77 to d0/d6/d9/d12/c11f 0 2026-03-10T07:51:08.591 INFO:tasks.workunit.client.0.vm05.stdout:4/877: mkdir d0/d6/d9/d12/d69/d120 0 2026-03-10T07:51:08.592 INFO:tasks.workunit.client.0.vm05.stdout:1/808: read - da/d26/d2b/d71/fa2 zero size 2026-03-10T07:51:08.593 INFO:tasks.workunit.client.0.vm05.stdout:4/878: creat d0/d6/d9/d5a/d6e/db6/db9/f121 x:0 0 0 2026-03-10T07:51:08.595 INFO:tasks.workunit.client.0.vm05.stdout:4/879: read - d0/d6/d9/d12/d69/fdb zero size 2026-03-10T07:51:08.596 INFO:tasks.workunit.client.0.vm05.stdout:4/880: readlink d0/d6/d9/d12/d45/d55/l8b 0 2026-03-10T07:51:08.606 INFO:tasks.workunit.client.0.vm05.stdout:4/881: dread d0/d6/d9/d5a/f58 [0,4194304] 0 2026-03-10T07:51:08.690 INFO:tasks.workunit.client.0.vm05.stdout:5/825: sync 2026-03-10T07:51:08.692 INFO:tasks.workunit.client.0.vm05.stdout:5/826: mknod d2/d12/dda/da1/dc0/dc2/c112 0 2026-03-10T07:51:08.693 INFO:tasks.workunit.client.0.vm05.stdout:5/827: symlink d2/d5/d61/l113 0 2026-03-10T07:51:08.695 INFO:tasks.workunit.client.0.vm05.stdout:5/828: link d2/d4b/fc6 d2/d20/d33/f114 0 2026-03-10T07:51:08.697 INFO:tasks.workunit.client.0.vm05.stdout:9/750: sync 2026-03-10T07:51:08.697 INFO:tasks.workunit.client.0.vm05.stdout:3/773: sync 2026-03-10T07:51:08.723 INFO:tasks.workunit.client.0.vm05.stdout:3/774: dread d8/f3b [0,4194304] 0 2026-03-10T07:51:08.729 INFO:tasks.workunit.client.0.vm05.stdout:3/775: creat d8/d1f/d2a/d96/da9/f10a x:0 0 0 2026-03-10T07:51:08.734 INFO:tasks.workunit.client.0.vm05.stdout:3/776: symlink d8/d1f/d24/d8a/l10b 0 2026-03-10T07:51:08.736 INFO:tasks.workunit.client.0.vm05.stdout:3/777: dread - d8/d1f/d2a/d96/da9/fe6 zero size 2026-03-10T07:51:08.742 INFO:tasks.workunit.client.0.vm05.stdout:2/854: dwrite d0/d8/d3d/f8d [0,4194304] 0 2026-03-10T07:51:08.745 INFO:tasks.workunit.client.0.vm05.stdout:3/778: mknod d8/d1f/d24/d76/dc5/de1/c10c 0 2026-03-10T07:51:08.745 INFO:tasks.workunit.client.0.vm05.stdout:6/839: truncate d0/d11/d2e/f30 1203664 0 2026-03-10T07:51:08.750 INFO:tasks.workunit.client.0.vm05.stdout:7/838: write d1/d6/d80/d82/ffd [4593363,34128] 0 2026-03-10T07:51:08.750 INFO:tasks.workunit.client.0.vm05.stdout:7/839: link d1/d6/d80/d82/fa8 d1/d6/d80/dcd/f103 0 2026-03-10T07:51:08.751 INFO:tasks.workunit.client.0.vm05.stdout:5/829: dread d2/d12/d2d/d4a/f99 [0,4194304] 0 2026-03-10T07:51:08.752 INFO:tasks.workunit.client.0.vm05.stdout:2/855: chown d0/d8/d43/df/d8b/dbf/de5 3 1 2026-03-10T07:51:08.775 INFO:tasks.workunit.client.0.vm05.stdout:6/840: creat d0/d11/d86/f113 x:0 0 0 2026-03-10T07:51:08.776 INFO:tasks.workunit.client.0.vm05.stdout:5/830: dwrite d2/d20/d4c/d64/f96 [4194304,4194304] 0 2026-03-10T07:51:08.780 INFO:tasks.workunit.client.0.vm05.stdout:2/856: mkdir d0/d8/d66/dd1/d49/d81/dd5/d115 0 2026-03-10T07:51:08.784 INFO:tasks.workunit.client.0.vm05.stdout:8/730: write d1/dd/d4d/d64/d6a/de5/d2a/fc8 [823957,64602] 0 2026-03-10T07:51:08.788 INFO:tasks.workunit.client.0.vm05.stdout:0/823: dwrite d8/dd/d10/f19 [0,4194304] 0 2026-03-10T07:51:08.798 INFO:tasks.workunit.client.0.vm05.stdout:3/779: creat d8/d22/d60/d6e/dca/dda/f10d x:0 0 0 2026-03-10T07:51:08.798 INFO:tasks.workunit.client.0.vm05.stdout:6/841: creat d0/d35/d36/d43/d9c/dc7/f114 x:0 0 0 2026-03-10T07:51:08.799 INFO:tasks.workunit.client.0.vm05.stdout:6/842: chown d0/d11/d4f/d56 82838919 1 2026-03-10T07:51:08.800 INFO:tasks.workunit.client.0.vm05.stdout:6/843: write d0/d11/d4f/d7d/db7/fe5 [829504,78598] 0 2026-03-10T07:51:08.806 INFO:tasks.workunit.client.0.vm05.stdout:1/809: truncate da/d26/d2b/dcb/fd7 4164817 0 2026-03-10T07:51:08.812 INFO:tasks.workunit.client.0.vm05.stdout:5/831: dread - d2/d12/da8/ddd/fdf zero size 2026-03-10T07:51:08.812 INFO:tasks.workunit.client.0.vm05.stdout:1/810: chown da/dd/d2a/d55/d68 14013800 1 2026-03-10T07:51:08.812 INFO:tasks.workunit.client.0.vm05.stdout:7/840: rmdir d1/d3c/d71/dd7 0 2026-03-10T07:51:08.812 INFO:tasks.workunit.client.0.vm05.stdout:0/824: creat d8/dd/d10/d26/d8b/d86/f11a x:0 0 0 2026-03-10T07:51:08.812 INFO:tasks.workunit.client.0.vm05.stdout:6/844: creat d0/d35/f115 x:0 0 0 2026-03-10T07:51:08.812 INFO:tasks.workunit.client.0.vm05.stdout:6/845: readlink d0/d11/d4f/lf0 0 2026-03-10T07:51:08.815 INFO:tasks.workunit.client.0.vm05.stdout:4/882: dwrite d0/d6/da6/fcb [0,4194304] 0 2026-03-10T07:51:08.816 INFO:tasks.workunit.client.0.vm05.stdout:5/832: chown d2/d20/d33/d53/d7d/l9d 213659 1 2026-03-10T07:51:08.831 INFO:tasks.workunit.client.0.vm05.stdout:7/841: mkdir d1/d3c/d104 0 2026-03-10T07:51:08.833 INFO:tasks.workunit.client.0.vm05.stdout:9/751: dwrite d8/d86/d28/d79/d57/de1/d22/d33/d47/f5f [4194304,4194304] 0 2026-03-10T07:51:08.838 INFO:tasks.workunit.client.0.vm05.stdout:9/752: fsync d8/d86/d28/d79/f91 0 2026-03-10T07:51:08.845 INFO:tasks.workunit.client.0.vm05.stdout:5/833: dread d2/d5/f3d [0,4194304] 0 2026-03-10T07:51:08.845 INFO:tasks.workunit.client.0.vm05.stdout:5/834: stat d2/d5/f3c 0 2026-03-10T07:51:08.846 INFO:tasks.workunit.client.0.vm05.stdout:9/753: fdatasync d8/d86/d28/d79/d57/de1/d1c/d20/d54/f7b 0 2026-03-10T07:51:08.849 INFO:tasks.workunit.client.0.vm05.stdout:5/835: unlink d2/d12/d4d/la6 0 2026-03-10T07:51:08.852 INFO:tasks.workunit.client.0.vm05.stdout:9/754: symlink d8/d86/d28/d79/d57/de1/d22/d33/d47/lfb 0 2026-03-10T07:51:08.858 INFO:tasks.workunit.client.0.vm05.stdout:3/780: sync 2026-03-10T07:51:08.874 INFO:tasks.workunit.client.0.vm05.stdout:7/842: dread d1/d6/f41 [0,4194304] 0 2026-03-10T07:51:08.881 INFO:tasks.workunit.client.0.vm05.stdout:3/781: dread d8/d1f/fcd [0,4194304] 0 2026-03-10T07:51:08.893 INFO:tasks.workunit.client.0.vm05.stdout:5/836: rename d2/d5/f23 to d2/d20/d33/f115 0 2026-03-10T07:51:08.900 INFO:tasks.workunit.client.0.vm05.stdout:2/857: write d0/d8/d43/df/fc2 [181400,79502] 0 2026-03-10T07:51:08.902 INFO:tasks.workunit.client.0.vm05.stdout:8/731: dread d1/dd/d4d/d64/d6a/de5/d2a/d9a/fca [0,4194304] 0 2026-03-10T07:51:08.906 INFO:tasks.workunit.client.0.vm05.stdout:0/825: dwrite d8/dd/d37/d67/f9f [0,4194304] 0 2026-03-10T07:51:08.913 INFO:tasks.workunit.client.0.vm05.stdout:1/811: truncate da/dd/d2a/d55/d64/f7a 3562418 0 2026-03-10T07:51:08.916 INFO:tasks.workunit.client.0.vm05.stdout:1/812: dwrite da/d26/d2b/daf/fb6 [0,4194304] 0 2026-03-10T07:51:08.917 INFO:tasks.workunit.client.0.vm05.stdout:6/846: write d0/f29 [5235656,20958] 0 2026-03-10T07:51:08.923 INFO:tasks.workunit.client.0.vm05.stdout:6/847: fdatasync d0/d11/d57/fb9 0 2026-03-10T07:51:08.926 INFO:tasks.workunit.client.0.vm05.stdout:3/782: fdatasync d8/d1f/d24/d8a/f57 0 2026-03-10T07:51:08.928 INFO:tasks.workunit.client.0.vm05.stdout:1/813: read da/dd/d42/d80/f87 [3785193,114708] 0 2026-03-10T07:51:08.928 INFO:tasks.workunit.client.0.vm05.stdout:5/837: creat d2/d20/d4c/db6/f116 x:0 0 0 2026-03-10T07:51:08.929 INFO:tasks.workunit.client.0.vm05.stdout:4/883: dwrite d0/d6/d9/d12/d65/f8a [4194304,4194304] 0 2026-03-10T07:51:08.930 INFO:tasks.workunit.client.0.vm05.stdout:4/884: chown d0/d6/d95/f40 13147 1 2026-03-10T07:51:08.973 INFO:tasks.workunit.client.0.vm05.stdout:0/826: fsync d8/dd/d10/fd9 0 2026-03-10T07:51:09.000 INFO:tasks.workunit.client.0.vm05.stdout:4/885: creat d0/d6/d60/f122 x:0 0 0 2026-03-10T07:51:09.000 INFO:tasks.workunit.client.0.vm05.stdout:5/838: creat d2/d12/d4d/f117 x:0 0 0 2026-03-10T07:51:09.007 INFO:tasks.workunit.client.0.vm05.stdout:3/783: read d8/d1f/d24/d8a/f91 [1495244,81886] 0 2026-03-10T07:51:09.007 INFO:tasks.workunit.client.0.vm05.stdout:0/827: mkdir d8/dd/d37/d67/d96/d11b 0 2026-03-10T07:51:09.007 INFO:tasks.workunit.client.0.vm05.stdout:9/755: link d8/d86/d28/d79/d57/de1/d1c/d20/d59/fae d8/d86/d28/d79/d57/de1/d1c/d75/ffc 0 2026-03-10T07:51:09.009 INFO:tasks.workunit.client.0.vm05.stdout:9/756: chown d8/d86/d28/d79/d57/de1/la8 56833 1 2026-03-10T07:51:09.037 INFO:tasks.workunit.client.0.vm05.stdout:8/732: write d1/d45/fba [721276,105338] 0 2026-03-10T07:51:09.037 INFO:tasks.workunit.client.0.vm05.stdout:7/843: dread d1/d6/f32 [0,4194304] 0 2026-03-10T07:51:09.039 INFO:tasks.workunit.client.0.vm05.stdout:2/858: link d0/d8/d43/df/cee d0/d8/d43/d38/c116 0 2026-03-10T07:51:09.047 INFO:tasks.workunit.client.0.vm05.stdout:0/828: fsync d8/dd/d37/d67/fb6 0 2026-03-10T07:51:09.060 INFO:tasks.workunit.client.0.vm05.stdout:5/839: rename d2/d20/d5b/f6e to d2/d20/d33/d86/dac/f118 0 2026-03-10T07:51:09.064 INFO:tasks.workunit.client.0.vm05.stdout:5/840: chown d2/d12/d2d/d4a/faf 2118472076 1 2026-03-10T07:51:09.071 INFO:tasks.workunit.client.0.vm05.stdout:8/733: fsync d1/dd/d4d/d64/d6a/f83 0 2026-03-10T07:51:09.081 INFO:tasks.workunit.client.0.vm05.stdout:8/734: dwrite d1/d6f/fa7 [0,4194304] 0 2026-03-10T07:51:09.092 INFO:tasks.workunit.client.0.vm05.stdout:6/848: write d0/fa [317616,7845] 0 2026-03-10T07:51:09.104 INFO:tasks.workunit.client.0.vm05.stdout:6/849: chown d0/d11/d57/faf 6157807 1 2026-03-10T07:51:09.104 INFO:tasks.workunit.client.0.vm05.stdout:1/814: truncate da/dd/d12/d34/ddb/fc5 522434 0 2026-03-10T07:51:09.122 INFO:tasks.workunit.client.0.vm05.stdout:0/829: sync 2026-03-10T07:51:09.125 INFO:tasks.workunit.client.0.vm05.stdout:0/830: write d8/dd/d10/d26/d3a/d5e/f71 [3421368,7750] 0 2026-03-10T07:51:09.135 INFO:tasks.workunit.client.0.vm05.stdout:9/757: write d8/d86/f92 [3945240,43401] 0 2026-03-10T07:51:09.142 INFO:tasks.workunit.client.0.vm05.stdout:2/859: write d0/f4 [5225809,124530] 0 2026-03-10T07:51:09.145 INFO:tasks.workunit.client.0.vm05.stdout:3/784: truncate d8/d1f/d24/d76/dc5/de1/d19/d6b/fe7 1082164 0 2026-03-10T07:51:09.232 INFO:tasks.workunit.client.0.vm05.stdout:0/831: symlink d8/dd/d10/d26/d2a/l11c 0 2026-03-10T07:51:09.237 INFO:tasks.workunit.client.0.vm05.stdout:2/860: creat d0/d8/d43/df/df8/f117 x:0 0 0 2026-03-10T07:51:09.244 INFO:tasks.workunit.client.0.vm05.stdout:7/844: symlink d1/d34/d59/d60/d8c/dff/l105 0 2026-03-10T07:51:09.245 INFO:tasks.workunit.client.0.vm05.stdout:7/845: chown d1/d6/d3b/cf3 730 1 2026-03-10T07:51:09.246 INFO:tasks.workunit.client.0.vm05.stdout:8/735: rename d1/cb to d1/d45/d90/ce9 0 2026-03-10T07:51:09.247 INFO:tasks.workunit.client.0.vm05.stdout:4/886: link d0/le d0/d6/d9/d12/d45/d55/d4e/dd2/l123 0 2026-03-10T07:51:09.255 INFO:tasks.workunit.client.0.vm05.stdout:5/841: link d2/d5/led d2/d12/da8/ddd/l119 0 2026-03-10T07:51:09.262 INFO:tasks.workunit.client.0.vm05.stdout:0/832: mkdir d8/d9c/dc8/d11d 0 2026-03-10T07:51:09.266 INFO:tasks.workunit.client.0.vm05.stdout:9/758: mknod d8/d86/d28/d79/d57/de1/d1c/d20/dee/cfd 0 2026-03-10T07:51:09.273 INFO:tasks.workunit.client.0.vm05.stdout:9/759: dwrite d8/d86/d28/d79/d57/de1/d6b/f97 [0,4194304] 0 2026-03-10T07:51:09.283 INFO:tasks.workunit.client.0.vm05.stdout:3/785: creat d8/d1c/d109/f10e x:0 0 0 2026-03-10T07:51:09.291 INFO:tasks.workunit.client.0.vm05.stdout:3/786: dwrite d8/d22/d60/d6e/dca/dda/f10d [0,4194304] 0 2026-03-10T07:51:09.292 INFO:tasks.workunit.client.0.vm05.stdout:7/846: rmdir d1/d6/d3b 39 2026-03-10T07:51:09.311 INFO:tasks.workunit.client.0.vm05.stdout:5/842: symlink d2/d20/d33/d53/l11a 0 2026-03-10T07:51:09.343 INFO:tasks.workunit.client.0.vm05.stdout:6/850: link d0/d11/d4f/d7d/db7/cfb d0/d11/d4f/d56/d96/db6/c116 0 2026-03-10T07:51:09.344 INFO:tasks.workunit.client.0.vm05.stdout:1/815: creat da/d26/d2b/daf/dbe/dc0/ffb x:0 0 0 2026-03-10T07:51:09.344 INFO:tasks.workunit.client.0.vm05.stdout:6/851: chown d0/d11/d22/d6c/d84/l9b 0 1 2026-03-10T07:51:09.344 INFO:tasks.workunit.client.0.vm05.stdout:9/760: mkdir d8/d86/dfe 0 2026-03-10T07:51:09.344 INFO:tasks.workunit.client.0.vm05.stdout:6/852: dwrite d0/d35/d36/d43/d9c/dc7/f114 [0,4194304] 0 2026-03-10T07:51:09.344 INFO:tasks.workunit.client.0.vm05.stdout:6/853: stat d0/d11/d22/d6c/d84/cb1 0 2026-03-10T07:51:09.344 INFO:tasks.workunit.client.0.vm05.stdout:6/854: readlink d0/d11/d2e/d81/d92/dc2/lce 0 2026-03-10T07:51:09.344 INFO:tasks.workunit.client.0.vm05.stdout:3/787: dread d8/f5d [0,4194304] 0 2026-03-10T07:51:09.344 INFO:tasks.workunit.client.0.vm05.stdout:7/847: rename d1/c69 to d1/d6/dc3/c106 0 2026-03-10T07:51:09.344 INFO:tasks.workunit.client.0.vm05.stdout:8/736: mknod d1/dd/d4d/d64/d6a/de5/d2a/d34/cea 0 2026-03-10T07:51:09.344 INFO:tasks.workunit.client.0.vm05.stdout:5/843: creat d2/d12/da8/ddd/de9/f11b x:0 0 0 2026-03-10T07:51:09.344 INFO:tasks.workunit.client.0.vm05.stdout:8/737: dwrite d1/dd/d4d/d64/d6a/de5/f5b [0,4194304] 0 2026-03-10T07:51:09.350 INFO:tasks.workunit.client.0.vm05.stdout:4/887: sync 2026-03-10T07:51:09.356 INFO:tasks.workunit.client.0.vm05.stdout:0/833: symlink d8/dd/d10/d26/d8b/l11e 0 2026-03-10T07:51:09.368 INFO:tasks.workunit.client.0.vm05.stdout:6/855: fdatasync d0/d11/d4f/da0/fc3 0 2026-03-10T07:51:09.370 INFO:tasks.workunit.client.0.vm05.stdout:1/816: dread da/f5c [0,4194304] 0 2026-03-10T07:51:09.374 INFO:tasks.workunit.client.0.vm05.stdout:3/788: chown d8/d22/fe2 17668255 1 2026-03-10T07:51:09.379 INFO:tasks.workunit.client.0.vm05.stdout:5/844: dread - d2/d20/d4c/d64/fde zero size 2026-03-10T07:51:09.381 INFO:tasks.workunit.client.0.vm05.stdout:8/738: chown d1/c3d 355814 1 2026-03-10T07:51:09.383 INFO:tasks.workunit.client.0.vm05.stdout:8/739: write d1/dd/d4d/d64/d6a/de5/f5b [10549424,102707] 0 2026-03-10T07:51:09.396 INFO:tasks.workunit.client.0.vm05.stdout:9/761: rename d8/d86/d28/d79/d57/de1/d22/d33/d62/d6d/de7/fed to d8/d86/d28/d79/d57/de1/d38/fff 0 2026-03-10T07:51:09.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:09 vm05.local ceph-mon[50387]: pgmap v24: 65 pgs: 65 active+clean; 1.8 GiB data, 6.1 GiB used, 114 GiB / 120 GiB avail; 52 MiB/s rd, 110 MiB/s wr, 316 op/s 2026-03-10T07:51:09.415 INFO:tasks.workunit.client.0.vm05.stdout:6/856: mkdir d0/d11/d57/d60/d117 0 2026-03-10T07:51:09.416 INFO:tasks.workunit.client.0.vm05.stdout:5/845: mkdir d2/d12/da8/ddd/de9/d11c 0 2026-03-10T07:51:09.416 INFO:tasks.workunit.client.0.vm05.stdout:5/846: fsync d2/f9 0 2026-03-10T07:51:09.417 INFO:tasks.workunit.client.0.vm05.stdout:8/740: mkdir d1/d52/dd3/ddb/deb 0 2026-03-10T07:51:09.420 INFO:tasks.workunit.client.0.vm05.stdout:2/861: dwrite d0/d8/d66/dd1/d49/db3/fdf [4194304,4194304] 0 2026-03-10T07:51:09.420 INFO:tasks.workunit.client.0.vm05.stdout:9/762: creat d8/d86/d28/d79/d57/d96/f100 x:0 0 0 2026-03-10T07:51:09.422 INFO:tasks.workunit.client.0.vm05.stdout:9/763: chown d8/d86/d28/d79/d57/de1/d22/d33/d47 0 1 2026-03-10T07:51:09.422 INFO:tasks.workunit.client.0.vm05.stdout:0/834: read d8/dd/d37/d56/f62 [6195711,122222] 0 2026-03-10T07:51:09.435 INFO:tasks.workunit.client.0.vm05.stdout:1/817: mkdir da/dd/d2a/d55/dfc 0 2026-03-10T07:51:09.438 INFO:tasks.workunit.client.0.vm05.stdout:3/789: creat d8/d47/f10f x:0 0 0 2026-03-10T07:51:09.438 INFO:tasks.workunit.client.0.vm05.stdout:3/790: chown d8/d1f/d2a/d96/da9 0 1 2026-03-10T07:51:09.452 INFO:tasks.workunit.client.0.vm05.stdout:7/848: dread d1/d34/d59/d60/d8c/ff4 [0,4194304] 0 2026-03-10T07:51:09.454 INFO:tasks.workunit.client.0.vm05.stdout:6/857: dread d0/d6/f1d [0,4194304] 0 2026-03-10T07:51:09.455 INFO:tasks.workunit.client.0.vm05.stdout:1/818: mknod da/d26/d2b/daf/dbe/cfd 0 2026-03-10T07:51:09.460 INFO:tasks.workunit.client.0.vm05.stdout:4/888: link d0/d28/l115 d0/d6/d9/d5a/d6e/l124 0 2026-03-10T07:51:09.464 INFO:tasks.workunit.client.0.vm05.stdout:9/764: creat d8/d86/d28/de9/f101 x:0 0 0 2026-03-10T07:51:09.474 INFO:tasks.workunit.client.0.vm05.stdout:6/858: dwrite d0/d11/d57/d60/f74 [0,4194304] 0 2026-03-10T07:51:09.485 INFO:tasks.workunit.client.0.vm05.stdout:8/741: creat d1/dd/d4d/d64/d6a/de5/d2a/fec x:0 0 0 2026-03-10T07:51:09.485 INFO:tasks.workunit.client.0.vm05.stdout:4/889: creat d0/d6/d9/d12/d9c/db7/da7/f125 x:0 0 0 2026-03-10T07:51:09.485 INFO:tasks.workunit.client.0.vm05.stdout:4/890: fsync d0/d6/d95/f3a 0 2026-03-10T07:51:09.488 INFO:tasks.workunit.client.0.vm05.stdout:7/849: symlink d1/d3c/d71/d79/d8a/dac/l107 0 2026-03-10T07:51:09.494 INFO:tasks.workunit.client.0.vm05.stdout:0/835: getdents d8/dd/d10/db7 0 2026-03-10T07:51:09.494 INFO:tasks.workunit.client.0.vm05.stdout:1/819: mknod da/d26/d2b/cfe 0 2026-03-10T07:51:09.498 INFO:tasks.workunit.client.0.vm05.stdout:8/742: symlink d1/dd/d18/led 0 2026-03-10T07:51:09.499 INFO:tasks.workunit.client.0.vm05.stdout:5/847: link d2/d12/dda/da1/faa d2/d12/dda/da1/dc0/f11d 0 2026-03-10T07:51:09.500 INFO:tasks.workunit.client.0.vm05.stdout:2/862: sync 2026-03-10T07:51:09.510 INFO:tasks.workunit.client.0.vm05.stdout:6/859: dread d0/d11/f13 [0,4194304] 0 2026-03-10T07:51:09.516 INFO:tasks.workunit.client.0.vm05.stdout:3/791: getdents d8/d22/d60 0 2026-03-10T07:51:09.517 INFO:tasks.workunit.client.0.vm05.stdout:5/848: creat d2/d4b/f11e x:0 0 0 2026-03-10T07:51:09.520 INFO:tasks.workunit.client.0.vm05.stdout:9/765: rename d8/d86/d28/d79/d57/d96/ce6 to d8/c102 0 2026-03-10T07:51:09.521 INFO:tasks.workunit.client.0.vm05.stdout:7/850: mkdir d1/d108 0 2026-03-10T07:51:09.536 INFO:tasks.workunit.client.0.vm05.stdout:1/820: symlink da/dd/d12/lff 0 2026-03-10T07:51:09.539 INFO:tasks.workunit.client.0.vm05.stdout:5/849: creat d2/d12/dda/f11f x:0 0 0 2026-03-10T07:51:09.540 INFO:tasks.workunit.client.0.vm05.stdout:0/836: dread d8/dd/d10/d26/d2a/fc7 [0,4194304] 0 2026-03-10T07:51:09.540 INFO:tasks.workunit.client.0.vm05.stdout:3/792: stat d8/d1f/d2a/d4a/d7d/c103 0 2026-03-10T07:51:09.550 INFO:tasks.workunit.client.0.vm05.stdout:9/766: dwrite d8/d86/d28/d79/d57/de1/d38/fff [0,4194304] 0 2026-03-10T07:51:09.551 INFO:tasks.workunit.client.0.vm05.stdout:4/891: write d0/d6/d9/d12/d45/d55/fe3 [736497,71308] 0 2026-03-10T07:51:09.552 INFO:tasks.workunit.client.0.vm05.stdout:8/743: creat d1/dd/d5e/fee x:0 0 0 2026-03-10T07:51:09.552 INFO:tasks.workunit.client.0.vm05.stdout:1/821: mknod da/d26/d2b/dcb/c100 0 2026-03-10T07:51:09.553 INFO:tasks.workunit.client.0.vm05.stdout:4/892: readlink d0/d6/d37/l93 0 2026-03-10T07:51:09.556 INFO:tasks.workunit.client.0.vm05.stdout:4/893: chown d0/d6/d9/d12/d4f/cdf 4302 1 2026-03-10T07:51:09.576 INFO:tasks.workunit.client.0.vm05.stdout:6/860: link d0/d11/d57/da4/fdf d0/d11/d86/f118 0 2026-03-10T07:51:09.579 INFO:tasks.workunit.client.0.vm05.stdout:2/863: dwrite d0/d8/d66/dd1/d49/df9/db2/dd7/f10f [0,4194304] 0 2026-03-10T07:51:09.580 INFO:tasks.workunit.client.0.vm05.stdout:7/851: write d1/d5b/fd5 [438879,112950] 0 2026-03-10T07:51:09.592 INFO:tasks.workunit.client.0.vm05.stdout:8/744: symlink d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d5d/lef 0 2026-03-10T07:51:09.592 INFO:tasks.workunit.client.0.vm05.stdout:1/822: mkdir da/dd/d2a/d70/d101 0 2026-03-10T07:51:09.593 INFO:tasks.workunit.client.0.vm05.stdout:8/745: fsync d1/dd/d4d/d64/d6a/de5/d2a/fec 0 2026-03-10T07:51:09.595 INFO:tasks.workunit.client.0.vm05.stdout:4/894: creat d0/d6/d9/d12/d9c/db7/da7/d96/f126 x:0 0 0 2026-03-10T07:51:09.605 INFO:tasks.workunit.client.0.vm05.stdout:5/850: link d2/d12/d2d/f7a d2/d12/da8/ddd/de9/f120 0 2026-03-10T07:51:09.618 INFO:tasks.workunit.client.0.vm05.stdout:3/793: dread d8/d1f/f6c [0,4194304] 0 2026-03-10T07:51:09.620 INFO:tasks.workunit.client.0.vm05.stdout:0/837: write d8/dd/d10/d26/d48/ff0 [253472,52457] 0 2026-03-10T07:51:09.623 INFO:tasks.workunit.client.0.vm05.stdout:9/767: dwrite d8/d86/d28/d79/d57/de1/d22/f9b [0,4194304] 0 2026-03-10T07:51:09.626 INFO:tasks.workunit.client.0.vm05.stdout:9/768: read - d8/d86/d28/d79/d57/de1/d22/d33/d62/d6d/d9e/fcc zero size 2026-03-10T07:51:09.626 INFO:tasks.workunit.client.0.vm05.stdout:2/864: truncate d0/d8/d43/df/feb 946700 0 2026-03-10T07:51:09.650 INFO:tasks.workunit.client.0.vm05.stdout:4/895: fsync d0/d6/d95/f40 0 2026-03-10T07:51:09.656 INFO:tasks.workunit.client.0.vm05.stdout:4/896: chown d0/d6/d9/d12/d45/d55/d44/d85/f9e 41 1 2026-03-10T07:51:09.660 INFO:tasks.workunit.client.0.vm05.stdout:6/861: truncate d0/d11/d4f/f7e 477634 0 2026-03-10T07:51:09.664 INFO:tasks.workunit.client.0.vm05.stdout:0/838: read d8/d9c/fec [423773,121452] 0 2026-03-10T07:51:09.665 INFO:tasks.workunit.client.0.vm05.stdout:0/839: stat d8/dd/d10/d26/d2a/c7c 0 2026-03-10T07:51:09.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:09 vm08.local ceph-mon[59917]: pgmap v24: 65 pgs: 65 active+clean; 1.8 GiB data, 6.1 GiB used, 114 GiB / 120 GiB avail; 52 MiB/s rd, 110 MiB/s wr, 316 op/s 2026-03-10T07:51:09.669 INFO:tasks.workunit.client.0.vm05.stdout:7/852: dread d1/d3c/d71/d79/fe1 [0,4194304] 0 2026-03-10T07:51:09.669 INFO:tasks.workunit.client.0.vm05.stdout:1/823: dwrite da/d26/d2b/daf/dbe/dc0/f79 [4194304,4194304] 0 2026-03-10T07:51:09.700 INFO:tasks.workunit.client.0.vm05.stdout:6/862: symlink d0/d11/d86/l119 0 2026-03-10T07:51:09.708 INFO:tasks.workunit.client.0.vm05.stdout:9/769: write d8/d86/d28/d79/d57/de1/d1c/d20/fbd [5177788,86741] 0 2026-03-10T07:51:09.715 INFO:tasks.workunit.client.0.vm05.stdout:2/865: symlink d0/d8/d3d/l118 0 2026-03-10T07:51:09.716 INFO:tasks.workunit.client.0.vm05.stdout:5/851: creat d2/d20/d7b/f121 x:0 0 0 2026-03-10T07:51:09.716 INFO:tasks.workunit.client.0.vm05.stdout:4/897: mkdir d0/d6/d9/d5a/d6e/db6/d127 0 2026-03-10T07:51:09.717 INFO:tasks.workunit.client.0.vm05.stdout:0/840: mkdir d8/dd/d37/d11f 0 2026-03-10T07:51:09.720 INFO:tasks.workunit.client.0.vm05.stdout:8/746: dwrite d1/dd/d4d/dcc/dbd/fcf [0,4194304] 0 2026-03-10T07:51:09.726 INFO:tasks.workunit.client.0.vm05.stdout:3/794: rename d8/d1c/d64 to d8/d110 0 2026-03-10T07:51:09.732 INFO:tasks.workunit.client.0.vm05.stdout:0/841: dwrite d8/dd/d10/d26/d8b/da4/de7/f109 [0,4194304] 0 2026-03-10T07:51:09.745 INFO:tasks.workunit.client.0.vm05.stdout:6/863: creat d0/d6/d3b/f11a x:0 0 0 2026-03-10T07:51:09.748 INFO:tasks.workunit.client.0.vm05.stdout:1/824: mkdir da/dd/d2a/d55/d102 0 2026-03-10T07:51:09.752 INFO:tasks.workunit.client.0.vm05.stdout:5/852: truncate d2/f8 4125690 0 2026-03-10T07:51:09.757 INFO:tasks.workunit.client.0.vm05.stdout:4/898: symlink d0/d6/d9/d12/d4f/l128 0 2026-03-10T07:51:09.759 INFO:tasks.workunit.client.0.vm05.stdout:4/899: dread - d0/d6/d9/d12/d4f/f113 zero size 2026-03-10T07:51:09.762 INFO:tasks.workunit.client.0.vm05.stdout:2/866: creat d0/d8/d43/da4/dea/f119 x:0 0 0 2026-03-10T07:51:09.762 INFO:tasks.workunit.client.0.vm05.stdout:5/853: dread d2/d20/d33/d53/f68 [0,4194304] 0 2026-03-10T07:51:09.777 INFO:tasks.workunit.client.0.vm05.stdout:8/747: mknod d1/dd/d4d/cf0 0 2026-03-10T07:51:09.777 INFO:tasks.workunit.client.0.vm05.stdout:0/842: unlink d8/dd/d10/d26/d8b/ce0 0 2026-03-10T07:51:09.778 INFO:tasks.workunit.client.0.vm05.stdout:8/748: dwrite d1/dd/d4d/d64/d6a/fb5 [0,4194304] 0 2026-03-10T07:51:09.779 INFO:tasks.workunit.client.0.vm05.stdout:6/864: rename d0/d35/c39 to d0/d11/d4f/d56/d96/db6/c11b 0 2026-03-10T07:51:09.780 INFO:tasks.workunit.client.0.vm05.stdout:6/865: readlink d0/l76 0 2026-03-10T07:51:09.791 INFO:tasks.workunit.client.0.vm05.stdout:0/843: dread d8/dd/d37/d67/f87 [0,4194304] 0 2026-03-10T07:51:09.805 INFO:tasks.workunit.client.0.vm05.stdout:5/854: mkdir d2/d20/d122 0 2026-03-10T07:51:09.805 INFO:tasks.workunit.client.0.vm05.stdout:2/867: mknod d0/d113/c11a 0 2026-03-10T07:51:09.806 INFO:tasks.workunit.client.0.vm05.stdout:0/844: chown d8/dd/d37/d56/d4d/fd7 1 1 2026-03-10T07:51:09.813 INFO:tasks.workunit.client.0.vm05.stdout:4/900: getdents d0/d6/d9/d12/d9c/db7/db1 0 2026-03-10T07:51:09.815 INFO:tasks.workunit.client.0.vm05.stdout:2/868: dwrite d0/d8/d66/dd1/d49/df9/da5/f109 [0,4194304] 0 2026-03-10T07:51:09.823 INFO:tasks.workunit.client.0.vm05.stdout:0/845: creat d8/dd/d37/d56/d4d/df8/f120 x:0 0 0 2026-03-10T07:51:09.824 INFO:tasks.workunit.client.0.vm05.stdout:0/846: stat d8/dd/d10/d26/d2a/c7c 0 2026-03-10T07:51:09.825 INFO:tasks.workunit.client.0.vm05.stdout:6/866: getdents d0/d11/d4f/d7d 0 2026-03-10T07:51:09.829 INFO:tasks.workunit.client.0.vm05.stdout:2/869: rename d0/d8/d43/df/ff2 to d0/d8/d43/df/d53/f11b 0 2026-03-10T07:51:09.834 INFO:tasks.workunit.client.0.vm05.stdout:2/870: symlink d0/d8/d43/dc9/l11c 0 2026-03-10T07:51:09.837 INFO:tasks.workunit.client.0.vm05.stdout:0/847: rename d8/f65 to d8/dd/d10/db7/f121 0 2026-03-10T07:51:09.837 INFO:tasks.workunit.client.0.vm05.stdout:0/848: stat d8/d9c/dc8/d11d 0 2026-03-10T07:51:09.845 INFO:tasks.workunit.client.0.vm05.stdout:6/867: getdents d0/d11/d57/da4 0 2026-03-10T07:51:09.850 INFO:tasks.workunit.client.0.vm05.stdout:6/868: mkdir d0/d11/d4f/d11c 0 2026-03-10T07:51:09.855 INFO:tasks.workunit.client.0.vm05.stdout:6/869: readlink d0/l34 0 2026-03-10T07:51:09.869 INFO:tasks.workunit.client.0.vm05.stdout:9/770: sync 2026-03-10T07:51:09.871 INFO:tasks.workunit.client.0.vm05.stdout:8/749: sync 2026-03-10T07:51:09.883 INFO:tasks.workunit.client.0.vm05.stdout:9/771: mkdir d8/d86/d28/d79/d57/de1/d1c/d75/dc6/d103 0 2026-03-10T07:51:09.894 INFO:tasks.workunit.client.0.vm05.stdout:7/853: write d1/d34/d59/d60/d8c/ff4 [2678542,102472] 0 2026-03-10T07:51:09.895 INFO:tasks.workunit.client.0.vm05.stdout:6/870: dread d0/d11/d4f/da0/da6/fd9 [0,4194304] 0 2026-03-10T07:51:09.903 INFO:tasks.workunit.client.0.vm05.stdout:3/795: write d8/d22/d60/f61 [57596,14949] 0 2026-03-10T07:51:09.903 INFO:tasks.workunit.client.0.vm05.stdout:1/825: write da/fe5 [2686147,118274] 0 2026-03-10T07:51:09.903 INFO:tasks.workunit.client.0.vm05.stdout:3/796: chown d8/d1c/lec 66066633 1 2026-03-10T07:51:09.906 INFO:tasks.workunit.client.0.vm05.stdout:5/855: write d2/d5/f1e [91797,115285] 0 2026-03-10T07:51:09.906 INFO:tasks.workunit.client.0.vm05.stdout:3/797: stat d8/d1c/d48/c73 0 2026-03-10T07:51:09.912 INFO:tasks.workunit.client.0.vm05.stdout:4/901: write d0/d6/d95/fad [1226012,32562] 0 2026-03-10T07:51:09.920 INFO:tasks.workunit.client.0.vm05.stdout:0/849: write d8/dd/fde [157302,51156] 0 2026-03-10T07:51:09.921 INFO:tasks.workunit.client.0.vm05.stdout:2/871: write d0/d8/d43/df/feb [61043,29915] 0 2026-03-10T07:51:09.933 INFO:tasks.workunit.client.0.vm05.stdout:8/750: link d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d5d/f74 d1/dd/d4d/d64/d6a/de5/d2a/d48/d7c/d9c/da4/ff1 0 2026-03-10T07:51:09.934 INFO:tasks.workunit.client.0.vm05.stdout:9/772: symlink d8/d86/d28/d79/d57/de1/d1c/d20/dd3/d63/l104 0 2026-03-10T07:51:09.935 INFO:tasks.workunit.client.0.vm05.stdout:9/773: fsync d8/d86/d28/d79/d57/de1/f51 0 2026-03-10T07:51:09.936 INFO:tasks.workunit.client.0.vm05.stdout:8/751: write d1/dd/d4d/d64/fe8 [40388,15387] 0 2026-03-10T07:51:09.937 INFO:tasks.workunit.client.0.vm05.stdout:7/854: dread - d1/d3c/d71/d79/d8a/dac/fc2 zero size 2026-03-10T07:51:09.943 INFO:tasks.workunit.client.0.vm05.stdout:6/871: creat d0/d11/d4f/da0/f11d x:0 0 0 2026-03-10T07:51:09.946 INFO:tasks.workunit.client.0.vm05.stdout:3/798: creat d8/d1f/d24/d45/f111 x:0 0 0 2026-03-10T07:51:09.950 INFO:tasks.workunit.client.0.vm05.stdout:7/855: stat d1/f3a 0 2026-03-10T07:51:09.953 INFO:tasks.workunit.client.0.vm05.stdout:3/799: chown d8/d8f/fb4 0 1 2026-03-10T07:51:09.957 INFO:tasks.workunit.client.0.vm05.stdout:1/826: creat da/dd/d2a/d70/d101/f103 x:0 0 0 2026-03-10T07:51:09.959 INFO:tasks.workunit.client.0.vm05.stdout:4/902: mknod d0/d6/d6f/dc2/c129 0 2026-03-10T07:51:09.966 INFO:tasks.workunit.client.0.vm05.stdout:3/800: chown d8/d22/d60/d6e/dca/dda/f10d 115 1 2026-03-10T07:51:09.966 INFO:tasks.workunit.client.0.vm05.stdout:7/856: dwrite d1/d34/fed [0,4194304] 0 2026-03-10T07:51:09.966 INFO:tasks.workunit.client.0.vm05.stdout:1/827: dwrite da/dd/d2a/d55/d64/dd1/f4e [0,4194304] 0 2026-03-10T07:51:09.970 INFO:tasks.workunit.client.0.vm05.stdout:6/872: unlink d0/d11/cad 0 2026-03-10T07:51:09.971 INFO:tasks.workunit.client.0.vm05.stdout:2/872: sync 2026-03-10T07:51:09.971 INFO:tasks.workunit.client.0.vm05.stdout:5/856: rename d2/d20/d4c/fa5 to d2/d20/d33/f123 0 2026-03-10T07:51:09.974 INFO:tasks.workunit.client.0.vm05.stdout:3/801: dread d8/d1f/d2a/d96/f7f [0,4194304] 0 2026-03-10T07:51:09.974 INFO:tasks.workunit.client.0.vm05.stdout:3/802: readlink l1 0 2026-03-10T07:51:09.975 INFO:tasks.workunit.client.0.vm05.stdout:1/828: sync 2026-03-10T07:51:09.987 INFO:tasks.workunit.client.0.vm05.stdout:6/873: dwrite d0/d11/d4f/d7d/db7/fe5 [0,4194304] 0 2026-03-10T07:51:09.989 INFO:tasks.workunit.client.0.vm05.stdout:7/857: dread - d1/d3c/d71/d79/d8a/dac/fcf zero size 2026-03-10T07:51:09.990 INFO:tasks.workunit.client.0.vm05.stdout:5/857: creat d2/d12/da8/f124 x:0 0 0 2026-03-10T07:51:09.990 INFO:tasks.workunit.client.0.vm05.stdout:3/803: truncate d8/d8f/fb4 1242930 0 2026-03-10T07:51:09.997 INFO:tasks.workunit.client.0.vm05.stdout:2/873: symlink d0/d8/d66/dd1/d49/l11d 0 2026-03-10T07:51:10.003 INFO:tasks.workunit.client.0.vm05.stdout:8/752: rename d1/dd/d18/f22 to d1/ff2 0 2026-03-10T07:51:10.019 INFO:tasks.workunit.client.0.vm05.stdout:7/858: truncate d1/fa4 8942046 0 2026-03-10T07:51:10.020 INFO:tasks.workunit.client.0.vm05.stdout:2/874: mknod d0/d8/d66/dd1/d49/df9/da5/da8/c11e 0 2026-03-10T07:51:10.020 INFO:tasks.workunit.client.0.vm05.stdout:3/804: dwrite d8/d47/f10f [0,4194304] 0 2026-03-10T07:51:10.020 INFO:tasks.workunit.client.0.vm05.stdout:8/753: stat d1/dd/d4d/d64/d6a/de5/d2a/d34/l62 0 2026-03-10T07:51:10.020 INFO:tasks.workunit.client.0.vm05.stdout:5/858: rename d2/d5/d61/f66 to d2/d20/d7b/dbc/f125 0 2026-03-10T07:51:10.020 INFO:tasks.workunit.client.0.vm05.stdout:7/859: chown d1/d6/f58 327449 1 2026-03-10T07:51:10.020 INFO:tasks.workunit.client.0.vm05.stdout:3/805: fsync d8/d22/fe2 0 2026-03-10T07:51:10.021 INFO:tasks.workunit.client.0.vm05.stdout:3/806: chown d8/d1f/d24/d76/dc5/de1/f9c 264749448 1 2026-03-10T07:51:10.022 INFO:tasks.workunit.client.0.vm05.stdout:3/807: dread - d8/dd5/dfb/f100 zero size 2026-03-10T07:51:10.027 INFO:tasks.workunit.client.0.vm05.stdout:8/754: read d1/d45/f53 [3639735,84842] 0 2026-03-10T07:51:10.027 INFO:tasks.workunit.client.0.vm05.stdout:7/860: creat d1/d6/d80/f109 x:0 0 0 2026-03-10T07:51:10.028 INFO:tasks.workunit.client.0.vm05.stdout:2/875: link d0/d8/dc6/c10d d0/d2a/c11f 0 2026-03-10T07:51:10.028 INFO:tasks.workunit.client.0.vm05.stdout:5/859: link d2/d5/c7 d2/d20/d33/d86/dac/dc1/d109/c126 0 2026-03-10T07:51:10.033 INFO:tasks.workunit.client.0.vm05.stdout:8/755: mkdir d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d5d/df3 0 2026-03-10T07:51:10.033 INFO:tasks.workunit.client.0.vm05.stdout:3/808: unlink d8/c15 0 2026-03-10T07:51:10.033 INFO:tasks.workunit.client.0.vm05.stdout:7/861: rename d1/d6/c29 to d1/d6/d80/dcd/c10a 0 2026-03-10T07:51:10.033 INFO:tasks.workunit.client.0.vm05.stdout:2/876: truncate d0/f1e 1104525 0 2026-03-10T07:51:10.034 INFO:tasks.workunit.client.0.vm05.stdout:3/809: fsync d8/d1f/d2a/d96/da9/fe6 0 2026-03-10T07:51:10.039 INFO:tasks.workunit.client.0.vm05.stdout:5/860: link d2/d12/f3a d2/d20/d33/d86/dac/f127 0 2026-03-10T07:51:10.039 INFO:tasks.workunit.client.0.vm05.stdout:7/862: dread d1/d6/f4e [0,4194304] 0 2026-03-10T07:51:10.055 INFO:tasks.workunit.client.0.vm05.stdout:7/863: dwrite d1/d6/d80/d82/ffd [0,4194304] 0 2026-03-10T07:51:10.057 INFO:tasks.workunit.client.0.vm05.stdout:7/864: chown d1/d3c/d71 43768 1 2026-03-10T07:51:10.067 INFO:tasks.workunit.client.0.vm05.stdout:2/877: dread d0/d8/d43/da4/ff1 [0,4194304] 0 2026-03-10T07:51:10.067 INFO:tasks.workunit.client.0.vm05.stdout:3/810: dread d8/f25 [0,4194304] 0 2026-03-10T07:51:10.074 INFO:tasks.workunit.client.0.vm05.stdout:2/878: creat d0/d8/d43/df/d4e/d10a/f120 x:0 0 0 2026-03-10T07:51:10.078 INFO:tasks.workunit.client.0.vm05.stdout:3/811: write d8/d1f/d2a/d96/da9/fe6 [352979,54628] 0 2026-03-10T07:51:10.101 INFO:tasks.workunit.client.0.vm05.stdout:9/774: write d8/d86/d28/d79/d57/de1/d22/d33/d62/d6d/faf [887823,123109] 0 2026-03-10T07:51:10.101 INFO:tasks.workunit.client.0.vm05.stdout:0/850: write d8/dd/d37/d56/fe9 [747160,49856] 0 2026-03-10T07:51:10.115 INFO:tasks.workunit.client.0.vm05.stdout:4/903: dwrite d0/d6/d95/f40 [0,4194304] 0 2026-03-10T07:51:10.115 INFO:tasks.workunit.client.0.vm05.stdout:1/829: write da/dd/d2a/d55/d64/dd1/fa9 [348180,36747] 0 2026-03-10T07:51:10.119 INFO:tasks.workunit.client.0.vm05.stdout:4/904: write d0/d6/d95/fad [2086973,109738] 0 2026-03-10T07:51:10.125 INFO:tasks.workunit.client.0.vm05.stdout:6/874: dwrite d0/d11/f13 [0,4194304] 0 2026-03-10T07:51:10.137 INFO:tasks.workunit.client.0.vm05.stdout:5/861: truncate d2/d5/f71 398643 0 2026-03-10T07:51:10.138 INFO:tasks.workunit.client.0.vm05.stdout:7/865: write d1/d3c/d71/d79/fe1 [4893540,95544] 0 2026-03-10T07:51:10.138 INFO:tasks.workunit.client.0.vm05.stdout:2/879: symlink d0/d52/l121 0 2026-03-10T07:51:10.138 INFO:tasks.workunit.client.0.vm05.stdout:3/812: creat d8/d1f/d24/d76/dc5/de1/dac/f112 x:0 0 0 2026-03-10T07:51:10.144 INFO:tasks.workunit.client.0.vm05.stdout:6/875: dread d0/d11/d4f/da0/da6/fd9 [0,4194304] 0 2026-03-10T07:51:10.153 INFO:tasks.workunit.client.0.vm05.stdout:8/756: dwrite d1/dd/d4d/d64/d6a/de5/d2a/d48/f57 [0,4194304] 0 2026-03-10T07:51:10.157 INFO:tasks.workunit.client.0.vm05.stdout:1/830: rename da/dd/d2a/d55/d68 to da/dd/d2a/d55/d64/d104 0 2026-03-10T07:51:10.158 INFO:tasks.workunit.client.0.vm05.stdout:5/862: creat d2/d12/d4d/f128 x:0 0 0 2026-03-10T07:51:10.158 INFO:tasks.workunit.client.0.vm05.stdout:7/866: dwrite d1/d3c/d71/d79/d8a/fce [0,4194304] 0 2026-03-10T07:51:10.163 INFO:tasks.workunit.client.0.vm05.stdout:2/880: mkdir d0/d8/d43/df/d4e/d122 0 2026-03-10T07:51:10.165 INFO:tasks.workunit.client.0.vm05.stdout:2/881: stat d0/d8/d43/dc9/l11c 0 2026-03-10T07:51:10.167 INFO:tasks.workunit.client.0.vm05.stdout:3/813: read d8/f18 [1491147,4447] 0 2026-03-10T07:51:10.168 INFO:tasks.workunit.client.0.vm05.stdout:0/851: getdents d8/d9c/dc8/d100 0 2026-03-10T07:51:10.168 INFO:tasks.workunit.client.0.vm05.stdout:0/852: chown d8/dd/d37/d67/d96/d11b 77 1 2026-03-10T07:51:10.169 INFO:tasks.workunit.client.0.vm05.stdout:6/876: creat d0/d35/d36/d43/d9c/dc7/f11e x:0 0 0 2026-03-10T07:51:10.172 INFO:tasks.workunit.client.0.vm05.stdout:5/863: rmdir d2/d5/d61 39 2026-03-10T07:51:10.172 INFO:tasks.workunit.client.0.vm05.stdout:1/831: unlink da/dd/d2a/d55/fbf 0 2026-03-10T07:51:10.175 INFO:tasks.workunit.client.0.vm05.stdout:2/882: truncate d0/d8/d43/df/d4e/d10a/f120 415248 0 2026-03-10T07:51:10.180 INFO:tasks.workunit.client.0.vm05.stdout:8/757: truncate d1/dd/d4d/d64/d6a/de5/d2a/f3a 377251 0 2026-03-10T07:51:10.184 INFO:tasks.workunit.client.0.vm05.stdout:0/853: symlink d8/d9c/l122 0 2026-03-10T07:51:10.187 INFO:tasks.workunit.client.0.vm05.stdout:3/814: mkdir d8/d1f/d24/d76/dc5/de1/d113 0 2026-03-10T07:51:10.190 INFO:tasks.workunit.client.0.vm05.stdout:1/832: mkdir da/d26/d9e/dcc/d105 0 2026-03-10T07:51:10.190 INFO:tasks.workunit.client.0.vm05.stdout:3/815: write d8/d1f/d24/d76/fc1 [4591062,9478] 0 2026-03-10T07:51:10.194 INFO:tasks.workunit.client.0.vm05.stdout:5/864: rename d2/d12/dda/da1/dc0/ld1 to d2/d20/d33/d53/l129 0 2026-03-10T07:51:10.198 INFO:tasks.workunit.client.0.vm05.stdout:5/865: truncate d2/d12/dda/f11f 591588 0 2026-03-10T07:51:10.210 INFO:tasks.workunit.client.0.vm05.stdout:0/854: truncate d8/dd/f40 213443 0 2026-03-10T07:51:10.214 INFO:tasks.workunit.client.0.vm05.stdout:6/877: creat d0/d11/d57/f11f x:0 0 0 2026-03-10T07:51:10.224 INFO:tasks.workunit.client.0.vm05.stdout:6/878: rmdir d0/d11/d4f 39 2026-03-10T07:51:10.239 INFO:tasks.workunit.client.0.vm05.stdout:8/758: link d1/dc9/f31 d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d5d/df3/ff4 0 2026-03-10T07:51:10.239 INFO:tasks.workunit.client.0.vm05.stdout:0/855: creat d8/dd/d10/d26/d8b/d7d/f123 x:0 0 0 2026-03-10T07:51:10.239 INFO:tasks.workunit.client.0.vm05.stdout:2/883: rename d0/d8/d43/df/d8b/la3 to d0/d8/d43/df/l123 0 2026-03-10T07:51:10.239 INFO:tasks.workunit.client.0.vm05.stdout:6/879: mknod d0/c120 0 2026-03-10T07:51:10.239 INFO:tasks.workunit.client.0.vm05.stdout:2/884: mknod d0/d8/d66/dd1/d49/df9/da5/c124 0 2026-03-10T07:51:10.239 INFO:tasks.workunit.client.0.vm05.stdout:8/759: symlink d1/dd/d4d/d64/d6a/de5/d2a/d48/lf5 0 2026-03-10T07:51:10.239 INFO:tasks.workunit.client.0.vm05.stdout:5/866: sync 2026-03-10T07:51:10.240 INFO:tasks.workunit.client.0.vm05.stdout:8/760: fdatasync d1/dd/d4d/d64/d6a/de5/d2a/fd8 0 2026-03-10T07:51:10.267 INFO:tasks.workunit.client.0.vm05.stdout:4/905: dwrite d0/d28/f33 [0,4194304] 0 2026-03-10T07:51:10.267 INFO:tasks.workunit.client.0.vm05.stdout:9/775: dwrite d8/d86/d28/d79/d57/de1/d22/d33/d62/fd2 [0,4194304] 0 2026-03-10T07:51:10.280 INFO:tasks.workunit.client.0.vm05.stdout:8/761: dread - d1/dd/d4d/d64/d6a/de5/d2a/d48/f7b zero size 2026-03-10T07:51:10.281 INFO:tasks.workunit.client.0.vm05.stdout:7/867: dwrite d1/d6/f77 [0,4194304] 0 2026-03-10T07:51:10.283 INFO:tasks.workunit.client.0.vm05.stdout:8/762: dread - d1/dd/d4d/d64/d6a/de5/fe4 zero size 2026-03-10T07:51:10.284 INFO:tasks.workunit.client.0.vm05.stdout:8/763: readlink d1/dc9/lcb 0 2026-03-10T07:51:10.299 INFO:tasks.workunit.client.0.vm05.stdout:1/833: rename da/dd/d2a/cea to da/dd/c106 0 2026-03-10T07:51:10.301 INFO:tasks.workunit.client.0.vm05.stdout:4/906: write d0/d6/d9/d5a/f58 [637234,116073] 0 2026-03-10T07:51:10.306 INFO:tasks.workunit.client.0.vm05.stdout:3/816: dwrite d8/d1f/d2a/d4a/d7d/ff3 [0,4194304] 0 2026-03-10T07:51:10.308 INFO:tasks.workunit.client.0.vm05.stdout:3/817: fsync d8/dd5/dfb/f100 0 2026-03-10T07:51:10.312 INFO:tasks.workunit.client.0.vm05.stdout:0/856: write d8/dd/d10/d26/d2a/f94 [156989,107936] 0 2026-03-10T07:51:10.312 INFO:tasks.workunit.client.0.vm05.stdout:2/885: truncate d0/d8/f2d 1320617 0 2026-03-10T07:51:10.324 INFO:tasks.workunit.client.0.vm05.stdout:5/867: mknod d2/d12/dda/da1/dc0/c12a 0 2026-03-10T07:51:10.324 INFO:tasks.workunit.client.0.vm05.stdout:6/880: dwrite d0/d11/d57/f9e [8388608,4194304] 0 2026-03-10T07:51:10.331 INFO:tasks.workunit.client.0.vm05.stdout:7/868: fsync d1/d6/d3b/fe9 0 2026-03-10T07:51:10.344 INFO:tasks.workunit.client.0.vm05.stdout:4/907: rename d0/d6/d9/d5a/c4b to d0/d6/d9/d5a/c12a 0 2026-03-10T07:51:10.346 INFO:tasks.workunit.client.0.vm05.stdout:3/818: rename d8/d22/d60/d6e/ffe to d8/d1f/d24/d76/dc5/de1/f114 0 2026-03-10T07:51:10.346 INFO:tasks.workunit.client.0.vm05.stdout:0/857: symlink d8/dd/d37/d67/d96/l124 0 2026-03-10T07:51:10.348 INFO:tasks.workunit.client.0.vm05.stdout:0/858: write d8/dd/d10/d26/d8b/d86/f101 [4499381,75650] 0 2026-03-10T07:51:10.350 INFO:tasks.workunit.client.0.vm05.stdout:2/886: mknod d0/d8/d43/df/df8/c125 0 2026-03-10T07:51:10.353 INFO:tasks.workunit.client.0.vm05.stdout:6/881: creat d0/d11/d57/da4/f121 x:0 0 0 2026-03-10T07:51:10.368 INFO:tasks.workunit.client.0.vm05.stdout:7/869: mkdir d1/d6/dc3/d10b 0 2026-03-10T07:51:10.369 INFO:tasks.workunit.client.0.vm05.stdout:9/776: link d8/fa d8/d86/d28/d79/d57/de1/d22/d33/df9/f105 0 2026-03-10T07:51:10.369 INFO:tasks.workunit.client.0.vm05.stdout:4/908: rmdir d0/d6/d60/dde 39 2026-03-10T07:51:10.386 INFO:tasks.workunit.client.0.vm05.stdout:1/834: write da/dd/d12/d86/d9a/fd0 [6324223,10106] 0 2026-03-10T07:51:10.390 INFO:tasks.workunit.client.0.vm05.stdout:8/764: dwrite d1/dd/d4d/d64/d6a/de5/d2a/d48/f50 [0,4194304] 0 2026-03-10T07:51:10.413 INFO:tasks.workunit.client.0.vm05.stdout:9/777: chown d8/d86/d95/fcb 931106 1 2026-03-10T07:51:10.414 INFO:tasks.workunit.client.0.vm05.stdout:0/859: write d8/f1c [475212,42817] 0 2026-03-10T07:51:10.422 INFO:tasks.workunit.client.0.vm05.stdout:7/870: fdatasync d1/d6/f22 0 2026-03-10T07:51:10.426 INFO:tasks.workunit.client.0.vm05.stdout:3/819: truncate d8/d1f/d2a/d96/da9/fb5 100287 0 2026-03-10T07:51:10.428 INFO:tasks.workunit.client.0.vm05.stdout:2/887: dwrite d0/d8/d66/dd1/d49/df9/da5/da8/fac [0,4194304] 0 2026-03-10T07:51:10.449 INFO:tasks.workunit.client.0.vm05.stdout:6/882: mkdir d0/d11/d57/d122 0 2026-03-10T07:51:10.449 INFO:tasks.workunit.client.0.vm05.stdout:6/883: chown d0/d35/d36/db8/l102 2277 1 2026-03-10T07:51:10.449 INFO:tasks.workunit.client.0.vm05.stdout:6/884: chown d0/d11/d22/d6c/d84/fa8 0 1 2026-03-10T07:51:10.449 INFO:tasks.workunit.client.0.vm05.stdout:9/778: unlink d8/d86/d28/d79/d57/de1/d22/d33/d62/d6d/faf 0 2026-03-10T07:51:10.451 INFO:tasks.workunit.client.0.vm05.stdout:7/871: creat d1/d34/d59/d60/d8c/f10c x:0 0 0 2026-03-10T07:51:10.452 INFO:tasks.workunit.client.0.vm05.stdout:9/779: chown d8/d86/d28/d79/d57/de1/d1c/d20/cc8 1731717507 1 2026-03-10T07:51:10.453 INFO:tasks.workunit.client.0.vm05.stdout:4/909: symlink d0/d6/d9/d8c/dbe/d11e/l12b 0 2026-03-10T07:51:10.453 INFO:tasks.workunit.client.0.vm05.stdout:9/780: read d8/f8a [314542,28035] 0 2026-03-10T07:51:10.457 INFO:tasks.workunit.client.0.vm05.stdout:9/781: dread - d8/d86/d28/d79/d57/de1/d1c/d20/d59/d8b/fa1 zero size 2026-03-10T07:51:10.460 INFO:tasks.workunit.client.0.vm05.stdout:9/782: write d8/d86/d28/d79/d57/de1/d22/d33/d62/fd2 [3651621,39907] 0 2026-03-10T07:51:10.466 INFO:tasks.workunit.client.0.vm05.stdout:3/820: write d8/d1f/f2f [4168434,28116] 0 2026-03-10T07:51:10.469 INFO:tasks.workunit.client.0.vm05.stdout:2/888: creat d0/d8/d43/df/d53/f126 x:0 0 0 2026-03-10T07:51:10.469 INFO:tasks.workunit.client.0.vm05.stdout:1/835: dwrite da/d26/d2b/d89/fb1 [0,4194304] 0 2026-03-10T07:51:10.472 INFO:tasks.workunit.client.0.vm05.stdout:1/836: write da/d26/d9e/fa1 [1203933,44959] 0 2026-03-10T07:51:10.565 INFO:tasks.workunit.client.0.vm05.stdout:8/765: truncate d1/dd/d4d/f63 3541743 0 2026-03-10T07:51:10.586 INFO:tasks.workunit.client.0.vm05.stdout:5/868: link d2/d12/dda/da1/faa d2/d20/d7b/dbc/f12b 0 2026-03-10T07:51:10.620 INFO:tasks.workunit.client.0.vm05.stdout:6/885: symlink d0/d11/d57/d66/l123 0 2026-03-10T07:51:10.648 INFO:tasks.workunit.client.0.vm05.stdout:9/783: creat d8/d86/d28/d79/d57/de1/d1c/d75/dc6/f106 x:0 0 0 2026-03-10T07:51:10.673 INFO:tasks.workunit.client.0.vm05.stdout:3/821: mknod d8/d1f/d24/d76/c115 0 2026-03-10T07:51:10.702 INFO:tasks.workunit.client.0.vm05.stdout:5/869: creat d2/d20/d33/d86/dac/dc1/f12c x:0 0 0 2026-03-10T07:51:10.703 INFO:tasks.workunit.client.0.vm05.stdout:5/870: chown d2/dd7/le2 72687276 1 2026-03-10T07:51:10.719 INFO:tasks.workunit.client.0.vm05.stdout:3/822: sync 2026-03-10T07:51:10.719 INFO:tasks.workunit.client.0.vm05.stdout:3/823: chown d8/d1f/d2a/d96/da9/fe6 2021073 1 2026-03-10T07:51:10.755 INFO:tasks.workunit.client.0.vm05.stdout:3/824: unlink d8/d1f/d24/d76/dc5/de1/d52/f9f 0 2026-03-10T07:51:10.755 INFO:tasks.workunit.client.0.vm05.stdout:8/766: creat d1/ff6 x:0 0 0 2026-03-10T07:51:10.758 INFO:tasks.workunit.client.0.vm05.stdout:8/767: chown d1/dd/d4d/d64/d6a/de5/d2a/d48/f79 589 1 2026-03-10T07:51:10.774 INFO:tasks.workunit.client.0.vm05.stdout:5/871: symlink d2/d12/dda/da1/dc0/l12d 0 2026-03-10T07:51:10.779 INFO:tasks.workunit.client.0.vm05.stdout:0/860: rename d8/dd/d37/f38 to d8/dd/d37/d67/f125 0 2026-03-10T07:51:10.784 INFO:tasks.workunit.client.0.vm05.stdout:3/825: mkdir d8/d1f/d2a/d34/dbd/d116 0 2026-03-10T07:51:10.786 INFO:tasks.workunit.client.0.vm05.stdout:8/768: mknod d1/dd/d4d/dcc/dbd/cf7 0 2026-03-10T07:51:10.787 INFO:tasks.workunit.client.0.vm05.stdout:5/872: readlink d2/d5/led 0 2026-03-10T07:51:10.791 INFO:tasks.workunit.client.0.vm05.stdout:6/886: link d0/d35/d36/db8/c109 d0/d11/d22/d6c/d84/dc4/c124 0 2026-03-10T07:51:10.809 INFO:tasks.workunit.client.0.vm05.stdout:3/826: creat d8/dd5/dfb/f117 x:0 0 0 2026-03-10T07:51:10.830 INFO:tasks.workunit.client.0.vm05.stdout:0/861: mkdir d8/d9c/dc8/d11d/d126 0 2026-03-10T07:51:10.858 INFO:tasks.workunit.client.0.vm05.stdout:8/769: unlink d1/dd/d4d/d64/d6a/de5/d2a/d48/f57 0 2026-03-10T07:51:10.858 INFO:tasks.workunit.client.0.vm05.stdout:1/837: rename da/d26/d2b/daf to da/dd/d12/d34/d107 0 2026-03-10T07:51:10.858 INFO:tasks.workunit.client.0.vm05.stdout:4/910: truncate d0/d6/d9/d12/d65/f8a 177715 0 2026-03-10T07:51:10.860 INFO:tasks.workunit.client.0.vm05.stdout:8/770: fsync d1/dd/d4d/d64/d6a/de5/f5b 0 2026-03-10T07:51:10.865 INFO:tasks.workunit.client.0.vm05.stdout:7/872: dwrite d1/d3c/d71/d79/f93 [0,4194304] 0 2026-03-10T07:51:10.866 INFO:tasks.workunit.client.0.vm05.stdout:9/784: write d8/d86/d28/d79/d57/de1/d1c/d20/d59/fae [2744480,113703] 0 2026-03-10T07:51:10.867 INFO:tasks.workunit.client.0.vm05.stdout:2/889: dwrite d0/d52/f88 [0,4194304] 0 2026-03-10T07:51:10.883 INFO:tasks.workunit.client.0.vm05.stdout:8/771: dwrite d1/dd/d5e/fee [0,4194304] 0 2026-03-10T07:51:10.897 INFO:tasks.workunit.client.0.vm05.stdout:4/911: creat d0/d6/d9/d12/d69/dc7/ded/f12c x:0 0 0 2026-03-10T07:51:10.903 INFO:tasks.workunit.client.0.vm05.stdout:4/912: dwrite d0/d6/f39 [0,4194304] 0 2026-03-10T07:51:10.914 INFO:tasks.workunit.client.0.vm05.stdout:5/873: dwrite d2/d12/d2d/f7a [0,4194304] 0 2026-03-10T07:51:10.922 INFO:tasks.workunit.client.0.vm05.stdout:5/874: read - d2/d12/da8/f124 zero size 2026-03-10T07:51:10.922 INFO:tasks.workunit.client.0.vm05.stdout:9/785: truncate d8/f9c 512100 0 2026-03-10T07:51:10.923 INFO:tasks.workunit.client.0.vm05.stdout:5/875: write d2/d20/d33/d86/ff1 [117340,24400] 0 2026-03-10T07:51:10.929 INFO:tasks.workunit.client.0.vm05.stdout:0/862: mkdir d8/d9c/dc8/d10f/d127 0 2026-03-10T07:51:10.931 INFO:tasks.workunit.client.0.vm05.stdout:8/772: dread - d1/dd/d4d/d64/d6a/f83 zero size 2026-03-10T07:51:10.954 INFO:tasks.workunit.client.0.vm05.stdout:4/913: truncate d0/d6/d9/d12/d65/fb5 427877 0 2026-03-10T07:51:10.954 INFO:tasks.workunit.client.0.vm05.stdout:5/876: unlink d2/d20/d33/d53/fd0 0 2026-03-10T07:51:10.954 INFO:tasks.workunit.client.0.vm05.stdout:2/890: mkdir d0/d8/d66/dd1/d49/db1/d107/d127 0 2026-03-10T07:51:10.956 INFO:tasks.workunit.client.0.vm05.stdout:4/914: mkdir d0/d6/d60/dde/d12d 0 2026-03-10T07:51:10.957 INFO:tasks.workunit.client.0.vm05.stdout:2/891: mknod d0/c128 0 2026-03-10T07:51:10.969 INFO:tasks.workunit.client.0.vm05.stdout:2/892: truncate d0/d2a/f45 3561121 0 2026-03-10T07:51:10.970 INFO:tasks.workunit.client.0.vm05.stdout:2/893: write d0/d8/d66/dd1/d49/fd3 [3264860,15004] 0 2026-03-10T07:51:10.988 INFO:tasks.workunit.client.0.vm05.stdout:9/786: sync 2026-03-10T07:51:10.988 INFO:tasks.workunit.client.0.vm05.stdout:0/863: sync 2026-03-10T07:51:10.988 INFO:tasks.workunit.client.0.vm05.stdout:4/915: sync 2026-03-10T07:51:10.993 INFO:tasks.workunit.client.0.vm05.stdout:4/916: unlink d0/d6/d95/f40 0 2026-03-10T07:51:10.994 INFO:tasks.workunit.client.0.vm05.stdout:0/864: truncate d8/dd/d10/f6c 3764372 0 2026-03-10T07:51:10.996 INFO:tasks.workunit.client.0.vm05.stdout:9/787: rename d8/f8a to d8/d86/d28/d79/d57/de1/d22/d33/d85/f107 0 2026-03-10T07:51:10.998 INFO:tasks.workunit.client.0.vm05.stdout:0/865: sync 2026-03-10T07:51:11.005 INFO:tasks.workunit.client.0.vm05.stdout:6/887: dwrite d0/d11/d2e/fbc [0,4194304] 0 2026-03-10T07:51:11.008 INFO:tasks.workunit.client.0.vm05.stdout:4/917: dwrite d0/d6/da6/fba [0,4194304] 0 2026-03-10T07:51:11.016 INFO:tasks.workunit.client.0.vm05.stdout:9/788: rmdir d8/d86/d28/d79/d57/d96 39 2026-03-10T07:51:11.028 INFO:tasks.workunit.client.0.vm05.stdout:9/789: mknod d8/d86/d28/d79/d57/de1/d22/d33/d62/d6d/de7/c108 0 2026-03-10T07:51:11.033 INFO:tasks.workunit.client.0.vm05.stdout:9/790: read d8/d86/d28/d79/d57/de1/d1c/d20/d59/fae [2340815,42113] 0 2026-03-10T07:51:11.034 INFO:tasks.workunit.client.0.vm05.stdout:9/791: unlink d8/d86/d95/la4 0 2026-03-10T07:51:11.036 INFO:tasks.workunit.client.0.vm05.stdout:9/792: mkdir d8/d86/d28/d79/d57/de1/d1c/d75/dc6/d103/d109 0 2026-03-10T07:51:11.037 INFO:tasks.workunit.client.0.vm05.stdout:9/793: fsync d8/d86/d28/d79/d57/de1/d1c/d20/d59/fca 0 2026-03-10T07:51:11.039 INFO:tasks.workunit.client.0.vm05.stdout:9/794: mknod d8/d86/d28/d79/d57/d96/c10a 0 2026-03-10T07:51:11.049 INFO:tasks.workunit.client.0.vm05.stdout:9/795: dread d8/d86/d28/d79/d57/de1/d1c/d75/ffc [0,4194304] 0 2026-03-10T07:51:11.067 INFO:tasks.workunit.client.0.vm05.stdout:3/827: dwrite d8/d8f/dbc/fe0 [0,4194304] 0 2026-03-10T07:51:11.090 INFO:tasks.workunit.client.0.vm05.stdout:3/828: sync 2026-03-10T07:51:11.151 INFO:tasks.workunit.client.0.vm05.stdout:1/838: write da/fc [2481239,43726] 0 2026-03-10T07:51:11.157 INFO:tasks.workunit.client.0.vm05.stdout:7/873: dwrite d1/d3c/f63 [4194304,4194304] 0 2026-03-10T07:51:11.168 INFO:tasks.workunit.client.0.vm05.stdout:8/773: write d1/dd/d18/f58 [464530,20113] 0 2026-03-10T07:51:11.170 INFO:tasks.workunit.client.0.vm05.stdout:1/839: read da/dd/f7b [320560,46835] 0 2026-03-10T07:51:11.174 INFO:tasks.workunit.client.0.vm05.stdout:5/877: dwrite d2/d12/dda/da1/dc0/dc2/ff8 [0,4194304] 0 2026-03-10T07:51:11.186 INFO:tasks.workunit.client.0.vm05.stdout:2/894: dwrite d0/f6 [0,4194304] 0 2026-03-10T07:51:11.196 INFO:tasks.workunit.client.0.vm05.stdout:8/774: rename d1/dd/d4d/f60 to d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d5d/ff8 0 2026-03-10T07:51:11.198 INFO:tasks.workunit.client.0.vm05.stdout:0/866: dwrite d8/dd/d10/d26/d8b/da4/f3e [0,4194304] 0 2026-03-10T07:51:11.202 INFO:tasks.workunit.client.0.vm05.stdout:1/840: truncate da/dd/d12/f31 2733880 0 2026-03-10T07:51:11.213 INFO:tasks.workunit.client.0.vm05.stdout:6/888: dwrite d0/d11/d57/ff4 [0,4194304] 0 2026-03-10T07:51:11.217 INFO:tasks.workunit.client.0.vm05.stdout:5/878: chown d2/d12/f3a 3677 1 2026-03-10T07:51:11.217 INFO:tasks.workunit.client.0.vm05.stdout:4/918: write d0/d6/da6/ffd [2366575,94704] 0 2026-03-10T07:51:11.233 INFO:tasks.workunit.client.0.vm05.stdout:9/796: unlink d8/d86/d28/d79/d57/de1/d22/d33/d85/f107 0 2026-03-10T07:51:11.245 INFO:tasks.workunit.client.0.vm05.stdout:6/889: creat d0/d11/d2e/d81/f125 x:0 0 0 2026-03-10T07:51:11.245 INFO:tasks.workunit.client.0.vm05.stdout:6/890: chown d0/d11/d31/c1f 197 1 2026-03-10T07:51:11.253 INFO:tasks.workunit.client.0.vm05.stdout:7/874: creat d1/d6/d80/f10d x:0 0 0 2026-03-10T07:51:11.256 INFO:tasks.workunit.client.0.vm05.stdout:4/919: fdatasync d0/d6/d9/d5a/d6e/fa8 0 2026-03-10T07:51:11.261 INFO:tasks.workunit.client.0.vm05.stdout:3/829: mkdir d8/d22/d60/d6e/d105/d118 0 2026-03-10T07:51:11.264 INFO:tasks.workunit.client.0.vm05.stdout:0/867: dread d8/dd/d10/d26/d3a/d5e/fa3 [0,4194304] 0 2026-03-10T07:51:11.268 INFO:tasks.workunit.client.0.vm05.stdout:2/895: mkdir d0/d8/d66/dd1/d49/db1/d107/d127/d129 0 2026-03-10T07:51:11.281 INFO:tasks.workunit.client.0.vm05.stdout:9/797: symlink d8/d86/d28/d79/d57/de1/d22/dab/l10b 0 2026-03-10T07:51:11.286 INFO:tasks.workunit.client.0.vm05.stdout:8/775: mkdir d1/d6f/df9 0 2026-03-10T07:51:11.291 INFO:tasks.workunit.client.0.vm05.stdout:6/891: chown d0/d35/f108 0 1 2026-03-10T07:51:11.295 INFO:tasks.workunit.client.0.vm05.stdout:7/875: fdatasync d1/d3c/d71/d79/fee 0 2026-03-10T07:51:11.299 INFO:tasks.workunit.client.0.vm05.stdout:4/920: mkdir d0/d6/d9/d12/d4f/d12e 0 2026-03-10T07:51:11.306 INFO:tasks.workunit.client.0.vm05.stdout:5/879: write d2/d5/f10 [3550921,13180] 0 2026-03-10T07:51:11.309 INFO:tasks.workunit.client.0.vm05.stdout:2/896: truncate d0/d8/d66/dd1/d49/df9/db2/f2b 3347863 0 2026-03-10T07:51:11.313 INFO:tasks.workunit.client.0.vm05.stdout:2/897: dwrite d0/d8/d43/df/d53/f126 [0,4194304] 0 2026-03-10T07:51:11.313 INFO:tasks.workunit.client.0.vm05.stdout:2/898: readlink d0/d8/d43/df/d4d/l79 0 2026-03-10T07:51:11.330 INFO:tasks.workunit.client.0.vm05.stdout:6/892: symlink d0/d11/d22/d6c/l126 0 2026-03-10T07:51:11.337 INFO:tasks.workunit.client.0.vm05.stdout:7/876: truncate d1/d6/f84 1162367 0 2026-03-10T07:51:11.339 INFO:tasks.workunit.client.0.vm05.stdout:4/921: creat d0/d6/d9/d12/d45/d55/d4e/f12f x:0 0 0 2026-03-10T07:51:11.340 INFO:tasks.workunit.client.0.vm05.stdout:4/922: readlink d0/d6/d9/d5a/d6e/db6/db9/l114 0 2026-03-10T07:51:11.344 INFO:tasks.workunit.client.0.vm05.stdout:0/868: write d8/dd/d10/d26/d3a/d5e/f7b [1483411,72844] 0 2026-03-10T07:51:11.354 INFO:tasks.workunit.client.0.vm05.stdout:8/776: write d1/d45/d90/faa [309224,49709] 0 2026-03-10T07:51:11.369 INFO:tasks.workunit.client.0.vm05.stdout:2/899: dread d0/fe3 [0,4194304] 0 2026-03-10T07:51:11.370 INFO:tasks.workunit.client.0.vm05.stdout:2/900: chown d0/d2a/d8c/f104 959 1 2026-03-10T07:51:11.373 INFO:tasks.workunit.client.0.vm05.stdout:2/901: dwrite d0/d8/d66/dd1/ff7 [0,4194304] 0 2026-03-10T07:51:11.384 INFO:tasks.workunit.client.0.vm05.stdout:1/841: symlink da/dd/d2a/d55/d102/l108 0 2026-03-10T07:51:11.386 INFO:tasks.workunit.client.0.vm05.stdout:6/893: fsync d0/d11/d57/ff8 0 2026-03-10T07:51:11.387 INFO:tasks.workunit.client.0.vm05.stdout:6/894: fdatasync d0/d11/d57/f11f 0 2026-03-10T07:51:11.393 INFO:tasks.workunit.client.0.vm05.stdout:1/842: dread da/dd/d42/d80/f87 [0,4194304] 0 2026-03-10T07:51:11.398 INFO:tasks.workunit.client.0.vm05.stdout:5/880: dwrite d2/d20/f2a [4194304,4194304] 0 2026-03-10T07:51:11.398 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:11 vm05.local ceph-mon[50387]: pgmap v25: 65 pgs: 65 active+clean; 1.8 GiB data, 6.1 GiB used, 114 GiB / 120 GiB avail; 32 MiB/s rd, 69 MiB/s wr, 193 op/s 2026-03-10T07:51:11.398 INFO:tasks.workunit.client.0.vm05.stdout:5/881: dread - d2/d20/d7b/f121 zero size 2026-03-10T07:51:11.413 INFO:tasks.workunit.client.0.vm05.stdout:0/869: chown d8/dd/d10/d26/d8b/d86/ff5 7404058 1 2026-03-10T07:51:11.418 INFO:tasks.workunit.client.0.vm05.stdout:8/777: read d1/dd/d4d/f63 [2365431,15788] 0 2026-03-10T07:51:11.421 INFO:tasks.workunit.client.0.vm05.stdout:9/798: rename d8/d86/d28/d79/d57/de1/d22/d33/d47/ff7 to d8/d86/d28/d79/d57/de1/d22/d33/d62/d6d/de7/f10c 0 2026-03-10T07:51:11.423 INFO:tasks.workunit.client.0.vm05.stdout:2/902: mkdir d0/d8/d43/df/d8b/d12a 0 2026-03-10T07:51:11.427 INFO:tasks.workunit.client.0.vm05.stdout:6/895: mknod d0/d11/d22/d6c/d84/c127 0 2026-03-10T07:51:11.430 INFO:tasks.workunit.client.0.vm05.stdout:7/877: fsync d1/d6/d47/f7b 0 2026-03-10T07:51:11.458 INFO:tasks.workunit.client.0.vm05.stdout:1/843: dwrite da/dd/d12/d86/fe0 [0,4194304] 0 2026-03-10T07:51:11.458 INFO:tasks.workunit.client.0.vm05.stdout:9/799: dwrite d8/d86/d28/d79/d57/d96/fa9 [0,4194304] 0 2026-03-10T07:51:11.464 INFO:tasks.workunit.client.0.vm05.stdout:9/800: dwrite d8/d86/d28/d79/d57/d96/fa9 [0,4194304] 0 2026-03-10T07:51:11.472 INFO:tasks.workunit.client.0.vm05.stdout:6/896: creat d0/d6/f128 x:0 0 0 2026-03-10T07:51:11.472 INFO:tasks.workunit.client.0.vm05.stdout:6/897: write d0/d35/d36/d43/d9c/dc7/f114 [2314361,71333] 0 2026-03-10T07:51:11.483 INFO:tasks.workunit.client.0.vm05.stdout:4/923: link d0/d6/d6f/dc2/c129 d0/d6/d9/d12/d69/d120/c130 0 2026-03-10T07:51:11.485 INFO:tasks.workunit.client.0.vm05.stdout:8/778: mknod d1/dd/d4d/d64/d6a/de5/d2a/de3/cfa 0 2026-03-10T07:51:11.485 INFO:tasks.workunit.client.0.vm05.stdout:8/779: chown d1 124838733 1 2026-03-10T07:51:11.488 INFO:tasks.workunit.client.0.vm05.stdout:2/903: rename d0/d8/d43/df/d53/f11b to d0/d8/d66/dd1/d49/db1/f12b 0 2026-03-10T07:51:11.495 INFO:tasks.workunit.client.0.vm05.stdout:1/844: dread da/d26/f92 [4194304,4194304] 0 2026-03-10T07:51:11.508 INFO:tasks.workunit.client.0.vm05.stdout:6/898: dread d0/d11/d57/f5c [4194304,4194304] 0 2026-03-10T07:51:11.510 INFO:tasks.workunit.client.0.vm05.stdout:0/870: creat d8/dd/d37/d67/f128 x:0 0 0 2026-03-10T07:51:11.519 INFO:tasks.workunit.client.0.vm05.stdout:3/830: rmdir d8/d1f/d2a/d98 0 2026-03-10T07:51:11.521 INFO:tasks.workunit.client.0.vm05.stdout:8/780: rmdir d1/d52 39 2026-03-10T07:51:11.521 INFO:tasks.workunit.client.0.vm05.stdout:8/781: chown d1/d6f/f7d 3639 1 2026-03-10T07:51:11.523 INFO:tasks.workunit.client.0.vm05.stdout:2/904: rmdir d0/d8/d43/df 39 2026-03-10T07:51:11.525 INFO:tasks.workunit.client.0.vm05.stdout:2/905: read d0/d8/d66/dd1/d49/d81/dd5/fe2 [2581307,84387] 0 2026-03-10T07:51:11.530 INFO:tasks.workunit.client.0.vm05.stdout:1/845: creat da/d26/d9e/dcc/f109 x:0 0 0 2026-03-10T07:51:11.534 INFO:tasks.workunit.client.0.vm05.stdout:7/878: dwrite d1/d3c/d71/f95 [0,4194304] 0 2026-03-10T07:51:11.536 INFO:tasks.workunit.client.0.vm05.stdout:7/879: write d1/d3c/d71/fd2 [5974985,82392] 0 2026-03-10T07:51:11.543 INFO:tasks.workunit.client.0.vm05.stdout:7/880: dwrite d1/d34/d59/d60/d8c/f10c [0,4194304] 0 2026-03-10T07:51:11.543 INFO:tasks.workunit.client.0.vm05.stdout:9/801: dwrite d8/d86/d28/d79/d57/d96/dd8/fec [0,4194304] 0 2026-03-10T07:51:11.562 INFO:tasks.workunit.client.0.vm05.stdout:4/924: dwrite d0/d6/d9/d8c/dbe/ff4 [0,4194304] 0 2026-03-10T07:51:11.564 INFO:tasks.workunit.client.0.vm05.stdout:4/925: truncate d0/d6/d9/d12/d9c/db7/da7/f125 429239 0 2026-03-10T07:51:11.568 INFO:tasks.workunit.client.0.vm05.stdout:6/899: mkdir d0/d35/d36/db8/d129 0 2026-03-10T07:51:11.579 INFO:tasks.workunit.client.0.vm05.stdout:5/882: link d2/d12/f40 d2/d5/d61/f12e 0 2026-03-10T07:51:11.579 INFO:tasks.workunit.client.0.vm05.stdout:0/871: read d8/dd/d10/d26/d3a/d5e/fa6 [163652,70148] 0 2026-03-10T07:51:11.579 INFO:tasks.workunit.client.0.vm05.stdout:0/872: chown d8/d9c/lae 23 1 2026-03-10T07:51:11.579 INFO:tasks.workunit.client.0.vm05.stdout:8/782: mkdir d1/d45/dfb 0 2026-03-10T07:51:11.582 INFO:tasks.workunit.client.0.vm05.stdout:1/846: fsync da/d26/d2b/f45 0 2026-03-10T07:51:11.598 INFO:tasks.workunit.client.0.vm05.stdout:9/802: mkdir d8/d86/d28/d79/d57/de1/d22/d10d 0 2026-03-10T07:51:11.600 INFO:tasks.workunit.client.0.vm05.stdout:9/803: write d8/d86/d28/d79/d57/de1/d22/d33/d62/fd2 [1586458,52460] 0 2026-03-10T07:51:11.620 INFO:tasks.workunit.client.0.vm05.stdout:4/926: rename d0/d6/d9/d5a/f58 to d0/d11c/f131 0 2026-03-10T07:51:11.620 INFO:tasks.workunit.client.0.vm05.stdout:6/900: symlink d0/d35/d36/d43/d9c/dc7/l12a 0 2026-03-10T07:51:11.620 INFO:tasks.workunit.client.0.vm05.stdout:4/927: write d0/d6/d9/d12/d9c/db7/feb [843897,14170] 0 2026-03-10T07:51:11.628 INFO:tasks.workunit.client.0.vm05.stdout:0/873: creat d8/dd/d37/d67/d96/f129 x:0 0 0 2026-03-10T07:51:11.632 INFO:tasks.workunit.client.0.vm05.stdout:2/906: mkdir d0/d8/d43/df/d8b/dbf/d12c 0 2026-03-10T07:51:11.632 INFO:tasks.workunit.client.0.vm05.stdout:9/804: symlink d8/d86/d28/d79/d57/de1/d6b/l10e 0 2026-03-10T07:51:11.632 INFO:tasks.workunit.client.0.vm05.stdout:8/783: write d1/d45/f53 [4922527,126461] 0 2026-03-10T07:51:11.632 INFO:tasks.workunit.client.0.vm05.stdout:3/831: write d8/d1f/d2a/d96/fd4 [5063126,41144] 0 2026-03-10T07:51:11.633 INFO:tasks.workunit.client.0.vm05.stdout:8/784: write d1/dd/d4d/d64/fe8 [708748,98201] 0 2026-03-10T07:51:11.637 INFO:tasks.workunit.client.0.vm05.stdout:8/785: stat d1/dd/d4d/d64/d6a/de5/d2a/d48/d7c/d9c/lc0 0 2026-03-10T07:51:11.637 INFO:tasks.workunit.client.0.vm05.stdout:8/786: stat d1/dd/d18/led 0 2026-03-10T07:51:11.638 INFO:tasks.workunit.client.0.vm05.stdout:8/787: stat d1/dd/d4d 0 2026-03-10T07:51:11.638 INFO:tasks.workunit.client.0.vm05.stdout:8/788: fdatasync d1/d6f/fa7 0 2026-03-10T07:51:11.648 INFO:tasks.workunit.client.0.vm05.stdout:7/881: write d1/d6/d80/d82/fbd [382732,41458] 0 2026-03-10T07:51:11.667 INFO:tasks.workunit.client.0.vm05.stdout:0/874: truncate d8/fb 3371103 0 2026-03-10T07:51:11.673 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:11 vm08.local ceph-mon[59917]: pgmap v25: 65 pgs: 65 active+clean; 1.8 GiB data, 6.1 GiB used, 114 GiB / 120 GiB avail; 32 MiB/s rd, 69 MiB/s wr, 193 op/s 2026-03-10T07:51:11.673 INFO:tasks.workunit.client.0.vm05.stdout:9/805: mkdir d8/d86/d28/d79/d57/de1/d38/d71/d81/d10f 0 2026-03-10T07:51:11.673 INFO:tasks.workunit.client.0.vm05.stdout:3/832: dwrite d8/d1f/d24/d76/dc5/de1/f114 [0,4194304] 0 2026-03-10T07:51:11.676 INFO:tasks.workunit.client.0.vm05.stdout:2/907: sync 2026-03-10T07:51:11.677 INFO:tasks.workunit.client.0.vm05.stdout:2/908: chown d0/d7e 1150581879 1 2026-03-10T07:51:11.677 INFO:tasks.workunit.client.0.vm05.stdout:8/789: rmdir d1/dd/d4d/d64/d6a 39 2026-03-10T07:51:11.680 INFO:tasks.workunit.client.0.vm05.stdout:2/909: sync 2026-03-10T07:51:11.694 INFO:tasks.workunit.client.0.vm05.stdout:7/882: creat d1/d3c/d71/d79/d8a/dac/f10e x:0 0 0 2026-03-10T07:51:11.704 INFO:tasks.workunit.client.0.vm05.stdout:4/928: dwrite d0/d6/d60/faf [0,4194304] 0 2026-03-10T07:51:11.706 INFO:tasks.workunit.client.0.vm05.stdout:1/847: dwrite da/d26/d2b/d89/fa7 [0,4194304] 0 2026-03-10T07:51:11.707 INFO:tasks.workunit.client.0.vm05.stdout:1/848: fsync da/dd/d2a/d55/d64/dd1/f4e 0 2026-03-10T07:51:11.719 INFO:tasks.workunit.client.0.vm05.stdout:9/806: creat d8/d86/d28/d79/d57/de1/d1c/d20/d54/f110 x:0 0 0 2026-03-10T07:51:11.728 INFO:tasks.workunit.client.0.vm05.stdout:8/790: readlink d1/dd/d4d/d64/d6a/de5/d2a/lc1 0 2026-03-10T07:51:11.736 INFO:tasks.workunit.client.0.vm05.stdout:2/910: rename d0/d8/d66/dd1/d49/df9/db2/l4f to d0/d8/d66/dd1/d49/d81/dd5/d115/l12d 0 2026-03-10T07:51:11.738 INFO:tasks.workunit.client.0.vm05.stdout:7/883: fdatasync d1/d6/d47/d8d/faf 0 2026-03-10T07:51:11.765 INFO:tasks.workunit.client.0.vm05.stdout:5/883: dwrite d2/d4b/fcf [0,4194304] 0 2026-03-10T07:51:11.776 INFO:tasks.workunit.client.0.vm05.stdout:4/929: mknod d0/d6/d9/d12/d9c/db7/da7/d5c/c132 0 2026-03-10T07:51:11.781 INFO:tasks.workunit.client.0.vm05.stdout:9/807: symlink d8/d86/d28/d79/d57/de1/d22/d33/d62/de0/l111 0 2026-03-10T07:51:11.785 INFO:tasks.workunit.client.0.vm05.stdout:8/791: mkdir d1/dd/d4d/dcc/dbd/dfc 0 2026-03-10T07:51:11.789 INFO:tasks.workunit.client.0.vm05.stdout:3/833: rename d8/d1f/d24/d76/dc5/de1/d52/dbb to d8/d22/dad/d119 0 2026-03-10T07:51:11.813 INFO:tasks.workunit.client.0.vm05.stdout:6/901: dwrite d0/d11/d4f/f7e [0,4194304] 0 2026-03-10T07:51:11.814 INFO:tasks.workunit.client.0.vm05.stdout:6/902: chown d0/d11/d22/d69/l90 200 1 2026-03-10T07:51:11.827 INFO:tasks.workunit.client.0.vm05.stdout:4/930: mkdir d0/d28/d133 0 2026-03-10T07:51:11.832 INFO:tasks.workunit.client.0.vm05.stdout:1/849: mkdir da/dd/d12/d34/d58/d10a 0 2026-03-10T07:51:11.840 INFO:tasks.workunit.client.0.vm05.stdout:2/911: dwrite d0/d8/d66/dd1/d49/df9/da5/fb0 [4194304,4194304] 0 2026-03-10T07:51:11.843 INFO:tasks.workunit.client.0.vm05.stdout:0/875: dwrite d8/fb [0,4194304] 0 2026-03-10T07:51:11.860 INFO:tasks.workunit.client.0.vm05.stdout:8/792: fdatasync d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/f66 0 2026-03-10T07:51:11.886 INFO:tasks.workunit.client.0.vm05.stdout:1/850: mknod da/d26/d2b/d89/c10b 0 2026-03-10T07:51:11.887 INFO:tasks.workunit.client.0.vm05.stdout:0/876: creat d8/dd/d10/d26/d8b/da4/de7/f12a x:0 0 0 2026-03-10T07:51:11.887 INFO:tasks.workunit.client.0.vm05.stdout:9/808: mkdir d8/d86/d28/d79/d57/de1/d22/d33/d62/d6d/d9e/d112 0 2026-03-10T07:51:11.892 INFO:tasks.workunit.client.0.vm05.stdout:3/834: fdatasync d8/d1f/d24/d76/dc5/de1/d52/fc0 0 2026-03-10T07:51:11.892 INFO:tasks.workunit.client.0.vm05.stdout:0/877: stat d8/dd/d37/d56/d4d/df8/c114 0 2026-03-10T07:51:11.892 INFO:tasks.workunit.client.0.vm05.stdout:8/793: dread - d1/dd/d4d/d64/d6a/de5/d2a/fb6 zero size 2026-03-10T07:51:11.892 INFO:tasks.workunit.client.0.vm05.stdout:6/903: symlink d0/d11/d57/d122/l12b 0 2026-03-10T07:51:11.893 INFO:tasks.workunit.client.0.vm05.stdout:4/931: fsync d0/f2 0 2026-03-10T07:51:11.898 INFO:tasks.workunit.client.0.vm05.stdout:6/904: chown d0/d11/d4f/da0/f11d 41455 1 2026-03-10T07:51:11.900 INFO:tasks.workunit.client.0.vm05.stdout:8/794: write d1/dd/d4d/d64/d6a/de5/d2a/fec [1026573,109938] 0 2026-03-10T07:51:11.901 INFO:tasks.workunit.client.0.vm05.stdout:8/795: fsync d1/dd/d4d/d64/d6a/de5/d2a/d48/f50 0 2026-03-10T07:51:11.907 INFO:tasks.workunit.client.0.vm05.stdout:0/878: dread d8/dd/d37/d56/f62 [4194304,4194304] 0 2026-03-10T07:51:11.911 INFO:tasks.workunit.client.0.vm05.stdout:9/809: mkdir d8/d86/d28/d79/d57/de1/d1c/d20/d59/d8b/d113 0 2026-03-10T07:51:11.917 INFO:tasks.workunit.client.0.vm05.stdout:7/884: write d1/d5b/fba [755971,68068] 0 2026-03-10T07:51:11.917 INFO:tasks.workunit.client.0.vm05.stdout:7/885: stat d1/d6/d3b 0 2026-03-10T07:51:11.918 INFO:tasks.workunit.client.0.vm05.stdout:4/932: fdatasync d0/f2 0 2026-03-10T07:51:11.919 INFO:tasks.workunit.client.0.vm05.stdout:4/933: stat d0/d6/d9/d12/c6b 0 2026-03-10T07:51:11.923 INFO:tasks.workunit.client.0.vm05.stdout:1/851: dwrite da/dd/d2a/f93 [0,4194304] 0 2026-03-10T07:51:11.928 INFO:tasks.workunit.client.0.vm05.stdout:3/835: symlink d8/d1f/d24/d76/dc5/de1/d19/daf/l11a 0 2026-03-10T07:51:11.933 INFO:tasks.workunit.client.0.vm05.stdout:8/796: mkdir d1/dd/d4d/d64/d6a/de5/d2a/d48/dfd 0 2026-03-10T07:51:11.933 INFO:tasks.workunit.client.0.vm05.stdout:6/905: creat d0/d35/d36/dd2/d101/f12c x:0 0 0 2026-03-10T07:51:11.941 INFO:tasks.workunit.client.0.vm05.stdout:3/836: truncate d8/d1f/d24/d76/dc5/fc8 979013 0 2026-03-10T07:51:11.942 INFO:tasks.workunit.client.0.vm05.stdout:7/886: creat d1/d5b/de6/f10f x:0 0 0 2026-03-10T07:51:11.952 INFO:tasks.workunit.client.0.vm05.stdout:1/852: link da/d26/d2b/d71/c8a da/dd/d2a/d70/d101/c10c 0 2026-03-10T07:51:11.953 INFO:tasks.workunit.client.0.vm05.stdout:8/797: dread d1/dd/d4d/d64/d6a/de5/d2a/d48/f59 [0,4194304] 0 2026-03-10T07:51:11.960 INFO:tasks.workunit.client.0.vm05.stdout:7/887: sync 2026-03-10T07:51:11.967 INFO:tasks.workunit.client.0.vm05.stdout:1/853: rename da/dd/d12/d34/d107/lf2 to da/dd/d2a/d55/d64/dac/l10d 0 2026-03-10T07:51:11.971 INFO:tasks.workunit.client.0.vm05.stdout:8/798: dread d1/d45/f55 [0,4194304] 0 2026-03-10T07:51:11.973 INFO:tasks.workunit.client.0.vm05.stdout:8/799: truncate d1/dd/d4d/d64/d6a/de5/d2a/fd8 931778 0 2026-03-10T07:51:11.982 INFO:tasks.workunit.client.0.vm05.stdout:7/888: dread d1/f49 [4194304,4194304] 0 2026-03-10T07:51:11.983 INFO:tasks.workunit.client.0.vm05.stdout:5/884: write d2/d20/d33/d86/dac/dc1/fc9 [3063949,67973] 0 2026-03-10T07:51:11.987 INFO:tasks.workunit.client.0.vm05.stdout:7/889: mknod d1/d6/d47/d8d/c110 0 2026-03-10T07:51:11.996 INFO:tasks.workunit.client.0.vm05.stdout:7/890: mknod d1/d3c/d71/d79/c111 0 2026-03-10T07:51:11.998 INFO:tasks.workunit.client.0.vm05.stdout:7/891: mkdir d1/d6/d47/d8d/d112 0 2026-03-10T07:51:11.998 INFO:tasks.workunit.client.0.vm05.stdout:7/892: fdatasync d1/d6/d3b/fe9 0 2026-03-10T07:51:12.004 INFO:tasks.workunit.client.0.vm05.stdout:7/893: symlink d1/d108/l113 0 2026-03-10T07:51:12.004 INFO:tasks.workunit.client.0.vm05.stdout:7/894: mknod d1/d6/d3b/c114 0 2026-03-10T07:51:12.007 INFO:tasks.workunit.client.0.vm05.stdout:5/885: read d2/d20/d33/d86/dac/dc1/fc9 [1765553,77977] 0 2026-03-10T07:51:12.009 INFO:tasks.workunit.client.0.vm05.stdout:7/895: rename d1/d6/d3b/fe9 to d1/d3c/d71/d79/d8a/dac/f115 0 2026-03-10T07:51:12.021 INFO:tasks.workunit.client.0.vm05.stdout:5/886: dread d2/d12/dda/da1/dc0/fce [0,4194304] 0 2026-03-10T07:51:12.025 INFO:tasks.workunit.client.0.vm05.stdout:7/896: read d1/d34/d59/d60/fbe [585576,14362] 0 2026-03-10T07:51:12.032 INFO:tasks.workunit.client.0.vm05.stdout:5/887: dread d2/d12/dda/da1/dc0/f11d [0,4194304] 0 2026-03-10T07:51:12.032 INFO:tasks.workunit.client.0.vm05.stdout:7/897: creat d1/d6/d47/f116 x:0 0 0 2026-03-10T07:51:12.034 INFO:tasks.workunit.client.0.vm05.stdout:7/898: mkdir d1/d3c/db8/d117 0 2026-03-10T07:51:12.035 INFO:tasks.workunit.client.0.vm05.stdout:5/888: creat d2/d20/d33/d53/f12f x:0 0 0 2026-03-10T07:51:12.040 INFO:tasks.workunit.client.0.vm05.stdout:2/912: write d0/d8/d43/da4/fc0 [466953,14758] 0 2026-03-10T07:51:12.040 INFO:tasks.workunit.client.0.vm05.stdout:2/913: chown d0/d8/d66/dd1/d49/d81/dd5/d115 76008 1 2026-03-10T07:51:12.044 INFO:tasks.workunit.client.0.vm05.stdout:5/889: rename d2/d20/d33/d53/l11a to d2/d12/da8/l130 0 2026-03-10T07:51:12.050 INFO:tasks.workunit.client.0.vm05.stdout:5/890: rmdir d2/d20/d7b 39 2026-03-10T07:51:12.055 INFO:tasks.workunit.client.0.vm05.stdout:5/891: creat d2/d5/f131 x:0 0 0 2026-03-10T07:51:12.057 INFO:tasks.workunit.client.0.vm05.stdout:5/892: symlink d2/d20/d33/d86/dac/l132 0 2026-03-10T07:51:12.059 INFO:tasks.workunit.client.0.vm05.stdout:0/879: write d8/dd/d10/f6c [2403219,42146] 0 2026-03-10T07:51:12.062 INFO:tasks.workunit.client.0.vm05.stdout:5/893: fdatasync d2/d12/f3a 0 2026-03-10T07:51:12.064 INFO:tasks.workunit.client.0.vm05.stdout:9/810: write d8/d86/ff6 [325096,84769] 0 2026-03-10T07:51:12.065 INFO:tasks.workunit.client.0.vm05.stdout:0/880: fsync d8/f20 0 2026-03-10T07:51:12.065 INFO:tasks.workunit.client.0.vm05.stdout:0/881: truncate d8/dd/f116 190747 0 2026-03-10T07:51:12.066 INFO:tasks.workunit.client.0.vm05.stdout:0/882: chown d8/dd/d37/d81 47 1 2026-03-10T07:51:12.079 INFO:tasks.workunit.client.0.vm05.stdout:4/934: truncate d0/d6/d95/fad 748583 0 2026-03-10T07:51:12.079 INFO:tasks.workunit.client.0.vm05.stdout:3/837: write d8/d1f/d2a/d96/da9/fd6 [1447664,96554] 0 2026-03-10T07:51:12.081 INFO:tasks.workunit.client.0.vm05.stdout:4/935: dread d0/d6/d9/d12/d9c/db7/da7/f125 [0,4194304] 0 2026-03-10T07:51:12.086 INFO:tasks.workunit.client.0.vm05.stdout:5/894: creat d2/d20/d4c/d64/f133 x:0 0 0 2026-03-10T07:51:12.089 INFO:tasks.workunit.client.0.vm05.stdout:6/906: dwrite d0/d11/d86/f118 [0,4194304] 0 2026-03-10T07:51:12.096 INFO:tasks.workunit.client.0.vm05.stdout:9/811: readlink d8/d86/d28/d79/d57/de1/d22/d33/d62/d6d/lce 0 2026-03-10T07:51:12.100 INFO:tasks.workunit.client.0.vm05.stdout:0/883: mknod d8/dd/d10/db7/dc3/c12b 0 2026-03-10T07:51:12.103 INFO:tasks.workunit.client.0.vm05.stdout:8/800: write d1/dd/d4d/d64/d6a/de5/f30 [1853114,94442] 0 2026-03-10T07:51:12.109 INFO:tasks.workunit.client.0.vm05.stdout:1/854: write da/dd/d12/d86/d9a/fbc [1665010,31419] 0 2026-03-10T07:51:12.112 INFO:tasks.workunit.client.0.vm05.stdout:1/855: fdatasync da/dd/d12/d86/fe0 0 2026-03-10T07:51:12.120 INFO:tasks.workunit.client.0.vm05.stdout:7/899: write d1/d34/d59/f99 [342733,56211] 0 2026-03-10T07:51:12.125 INFO:tasks.workunit.client.0.vm05.stdout:2/914: write d0/d8/d66/dd1/d49/df9/db2/fba [4458172,16889] 0 2026-03-10T07:51:12.126 INFO:tasks.workunit.client.0.vm05.stdout:2/915: chown d0/d8/c64 20963 1 2026-03-10T07:51:12.161 INFO:tasks.workunit.client.0.vm05.stdout:4/936: dread d0/d6/d37/f3d [0,4194304] 0 2026-03-10T07:51:12.161 INFO:tasks.workunit.client.0.vm05.stdout:6/907: mknod d0/d11/d22/c12d 0 2026-03-10T07:51:12.162 INFO:tasks.workunit.client.0.vm05.stdout:9/812: creat d8/d86/d28/d79/d57/de1/d22/f114 x:0 0 0 2026-03-10T07:51:12.163 INFO:tasks.workunit.client.0.vm05.stdout:0/884: creat d8/dd/d37/dfd/f12c x:0 0 0 2026-03-10T07:51:12.171 INFO:tasks.workunit.client.0.vm05.stdout:3/838: creat d8/d1f/d2a/d4a/d8c/f11b x:0 0 0 2026-03-10T07:51:12.171 INFO:tasks.workunit.client.0.vm05.stdout:3/839: chown d8/d1f/d24/d76/dc5/de1/d52/da4 3475572 1 2026-03-10T07:51:12.173 INFO:tasks.workunit.client.0.vm05.stdout:5/895: mkdir d2/d20/d134 0 2026-03-10T07:51:12.176 INFO:tasks.workunit.client.0.vm05.stdout:5/896: dwrite d2/d20/d4c/db6/f110 [0,4194304] 0 2026-03-10T07:51:12.177 INFO:tasks.workunit.client.0.vm05.stdout:9/813: symlink d8/d86/d28/d79/d57/de1/d22/dab/l115 0 2026-03-10T07:51:12.178 INFO:tasks.workunit.client.0.vm05.stdout:9/814: fdatasync d8/d86/d28/d79/d57/de1/f51 0 2026-03-10T07:51:12.185 INFO:tasks.workunit.client.0.vm05.stdout:4/937: sync 2026-03-10T07:51:12.186 INFO:tasks.workunit.client.0.vm05.stdout:4/938: write d0/d6/da6/f11a [501058,113764] 0 2026-03-10T07:51:12.188 INFO:tasks.workunit.client.0.vm05.stdout:7/900: creat d1/d6/dc3/d10b/f118 x:0 0 0 2026-03-10T07:51:12.190 INFO:tasks.workunit.client.0.vm05.stdout:7/901: dwrite d1/d3c/d71/d79/fe1 [0,4194304] 0 2026-03-10T07:51:12.206 INFO:tasks.workunit.client.0.vm05.stdout:5/897: truncate d2/d20/d33/d53/d7d/f82 2044088 0 2026-03-10T07:51:12.206 INFO:tasks.workunit.client.0.vm05.stdout:4/939: fsync d0/d6/d9/d12/d45/d55/f2c 0 2026-03-10T07:51:12.207 INFO:tasks.workunit.client.0.vm05.stdout:4/940: chown d0/d6/d9/d12/d45 875193579 1 2026-03-10T07:51:12.215 INFO:tasks.workunit.client.0.vm05.stdout:9/815: dread d8/d86/fc1 [0,4194304] 0 2026-03-10T07:51:12.220 INFO:tasks.workunit.client.0.vm05.stdout:8/801: getdents d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d5d/df3 0 2026-03-10T07:51:12.222 INFO:tasks.workunit.client.0.vm05.stdout:8/802: fdatasync d1/dd/d4d/f8a 0 2026-03-10T07:51:12.230 INFO:tasks.workunit.client.0.vm05.stdout:3/840: symlink d8/dd5/l11c 0 2026-03-10T07:51:12.248 INFO:tasks.workunit.client.0.vm05.stdout:5/898: dread d2/d20/d33/f88 [0,4194304] 0 2026-03-10T07:51:12.248 INFO:tasks.workunit.client.0.vm05.stdout:5/899: stat d2/d20/d5b/f5f 0 2026-03-10T07:51:12.255 INFO:tasks.workunit.client.0.vm05.stdout:7/902: creat d1/d3c/d104/f119 x:0 0 0 2026-03-10T07:51:12.261 INFO:tasks.workunit.client.0.vm05.stdout:8/803: rename d1/dc9/fa3 to d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d5d/df3/ffe 0 2026-03-10T07:51:12.266 INFO:tasks.workunit.client.0.vm05.stdout:0/885: getdents d8/dd/d10/d26/d8b 0 2026-03-10T07:51:12.267 INFO:tasks.workunit.client.0.vm05.stdout:0/886: dread - d8/dd/d10/f115 zero size 2026-03-10T07:51:12.268 INFO:tasks.workunit.client.0.vm05.stdout:0/887: write d8/dd/d10/d26/d3a/d5e/f7b [863392,95833] 0 2026-03-10T07:51:12.271 INFO:tasks.workunit.client.0.vm05.stdout:1/856: write da/d26/d2b/f45 [2204166,39737] 0 2026-03-10T07:51:12.274 INFO:tasks.workunit.client.0.vm05.stdout:2/916: dwrite d0/fe3 [0,4194304] 0 2026-03-10T07:51:12.278 INFO:tasks.workunit.client.0.vm05.stdout:4/941: fsync d0/d6/d9/d12/d65/fb5 0 2026-03-10T07:51:12.285 INFO:tasks.workunit.client.0.vm05.stdout:6/908: dwrite d0/d11/d4f/d56/d96/db6/fef [0,4194304] 0 2026-03-10T07:51:12.300 INFO:tasks.workunit.client.0.vm05.stdout:5/900: write d2/d20/d4c/db6/f110 [4690861,26009] 0 2026-03-10T07:51:12.300 INFO:tasks.workunit.client.0.vm05.stdout:5/901: read d2/d12/dda/f11f [551409,97306] 0 2026-03-10T07:51:12.307 INFO:tasks.workunit.client.0.vm05.stdout:8/804: creat d1/dd/d5e/d9e/fff x:0 0 0 2026-03-10T07:51:12.312 INFO:tasks.workunit.client.0.vm05.stdout:9/816: dwrite d8/d86/d28/d79/d57/de1/d22/d33/d85/f8f [0,4194304] 0 2026-03-10T07:51:12.315 INFO:tasks.workunit.client.0.vm05.stdout:3/841: creat d8/d1c/d48/d69/f11d x:0 0 0 2026-03-10T07:51:12.322 INFO:tasks.workunit.client.0.vm05.stdout:1/857: fdatasync da/dd/d42/d80/f94 0 2026-03-10T07:51:12.323 INFO:tasks.workunit.client.0.vm05.stdout:1/858: dread - da/d26/d2b/d71/fa2 zero size 2026-03-10T07:51:12.328 INFO:tasks.workunit.client.0.vm05.stdout:0/888: rename d8/dd/d10/d26/d8b/da4/de7/ced to d8/dd/d10/d26/d48/c12d 0 2026-03-10T07:51:12.331 INFO:tasks.workunit.client.0.vm05.stdout:4/942: mknod d0/d6/da6/c134 0 2026-03-10T07:51:12.340 INFO:tasks.workunit.client.0.vm05.stdout:9/817: sync 2026-03-10T07:51:12.341 INFO:tasks.workunit.client.0.vm05.stdout:9/818: chown d8/d86/d28/d79/d57/de1/d22/f114 2332471 1 2026-03-10T07:51:12.341 INFO:tasks.workunit.client.0.vm05.stdout:9/819: write d8/d86/d28/d79/d57/de1/d6b/f97 [3634500,50947] 0 2026-03-10T07:51:12.344 INFO:tasks.workunit.client.0.vm05.stdout:5/902: dread d2/d20/d77/f7f [0,4194304] 0 2026-03-10T07:51:12.348 INFO:tasks.workunit.client.0.vm05.stdout:8/805: truncate d1/dd/d4d/d64/d6a/de5/d2a/d48/f4a 1752790 0 2026-03-10T07:51:12.352 INFO:tasks.workunit.client.0.vm05.stdout:3/842: chown d8/fe 122628707 1 2026-03-10T07:51:12.355 INFO:tasks.workunit.client.0.vm05.stdout:6/909: rename d0/d35/d36 to d0/d11/d86/d12e 0 2026-03-10T07:51:12.381 INFO:tasks.workunit.client.0.vm05.stdout:9/820: symlink d8/d86/d28/d79/d57/de1/d22/d33/d62/d6d/d9e/l116 0 2026-03-10T07:51:12.381 INFO:tasks.workunit.client.0.vm05.stdout:9/821: readlink d8/d86/d28/d79/d57/de1/d22/dab/l115 0 2026-03-10T07:51:12.381 INFO:tasks.workunit.client.0.vm05.stdout:8/806: truncate d1/dd/d4d/d64/f86 168586 0 2026-03-10T07:51:12.393 INFO:tasks.workunit.client.0.vm05.stdout:3/843: rename d8/d1f/d24/d76/dc5/de1/d19/daf to d8/d1c/d48/d69/d11e 0 2026-03-10T07:51:12.395 INFO:tasks.workunit.client.0.vm05.stdout:7/903: write d1/d6/d80/fb9 [1133010,47134] 0 2026-03-10T07:51:12.395 INFO:tasks.workunit.client.0.vm05.stdout:3/844: truncate d8/dd5/dfb/f100 464741 0 2026-03-10T07:51:12.396 INFO:tasks.workunit.client.0.vm05.stdout:7/904: write d1/d34/d59/d60/d8c/f10c [3719077,33923] 0 2026-03-10T07:51:12.408 INFO:tasks.workunit.client.0.vm05.stdout:0/889: mknod d8/dd/d10/d26/d3a/d5e/c12e 0 2026-03-10T07:51:12.426 INFO:tasks.workunit.client.0.vm05.stdout:9/822: creat d8/d86/d28/d79/d57/de1/d22/dab/db4/f117 x:0 0 0 2026-03-10T07:51:12.444 INFO:tasks.workunit.client.0.vm05.stdout:7/905: rename d1/d6/d80/dcd/f103 to d1/d3c/d104/f11a 0 2026-03-10T07:51:12.461 INFO:tasks.workunit.client.0.vm05.stdout:2/917: dwrite d0/d8/f42 [0,4194304] 0 2026-03-10T07:51:12.471 INFO:tasks.workunit.client.0.vm05.stdout:0/890: rename d8/dd/d10/db7 to d8/d9c/dc8/d10f/d127/d12f 0 2026-03-10T07:51:12.476 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:12 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:51:12.476 INFO:tasks.workunit.client.0.vm05.stdout:4/943: getdents d0/d6/d9/d5a/d6e/db6/db9 0 2026-03-10T07:51:12.479 INFO:tasks.workunit.client.0.vm05.stdout:5/903: dwrite d2/d20/d33/d86/dac/f127 [0,4194304] 0 2026-03-10T07:51:12.488 INFO:tasks.workunit.client.0.vm05.stdout:1/859: write da/dd/d12/d34/ddb/fc5 [1043617,127058] 0 2026-03-10T07:51:12.491 INFO:tasks.workunit.client.0.vm05.stdout:6/910: write d0/d11/d57/d66/f75 [4624070,77045] 0 2026-03-10T07:51:12.494 INFO:tasks.workunit.client.0.vm05.stdout:2/918: dread d0/d8/d43/df/d4d/f93 [0,4194304] 0 2026-03-10T07:51:12.494 INFO:tasks.workunit.client.0.vm05.stdout:2/919: readlink d0/l4a 0 2026-03-10T07:51:12.495 INFO:tasks.workunit.client.0.vm05.stdout:2/920: readlink d0/d8/d3d/l77 0 2026-03-10T07:51:12.503 INFO:tasks.workunit.client.0.vm05.stdout:0/891: creat d8/dd/d10/d26/d8b/da4/ddf/f130 x:0 0 0 2026-03-10T07:51:12.518 INFO:tasks.workunit.client.0.vm05.stdout:0/892: dread d8/dd/d10/d26/d8b/da4/de7/f109 [0,4194304] 0 2026-03-10T07:51:12.518 INFO:tasks.workunit.client.0.vm05.stdout:9/823: write d8/d86/d28/d79/d57/de1/f48 [227540,71717] 0 2026-03-10T07:51:12.519 INFO:tasks.workunit.client.0.vm05.stdout:0/893: truncate d8/dd/d37/d67/d96/ff3 1083959 0 2026-03-10T07:51:12.520 INFO:tasks.workunit.client.0.vm05.stdout:0/894: dread - d8/dd/d37/dfd/f12c zero size 2026-03-10T07:51:12.522 INFO:tasks.workunit.client.0.vm05.stdout:8/807: getdents d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d5d/df3 0 2026-03-10T07:51:12.527 INFO:tasks.workunit.client.0.vm05.stdout:6/911: dread - d0/d11/d86/d12e/d43/d9c/fc5 zero size 2026-03-10T07:51:12.527 INFO:tasks.workunit.client.0.vm05.stdout:3/845: getdents d8/d1c/d48/d69/d11e 0 2026-03-10T07:51:12.528 INFO:tasks.workunit.client.0.vm05.stdout:3/846: truncate d8/d1f/d24/d8a/f91 4790786 0 2026-03-10T07:51:12.538 INFO:tasks.workunit.client.0.vm05.stdout:2/921: read d0/d8/d66/dd1/d49/d81/dd5/fe2 [2823248,108062] 0 2026-03-10T07:51:12.539 INFO:tasks.workunit.client.0.vm05.stdout:8/808: dread d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d5d/f84 [0,4194304] 0 2026-03-10T07:51:12.539 INFO:tasks.workunit.client.0.vm05.stdout:2/922: chown d0/d8/d43/df/d8b/dbf/d12c 387093 1 2026-03-10T07:51:12.540 INFO:tasks.workunit.client.0.vm05.stdout:8/809: truncate d1/dd/d4d/f8a 1673328 0 2026-03-10T07:51:12.549 INFO:tasks.workunit.client.0.vm05.stdout:4/944: write d0/d6/d9/d12/d9c/db7/da7/f100 [388862,18863] 0 2026-03-10T07:51:12.550 INFO:tasks.workunit.client.0.vm05.stdout:1/860: dread da/dd/d2a/f54 [4194304,4194304] 0 2026-03-10T07:51:12.551 INFO:tasks.workunit.client.0.vm05.stdout:1/861: readlink da/dd/d2a/d55/l66 0 2026-03-10T07:51:12.552 INFO:tasks.workunit.client.0.vm05.stdout:1/862: write da/dd/d12/d86/fe0 [1976086,27093] 0 2026-03-10T07:51:12.552 INFO:tasks.workunit.client.0.vm05.stdout:5/904: write d2/d20/d33/fe0 [729791,86887] 0 2026-03-10T07:51:12.553 INFO:tasks.workunit.client.0.vm05.stdout:5/905: truncate d2/d20/d33/d86/fbb 581151 0 2026-03-10T07:51:12.557 INFO:tasks.workunit.client.0.vm05.stdout:7/906: link d1/d3c/d71/f95 d1/d6/d3b/f11b 0 2026-03-10T07:51:12.562 INFO:tasks.workunit.client.0.vm05.stdout:9/824: rmdir d8 39 2026-03-10T07:51:12.569 INFO:tasks.workunit.client.0.vm05.stdout:0/895: creat d8/dd/d10/d26/d8b/da4/ddf/f131 x:0 0 0 2026-03-10T07:51:12.571 INFO:tasks.workunit.client.0.vm05.stdout:3/847: fdatasync d8/d1f/d2a/f42 0 2026-03-10T07:51:12.583 INFO:tasks.workunit.client.0.vm05.stdout:6/912: dread d0/d11/f58 [0,4194304] 0 2026-03-10T07:51:12.583 INFO:tasks.workunit.client.0.vm05.stdout:6/913: chown d0/d35 15 1 2026-03-10T07:51:12.586 INFO:tasks.workunit.client.0.vm05.stdout:6/914: write d0/d11/d86/d12e/d43/d9c/dc7/f114 [4533943,45647] 0 2026-03-10T07:51:12.591 INFO:tasks.workunit.client.0.vm05.stdout:1/863: creat da/dd/d2a/d70/f10e x:0 0 0 2026-03-10T07:51:12.592 INFO:tasks.workunit.client.0.vm05.stdout:1/864: fdatasync da/fc 0 2026-03-10T07:51:12.599 INFO:tasks.workunit.client.0.vm05.stdout:7/907: creat d1/d3c/d104/f11c x:0 0 0 2026-03-10T07:51:12.599 INFO:tasks.workunit.client.0.vm05.stdout:4/945: write d0/d6/d9/d12/d4f/f5b [2476429,110865] 0 2026-03-10T07:51:12.606 INFO:tasks.workunit.client.0.vm05.stdout:8/810: creat d1/dd/d4d/d64/d6a/de5/d2a/d48/dfd/f100 x:0 0 0 2026-03-10T07:51:12.608 INFO:tasks.workunit.client.0.vm05.stdout:9/825: dwrite d8/d86/d28/d79/d57/de1/d1c/d75/dc6/f106 [0,4194304] 0 2026-03-10T07:51:12.610 INFO:tasks.workunit.client.0.vm05.stdout:9/826: write d8/d86/d28/d79/d57/de1/d22/d33/d47/f5f [5004748,5638] 0 2026-03-10T07:51:12.617 INFO:tasks.workunit.client.0.vm05.stdout:0/896: dread d8/dd/f59 [0,4194304] 0 2026-03-10T07:51:12.629 INFO:tasks.workunit.client.0.vm05.stdout:5/906: mkdir d2/d20/d135 0 2026-03-10T07:51:12.631 INFO:tasks.workunit.client.0.vm05.stdout:1/865: rename da/d26/d2b/d71/f98 to da/dd/d2a/d55/d64/dac/f10f 0 2026-03-10T07:51:12.631 INFO:tasks.workunit.client.0.vm05.stdout:7/908: readlink d1/l2b 0 2026-03-10T07:51:12.631 INFO:tasks.workunit.client.0.vm05.stdout:4/946: creat d0/d6/d9/d12/d9c/db7/db1/f135 x:0 0 0 2026-03-10T07:51:12.636 INFO:tasks.workunit.client.0.vm05.stdout:3/848: creat d8/d8f/dbc/dc7/f11f x:0 0 0 2026-03-10T07:51:12.637 INFO:tasks.workunit.client.0.vm05.stdout:8/811: chown d1/dc9/f31 11144 1 2026-03-10T07:51:12.640 INFO:tasks.workunit.client.0.vm05.stdout:9/827: rename d8/d86/d28/d79/d57/de1/d22/c6e to d8/d86/d28/d79/d57/de1/d22/dab/db4/c118 0 2026-03-10T07:51:12.646 INFO:tasks.workunit.client.0.vm05.stdout:5/907: creat d2/d4b/f136 x:0 0 0 2026-03-10T07:51:12.646 INFO:tasks.workunit.client.0.vm05.stdout:2/923: getdents d0/d8/d66/dd1/d49/dab 0 2026-03-10T07:51:12.649 INFO:tasks.workunit.client.0.vm05.stdout:7/909: fsync d1/d34/d59/f78 0 2026-03-10T07:51:12.660 INFO:tasks.workunit.client.0.vm05.stdout:4/947: mkdir d0/d6/d9/d5a/d6e/d136 0 2026-03-10T07:51:12.662 INFO:tasks.workunit.client.0.vm05.stdout:0/897: mknod d8/d9c/dc8/d100/c132 0 2026-03-10T07:51:12.662 INFO:tasks.workunit.client.0.vm05.stdout:8/812: symlink d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d5d/df3/l101 0 2026-03-10T07:51:12.662 INFO:tasks.workunit.client.0.vm05.stdout:1/866: rename da/dd/d12/d86/d9a/fbc to da/d26/d9e/dcc/f110 0 2026-03-10T07:51:12.663 INFO:tasks.workunit.client.0.vm05.stdout:9/828: dread - d8/d86/d28/d79/d57/de1/d6b/fb0 zero size 2026-03-10T07:51:12.664 INFO:tasks.workunit.client.0.vm05.stdout:0/898: readlink d8/dd/d10/d26/d3a/d5e/d63/l78 0 2026-03-10T07:51:12.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:12 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:51:12.672 INFO:tasks.workunit.client.0.vm05.stdout:7/910: dread d1/d34/fed [0,4194304] 0 2026-03-10T07:51:12.684 INFO:tasks.workunit.client.0.vm05.stdout:2/924: write d0/d8/d66/dd1/d49/df9/db2/f29 [1085489,55718] 0 2026-03-10T07:51:12.691 INFO:tasks.workunit.client.0.vm05.stdout:6/915: getdents d0/d11/d86/d12e 0 2026-03-10T07:51:12.695 INFO:tasks.workunit.client.0.vm05.stdout:4/948: mknod d0/d6/d9/d12/d4f/c137 0 2026-03-10T07:51:12.697 INFO:tasks.workunit.client.0.vm05.stdout:1/867: symlink da/dd/d12/d86/d9a/l111 0 2026-03-10T07:51:12.698 INFO:tasks.workunit.client.0.vm05.stdout:8/813: truncate d1/fad 723565 0 2026-03-10T07:51:12.699 INFO:tasks.workunit.client.0.vm05.stdout:0/899: mkdir d8/d9c/dc8/d133 0 2026-03-10T07:51:12.700 INFO:tasks.workunit.client.0.vm05.stdout:5/908: creat d2/d20/d7b/dca/f137 x:0 0 0 2026-03-10T07:51:12.702 INFO:tasks.workunit.client.0.vm05.stdout:7/911: mknod d1/d6/d80/d82/c11d 0 2026-03-10T07:51:12.704 INFO:tasks.workunit.client.0.vm05.stdout:4/949: mknod d0/d6/d9/d8c/dbe/d11e/c138 0 2026-03-10T07:51:12.707 INFO:tasks.workunit.client.0.vm05.stdout:9/829: creat d8/d86/d28/d79/d57/de1/d38/d71/d81/dcf/f119 x:0 0 0 2026-03-10T07:51:12.712 INFO:tasks.workunit.client.0.vm05.stdout:0/900: mkdir d8/d9c/dc8/d134 0 2026-03-10T07:51:12.712 INFO:tasks.workunit.client.0.vm05.stdout:3/849: getdents d8/d1f/d24/d76/dc5/de1/d19 0 2026-03-10T07:51:12.713 INFO:tasks.workunit.client.0.vm05.stdout:6/916: mknod d0/d11/d22/d112/c12f 0 2026-03-10T07:51:12.714 INFO:tasks.workunit.client.0.vm05.stdout:2/925: truncate d0/d8/f65 2762114 0 2026-03-10T07:51:12.715 INFO:tasks.workunit.client.0.vm05.stdout:2/926: dread - d0/d8/d43/df/df8/f117 zero size 2026-03-10T07:51:12.717 INFO:tasks.workunit.client.0.vm05.stdout:2/927: chown d0/d8/d43/df/d4d/c5a 3353973 1 2026-03-10T07:51:12.718 INFO:tasks.workunit.client.0.vm05.stdout:4/950: unlink d0/d6/d37/f46 0 2026-03-10T07:51:12.719 INFO:tasks.workunit.client.0.vm05.stdout:4/951: write d0/d6/da6/f11a [1122320,37017] 0 2026-03-10T07:51:12.734 INFO:tasks.workunit.client.0.vm05.stdout:0/901: symlink d8/dd/d37/d81/l135 0 2026-03-10T07:51:12.737 INFO:tasks.workunit.client.0.vm05.stdout:3/850: rename d8/d1f/d24/d45/f111 to d8/d8f/f120 0 2026-03-10T07:51:12.737 INFO:tasks.workunit.client.0.vm05.stdout:1/868: write da/dd/d12/f22 [3779681,82374] 0 2026-03-10T07:51:12.737 INFO:tasks.workunit.client.0.vm05.stdout:2/928: symlink d0/d8/d43/da4/dea/l12e 0 2026-03-10T07:51:12.738 INFO:tasks.workunit.client.0.vm05.stdout:2/929: fdatasync d0/d8/d66/dd1/d49/df9/f36 0 2026-03-10T07:51:12.741 INFO:tasks.workunit.client.0.vm05.stdout:4/952: creat d0/d6/d9/d8c/f139 x:0 0 0 2026-03-10T07:51:12.742 INFO:tasks.workunit.client.0.vm05.stdout:2/930: read d0/d8/d66/f68 [979071,55582] 0 2026-03-10T07:51:12.757 INFO:tasks.workunit.client.0.vm05.stdout:8/814: write d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d5d/ff8 [978036,109159] 0 2026-03-10T07:51:12.766 INFO:tasks.workunit.client.0.vm05.stdout:9/830: write d8/d86/d28/d79/d57/de1/f1f [2301508,20571] 0 2026-03-10T07:51:12.774 INFO:tasks.workunit.client.0.vm05.stdout:0/902: truncate d8/dd/f29 160135 0 2026-03-10T07:51:12.775 INFO:tasks.workunit.client.0.vm05.stdout:0/903: chown d8/dd/d10/d26/d8b/d7d/f123 463037 1 2026-03-10T07:51:12.792 INFO:tasks.workunit.client.0.vm05.stdout:2/931: write d0/d8/d66/dd1/fe8 [713494,97424] 0 2026-03-10T07:51:12.794 INFO:tasks.workunit.client.0.vm05.stdout:6/917: creat d0/d11/f130 x:0 0 0 2026-03-10T07:51:12.795 INFO:tasks.workunit.client.0.vm05.stdout:7/912: getdents d1/d6/d80/dcd 0 2026-03-10T07:51:12.800 INFO:tasks.workunit.client.0.vm05.stdout:3/851: getdents d8/d1f/d108 0 2026-03-10T07:51:12.806 INFO:tasks.workunit.client.0.vm05.stdout:0/904: mkdir d8/d9c/dc8/d100/d136 0 2026-03-10T07:51:12.806 INFO:tasks.workunit.client.0.vm05.stdout:4/953: symlink d0/d6/d60/dde/d12d/l13a 0 2026-03-10T07:51:12.808 INFO:tasks.workunit.client.0.vm05.stdout:3/852: dwrite d8/d1f/d2a/d4a/d8c/f11b [0,4194304] 0 2026-03-10T07:51:12.819 INFO:tasks.workunit.client.0.vm05.stdout:8/815: truncate d1/dd/d4d/d64/d6a/de5/d2a/d48/f4a 181184 0 2026-03-10T07:51:12.826 INFO:tasks.workunit.client.0.vm05.stdout:2/932: rename d0/d8/d66/dd1/d49/fde to d0/d8/d43/d38/f12f 0 2026-03-10T07:51:12.829 INFO:tasks.workunit.client.0.vm05.stdout:5/909: rename d2/d12/da8/ddd/le6 to d2/d12/da8/ddd/l138 0 2026-03-10T07:51:12.832 INFO:tasks.workunit.client.0.vm05.stdout:6/918: creat d0/d11/d86/d12e/dc8/f131 x:0 0 0 2026-03-10T07:51:12.838 INFO:tasks.workunit.client.0.vm05.stdout:4/954: symlink d0/d6/d9/d12/d9c/db7/da7/d96/l13b 0 2026-03-10T07:51:12.843 INFO:tasks.workunit.client.0.vm05.stdout:5/910: chown d2/d20/d33/f123 1464607 1 2026-03-10T07:51:12.852 INFO:tasks.workunit.client.0.vm05.stdout:1/869: getdents da/dd/d2a 0 2026-03-10T07:51:12.859 INFO:tasks.workunit.client.0.vm05.stdout:8/816: write d1/d45/d90/fb3 [614040,54298] 0 2026-03-10T07:51:12.861 INFO:tasks.workunit.client.0.vm05.stdout:9/831: dwrite f6 [0,4194304] 0 2026-03-10T07:51:12.863 INFO:tasks.workunit.client.0.vm05.stdout:7/913: sync 2026-03-10T07:51:12.863 INFO:tasks.workunit.client.0.vm05.stdout:8/817: write d1/dd/d4d/d64/fe8 [745517,20927] 0 2026-03-10T07:51:12.865 INFO:tasks.workunit.client.0.vm05.stdout:0/905: fsync d8/dd/d10/d26/d2a/d6f/f85 0 2026-03-10T07:51:12.866 INFO:tasks.workunit.client.0.vm05.stdout:7/914: dread - d1/d6/d47/f116 zero size 2026-03-10T07:51:12.874 INFO:tasks.workunit.client.0.vm05.stdout:6/919: dwrite d0/d11/d4f/d56/f83 [0,4194304] 0 2026-03-10T07:51:12.894 INFO:tasks.workunit.client.0.vm05.stdout:3/853: mkdir d8/d1f/d2a/d34/dbd/d116/d121 0 2026-03-10T07:51:12.901 INFO:tasks.workunit.client.0.vm05.stdout:5/911: unlink d2/d20/d33/d86/dac/cd5 0 2026-03-10T07:51:12.904 INFO:tasks.workunit.client.0.vm05.stdout:1/870: symlink da/dd/d2a/d55/d64/dac/l112 0 2026-03-10T07:51:12.905 INFO:tasks.workunit.client.0.vm05.stdout:7/915: readlink d1/l2f 0 2026-03-10T07:51:12.905 INFO:tasks.workunit.client.0.vm05.stdout:0/906: dread d8/dd/d37/d56/d4d/fd7 [0,4194304] 0 2026-03-10T07:51:12.906 INFO:tasks.workunit.client.0.vm05.stdout:9/832: dread d8/d86/d95/fcb [0,4194304] 0 2026-03-10T07:51:12.906 INFO:tasks.workunit.client.0.vm05.stdout:8/818: rename d1/d45/d90 to d1/d6f/df9/d102 0 2026-03-10T07:51:12.907 INFO:tasks.workunit.client.0.vm05.stdout:1/871: write da/dd/d12/f22 [7186272,86531] 0 2026-03-10T07:51:12.916 INFO:tasks.workunit.client.0.vm05.stdout:4/955: write d0/d6/d9/d12/d45/d55/d4e/f97 [1012511,46618] 0 2026-03-10T07:51:12.924 INFO:tasks.workunit.client.0.vm05.stdout:6/920: creat d0/d11/d57/d66/f132 x:0 0 0 2026-03-10T07:51:12.936 INFO:tasks.workunit.client.0.vm05.stdout:5/912: rmdir d2/d4b 39 2026-03-10T07:51:12.940 INFO:tasks.workunit.client.0.vm05.stdout:7/916: creat d1/d5b/de6/f11e x:0 0 0 2026-03-10T07:51:12.945 INFO:tasks.workunit.client.0.vm05.stdout:0/907: symlink d8/dd/d10/d26/d8b/da4/ddf/l137 0 2026-03-10T07:51:12.950 INFO:tasks.workunit.client.0.vm05.stdout:1/872: unlink da/dd/d12/d34/d107/fb6 0 2026-03-10T07:51:12.951 INFO:tasks.workunit.client.0.vm05.stdout:0/908: dwrite d8/dd/f118 [4194304,4194304] 0 2026-03-10T07:51:12.971 INFO:tasks.workunit.client.0.vm05.stdout:2/933: getdents d0/d8/d43/df/d4d 0 2026-03-10T07:51:12.975 INFO:tasks.workunit.client.0.vm05.stdout:3/854: getdents d8/d1f/d24/d76/df0 0 2026-03-10T07:51:12.975 INFO:tasks.workunit.client.0.vm05.stdout:7/917: truncate d1/d6/f22 305091 0 2026-03-10T07:51:12.976 INFO:tasks.workunit.client.0.vm05.stdout:8/819: truncate d1/dd/d4d/d64/f86 568924 0 2026-03-10T07:51:12.976 INFO:tasks.workunit.client.0.vm05.stdout:3/855: stat d8/d1f/d24/d76/dc5/de1/dac/f75 0 2026-03-10T07:51:12.983 INFO:tasks.workunit.client.0.vm05.stdout:2/934: dwrite d0/d8/d43/df/fc2 [0,4194304] 0 2026-03-10T07:51:12.997 INFO:tasks.workunit.client.0.vm05.stdout:4/956: rename d0/d6/d37/c92 to d0/d6/d9/d12/d45/d55/d44/c13c 0 2026-03-10T07:51:13.002 INFO:tasks.workunit.client.0.vm05.stdout:2/935: unlink d0/d8/dc6/fdc 0 2026-03-10T07:51:13.006 INFO:tasks.workunit.client.0.vm05.stdout:9/833: dwrite d8/d86/d28/d79/d57/de1/d22/d33/d62/fef [0,4194304] 0 2026-03-10T07:51:13.012 INFO:tasks.workunit.client.0.vm05.stdout:2/936: read d0/d8/d66/dd1/fda [2158164,77152] 0 2026-03-10T07:51:13.012 INFO:tasks.workunit.client.0.vm05.stdout:2/937: write d0/d8/d43/df/df8/f117 [272639,20350] 0 2026-03-10T07:51:13.012 INFO:tasks.workunit.client.0.vm05.stdout:9/834: write d8/d86/d28/d79/d57/de1/d38/fff [3537499,103815] 0 2026-03-10T07:51:13.014 INFO:tasks.workunit.client.0.vm05.stdout:6/921: dwrite d0/d11/d22/f52 [0,4194304] 0 2026-03-10T07:51:13.024 INFO:tasks.workunit.client.0.vm05.stdout:5/913: getdents d2/d20/d7b/dbc 0 2026-03-10T07:51:13.024 INFO:tasks.workunit.client.0.vm05.stdout:0/909: getdents d8/dd/d37/d56/d4d/df8 0 2026-03-10T07:51:13.024 INFO:tasks.workunit.client.0.vm05.stdout:5/914: chown d2/d12/dda/da1 29687 1 2026-03-10T07:51:13.034 INFO:tasks.workunit.client.0.vm05.stdout:9/835: mkdir d8/d86/d28/d79/d57/dbc/d11a 0 2026-03-10T07:51:13.034 INFO:tasks.workunit.client.0.vm05.stdout:6/922: mkdir d0/d11/d4f/d7d/db7/d133 0 2026-03-10T07:51:13.035 INFO:tasks.workunit.client.0.vm05.stdout:4/957: sync 2026-03-10T07:51:13.042 INFO:tasks.workunit.client.0.vm05.stdout:5/915: rmdir d2/d12/dda/da1/dc0/dc2 39 2026-03-10T07:51:13.062 INFO:tasks.workunit.client.0.vm05.stdout:2/938: mknod d0/d8/d66/dd1/d49/db1/d111/c130 0 2026-03-10T07:51:13.063 INFO:tasks.workunit.client.0.vm05.stdout:9/836: unlink d8/d86/d28/d79/d57/de1/d22/d33/d62/d6d/de7/c108 0 2026-03-10T07:51:13.063 INFO:tasks.workunit.client.0.vm05.stdout:2/939: symlink d0/d113/l131 0 2026-03-10T07:51:13.063 INFO:tasks.workunit.client.0.vm05.stdout:9/837: unlink d8/d86/d28/f43 0 2026-03-10T07:51:13.063 INFO:tasks.workunit.client.0.vm05.stdout:2/940: dread - d0/d8/d66/dd1/d49/db1/f12b zero size 2026-03-10T07:51:13.063 INFO:tasks.workunit.client.0.vm05.stdout:2/941: stat d0/d8/d43/d38/f9a 0 2026-03-10T07:51:13.063 INFO:tasks.workunit.client.0.vm05.stdout:6/923: link d0/d11/c10c d0/d11/d4f/c134 0 2026-03-10T07:51:13.070 INFO:tasks.workunit.client.0.vm05.stdout:2/942: mkdir d0/d8/d43/da4/dea/d132 0 2026-03-10T07:51:13.073 INFO:tasks.workunit.client.0.vm05.stdout:5/916: link d2/d12/da8/c76 d2/d20/d33/d86/dac/c139 0 2026-03-10T07:51:13.074 INFO:tasks.workunit.client.0.vm05.stdout:6/924: truncate d0/d11/d2e/f30 1124124 0 2026-03-10T07:51:13.074 INFO:tasks.workunit.client.0.vm05.stdout:2/943: readlink d0/d8/d43/df/df8/lff 0 2026-03-10T07:51:13.074 INFO:tasks.workunit.client.0.vm05.stdout:5/917: chown d2/d20/d33/d86/dac/dc1 3687779 1 2026-03-10T07:51:13.075 INFO:tasks.workunit.client.0.vm05.stdout:2/944: fsync d0/d8/d43/df/d4e/d10a/f120 0 2026-03-10T07:51:13.078 INFO:tasks.workunit.client.0.vm05.stdout:4/958: sync 2026-03-10T07:51:13.090 INFO:tasks.workunit.client.0.vm05.stdout:6/925: rename d0/d11/d2e/d81/f125 to d0/d11/d4f/d56/f135 0 2026-03-10T07:51:13.096 INFO:tasks.workunit.client.0.vm05.stdout:7/918: write d1/d6/f84 [1850379,116089] 0 2026-03-10T07:51:13.097 INFO:tasks.workunit.client.0.vm05.stdout:3/856: write d8/d22/fb9 [1018007,80603] 0 2026-03-10T07:51:13.098 INFO:tasks.workunit.client.0.vm05.stdout:8/820: truncate d1/dd/d4d/d64/d6a/de5/d2a/f95 9192463 0 2026-03-10T07:51:13.109 INFO:tasks.workunit.client.0.vm05.stdout:0/910: write d8/dd/d37/d56/d4d/f69 [119817,48780] 0 2026-03-10T07:51:13.113 INFO:tasks.workunit.client.0.vm05.stdout:5/918: write d2/d20/d5b/feb [362964,19163] 0 2026-03-10T07:51:13.114 INFO:tasks.workunit.client.0.vm05.stdout:5/919: truncate d2/d20/d7b/f121 805320 0 2026-03-10T07:51:13.117 INFO:tasks.workunit.client.0.vm05.stdout:9/838: dwrite d8/d86/d28/d79/d57/de1/d22/d33/d62/f6c [0,4194304] 0 2026-03-10T07:51:13.135 INFO:tasks.workunit.client.0.vm05.stdout:1/873: dread da/dd/d12/d34/d107/dbe/dc0/fa3 [0,4194304] 0 2026-03-10T07:51:13.178 INFO:tasks.workunit.client.0.vm05.stdout:2/945: rmdir d0/d8/d66/dd1/d49/df9/da5/da8 39 2026-03-10T07:51:13.181 INFO:tasks.workunit.client.0.vm05.stdout:4/959: creat d0/d6/d9/d5a/d6e/dd1/f13d x:0 0 0 2026-03-10T07:51:13.184 INFO:tasks.workunit.client.0.vm05.stdout:6/926: creat d0/d6/d3b/f136 x:0 0 0 2026-03-10T07:51:13.190 INFO:tasks.workunit.client.0.vm05.stdout:3/857: dread d8/f5d [0,4194304] 0 2026-03-10T07:51:13.191 INFO:tasks.workunit.client.0.vm05.stdout:7/919: fsync d1/d6/d47/d8d/fb4 0 2026-03-10T07:51:13.194 INFO:tasks.workunit.client.0.vm05.stdout:8/821: mkdir d1/dd/d4d/d64/d6a/de5/d2a/d34/da5/d103 0 2026-03-10T07:51:13.201 INFO:tasks.workunit.client.0.vm05.stdout:0/911: write d8/dd/d10/d26/d2a/f2e [2382996,36679] 0 2026-03-10T07:51:13.213 INFO:tasks.workunit.client.0.vm05.stdout:5/920: read d2/d20/d33/f115 [254984,52067] 0 2026-03-10T07:51:13.215 INFO:tasks.workunit.client.0.vm05.stdout:9/839: readlink d8/l19 0 2026-03-10T07:51:13.219 INFO:tasks.workunit.client.0.vm05.stdout:1/874: dread da/dd/d2a/d55/d64/d104/f4d [0,4194304] 0 2026-03-10T07:51:13.220 INFO:tasks.workunit.client.0.vm05.stdout:1/875: write da/fc [1997678,6221] 0 2026-03-10T07:51:13.220 INFO:tasks.workunit.client.0.vm05.stdout:1/876: chown da/d26/d2b/dcb 173 1 2026-03-10T07:51:13.222 INFO:tasks.workunit.client.0.vm05.stdout:1/877: dread da/dd/d2a/f90 [0,4194304] 0 2026-03-10T07:51:13.223 INFO:tasks.workunit.client.0.vm05.stdout:1/878: read da/dd/d12/d34/f5f [1979610,126305] 0 2026-03-10T07:51:13.231 INFO:tasks.workunit.client.0.vm05.stdout:6/927: truncate d0/d6/f44 3100870 0 2026-03-10T07:51:13.232 INFO:tasks.workunit.client.0.vm05.stdout:7/920: creat d1/d6/dc3/f11f x:0 0 0 2026-03-10T07:51:13.234 INFO:tasks.workunit.client.0.vm05.stdout:8/822: unlink d1/dd/d4d/f8a 0 2026-03-10T07:51:13.235 INFO:tasks.workunit.client.0.vm05.stdout:8/823: chown d1/dd/d18/f29 5 1 2026-03-10T07:51:13.235 INFO:tasks.workunit.client.0.vm05.stdout:8/824: chown d1/dd/d4d/dcc/dbd/dfc 55044 1 2026-03-10T07:51:13.236 INFO:tasks.workunit.client.0.vm05.stdout:8/825: chown d1/dd/d4d/d64/d6a/de5/fe4 1354 1 2026-03-10T07:51:13.238 INFO:tasks.workunit.client.0.vm05.stdout:0/912: creat d8/dd/d10/d26/d8b/da4/de7/f138 x:0 0 0 2026-03-10T07:51:13.238 INFO:tasks.workunit.client.0.vm05.stdout:0/913: chown d8/f20 24838 1 2026-03-10T07:51:13.239 INFO:tasks.workunit.client.0.vm05.stdout:0/914: stat d8/dd/d10/d26/d2a/f94 0 2026-03-10T07:51:13.241 INFO:tasks.workunit.client.0.vm05.stdout:5/921: mkdir d2/d20/d33/d53/d7d/d13a 0 2026-03-10T07:51:13.242 INFO:tasks.workunit.client.0.vm05.stdout:5/922: fsync d2/d5/f131 0 2026-03-10T07:51:13.242 INFO:tasks.workunit.client.0.vm05.stdout:0/915: dwrite d8/dd/d10/d26/d8b/d86/f11a [0,4194304] 0 2026-03-10T07:51:13.249 INFO:tasks.workunit.client.0.vm05.stdout:2/946: dwrite d0/d8/f3b [4194304,4194304] 0 2026-03-10T07:51:13.251 INFO:tasks.workunit.client.0.vm05.stdout:2/947: stat d0/d8/d43/d38/f12f 0 2026-03-10T07:51:13.265 INFO:tasks.workunit.client.0.vm05.stdout:4/960: symlink d0/d6/d9/d12/d65/def/l13e 0 2026-03-10T07:51:13.275 INFO:tasks.workunit.client.0.vm05.stdout:3/858: creat d8/d22/d60/df7/f122 x:0 0 0 2026-03-10T07:51:13.275 INFO:tasks.workunit.client.0.vm05.stdout:7/921: fdatasync d1/d6/f31 0 2026-03-10T07:51:13.275 INFO:tasks.workunit.client.0.vm05.stdout:8/826: fsync d1/dd/d4d/d64/d6a/de5/d2a/d48/f7b 0 2026-03-10T07:51:13.276 INFO:tasks.workunit.client.0.vm05.stdout:1/879: dread da/d26/d2b/d71/f7d [0,4194304] 0 2026-03-10T07:51:13.278 INFO:tasks.workunit.client.0.vm05.stdout:0/916: write d8/dd/d10/d26/d2a/fc7 [2759117,718] 0 2026-03-10T07:51:13.280 INFO:tasks.workunit.client.0.vm05.stdout:2/948: creat d0/d8/d43/df/d8b/f133 x:0 0 0 2026-03-10T07:51:13.284 INFO:tasks.workunit.client.0.vm05.stdout:3/859: creat d8/d1f/f123 x:0 0 0 2026-03-10T07:51:13.285 INFO:tasks.workunit.client.0.vm05.stdout:3/860: write d8/d22/d60/d6e/dca/dda/f10d [4409745,12384] 0 2026-03-10T07:51:13.286 INFO:tasks.workunit.client.0.vm05.stdout:6/928: creat d0/d11/d57/d60/d117/f137 x:0 0 0 2026-03-10T07:51:13.292 INFO:tasks.workunit.client.0.vm05.stdout:7/922: mkdir d1/d108/d120 0 2026-03-10T07:51:13.304 INFO:tasks.workunit.client.0.vm05.stdout:9/840: dwrite d8/d86/d28/d79/d57/de1/d22/f4f [0,4194304] 0 2026-03-10T07:51:13.311 INFO:tasks.workunit.client.0.vm05.stdout:9/841: write d8/d86/d28/d79/d57/de1/d38/fff [3060823,54821] 0 2026-03-10T07:51:13.315 INFO:tasks.workunit.client.0.vm05.stdout:9/842: chown d8/d86/d28/d79/d57/de1/d38/fd0 260 1 2026-03-10T07:51:13.315 INFO:tasks.workunit.client.0.vm05.stdout:3/861: creat d8/d1c/f124 x:0 0 0 2026-03-10T07:51:13.318 INFO:tasks.workunit.client.0.vm05.stdout:6/929: creat d0/d11/d22/f138 x:0 0 0 2026-03-10T07:51:13.318 INFO:tasks.workunit.client.0.vm05.stdout:3/862: chown d8/d1f/d24/d8a/f57 660019 1 2026-03-10T07:51:13.328 INFO:tasks.workunit.client.0.vm05.stdout:8/827: mkdir d1/dd/d4d/d64/d104 0 2026-03-10T07:51:13.330 INFO:tasks.workunit.client.0.vm05.stdout:5/923: creat d2/d20/f13b x:0 0 0 2026-03-10T07:51:13.333 INFO:tasks.workunit.client.0.vm05.stdout:1/880: unlink da/d26/d2b/d89/dbd/ce9 0 2026-03-10T07:51:13.346 INFO:tasks.workunit.client.0.vm05.stdout:3/863: chown d8/d1f/d2a/d34/fae 16457426 1 2026-03-10T07:51:13.346 INFO:tasks.workunit.client.0.vm05.stdout:8/828: chown d1/dd/d4d/d64/d6a/de5/c28 3942 1 2026-03-10T07:51:13.348 INFO:tasks.workunit.client.0.vm05.stdout:9/843: sync 2026-03-10T07:51:13.348 INFO:tasks.workunit.client.0.vm05.stdout:5/924: creat d2/d20/d7b/dca/f13c x:0 0 0 2026-03-10T07:51:13.350 INFO:tasks.workunit.client.0.vm05.stdout:1/881: dread - da/dd/d42/fe3 zero size 2026-03-10T07:51:13.351 INFO:tasks.workunit.client.0.vm05.stdout:9/844: readlink d8/d86/d28/d79/d57/de1/d1c/d20/dd3/d63/l104 0 2026-03-10T07:51:13.352 INFO:tasks.workunit.client.0.vm05.stdout:9/845: dread - d8/d86/d28/de9/f101 zero size 2026-03-10T07:51:13.353 INFO:tasks.workunit.client.0.vm05.stdout:0/917: rename d8/dd/d10/d26/d48/fb0 to d8/dd/d37/f139 0 2026-03-10T07:51:13.354 INFO:tasks.workunit.client.0.vm05.stdout:0/918: readlink d8/dd/d10/d26/d8b/d70/l8a 0 2026-03-10T07:51:13.361 INFO:tasks.workunit.client.0.vm05.stdout:4/961: getdents d0/d6/d9/d12/d9c/db7/db1 0 2026-03-10T07:51:13.379 INFO:tasks.workunit.client.0.vm05.stdout:8/829: rmdir d1/dd/d5e 39 2026-03-10T07:51:13.382 INFO:tasks.workunit.client.0.vm05.stdout:6/930: dwrite d0/d11/d31/f63 [0,4194304] 0 2026-03-10T07:51:13.389 INFO:tasks.workunit.client.0.vm05.stdout:5/925: mknod d2/d20/d33/d53/c13d 0 2026-03-10T07:51:13.394 INFO:tasks.workunit.client.0.vm05.stdout:9/846: fdatasync d8/d86/d28/d79/d57/de1/d38/fd0 0 2026-03-10T07:51:13.399 INFO:tasks.workunit.client.0.vm05.stdout:4/962: fdatasync d0/d6/fa0 0 2026-03-10T07:51:13.401 INFO:tasks.workunit.client.0.vm05.stdout:8/830: chown d1/l44 3567 1 2026-03-10T07:51:13.420 INFO:tasks.workunit.client.0.vm05.stdout:1/882: dwrite da/dd/d2a/d55/d64/d104/f4d [0,4194304] 0 2026-03-10T07:51:13.422 INFO:tasks.workunit.client.0.vm05.stdout:0/919: dwrite d8/f75 [0,4194304] 0 2026-03-10T07:51:13.423 INFO:tasks.workunit.client.0.vm05.stdout:0/920: readlink d8/dd/d10/d26/d2a/l43 0 2026-03-10T07:51:13.425 INFO:tasks.workunit.client.0.vm05.stdout:7/923: rename d1/d6/d47/c75 to d1/d6/d47/d8d/d112/c121 0 2026-03-10T07:51:13.454 INFO:tasks.workunit.client.0.vm05.stdout:9/847: mknod d8/d86/d28/d79/d57/de1/d1c/d20/d54/c11b 0 2026-03-10T07:51:13.464 INFO:tasks.workunit.client.0.vm05.stdout:6/931: mkdir d0/d11/d57/d139 0 2026-03-10T07:51:13.467 INFO:tasks.workunit.client.0.vm05.stdout:6/932: truncate d0/d11/d22/f138 522175 0 2026-03-10T07:51:13.474 INFO:tasks.workunit.client.0.vm05.stdout:0/921: rename d8/dd/d10/d26/d2a/d6f to d8/dd/d10/d26/d2a/d13a 0 2026-03-10T07:51:13.475 INFO:tasks.workunit.client.0.vm05.stdout:2/949: rename d0/d8/d66/dd1/d49/df9/db2/dd7/c110 to d0/d8/d66/dd1/d49/df9/da5/da8/c134 0 2026-03-10T07:51:13.475 INFO:tasks.workunit.client.0.vm05.stdout:0/922: fsync d8/dd/d37/dfd/f119 0 2026-03-10T07:51:13.480 INFO:tasks.workunit.client.0.vm05.stdout:7/924: symlink d1/d3c/d4b/l122 0 2026-03-10T07:51:13.484 INFO:tasks.workunit.client.0.vm05.stdout:1/883: mkdir da/dd/d12/d113 0 2026-03-10T07:51:13.484 INFO:tasks.workunit.client.0.vm05.stdout:3/864: rename d8/d1c/d48/d69/fb2 to d8/d1f/f125 0 2026-03-10T07:51:13.487 INFO:tasks.workunit.client.0.vm05.stdout:8/831: write d1/dd/d4d/dcc/fbe [518093,105424] 0 2026-03-10T07:51:13.491 INFO:tasks.workunit.client.0.vm05.stdout:8/832: chown d1/dc9/lcb 0 1 2026-03-10T07:51:13.497 INFO:tasks.workunit.client.0.vm05.stdout:5/926: dwrite d2/d20/f51 [0,4194304] 0 2026-03-10T07:51:13.517 INFO:tasks.workunit.client.0.vm05.stdout:8/833: mknod d1/dc9/c105 0 2026-03-10T07:51:13.525 INFO:tasks.workunit.client.0.vm05.stdout:6/933: dread d0/d11/f13 [0,4194304] 0 2026-03-10T07:51:13.540 INFO:tasks.workunit.client.0.vm05.stdout:3/865: write d8/d1f/d2a/d4a/f89 [981015,46014] 0 2026-03-10T07:51:13.540 INFO:tasks.workunit.client.0.vm05.stdout:9/848: write d8/d86/d28/d79/d57/de1/d22/dab/fd7 [382590,5192] 0 2026-03-10T07:51:13.549 INFO:tasks.workunit.client.0.vm05.stdout:7/925: dwrite d1/d6/d3b/fda [0,4194304] 0 2026-03-10T07:51:13.551 INFO:tasks.workunit.client.0.vm05.stdout:4/963: getdents d0/de4 0 2026-03-10T07:51:13.552 INFO:tasks.workunit.client.0.vm05.stdout:4/964: chown d0/d6/d9/d12/d45 11224726 1 2026-03-10T07:51:13.559 INFO:tasks.workunit.client.0.vm05.stdout:8/834: creat d1/dd/d4d/d64/d6a/de5/d2a/d48/d7c/d9c/da4/f106 x:0 0 0 2026-03-10T07:51:13.561 INFO:tasks.workunit.client.0.vm05.stdout:2/950: getdents d0/d8/d66/dd1/d49 0 2026-03-10T07:51:13.565 INFO:tasks.workunit.client.0.vm05.stdout:1/884: creat da/d26/d2b/f114 x:0 0 0 2026-03-10T07:51:13.570 INFO:tasks.workunit.client.0.vm05.stdout:1/885: read da/f5d [3664952,37773] 0 2026-03-10T07:51:13.570 INFO:tasks.workunit.client.0.vm05.stdout:9/849: mkdir d8/d86/d28/d79/d57/de1/d38/d71/d81/d11c 0 2026-03-10T07:51:13.573 INFO:tasks.workunit.client.0.vm05.stdout:6/934: dread d0/f26 [0,4194304] 0 2026-03-10T07:51:13.575 INFO:tasks.workunit.client.0.vm05.stdout:4/965: rename d0/d6/d9/d12/d9c/db7/da7/d96/l13b to d0/d6/d6f/dc2/l13f 0 2026-03-10T07:51:13.575 INFO:tasks.workunit.client.0.vm05.stdout:2/951: creat d0/d8/d43/da4/dea/f135 x:0 0 0 2026-03-10T07:51:13.576 INFO:tasks.workunit.client.0.vm05.stdout:0/923: link d8/dd/d10/d26/d2a/d13a/cc1 d8/dd/d37/c13b 0 2026-03-10T07:51:13.580 INFO:tasks.workunit.client.0.vm05.stdout:1/886: mknod da/d26/d9e/c115 0 2026-03-10T07:51:13.585 INFO:tasks.workunit.client.0.vm05.stdout:7/926: symlink d1/d3c/l123 0 2026-03-10T07:51:13.585 INFO:tasks.workunit.client.0.vm05.stdout:5/927: creat d2/d5/f13e x:0 0 0 2026-03-10T07:51:13.585 INFO:tasks.workunit.client.0.vm05.stdout:5/928: readlink d2/dd7/le2 0 2026-03-10T07:51:13.585 INFO:tasks.workunit.client.0.vm05.stdout:6/935: creat d0/d11/d22/d6c/f13a x:0 0 0 2026-03-10T07:51:13.599 INFO:tasks.workunit.client.0.vm05.stdout:8/835: sync 2026-03-10T07:51:13.600 INFO:tasks.workunit.client.0.vm05.stdout:0/924: creat d8/d9c/dc8/d100/d136/f13c x:0 0 0 2026-03-10T07:51:13.600 INFO:tasks.workunit.client.0.vm05.stdout:4/966: mkdir d0/d6/d9/d5a/d6e/db6/d140 0 2026-03-10T07:51:13.611 INFO:tasks.workunit.client.0.vm05.stdout:2/952: dread d0/d8/f2d [0,4194304] 0 2026-03-10T07:51:13.618 INFO:tasks.workunit.client.0.vm05.stdout:6/936: write d0/d6/f1d [1476992,70149] 0 2026-03-10T07:51:13.620 INFO:tasks.workunit.client.0.vm05.stdout:8/836: dread d1/fe [0,4194304] 0 2026-03-10T07:51:13.626 INFO:tasks.workunit.client.0.vm05.stdout:6/937: sync 2026-03-10T07:51:13.628 INFO:tasks.workunit.client.0.vm05.stdout:9/850: link d8/d86/d28/d79/d57/ff5 d8/d86/d28/d79/d57/de1/d22/d33/d62/f11d 0 2026-03-10T07:51:13.628 INFO:tasks.workunit.client.0.vm05.stdout:7/927: dwrite d1/d34/d59/d60/d8c/de4/ffe [0,4194304] 0 2026-03-10T07:51:13.635 INFO:tasks.workunit.client.0.vm05.stdout:1/887: dwrite da/dd/d12/d34/f5f [0,4194304] 0 2026-03-10T07:51:13.636 INFO:tasks.workunit.client.0.vm05.stdout:4/967: creat d0/d6/d9/d5a/d6e/db6/db9/f141 x:0 0 0 2026-03-10T07:51:13.636 INFO:tasks.workunit.client.0.vm05.stdout:5/929: symlink d2/d12/da8/ddd/de9/d11c/l13f 0 2026-03-10T07:51:13.636 INFO:tasks.workunit.client.0.vm05.stdout:2/953: mknod d0/d2a/c136 0 2026-03-10T07:51:13.636 INFO:tasks.workunit.client.0.vm05.stdout:9/851: rmdir d8/d86/d28/d79/d57/de1/d22/d33/df9 39 2026-03-10T07:51:13.641 INFO:tasks.workunit.client.0.vm05.stdout:3/866: dwrite d8/d1f/d24/d8a/f57 [4194304,4194304] 0 2026-03-10T07:51:13.641 INFO:tasks.workunit.client.0.vm05.stdout:3/867: read - d8/d1c/d109/f10e zero size 2026-03-10T07:51:13.641 INFO:tasks.workunit.client.0.vm05.stdout:6/938: dread - d0/d11/d86/d12e/dc8/f131 zero size 2026-03-10T07:51:13.644 INFO:tasks.workunit.client.0.vm05.stdout:8/837: rmdir d1/dd/d4d/d64/d6a/de5/d2a/d9a 39 2026-03-10T07:51:13.649 INFO:tasks.workunit.client.0.vm05.stdout:8/838: chown d1/dd/d4d/d64/d6a/de5/l3c 203475 1 2026-03-10T07:51:13.652 INFO:tasks.workunit.client.0.vm05.stdout:5/930: mknod d2/d20/d4c/db6/c140 0 2026-03-10T07:51:13.656 INFO:tasks.workunit.client.0.vm05.stdout:0/925: dwrite d8/d9c/dc8/f106 [0,4194304] 0 2026-03-10T07:51:13.659 INFO:tasks.workunit.client.0.vm05.stdout:0/926: stat d8/dd/d10/d26/d2a/d13a/faa 0 2026-03-10T07:51:13.670 INFO:tasks.workunit.client.0.vm05.stdout:2/954: truncate d0/d8/fe 4745595 0 2026-03-10T07:51:13.680 INFO:tasks.workunit.client.0.vm05.stdout:5/931: stat d2/d12/da8/c76 0 2026-03-10T07:51:13.685 INFO:tasks.workunit.client.0.vm05.stdout:9/852: mknod d8/d86/d28/d79/d57/de1/d38/d71/d81/d11c/c11e 0 2026-03-10T07:51:13.685 INFO:tasks.workunit.client.0.vm05.stdout:0/927: dread d8/dd/d10/d26/d2a/d13a/faa [0,4194304] 0 2026-03-10T07:51:13.691 INFO:tasks.workunit.client.0.vm05.stdout:4/968: creat d0/d6/d9/d12/d65/f142 x:0 0 0 2026-03-10T07:51:13.692 INFO:tasks.workunit.client.0.vm05.stdout:6/939: rmdir d0/d11/d2e/d81/dd6 0 2026-03-10T07:51:13.696 INFO:tasks.workunit.client.0.vm05.stdout:7/928: rename d1/d6/d47/l4a to d1/d6/d3b/l124 0 2026-03-10T07:51:13.701 INFO:tasks.workunit.client.0.vm05.stdout:6/940: dread - d0/d11/d4f/d7d/db7/fc9 zero size 2026-03-10T07:51:13.702 INFO:tasks.workunit.client.0.vm05.stdout:6/941: write d0/f29 [5253680,36364] 0 2026-03-10T07:51:13.702 INFO:tasks.workunit.client.0.vm05.stdout:6/942: readlink d0/d11/l25 0 2026-03-10T07:51:13.706 INFO:tasks.workunit.client.0.vm05.stdout:5/932: mknod d2/d20/d7b/c141 0 2026-03-10T07:51:13.706 INFO:tasks.workunit.client.0.vm05.stdout:5/933: write d2/d20/d33/fe0 [5152266,63728] 0 2026-03-10T07:51:13.708 INFO:tasks.workunit.client.0.vm05.stdout:5/934: stat d2/d20/d33/d86/fb5 0 2026-03-10T07:51:13.708 INFO:tasks.workunit.client.0.vm05.stdout:0/928: sync 2026-03-10T07:51:13.710 INFO:tasks.workunit.client.0.vm05.stdout:5/935: write d2/d20/d33/d53/f97 [6970696,93378] 0 2026-03-10T07:51:13.713 INFO:tasks.workunit.client.0.vm05.stdout:0/929: dread d8/f75 [0,4194304] 0 2026-03-10T07:51:13.713 INFO:tasks.workunit.client.0.vm05.stdout:0/930: readlink d8/l90 0 2026-03-10T07:51:13.722 INFO:tasks.workunit.client.0.vm05.stdout:3/868: rename d8/d1f/d24/d8a/f91 to d8/d1f/d2a/d96/f126 0 2026-03-10T07:51:13.728 INFO:tasks.workunit.client.0.vm05.stdout:7/929: fdatasync d1/d34/d59/fca 0 2026-03-10T07:51:13.729 INFO:tasks.workunit.client.0.vm05.stdout:7/930: read d1/d6/f4e [198855,81813] 0 2026-03-10T07:51:13.732 INFO:tasks.workunit.client.0.vm05.stdout:7/931: dwrite d1/d6/d80/fb9 [0,4194304] 0 2026-03-10T07:51:13.741 INFO:tasks.workunit.client.0.vm05.stdout:9/853: symlink d8/d86/d28/d79/d57/de1/d6b/dde/l11f 0 2026-03-10T07:51:13.749 INFO:tasks.workunit.client.0.vm05.stdout:1/888: write da/dd/d2a/d55/d64/f9f [981830,102517] 0 2026-03-10T07:51:13.753 INFO:tasks.workunit.client.0.vm05.stdout:6/943: mkdir d0/d11/d2e/d81/d92/d13b 0 2026-03-10T07:51:13.763 INFO:tasks.workunit.client.0.vm05.stdout:8/839: dwrite d1/dd/d4d/d64/d8f/faf [0,4194304] 0 2026-03-10T07:51:13.770 INFO:tasks.workunit.client.0.vm05.stdout:5/936: mknod d2/d20/d33/d86/dac/dc1/c142 0 2026-03-10T07:51:13.770 INFO:tasks.workunit.client.0.vm05.stdout:5/937: stat d2/d20/d33/d53/fb0 0 2026-03-10T07:51:13.774 INFO:tasks.workunit.client.0.vm05.stdout:0/931: rename d8/dd/d10/d26/d8b/da4/fdc to d8/d9c/dc8/d10f/d127/f13d 0 2026-03-10T07:51:13.781 INFO:tasks.workunit.client.0.vm05.stdout:2/955: truncate d0/d8/d43/df/d53/f126 1289100 0 2026-03-10T07:51:13.783 INFO:tasks.workunit.client.0.vm05.stdout:8/840: fsync d1/fa 0 2026-03-10T07:51:13.788 INFO:tasks.workunit.client.0.vm05.stdout:3/869: truncate d8/f3b 301954 0 2026-03-10T07:51:13.790 INFO:tasks.workunit.client.0.vm05.stdout:7/932: rename d1/d6/d47/d8d/le2 to d1/d34/d59/d60/d8c/l125 0 2026-03-10T07:51:13.790 INFO:tasks.workunit.client.0.vm05.stdout:5/938: dwrite d2/d4b/fcf [0,4194304] 0 2026-03-10T07:51:13.802 INFO:tasks.workunit.client.0.vm05.stdout:9/854: mkdir d8/d86/d28/d79/d57/de1/d1c/d120 0 2026-03-10T07:51:13.804 INFO:tasks.workunit.client.0.vm05.stdout:4/969: getdents d0/d6/d9/d8c/dbe 0 2026-03-10T07:51:13.805 INFO:tasks.workunit.client.0.vm05.stdout:6/944: mkdir d0/d11/d13c 0 2026-03-10T07:51:13.807 INFO:tasks.workunit.client.0.vm05.stdout:8/841: symlink d1/dd/d4d/d64/d8f/l107 0 2026-03-10T07:51:13.828 INFO:tasks.workunit.client.0.vm05.stdout:3/870: creat d8/dd5/f127 x:0 0 0 2026-03-10T07:51:13.828 INFO:tasks.workunit.client.0.vm05.stdout:2/956: rename d0/d8/d43/df/d4d/c5a to d0/d8/d66/dd1/d49/df9/db2/c137 0 2026-03-10T07:51:13.828 INFO:tasks.workunit.client.0.vm05.stdout:2/957: mkdir d0/d8/d43/d38/d138 0 2026-03-10T07:51:13.828 INFO:tasks.workunit.client.0.vm05.stdout:0/932: creat d8/dd/d10/f13e x:0 0 0 2026-03-10T07:51:13.837 INFO:tasks.workunit.client.0.vm05.stdout:0/933: creat d8/d9c/dc8/f13f x:0 0 0 2026-03-10T07:51:13.839 INFO:tasks.workunit.client.0.vm05.stdout:5/939: creat d2/d12/da8/ddd/f143 x:0 0 0 2026-03-10T07:51:13.841 INFO:tasks.workunit.client.0.vm05.stdout:0/934: dwrite d8/dd/d37/fd6 [0,4194304] 0 2026-03-10T07:51:13.843 INFO:tasks.workunit.client.0.vm05.stdout:0/935: chown d8/dd/d10/d26/d8b/d86/ff5 198002 1 2026-03-10T07:51:13.846 INFO:tasks.workunit.client.0.vm05.stdout:8/842: sync 2026-03-10T07:51:13.854 INFO:tasks.workunit.client.0.vm05.stdout:4/970: creat d0/d6/d9/d12/d45/d55/d44/f143 x:0 0 0 2026-03-10T07:51:13.855 INFO:tasks.workunit.client.0.vm05.stdout:4/971: chown d0/d6/d9/d12/d9c/db7/da7/f125 24390 1 2026-03-10T07:51:13.858 INFO:tasks.workunit.client.0.vm05.stdout:3/871: link d8/d1f/d24/d76/dc5/de1/d19/lb8 d8/d1c/db3/l128 0 2026-03-10T07:51:13.867 INFO:tasks.workunit.client.0.vm05.stdout:5/940: creat d2/d20/d33/d53/d7d/f144 x:0 0 0 2026-03-10T07:51:13.867 INFO:tasks.workunit.client.0.vm05.stdout:0/936: truncate d8/dd/d10/fd9 900015 0 2026-03-10T07:51:13.868 INFO:tasks.workunit.client.0.vm05.stdout:5/941: rmdir d2/d12/dda 39 2026-03-10T07:51:13.874 INFO:tasks.workunit.client.0.vm05.stdout:4/972: mkdir d0/d6/d9/d12/d45/d55/d144 0 2026-03-10T07:51:13.876 INFO:tasks.workunit.client.0.vm05.stdout:3/872: dread d8/d1f/d24/d76/dc5/de1/d52/fde [0,4194304] 0 2026-03-10T07:51:13.878 INFO:tasks.workunit.client.0.vm05.stdout:0/937: truncate d8/d9c/fd8 1064630 0 2026-03-10T07:51:13.882 INFO:tasks.workunit.client.0.vm05.stdout:8/843: creat d1/d6f/f108 x:0 0 0 2026-03-10T07:51:13.887 INFO:tasks.workunit.client.0.vm05.stdout:3/873: mknod d8/d1f/d108/c129 0 2026-03-10T07:51:13.895 INFO:tasks.workunit.client.0.vm05.stdout:3/874: sync 2026-03-10T07:51:13.902 INFO:tasks.workunit.client.0.vm05.stdout:2/958: dread d0/d8/d43/f1f [0,4194304] 0 2026-03-10T07:51:13.907 INFO:tasks.workunit.client.0.vm05.stdout:1/889: dwrite da/d26/d2b/d71/f7d [0,4194304] 0 2026-03-10T07:51:13.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:13 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:13.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:13 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:13.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:13 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:51:13.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:13 vm05.local ceph-mon[50387]: pgmap v26: 65 pgs: 65 active+clean; 1.8 GiB data, 6.1 GiB used, 114 GiB / 120 GiB avail; 32 MiB/s rd, 69 MiB/s wr, 193 op/s 2026-03-10T07:51:13.918 INFO:tasks.workunit.client.0.vm05.stdout:7/933: write d1/d34/d59/fd1 [1850490,128917] 0 2026-03-10T07:51:13.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:13 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:13.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:13 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:13.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:13 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:51:13.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:13 vm08.local ceph-mon[59917]: pgmap v26: 65 pgs: 65 active+clean; 1.8 GiB data, 6.1 GiB used, 114 GiB / 120 GiB avail; 32 MiB/s rd, 69 MiB/s wr, 193 op/s 2026-03-10T07:51:13.920 INFO:tasks.workunit.client.0.vm05.stdout:9/855: write d8/d86/d28/d79/d57/de1/d22/d33/fac [316402,110153] 0 2026-03-10T07:51:13.924 INFO:tasks.workunit.client.0.vm05.stdout:6/945: dwrite d0/d11/d2e/f7f [0,4194304] 0 2026-03-10T07:51:13.926 INFO:tasks.workunit.client.0.vm05.stdout:5/942: link d2/f9 d2/d20/d135/f145 0 2026-03-10T07:51:13.930 INFO:tasks.workunit.client.0.vm05.stdout:8/844: truncate d1/dd/d4d/d64/d6a/de5/d2a/d48/d5a/f6d 550629 0 2026-03-10T07:51:13.932 INFO:tasks.workunit.client.0.vm05.stdout:4/973: symlink d0/d6/d9/d12/d45/l145 0 2026-03-10T07:51:13.934 INFO:tasks.workunit.client.0.vm05.stdout:3/875: mkdir d8/d1f/d24/d76/dc5/de1/d12a 0 2026-03-10T07:51:13.946 INFO:tasks.workunit.client.0.vm05.stdout:0/938: dwrite d8/dd/d10/d26/d8b/da4/fc4 [0,4194304] 0 2026-03-10T07:51:13.948 INFO:tasks.workunit.client.0.vm05.stdout:6/946: sync 2026-03-10T07:51:13.955 INFO:tasks.workunit.client.0.vm05.stdout:0/939: write d8/dd/d37/dfd/f119 [958962,106980] 0 2026-03-10T07:51:13.958 INFO:tasks.workunit.client.0.vm05.stdout:7/934: mknod d1/d3c/d71/d79/d8a/dac/c126 0 2026-03-10T07:51:13.969 INFO:tasks.workunit.client.0.vm05.stdout:5/943: symlink d2/d12/da8/ddd/de9/l146 0 2026-03-10T07:51:13.969 INFO:tasks.workunit.client.0.vm05.stdout:8/845: creat d1/dd/d4d/d64/d6a/de5/f109 x:0 0 0 2026-03-10T07:51:13.969 INFO:tasks.workunit.client.0.vm05.stdout:4/974: dread - d0/d6/d9/d12/d4f/f116 zero size 2026-03-10T07:51:13.971 INFO:tasks.workunit.client.0.vm05.stdout:4/975: dread - d0/d6/d9/d12/d45/d55/d44/f143 zero size 2026-03-10T07:51:13.975 INFO:tasks.workunit.client.0.vm05.stdout:2/959: creat d0/d8/d43/df/d8b/d12a/f139 x:0 0 0 2026-03-10T07:51:13.976 INFO:tasks.workunit.client.0.vm05.stdout:2/960: chown d0/d8/d66/dd1/d49/df9/l3f 85 1 2026-03-10T07:51:13.986 INFO:tasks.workunit.client.0.vm05.stdout:6/947: creat d0/d11/d22/d6c/d84/dc4/f13d x:0 0 0 2026-03-10T07:51:13.990 INFO:tasks.workunit.client.0.vm05.stdout:1/890: creat da/dd/d12/d34/ddb/def/f116 x:0 0 0 2026-03-10T07:51:13.990 INFO:tasks.workunit.client.0.vm05.stdout:6/948: chown d0/d11/d86/d12e/dd2/d101/f12c 2205 1 2026-03-10T07:51:13.997 INFO:tasks.workunit.client.0.vm05.stdout:9/856: symlink d8/d86/dfe/l121 0 2026-03-10T07:51:14.002 INFO:tasks.workunit.client.0.vm05.stdout:8/846: rename d1/dd/d4d/d64/d6a/de5/d2a/d34/l62 to d1/dd/d4d/dcc/dbd/l10a 0 2026-03-10T07:51:14.003 INFO:tasks.workunit.client.0.vm05.stdout:8/847: dread - d1/dd/d4d/d64/d6a/de5/f109 zero size 2026-03-10T07:51:14.003 INFO:tasks.workunit.client.0.vm05.stdout:9/857: write d8/d86/d28/d79/d57/de1/d1c/d20/fbd [4153919,9701] 0 2026-03-10T07:51:14.013 INFO:tasks.workunit.client.0.vm05.stdout:2/961: truncate d0/d8/d3d/fdd 2124074 0 2026-03-10T07:51:14.019 INFO:tasks.workunit.client.0.vm05.stdout:8/848: dread d1/dd/d4d/d64/d6a/de5/d2a/f54 [0,4194304] 0 2026-03-10T07:51:14.025 INFO:tasks.workunit.client.0.vm05.stdout:6/949: creat d0/d11/d31/f13e x:0 0 0 2026-03-10T07:51:14.045 INFO:tasks.workunit.client.0.vm05.stdout:9/858: creat d8/d86/d95/f122 x:0 0 0 2026-03-10T07:51:14.048 INFO:tasks.workunit.client.0.vm05.stdout:3/876: dwrite d8/d1f/d2a/d96/da9/fb5 [0,4194304] 0 2026-03-10T07:51:14.051 INFO:tasks.workunit.client.0.vm05.stdout:2/962: unlink d0/d8/d66/dd1/d49/df9/da5/da8/c11e 0 2026-03-10T07:51:14.063 INFO:tasks.workunit.client.0.vm05.stdout:8/849: unlink d1/dc9/f31 0 2026-03-10T07:51:14.064 INFO:tasks.workunit.client.0.vm05.stdout:0/940: creat d8/dd/f140 x:0 0 0 2026-03-10T07:51:14.065 INFO:tasks.workunit.client.0.vm05.stdout:0/941: chown d8/dd/d10/c44 7971908 1 2026-03-10T07:51:14.073 INFO:tasks.workunit.client.0.vm05.stdout:7/935: rmdir d1/d108/d120 0 2026-03-10T07:51:14.075 INFO:tasks.workunit.client.0.vm05.stdout:5/944: creat d2/d20/f147 x:0 0 0 2026-03-10T07:51:14.079 INFO:tasks.workunit.client.0.vm05.stdout:4/976: rename d0/d6/d9/d12/d9c/db7/da7/d5c/l81 to d0/d6/d9/d12/d69/ddc/l146 0 2026-03-10T07:51:14.087 INFO:tasks.workunit.client.0.vm05.stdout:2/963: dread - d0/d8/fcd zero size 2026-03-10T07:51:14.095 INFO:tasks.workunit.client.0.vm05.stdout:8/850: truncate d1/dd/d4d/d64/d6a/de5/d2a/d34/fc6 1623719 0 2026-03-10T07:51:14.100 INFO:tasks.workunit.client.0.vm05.stdout:0/942: unlink d8/dd/d10/d26/d48/ff0 0 2026-03-10T07:51:14.105 INFO:tasks.workunit.client.0.vm05.stdout:0/943: readlink d8/dd/d10/d26/d3a/d5e/d63/l78 0 2026-03-10T07:51:14.119 INFO:tasks.workunit.client.0.vm05.stdout:6/950: rename d0/d11/d22/d6c/d84/c127 to d0/d11/d57/da4/c13f 0 2026-03-10T07:51:14.124 INFO:tasks.workunit.client.0.vm05.stdout:7/936: dwrite d1/d34/f7d [4194304,4194304] 0 2026-03-10T07:51:14.178 INFO:tasks.workunit.client.0.vm05.stdout:4/977: stat d0/d6/d9/d12/c11f 0 2026-03-10T07:51:14.180 INFO:tasks.workunit.client.0.vm05.stdout:3/877: symlink d8/d1f/d2a/d34/dbd/d116/d121/l12b 0 2026-03-10T07:51:14.183 INFO:tasks.workunit.client.0.vm05.stdout:1/891: getdents da/d26/d2b/d89 0 2026-03-10T07:51:14.187 INFO:tasks.workunit.client.0.vm05.stdout:0/944: dread - d8/dd/d37/d81/fce zero size 2026-03-10T07:51:14.187 INFO:tasks.workunit.client.0.vm05.stdout:8/851: stat d1/dd/d4d/d64/d6a/de5/d2a/f3a 0 2026-03-10T07:51:14.196 INFO:tasks.workunit.client.0.vm05.stdout:4/978: dwrite d0/d6/d9/d5a/d6e/dd1/f13d [0,4194304] 0 2026-03-10T07:51:14.200 INFO:tasks.workunit.client.0.vm05.stdout:0/945: dwrite d8/dd/d10/d26/d2a/f94 [0,4194304] 0 2026-03-10T07:51:14.227 INFO:tasks.workunit.client.0.vm05.stdout:6/951: rename d0/d11/d4f/da0/da6/lc1 to d0/d11/d2e/l140 0 2026-03-10T07:51:14.228 INFO:tasks.workunit.client.0.vm05.stdout:9/859: link d8/d86/d28/d79/d57/de1/d6b/cf3 d8/d86/d28/d79/d57/de1/d22/d33/d70/c123 0 2026-03-10T07:51:14.229 INFO:tasks.workunit.client.0.vm05.stdout:3/878: creat d8/d1c/d109/f12c x:0 0 0 2026-03-10T07:51:14.233 INFO:tasks.workunit.client.0.vm05.stdout:1/892: creat da/dd/d2a/d55/d64/f117 x:0 0 0 2026-03-10T07:51:14.235 INFO:tasks.workunit.client.0.vm05.stdout:1/893: readlink da/dd/d12/lff 0 2026-03-10T07:51:14.243 INFO:tasks.workunit.client.0.vm05.stdout:5/945: truncate d2/d4b/fcf 1082385 0 2026-03-10T07:51:14.249 INFO:tasks.workunit.client.0.vm05.stdout:0/946: symlink d8/dd/d10/d26/d3a/d5e/d63/l141 0 2026-03-10T07:51:14.255 INFO:tasks.workunit.client.0.vm05.stdout:6/952: dread d0/d11/d2e/fbc [0,4194304] 0 2026-03-10T07:51:14.257 INFO:tasks.workunit.client.0.vm05.stdout:2/964: write d0/d8/d3d/fdd [2294045,44217] 0 2026-03-10T07:51:14.260 INFO:tasks.workunit.client.0.vm05.stdout:7/937: dwrite d1/d6/d3b/ffa [0,4194304] 0 2026-03-10T07:51:14.271 INFO:tasks.workunit.client.0.vm05.stdout:2/965: readlink d0/d52/l121 0 2026-03-10T07:51:14.271 INFO:tasks.workunit.client.0.vm05.stdout:4/979: dread d0/d6/d9/d12/d9c/fff [0,4194304] 0 2026-03-10T07:51:14.272 INFO:tasks.workunit.client.0.vm05.stdout:9/860: unlink d8/d86/d28/d79/d57/de1/d22/f9b 0 2026-03-10T07:51:14.272 INFO:tasks.workunit.client.0.vm05.stdout:1/894: unlink da/dd/c85 0 2026-03-10T07:51:14.272 INFO:tasks.workunit.client.0.vm05.stdout:2/966: dwrite d0/d8/d43/da4/dea/f119 [0,4194304] 0 2026-03-10T07:51:14.281 INFO:tasks.workunit.client.0.vm05.stdout:5/946: sync 2026-03-10T07:51:14.301 INFO:tasks.workunit.client.0.vm05.stdout:8/852: creat d1/d6f/f10b x:0 0 0 2026-03-10T07:51:14.304 INFO:tasks.workunit.client.0.vm05.stdout:9/861: unlink d8/d86/d28/d79/d57/lb9 0 2026-03-10T07:51:14.314 INFO:tasks.workunit.client.0.vm05.stdout:1/895: rename da/dd/d2a/d55/d64/f117 to da/dd/d12/d34/d58/d10a/f118 0 2026-03-10T07:51:14.314 INFO:tasks.workunit.client.0.vm05.stdout:2/967: dread d0/d52/f88 [0,4194304] 0 2026-03-10T07:51:14.314 INFO:tasks.workunit.client.0.vm05.stdout:1/896: stat da/dd/d12/d86/lba 0 2026-03-10T07:51:14.314 INFO:tasks.workunit.client.0.vm05.stdout:0/947: symlink d8/dd/d10/d26/l142 0 2026-03-10T07:51:14.314 INFO:tasks.workunit.client.0.vm05.stdout:6/953: truncate d0/f15 4930649 0 2026-03-10T07:51:14.314 INFO:tasks.workunit.client.0.vm05.stdout:6/954: write d0/d11/d57/d60/f74 [1637300,119439] 0 2026-03-10T07:51:14.318 INFO:tasks.workunit.client.0.vm05.stdout:1/897: dwrite da/dd/d2a/d55/d64/d104/f4d [4194304,4194304] 0 2026-03-10T07:51:14.324 INFO:tasks.workunit.client.0.vm05.stdout:9/862: creat d8/d86/d28/d79/d57/de1/d1c/d20/f124 x:0 0 0 2026-03-10T07:51:14.329 INFO:tasks.workunit.client.0.vm05.stdout:2/968: dwrite d0/d8/d3d/fdd [0,4194304] 0 2026-03-10T07:51:14.334 INFO:tasks.workunit.client.0.vm05.stdout:3/879: dwrite d8/d22/fe2 [0,4194304] 0 2026-03-10T07:51:14.355 INFO:tasks.workunit.client.0.vm05.stdout:5/947: rename d2/d5/f10 to d2/d12/dda/f148 0 2026-03-10T07:51:14.355 INFO:tasks.workunit.client.0.vm05.stdout:1/898: symlink da/dd/d12/d34/d107/dbe/l119 0 2026-03-10T07:51:14.358 INFO:tasks.workunit.client.0.vm05.stdout:1/899: truncate da/d26/d2b/f114 938475 0 2026-03-10T07:51:14.365 INFO:tasks.workunit.client.0.vm05.stdout:2/969: mkdir d0/d8/d43/df/d53/d13a 0 2026-03-10T07:51:14.388 INFO:tasks.workunit.client.0.vm05.stdout:0/948: symlink d8/l143 0 2026-03-10T07:51:14.391 INFO:tasks.workunit.client.0.vm05.stdout:2/970: sync 2026-03-10T07:51:14.393 INFO:tasks.workunit.client.0.vm05.stdout:7/938: link d1/l6b d1/d3c/db8/l127 0 2026-03-10T07:51:14.394 INFO:tasks.workunit.client.0.vm05.stdout:7/939: stat d1/d108/l113 0 2026-03-10T07:51:14.401 INFO:tasks.workunit.client.0.vm05.stdout:6/955: rename d0/d11/d86/d12e to d0/d11/d57/da4/db3/de7/d141 0 2026-03-10T07:51:14.403 INFO:tasks.workunit.client.0.vm05.stdout:2/971: write d0/d8/d43/da4/dea/f119 [4296998,77712] 0 2026-03-10T07:51:14.404 INFO:tasks.workunit.client.0.vm05.stdout:9/863: creat d8/d86/d28/d79/d57/dbc/d11a/f125 x:0 0 0 2026-03-10T07:51:14.406 INFO:tasks.workunit.client.0.vm05.stdout:9/864: write d8/d86/d28/d79/d57/de1/d6b/f97 [3407931,3428] 0 2026-03-10T07:51:14.408 INFO:tasks.workunit.client.0.vm05.stdout:5/948: readlink d2/d12/d4d/lb8 0 2026-03-10T07:51:14.415 INFO:tasks.workunit.client.0.vm05.stdout:0/949: fsync d8/dd/d10/d26/d8b/da4/f3d 0 2026-03-10T07:51:14.418 INFO:tasks.workunit.client.0.vm05.stdout:4/980: write d0/d6/d9/d12/d9c/db7/da7/f53 [2208840,58906] 0 2026-03-10T07:51:14.419 INFO:tasks.workunit.client.0.vm05.stdout:4/981: write d0/d6/d9/d12/d9c/db7/feb [1072849,100034] 0 2026-03-10T07:51:14.428 INFO:tasks.workunit.client.0.vm05.stdout:7/940: rename d1/d3c/d71/d79/ff5 to d1/d3c/d4b/da6/f128 0 2026-03-10T07:51:14.432 INFO:tasks.workunit.client.0.vm05.stdout:8/853: write d1/dd/d4d/d64/d6a/de5/d2a/d48/d7c/d9c/fab [224773,105896] 0 2026-03-10T07:51:14.445 INFO:tasks.workunit.client.0.vm05.stdout:3/880: dwrite d8/fe [0,4194304] 0 2026-03-10T07:51:14.455 INFO:tasks.workunit.client.0.vm05.stdout:5/949: dread - d2/d4b/ff9 zero size 2026-03-10T07:51:14.455 INFO:tasks.workunit.client.0.vm05.stdout:0/950: dread - d8/dd/d10/d26/d48/fdd zero size 2026-03-10T07:51:14.462 INFO:tasks.workunit.client.0.vm05.stdout:4/982: dread d0/d6/f94 [0,4194304] 0 2026-03-10T07:51:14.463 INFO:tasks.workunit.client.0.vm05.stdout:1/900: dwrite da/dd/d2a/f75 [0,4194304] 0 2026-03-10T07:51:14.468 INFO:tasks.workunit.client.0.vm05.stdout:2/972: symlink d0/d8/def/l13b 0 2026-03-10T07:51:14.477 INFO:tasks.workunit.client.0.vm05.stdout:9/865: symlink d8/d86/d28/d79/d57/de1/d1c/l126 0 2026-03-10T07:51:14.477 INFO:tasks.workunit.client.0.vm05.stdout:4/983: write d0/d6/d60/faf [887429,87028] 0 2026-03-10T07:51:14.478 INFO:tasks.workunit.client.0.vm05.stdout:3/881: rename d8/d8f/fb4 to d8/d1f/d24/d76/dc5/de1/d12a/f12d 0 2026-03-10T07:51:14.478 INFO:tasks.workunit.client.0.vm05.stdout:9/866: readlink d8/d86/d28/d79/d57/de1/d1c/d20/d59/d8b/ld9 0 2026-03-10T07:51:14.478 INFO:tasks.workunit.client.0.vm05.stdout:0/951: truncate d8/dd/d10/d26/d3a/d5e/ff9 1060788 0 2026-03-10T07:51:14.483 INFO:tasks.workunit.client.0.vm05.stdout:8/854: mkdir d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d10c 0 2026-03-10T07:51:14.483 INFO:tasks.workunit.client.0.vm05.stdout:3/882: rmdir d8 39 2026-03-10T07:51:14.485 INFO:tasks.workunit.client.0.vm05.stdout:1/901: creat da/dd/d2a/d55/d64/dd1/f11a x:0 0 0 2026-03-10T07:51:14.486 INFO:tasks.workunit.client.0.vm05.stdout:9/867: unlink d8/d86/d28/d79/f44 0 2026-03-10T07:51:14.486 INFO:tasks.workunit.client.0.vm05.stdout:4/984: sync 2026-03-10T07:51:14.487 INFO:tasks.workunit.client.0.vm05.stdout:1/902: chown da/d26/d2b/d89/fb1 20206 1 2026-03-10T07:51:14.489 INFO:tasks.workunit.client.0.vm05.stdout:0/952: rename d8/dd/d10/d26/d3a/l97 to d8/dd/d37/d81/l144 0 2026-03-10T07:51:14.489 INFO:tasks.workunit.client.0.vm05.stdout:4/985: readlink d0/d6/d9/l27 0 2026-03-10T07:51:14.490 INFO:tasks.workunit.client.0.vm05.stdout:6/956: getdents d0/d11/d57/d60 0 2026-03-10T07:51:14.493 INFO:tasks.workunit.client.0.vm05.stdout:2/973: mknod d0/d8/d66/dd1/d49/db1/d107/d127/d129/c13c 0 2026-03-10T07:51:14.494 INFO:tasks.workunit.client.0.vm05.stdout:8/855: mknod d1/dd/d4d/c10d 0 2026-03-10T07:51:14.497 INFO:tasks.workunit.client.0.vm05.stdout:5/950: dread d2/d20/d4c/f9a [0,4194304] 0 2026-03-10T07:51:14.499 INFO:tasks.workunit.client.0.vm05.stdout:9/868: dread - d8/d86/d95/ff2 zero size 2026-03-10T07:51:14.499 INFO:tasks.workunit.client.0.vm05.stdout:1/903: symlink da/d26/d2b/d89/l11b 0 2026-03-10T07:51:14.499 INFO:tasks.workunit.client.0.vm05.stdout:6/957: symlink d0/d11/d2e/d81/d92/dc2/l142 0 2026-03-10T07:51:14.502 INFO:tasks.workunit.client.0.vm05.stdout:3/883: creat d8/d1f/d2a/d34/dbd/d116/d121/f12e x:0 0 0 2026-03-10T07:51:14.505 INFO:tasks.workunit.client.0.vm05.stdout:0/953: mknod d8/dd/d37/d11f/c145 0 2026-03-10T07:51:14.506 INFO:tasks.workunit.client.0.vm05.stdout:6/958: mknod d0/d6/c143 0 2026-03-10T07:51:14.506 INFO:tasks.workunit.client.0.vm05.stdout:2/974: creat d0/d8/d43/da4/dea/d132/f13d x:0 0 0 2026-03-10T07:51:14.506 INFO:tasks.workunit.client.0.vm05.stdout:8/856: rename d1/d52/dd3 to d1/dd/d4d/dcc/dbd/d10e 0 2026-03-10T07:51:14.508 INFO:tasks.workunit.client.0.vm05.stdout:3/884: read d8/d1f/d24/d76/dc5/de1/d19/f77 [63103,29390] 0 2026-03-10T07:51:14.511 INFO:tasks.workunit.client.0.vm05.stdout:1/904: getdents da/dd/d12/d113 0 2026-03-10T07:51:14.515 INFO:tasks.workunit.client.0.vm05.stdout:0/954: creat d8/dd/d10/d26/d8b/d86/f146 x:0 0 0 2026-03-10T07:51:14.515 INFO:tasks.workunit.client.0.vm05.stdout:9/869: rename d8/d86/d28/d79/d57/de1/d1c/d75/ffc to d8/d86/d28/d79/d57/de1/d38/d71/d81/d11c/f127 0 2026-03-10T07:51:14.517 INFO:tasks.workunit.client.0.vm05.stdout:2/975: mknod d0/d8/d66/dd1/d49/df9/db2/c13e 0 2026-03-10T07:51:14.517 INFO:tasks.workunit.client.0.vm05.stdout:9/870: readlink d8/d86/d28/d79/d57/de1/d22/d33/l7f 0 2026-03-10T07:51:14.521 INFO:tasks.workunit.client.0.vm05.stdout:6/959: mkdir d0/d11/d57/da4/db3/d144 0 2026-03-10T07:51:14.522 INFO:tasks.workunit.client.0.vm05.stdout:9/871: chown d8/d86/d28/d79/d57/de1/d38/d71/d81/f83 932893 1 2026-03-10T07:51:14.527 INFO:tasks.workunit.client.0.vm05.stdout:3/885: symlink d8/d1f/d24/d76/dc5/de1/d19/d37/l12f 0 2026-03-10T07:51:14.534 INFO:tasks.workunit.client.0.vm05.stdout:5/951: getdents d2/d12/da8 0 2026-03-10T07:51:14.534 INFO:tasks.workunit.client.0.vm05.stdout:1/905: dread da/dd/d12/d86/d9a/fd0 [0,4194304] 0 2026-03-10T07:51:14.535 INFO:tasks.workunit.client.0.vm05.stdout:0/955: dread - d8/d9c/fd0 zero size 2026-03-10T07:51:14.539 INFO:tasks.workunit.client.0.vm05.stdout:2/976: unlink d0/d8/d66/dd1/d49/df9/da5/f109 0 2026-03-10T07:51:14.542 INFO:tasks.workunit.client.0.vm05.stdout:9/872: rmdir d8/d86/d28/d79/d57/de1/d1c/d20/d59/d8b 39 2026-03-10T07:51:14.544 INFO:tasks.workunit.client.0.vm05.stdout:1/906: sync 2026-03-10T07:51:14.555 INFO:tasks.workunit.client.0.vm05.stdout:7/941: dwrite d1/d3c/d4b/fc8 [0,4194304] 0 2026-03-10T07:51:14.555 INFO:tasks.workunit.client.0.vm05.stdout:6/960: fsync d0/d6/f24 0 2026-03-10T07:51:14.557 INFO:tasks.workunit.client.0.vm05.stdout:2/977: mknod d0/d113/c13f 0 2026-03-10T07:51:14.562 INFO:tasks.workunit.client.0.vm05.stdout:7/942: sync 2026-03-10T07:51:14.562 INFO:tasks.workunit.client.0.vm05.stdout:7/943: write d1/d6/d80/d82/ffd [4000931,100377] 0 2026-03-10T07:51:14.569 INFO:tasks.workunit.client.0.vm05.stdout:5/952: truncate d2/d4b/fcf 1486307 0 2026-03-10T07:51:14.569 INFO:tasks.workunit.client.0.vm05.stdout:9/873: creat d8/d86/d28/d79/d57/de1/d6b/dde/f128 x:0 0 0 2026-03-10T07:51:14.570 INFO:tasks.workunit.client.0.vm05.stdout:4/986: write d0/d6/d9/d12/d9c/db7/da7/d96/f118 [1018894,59656] 0 2026-03-10T07:51:14.572 INFO:tasks.workunit.client.0.vm05.stdout:0/956: link d8/dd/f140 d8/d9c/dc8/d10f/d127/d12f/dc3/d10b/f147 0 2026-03-10T07:51:14.581 INFO:tasks.workunit.client.0.vm05.stdout:1/907: dwrite da/d26/d2b/f114 [0,4194304] 0 2026-03-10T07:51:14.584 INFO:tasks.workunit.client.0.vm05.stdout:2/978: mkdir d0/d8/d43/df/d8b/d140 0 2026-03-10T07:51:14.588 INFO:tasks.workunit.client.0.vm05.stdout:1/908: truncate da/d26/d9e/fa1 4594012 0 2026-03-10T07:51:14.588 INFO:tasks.workunit.client.0.vm05.stdout:1/909: fdatasync da/fc 0 2026-03-10T07:51:14.591 INFO:tasks.workunit.client.0.vm05.stdout:1/910: chown da/d26/d2b/fb0 107 1 2026-03-10T07:51:14.591 INFO:tasks.workunit.client.0.vm05.stdout:3/886: dread d8/d1f/d2a/d34/fae [0,4194304] 0 2026-03-10T07:51:14.592 INFO:tasks.workunit.client.0.vm05.stdout:3/887: stat d8/d22/d60/df7 0 2026-03-10T07:51:14.593 INFO:tasks.workunit.client.0.vm05.stdout:5/953: rmdir d2/d5 39 2026-03-10T07:51:14.601 INFO:tasks.workunit.client.0.vm05.stdout:8/857: dwrite d1/dd/d4d/dcc/dbd/d10e/fd7 [0,4194304] 0 2026-03-10T07:51:14.606 INFO:tasks.workunit.client.0.vm05.stdout:4/987: rename d0/d6/d95/f3a to d0/d6/d9/d12/d45/d55/d4e/dd2/f147 0 2026-03-10T07:51:14.608 INFO:tasks.workunit.client.0.vm05.stdout:2/979: mknod d0/d8/d43/df/d4e/d10a/c141 0 2026-03-10T07:51:14.609 INFO:tasks.workunit.client.0.vm05.stdout:5/954: sync 2026-03-10T07:51:14.609 INFO:tasks.workunit.client.0.vm05.stdout:2/980: write d0/d8/d43/df/feb [1168904,89841] 0 2026-03-10T07:51:14.617 INFO:tasks.workunit.client.0.vm05.stdout:9/874: symlink d8/d86/l129 0 2026-03-10T07:51:14.617 INFO:tasks.workunit.client.0.vm05.stdout:3/888: mkdir d8/d1f/d24/d76/dc5/de1/d52/d7b/d130 0 2026-03-10T07:51:14.633 INFO:tasks.workunit.client.0.vm05.stdout:2/981: rename d0/d8/d66/dd1/d49/db1/d107/d127 to d0/d8/d43/df/d53/d13a/d142 0 2026-03-10T07:51:14.636 INFO:tasks.workunit.client.0.vm05.stdout:2/982: write d0/f4 [3723742,106461] 0 2026-03-10T07:51:14.639 INFO:tasks.workunit.client.0.vm05.stdout:9/875: creat d8/d86/d28/d79/d57/de1/d22/d33/d62/de0/f12a x:0 0 0 2026-03-10T07:51:14.643 INFO:tasks.workunit.client.0.vm05.stdout:3/889: read d8/f5c [2327253,29104] 0 2026-03-10T07:51:14.644 INFO:tasks.workunit.client.0.vm05.stdout:3/890: chown d8/d1f 3435076 1 2026-03-10T07:51:14.647 INFO:tasks.workunit.client.0.vm05.stdout:2/983: truncate d0/d8/d43/da4/ff1 1293303 0 2026-03-10T07:51:14.657 INFO:tasks.workunit.client.0.vm05.stdout:3/891: read - d8/d1f/d2a/d4a/d7d/fee zero size 2026-03-10T07:51:14.661 INFO:tasks.workunit.client.0.vm05.stdout:8/858: dread d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d5d/ff8 [0,4194304] 0 2026-03-10T07:51:14.661 INFO:tasks.workunit.client.0.vm05.stdout:5/955: dwrite d2/d20/d33/d53/f97 [4194304,4194304] 0 2026-03-10T07:51:14.664 INFO:tasks.workunit.client.0.vm05.stdout:2/984: rmdir d0/d8/d43/df 39 2026-03-10T07:51:14.664 INFO:tasks.workunit.client.0.vm05.stdout:1/911: dwrite da/f5c [4194304,4194304] 0 2026-03-10T07:51:14.664 INFO:tasks.workunit.client.0.vm05.stdout:2/985: stat d0/d8/d66/dd1/fa6 0 2026-03-10T07:51:14.667 INFO:tasks.workunit.client.0.vm05.stdout:3/892: stat d8/d1c/d48/d69/cf5 0 2026-03-10T07:51:14.667 INFO:tasks.workunit.client.0.vm05.stdout:0/957: dwrite d8/dd/d10/d26/d8b/d70/fbe [0,4194304] 0 2026-03-10T07:51:14.672 INFO:tasks.workunit.client.0.vm05.stdout:3/893: chown d8/dd5/l11c 5 1 2026-03-10T07:51:14.672 INFO:tasks.workunit.client.0.vm05.stdout:6/961: dwrite d0/d11/d86/f113 [0,4194304] 0 2026-03-10T07:51:14.672 INFO:tasks.workunit.client.0.vm05.stdout:2/986: stat d0/d8/d43/dc9 0 2026-03-10T07:51:14.677 INFO:tasks.workunit.client.0.vm05.stdout:7/944: dwrite d1/d3c/d71/d79/d8a/dac/fcf [0,4194304] 0 2026-03-10T07:51:14.688 INFO:tasks.workunit.client.0.vm05.stdout:4/988: link d0/d6/l112 d0/d6/d6f/l148 0 2026-03-10T07:51:14.697 INFO:tasks.workunit.client.0.vm05.stdout:5/956: symlink d2/d20/d77/l149 0 2026-03-10T07:51:14.697 INFO:tasks.workunit.client.0.vm05.stdout:3/894: mkdir d8/d22/d60/d6e/dca/dda/d131 0 2026-03-10T07:51:14.698 INFO:tasks.workunit.client.0.vm05.stdout:6/962: mknod d0/d11/d22/d6c/d84/dc4/c145 0 2026-03-10T07:51:14.698 INFO:tasks.workunit.client.0.vm05.stdout:0/958: readlink d8/dd/d10/d26/d2a/d13a/lef 0 2026-03-10T07:51:14.707 INFO:tasks.workunit.client.0.vm05.stdout:7/945: rename d1/d34/lea to d1/d3c/d4b/da6/l129 0 2026-03-10T07:51:14.707 INFO:tasks.workunit.client.0.vm05.stdout:4/989: unlink d0/d6/d60/dde/l103 0 2026-03-10T07:51:14.707 INFO:tasks.workunit.client.0.vm05.stdout:9/876: link d8/d86/d28/d79/d57/de1/d1c/d20/d54/c56 d8/d86/d28/d79/d57/de1/d1c/d120/c12b 0 2026-03-10T07:51:14.708 INFO:tasks.workunit.client.0.vm05.stdout:1/912: unlink da/dd/d2a/d55/ce4 0 2026-03-10T07:51:14.708 INFO:tasks.workunit.client.0.vm05.stdout:3/895: symlink d8/d1f/d2a/d4a/d8c/l132 0 2026-03-10T07:51:14.710 INFO:tasks.workunit.client.0.vm05.stdout:6/963: creat d0/d11/d2e/d81/d92/dc2/f146 x:0 0 0 2026-03-10T07:51:14.711 INFO:tasks.workunit.client.0.vm05.stdout:7/946: readlink d1/d34/l39 0 2026-03-10T07:51:14.714 INFO:tasks.workunit.client.0.vm05.stdout:6/964: readlink d0/d11/d22/d6c/d84/dc4/lcb 0 2026-03-10T07:51:14.715 INFO:tasks.workunit.client.0.vm05.stdout:6/965: write d0/d11/d57/d60/f74 [2767783,5863] 0 2026-03-10T07:51:14.728 INFO:tasks.workunit.client.0.vm05.stdout:7/947: write d1/d6/d80/f10d [455362,27278] 0 2026-03-10T07:51:14.728 INFO:tasks.workunit.client.0.vm05.stdout:2/987: mkdir d0/d8/d43/df/d4e/d122/d143 0 2026-03-10T07:51:14.728 INFO:tasks.workunit.client.0.vm05.stdout:8/859: rename d1/dd/d5e/fee to d1/dd/d4d/d64/d8f/f10f 0 2026-03-10T07:51:14.728 INFO:tasks.workunit.client.0.vm05.stdout:4/990: mkdir d0/d11c/d149 0 2026-03-10T07:51:14.728 INFO:tasks.workunit.client.0.vm05.stdout:2/988: readlink d0/d8/d43/dc9/lcc 0 2026-03-10T07:51:14.728 INFO:tasks.workunit.client.0.vm05.stdout:5/957: symlink d2/d12/dda/da1/dc0/l14a 0 2026-03-10T07:51:14.728 INFO:tasks.workunit.client.0.vm05.stdout:2/989: symlink d0/d2a/l144 0 2026-03-10T07:51:14.728 INFO:tasks.workunit.client.0.vm05.stdout:7/948: creat d1/d6/d80/d82/f12a x:0 0 0 2026-03-10T07:51:14.728 INFO:tasks.workunit.client.0.vm05.stdout:2/990: read d0/d8/f2d [185399,101916] 0 2026-03-10T07:51:14.728 INFO:tasks.workunit.client.0.vm05.stdout:2/991: stat d0/d8/d66/dd1/d49/df9/db2 0 2026-03-10T07:51:14.729 INFO:tasks.workunit.client.0.vm05.stdout:9/877: dwrite d8/d86/d28/d79/d57/de1/d1c/d20/f124 [0,4194304] 0 2026-03-10T07:51:14.729 INFO:tasks.workunit.client.0.vm05.stdout:6/966: dwrite d0/d11/d57/f11f [0,4194304] 0 2026-03-10T07:51:14.757 INFO:tasks.workunit.client.0.vm05.stdout:9/878: creat d8/d86/d28/d79/d57/de1/d1c/d20/d54/f12c x:0 0 0 2026-03-10T07:51:14.765 INFO:tasks.workunit.client.0.vm05.stdout:0/959: rename d8/dd/d10/d26/d8b/da4/ddf/cfe to d8/c148 0 2026-03-10T07:51:14.766 INFO:tasks.workunit.client.0.vm05.stdout:0/960: read d8/dd/f59 [788133,28397] 0 2026-03-10T07:51:14.777 INFO:tasks.workunit.client.0.vm05.stdout:4/991: creat d0/d6/d9/f14a x:0 0 0 2026-03-10T07:51:14.777 INFO:tasks.workunit.client.0.vm05.stdout:2/992: mkdir d0/d8/d43/d38/d138/d145 0 2026-03-10T07:51:14.777 INFO:tasks.workunit.client.0.vm05.stdout:4/992: rmdir d0/d6/d9/d12/d69 39 2026-03-10T07:51:14.777 INFO:tasks.workunit.client.0.vm05.stdout:0/961: creat d8/dd/d10/d26/d8b/d86/f149 x:0 0 0 2026-03-10T07:51:14.778 INFO:tasks.workunit.client.0.vm05.stdout:8/860: link d1/d45/f81 d1/d45/dfb/f110 0 2026-03-10T07:51:14.780 INFO:tasks.workunit.client.0.vm05.stdout:1/913: rename da/caa to da/d26/d2b/d71/c11c 0 2026-03-10T07:51:14.783 INFO:tasks.workunit.client.0.vm05.stdout:4/993: creat d0/d28/f14b x:0 0 0 2026-03-10T07:51:14.785 INFO:tasks.workunit.client.0.vm05.stdout:8/861: creat d1/dd/d4d/dcc/f111 x:0 0 0 2026-03-10T07:51:14.787 INFO:tasks.workunit.client.0.vm05.stdout:9/879: creat d8/d86/d28/d79/d57/de1/d1c/d20/d59/f12d x:0 0 0 2026-03-10T07:51:14.789 INFO:tasks.workunit.client.0.vm05.stdout:3/896: rename d8/d1f/d24/d8a/l10b to d8/d1f/d2a/d34/dbd/d116/l133 0 2026-03-10T07:51:14.793 INFO:tasks.workunit.client.0.vm05.stdout:4/994: rename d0/d6/d9/d5a/l30 to d0/d6/d9/d12/d4f/d12e/l14c 0 2026-03-10T07:51:14.796 INFO:tasks.workunit.client.0.vm05.stdout:1/914: mknod da/dd/d2a/d55/d64/dac/dfa/c11d 0 2026-03-10T07:51:14.796 INFO:tasks.workunit.client.0.vm05.stdout:8/862: dread d1/dd/d4d/d64/fe2 [0,4194304] 0 2026-03-10T07:51:14.797 INFO:tasks.workunit.client.0.vm05.stdout:1/915: write da/dd/d12/d34/ddb/fc5 [1407233,12155] 0 2026-03-10T07:51:14.797 INFO:tasks.workunit.client.0.vm05.stdout:8/863: chown d1/dd/d4d/d64/d6a/de5/d2a/d34/cea 835 1 2026-03-10T07:51:14.798 INFO:tasks.workunit.client.0.vm05.stdout:1/916: readlink da/dd/d2a/d55/d102/l108 0 2026-03-10T07:51:14.802 INFO:tasks.workunit.client.0.vm05.stdout:9/880: mkdir d8/d86/d28/d79/d57/de1/d38/d71/d12e 0 2026-03-10T07:51:14.805 INFO:tasks.workunit.client.0.vm05.stdout:1/917: creat da/d26/f11e x:0 0 0 2026-03-10T07:51:14.806 INFO:tasks.workunit.client.0.vm05.stdout:4/995: mkdir d0/d6/d60/d14d 0 2026-03-10T07:51:14.807 INFO:tasks.workunit.client.0.vm05.stdout:9/881: unlink d8/d86/d28/d79/d57/de1/d22/d33/d47/f5f 0 2026-03-10T07:51:14.808 INFO:tasks.workunit.client.0.vm05.stdout:8/864: rename d1/dd/d4d/d64/d6a/de5/d2a/c75 to d1/d6f/df9/d102/dbf/c112 0 2026-03-10T07:51:14.818 INFO:tasks.workunit.client.0.vm05.stdout:8/865: dwrite d1/dd/d4d/d64/d6a/de5/d2a/d48/dfd/f100 [0,4194304] 0 2026-03-10T07:51:14.823 INFO:tasks.workunit.client.0.vm05.stdout:5/958: dwrite d2/d12/d4d/f84 [0,4194304] 0 2026-03-10T07:51:14.824 INFO:tasks.workunit.client.0.vm05.stdout:2/993: sync 2026-03-10T07:51:14.825 INFO:tasks.workunit.client.0.vm05.stdout:1/918: sync 2026-03-10T07:51:14.844 INFO:tasks.workunit.client.0.vm05.stdout:7/949: write d1/d34/d59/f6f [990708,109268] 0 2026-03-10T07:51:14.844 INFO:tasks.workunit.client.0.vm05.stdout:9/882: dread d8/d86/d28/d79/d57/de1/d1c/d20/f32 [0,4194304] 0 2026-03-10T07:51:14.848 INFO:tasks.workunit.client.0.vm05.stdout:6/967: dwrite d0/d11/d4f/da0/da6/fd9 [0,4194304] 0 2026-03-10T07:51:14.854 INFO:tasks.workunit.client.0.vm05.stdout:8/866: link d1/dd/d4d/d64/d6a/de5/f109 d1/dd/d4d/f113 0 2026-03-10T07:51:14.859 INFO:tasks.workunit.client.0.vm05.stdout:2/994: creat d0/d8/d43/df/d8b/dbf/de5/d10c/f146 x:0 0 0 2026-03-10T07:51:14.860 INFO:tasks.workunit.client.0.vm05.stdout:0/962: dwrite d8/dd/d10/d26/fad [0,4194304] 0 2026-03-10T07:51:14.865 INFO:tasks.workunit.client.0.vm05.stdout:3/897: dwrite d8/d1f/d24/d76/dc5/de1/f1a [0,4194304] 0 2026-03-10T07:51:14.868 INFO:tasks.workunit.client.0.vm05.stdout:7/950: fsync d1/d6/d3b/f11b 0 2026-03-10T07:51:14.873 INFO:tasks.workunit.client.0.vm05.stdout:0/963: dwrite d8/dd/d10/d26/d2a/f94 [0,4194304] 0 2026-03-10T07:51:14.884 INFO:tasks.workunit.client.0.vm05.stdout:1/919: link da/dd/d2a/d55/d102/l108 da/dd/d2a/l11f 0 2026-03-10T07:51:14.894 INFO:tasks.workunit.client.0.vm05.stdout:7/951: readlink d1/d34/d59/d60/d8c/l125 0 2026-03-10T07:51:14.901 INFO:tasks.workunit.client.0.vm05.stdout:2/995: symlink d0/d8/d3d/l147 0 2026-03-10T07:51:14.901 INFO:tasks.workunit.client.0.vm05.stdout:2/996: chown d0/d8/d43/df/feb 0 1 2026-03-10T07:51:14.911 INFO:tasks.workunit.client.0.vm05.stdout:1/920: write da/dd/d12/d34/d58/fce [3616025,48475] 0 2026-03-10T07:51:14.919 INFO:tasks.workunit.client.0.vm05.stdout:1/921: fsync da/d26/f11e 0 2026-03-10T07:51:14.924 INFO:tasks.workunit.client.0.vm05.stdout:2/997: rmdir d0/d8/d43/da4/dea 39 2026-03-10T07:51:14.927 INFO:tasks.workunit.client.0.vm05.stdout:9/883: link d8/d86/d28/d79/d57/dbc/fc5 d8/d86/d28/d79/d57/de1/d38/d71/d81/d11c/f12f 0 2026-03-10T07:51:14.931 INFO:tasks.workunit.client.0.vm05.stdout:4/996: dwrite d0/d6/d9/d5a/d91/fc9 [0,4194304] 0 2026-03-10T07:51:14.940 INFO:tasks.workunit.client.0.vm05.stdout:1/922: unlink da/dd/d12/d86/d9a/fd0 0 2026-03-10T07:51:14.945 INFO:tasks.workunit.client.0.vm05.stdout:8/867: creat d1/dd/d4d/d64/d6a/de5/d2a/d48/f114 x:0 0 0 2026-03-10T07:51:14.947 INFO:tasks.workunit.client.0.vm05.stdout:4/997: dwrite d0/d28/f33 [0,4194304] 0 2026-03-10T07:51:14.969 INFO:tasks.workunit.client.0.vm05.stdout:4/998: dread d0/d6/d9/d12/d9c/db7/da7/f4c [0,4194304] 0 2026-03-10T07:51:14.971 INFO:tasks.workunit.client.0.vm05.stdout:7/952: mkdir d1/d34/d59/d60/d12b 0 2026-03-10T07:51:14.977 INFO:tasks.workunit.client.0.vm05.stdout:9/884: mknod d8/d86/d28/d79/d57/de1/d22/d33/d62/dc0/c130 0 2026-03-10T07:51:14.990 INFO:tasks.workunit.client.0.vm05.stdout:4/999: write d0/d6/d9/d12/d69/dc7/ded/ff0 [3107336,111101] 0 2026-03-10T07:51:14.991 INFO:tasks.workunit.client.0.vm05.stdout:8/868: dwrite d1/dd/d4d/d64/d8f/f10f [0,4194304] 0 2026-03-10T07:51:14.998 INFO:tasks.workunit.client.0.vm05.stdout:7/953: chown d1/l53 84769 1 2026-03-10T07:51:15.006 INFO:tasks.workunit.client.0.vm05.stdout:5/959: dwrite d2/d20/d4c/f103 [0,4194304] 0 2026-03-10T07:51:15.015 INFO:tasks.workunit.client.0.vm05.stdout:6/968: dwrite d0/d11/d4f/d56/f6f [0,4194304] 0 2026-03-10T07:51:15.017 INFO:tasks.workunit.client.0.vm05.stdout:3/898: dwrite d8/d1f/d24/d76/dc5/de1/f2d [0,4194304] 0 2026-03-10T07:51:15.031 INFO:tasks.workunit.client.0.vm05.stdout:9/885: dread - d8/d86/d28/f7c zero size 2026-03-10T07:51:15.031 INFO:tasks.workunit.client.0.vm05.stdout:3/899: dwrite d8/d22/fe2 [0,4194304] 0 2026-03-10T07:51:15.035 INFO:tasks.workunit.client.0.vm05.stdout:1/923: write da/dd/d2a/f54 [1746639,90336] 0 2026-03-10T07:51:15.041 INFO:tasks.workunit.client.0.vm05.stdout:0/964: dwrite d8/dd/d37/f4f [0,4194304] 0 2026-03-10T07:51:15.070 INFO:tasks.workunit.client.0.vm05.stdout:7/954: unlink d1/d6/f31 0 2026-03-10T07:51:15.070 INFO:tasks.workunit.client.0.vm05.stdout:7/955: readlink d1/d3c/l123 0 2026-03-10T07:51:15.073 INFO:tasks.workunit.client.0.vm05.stdout:2/998: write d0/d8/d66/dd1/d49/df9/da5/fd2 [560223,20420] 0 2026-03-10T07:51:15.085 INFO:tasks.workunit.client.0.vm05.stdout:6/969: truncate d0/d11/d57/faf 714013 0 2026-03-10T07:51:15.098 INFO:tasks.workunit.client.0.vm05.stdout:3/900: rmdir d8/d1f/d24/d76/dc5/de1/dac 39 2026-03-10T07:51:15.101 INFO:tasks.workunit.client.0.vm05.stdout:9/886: symlink d8/d86/d28/d79/d57/de1/d1c/d20/dee/l131 0 2026-03-10T07:51:15.108 INFO:tasks.workunit.client.0.vm05.stdout:1/924: dread - da/dd/d2a/d55/d64/dd1/f11a zero size 2026-03-10T07:51:15.108 INFO:tasks.workunit.client.0.vm05.stdout:0/965: mkdir d8/d9c/dc8/d10f/d127/d14a 0 2026-03-10T07:51:15.109 INFO:tasks.workunit.client.0.vm05.stdout:9/887: dread d8/d86/d28/f29 [0,4194304] 0 2026-03-10T07:51:15.115 INFO:tasks.workunit.client.0.vm05.stdout:3/901: dwrite d8/d1c/f124 [0,4194304] 0 2026-03-10T07:51:15.126 INFO:tasks.workunit.client.0.vm05.stdout:3/902: dwrite d8/dd5/dfb/f100 [0,4194304] 0 2026-03-10T07:51:15.139 INFO:tasks.workunit.client.0.vm05.stdout:3/903: fsync d8/d1f/d24/d76/fc1 0 2026-03-10T07:51:15.147 INFO:tasks.workunit.client.0.vm05.stdout:3/904: truncate d8/d1f/d2a/d96/da9/fd6 1658625 0 2026-03-10T07:51:15.148 INFO:tasks.workunit.client.0.vm05.stdout:5/960: mknod d2/d12/dda/da1/dc0/c14b 0 2026-03-10T07:51:15.151 INFO:tasks.workunit.client.0.vm05.stdout:3/905: dread - d8/d1c/d109/f12c zero size 2026-03-10T07:51:15.154 INFO:tasks.workunit.client.0.vm05.stdout:8/869: dwrite d1/dd/f87 [0,4194304] 0 2026-03-10T07:51:15.187 INFO:tasks.workunit.client.0.vm05.stdout:2/999: write d0/d8/d43/df/f58 [2699333,63375] 0 2026-03-10T07:51:15.197 INFO:tasks.workunit.client.0.vm05.stdout:6/970: creat d0/d11/d31/dbf/f147 x:0 0 0 2026-03-10T07:51:15.205 INFO:tasks.workunit.client.1.vm08.stderr:+ pushd ltp-full-20091231/testcases/kernel/fs/fsstress 2026-03-10T07:51:15.214 INFO:tasks.workunit.client.1.vm08.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress ~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-10T07:51:15.215 INFO:tasks.workunit.client.1.vm08.stderr:+ make 2026-03-10T07:51:15.223 INFO:tasks.workunit.client.0.vm05.stdout:3/906: mkdir d8/d1f/d134 0 2026-03-10T07:51:15.223 INFO:tasks.workunit.client.0.vm05.stdout:1/925: dread da/f5d [0,4194304] 0 2026-03-10T07:51:15.226 INFO:tasks.workunit.client.0.vm05.stdout:1/926: write da/d26/d9e/fa1 [2840993,12298] 0 2026-03-10T07:51:15.228 INFO:tasks.workunit.client.0.vm05.stdout:1/927: fdatasync da/dd/d12/d34/ddb/def/f116 0 2026-03-10T07:51:15.237 INFO:tasks.workunit.client.0.vm05.stdout:9/888: truncate d8/d86/d28/d79/d57/dbc/fc5 74592 0 2026-03-10T07:51:15.237 INFO:tasks.workunit.client.0.vm05.stdout:7/956: creat d1/d6/f12c x:0 0 0 2026-03-10T07:51:15.245 INFO:tasks.workunit.client.0.vm05.stdout:6/971: mknod d0/d11/d22/d10b/c148 0 2026-03-10T07:51:15.247 INFO:tasks.workunit.client.0.vm05.stdout:0/966: creat d8/dd/d10/f14b x:0 0 0 2026-03-10T07:51:15.247 INFO:tasks.workunit.client.0.vm05.stdout:0/967: fsync d8/dd/d37/d56/fe9 0 2026-03-10T07:51:15.248 INFO:tasks.workunit.client.0.vm05.stdout:0/968: write d8/dd/d10/d26/d3a/d5e/f7b [637447,62668] 0 2026-03-10T07:51:15.254 INFO:tasks.workunit.client.0.vm05.stdout:9/889: creat d8/d86/d28/d79/d57/de1/d6b/f132 x:0 0 0 2026-03-10T07:51:15.261 INFO:tasks.workunit.client.0.vm05.stdout:3/907: dwrite d8/d22/f29 [0,4194304] 0 2026-03-10T07:51:15.263 INFO:tasks.workunit.client.0.vm05.stdout:1/928: sync 2026-03-10T07:51:15.265 INFO:tasks.workunit.client.0.vm05.stdout:3/908: dwrite d8/fe [0,4194304] 0 2026-03-10T07:51:15.282 INFO:tasks.workunit.client.0.vm05.stdout:7/957: dwrite d1/d6/d3b/d7f/fe8 [0,4194304] 0 2026-03-10T07:51:15.284 INFO:tasks.workunit.client.0.vm05.stdout:6/972: mknod d0/d11/d57/d60/d117/c149 0 2026-03-10T07:51:15.303 INFO:tasks.workunit.client.0.vm05.stdout:9/890: rmdir d8/d86/d28/d79/d57/de1/d22/d33/df9 39 2026-03-10T07:51:15.307 INFO:tasks.workunit.client.0.vm05.stdout:8/870: rename d1/dd/d4d/d64/d6a/de5/d2a/d34/c4c to d1/d45/c115 0 2026-03-10T07:51:15.308 INFO:tasks.workunit.client.0.vm05.stdout:8/871: chown d1/fe 13 1 2026-03-10T07:51:15.310 INFO:tasks.workunit.client.0.vm05.stdout:8/872: dwrite d1/dd/d5e/d9e/fff [0,4194304] 0 2026-03-10T07:51:15.312 INFO:tasks.workunit.client.0.vm05.stdout:8/873: chown d1/fa 9 1 2026-03-10T07:51:15.321 INFO:tasks.workunit.client.0.vm05.stdout:3/909: symlink d8/d1f/d2a/d96/l135 0 2026-03-10T07:51:15.324 INFO:tasks.workunit.client.0.vm05.stdout:5/961: getdents d2/d20/d33/d86/dac/dc1 0 2026-03-10T07:51:15.347 INFO:tasks.workunit.client.0.vm05.stdout:1/929: link da/dd/d12/d34/d58/fce da/dd/d12/d34/ddb/def/f120 0 2026-03-10T07:51:15.349 INFO:tasks.workunit.client.0.vm05.stdout:3/910: chown d8/d1f/d24/d76/dc5/fc8 14745 1 2026-03-10T07:51:15.350 INFO:tasks.workunit.client.0.vm05.stdout:5/962: creat d2/d12/dda/f14c x:0 0 0 2026-03-10T07:51:15.351 INFO:tasks.workunit.client.0.vm05.stdout:1/930: read da/dd/d2a/d55/d64/dd1/f4e [137226,20515] 0 2026-03-10T07:51:15.357 INFO:tasks.workunit.client.0.vm05.stdout:6/973: dwrite d0/d11/d57/da4/db3/de7/d141/db8/fcf [0,4194304] 0 2026-03-10T07:51:15.359 INFO:tasks.workunit.client.0.vm05.stdout:9/891: write d8/d86/d28/d79/d57/de1/d38/ff0 [803553,89663] 0 2026-03-10T07:51:15.369 INFO:tasks.workunit.client.0.vm05.stdout:8/874: mkdir d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d10c/d116 0 2026-03-10T07:51:15.375 INFO:tasks.workunit.client.0.vm05.stdout:5/963: creat d2/d12/dda/f14d x:0 0 0 2026-03-10T07:51:15.378 INFO:tasks.workunit.client.0.vm05.stdout:5/964: dread - d2/d20/d33/d53/d7d/f144 zero size 2026-03-10T07:51:15.379 INFO:tasks.workunit.client.0.vm05.stdout:5/965: dwrite d2/d20/d7b/f121 [0,4194304] 0 2026-03-10T07:51:15.393 INFO:tasks.workunit.client.0.vm05.stdout:9/892: chown d8/d86/d28/d79/d57/de1/d22/d33/df9/f105 48 1 2026-03-10T07:51:15.397 INFO:tasks.workunit.client.0.vm05.stdout:0/969: getdents d8/dd 0 2026-03-10T07:51:15.401 INFO:tasks.workunit.client.0.vm05.stdout:8/875: creat d1/dd/d4d/d64/d6a/de5/d2a/d48/d7c/d9c/f117 x:0 0 0 2026-03-10T07:51:15.402 INFO:tasks.workunit.client.0.vm05.stdout:3/911: symlink d8/d1f/d24/d76/dc5/de1/dac/l136 0 2026-03-10T07:51:15.404 INFO:tasks.workunit.client.0.vm05.stdout:1/931: rename da/dd/d2a/d55/d64/f7a to da/d26/d9e/f121 0 2026-03-10T07:51:15.413 INFO:tasks.workunit.client.0.vm05.stdout:7/958: getdents d1/d6/d3b 0 2026-03-10T07:51:15.414 INFO:tasks.workunit.client.0.vm05.stdout:5/966: dread - d2/d5/d61/ffe zero size 2026-03-10T07:51:15.418 INFO:tasks.workunit.client.0.vm05.stdout:6/974: creat d0/d11/d2e/d81/d92/d13b/f14a x:0 0 0 2026-03-10T07:51:15.418 INFO:tasks.workunit.client.0.vm05.stdout:0/970: fdatasync d8/dd/f140 0 2026-03-10T07:51:15.419 INFO:tasks.workunit.client.0.vm05.stdout:0/971: write d8/dd/d10/f115 [218541,83603] 0 2026-03-10T07:51:15.423 INFO:tasks.workunit.client.1.vm08.stdout:cc -DNO_XFS -I/home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress -D_LARGEFILE64_SOURCE -D_GNU_SOURCE -I../../../../include -I../../../../include -L../../../../lib fsstress.c -o fsstress 2026-03-10T07:51:15.434 INFO:tasks.workunit.client.0.vm05.stdout:0/972: rmdir d8 39 2026-03-10T07:51:15.436 INFO:tasks.workunit.client.0.vm05.stdout:9/893: dwrite f6 [0,4194304] 0 2026-03-10T07:51:15.441 INFO:tasks.workunit.client.0.vm05.stdout:9/894: dwrite d8/d86/d28/d79/d57/de1/f1f [0,4194304] 0 2026-03-10T07:51:15.442 INFO:tasks.workunit.client.0.vm05.stdout:9/895: chown d8/d86/d28/d79/d57/de1/d38/d71/d81/d10f 369357 1 2026-03-10T07:51:15.443 INFO:tasks.workunit.client.0.vm05.stdout:9/896: chown d8/d86/d28/de9 1 1 2026-03-10T07:51:15.462 INFO:tasks.workunit.client.0.vm05.stdout:8/876: symlink d1/dd/d5e/l118 0 2026-03-10T07:51:15.463 INFO:tasks.workunit.client.0.vm05.stdout:5/967: mknod d2/d20/d122/c14e 0 2026-03-10T07:51:15.463 INFO:tasks.workunit.client.0.vm05.stdout:8/877: chown d1/d6f/df9/d102 15124095 1 2026-03-10T07:51:15.471 INFO:tasks.workunit.client.0.vm05.stdout:9/897: creat d8/d86/d28/d79/d57/de1/d22/f133 x:0 0 0 2026-03-10T07:51:15.483 INFO:tasks.workunit.client.0.vm05.stdout:1/932: creat da/dd/d2a/d55/f122 x:0 0 0 2026-03-10T07:51:15.484 INFO:tasks.workunit.client.0.vm05.stdout:7/959: unlink d1/d34/d59/d60/d8c/laa 0 2026-03-10T07:51:15.484 INFO:tasks.workunit.client.0.vm05.stdout:5/968: creat d2/dd7/f14f x:0 0 0 2026-03-10T07:51:15.484 INFO:tasks.workunit.client.0.vm05.stdout:0/973: fsync d8/d9c/fec 0 2026-03-10T07:51:15.484 INFO:tasks.workunit.client.0.vm05.stdout:3/912: getdents d8/d8f 0 2026-03-10T07:51:15.533 INFO:tasks.workunit.client.0.vm05.stdout:1/933: fdatasync da/dd/ff9 0 2026-03-10T07:51:15.548 INFO:tasks.workunit.client.0.vm05.stdout:9/898: fdatasync d8/d86/d28/d79/d57/de1/d22/d33/d62/fba 0 2026-03-10T07:51:15.551 INFO:tasks.workunit.client.0.vm05.stdout:6/975: getdents d0/d11/d57/da4/db3/de7/d141/dd2 0 2026-03-10T07:51:15.552 INFO:tasks.workunit.client.0.vm05.stdout:5/969: creat d2/d20/d33/d86/dac/dc1/f150 x:0 0 0 2026-03-10T07:51:15.552 INFO:tasks.workunit.client.0.vm05.stdout:6/976: readlink d0/d11/d57/l111 0 2026-03-10T07:51:15.564 INFO:tasks.workunit.client.0.vm05.stdout:0/974: getdents d8/dd/d10/d26/d8b/da4/de7/d111 0 2026-03-10T07:51:15.567 INFO:tasks.workunit.client.0.vm05.stdout:9/899: unlink d8/d86/d28/d79/d57/de1/d22/d33/d47/fc2 0 2026-03-10T07:51:15.569 INFO:tasks.workunit.client.0.vm05.stdout:9/900: write d8/d86/d28/d79/d57/de1/f51 [2068754,41672] 0 2026-03-10T07:51:15.571 INFO:tasks.workunit.client.0.vm05.stdout:0/975: dread d8/dd/d10/d26/d3a/d5e/fa6 [0,4194304] 0 2026-03-10T07:51:15.575 INFO:tasks.workunit.client.0.vm05.stdout:0/976: fsync d8/dd/d10/f115 0 2026-03-10T07:51:15.575 INFO:tasks.workunit.client.0.vm05.stdout:8/878: link d1/dd/d18/c8d d1/dd/d4d/d64/d6a/de5/d2a/d48/c119 0 2026-03-10T07:51:15.575 INFO:tasks.workunit.client.0.vm05.stdout:9/901: chown d8/d86/d28/d79/d57/de1/d1c/f52 6388343 1 2026-03-10T07:51:15.578 INFO:tasks.workunit.client.0.vm05.stdout:6/977: dwrite d0/d11/d4f/d56/f135 [0,4194304] 0 2026-03-10T07:51:15.594 INFO:tasks.workunit.client.0.vm05.stdout:0/977: dread d8/f1c [0,4194304] 0 2026-03-10T07:51:15.603 INFO:tasks.workunit.client.0.vm05.stdout:6/978: creat d0/d11/d31/dbf/f14b x:0 0 0 2026-03-10T07:51:15.610 INFO:tasks.workunit.client.0.vm05.stdout:8/879: mknod d1/dd/d4d/dcc/dbd/dfc/c11a 0 2026-03-10T07:51:15.611 INFO:tasks.workunit.client.0.vm05.stdout:0/978: creat d8/dd/d10/d26/d48/f14c x:0 0 0 2026-03-10T07:51:15.616 INFO:tasks.workunit.client.0.vm05.stdout:8/880: dwrite d1/d6f/f108 [0,4194304] 0 2026-03-10T07:51:15.623 INFO:tasks.workunit.client.0.vm05.stdout:0/979: symlink d8/dd/d10/d26/d48/l14d 0 2026-03-10T07:51:15.630 INFO:tasks.workunit.client.0.vm05.stdout:8/881: symlink d1/dd/d4d/d64/d6a/de5/d2a/d48/d7c/d9c/da4/l11b 0 2026-03-10T07:51:15.641 INFO:tasks.workunit.client.0.vm05.stdout:8/882: fsync d1/dd/d4d/d64/d6a/de5/d2a/d48/dfd/f100 0 2026-03-10T07:51:15.641 INFO:tasks.workunit.client.0.vm05.stdout:8/883: rmdir d1/dd/d4d/dcc/dbd/d10e/ddb/deb 0 2026-03-10T07:51:15.643 INFO:tasks.workunit.client.0.vm05.stdout:8/884: truncate d1/dd/d4d/d64/d6a/de5/d2a/d48/d7c/d9c/fab 1248318 0 2026-03-10T07:51:15.646 INFO:tasks.workunit.client.0.vm05.stdout:8/885: dread d1/dd/d18/f58 [0,4194304] 0 2026-03-10T07:51:15.650 INFO:tasks.workunit.client.0.vm05.stdout:8/886: readlink d1/dd/d4d/d64/d6a/de5/d2a/d9a/lcd 0 2026-03-10T07:51:15.654 INFO:tasks.workunit.client.0.vm05.stdout:8/887: symlink d1/dd/d4d/d64/d8f/l11c 0 2026-03-10T07:51:15.659 INFO:tasks.workunit.client.0.vm05.stdout:1/934: write da/dd/d2a/d55/d64/f81 [4488648,1347] 0 2026-03-10T07:51:15.685 INFO:tasks.workunit.client.0.vm05.stdout:3/913: write d8/d1f/d2a/d34/f39 [4651611,18099] 0 2026-03-10T07:51:15.686 INFO:tasks.workunit.client.0.vm05.stdout:3/914: fsync d8/d1f/d24/d76/dc5/de1/f2d 0 2026-03-10T07:51:15.687 INFO:tasks.workunit.client.0.vm05.stdout:3/915: chown d8/d1f/cdc 102903 1 2026-03-10T07:51:15.701 INFO:tasks.workunit.client.0.vm05.stdout:7/960: dwrite d1/d34/ff2 [0,4194304] 0 2026-03-10T07:51:15.704 INFO:tasks.workunit.client.0.vm05.stdout:7/961: creat d1/d3c/d4b/da6/f12d x:0 0 0 2026-03-10T07:51:15.710 INFO:tasks.workunit.client.0.vm05.stdout:7/962: dread d1/d6/d3b/d7f/fe8 [0,4194304] 0 2026-03-10T07:51:15.715 INFO:tasks.workunit.client.0.vm05.stdout:7/963: truncate d1/d34/d59/fdf 1843535 0 2026-03-10T07:51:15.715 INFO:tasks.workunit.client.0.vm05.stdout:7/964: stat d1/d6/c81 0 2026-03-10T07:51:15.725 INFO:tasks.workunit.client.0.vm05.stdout:7/965: dread d1/f11 [0,4194304] 0 2026-03-10T07:51:15.727 INFO:tasks.workunit.client.0.vm05.stdout:7/966: mknod d1/d3c/d71/c12e 0 2026-03-10T07:51:15.752 INFO:tasks.workunit.client.0.vm05.stdout:3/916: sync 2026-03-10T07:51:15.755 INFO:tasks.workunit.client.0.vm05.stdout:3/917: write d8/d8f/f120 [489406,125751] 0 2026-03-10T07:51:15.759 INFO:tasks.workunit.client.0.vm05.stdout:3/918: truncate d8/d1f/d24/d76/dc5/de1/d19/f77 882563 0 2026-03-10T07:51:15.760 INFO:tasks.workunit.client.0.vm05.stdout:3/919: write d8/d1c/d109/f10e [569581,35529] 0 2026-03-10T07:51:15.780 INFO:tasks.workunit.client.0.vm05.stdout:3/920: truncate d8/d1f/d24/d76/fc1 4384238 0 2026-03-10T07:51:15.785 INFO:tasks.workunit.client.1.vm08.stderr:++ readlink -f fsstress 2026-03-10T07:51:15.787 INFO:tasks.workunit.client.1.vm08.stderr:+ BIN=/home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress 2026-03-10T07:51:15.787 INFO:tasks.workunit.client.1.vm08.stderr:+ popd 2026-03-10T07:51:15.788 INFO:tasks.workunit.client.1.vm08.stdout:~/cephtest/mnt.1/client.1/tmp/fsstress ~/cephtest/mnt.1/client.1/tmp 2026-03-10T07:51:15.788 INFO:tasks.workunit.client.1.vm08.stderr:+ popd 2026-03-10T07:51:15.789 INFO:tasks.workunit.client.0.vm05.stdout:3/921: dread d8/d1c/d109/f10e [0,4194304] 0 2026-03-10T07:51:15.789 INFO:tasks.workunit.client.1.vm08.stdout:~/cephtest/mnt.1/client.1/tmp 2026-03-10T07:51:15.789 INFO:tasks.workunit.client.1.vm08.stderr:++ mktemp -d -p . 2026-03-10T07:51:15.789 INFO:tasks.workunit.client.0.vm05.stdout:3/922: write d8/d1f/f123 [876453,128723] 0 2026-03-10T07:51:15.791 INFO:tasks.workunit.client.0.vm05.stdout:3/923: write d8/d1f/d2a/d4a/f89 [968364,104423] 0 2026-03-10T07:51:15.797 INFO:tasks.workunit.client.1.vm08.stderr:+ T=./tmp.npN7WPNL4y 2026-03-10T07:51:15.797 INFO:tasks.workunit.client.1.vm08.stderr:+ /home/ubuntu/cephtest/mnt.1/client.1/tmp/fsstress/ltp-full-20091231/testcases/kernel/fs/fsstress/fsstress -d ./tmp.npN7WPNL4y -l 1 -n 1000 -p 10 -v 2026-03-10T07:51:15.803 INFO:tasks.workunit.client.0.vm05.stdout:3/924: symlink d8/d1f/d2a/l137 0 2026-03-10T07:51:15.809 INFO:tasks.workunit.client.0.vm05.stdout:3/925: mknod d8/d1f/d24/d76/dc5/c138 0 2026-03-10T07:51:15.811 INFO:tasks.workunit.client.0.vm05.stdout:7/967: creat d1/d3c/d71/d79/f12f x:0 0 0 2026-03-10T07:51:15.814 INFO:tasks.workunit.client.1.vm08.stdout:seed = 1772331139 2026-03-10T07:51:15.819 INFO:tasks.workunit.client.1.vm08.stdout:0/0: dread - no filename 2026-03-10T07:51:15.821 INFO:tasks.workunit.client.0.vm05.stdout:9/902: write d8/d86/d28/d79/d57/de1/d38/d71/d81/f83 [3346291,88653] 0 2026-03-10T07:51:15.823 INFO:tasks.workunit.client.1.vm08.stdout:0/1: creat f0 x:0 0 0 2026-03-10T07:51:15.824 INFO:tasks.workunit.client.1.vm08.stdout:0/2: write f0 [593199,23306] 0 2026-03-10T07:51:15.827 INFO:tasks.workunit.client.0.vm05.stdout:7/968: dwrite d1/d3c/d4b/da6/f12d [0,4194304] 0 2026-03-10T07:51:15.837 INFO:tasks.workunit.client.1.vm08.stdout:7/0: readlink - no filename 2026-03-10T07:51:15.837 INFO:tasks.workunit.client.1.vm08.stdout:7/1: dwrite - no filename 2026-03-10T07:51:15.837 INFO:tasks.workunit.client.1.vm08.stdout:7/2: chown . 304 1 2026-03-10T07:51:15.837 INFO:tasks.workunit.client.0.vm05.stdout:9/903: unlink d8/l19 0 2026-03-10T07:51:15.838 INFO:tasks.workunit.client.0.vm05.stdout:7/969: mkdir d1/d6/d47/d8d/d112/d130 0 2026-03-10T07:51:15.838 INFO:tasks.workunit.client.1.vm08.stdout:0/3: creat f1 x:0 0 0 2026-03-10T07:51:15.838 INFO:tasks.workunit.client.1.vm08.stdout:8/0: mkdir d0 0 2026-03-10T07:51:15.838 INFO:tasks.workunit.client.1.vm08.stdout:8/1: fdatasync - no filename 2026-03-10T07:51:15.839 INFO:tasks.workunit.client.0.vm05.stdout:5/970: rename d2/d20/d7b/dbc/f12b to d2/d20/d33/f151 0 2026-03-10T07:51:15.840 INFO:tasks.workunit.client.1.vm08.stdout:9/0: creat f0 x:0 0 0 2026-03-10T07:51:15.840 INFO:tasks.workunit.client.1.vm08.stdout:9/1: chown f0 4120982 1 2026-03-10T07:51:15.842 INFO:tasks.workunit.client.1.vm08.stdout:0/4: creat f2 x:0 0 0 2026-03-10T07:51:15.843 INFO:tasks.workunit.client.1.vm08.stdout:0/5: write f1 [53512,119583] 0 2026-03-10T07:51:15.843 INFO:tasks.workunit.client.0.vm05.stdout:3/926: dwrite d8/d1c/d109/f10e [0,4194304] 0 2026-03-10T07:51:15.844 INFO:tasks.workunit.client.1.vm08.stdout:0/6: chown f2 990866169 1 2026-03-10T07:51:15.844 INFO:tasks.workunit.client.1.vm08.stdout:9/2: mkdir d1 0 2026-03-10T07:51:15.845 INFO:tasks.workunit.client.1.vm08.stdout:9/3: truncate f0 584837 0 2026-03-10T07:51:15.846 INFO:tasks.workunit.client.1.vm08.stdout:9/4: write f0 [1378118,99332] 0 2026-03-10T07:51:15.846 INFO:tasks.workunit.client.1.vm08.stdout:9/5: write f0 [503889,3636] 0 2026-03-10T07:51:15.849 INFO:tasks.workunit.client.0.vm05.stdout:9/904: write d8/d86/d28/d79/d57/de1/d22/dab/fd7 [221529,92121] 0 2026-03-10T07:51:15.851 INFO:tasks.workunit.client.0.vm05.stdout:9/905: write d8/d86/d28/d79/d57/d96/fa9 [1676366,48802] 0 2026-03-10T07:51:15.856 INFO:tasks.workunit.client.0.vm05.stdout:6/979: dwrite d0/d11/d57/f7a [0,4194304] 0 2026-03-10T07:51:15.860 INFO:tasks.workunit.client.1.vm08.stdout:5/0: dwrite - no filename 2026-03-10T07:51:15.860 INFO:tasks.workunit.client.1.vm08.stdout:5/1: write - no filename 2026-03-10T07:51:15.860 INFO:tasks.workunit.client.1.vm08.stdout:5/2: truncate - no filename 2026-03-10T07:51:15.876 INFO:tasks.workunit.client.1.vm08.stdout:8/2: mknod d0/c1 0 2026-03-10T07:51:15.876 INFO:tasks.workunit.client.1.vm08.stdout:7/3: creat f0 x:0 0 0 2026-03-10T07:51:15.876 INFO:tasks.workunit.client.1.vm08.stdout:7/4: read - f0 zero size 2026-03-10T07:51:15.876 INFO:tasks.workunit.client.1.vm08.stdout:8/3: rename d0 to d0/d2 22 2026-03-10T07:51:15.877 INFO:tasks.workunit.client.1.vm08.stdout:7/5: write f0 [985051,33232] 0 2026-03-10T07:51:15.881 INFO:tasks.workunit.client.0.vm05.stdout:0/980: dwrite d8/faf [8388608,4194304] 0 2026-03-10T07:51:15.888 INFO:tasks.workunit.client.1.vm08.stdout:4/0: link - no file 2026-03-10T07:51:15.898 INFO:tasks.workunit.client.0.vm05.stdout:8/888: write d1/dd/d4d/d64/d6a/f83 [587134,99105] 0 2026-03-10T07:51:15.898 INFO:tasks.workunit.client.0.vm05.stdout:1/935: write da/d26/d9e/fd5 [120083,72007] 0 2026-03-10T07:51:15.902 INFO:tasks.workunit.client.1.vm08.stdout:6/0: symlink l0 0 2026-03-10T07:51:15.902 INFO:tasks.workunit.client.1.vm08.stdout:6/1: rmdir - no directory 2026-03-10T07:51:15.902 INFO:tasks.workunit.client.1.vm08.stdout:6/2: dwrite - no filename 2026-03-10T07:51:15.911 INFO:tasks.workunit.client.1.vm08.stdout:8/4: mknod d0/c3 0 2026-03-10T07:51:15.911 INFO:tasks.workunit.client.1.vm08.stdout:8/5: dwrite - no filename 2026-03-10T07:51:15.919 INFO:tasks.workunit.client.0.vm05.stdout:8/889: rename d1/dd/d4d/d64/d6a/de5/f5b to d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d5d/f11d 0 2026-03-10T07:51:15.961 INFO:tasks.workunit.client.0.vm05.stdout:6/980: link d0/d35/c8c d0/d11/d2e/d81/d92/c14c 0 2026-03-10T07:51:15.961 INFO:tasks.workunit.client.0.vm05.stdout:9/906: link d8/d86/d28/d79/d57/de1/d1c/l126 d8/d86/d28/d79/d57/de1/d22/d33/d70/l134 0 2026-03-10T07:51:15.961 INFO:tasks.workunit.client.0.vm05.stdout:9/907: chown d8/d86/d28/d79/d57/de1/d1c/d20/d59/fca 0 1 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.0.vm05.stdout:0/981: creat d8/d9c/dc8/d10f/d127/d12f/dc3/dea/f14e x:0 0 0 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.0.vm05.stdout:5/971: rmdir d2/d12 39 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.0.vm05.stdout:5/972: truncate d2/d20/d5b/feb 1231545 0 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.0.vm05.stdout:5/973: chown d2/d20/d5b 534117 1 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.0.vm05.stdout:1/936: mknod da/d26/d9e/dcc/d105/c123 0 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.0.vm05.stdout:9/908: dwrite d8/d86/d28/d79/d57/de1/d38/d71/d81/dcf/f119 [0,4194304] 0 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.0.vm05.stdout:1/937: chown da/dd/d12/d34/d58/d10a 521408 1 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.0.vm05.stdout:8/890: rmdir d1/d6f/df9/d102 39 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.0.vm05.stdout:3/927: link d8/d1f/d24/d76/dc5/de1/d52/d7b/fc3 d8/d1f/d2a/f139 0 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.0.vm05.stdout:6/981: dread - d0/d35/ffe zero size 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.0.vm05.stdout:5/974: chown d2/d12/da8/ddd/de9/cff 17170511 1 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.1.vm08.stdout:3/0: rename - no filename 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.1.vm08.stdout:3/1: dwrite - no filename 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.1.vm08.stdout:3/2: dwrite - no filename 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.1.vm08.stdout:6/3: mkdir d1 0 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.1.vm08.stdout:6/4: readlink l0 0 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.1.vm08.stdout:6/5: stat d1 0 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.1.vm08.stdout:6/6: write - no filename 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.1.vm08.stdout:6/7: write - no filename 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.1.vm08.stdout:8/6: mknod d0/c4 0 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.1.vm08.stdout:8/7: chown d0/c1 98250119 1 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.1.vm08.stdout:9/6: rmdir d1 0 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.1.vm08.stdout:9/7: write f0 [2176229,129312] 0 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.1.vm08.stdout:4/1: creat f0 x:0 0 0 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.1.vm08.stdout:8/8: mknod d0/c5 0 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.1.vm08.stdout:4/2: dwrite f0 [0,4194304] 0 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.1.vm08.stdout:3/3: mkdir d0 0 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.1.vm08.stdout:3/4: chown d0 6335 1 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.1.vm08.stdout:4/3: write f0 [2421532,96318] 0 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.1.vm08.stdout:3/5: rename d0 to d0/d1 22 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.1.vm08.stdout:3/6: write - no filename 2026-03-10T07:51:15.962 INFO:tasks.workunit.client.1.vm08.stdout:6/8: creat d1/f2 x:0 0 0 2026-03-10T07:51:15.963 INFO:tasks.workunit.client.0.vm05.stdout:9/909: rename f6 to d8/d86/d28/d79/d57/de1/d22/d33/d62/d6d/d9e/d112/f135 0 2026-03-10T07:51:15.963 INFO:tasks.workunit.client.0.vm05.stdout:0/982: mknod d8/dd/d10/c14f 0 2026-03-10T07:51:15.964 INFO:tasks.workunit.client.1.vm08.stdout:6/9: dwrite d1/f2 [0,4194304] 0 2026-03-10T07:51:15.967 INFO:tasks.workunit.client.0.vm05.stdout:8/891: truncate d1/dd/d5e/d9e/fc3 4200246 0 2026-03-10T07:51:15.971 INFO:tasks.workunit.client.0.vm05.stdout:5/975: creat d2/d20/d33/d86/dac/f152 x:0 0 0 2026-03-10T07:51:15.976 INFO:tasks.workunit.client.0.vm05.stdout:8/892: dread d1/dd/d18/f58 [0,4194304] 0 2026-03-10T07:51:15.976 INFO:tasks.workunit.client.1.vm08.stdout:8/9: creat d0/f6 x:0 0 0 2026-03-10T07:51:15.976 INFO:tasks.workunit.client.1.vm08.stdout:8/10: read - d0/f6 zero size 2026-03-10T07:51:15.976 INFO:tasks.workunit.client.1.vm08.stdout:4/4: creat f1 x:0 0 0 2026-03-10T07:51:15.976 INFO:tasks.workunit.client.1.vm08.stdout:2/0: mkdir d0 0 2026-03-10T07:51:15.976 INFO:tasks.workunit.client.1.vm08.stdout:8/11: read - d0/f6 zero size 2026-03-10T07:51:15.976 INFO:tasks.workunit.client.1.vm08.stdout:4/5: chown f1 78127 1 2026-03-10T07:51:15.976 INFO:tasks.workunit.client.1.vm08.stdout:8/12: read - d0/f6 zero size 2026-03-10T07:51:15.976 INFO:tasks.workunit.client.1.vm08.stdout:6/10: dwrite d1/f2 [0,4194304] 0 2026-03-10T07:51:15.978 INFO:tasks.workunit.client.0.vm05.stdout:1/938: creat da/dd/d12/d86/f124 x:0 0 0 2026-03-10T07:51:15.980 INFO:tasks.workunit.client.0.vm05.stdout:9/910: fdatasync d8/d86/d28/f29 0 2026-03-10T07:51:15.980 INFO:tasks.workunit.client.0.vm05.stdout:3/928: symlink d8/d1f/d24/d76/dc5/de1/d19/d37/l13a 0 2026-03-10T07:51:15.980 INFO:tasks.workunit.client.0.vm05.stdout:0/983: symlink d8/dd/d37/dfd/l150 0 2026-03-10T07:51:15.980 INFO:tasks.workunit.client.0.vm05.stdout:5/976: mkdir d2/d12/da8/ddd/de9/d11c/d153 0 2026-03-10T07:51:15.982 INFO:tasks.workunit.client.0.vm05.stdout:8/893: symlink d1/dd/d4d/dcc/l11e 0 2026-03-10T07:51:15.983 INFO:tasks.workunit.client.1.vm08.stdout:3/7: getdents d0 0 2026-03-10T07:51:15.985 INFO:tasks.workunit.client.0.vm05.stdout:0/984: dread d8/dd/d10/d26/d2a/d13a/faa [0,4194304] 0 2026-03-10T07:51:15.987 INFO:tasks.workunit.client.0.vm05.stdout:1/939: mknod da/dd/d12/d34/d58/d10a/c125 0 2026-03-10T07:51:16.012 INFO:tasks.workunit.client.0.vm05.stdout:0/985: fsync d8/dd/d10/d26/d8b/d86/f149 0 2026-03-10T07:51:16.012 INFO:tasks.workunit.client.0.vm05.stdout:0/986: stat d8/dd/d10 0 2026-03-10T07:51:16.012 INFO:tasks.workunit.client.0.vm05.stdout:1/940: mknod da/d26/d9e/c126 0 2026-03-10T07:51:16.012 INFO:tasks.workunit.client.0.vm05.stdout:9/911: dread d8/fa [0,4194304] 0 2026-03-10T07:51:16.012 INFO:tasks.workunit.client.0.vm05.stdout:8/894: mkdir d1/d6f/df9/d102/dbf/d11f 0 2026-03-10T07:51:16.012 INFO:tasks.workunit.client.0.vm05.stdout:9/912: write d8/d86/d28/d79/d57/d96/dd8/fec [4026306,87484] 0 2026-03-10T07:51:16.012 INFO:tasks.workunit.client.0.vm05.stdout:1/941: creat da/dd/d12/d34/ddb/def/f127 x:0 0 0 2026-03-10T07:51:16.012 INFO:tasks.workunit.client.0.vm05.stdout:0/987: symlink d8/dd/d37/d67/l151 0 2026-03-10T07:51:16.012 INFO:tasks.workunit.client.0.vm05.stdout:5/977: getdents d2/d20/d33/d86 0 2026-03-10T07:51:16.012 INFO:tasks.workunit.client.0.vm05.stdout:1/942: chown da/d26/d2b/d71/c11c 5 1 2026-03-10T07:51:16.012 INFO:tasks.workunit.client.0.vm05.stdout:9/913: creat d8/d86/d28/d79/d57/de1/d1c/d20/f136 x:0 0 0 2026-03-10T07:51:16.013 INFO:tasks.workunit.client.1.vm08.stdout:1/0: creat f0 x:0 0 0 2026-03-10T07:51:16.013 INFO:tasks.workunit.client.1.vm08.stdout:6/11: dwrite d1/f2 [0,4194304] 0 2026-03-10T07:51:16.013 INFO:tasks.workunit.client.1.vm08.stdout:2/1: mkdir d0/d1 0 2026-03-10T07:51:16.013 INFO:tasks.workunit.client.1.vm08.stdout:2/2: truncate - no filename 2026-03-10T07:51:16.013 INFO:tasks.workunit.client.1.vm08.stdout:2/3: readlink - no filename 2026-03-10T07:51:16.013 INFO:tasks.workunit.client.1.vm08.stdout:2/4: readlink - no filename 2026-03-10T07:51:16.013 INFO:tasks.workunit.client.1.vm08.stdout:2/5: write - no filename 2026-03-10T07:51:16.013 INFO:tasks.workunit.client.1.vm08.stdout:2/6: readlink - no filename 2026-03-10T07:51:16.013 INFO:tasks.workunit.client.1.vm08.stdout:2/7: dwrite - no filename 2026-03-10T07:51:16.013 INFO:tasks.workunit.client.1.vm08.stdout:1/1: symlink l1 0 2026-03-10T07:51:16.013 INFO:tasks.workunit.client.1.vm08.stdout:2/8: mknod d0/c2 0 2026-03-10T07:51:16.013 INFO:tasks.workunit.client.1.vm08.stdout:8/13: link d0/c4 d0/c7 0 2026-03-10T07:51:16.013 INFO:tasks.workunit.client.1.vm08.stdout:1/2: mkdir d2 0 2026-03-10T07:51:16.013 INFO:tasks.workunit.client.1.vm08.stdout:1/3: dread - f0 zero size 2026-03-10T07:51:16.013 INFO:tasks.workunit.client.1.vm08.stdout:8/14: creat d0/f8 x:0 0 0 2026-03-10T07:51:16.013 INFO:tasks.workunit.client.1.vm08.stdout:2/9: mkdir d0/d1/d3 0 2026-03-10T07:51:16.013 INFO:tasks.workunit.client.1.vm08.stdout:2/10: chown d0 1591669 1 2026-03-10T07:51:16.013 INFO:tasks.workunit.client.1.vm08.stdout:2/11: dwrite - no filename 2026-03-10T07:51:16.013 INFO:tasks.workunit.client.1.vm08.stdout:8/15: chown d0/c1 296042 1 2026-03-10T07:51:16.013 INFO:tasks.workunit.client.1.vm08.stdout:8/16: truncate d0/f8 190127 0 2026-03-10T07:51:16.013 INFO:tasks.workunit.client.1.vm08.stdout:1/4: dwrite f0 [0,4194304] 0 2026-03-10T07:51:16.013 INFO:tasks.workunit.client.1.vm08.stdout:8/17: truncate d0/f8 897648 0 2026-03-10T07:51:16.013 INFO:tasks.workunit.client.1.vm08.stdout:8/18: write d0/f6 [603536,85116] 0 2026-03-10T07:51:16.013 INFO:tasks.workunit.client.1.vm08.stdout:1/5: rename d2 to d2/d3 22 2026-03-10T07:51:16.014 INFO:tasks.workunit.client.1.vm08.stdout:8/19: mkdir d0/d9 0 2026-03-10T07:51:16.015 INFO:tasks.workunit.client.0.vm05.stdout:8/895: dread d1/dd/d4d/d64/d6a/de5/d2a/d34/da5/fa8 [0,4194304] 0 2026-03-10T07:51:16.015 INFO:tasks.workunit.client.1.vm08.stdout:2/12: link d0/c2 d0/d1/d3/c4 0 2026-03-10T07:51:16.016 INFO:tasks.workunit.client.1.vm08.stdout:2/13: fsync - no filename 2026-03-10T07:51:16.016 INFO:tasks.workunit.client.0.vm05.stdout:1/943: unlink da/d26/d2b/dcb/lf8 0 2026-03-10T07:51:16.017 INFO:tasks.workunit.client.1.vm08.stdout:8/20: creat d0/fa x:0 0 0 2026-03-10T07:51:16.018 INFO:tasks.workunit.client.1.vm08.stdout:1/6: creat d2/f4 x:0 0 0 2026-03-10T07:51:16.018 INFO:tasks.workunit.client.0.vm05.stdout:1/944: readlink da/dd/d12/d34/d107/dbe/l119 0 2026-03-10T07:51:16.018 INFO:tasks.workunit.client.0.vm05.stdout:9/914: mkdir d8/d86/d28/d79/d57/de1/d1c/d20/dee/d137 0 2026-03-10T07:51:16.019 INFO:tasks.workunit.client.0.vm05.stdout:8/896: chown d1/dd/d4d/d64/d6a/f83 2315690 1 2026-03-10T07:51:16.020 INFO:tasks.workunit.client.1.vm08.stdout:2/14: mkdir d0/d1/d3/d5 0 2026-03-10T07:51:16.022 INFO:tasks.workunit.client.1.vm08.stdout:8/21: rmdir d0/d9 0 2026-03-10T07:51:16.036 INFO:tasks.workunit.client.0.vm05.stdout:9/915: truncate d8/d86/d28/d79/d57/de1/d1c/d20/f124 4196827 0 2026-03-10T07:51:16.036 INFO:tasks.workunit.client.1.vm08.stdout:2/15: mknod d0/c6 0 2026-03-10T07:51:16.036 INFO:tasks.workunit.client.1.vm08.stdout:2/16: dread - no filename 2026-03-10T07:51:16.036 INFO:tasks.workunit.client.1.vm08.stdout:2/17: truncate - no filename 2026-03-10T07:51:16.036 INFO:tasks.workunit.client.1.vm08.stdout:2/18: mknod d0/d1/d3/c7 0 2026-03-10T07:51:16.036 INFO:tasks.workunit.client.1.vm08.stdout:2/19: creat d0/d1/d3/f8 x:0 0 0 2026-03-10T07:51:16.036 INFO:tasks.workunit.client.1.vm08.stdout:2/20: dread - d0/d1/d3/f8 zero size 2026-03-10T07:51:16.036 INFO:tasks.workunit.client.1.vm08.stdout:2/21: chown d0/d1 17775292 1 2026-03-10T07:51:16.036 INFO:tasks.workunit.client.1.vm08.stdout:2/22: truncate d0/d1/d3/f8 295400 0 2026-03-10T07:51:16.036 INFO:tasks.workunit.client.1.vm08.stdout:2/23: dread d0/d1/d3/f8 [0,4194304] 0 2026-03-10T07:51:16.036 INFO:tasks.workunit.client.1.vm08.stdout:2/24: mkdir d0/d1/d9 0 2026-03-10T07:51:16.038 INFO:tasks.workunit.client.0.vm05.stdout:9/916: truncate d8/d86/d28/d79/d57/de1/d22/f98 834415 0 2026-03-10T07:51:16.038 INFO:tasks.workunit.client.0.vm05.stdout:1/945: getdents da 0 2026-03-10T07:51:16.039 INFO:tasks.workunit.client.0.vm05.stdout:1/946: fdatasync da/d26/d9e/dcc/f109 0 2026-03-10T07:51:16.041 INFO:tasks.workunit.client.1.vm08.stdout:2/25: chown d0/d1/d3/c4 0 1 2026-03-10T07:51:16.043 INFO:tasks.workunit.client.0.vm05.stdout:8/897: truncate d1/dd/d4d/d64/d6a/de5/d2a/d48/f50 1577627 0 2026-03-10T07:51:16.043 INFO:tasks.workunit.client.1.vm08.stdout:2/26: creat d0/d1/d3/fa x:0 0 0 2026-03-10T07:51:16.044 INFO:tasks.workunit.client.1.vm08.stdout:2/27: creat d0/d1/fb x:0 0 0 2026-03-10T07:51:16.046 INFO:tasks.workunit.client.0.vm05.stdout:9/917: link d8/d86/d28/d79/d57/de1/d1c/d20/lf4 d8/d86/d28/d79/d57/de1/d6b/dde/l138 0 2026-03-10T07:51:16.060 INFO:tasks.workunit.client.1.vm08.stdout:2/28: dwrite d0/d1/fb [0,4194304] 0 2026-03-10T07:51:16.060 INFO:tasks.workunit.client.1.vm08.stdout:2/29: chown d0/d1/d3/f8 1333 1 2026-03-10T07:51:16.060 INFO:tasks.workunit.client.0.vm05.stdout:9/918: fsync d8/d86/d28/d79/d57/de1/d6b/dde/f128 0 2026-03-10T07:51:16.060 INFO:tasks.workunit.client.0.vm05.stdout:9/919: fdatasync d8/d86/d28/d79/d57/de1/d38/fff 0 2026-03-10T07:51:16.060 INFO:tasks.workunit.client.1.vm08.stdout:5/3: sync 2026-03-10T07:51:16.061 INFO:tasks.workunit.client.1.vm08.stdout:2/30: mknod d0/d1/d3/d5/cc 0 2026-03-10T07:51:16.062 INFO:tasks.workunit.client.1.vm08.stdout:2/31: dread - d0/d1/d3/fa zero size 2026-03-10T07:51:16.066 INFO:tasks.workunit.client.1.vm08.stdout:5/4: mkdir d0 0 2026-03-10T07:51:16.066 INFO:tasks.workunit.client.1.vm08.stdout:5/5: read - no filename 2026-03-10T07:51:16.066 INFO:tasks.workunit.client.1.vm08.stdout:5/6: chown d0 12789551 1 2026-03-10T07:51:16.066 INFO:tasks.workunit.client.1.vm08.stdout:5/7: dwrite - no filename 2026-03-10T07:51:16.066 INFO:tasks.workunit.client.1.vm08.stdout:5/8: dread - no filename 2026-03-10T07:51:16.067 INFO:tasks.workunit.client.1.vm08.stdout:7/6: dread f0 [0,4194304] 0 2026-03-10T07:51:16.070 INFO:tasks.workunit.client.1.vm08.stdout:3/8: sync 2026-03-10T07:51:16.070 INFO:tasks.workunit.client.1.vm08.stdout:3/9: truncate - no filename 2026-03-10T07:51:16.070 INFO:tasks.workunit.client.1.vm08.stdout:7/7: write f0 [523126,126361] 0 2026-03-10T07:51:16.071 INFO:tasks.workunit.client.1.vm08.stdout:3/10: chown d0 33090328 1 2026-03-10T07:51:16.072 INFO:tasks.workunit.client.1.vm08.stdout:7/8: dread f0 [0,4194304] 0 2026-03-10T07:51:16.073 INFO:tasks.workunit.client.1.vm08.stdout:5/9: mknod d0/c1 0 2026-03-10T07:51:16.073 INFO:tasks.workunit.client.1.vm08.stdout:5/10: dwrite - no filename 2026-03-10T07:51:16.076 INFO:tasks.workunit.client.0.vm05.stdout:9/920: dread d8/d86/d28/d79/d57/de1/f5d [0,4194304] 0 2026-03-10T07:51:16.077 INFO:tasks.workunit.client.0.vm05.stdout:9/921: write d8/d86/d28/d79/d57/de1/d22/f133 [818817,28830] 0 2026-03-10T07:51:16.077 INFO:tasks.workunit.client.1.vm08.stdout:7/9: dwrite f0 [0,4194304] 0 2026-03-10T07:51:16.078 INFO:tasks.workunit.client.0.vm05.stdout:9/922: chown d8/d86/d28/d79/d57/de1/d22/d33/d47/lfb 21178292 1 2026-03-10T07:51:16.079 INFO:tasks.workunit.client.1.vm08.stdout:7/10: dread f0 [0,4194304] 0 2026-03-10T07:51:16.080 INFO:tasks.workunit.client.1.vm08.stdout:7/11: read f0 [1931058,70131] 0 2026-03-10T07:51:16.080 INFO:tasks.workunit.client.1.vm08.stdout:7/12: readlink - no filename 2026-03-10T07:51:16.083 INFO:tasks.workunit.client.1.vm08.stdout:5/11: symlink d0/l2 0 2026-03-10T07:51:16.083 INFO:tasks.workunit.client.1.vm08.stdout:5/12: chown d0 444490598 1 2026-03-10T07:51:16.087 INFO:tasks.workunit.client.1.vm08.stdout:7/13: symlink l1 0 2026-03-10T07:51:16.087 INFO:tasks.workunit.client.1.vm08.stdout:5/13: symlink d0/l3 0 2026-03-10T07:51:16.089 INFO:tasks.workunit.client.1.vm08.stdout:7/14: mknod c2 0 2026-03-10T07:51:16.089 INFO:tasks.workunit.client.1.vm08.stdout:7/15: stat f0 0 2026-03-10T07:51:16.089 INFO:tasks.workunit.client.1.vm08.stdout:5/14: mkdir d0/d4 0 2026-03-10T07:51:16.099 INFO:tasks.workunit.client.1.vm08.stdout:5/15: creat d0/d4/f5 x:0 0 0 2026-03-10T07:51:16.103 INFO:tasks.workunit.client.1.vm08.stdout:5/16: mknod d0/d4/c6 0 2026-03-10T07:51:16.103 INFO:tasks.workunit.client.1.vm08.stdout:5/17: mknod d0/d4/c7 0 2026-03-10T07:51:16.103 INFO:tasks.workunit.client.1.vm08.stdout:5/18: mkdir d0/d8 0 2026-03-10T07:51:16.103 INFO:tasks.workunit.client.1.vm08.stdout:5/19: readlink d0/l3 0 2026-03-10T07:51:16.104 INFO:tasks.workunit.client.1.vm08.stdout:3/11: sync 2026-03-10T07:51:16.104 INFO:tasks.workunit.client.1.vm08.stdout:3/12: dwrite - no filename 2026-03-10T07:51:16.104 INFO:tasks.workunit.client.1.vm08.stdout:3/13: dwrite - no filename 2026-03-10T07:51:16.106 INFO:tasks.workunit.client.1.vm08.stdout:3/14: creat d0/f2 x:0 0 0 2026-03-10T07:51:16.111 INFO:tasks.workunit.client.1.vm08.stdout:3/15: dwrite d0/f2 [0,4194304] 0 2026-03-10T07:51:16.116 INFO:tasks.workunit.client.1.vm08.stdout:3/16: mknod d0/c3 0 2026-03-10T07:51:16.117 INFO:tasks.workunit.client.0.vm05.stdout:8/898: sync 2026-03-10T07:51:16.123 INFO:tasks.workunit.client.0.vm05.stdout:8/899: rename d1/dd/d4d/d64/d6a/de5/d2a/d48/d5a/cb0 to d1/dd/d4d/d64/d104/c120 0 2026-03-10T07:51:16.127 INFO:tasks.workunit.client.0.vm05.stdout:8/900: fdatasync d1/dd/d4d/d64/d6a/de5/f43 0 2026-03-10T07:51:16.140 INFO:tasks.workunit.client.0.vm05.stdout:8/901: readlink d1/dd/d4d/d64/d6a/de5/d2a/d34/da5/ld1 0 2026-03-10T07:51:16.140 INFO:tasks.workunit.client.0.vm05.stdout:8/902: truncate d1/d6f/fa7 2155524 0 2026-03-10T07:51:16.140 INFO:tasks.workunit.client.0.vm05.stdout:8/903: mkdir d1/dd/d121 0 2026-03-10T07:51:16.140 INFO:tasks.workunit.client.0.vm05.stdout:8/904: fsync d1/dd/d4d/d64/d6a/de5/fd2 0 2026-03-10T07:51:16.140 INFO:tasks.workunit.client.0.vm05.stdout:8/905: mkdir d1/dd/d4d/dcc/dbd/d10e/ddb/d122 0 2026-03-10T07:51:16.142 INFO:tasks.workunit.client.0.vm05.stdout:8/906: dwrite d1/dd/d4d/dcc/dbd/fcf [0,4194304] 0 2026-03-10T07:51:16.155 INFO:tasks.workunit.client.1.vm08.stdout:9/8: fdatasync f0 0 2026-03-10T07:51:16.155 INFO:tasks.workunit.client.1.vm08.stdout:9/9: write f0 [652005,62465] 0 2026-03-10T07:51:16.255 INFO:tasks.workunit.client.1.vm08.stdout:9/10: dread f0 [0,4194304] 0 2026-03-10T07:51:16.256 INFO:tasks.workunit.client.1.vm08.stdout:9/11: readlink - no filename 2026-03-10T07:51:16.273 INFO:tasks.workunit.client.1.vm08.stdout:0/7: getdents . 0 2026-03-10T07:51:16.273 INFO:tasks.workunit.client.1.vm08.stdout:7/16: fsync f0 0 2026-03-10T07:51:16.273 INFO:tasks.workunit.client.1.vm08.stdout:7/17: fdatasync f0 0 2026-03-10T07:51:16.274 INFO:tasks.workunit.client.0.vm05.stdout:7/970: write d1/d34/d59/fca [282231,118781] 0 2026-03-10T07:51:16.278 INFO:tasks.workunit.client.1.vm08.stdout:0/8: mknod c3 0 2026-03-10T07:51:16.278 INFO:tasks.workunit.client.1.vm08.stdout:0/9: chown f2 344587 1 2026-03-10T07:51:16.278 INFO:tasks.workunit.client.1.vm08.stdout:0/10: truncate f1 195243 0 2026-03-10T07:51:16.279 INFO:tasks.workunit.client.1.vm08.stdout:0/11: fsync f0 0 2026-03-10T07:51:16.295 INFO:tasks.workunit.client.0.vm05.stdout:7/971: mkdir d1/d34/d59/d131 0 2026-03-10T07:51:16.308 INFO:tasks.workunit.client.1.vm08.stdout:4/6: fsync f0 0 2026-03-10T07:51:16.308 INFO:tasks.workunit.client.1.vm08.stdout:4/7: chown f0 310 1 2026-03-10T07:51:16.309 INFO:tasks.workunit.client.0.vm05.stdout:7/972: mknod d1/d3c/d71/c132 0 2026-03-10T07:51:16.309 INFO:tasks.workunit.client.0.vm05.stdout:7/973: readlink d1/d3c/db8/lc6 0 2026-03-10T07:51:16.310 INFO:tasks.workunit.client.0.vm05.stdout:7/974: read - d1/d3c/d104/f11c zero size 2026-03-10T07:51:16.321 INFO:tasks.workunit.client.0.vm05.stdout:7/975: creat d1/d6/d3b/f133 x:0 0 0 2026-03-10T07:51:16.321 INFO:tasks.workunit.client.1.vm08.stdout:0/12: creat f4 x:0 0 0 2026-03-10T07:51:16.321 INFO:tasks.workunit.client.1.vm08.stdout:0/13: chown f4 30827 1 2026-03-10T07:51:16.324 INFO:tasks.workunit.client.1.vm08.stdout:4/8: creat f2 x:0 0 0 2026-03-10T07:51:16.325 INFO:tasks.workunit.client.1.vm08.stdout:4/9: write f2 [92540,73777] 0 2026-03-10T07:51:16.325 INFO:tasks.workunit.client.1.vm08.stdout:4/10: truncate f1 441125 0 2026-03-10T07:51:16.328 INFO:tasks.workunit.client.1.vm08.stdout:0/14: creat f5 x:0 0 0 2026-03-10T07:51:16.328 INFO:tasks.workunit.client.1.vm08.stdout:0/15: readlink - no filename 2026-03-10T07:51:16.328 INFO:tasks.workunit.client.0.vm05.stdout:6/982: write d0/d11/d57/da4/db3/de7/d141/d43/fe1 [573928,93561] 0 2026-03-10T07:51:16.329 INFO:tasks.workunit.client.0.vm05.stdout:6/983: fdatasync d0/d11/d4f/da0/da6/fd9 0 2026-03-10T07:51:16.335 INFO:tasks.workunit.client.1.vm08.stdout:4/11: symlink l3 0 2026-03-10T07:51:16.335 INFO:tasks.workunit.client.1.vm08.stdout:4/12: stat f2 0 2026-03-10T07:51:16.337 INFO:tasks.workunit.client.1.vm08.stdout:6/12: write d1/f2 [4798787,15958] 0 2026-03-10T07:51:16.339 INFO:tasks.workunit.client.1.vm08.stdout:0/16: creat f6 x:0 0 0 2026-03-10T07:51:16.339 INFO:tasks.workunit.client.1.vm08.stdout:0/17: write f2 [225801,75127] 0 2026-03-10T07:51:16.347 INFO:tasks.workunit.client.0.vm05.stdout:7/976: link d1/d6/c5a d1/d3c/d71/d79/c134 0 2026-03-10T07:51:16.347 INFO:tasks.workunit.client.0.vm05.stdout:7/977: stat d1/d3c/d104/f11c 0 2026-03-10T07:51:16.348 INFO:tasks.workunit.client.1.vm08.stdout:6/13: mkdir d1/d3 0 2026-03-10T07:51:16.348 INFO:tasks.workunit.client.1.vm08.stdout:0/18: creat f7 x:0 0 0 2026-03-10T07:51:16.349 INFO:tasks.workunit.client.0.vm05.stdout:3/929: write d8/d1f/d2a/d34/f3f [4125063,61821] 0 2026-03-10T07:51:16.349 INFO:tasks.workunit.client.1.vm08.stdout:0/19: read f2 [109626,84056] 0 2026-03-10T07:51:16.349 INFO:tasks.workunit.client.1.vm08.stdout:0/20: rmdir - no directory 2026-03-10T07:51:16.358 INFO:tasks.workunit.client.1.vm08.stdout:6/14: creat d1/f4 x:0 0 0 2026-03-10T07:51:16.358 INFO:tasks.workunit.client.1.vm08.stdout:6/15: dread - d1/f4 zero size 2026-03-10T07:51:16.358 INFO:tasks.workunit.client.1.vm08.stdout:6/16: truncate d1/f4 904547 0 2026-03-10T07:51:16.359 INFO:tasks.workunit.client.0.vm05.stdout:7/978: mkdir d1/d34/d135 0 2026-03-10T07:51:16.359 INFO:tasks.workunit.client.0.vm05.stdout:7/979: truncate d1/d6/f12c 448132 0 2026-03-10T07:51:16.363 INFO:tasks.workunit.client.1.vm08.stdout:0/21: creat f8 x:0 0 0 2026-03-10T07:51:16.363 INFO:tasks.workunit.client.1.vm08.stdout:4/13: link l3 l4 0 2026-03-10T07:51:16.364 INFO:tasks.workunit.client.1.vm08.stdout:0/22: write f7 [359561,19163] 0 2026-03-10T07:51:16.364 INFO:tasks.workunit.client.1.vm08.stdout:0/23: dread - f8 zero size 2026-03-10T07:51:16.364 INFO:tasks.workunit.client.1.vm08.stdout:0/24: dread - f4 zero size 2026-03-10T07:51:16.365 INFO:tasks.workunit.client.1.vm08.stdout:4/14: read f2 [7816,98223] 0 2026-03-10T07:51:16.366 INFO:tasks.workunit.client.1.vm08.stdout:6/17: rename d1/f4 to d1/f5 0 2026-03-10T07:51:16.368 INFO:tasks.workunit.client.0.vm05.stdout:0/988: write d8/dd/d37/d81/f91 [588177,16870] 0 2026-03-10T07:51:16.368 INFO:tasks.workunit.client.1.vm08.stdout:6/18: dread d1/f2 [0,4194304] 0 2026-03-10T07:51:16.372 INFO:tasks.workunit.client.1.vm08.stdout:0/25: unlink f1 0 2026-03-10T07:51:16.373 INFO:tasks.workunit.client.0.vm05.stdout:3/930: getdents d8/d1f/d24/d45/ddf 0 2026-03-10T07:51:16.373 INFO:tasks.workunit.client.0.vm05.stdout:5/978: write d2/d20/d7b/dbc/f125 [2919713,15758] 0 2026-03-10T07:51:16.374 INFO:tasks.workunit.client.0.vm05.stdout:5/979: stat d2/d12/dda/da1/fbf 0 2026-03-10T07:51:16.374 INFO:tasks.workunit.client.0.vm05.stdout:5/980: read - d2/d20/d33/d53/d7d/f144 zero size 2026-03-10T07:51:16.378 INFO:tasks.workunit.client.1.vm08.stdout:4/15: mkdir d5 0 2026-03-10T07:51:16.380 INFO:tasks.workunit.client.1.vm08.stdout:1/7: rmdir d2 39 2026-03-10T07:51:16.380 INFO:tasks.workunit.client.1.vm08.stdout:8/22: rmdir d0 39 2026-03-10T07:51:16.381 INFO:tasks.workunit.client.0.vm05.stdout:7/980: sync 2026-03-10T07:51:16.382 INFO:tasks.workunit.client.0.vm05.stdout:0/989: sync 2026-03-10T07:51:16.390 INFO:tasks.workunit.client.1.vm08.stdout:6/19: unlink l0 0 2026-03-10T07:51:16.393 INFO:tasks.workunit.client.0.vm05.stdout:7/981: unlink d1/d6/l26 0 2026-03-10T07:51:16.393 INFO:tasks.workunit.client.0.vm05.stdout:7/982: fdatasync d1/d3c/d104/f11c 0 2026-03-10T07:51:16.394 INFO:tasks.workunit.client.0.vm05.stdout:1/947: write da/fb2 [1800517,32554] 0 2026-03-10T07:51:16.403 INFO:tasks.workunit.client.0.vm05.stdout:1/948: dread da/f5c [0,4194304] 0 2026-03-10T07:51:16.404 INFO:tasks.workunit.client.0.vm05.stdout:1/949: read da/d26/d9e/dcc/f110 [1363833,115235] 0 2026-03-10T07:51:16.405 INFO:tasks.workunit.client.0.vm05.stdout:0/990: symlink d8/d9c/dc8/d100/l152 0 2026-03-10T07:51:16.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:16 vm05.local ceph-mon[50387]: pgmap v27: 65 pgs: 65 active+clean; 2.0 GiB data, 6.6 GiB used, 113 GiB / 120 GiB avail; 48 MiB/s rd, 100 MiB/s wr, 296 op/s 2026-03-10T07:51:16.408 INFO:tasks.workunit.client.1.vm08.stdout:2/32: rmdir d0/d1 39 2026-03-10T07:51:16.409 INFO:tasks.workunit.client.1.vm08.stdout:0/26: mknod c9 0 2026-03-10T07:51:16.410 INFO:tasks.workunit.client.0.vm05.stdout:9/923: write d8/d86/d28/d79/d57/de1/d38/fd0 [55666,12844] 0 2026-03-10T07:51:16.413 INFO:tasks.workunit.client.1.vm08.stdout:0/27: dwrite f8 [0,4194304] 0 2026-03-10T07:51:16.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:16 vm08.local ceph-mon[59917]: pgmap v27: 65 pgs: 65 active+clean; 2.0 GiB data, 6.6 GiB used, 113 GiB / 120 GiB avail; 48 MiB/s rd, 100 MiB/s wr, 296 op/s 2026-03-10T07:51:16.419 INFO:tasks.workunit.client.1.vm08.stdout:7/18: truncate f0 2974027 0 2026-03-10T07:51:16.424 INFO:tasks.workunit.client.1.vm08.stdout:8/23: stat d0/fa 0 2026-03-10T07:51:16.429 INFO:tasks.workunit.client.1.vm08.stdout:1/8: dwrite d2/f4 [0,4194304] 0 2026-03-10T07:51:16.430 INFO:tasks.workunit.client.1.vm08.stdout:8/24: dwrite d0/f8 [0,4194304] 0 2026-03-10T07:51:16.442 INFO:tasks.workunit.client.1.vm08.stdout:6/20: creat d1/f6 x:0 0 0 2026-03-10T07:51:16.442 INFO:tasks.workunit.client.1.vm08.stdout:6/21: chown d1/f6 4825 1 2026-03-10T07:51:16.447 INFO:tasks.workunit.client.0.vm05.stdout:1/950: fsync da/dd/d42/fe3 0 2026-03-10T07:51:16.448 INFO:tasks.workunit.client.0.vm05.stdout:1/951: fsync da/dd/d12/d34/ddb/def/f127 0 2026-03-10T07:51:16.449 INFO:tasks.workunit.client.0.vm05.stdout:1/952: write da/d26/f11e [366262,128927] 0 2026-03-10T07:51:16.450 INFO:tasks.workunit.client.1.vm08.stdout:2/33: dwrite d0/d1/fb [0,4194304] 0 2026-03-10T07:51:16.457 INFO:tasks.workunit.client.0.vm05.stdout:0/991: symlink d8/d9c/dc8/d10f/d127/d12f/dc3/l153 0 2026-03-10T07:51:16.463 INFO:tasks.workunit.client.1.vm08.stdout:5/20: getdents d0 0 2026-03-10T07:51:16.479 INFO:tasks.workunit.client.0.vm05.stdout:9/924: rename d8/d86/d28/d79/d57/de1/d1c/f8c to d8/d86/d28/d79/d57/de1/d22/d33/d47/f139 0 2026-03-10T07:51:16.479 INFO:tasks.workunit.client.1.vm08.stdout:5/21: dread - d0/d4/f5 zero size 2026-03-10T07:51:16.480 INFO:tasks.workunit.client.1.vm08.stdout:5/22: chown d0/d4/f5 0 1 2026-03-10T07:51:16.480 INFO:tasks.workunit.client.1.vm08.stdout:0/28: rename c3 to ca 0 2026-03-10T07:51:16.480 INFO:tasks.workunit.client.1.vm08.stdout:0/29: write f7 [961583,9492] 0 2026-03-10T07:51:16.480 INFO:tasks.workunit.client.1.vm08.stdout:0/30: dwrite f5 [0,4194304] 0 2026-03-10T07:51:16.480 INFO:tasks.workunit.client.1.vm08.stdout:0/31: fdatasync f4 0 2026-03-10T07:51:16.480 INFO:tasks.workunit.client.1.vm08.stdout:0/32: dwrite f5 [0,4194304] 0 2026-03-10T07:51:16.488 INFO:tasks.workunit.client.1.vm08.stdout:3/17: fsync d0/f2 0 2026-03-10T07:51:16.491 INFO:tasks.workunit.client.1.vm08.stdout:4/16: chown l3 13 1 2026-03-10T07:51:16.512 INFO:tasks.workunit.client.0.vm05.stdout:1/953: creat da/d26/d2b/dcb/f128 x:0 0 0 2026-03-10T07:51:16.518 INFO:tasks.workunit.client.1.vm08.stdout:6/22: creat d1/f7 x:0 0 0 2026-03-10T07:51:16.518 INFO:tasks.workunit.client.1.vm08.stdout:6/23: readlink - no filename 2026-03-10T07:51:16.522 INFO:tasks.workunit.client.1.vm08.stdout:6/24: dwrite d1/f7 [0,4194304] 0 2026-03-10T07:51:16.522 INFO:tasks.workunit.client.1.vm08.stdout:6/25: dread - d1/f6 zero size 2026-03-10T07:51:16.523 INFO:tasks.workunit.client.0.vm05.stdout:8/907: dwrite d1/d52/f94 [0,4194304] 0 2026-03-10T07:51:16.533 INFO:tasks.workunit.client.0.vm05.stdout:0/992: dread d8/dd/d37/d56/feb [0,4194304] 0 2026-03-10T07:51:16.533 INFO:tasks.workunit.client.1.vm08.stdout:9/12: truncate f0 1107253 0 2026-03-10T07:51:16.546 INFO:tasks.workunit.client.0.vm05.stdout:1/954: fdatasync da/d26/d2b/f65 0 2026-03-10T07:51:16.550 INFO:tasks.workunit.client.1.vm08.stdout:5/23: creat d0/f9 x:0 0 0 2026-03-10T07:51:16.556 INFO:tasks.workunit.client.1.vm08.stdout:5/24: chown d0/c1 21079281 1 2026-03-10T07:51:16.556 INFO:tasks.workunit.client.1.vm08.stdout:5/25: dread - d0/d4/f5 zero size 2026-03-10T07:51:16.565 INFO:tasks.workunit.client.0.vm05.stdout:9/925: truncate d8/d86/d28/d79/d57/de1/d38/d71/d81/d11c/f12f 453812 0 2026-03-10T07:51:16.567 INFO:tasks.workunit.client.1.vm08.stdout:9/13: sync 2026-03-10T07:51:16.569 INFO:tasks.workunit.client.1.vm08.stdout:9/14: sync 2026-03-10T07:51:16.570 INFO:tasks.workunit.client.0.vm05.stdout:8/908: creat d1/dd/d18/f123 x:0 0 0 2026-03-10T07:51:16.575 INFO:tasks.workunit.client.1.vm08.stdout:0/33: creat fb x:0 0 0 2026-03-10T07:51:16.576 INFO:tasks.workunit.client.0.vm05.stdout:8/909: dread - d1/dd/d4d/d64/d6a/de5/d2a/d48/d7c/d9c/f117 zero size 2026-03-10T07:51:16.576 INFO:tasks.workunit.client.1.vm08.stdout:0/34: truncate f4 389550 0 2026-03-10T07:51:16.576 INFO:tasks.workunit.client.1.vm08.stdout:0/35: rmdir - no directory 2026-03-10T07:51:16.576 INFO:tasks.workunit.client.1.vm08.stdout:0/36: chown f8 1220591 1 2026-03-10T07:51:16.576 INFO:tasks.workunit.client.1.vm08.stdout:0/37: write f0 [451726,107774] 0 2026-03-10T07:51:16.579 INFO:tasks.workunit.client.1.vm08.stdout:0/38: dread f7 [0,4194304] 0 2026-03-10T07:51:16.579 INFO:tasks.workunit.client.1.vm08.stdout:0/39: write f8 [3230227,80949] 0 2026-03-10T07:51:16.580 INFO:tasks.workunit.client.1.vm08.stdout:3/18: symlink d0/l4 0 2026-03-10T07:51:16.581 INFO:tasks.workunit.client.1.vm08.stdout:3/19: write d0/f2 [3344333,60868] 0 2026-03-10T07:51:16.581 INFO:tasks.workunit.client.0.vm05.stdout:6/984: write d0/fa [1247808,96737] 0 2026-03-10T07:51:16.581 INFO:tasks.workunit.client.1.vm08.stdout:3/20: write d0/f2 [4520494,99922] 0 2026-03-10T07:51:16.582 INFO:tasks.workunit.client.1.vm08.stdout:3/21: readlink d0/l4 0 2026-03-10T07:51:16.582 INFO:tasks.workunit.client.0.vm05.stdout:6/985: chown d0/d11/d57/d60/f74 22 1 2026-03-10T07:51:16.583 INFO:tasks.workunit.client.1.vm08.stdout:4/17: write f2 [351843,32615] 0 2026-03-10T07:51:16.585 INFO:tasks.workunit.client.1.vm08.stdout:4/18: dread f1 [0,4194304] 0 2026-03-10T07:51:16.586 INFO:tasks.workunit.client.1.vm08.stdout:4/19: write f2 [1306354,121435] 0 2026-03-10T07:51:16.590 INFO:tasks.workunit.client.0.vm05.stdout:9/926: mkdir d8/d86/d28/d79/d57/de1/d22/d33/d62/d6d/de7/d13a 0 2026-03-10T07:51:16.591 INFO:tasks.workunit.client.1.vm08.stdout:3/22: dwrite d0/f2 [0,4194304] 0 2026-03-10T07:51:16.600 INFO:tasks.workunit.client.0.vm05.stdout:0/993: mkdir d8/dd/d37/d154 0 2026-03-10T07:51:16.606 INFO:tasks.workunit.client.0.vm05.stdout:1/955: creat da/dd/d2a/d70/f129 x:0 0 0 2026-03-10T07:51:16.608 INFO:tasks.workunit.client.1.vm08.stdout:6/26: write d1/f5 [1452391,67055] 0 2026-03-10T07:51:16.613 INFO:tasks.workunit.client.1.vm08.stdout:2/34: mknod d0/d1/d9/cd 0 2026-03-10T07:51:16.613 INFO:tasks.workunit.client.0.vm05.stdout:8/910: symlink d1/dd/d4d/d64/d6a/l124 0 2026-03-10T07:51:16.617 INFO:tasks.workunit.client.1.vm08.stdout:5/26: unlink d0/l2 0 2026-03-10T07:51:16.626 INFO:tasks.workunit.client.1.vm08.stdout:5/27: write d0/f9 [921267,76558] 0 2026-03-10T07:51:16.626 INFO:tasks.workunit.client.1.vm08.stdout:5/28: truncate d0/d4/f5 555285 0 2026-03-10T07:51:16.626 INFO:tasks.workunit.client.0.vm05.stdout:3/931: truncate d8/d22/f29 154436 0 2026-03-10T07:51:16.626 INFO:tasks.workunit.client.0.vm05.stdout:9/927: creat d8/d86/d28/d79/d57/de1/d22/dab/f13b x:0 0 0 2026-03-10T07:51:16.626 INFO:tasks.workunit.client.0.vm05.stdout:5/981: write d2/d20/d33/f151 [383433,54900] 0 2026-03-10T07:51:16.628 INFO:tasks.workunit.client.1.vm08.stdout:9/15: mkdir d2 0 2026-03-10T07:51:16.629 INFO:tasks.workunit.client.0.vm05.stdout:0/994: rmdir d8/d9c/dc8/d10f/d127/d12f/dc3/d10b 39 2026-03-10T07:51:16.640 INFO:tasks.workunit.client.0.vm05.stdout:7/983: write d1/d3c/f89 [3002021,17742] 0 2026-03-10T07:51:16.645 INFO:tasks.workunit.client.0.vm05.stdout:1/956: dread f4 [8388608,4194304] 0 2026-03-10T07:51:16.652 INFO:tasks.workunit.client.0.vm05.stdout:8/911: unlink d1/dd/d4d/d64/fac 0 2026-03-10T07:51:16.653 INFO:tasks.workunit.client.1.vm08.stdout:0/40: symlink lc 0 2026-03-10T07:51:16.662 INFO:tasks.workunit.client.0.vm05.stdout:5/982: creat d2/d12/d2d/d4a/de7/f154 x:0 0 0 2026-03-10T07:51:16.666 INFO:tasks.workunit.client.0.vm05.stdout:0/995: creat d8/dd/d10/d26/d8b/d86/f155 x:0 0 0 2026-03-10T07:51:16.673 INFO:tasks.workunit.client.0.vm05.stdout:1/957: fsync da/dd/d2a/d55/d64/d104/f36 0 2026-03-10T07:51:16.675 INFO:tasks.workunit.client.0.vm05.stdout:6/986: dwrite d0/d11/d57/d60/dcc/f110 [0,4194304] 0 2026-03-10T07:51:16.688 INFO:tasks.workunit.client.0.vm05.stdout:8/912: creat d1/dd/d5e/d9e/f125 x:0 0 0 2026-03-10T07:51:16.693 INFO:tasks.workunit.client.1.vm08.stdout:6/27: dread d1/f2 [0,4194304] 0 2026-03-10T07:51:16.693 INFO:tasks.workunit.client.1.vm08.stdout:6/28: chown d1/f2 25931 1 2026-03-10T07:51:16.693 INFO:tasks.workunit.client.1.vm08.stdout:6/29: stat d1/f6 0 2026-03-10T07:51:16.693 INFO:tasks.workunit.client.1.vm08.stdout:7/19: write f0 [2594647,7556] 0 2026-03-10T07:51:16.693 INFO:tasks.workunit.client.1.vm08.stdout:7/20: chown f0 99 1 2026-03-10T07:51:16.695 INFO:tasks.workunit.client.1.vm08.stdout:2/35: rmdir d0/d1/d3/d5 39 2026-03-10T07:51:16.702 INFO:tasks.workunit.client.0.vm05.stdout:0/996: fdatasync d8/dd/d10/d26/d8b/d7d/f123 0 2026-03-10T07:51:16.703 INFO:tasks.workunit.client.0.vm05.stdout:0/997: chown d8/dd/d10/c44 4172 1 2026-03-10T07:51:16.708 INFO:tasks.workunit.client.1.vm08.stdout:0/41: fsync f7 0 2026-03-10T07:51:16.708 INFO:tasks.workunit.client.0.vm05.stdout:9/928: dwrite d8/f9 [4194304,4194304] 0 2026-03-10T07:51:16.708 INFO:tasks.workunit.client.1.vm08.stdout:0/42: truncate f6 72235 0 2026-03-10T07:51:16.713 INFO:tasks.workunit.client.1.vm08.stdout:0/43: dwrite fb [0,4194304] 0 2026-03-10T07:51:16.713 INFO:tasks.workunit.client.1.vm08.stdout:0/44: stat f7 0 2026-03-10T07:51:16.713 INFO:tasks.workunit.client.1.vm08.stdout:0/45: stat f2 0 2026-03-10T07:51:16.714 INFO:tasks.workunit.client.1.vm08.stdout:0/46: write f6 [422384,128452] 0 2026-03-10T07:51:16.719 INFO:tasks.workunit.client.0.vm05.stdout:7/984: truncate d1/d6/d3b/ffa 4836075 0 2026-03-10T07:51:16.719 INFO:tasks.workunit.client.0.vm05.stdout:7/985: readlink d1/d6/d80/l91 0 2026-03-10T07:51:16.722 INFO:tasks.workunit.client.1.vm08.stdout:4/20: mkdir d5/d6 0 2026-03-10T07:51:16.722 INFO:tasks.workunit.client.0.vm05.stdout:3/932: truncate d8/d22/fb9 578942 0 2026-03-10T07:51:16.724 INFO:tasks.workunit.client.0.vm05.stdout:1/958: creat da/dd/d2a/d55/d64/dd1/f12a x:0 0 0 2026-03-10T07:51:16.727 INFO:tasks.workunit.client.1.vm08.stdout:1/9: getdents d2 0 2026-03-10T07:51:16.732 INFO:tasks.workunit.client.0.vm05.stdout:6/987: symlink d0/d11/d57/d60/l14d 0 2026-03-10T07:51:16.733 INFO:tasks.workunit.client.1.vm08.stdout:8/25: getdents d0 0 2026-03-10T07:51:16.733 INFO:tasks.workunit.client.1.vm08.stdout:6/30: rmdir d1 39 2026-03-10T07:51:16.733 INFO:tasks.workunit.client.1.vm08.stdout:8/26: dwrite d0/f8 [0,4194304] 0 2026-03-10T07:51:16.736 INFO:tasks.workunit.client.0.vm05.stdout:6/988: dwrite d0/d11/d57/d60/f74 [0,4194304] 0 2026-03-10T07:51:16.737 INFO:tasks.workunit.client.1.vm08.stdout:2/36: truncate d0/d1/d3/f8 781646 0 2026-03-10T07:51:16.739 INFO:tasks.workunit.client.1.vm08.stdout:8/27: dwrite d0/f6 [0,4194304] 0 2026-03-10T07:51:16.744 INFO:tasks.workunit.client.1.vm08.stdout:9/16: read f0 [946969,92156] 0 2026-03-10T07:51:16.746 INFO:tasks.workunit.client.1.vm08.stdout:8/28: dwrite d0/f6 [0,4194304] 0 2026-03-10T07:51:16.746 INFO:tasks.workunit.client.1.vm08.stdout:9/17: dread f0 [0,4194304] 0 2026-03-10T07:51:16.748 INFO:tasks.workunit.client.0.vm05.stdout:6/989: dwrite d0/fa [0,4194304] 0 2026-03-10T07:51:16.753 INFO:tasks.workunit.client.0.vm05.stdout:7/986: rmdir d1/d34 39 2026-03-10T07:51:16.754 INFO:tasks.workunit.client.0.vm05.stdout:7/987: fsync d1/d3c/d71/fd2 0 2026-03-10T07:51:16.783 INFO:tasks.workunit.client.1.vm08.stdout:0/47: mkdir dd 0 2026-03-10T07:51:16.795 INFO:tasks.workunit.client.1.vm08.stdout:1/10: symlink d2/l5 0 2026-03-10T07:51:16.795 INFO:tasks.workunit.client.0.vm05.stdout:8/913: unlink d1/d45/cbb 0 2026-03-10T07:51:16.796 INFO:tasks.workunit.client.1.vm08.stdout:6/31: stat d1/f7 0 2026-03-10T07:51:16.797 INFO:tasks.workunit.client.0.vm05.stdout:9/929: symlink d8/d86/d28/d79/d57/de1/d38/d71/l13c 0 2026-03-10T07:51:16.800 INFO:tasks.workunit.client.1.vm08.stdout:2/37: stat d0/d1/d3/d5 0 2026-03-10T07:51:16.814 INFO:tasks.workunit.client.0.vm05.stdout:3/933: truncate d8/d1c/f56 2744230 0 2026-03-10T07:51:16.814 INFO:tasks.workunit.client.0.vm05.stdout:5/983: getdents d2/d12/d2d 0 2026-03-10T07:51:16.814 INFO:tasks.workunit.client.0.vm05.stdout:0/998: creat d8/dd/d10/d26/d3a/f156 x:0 0 0 2026-03-10T07:51:16.814 INFO:tasks.workunit.client.1.vm08.stdout:8/29: mknod d0/cb 0 2026-03-10T07:51:16.814 INFO:tasks.workunit.client.1.vm08.stdout:8/30: dread d0/f8 [0,4194304] 0 2026-03-10T07:51:16.814 INFO:tasks.workunit.client.1.vm08.stdout:8/31: dread d0/f6 [0,4194304] 0 2026-03-10T07:51:16.814 INFO:tasks.workunit.client.1.vm08.stdout:4/21: creat d5/d6/f7 x:0 0 0 2026-03-10T07:51:16.814 INFO:tasks.workunit.client.1.vm08.stdout:3/23: getdents d0 0 2026-03-10T07:51:16.814 INFO:tasks.workunit.client.1.vm08.stdout:1/11: unlink l1 0 2026-03-10T07:51:16.814 INFO:tasks.workunit.client.1.vm08.stdout:3/24: dread d0/f2 [0,4194304] 0 2026-03-10T07:51:16.814 INFO:tasks.workunit.client.1.vm08.stdout:6/32: rename d1/f5 to d1/f8 0 2026-03-10T07:51:16.814 INFO:tasks.workunit.client.1.vm08.stdout:6/33: dread - d1/f6 zero size 2026-03-10T07:51:16.814 INFO:tasks.workunit.client.1.vm08.stdout:3/25: chown d0/l4 2016 1 2026-03-10T07:51:16.814 INFO:tasks.workunit.client.1.vm08.stdout:6/34: fsync d1/f6 0 2026-03-10T07:51:16.815 INFO:tasks.workunit.client.1.vm08.stdout:2/38: rmdir d0 39 2026-03-10T07:51:16.815 INFO:tasks.workunit.client.0.vm05.stdout:5/984: mkdir d2/d5/d61/d155 0 2026-03-10T07:51:16.819 INFO:tasks.workunit.client.1.vm08.stdout:8/32: sync 2026-03-10T07:51:16.821 INFO:tasks.workunit.client.1.vm08.stdout:8/33: dread - d0/fa zero size 2026-03-10T07:51:16.831 INFO:tasks.workunit.client.1.vm08.stdout:9/18: mkdir d2/d3 0 2026-03-10T07:51:16.838 INFO:tasks.workunit.client.1.vm08.stdout:0/48: creat dd/fe x:0 0 0 2026-03-10T07:51:16.838 INFO:tasks.workunit.client.0.vm05.stdout:6/990: link d0/c38 d0/d11/d57/da4/db3/de7/d141/dc8/c14e 0 2026-03-10T07:51:16.839 INFO:tasks.workunit.client.0.vm05.stdout:6/991: chown d0/d11/d57/da4/db3/de7/d141/d43/l67 51 1 2026-03-10T07:51:16.843 INFO:tasks.workunit.client.1.vm08.stdout:4/22: rmdir d5 39 2026-03-10T07:51:16.850 INFO:tasks.workunit.client.1.vm08.stdout:1/12: rmdir d2 39 2026-03-10T07:51:16.854 INFO:tasks.workunit.client.1.vm08.stdout:7/21: getdents . 0 2026-03-10T07:51:16.854 INFO:tasks.workunit.client.1.vm08.stdout:7/22: fdatasync f0 0 2026-03-10T07:51:16.856 INFO:tasks.workunit.client.1.vm08.stdout:7/23: write f0 [2677535,38685] 0 2026-03-10T07:51:16.857 INFO:tasks.workunit.client.0.vm05.stdout:0/999: dread d8/dd/d37/d67/f125 [0,4194304] 0 2026-03-10T07:51:16.866 INFO:tasks.workunit.client.1.vm08.stdout:3/26: mknod d0/c5 0 2026-03-10T07:51:16.891 INFO:tasks.workunit.client.0.vm05.stdout:1/959: link da/dd/d2a/f6f da/dd/d12/d34/d58/f12b 0 2026-03-10T07:51:16.891 INFO:tasks.workunit.client.1.vm08.stdout:5/29: truncate d0/f9 832437 0 2026-03-10T07:51:16.891 INFO:tasks.workunit.client.1.vm08.stdout:5/30: stat d0/d8 0 2026-03-10T07:51:16.915 INFO:tasks.workunit.client.0.vm05.stdout:8/914: link d1/dd/d4d/d64/d6a/l124 d1/l126 0 2026-03-10T07:51:16.919 INFO:tasks.workunit.client.0.vm05.stdout:7/988: dwrite d1/d6/d47/f65 [0,4194304] 0 2026-03-10T07:51:16.919 INFO:tasks.workunit.client.0.vm05.stdout:6/992: mknod d0/d11/d4f/d11c/c14f 0 2026-03-10T07:51:16.942 INFO:tasks.workunit.client.0.vm05.stdout:6/993: dread d0/d11/d57/da4/db3/de7/d141/f71 [0,4194304] 0 2026-03-10T07:51:16.947 INFO:tasks.workunit.client.1.vm08.stdout:8/34: rmdir d0 39 2026-03-10T07:51:16.957 INFO:tasks.workunit.client.0.vm05.stdout:3/934: fsync d8/d1f/d24/d76/dc5/de1/d52/fde 0 2026-03-10T07:51:16.957 INFO:tasks.workunit.client.0.vm05.stdout:9/930: write d8/d86/d28/d79/d57/de1/d1c/d75/f88 [1484573,65997] 0 2026-03-10T07:51:16.968 INFO:tasks.workunit.client.0.vm05.stdout:3/935: dread d8/d1f/d24/d76/dc5/de1/d19/d6b/fe7 [0,4194304] 0 2026-03-10T07:51:16.972 INFO:tasks.workunit.client.0.vm05.stdout:8/915: creat d1/d6f/df9/d102/dbf/f127 x:0 0 0 2026-03-10T07:51:16.976 INFO:tasks.workunit.client.1.vm08.stdout:4/23: dread - d5/d6/f7 zero size 2026-03-10T07:51:16.980 INFO:tasks.workunit.client.1.vm08.stdout:4/24: dwrite f2 [0,4194304] 0 2026-03-10T07:51:16.980 INFO:tasks.workunit.client.1.vm08.stdout:4/25: read f0 [528202,45589] 0 2026-03-10T07:51:17.005 INFO:tasks.workunit.client.1.vm08.stdout:5/31: rmdir d0/d4 39 2026-03-10T07:51:17.006 INFO:tasks.workunit.client.1.vm08.stdout:6/35: symlink d1/d3/l9 0 2026-03-10T07:51:17.006 INFO:tasks.workunit.client.1.vm08.stdout:8/35: chown d0 23661240 1 2026-03-10T07:51:17.006 INFO:tasks.workunit.client.0.vm05.stdout:5/985: truncate d2/d5/f3d 2298967 0 2026-03-10T07:51:17.007 INFO:tasks.workunit.client.0.vm05.stdout:6/994: mknod d0/d11/d57/da4/db3/de7/d141/db8/d129/c150 0 2026-03-10T07:51:17.010 INFO:tasks.workunit.client.0.vm05.stdout:8/916: chown d1/d45/f55 26 1 2026-03-10T07:51:17.011 INFO:tasks.workunit.client.0.vm05.stdout:5/986: write d2/d12/da8/ddd/f143 [717395,7203] 0 2026-03-10T07:51:17.014 INFO:tasks.workunit.client.0.vm05.stdout:1/960: dwrite da/dd/d12/d34/d107/dbe/dc0/ffb [0,4194304] 0 2026-03-10T07:51:17.014 INFO:tasks.workunit.client.0.vm05.stdout:8/917: chown d1/dd/d4d/d64/d6a/de5/d2a/d48/d5a/f98 59394962 1 2026-03-10T07:51:17.020 INFO:tasks.workunit.client.0.vm05.stdout:8/918: chown d1/dd/d4d/d64/d6a/de5/d2a/d48/d7c/f80 718144231 1 2026-03-10T07:51:17.023 INFO:tasks.workunit.client.0.vm05.stdout:1/961: dwrite da/dd/d2a/d55/f122 [0,4194304] 0 2026-03-10T07:51:17.028 INFO:tasks.workunit.client.0.vm05.stdout:9/931: dread d8/d86/d28/d79/d57/de1/d22/d33/d62/fbf [4194304,4194304] 0 2026-03-10T07:51:17.030 INFO:tasks.workunit.client.1.vm08.stdout:3/27: truncate d0/f2 3128308 0 2026-03-10T07:51:17.038 INFO:tasks.workunit.client.1.vm08.stdout:1/13: rmdir d2 39 2026-03-10T07:51:17.038 INFO:tasks.workunit.client.0.vm05.stdout:3/936: mkdir d8/d22/d60/d6e/d105/d118/d13b 0 2026-03-10T07:51:17.041 INFO:tasks.workunit.client.1.vm08.stdout:4/26: rename d5/d6 to d5/d8 0 2026-03-10T07:51:17.054 INFO:tasks.workunit.client.1.vm08.stdout:9/19: write f0 [1477540,117195] 0 2026-03-10T07:51:17.064 INFO:tasks.workunit.client.0.vm05.stdout:6/995: dread - d0/d11/d57/fb9 zero size 2026-03-10T07:51:17.066 INFO:tasks.workunit.client.1.vm08.stdout:6/36: write d1/f8 [1154230,69948] 0 2026-03-10T07:51:17.067 INFO:tasks.workunit.client.1.vm08.stdout:2/39: rename d0/d1/d3/c4 to d0/d1/ce 0 2026-03-10T07:51:17.067 INFO:tasks.workunit.client.1.vm08.stdout:3/28: rename d0 to d0/d6 22 2026-03-10T07:51:17.067 INFO:tasks.workunit.client.0.vm05.stdout:6/996: write d0/fa [2310879,127182] 0 2026-03-10T07:51:17.067 INFO:tasks.workunit.client.1.vm08.stdout:2/40: chown d0/d1/d9/cd 59 1 2026-03-10T07:51:17.069 INFO:tasks.workunit.client.1.vm08.stdout:6/37: dread d1/f7 [0,4194304] 0 2026-03-10T07:51:17.069 INFO:tasks.workunit.client.1.vm08.stdout:8/36: symlink d0/lc 0 2026-03-10T07:51:17.069 INFO:tasks.workunit.client.1.vm08.stdout:8/37: readlink d0/lc 0 2026-03-10T07:51:17.071 INFO:tasks.workunit.client.1.vm08.stdout:0/49: link lc dd/lf 0 2026-03-10T07:51:17.073 INFO:tasks.workunit.client.1.vm08.stdout:4/27: readlink l3 0 2026-03-10T07:51:17.074 INFO:tasks.workunit.client.0.vm05.stdout:5/987: rename l1 to d2/d5/l156 0 2026-03-10T07:51:17.076 INFO:tasks.workunit.client.1.vm08.stdout:8/38: dread d0/f8 [0,4194304] 0 2026-03-10T07:51:17.077 INFO:tasks.workunit.client.0.vm05.stdout:8/919: link d1/dc9/l42 d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d5d/l128 0 2026-03-10T07:51:17.077 INFO:tasks.workunit.client.1.vm08.stdout:5/32: symlink d0/la 0 2026-03-10T07:51:17.077 INFO:tasks.workunit.client.1.vm08.stdout:1/14: dread d2/f4 [0,4194304] 0 2026-03-10T07:51:17.078 INFO:tasks.workunit.client.0.vm05.stdout:9/932: rename d8/d86/d28/d79/d57/de1/d22/d33/ff8 to d8/d86/d28/d79/d57/de1/d22/d33/d70/f13d 0 2026-03-10T07:51:17.078 INFO:tasks.workunit.client.1.vm08.stdout:6/38: dwrite d1/f6 [0,4194304] 0 2026-03-10T07:51:17.078 INFO:tasks.workunit.client.0.vm05.stdout:9/933: chown d8/d86/d28/d79/d57/de1/f48 5557559 1 2026-03-10T07:51:17.080 INFO:tasks.workunit.client.0.vm05.stdout:9/934: write d8/d86/d28/d79/d57/de1/d38/d71/d81/f83 [1102154,129570] 0 2026-03-10T07:51:17.080 INFO:tasks.workunit.client.0.vm05.stdout:8/920: fdatasync d1/d6f/fa7 0 2026-03-10T07:51:17.085 INFO:tasks.workunit.client.1.vm08.stdout:6/39: dwrite d1/f8 [0,4194304] 0 2026-03-10T07:51:17.085 INFO:tasks.workunit.client.1.vm08.stdout:0/50: mkdir dd/d10 0 2026-03-10T07:51:17.085 INFO:tasks.workunit.client.1.vm08.stdout:9/20: creat d2/d3/f4 x:0 0 0 2026-03-10T07:51:17.085 INFO:tasks.workunit.client.1.vm08.stdout:0/51: stat f6 0 2026-03-10T07:51:17.088 INFO:tasks.workunit.client.1.vm08.stdout:2/41: dread d0/d1/d3/f8 [0,4194304] 0 2026-03-10T07:51:17.089 INFO:tasks.workunit.client.1.vm08.stdout:2/42: dread - d0/d1/d3/fa zero size 2026-03-10T07:51:17.093 INFO:tasks.workunit.client.1.vm08.stdout:2/43: dread - d0/d1/d3/fa zero size 2026-03-10T07:51:17.094 INFO:tasks.workunit.client.1.vm08.stdout:0/52: dwrite f4 [0,4194304] 0 2026-03-10T07:51:17.095 INFO:tasks.workunit.client.1.vm08.stdout:0/53: write f4 [1426405,20823] 0 2026-03-10T07:51:17.099 INFO:tasks.workunit.client.0.vm05.stdout:1/962: dread da/dd/f77 [0,4194304] 0 2026-03-10T07:51:17.106 INFO:tasks.workunit.client.1.vm08.stdout:8/39: fdatasync d0/f6 0 2026-03-10T07:51:17.107 INFO:tasks.workunit.client.1.vm08.stdout:1/15: mkdir d2/d6 0 2026-03-10T07:51:17.107 INFO:tasks.workunit.client.0.vm05.stdout:1/963: write da/d26/f11e [1380104,31610] 0 2026-03-10T07:51:17.107 INFO:tasks.workunit.client.0.vm05.stdout:9/935: rename d8/d86/d28/d79/d57/de1/d22/d33/d62/fef to d8/d86/d28/d79/d57/dbc/d11a/f13e 0 2026-03-10T07:51:17.107 INFO:tasks.workunit.client.1.vm08.stdout:8/40: dread d0/f8 [0,4194304] 0 2026-03-10T07:51:17.111 INFO:tasks.workunit.client.1.vm08.stdout:8/41: dread d0/f6 [0,4194304] 0 2026-03-10T07:51:17.116 INFO:tasks.workunit.client.1.vm08.stdout:9/21: rename d2/d3/f4 to d2/f5 0 2026-03-10T07:51:17.116 INFO:tasks.workunit.client.1.vm08.stdout:9/22: write f0 [1951167,90549] 0 2026-03-10T07:51:17.116 INFO:tasks.workunit.client.1.vm08.stdout:8/42: write d0/f6 [1953847,13479] 0 2026-03-10T07:51:17.116 INFO:tasks.workunit.client.1.vm08.stdout:9/23: chown d2/f5 9778209 1 2026-03-10T07:51:17.116 INFO:tasks.workunit.client.1.vm08.stdout:2/44: mknod d0/d1/d3/d5/cf 0 2026-03-10T07:51:17.116 INFO:tasks.workunit.client.1.vm08.stdout:5/33: symlink d0/d4/lb 0 2026-03-10T07:51:17.116 INFO:tasks.workunit.client.0.vm05.stdout:3/937: sync 2026-03-10T07:51:17.118 INFO:tasks.workunit.client.1.vm08.stdout:8/43: dread d0/f8 [0,4194304] 0 2026-03-10T07:51:17.119 INFO:tasks.workunit.client.1.vm08.stdout:8/44: write d0/f6 [4580620,4909] 0 2026-03-10T07:51:17.119 INFO:tasks.workunit.client.1.vm08.stdout:1/16: dread d2/f4 [0,4194304] 0 2026-03-10T07:51:17.121 INFO:tasks.workunit.client.0.vm05.stdout:5/988: sync 2026-03-10T07:51:17.122 INFO:tasks.workunit.client.1.vm08.stdout:1/17: rename d2 to d2/d7 22 2026-03-10T07:51:17.124 INFO:tasks.workunit.client.1.vm08.stdout:1/18: stat f0 0 2026-03-10T07:51:17.126 INFO:tasks.workunit.client.0.vm05.stdout:7/989: write d1/d34/f7a [6015293,16024] 0 2026-03-10T07:51:17.127 INFO:tasks.workunit.client.1.vm08.stdout:1/19: dwrite f0 [0,4194304] 0 2026-03-10T07:51:17.131 INFO:tasks.workunit.client.1.vm08.stdout:7/24: truncate f0 188903 0 2026-03-10T07:51:17.131 INFO:tasks.workunit.client.1.vm08.stdout:0/54: symlink dd/d10/l11 0 2026-03-10T07:51:17.134 INFO:tasks.workunit.client.1.vm08.stdout:0/55: dread f6 [0,4194304] 0 2026-03-10T07:51:17.147 INFO:tasks.workunit.client.1.vm08.stdout:5/34: mkdir d0/d4/dc 0 2026-03-10T07:51:17.147 INFO:tasks.workunit.client.1.vm08.stdout:8/45: creat d0/fd x:0 0 0 2026-03-10T07:51:17.147 INFO:tasks.workunit.client.1.vm08.stdout:0/56: dread fb [0,4194304] 0 2026-03-10T07:51:17.147 INFO:tasks.workunit.client.1.vm08.stdout:1/20: rename f0 to d2/d6/f8 0 2026-03-10T07:51:17.147 INFO:tasks.workunit.client.1.vm08.stdout:2/45: mkdir d0/d1/d3/d10 0 2026-03-10T07:51:17.147 INFO:tasks.workunit.client.1.vm08.stdout:8/46: dwrite d0/fd [0,4194304] 0 2026-03-10T07:51:17.147 INFO:tasks.workunit.client.1.vm08.stdout:5/35: mknod d0/d4/cd 0 2026-03-10T07:51:17.147 INFO:tasks.workunit.client.1.vm08.stdout:1/21: mknod d2/d6/c9 0 2026-03-10T07:51:17.147 INFO:tasks.workunit.client.1.vm08.stdout:5/36: fsync d0/d4/f5 0 2026-03-10T07:51:17.149 INFO:tasks.workunit.client.0.vm05.stdout:1/964: creat da/dd/d12/d34/d107/f12c x:0 0 0 2026-03-10T07:51:17.152 INFO:tasks.workunit.client.0.vm05.stdout:9/936: creat d8/d86/d28/de9/f13f x:0 0 0 2026-03-10T07:51:17.155 INFO:tasks.workunit.client.0.vm05.stdout:3/938: mkdir d8/d1f/d24/d76/dc5/de1/dac/d13c 0 2026-03-10T07:51:17.156 INFO:tasks.workunit.client.1.vm08.stdout:1/22: symlink d2/la 0 2026-03-10T07:51:17.156 INFO:tasks.workunit.client.1.vm08.stdout:1/23: stat d2/f4 0 2026-03-10T07:51:17.156 INFO:tasks.workunit.client.1.vm08.stdout:1/24: stat d2/d6/f8 0 2026-03-10T07:51:17.160 INFO:tasks.workunit.client.1.vm08.stdout:1/25: rename d2/l5 to d2/lb 0 2026-03-10T07:51:17.162 INFO:tasks.workunit.client.1.vm08.stdout:8/47: link d0/c5 d0/ce 0 2026-03-10T07:51:17.163 INFO:tasks.workunit.client.0.vm05.stdout:1/965: dwrite da/d26/f92 [0,4194304] 0 2026-03-10T07:51:17.166 INFO:tasks.workunit.client.1.vm08.stdout:1/26: creat d2/d6/fc x:0 0 0 2026-03-10T07:51:17.166 INFO:tasks.workunit.client.0.vm05.stdout:9/937: creat d8/d86/d28/d79/d57/de1/d22/d33/d85/f140 x:0 0 0 2026-03-10T07:51:17.166 INFO:tasks.workunit.client.0.vm05.stdout:9/938: dread - d8/d86/d28/d79/d57/de1/d1c/d20/d54/f12c zero size 2026-03-10T07:51:17.166 INFO:tasks.workunit.client.0.vm05.stdout:3/939: rmdir d8/d1c/d48/d69/d11e 39 2026-03-10T07:51:17.166 INFO:tasks.workunit.client.1.vm08.stdout:2/46: sync 2026-03-10T07:51:17.166 INFO:tasks.workunit.client.1.vm08.stdout:5/37: sync 2026-03-10T07:51:17.178 INFO:tasks.workunit.client.1.vm08.stdout:5/38: dwrite d0/d4/f5 [0,4194304] 0 2026-03-10T07:51:17.179 INFO:tasks.workunit.client.1.vm08.stdout:5/39: write d0/d4/f5 [2819051,111102] 0 2026-03-10T07:51:17.180 INFO:tasks.workunit.client.1.vm08.stdout:1/27: creat d2/d6/fd x:0 0 0 2026-03-10T07:51:17.180 INFO:tasks.workunit.client.1.vm08.stdout:8/48: dwrite d0/fd [0,4194304] 0 2026-03-10T07:51:17.180 INFO:tasks.workunit.client.1.vm08.stdout:2/47: write d0/d1/fb [481986,4632] 0 2026-03-10T07:51:17.181 INFO:tasks.workunit.client.0.vm05.stdout:8/921: getdents d1/dd/d4d/d64/d6a/de5 0 2026-03-10T07:51:17.182 INFO:tasks.workunit.client.0.vm05.stdout:8/922: chown d1/dd/d4d/d64/d6a/de5/fd2 180786 1 2026-03-10T07:51:17.183 INFO:tasks.workunit.client.0.vm05.stdout:8/923: chown d1/dd/d4d/d64/d6a/de5/d2a/d34/l82 38491 1 2026-03-10T07:51:17.183 INFO:tasks.workunit.client.1.vm08.stdout:2/48: read - d0/d1/d3/fa zero size 2026-03-10T07:51:17.184 INFO:tasks.workunit.client.1.vm08.stdout:2/49: stat d0/d1/d3/fa 0 2026-03-10T07:51:17.191 INFO:tasks.workunit.client.0.vm05.stdout:9/939: read d8/d86/d28/d79/d57/de1/d38/d71/d81/f83 [726400,40994] 0 2026-03-10T07:51:17.192 INFO:tasks.workunit.client.0.vm05.stdout:9/940: truncate d8/d86/d28/d79/d57/de1/d22/f114 930484 0 2026-03-10T07:51:17.194 INFO:tasks.workunit.client.1.vm08.stdout:8/49: dwrite d0/fa [0,4194304] 0 2026-03-10T07:51:17.206 INFO:tasks.workunit.client.1.vm08.stdout:1/28: mkdir d2/d6/de 0 2026-03-10T07:51:17.206 INFO:tasks.workunit.client.1.vm08.stdout:1/29: read d2/d6/f8 [4146213,86618] 0 2026-03-10T07:51:17.206 INFO:tasks.workunit.client.0.vm05.stdout:3/940: symlink d8/d1f/d2a/d96/da9/l13d 0 2026-03-10T07:51:17.206 INFO:tasks.workunit.client.0.vm05.stdout:8/924: symlink d1/d6f/df9/d102/dbf/l129 0 2026-03-10T07:51:17.206 INFO:tasks.workunit.client.0.vm05.stdout:3/941: dread d8/d1c/f124 [0,4194304] 0 2026-03-10T07:51:17.210 INFO:tasks.workunit.client.1.vm08.stdout:1/30: dwrite d2/d6/fc [0,4194304] 0 2026-03-10T07:51:17.210 INFO:tasks.workunit.client.1.vm08.stdout:1/31: write d2/d6/fc [3768491,48979] 0 2026-03-10T07:51:17.210 INFO:tasks.workunit.client.1.vm08.stdout:8/50: mkdir d0/df 0 2026-03-10T07:51:17.212 INFO:tasks.workunit.client.0.vm05.stdout:9/941: chown d8/d86/d28/d79/d57/de1/d22/d33/c53 5616 1 2026-03-10T07:51:17.213 INFO:tasks.workunit.client.1.vm08.stdout:1/32: dread d2/d6/f8 [0,4194304] 0 2026-03-10T07:51:17.225 INFO:tasks.workunit.client.0.vm05.stdout:7/990: getdents d1/d6/d47/d8d/d112 0 2026-03-10T07:51:17.227 INFO:tasks.workunit.client.0.vm05.stdout:8/925: fsync d1/d45/dfb/f110 0 2026-03-10T07:51:17.234 INFO:tasks.workunit.client.0.vm05.stdout:3/942: mkdir d8/d1f/d24/d8a/d13e 0 2026-03-10T07:51:17.234 INFO:tasks.workunit.client.0.vm05.stdout:1/966: creat da/dd/d12/d86/f12d x:0 0 0 2026-03-10T07:51:17.235 INFO:tasks.workunit.client.0.vm05.stdout:6/997: write d0/d11/d4f/d7d/db7/fc9 [7975,71287] 0 2026-03-10T07:51:17.236 INFO:tasks.workunit.client.0.vm05.stdout:1/967: write da/d26/d2b/f45 [2279969,85495] 0 2026-03-10T07:51:17.236 INFO:tasks.workunit.client.1.vm08.stdout:9/24: dread f0 [0,4194304] 0 2026-03-10T07:51:17.236 INFO:tasks.workunit.client.1.vm08.stdout:8/51: write d0/f8 [1054332,50986] 0 2026-03-10T07:51:17.248 INFO:tasks.workunit.client.0.vm05.stdout:8/926: readlink d1/l126 0 2026-03-10T07:51:17.255 INFO:tasks.workunit.client.0.vm05.stdout:3/943: creat d8/d1c/d48/f13f x:0 0 0 2026-03-10T07:51:17.255 INFO:tasks.workunit.client.0.vm05.stdout:7/991: getdents d1/d34/d59/d60/d12b 0 2026-03-10T07:51:17.256 INFO:tasks.workunit.client.1.vm08.stdout:8/52: unlink d0/cb 0 2026-03-10T07:51:17.256 INFO:tasks.workunit.client.0.vm05.stdout:6/998: unlink d0/d11/d57/d60/c95 0 2026-03-10T07:51:17.257 INFO:tasks.workunit.client.0.vm05.stdout:9/942: truncate d8/d86/d28/d79/d57/de1/d22/d33/d47/f5a 443586 0 2026-03-10T07:51:17.257 INFO:tasks.workunit.client.0.vm05.stdout:1/968: read - da/dd/d12/d34/d58/ff7 zero size 2026-03-10T07:51:17.262 INFO:tasks.workunit.client.0.vm05.stdout:8/927: fsync d1/dd/d4d/f63 0 2026-03-10T07:51:17.263 INFO:tasks.workunit.client.1.vm08.stdout:9/25: link f0 d2/f6 0 2026-03-10T07:51:17.263 INFO:tasks.workunit.client.1.vm08.stdout:9/26: read d2/f6 [825359,122386] 0 2026-03-10T07:51:17.266 INFO:tasks.workunit.client.0.vm05.stdout:6/999: symlink d0/d11/d57/d60/d117/l151 0 2026-03-10T07:51:17.271 INFO:tasks.workunit.client.1.vm08.stdout:8/53: stat d0/c5 0 2026-03-10T07:51:17.281 INFO:tasks.workunit.client.1.vm08.stdout:9/27: dwrite d2/f5 [0,4194304] 0 2026-03-10T07:51:17.282 INFO:tasks.workunit.client.0.vm05.stdout:3/944: creat d8/d1f/d24/d76/dc5/de1/d52/da4/f140 x:0 0 0 2026-03-10T07:51:17.282 INFO:tasks.workunit.client.0.vm05.stdout:3/945: write d8/d1f/d24/d8a/f57 [3741074,33784] 0 2026-03-10T07:51:17.282 INFO:tasks.workunit.client.0.vm05.stdout:8/928: creat d1/dd/d4d/dcc/dbd/f12a x:0 0 0 2026-03-10T07:51:17.282 INFO:tasks.workunit.client.0.vm05.stdout:8/929: dread d1/d6f/f108 [0,4194304] 0 2026-03-10T07:51:17.285 INFO:tasks.workunit.client.0.vm05.stdout:8/930: dread d1/dd/d4d/d64/d8f/f10f [0,4194304] 0 2026-03-10T07:51:17.297 INFO:tasks.workunit.client.0.vm05.stdout:1/969: dread da/dd/d12/f31 [0,4194304] 0 2026-03-10T07:51:17.298 INFO:tasks.workunit.client.0.vm05.stdout:7/992: dread d1/d3c/d71/d79/d8a/fce [0,4194304] 0 2026-03-10T07:51:17.298 INFO:tasks.workunit.client.0.vm05.stdout:9/943: creat d8/d86/db8/f141 x:0 0 0 2026-03-10T07:51:17.300 INFO:tasks.workunit.client.0.vm05.stdout:7/993: chown d1/d6/dc3/f101 345 1 2026-03-10T07:51:17.300 INFO:tasks.workunit.client.0.vm05.stdout:9/944: write d8/d86/d28/d79/d57/de1/d1c/d20/f124 [2854141,47241] 0 2026-03-10T07:51:17.300 INFO:tasks.workunit.client.1.vm08.stdout:9/28: rename f0 to d2/f7 0 2026-03-10T07:51:17.302 INFO:tasks.workunit.client.0.vm05.stdout:3/946: mkdir d8/d1f/d2a/d34/dbd/d141 0 2026-03-10T07:51:17.307 INFO:tasks.workunit.client.0.vm05.stdout:8/931: unlink d1/dd/d4d/d64/d6a/de5/d2a/f54 0 2026-03-10T07:51:17.310 INFO:tasks.workunit.client.0.vm05.stdout:1/970: creat da/d26/d9e/dcc/d105/f12e x:0 0 0 2026-03-10T07:51:17.310 INFO:tasks.workunit.client.0.vm05.stdout:3/947: chown d8/d110 10773 1 2026-03-10T07:51:17.333 INFO:tasks.workunit.client.0.vm05.stdout:1/971: dread f4 [4194304,4194304] 0 2026-03-10T07:51:17.338 INFO:tasks.workunit.client.1.vm08.stdout:8/54: sync 2026-03-10T07:51:17.364 INFO:tasks.workunit.client.1.vm08.stdout:8/55: unlink d0/f8 0 2026-03-10T07:51:17.368 INFO:tasks.workunit.client.0.vm05.stdout:9/945: rename d8/d86/d28/d79/d57/de1/d22/d33/d70 to d8/d86/d28/d79/d57/dbc/d142 0 2026-03-10T07:51:17.369 INFO:tasks.workunit.client.0.vm05.stdout:8/932: rename d1/dd/d121 to d1/dd/d121/d12b 22 2026-03-10T07:51:17.371 INFO:tasks.workunit.client.1.vm08.stdout:8/56: unlink d0/c3 0 2026-03-10T07:51:17.374 INFO:tasks.workunit.client.1.vm08.stdout:4/28: truncate f2 1102574 0 2026-03-10T07:51:17.378 INFO:tasks.workunit.client.1.vm08.stdout:6/40: dwrite d1/f2 [0,4194304] 0 2026-03-10T07:51:17.381 INFO:tasks.workunit.client.0.vm05.stdout:1/972: rename da/dd/d12/d34/d107/dbe/dc0/ffb to da/dd/d2a/d55/d64/f12f 0 2026-03-10T07:51:17.382 INFO:tasks.workunit.client.0.vm05.stdout:1/973: stat da/dd/d12/d34/ddb/def 0 2026-03-10T07:51:17.383 INFO:tasks.workunit.client.1.vm08.stdout:3/29: truncate d0/f2 3424247 0 2026-03-10T07:51:17.384 INFO:tasks.workunit.client.1.vm08.stdout:6/41: dread d1/f8 [0,4194304] 0 2026-03-10T07:51:17.384 INFO:tasks.workunit.client.1.vm08.stdout:6/42: chown d1/d3 2021 1 2026-03-10T07:51:17.390 INFO:tasks.workunit.client.0.vm05.stdout:1/974: dwrite da/dd/d2a/d55/d64/f81 [0,4194304] 0 2026-03-10T07:51:17.399 INFO:tasks.workunit.client.1.vm08.stdout:4/29: mkdir d5/d8/d9 0 2026-03-10T07:51:17.404 INFO:tasks.workunit.client.1.vm08.stdout:0/57: getdents dd/d10 0 2026-03-10T07:51:17.409 INFO:tasks.workunit.client.1.vm08.stdout:0/58: write f0 [1616297,18129] 0 2026-03-10T07:51:17.409 INFO:tasks.workunit.client.1.vm08.stdout:9/29: getdents d2/d3 0 2026-03-10T07:51:17.411 INFO:tasks.workunit.client.1.vm08.stdout:9/30: dread d2/f7 [0,4194304] 0 2026-03-10T07:51:17.415 INFO:tasks.workunit.client.0.vm05.stdout:9/946: symlink d8/d86/d28/d79/d57/l143 0 2026-03-10T07:51:17.416 INFO:tasks.workunit.client.0.vm05.stdout:9/947: fsync d8/d86/d28/d79/d57/d96/dd8/fec 0 2026-03-10T07:51:17.420 INFO:tasks.workunit.client.1.vm08.stdout:8/57: getdents d0/df 0 2026-03-10T07:51:17.421 INFO:tasks.workunit.client.1.vm08.stdout:8/58: stat d0/fa 0 2026-03-10T07:51:17.424 INFO:tasks.workunit.client.1.vm08.stdout:1/33: rename d2/d6/f8 to d2/d6/ff 0 2026-03-10T07:51:17.424 INFO:tasks.workunit.client.1.vm08.stdout:1/34: chown d2/d6/fd 502055 1 2026-03-10T07:51:17.425 INFO:tasks.workunit.client.1.vm08.stdout:8/59: dwrite d0/f6 [0,4194304] 0 2026-03-10T07:51:17.429 INFO:tasks.workunit.client.1.vm08.stdout:3/30: rename d0/c3 to d0/c7 0 2026-03-10T07:51:17.432 INFO:tasks.workunit.client.1.vm08.stdout:6/43: symlink d1/d3/la 0 2026-03-10T07:51:17.433 INFO:tasks.workunit.client.0.vm05.stdout:8/933: mknod d1/d45/dfb/c12c 0 2026-03-10T07:51:17.450 INFO:tasks.workunit.client.0.vm05.stdout:9/948: creat d8/d86/d28/d79/d57/de1/d1c/d20/dd3/f144 x:0 0 0 2026-03-10T07:51:17.457 INFO:tasks.workunit.client.0.vm05.stdout:1/975: truncate da/dd/d2a/d55/d64/dac/f10f 540943 0 2026-03-10T07:51:17.458 INFO:tasks.workunit.client.0.vm05.stdout:1/976: dread - da/dd/d2a/d70/f10e zero size 2026-03-10T07:51:17.458 INFO:tasks.workunit.client.1.vm08.stdout:0/59: unlink f4 0 2026-03-10T07:51:17.459 INFO:tasks.workunit.client.1.vm08.stdout:0/60: write f7 [226617,25741] 0 2026-03-10T07:51:17.461 INFO:tasks.workunit.client.0.vm05.stdout:8/934: creat d1/dd/d4d/f12d x:0 0 0 2026-03-10T07:51:17.465 INFO:tasks.workunit.client.0.vm05.stdout:1/977: truncate da/d26/d2b/d71/fec 367801 0 2026-03-10T07:51:17.466 INFO:tasks.workunit.client.0.vm05.stdout:1/978: readlink da/dd/d12/l39 0 2026-03-10T07:51:17.470 INFO:tasks.workunit.client.1.vm08.stdout:9/31: fdatasync d2/f7 0 2026-03-10T07:51:17.474 INFO:tasks.workunit.client.1.vm08.stdout:1/35: mkdir d2/d10 0 2026-03-10T07:51:17.475 INFO:tasks.workunit.client.0.vm05.stdout:1/979: creat da/dd/d2a/d55/d64/dc2/f130 x:0 0 0 2026-03-10T07:51:17.482 INFO:tasks.workunit.client.0.vm05.stdout:1/980: dwrite da/dd/d2a/d55/d64/f81 [8388608,4194304] 0 2026-03-10T07:51:17.484 INFO:tasks.workunit.client.1.vm08.stdout:6/44: mkdir d1/db 0 2026-03-10T07:51:17.489 INFO:tasks.workunit.client.0.vm05.stdout:8/935: creat d1/dd/f12e x:0 0 0 2026-03-10T07:51:17.491 INFO:tasks.workunit.client.0.vm05.stdout:9/949: getdents d8/d86/d28/d79/d57/de1/d22/d33/d62/d6d 0 2026-03-10T07:51:17.495 INFO:tasks.workunit.client.1.vm08.stdout:0/61: fdatasync f2 0 2026-03-10T07:51:17.507 INFO:tasks.workunit.client.0.vm05.stdout:1/981: creat da/dd/d12/d113/f131 x:0 0 0 2026-03-10T07:51:17.507 INFO:tasks.workunit.client.0.vm05.stdout:9/950: mknod d8/d86/d28/d79/d57/de1/d1c/d20/d54/c145 0 2026-03-10T07:51:17.507 INFO:tasks.workunit.client.0.vm05.stdout:1/982: fdatasync da/dd/d42/d80/f94 0 2026-03-10T07:51:17.507 INFO:tasks.workunit.client.0.vm05.stdout:1/983: stat da/fb2 0 2026-03-10T07:51:17.507 INFO:tasks.workunit.client.1.vm08.stdout:9/32: mknod d2/c8 0 2026-03-10T07:51:17.507 INFO:tasks.workunit.client.1.vm08.stdout:1/36: mkdir d2/d6/d11 0 2026-03-10T07:51:17.507 INFO:tasks.workunit.client.1.vm08.stdout:8/60: symlink d0/df/l10 0 2026-03-10T07:51:17.507 INFO:tasks.workunit.client.1.vm08.stdout:6/45: rename d1/f8 to d1/fc 0 2026-03-10T07:51:17.507 INFO:tasks.workunit.client.1.vm08.stdout:2/50: truncate d0/d1/fb 3331603 0 2026-03-10T07:51:17.507 INFO:tasks.workunit.client.1.vm08.stdout:2/51: dread - d0/d1/d3/fa zero size 2026-03-10T07:51:17.507 INFO:tasks.workunit.client.1.vm08.stdout:2/52: dread d0/d1/d3/f8 [0,4194304] 0 2026-03-10T07:51:17.511 INFO:tasks.workunit.client.1.vm08.stdout:0/62: creat dd/d10/f12 x:0 0 0 2026-03-10T07:51:17.512 INFO:tasks.workunit.client.1.vm08.stdout:9/33: symlink d2/l9 0 2026-03-10T07:51:17.513 INFO:tasks.workunit.client.0.vm05.stdout:1/984: creat da/dd/d12/d34/f132 x:0 0 0 2026-03-10T07:51:17.514 INFO:tasks.workunit.client.0.vm05.stdout:1/985: write da/dd/d12/d34/d58/fd4 [2019952,15410] 0 2026-03-10T07:51:17.515 INFO:tasks.workunit.client.1.vm08.stdout:1/37: creat d2/f12 x:0 0 0 2026-03-10T07:51:17.522 INFO:tasks.workunit.client.1.vm08.stdout:8/61: mknod d0/df/c11 0 2026-03-10T07:51:17.530 INFO:tasks.workunit.client.1.vm08.stdout:5/40: dwrite d0/f9 [0,4194304] 0 2026-03-10T07:51:17.531 INFO:tasks.workunit.client.1.vm08.stdout:6/46: fsync d1/f7 0 2026-03-10T07:51:17.553 INFO:tasks.workunit.client.1.vm08.stdout:2/53: mknod d0/d1/c11 0 2026-03-10T07:51:17.553 INFO:tasks.workunit.client.1.vm08.stdout:9/34: creat d2/d3/fa x:0 0 0 2026-03-10T07:51:17.553 INFO:tasks.workunit.client.1.vm08.stdout:0/63: creat dd/f13 x:0 0 0 2026-03-10T07:51:17.553 INFO:tasks.workunit.client.1.vm08.stdout:9/35: dread - d2/d3/fa zero size 2026-03-10T07:51:17.559 INFO:tasks.workunit.client.0.vm05.stdout:7/994: dwrite d1/d6/d3b/d7f/fa2 [0,4194304] 0 2026-03-10T07:51:17.560 INFO:tasks.workunit.client.1.vm08.stdout:8/62: creat d0/df/f12 x:0 0 0 2026-03-10T07:51:17.561 INFO:tasks.workunit.client.1.vm08.stdout:1/38: creat d2/f13 x:0 0 0 2026-03-10T07:51:17.561 INFO:tasks.workunit.client.1.vm08.stdout:9/36: dwrite d2/d3/fa [0,4194304] 0 2026-03-10T07:51:17.562 INFO:tasks.workunit.client.1.vm08.stdout:1/39: stat d2/d6/de 0 2026-03-10T07:51:17.566 INFO:tasks.workunit.client.0.vm05.stdout:3/948: dwrite d8/dd5/dfb/f117 [0,4194304] 0 2026-03-10T07:51:17.571 INFO:tasks.workunit.client.0.vm05.stdout:8/936: sync 2026-03-10T07:51:17.571 INFO:tasks.workunit.client.0.vm05.stdout:3/949: chown d8/d1f/d24/d76/dc5/de1/d19/f81 89 1 2026-03-10T07:51:17.572 INFO:tasks.workunit.client.0.vm05.stdout:8/937: chown d1/dc9/c105 592 1 2026-03-10T07:51:17.576 INFO:tasks.workunit.client.1.vm08.stdout:6/47: sync 2026-03-10T07:51:17.587 INFO:tasks.workunit.client.1.vm08.stdout:0/64: mkdir dd/d10/d14 0 2026-03-10T07:51:17.595 INFO:tasks.workunit.client.0.vm05.stdout:7/995: creat d1/d6/d47/d8d/d112/f136 x:0 0 0 2026-03-10T07:51:17.598 INFO:tasks.workunit.client.0.vm05.stdout:1/986: dread da/d26/d9e/fa1 [0,4194304] 0 2026-03-10T07:51:17.599 INFO:tasks.workunit.client.0.vm05.stdout:3/950: symlink d8/d1f/d2a/d34/dbd/l142 0 2026-03-10T07:51:17.603 INFO:tasks.workunit.client.1.vm08.stdout:8/63: creat d0/df/f13 x:0 0 0 2026-03-10T07:51:17.603 INFO:tasks.workunit.client.1.vm08.stdout:8/64: truncate d0/df/f13 970744 0 2026-03-10T07:51:17.603 INFO:tasks.workunit.client.1.vm08.stdout:8/65: chown d0/df/f13 408226 1 2026-03-10T07:51:17.604 INFO:tasks.workunit.client.0.vm05.stdout:8/938: mkdir d1/dd/d4d/d64/d6a/de5/d2a/de3/d12f 0 2026-03-10T07:51:17.604 INFO:tasks.workunit.client.1.vm08.stdout:8/66: fsync d0/fd 0 2026-03-10T07:51:17.612 INFO:tasks.workunit.client.1.vm08.stdout:6/48: mknod d1/cd 0 2026-03-10T07:51:17.613 INFO:tasks.workunit.client.1.vm08.stdout:6/49: write d1/f6 [603428,23072] 0 2026-03-10T07:51:17.616 INFO:tasks.workunit.client.0.vm05.stdout:1/987: creat da/dd/d2a/d55/d64/dac/dfa/f133 x:0 0 0 2026-03-10T07:51:17.619 INFO:tasks.workunit.client.1.vm08.stdout:4/30: dread f2 [0,4194304] 0 2026-03-10T07:51:17.623 INFO:tasks.workunit.client.0.vm05.stdout:3/951: rename d8/d1f/d24/d76/dc5/de1/d19/d6b/la0 to d8/d22/d60/d6e/dca/dda/d131/l143 0 2026-03-10T07:51:17.624 INFO:tasks.workunit.client.1.vm08.stdout:2/54: creat d0/f12 x:0 0 0 2026-03-10T07:51:17.625 INFO:tasks.workunit.client.1.vm08.stdout:2/55: write d0/f12 [836982,130537] 0 2026-03-10T07:51:17.628 INFO:tasks.workunit.client.0.vm05.stdout:1/988: dread da/dd/d12/f31 [0,4194304] 0 2026-03-10T07:51:17.628 INFO:tasks.workunit.client.1.vm08.stdout:3/31: dwrite d0/f2 [0,4194304] 0 2026-03-10T07:51:17.641 INFO:tasks.workunit.client.1.vm08.stdout:9/37: link d2/f7 d2/fb 0 2026-03-10T07:51:17.645 INFO:tasks.workunit.client.1.vm08.stdout:7/25: dwrite f0 [0,4194304] 0 2026-03-10T07:51:17.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:17 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:17.663 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:17 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:17.663 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:17 vm05.local ceph-mon[50387]: pgmap v28: 65 pgs: 65 active+clean; 2.0 GiB data, 6.6 GiB used, 113 GiB / 120 GiB avail; 34 MiB/s rd, 68 MiB/s wr, 211 op/s 2026-03-10T07:51:17.663 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:17 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:17.663 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:17 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:17.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:17 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:17.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:17 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:17.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:17 vm08.local ceph-mon[59917]: pgmap v28: 65 pgs: 65 active+clean; 2.0 GiB data, 6.6 GiB used, 113 GiB / 120 GiB avail; 34 MiB/s rd, 68 MiB/s wr, 211 op/s 2026-03-10T07:51:17.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:17 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:17.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:17 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:17.673 INFO:tasks.workunit.client.0.vm05.stdout:8/939: dread d1/dd/d4d/d64/d6a/de5/d2a/d48/d7c/f80 [0,4194304] 0 2026-03-10T07:51:17.681 INFO:tasks.workunit.client.1.vm08.stdout:6/50: mknod d1/d3/ce 0 2026-03-10T07:51:17.686 INFO:tasks.workunit.client.0.vm05.stdout:8/940: mknod d1/d6f/df9/d102/dbf/d11f/c130 0 2026-03-10T07:51:17.687 INFO:tasks.workunit.client.1.vm08.stdout:2/56: unlink d0/d1/d3/fa 0 2026-03-10T07:51:17.689 INFO:tasks.workunit.client.1.vm08.stdout:3/32: unlink d0/f2 0 2026-03-10T07:51:17.691 INFO:tasks.workunit.client.0.vm05.stdout:8/941: dread d1/dd/d4d/d64/d6a/f83 [0,4194304] 0 2026-03-10T07:51:17.699 INFO:tasks.workunit.client.1.vm08.stdout:1/40: link d2/la d2/d6/de/l14 0 2026-03-10T07:51:17.700 INFO:tasks.workunit.client.0.vm05.stdout:8/942: truncate d1/d45/dfb/f110 1038092 0 2026-03-10T07:51:17.700 INFO:tasks.workunit.client.0.vm05.stdout:5/989: dwrite d2/d12/dda/f148 [0,4194304] 0 2026-03-10T07:51:17.707 INFO:tasks.workunit.client.1.vm08.stdout:7/26: fsync f0 0 2026-03-10T07:51:17.707 INFO:tasks.workunit.client.1.vm08.stdout:7/27: write f0 [213356,118718] 0 2026-03-10T07:51:17.708 INFO:tasks.workunit.client.1.vm08.stdout:7/28: write f0 [571096,84308] 0 2026-03-10T07:51:17.722 INFO:tasks.workunit.client.0.vm05.stdout:8/943: mknod d1/dd/d5e/d9e/c131 0 2026-03-10T07:51:17.725 INFO:tasks.workunit.client.0.vm05.stdout:8/944: truncate d1/dd/d5e/d9e/f125 835040 0 2026-03-10T07:51:17.738 INFO:tasks.workunit.client.1.vm08.stdout:2/57: creat d0/d1/d3/d5/f13 x:0 0 0 2026-03-10T07:51:17.738 INFO:tasks.workunit.client.0.vm05.stdout:5/990: read d2/d20/d4c/fb1 [3881411,3567] 0 2026-03-10T07:51:17.739 INFO:tasks.workunit.client.1.vm08.stdout:2/58: dread d0/d1/d3/f8 [0,4194304] 0 2026-03-10T07:51:17.744 INFO:tasks.workunit.client.1.vm08.stdout:3/33: creat d0/f8 x:0 0 0 2026-03-10T07:51:17.745 INFO:tasks.workunit.client.1.vm08.stdout:3/34: truncate d0/f8 957589 0 2026-03-10T07:51:17.747 INFO:tasks.workunit.client.1.vm08.stdout:3/35: dread d0/f8 [0,4194304] 0 2026-03-10T07:51:17.747 INFO:tasks.workunit.client.1.vm08.stdout:3/36: chown d0 113838545 1 2026-03-10T07:51:17.754 INFO:tasks.workunit.client.1.vm08.stdout:1/41: unlink d2/d6/fc 0 2026-03-10T07:51:17.763 INFO:tasks.workunit.client.0.vm05.stdout:9/951: write d8/d86/d28/d79/d57/de1/d22/d33/d47/f5a [1215734,42073] 0 2026-03-10T07:51:17.763 INFO:tasks.workunit.client.0.vm05.stdout:9/952: dwrite d8/d86/d28/d79/d57/de1/d38/fff [0,4194304] 0 2026-03-10T07:51:17.763 INFO:tasks.workunit.client.1.vm08.stdout:6/51: dwrite d1/f7 [0,4194304] 0 2026-03-10T07:51:17.763 INFO:tasks.workunit.client.1.vm08.stdout:6/52: chown d1/f2 90258017 1 2026-03-10T07:51:17.770 INFO:tasks.workunit.client.0.vm05.stdout:5/991: link d2/dd7/le5 d2/d20/d7b/dca/l157 0 2026-03-10T07:51:17.783 INFO:tasks.workunit.client.0.vm05.stdout:9/953: dread d8/d86/d28/d79/d57/de1/d1c/d20/d59/d8b/f50 [0,4194304] 0 2026-03-10T07:51:17.784 INFO:tasks.workunit.client.0.vm05.stdout:5/992: rename d2/d20/d5b/ccc to d2/dd7/c158 0 2026-03-10T07:51:17.788 INFO:tasks.workunit.client.0.vm05.stdout:9/954: mkdir d8/d86/d28/d79/d57/de1/d22/d33/d62/d6d/d146 0 2026-03-10T07:51:17.789 INFO:tasks.workunit.client.1.vm08.stdout:5/41: dwrite d0/f9 [4194304,4194304] 0 2026-03-10T07:51:17.789 INFO:tasks.workunit.client.0.vm05.stdout:9/955: chown d8/d86/d28/d79/d57/de1/d22/f6a 433 1 2026-03-10T07:51:17.790 INFO:tasks.workunit.client.0.vm05.stdout:9/956: stat d8/d86/d28/d79/d57/de1/d1c/d20/d54/fd6 0 2026-03-10T07:51:17.791 INFO:tasks.workunit.client.1.vm08.stdout:0/65: getdents dd/d10 0 2026-03-10T07:51:17.792 INFO:tasks.workunit.client.0.vm05.stdout:5/993: fdatasync d2/d20/d33/d86/fb3 0 2026-03-10T07:51:17.793 INFO:tasks.workunit.client.1.vm08.stdout:8/67: fsync d0/df/f13 0 2026-03-10T07:51:17.794 INFO:tasks.workunit.client.1.vm08.stdout:3/37: symlink d0/l9 0 2026-03-10T07:51:17.795 INFO:tasks.workunit.client.1.vm08.stdout:8/68: chown d0/fd 443303 1 2026-03-10T07:51:17.798 INFO:tasks.workunit.client.0.vm05.stdout:7/996: write d1/d34/d59/f78 [3928483,66054] 0 2026-03-10T07:51:17.799 INFO:tasks.workunit.client.0.vm05.stdout:7/997: write d1/d3c/d71/fd2 [3360714,118978] 0 2026-03-10T07:51:17.803 INFO:tasks.workunit.client.1.vm08.stdout:4/31: getdents d5/d8 0 2026-03-10T07:51:17.804 INFO:tasks.workunit.client.1.vm08.stdout:6/53: rmdir d1 39 2026-03-10T07:51:17.805 INFO:tasks.workunit.client.0.vm05.stdout:3/952: write d8/d1c/f94 [825877,1092] 0 2026-03-10T07:51:17.806 INFO:tasks.workunit.client.0.vm05.stdout:1/989: write da/dd/d12/d86/d9a/ff6 [888217,80837] 0 2026-03-10T07:51:17.810 INFO:tasks.workunit.client.0.vm05.stdout:3/953: creat d8/d1f/d108/f144 x:0 0 0 2026-03-10T07:51:17.840 INFO:tasks.workunit.client.0.vm05.stdout:3/954: chown d8/d8f/dbc/f107 0 1 2026-03-10T07:51:17.841 INFO:tasks.workunit.client.0.vm05.stdout:7/998: creat d1/d34/d59/f137 x:0 0 0 2026-03-10T07:51:17.841 INFO:tasks.workunit.client.0.vm05.stdout:3/955: symlink d8/d1f/d2a/d4a/l145 0 2026-03-10T07:51:17.841 INFO:tasks.workunit.client.0.vm05.stdout:7/999: getdents d1/d3c/d71/d79 0 2026-03-10T07:51:17.841 INFO:tasks.workunit.client.0.vm05.stdout:5/994: dread d2/f9 [0,4194304] 0 2026-03-10T07:51:17.841 INFO:tasks.workunit.client.1.vm08.stdout:8/69: symlink d0/l14 0 2026-03-10T07:51:17.841 INFO:tasks.workunit.client.1.vm08.stdout:0/66: dread f6 [0,4194304] 0 2026-03-10T07:51:17.841 INFO:tasks.workunit.client.1.vm08.stdout:1/42: creat d2/d6/de/f15 x:0 0 0 2026-03-10T07:51:17.841 INFO:tasks.workunit.client.1.vm08.stdout:8/70: dread d0/df/f13 [0,4194304] 0 2026-03-10T07:51:17.841 INFO:tasks.workunit.client.1.vm08.stdout:4/32: truncate f1 858806 0 2026-03-10T07:51:17.841 INFO:tasks.workunit.client.1.vm08.stdout:5/42: link d0/f9 d0/d8/fe 0 2026-03-10T07:51:17.841 INFO:tasks.workunit.client.1.vm08.stdout:5/43: write d0/d4/f5 [1385578,74370] 0 2026-03-10T07:51:17.841 INFO:tasks.workunit.client.1.vm08.stdout:1/43: mknod d2/d6/de/c16 0 2026-03-10T07:51:17.841 INFO:tasks.workunit.client.1.vm08.stdout:8/71: mkdir d0/df/d15 0 2026-03-10T07:51:17.841 INFO:tasks.workunit.client.1.vm08.stdout:4/33: mkdir d5/da 0 2026-03-10T07:51:17.841 INFO:tasks.workunit.client.1.vm08.stdout:4/34: fdatasync d5/d8/f7 0 2026-03-10T07:51:17.841 INFO:tasks.workunit.client.1.vm08.stdout:6/54: mkdir d1/d3/df 0 2026-03-10T07:51:17.841 INFO:tasks.workunit.client.1.vm08.stdout:8/72: dwrite d0/df/f12 [0,4194304] 0 2026-03-10T07:51:17.841 INFO:tasks.workunit.client.1.vm08.stdout:5/44: mkdir d0/d4/df 0 2026-03-10T07:51:17.841 INFO:tasks.workunit.client.1.vm08.stdout:6/55: symlink d1/d3/df/l10 0 2026-03-10T07:51:17.841 INFO:tasks.workunit.client.1.vm08.stdout:4/35: rmdir d5/da 0 2026-03-10T07:51:17.841 INFO:tasks.workunit.client.1.vm08.stdout:1/44: mknod d2/d10/c17 0 2026-03-10T07:51:17.841 INFO:tasks.workunit.client.1.vm08.stdout:5/45: mknod d0/c10 0 2026-03-10T07:51:17.841 INFO:tasks.workunit.client.1.vm08.stdout:1/45: write d2/f12 [1003796,2319] 0 2026-03-10T07:51:17.842 INFO:tasks.workunit.client.1.vm08.stdout:1/46: chown d2/d6/de 83748970 1 2026-03-10T07:51:17.842 INFO:tasks.workunit.client.1.vm08.stdout:6/56: dwrite d1/f2 [0,4194304] 0 2026-03-10T07:51:17.843 INFO:tasks.workunit.client.1.vm08.stdout:1/47: fdatasync d2/f13 0 2026-03-10T07:51:17.843 INFO:tasks.workunit.client.1.vm08.stdout:6/57: chown d1/db 31 1 2026-03-10T07:51:17.844 INFO:tasks.workunit.client.1.vm08.stdout:4/36: mknod d5/d8/d9/cb 0 2026-03-10T07:51:17.849 INFO:tasks.workunit.client.1.vm08.stdout:5/46: creat d0/d4/dc/f11 x:0 0 0 2026-03-10T07:51:17.849 INFO:tasks.workunit.client.1.vm08.stdout:1/48: write d2/f12 [566591,110518] 0 2026-03-10T07:51:17.851 INFO:tasks.workunit.client.1.vm08.stdout:1/49: chown d2 142209742 1 2026-03-10T07:51:17.852 INFO:tasks.workunit.client.1.vm08.stdout:5/47: write d0/d8/fe [4066241,43154] 0 2026-03-10T07:51:17.854 INFO:tasks.workunit.client.1.vm08.stdout:1/50: creat d2/d6/f18 x:0 0 0 2026-03-10T07:51:17.854 INFO:tasks.workunit.client.1.vm08.stdout:5/48: write d0/d8/fe [6685396,19593] 0 2026-03-10T07:51:17.856 INFO:tasks.workunit.client.1.vm08.stdout:6/58: dread d1/fc [0,4194304] 0 2026-03-10T07:51:17.856 INFO:tasks.workunit.client.1.vm08.stdout:4/37: dwrite d5/d8/f7 [0,4194304] 0 2026-03-10T07:51:17.858 INFO:tasks.workunit.client.1.vm08.stdout:1/51: readlink d2/lb 0 2026-03-10T07:51:17.858 INFO:tasks.workunit.client.1.vm08.stdout:6/59: stat d1/db 0 2026-03-10T07:51:17.868 INFO:tasks.workunit.client.1.vm08.stdout:6/60: chown d1/d3/df/l10 1135 1 2026-03-10T07:51:17.868 INFO:tasks.workunit.client.1.vm08.stdout:5/49: dwrite d0/d8/fe [0,4194304] 0 2026-03-10T07:51:17.869 INFO:tasks.workunit.client.1.vm08.stdout:5/50: write d0/f9 [8639582,26896] 0 2026-03-10T07:51:17.869 INFO:tasks.workunit.client.1.vm08.stdout:5/51: chown d0/d4/c6 0 1 2026-03-10T07:51:17.873 INFO:tasks.workunit.client.1.vm08.stdout:1/52: mkdir d2/d19 0 2026-03-10T07:51:17.874 INFO:tasks.workunit.client.1.vm08.stdout:6/61: dwrite d1/f6 [0,4194304] 0 2026-03-10T07:51:17.878 INFO:tasks.workunit.client.1.vm08.stdout:1/53: dwrite d2/d6/fd [0,4194304] 0 2026-03-10T07:51:17.883 INFO:tasks.workunit.client.1.vm08.stdout:6/62: unlink d1/fc 0 2026-03-10T07:51:17.887 INFO:tasks.workunit.client.1.vm08.stdout:7/29: fsync f0 0 2026-03-10T07:51:17.897 INFO:tasks.workunit.client.1.vm08.stdout:7/30: read f0 [1590330,622] 0 2026-03-10T07:51:17.897 INFO:tasks.workunit.client.1.vm08.stdout:1/54: link d2/d6/c9 d2/d6/c1a 0 2026-03-10T07:51:17.897 INFO:tasks.workunit.client.1.vm08.stdout:1/55: creat d2/d6/de/f1b x:0 0 0 2026-03-10T07:51:17.897 INFO:tasks.workunit.client.1.vm08.stdout:7/31: dread f0 [0,4194304] 0 2026-03-10T07:51:17.897 INFO:tasks.workunit.client.1.vm08.stdout:7/32: rmdir - no directory 2026-03-10T07:51:17.897 INFO:tasks.workunit.client.1.vm08.stdout:7/33: write f0 [2817793,60910] 0 2026-03-10T07:51:17.898 INFO:tasks.workunit.client.1.vm08.stdout:1/56: dwrite d2/d6/f18 [0,4194304] 0 2026-03-10T07:51:17.898 INFO:tasks.workunit.client.1.vm08.stdout:3/38: sync 2026-03-10T07:51:17.899 INFO:tasks.workunit.client.1.vm08.stdout:0/67: sync 2026-03-10T07:51:17.903 INFO:tasks.workunit.client.1.vm08.stdout:7/34: dwrite f0 [0,4194304] 0 2026-03-10T07:51:17.903 INFO:tasks.workunit.client.1.vm08.stdout:7/35: chown l1 7 1 2026-03-10T07:51:17.906 INFO:tasks.workunit.client.1.vm08.stdout:7/36: dwrite f0 [0,4194304] 0 2026-03-10T07:51:17.911 INFO:tasks.workunit.client.1.vm08.stdout:0/68: mkdir dd/d10/d14/d15 0 2026-03-10T07:51:17.911 INFO:tasks.workunit.client.1.vm08.stdout:7/37: getdents . 0 2026-03-10T07:51:17.911 INFO:tasks.workunit.client.1.vm08.stdout:7/38: rmdir - no directory 2026-03-10T07:51:17.911 INFO:tasks.workunit.client.1.vm08.stdout:0/69: chown dd/d10 3245 1 2026-03-10T07:51:17.912 INFO:tasks.workunit.client.1.vm08.stdout:7/39: mkdir d3 0 2026-03-10T07:51:17.912 INFO:tasks.workunit.client.1.vm08.stdout:7/40: truncate f0 5150039 0 2026-03-10T07:51:17.913 INFO:tasks.workunit.client.1.vm08.stdout:0/70: creat dd/f16 x:0 0 0 2026-03-10T07:51:17.914 INFO:tasks.workunit.client.1.vm08.stdout:7/41: creat d3/f4 x:0 0 0 2026-03-10T07:51:17.915 INFO:tasks.workunit.client.1.vm08.stdout:0/71: symlink dd/d10/d14/d15/l17 0 2026-03-10T07:51:17.928 INFO:tasks.workunit.client.1.vm08.stdout:7/42: dwrite f0 [0,4194304] 0 2026-03-10T07:51:17.934 INFO:tasks.workunit.client.1.vm08.stdout:0/72: mkdir dd/d18 0 2026-03-10T07:51:17.943 INFO:tasks.workunit.client.1.vm08.stdout:0/73: symlink dd/d10/d14/d15/l19 0 2026-03-10T07:51:17.943 INFO:tasks.workunit.client.1.vm08.stdout:7/43: dwrite f0 [4194304,4194304] 0 2026-03-10T07:51:17.943 INFO:tasks.workunit.client.1.vm08.stdout:0/74: mknod dd/c1a 0 2026-03-10T07:51:17.943 INFO:tasks.workunit.client.1.vm08.stdout:7/44: dwrite f0 [4194304,4194304] 0 2026-03-10T07:51:17.946 INFO:tasks.workunit.client.1.vm08.stdout:0/75: dwrite dd/fe [0,4194304] 0 2026-03-10T07:51:17.953 INFO:tasks.workunit.client.1.vm08.stdout:7/45: mknod d3/c5 0 2026-03-10T07:51:17.955 INFO:tasks.workunit.client.1.vm08.stdout:0/76: mkdir dd/d10/d14/d1b 0 2026-03-10T07:51:17.957 INFO:tasks.workunit.client.1.vm08.stdout:7/46: dread f0 [0,4194304] 0 2026-03-10T07:51:17.965 INFO:tasks.workunit.client.1.vm08.stdout:0/77: dwrite f5 [0,4194304] 0 2026-03-10T07:51:17.973 INFO:tasks.workunit.client.1.vm08.stdout:7/47: creat d3/f6 x:0 0 0 2026-03-10T07:51:17.975 INFO:tasks.workunit.client.1.vm08.stdout:0/78: mkdir dd/d10/d14/d1c 0 2026-03-10T07:51:17.975 INFO:tasks.workunit.client.1.vm08.stdout:0/79: truncate fb 4380400 0 2026-03-10T07:51:17.975 INFO:tasks.workunit.client.1.vm08.stdout:0/80: mknod dd/d18/c1d 0 2026-03-10T07:51:17.976 INFO:tasks.workunit.client.1.vm08.stdout:7/48: dread f0 [4194304,4194304] 0 2026-03-10T07:51:17.976 INFO:tasks.workunit.client.1.vm08.stdout:0/81: read - dd/f16 zero size 2026-03-10T07:51:17.976 INFO:tasks.workunit.client.1.vm08.stdout:0/82: fsync f8 0 2026-03-10T07:51:17.977 INFO:tasks.workunit.client.1.vm08.stdout:7/49: write f0 [5085719,40125] 0 2026-03-10T07:51:17.978 INFO:tasks.workunit.client.1.vm08.stdout:7/50: chown c2 55518 1 2026-03-10T07:51:17.981 INFO:tasks.workunit.client.1.vm08.stdout:0/83: mknod dd/d10/d14/c1e 0 2026-03-10T07:51:17.982 INFO:tasks.workunit.client.1.vm08.stdout:0/84: write f8 [4853781,85275] 0 2026-03-10T07:51:17.982 INFO:tasks.workunit.client.1.vm08.stdout:0/85: dread - dd/f16 zero size 2026-03-10T07:51:17.982 INFO:tasks.workunit.client.1.vm08.stdout:7/51: mknod d3/c7 0 2026-03-10T07:51:17.983 INFO:tasks.workunit.client.1.vm08.stdout:7/52: read f0 [3878406,41029] 0 2026-03-10T07:51:17.985 INFO:tasks.workunit.client.1.vm08.stdout:0/86: dread dd/fe [0,4194304] 0 2026-03-10T07:51:17.991 INFO:tasks.workunit.client.1.vm08.stdout:0/87: mknod dd/d10/d14/d15/c1f 0 2026-03-10T07:51:17.995 INFO:tasks.workunit.client.1.vm08.stdout:0/88: chown dd/c1a 426694935 1 2026-03-10T07:51:18.029 INFO:tasks.workunit.client.1.vm08.stdout:6/63: dwrite d1/f7 [0,4194304] 0 2026-03-10T07:51:18.031 INFO:tasks.workunit.client.1.vm08.stdout:6/64: write d1/f7 [1875546,88532] 0 2026-03-10T07:51:18.032 INFO:tasks.workunit.client.1.vm08.stdout:6/65: mknod d1/d3/c11 0 2026-03-10T07:51:18.033 INFO:tasks.workunit.client.1.vm08.stdout:6/66: creat d1/d3/df/f12 x:0 0 0 2026-03-10T07:51:18.036 INFO:tasks.workunit.client.1.vm08.stdout:6/67: dread d1/f7 [0,4194304] 0 2026-03-10T07:51:18.037 INFO:tasks.workunit.client.1.vm08.stdout:6/68: write d1/f2 [4962185,42582] 0 2026-03-10T07:51:18.039 INFO:tasks.workunit.client.1.vm08.stdout:6/69: creat d1/d3/f13 x:0 0 0 2026-03-10T07:51:18.043 INFO:tasks.workunit.client.1.vm08.stdout:6/70: dread d1/f7 [0,4194304] 0 2026-03-10T07:51:18.054 INFO:tasks.workunit.client.1.vm08.stdout:6/71: symlink d1/db/l14 0 2026-03-10T07:51:18.055 INFO:tasks.workunit.client.1.vm08.stdout:6/72: rename d1 to d1/d15 22 2026-03-10T07:51:18.055 INFO:tasks.workunit.client.1.vm08.stdout:6/73: chown d1/db 3361846 1 2026-03-10T07:51:18.056 INFO:tasks.workunit.client.1.vm08.stdout:6/74: write d1/f2 [644071,38405] 0 2026-03-10T07:51:18.057 INFO:tasks.workunit.client.1.vm08.stdout:6/75: dread - d1/d3/f13 zero size 2026-03-10T07:51:18.060 INFO:tasks.workunit.client.1.vm08.stdout:6/76: creat d1/db/f16 x:0 0 0 2026-03-10T07:51:18.061 INFO:tasks.workunit.client.1.vm08.stdout:6/77: chown d1/d3/df/l10 514446 1 2026-03-10T07:51:18.128 INFO:tasks.workunit.client.1.vm08.stdout:4/38: fsync d5/d8/f7 0 2026-03-10T07:51:18.130 INFO:tasks.workunit.client.1.vm08.stdout:4/39: link f2 d5/d8/fc 0 2026-03-10T07:51:18.135 INFO:tasks.workunit.client.1.vm08.stdout:4/40: rmdir d5/d8 39 2026-03-10T07:51:18.140 INFO:tasks.workunit.client.1.vm08.stdout:4/41: fdatasync f2 0 2026-03-10T07:51:18.141 INFO:tasks.workunit.client.1.vm08.stdout:4/42: dread f2 [0,4194304] 0 2026-03-10T07:51:18.144 INFO:tasks.workunit.client.1.vm08.stdout:4/43: creat d5/d8/fd x:0 0 0 2026-03-10T07:51:18.149 INFO:tasks.workunit.client.1.vm08.stdout:4/44: dwrite d5/d8/f7 [0,4194304] 0 2026-03-10T07:51:18.149 INFO:tasks.workunit.client.1.vm08.stdout:4/45: write f0 [1784207,92376] 0 2026-03-10T07:51:18.158 INFO:tasks.workunit.client.1.vm08.stdout:4/46: rename l4 to d5/d8/d9/le 0 2026-03-10T07:51:18.173 INFO:tasks.workunit.client.1.vm08.stdout:4/47: link f2 d5/d8/ff 0 2026-03-10T07:51:18.177 INFO:tasks.workunit.client.1.vm08.stdout:4/48: creat d5/f10 x:0 0 0 2026-03-10T07:51:18.180 INFO:tasks.workunit.client.1.vm08.stdout:9/38: truncate d2/f5 2523687 0 2026-03-10T07:51:18.181 INFO:tasks.workunit.client.0.vm05.stdout:8/945: write d1/dd/d4d/d64/d6a/de5/d2a/d48/f7b [771188,40113] 0 2026-03-10T07:51:18.182 INFO:tasks.workunit.client.1.vm08.stdout:2/59: rmdir d0 39 2026-03-10T07:51:18.184 INFO:tasks.workunit.client.1.vm08.stdout:9/39: creat d2/d3/fc x:0 0 0 2026-03-10T07:51:18.184 INFO:tasks.workunit.client.0.vm05.stdout:3/956: dread d8/f5d [0,4194304] 0 2026-03-10T07:51:18.188 INFO:tasks.workunit.client.1.vm08.stdout:9/40: dwrite d2/d3/fa [4194304,4194304] 0 2026-03-10T07:51:18.190 INFO:tasks.workunit.client.1.vm08.stdout:9/41: write d2/d3/fc [390965,77015] 0 2026-03-10T07:51:18.191 INFO:tasks.workunit.client.1.vm08.stdout:9/42: stat d2/d3/fc 0 2026-03-10T07:51:18.192 INFO:tasks.workunit.client.1.vm08.stdout:9/43: dread d2/f7 [0,4194304] 0 2026-03-10T07:51:18.201 INFO:tasks.workunit.client.1.vm08.stdout:2/60: write d0/d1/fb [3289213,64499] 0 2026-03-10T07:51:18.201 INFO:tasks.workunit.client.1.vm08.stdout:2/61: readlink - no filename 2026-03-10T07:51:18.201 INFO:tasks.workunit.client.0.vm05.stdout:3/957: readlink d8/d22/d60/d6e/dca/dda/d131/l143 0 2026-03-10T07:51:18.210 INFO:tasks.workunit.client.0.vm05.stdout:3/958: dread d8/d1f/ffc [0,4194304] 0 2026-03-10T07:51:18.217 INFO:tasks.workunit.client.1.vm08.stdout:2/62: creat d0/d1/d3/f14 x:0 0 0 2026-03-10T07:51:18.220 INFO:tasks.workunit.client.0.vm05.stdout:8/946: creat d1/d6f/f132 x:0 0 0 2026-03-10T07:51:18.220 INFO:tasks.workunit.client.1.vm08.stdout:2/63: mkdir d0/d1/d15 0 2026-03-10T07:51:18.222 INFO:tasks.workunit.client.1.vm08.stdout:6/78: dread d1/f2 [4194304,4194304] 0 2026-03-10T07:51:18.222 INFO:tasks.workunit.client.1.vm08.stdout:6/79: fdatasync d1/db/f16 0 2026-03-10T07:51:18.227 INFO:tasks.workunit.client.1.vm08.stdout:2/64: symlink d0/d1/d3/d5/l16 0 2026-03-10T07:51:18.230 INFO:tasks.workunit.client.0.vm05.stdout:3/959: link d8/d1c/f124 d8/f146 0 2026-03-10T07:51:18.244 INFO:tasks.workunit.client.0.vm05.stdout:3/960: getdents d8/d1f 0 2026-03-10T07:51:18.248 INFO:tasks.workunit.client.1.vm08.stdout:7/53: fdatasync f0 0 2026-03-10T07:51:18.252 INFO:tasks.workunit.client.0.vm05.stdout:3/961: dread d8/d1f/d2a/d34/fae [0,4194304] 0 2026-03-10T07:51:18.256 INFO:tasks.workunit.client.1.vm08.stdout:7/54: mkdir d3/d8 0 2026-03-10T07:51:18.260 INFO:tasks.workunit.client.0.vm05.stdout:3/962: mknod d8/d22/d60/d6e/dca/c147 0 2026-03-10T07:51:18.260 INFO:tasks.workunit.client.1.vm08.stdout:7/55: dread - d3/f4 zero size 2026-03-10T07:51:18.264 INFO:tasks.workunit.client.1.vm08.stdout:7/56: mkdir d3/d8/d9 0 2026-03-10T07:51:18.265 INFO:tasks.workunit.client.0.vm05.stdout:3/963: dwrite d8/d8f/dbc/dc7/f11f [0,4194304] 0 2026-03-10T07:51:18.271 INFO:tasks.workunit.client.0.vm05.stdout:3/964: write d8/d1f/d2a/d96/da9/fb5 [1962847,26501] 0 2026-03-10T07:51:18.274 INFO:tasks.workunit.client.0.vm05.stdout:3/965: chown d8/d1c/f94 20927920 1 2026-03-10T07:51:18.275 INFO:tasks.workunit.client.0.vm05.stdout:3/966: write d8/d8f/f120 [505224,107287] 0 2026-03-10T07:51:18.280 INFO:tasks.workunit.client.1.vm08.stdout:7/57: mkdir d3/da 0 2026-03-10T07:51:18.315 INFO:tasks.workunit.client.1.vm08.stdout:3/39: rmdir d0 39 2026-03-10T07:51:18.319 INFO:tasks.workunit.client.0.vm05.stdout:9/957: write d8/d86/d28/d79/d57/de1/d1c/d20/d59/d8b/f39 [4816172,16279] 0 2026-03-10T07:51:18.320 INFO:tasks.workunit.client.0.vm05.stdout:9/958: chown d8/d86/d28/d79/d57/de1/d38/fd0 9212 1 2026-03-10T07:51:18.321 INFO:tasks.workunit.client.0.vm05.stdout:9/959: dread - d8/d86/d28/d79/d57/de1/d22/d33/d62/de0/f12a zero size 2026-03-10T07:51:18.325 INFO:tasks.workunit.client.0.vm05.stdout:1/990: write da/f3a [4701889,57112] 0 2026-03-10T07:51:18.326 INFO:tasks.workunit.client.1.vm08.stdout:4/49: truncate f1 1503127 0 2026-03-10T07:51:18.326 INFO:tasks.workunit.client.0.vm05.stdout:1/991: dread - da/dd/d12/d34/f132 zero size 2026-03-10T07:51:18.326 INFO:tasks.workunit.client.0.vm05.stdout:1/992: fdatasync da/fc 0 2026-03-10T07:51:18.337 INFO:tasks.workunit.client.1.vm08.stdout:8/73: truncate d0/f6 1643230 0 2026-03-10T07:51:18.338 INFO:tasks.workunit.client.1.vm08.stdout:8/74: readlink d0/lc 0 2026-03-10T07:51:18.339 INFO:tasks.workunit.client.0.vm05.stdout:1/993: unlink da/dd/ce 0 2026-03-10T07:51:18.355 INFO:tasks.workunit.client.0.vm05.stdout:1/994: dread da/dd/ff9 [4194304,4194304] 0 2026-03-10T07:51:18.356 INFO:tasks.workunit.client.0.vm05.stdout:5/995: write d2/d20/d33/f88 [1237419,40750] 0 2026-03-10T07:51:18.360 INFO:tasks.workunit.client.0.vm05.stdout:5/996: dwrite d2/d5/f3d [0,4194304] 0 2026-03-10T07:51:18.361 INFO:tasks.workunit.client.0.vm05.stdout:5/997: chown d2/d20/f13b 74 1 2026-03-10T07:51:18.362 INFO:tasks.workunit.client.0.vm05.stdout:5/998: dread - d2/d12/d2d/d4a/de7/f154 zero size 2026-03-10T07:51:18.382 INFO:tasks.workunit.client.1.vm08.stdout:5/52: truncate d0/d8/fe 1997481 0 2026-03-10T07:51:18.387 INFO:tasks.workunit.client.1.vm08.stdout:1/57: dwrite d2/f4 [0,4194304] 0 2026-03-10T07:51:18.392 INFO:tasks.workunit.client.1.vm08.stdout:1/58: dread d2/f12 [0,4194304] 0 2026-03-10T07:51:18.398 INFO:tasks.workunit.client.0.vm05.stdout:1/995: read - da/dd/d12/d86/fda zero size 2026-03-10T07:51:18.411 INFO:tasks.workunit.client.1.vm08.stdout:0/89: write f6 [705095,64076] 0 2026-03-10T07:51:18.417 INFO:tasks.workunit.client.0.vm05.stdout:5/999: link d2/d20/d33/d53/d7d/f82 d2/d20/d7b/dca/f159 0 2026-03-10T07:51:18.419 INFO:tasks.workunit.client.1.vm08.stdout:6/80: truncate d1/f7 3939204 0 2026-03-10T07:51:18.437 INFO:tasks.workunit.client.1.vm08.stdout:4/50: rmdir d5/d8/d9 39 2026-03-10T07:51:18.438 INFO:tasks.workunit.client.1.vm08.stdout:4/51: read - d5/d8/fd zero size 2026-03-10T07:51:18.438 INFO:tasks.workunit.client.1.vm08.stdout:8/75: symlink d0/df/l16 0 2026-03-10T07:51:18.442 INFO:tasks.workunit.client.1.vm08.stdout:8/76: dwrite d0/df/f13 [0,4194304] 0 2026-03-10T07:51:18.442 INFO:tasks.workunit.client.1.vm08.stdout:5/53: rename d0/d4/dc to d0/d4/df/d12 0 2026-03-10T07:51:18.443 INFO:tasks.workunit.client.1.vm08.stdout:5/54: readlink d0/la 0 2026-03-10T07:51:18.451 INFO:tasks.workunit.client.1.vm08.stdout:0/90: chown ca 106080900 1 2026-03-10T07:51:18.453 INFO:tasks.workunit.client.1.vm08.stdout:6/81: mkdir d1/d17 0 2026-03-10T07:51:18.453 INFO:tasks.workunit.client.1.vm08.stdout:6/82: write d1/d3/df/f12 [398742,116351] 0 2026-03-10T07:51:18.454 INFO:tasks.workunit.client.1.vm08.stdout:6/83: write d1/d3/df/f12 [1297130,26827] 0 2026-03-10T07:51:18.464 INFO:tasks.workunit.client.1.vm08.stdout:7/58: symlink d3/da/lb 0 2026-03-10T07:51:18.469 INFO:tasks.workunit.client.1.vm08.stdout:8/77: mkdir d0/df/d17 0 2026-03-10T07:51:18.484 INFO:tasks.workunit.client.0.vm05.stdout:8/947: write d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d5d/f68 [887330,71010] 0 2026-03-10T07:51:18.484 INFO:tasks.workunit.client.1.vm08.stdout:5/55: creat d0/d4/df/d12/f13 x:0 0 0 2026-03-10T07:51:18.484 INFO:tasks.workunit.client.1.vm08.stdout:0/91: fdatasync dd/fe 0 2026-03-10T07:51:18.484 INFO:tasks.workunit.client.1.vm08.stdout:2/65: dwrite d0/d1/d3/f8 [0,4194304] 0 2026-03-10T07:51:18.489 INFO:tasks.workunit.client.1.vm08.stdout:5/56: mknod d0/d4/df/c14 0 2026-03-10T07:51:18.490 INFO:tasks.workunit.client.0.vm05.stdout:3/967: write d8/d1f/d24/d76/dc5/de1/d19/f81 [3841676,45101] 0 2026-03-10T07:51:18.501 INFO:tasks.workunit.client.0.vm05.stdout:8/948: creat d1/dd/d4d/dcc/f133 x:0 0 0 2026-03-10T07:51:18.504 INFO:tasks.workunit.client.1.vm08.stdout:4/52: rename d5/d8/f7 to d5/d8/f11 0 2026-03-10T07:51:18.505 INFO:tasks.workunit.client.0.vm05.stdout:3/968: creat d8/d8f/dbc/dc7/f148 x:0 0 0 2026-03-10T07:51:18.508 INFO:tasks.workunit.client.0.vm05.stdout:8/949: mknod d1/dd/d4d/d64/d6a/c134 0 2026-03-10T07:51:18.510 INFO:tasks.workunit.client.1.vm08.stdout:0/92: rename dd/d10/d14/d1c to dd/d10/d14/d15/d20 0 2026-03-10T07:51:18.511 INFO:tasks.workunit.client.0.vm05.stdout:3/969: readlink d8/d1f/d24/d76/dc5/de1/d19/d37/l9a 0 2026-03-10T07:51:18.518 INFO:tasks.workunit.client.0.vm05.stdout:8/950: mknod d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d10c/c135 0 2026-03-10T07:51:18.519 INFO:tasks.workunit.client.0.vm05.stdout:8/951: dread - d1/dd/d4d/d64/d6a/de5/f109 zero size 2026-03-10T07:51:18.520 INFO:tasks.workunit.client.1.vm08.stdout:2/66: rename d0/d1/d9 to d0/d1/d17 0 2026-03-10T07:51:18.526 INFO:tasks.workunit.client.0.vm05.stdout:3/970: fdatasync d8/d22/d60/f61 0 2026-03-10T07:51:18.530 INFO:tasks.workunit.client.1.vm08.stdout:7/59: link d3/c7 d3/cc 0 2026-03-10T07:51:18.533 INFO:tasks.workunit.client.1.vm08.stdout:7/60: dwrite f0 [0,4194304] 0 2026-03-10T07:51:18.535 INFO:tasks.workunit.client.1.vm08.stdout:7/61: dread - d3/f6 zero size 2026-03-10T07:51:18.539 INFO:tasks.workunit.client.1.vm08.stdout:7/62: dwrite d3/f4 [0,4194304] 0 2026-03-10T07:51:18.549 INFO:tasks.workunit.client.1.vm08.stdout:8/78: read d0/f6 [1005323,26254] 0 2026-03-10T07:51:18.551 INFO:tasks.workunit.client.0.vm05.stdout:8/952: fdatasync d1/f15 0 2026-03-10T07:51:18.559 INFO:tasks.workunit.client.0.vm05.stdout:9/960: dwrite d8/d86/d28/d79/d57/ff5 [0,4194304] 0 2026-03-10T07:51:18.562 INFO:tasks.workunit.client.0.vm05.stdout:9/961: dwrite d8/d86/d28/d79/d57/de1/d1c/d20/fbd [0,4194304] 0 2026-03-10T07:51:18.563 INFO:tasks.workunit.client.0.vm05.stdout:9/962: dread - d8/d86/d28/d79/d57/de1/d22/dab/f13b zero size 2026-03-10T07:51:18.571 INFO:tasks.workunit.client.1.vm08.stdout:5/57: write d0/d8/fe [1161951,83974] 0 2026-03-10T07:51:18.571 INFO:tasks.workunit.client.1.vm08.stdout:5/58: stat d0/d4 0 2026-03-10T07:51:18.575 INFO:tasks.workunit.client.1.vm08.stdout:0/93: creat dd/d18/f21 x:0 0 0 2026-03-10T07:51:18.591 INFO:tasks.workunit.client.0.vm05.stdout:1/996: write da/dd/d2a/d70/f9c [397368,127775] 0 2026-03-10T07:51:18.605 INFO:tasks.workunit.client.0.vm05.stdout:3/971: mkdir d8/d1f/d24/d76/dc5/de1/d52/d149 0 2026-03-10T07:51:18.614 INFO:tasks.workunit.client.0.vm05.stdout:8/953: dread d1/dd/d4d/d64/d6a/de5/d2a/d48/f4a [0,4194304] 0 2026-03-10T07:51:18.625 INFO:tasks.workunit.client.1.vm08.stdout:8/79: rename d0/fd to d0/df/f18 0 2026-03-10T07:51:18.625 INFO:tasks.workunit.client.1.vm08.stdout:8/80: write d0/df/f12 [1545673,108181] 0 2026-03-10T07:51:18.633 INFO:tasks.workunit.client.1.vm08.stdout:4/53: mkdir d5/d8/d9/d12 0 2026-03-10T07:51:18.637 INFO:tasks.workunit.client.0.vm05.stdout:1/997: write da/dd/d12/d34/d107/dbe/dc0/fb7 [7316942,71799] 0 2026-03-10T07:51:18.637 INFO:tasks.workunit.client.1.vm08.stdout:5/59: unlink d0/c10 0 2026-03-10T07:51:18.638 INFO:tasks.workunit.client.1.vm08.stdout:5/60: write d0/f9 [1486607,104510] 0 2026-03-10T07:51:18.644 INFO:tasks.workunit.client.0.vm05.stdout:8/954: creat d1/dd/d4d/d64/d6a/de5/d2a/d48/dfd/f136 x:0 0 0 2026-03-10T07:51:18.645 INFO:tasks.workunit.client.0.vm05.stdout:8/955: write d1/dd/d4d/d64/d6a/de5/d2a/d48/dfd/f100 [445431,128132] 0 2026-03-10T07:51:18.656 INFO:tasks.workunit.client.0.vm05.stdout:1/998: creat da/d26/d2b/d71/f134 x:0 0 0 2026-03-10T07:51:18.660 INFO:tasks.workunit.client.1.vm08.stdout:5/61: sync 2026-03-10T07:51:18.663 INFO:tasks.workunit.client.0.vm05.stdout:1/999: rename da/dd/d12/f16 to da/dd/d42/d80/f135 0 2026-03-10T07:51:18.664 INFO:tasks.workunit.client.1.vm08.stdout:5/62: dwrite d0/d8/fe [0,4194304] 0 2026-03-10T07:51:18.667 INFO:tasks.workunit.client.1.vm08.stdout:3/40: write d0/f8 [730116,64477] 0 2026-03-10T07:51:18.668 INFO:tasks.workunit.client.1.vm08.stdout:9/44: write d2/f5 [2542683,14326] 0 2026-03-10T07:51:18.668 INFO:tasks.workunit.client.1.vm08.stdout:1/59: truncate d2/d6/fd 3044179 0 2026-03-10T07:51:18.669 INFO:tasks.workunit.client.1.vm08.stdout:1/60: dread - d2/f13 zero size 2026-03-10T07:51:18.669 INFO:tasks.workunit.client.1.vm08.stdout:1/61: write d2/f4 [1988584,61550] 0 2026-03-10T07:51:18.679 INFO:tasks.workunit.client.1.vm08.stdout:5/63: unlink d0/d4/lb 0 2026-03-10T07:51:18.683 INFO:tasks.workunit.client.1.vm08.stdout:4/54: symlink d5/d8/d9/d12/l13 0 2026-03-10T07:51:18.692 INFO:tasks.workunit.client.1.vm08.stdout:2/67: link d0/c2 d0/d1/d3/d10/c18 0 2026-03-10T07:51:18.692 INFO:tasks.workunit.client.1.vm08.stdout:4/55: dread f0 [0,4194304] 0 2026-03-10T07:51:18.692 INFO:tasks.workunit.client.1.vm08.stdout:2/68: fsync d0/d1/d3/f8 0 2026-03-10T07:51:18.692 INFO:tasks.workunit.client.1.vm08.stdout:4/56: stat f2 0 2026-03-10T07:51:18.692 INFO:tasks.workunit.client.1.vm08.stdout:2/69: fdatasync d0/d1/fb 0 2026-03-10T07:51:18.692 INFO:tasks.workunit.client.1.vm08.stdout:2/70: dread d0/d1/d3/f8 [0,4194304] 0 2026-03-10T07:51:18.694 INFO:tasks.workunit.client.1.vm08.stdout:3/41: mknod d0/ca 0 2026-03-10T07:51:18.695 INFO:tasks.workunit.client.1.vm08.stdout:9/45: creat d2/fd x:0 0 0 2026-03-10T07:51:18.695 INFO:tasks.workunit.client.0.vm05.stdout:8/956: getdents d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d10c 0 2026-03-10T07:51:18.695 INFO:tasks.workunit.client.0.vm05.stdout:9/963: write d8/d86/d28/d79/d57/de1/d22/d33/f5b [3685157,43136] 0 2026-03-10T07:51:18.695 INFO:tasks.workunit.client.1.vm08.stdout:9/46: write d2/fd [336475,55448] 0 2026-03-10T07:51:18.696 INFO:tasks.workunit.client.1.vm08.stdout:5/64: mknod d0/d4/c15 0 2026-03-10T07:51:18.696 INFO:tasks.workunit.client.1.vm08.stdout:1/62: dread d2/f12 [0,4194304] 0 2026-03-10T07:51:18.698 INFO:tasks.workunit.client.0.vm05.stdout:3/972: dwrite d8/d1f/d24/d76/dc5/de1/d52/fde [0,4194304] 0 2026-03-10T07:51:18.699 INFO:tasks.workunit.client.0.vm05.stdout:9/964: dread - d8/d86/d28/d79/d57/de1/d6b/dde/f128 zero size 2026-03-10T07:51:18.700 INFO:tasks.workunit.client.0.vm05.stdout:9/965: truncate d8/d86/d28/d79/d57/dbc/d11a/f125 158525 0 2026-03-10T07:51:18.700 INFO:tasks.workunit.client.1.vm08.stdout:8/81: dwrite d0/f6 [0,4194304] 0 2026-03-10T07:51:18.702 INFO:tasks.workunit.client.1.vm08.stdout:3/42: symlink d0/lb 0 2026-03-10T07:51:18.703 INFO:tasks.workunit.client.1.vm08.stdout:9/47: mkdir d2/de 0 2026-03-10T07:51:18.703 INFO:tasks.workunit.client.1.vm08.stdout:7/63: dwrite d3/f4 [4194304,4194304] 0 2026-03-10T07:51:18.704 INFO:tasks.workunit.client.0.vm05.stdout:8/957: rename d1/dd/d4d/d64/d6a/de5/d2a/d9a/lcd to d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d10c/l137 0 2026-03-10T07:51:18.706 INFO:tasks.workunit.client.0.vm05.stdout:8/958: dread d1/dd/d18/f58 [0,4194304] 0 2026-03-10T07:51:18.709 INFO:tasks.workunit.client.1.vm08.stdout:3/43: truncate d0/f8 1979322 0 2026-03-10T07:51:18.711 INFO:tasks.workunit.client.0.vm05.stdout:9/966: dread d8/d86/d28/d79/d57/de1/d1c/d20/d59/d8b/f50 [0,4194304] 0 2026-03-10T07:51:18.732 INFO:tasks.workunit.client.1.vm08.stdout:0/94: dwrite f2 [0,4194304] 0 2026-03-10T07:51:18.733 INFO:tasks.workunit.client.1.vm08.stdout:0/95: fdatasync f8 0 2026-03-10T07:51:18.765 INFO:tasks.workunit.client.1.vm08.stdout:9/48: unlink d2/c8 0 2026-03-10T07:51:18.765 INFO:tasks.workunit.client.1.vm08.stdout:3/44: rename d0/f8 to d0/fc 0 2026-03-10T07:51:18.766 INFO:tasks.workunit.client.1.vm08.stdout:0/96: mkdir dd/d10/d14/d15/d20/d22 0 2026-03-10T07:51:18.768 INFO:tasks.workunit.client.1.vm08.stdout:2/71: creat d0/f19 x:0 0 0 2026-03-10T07:51:18.770 INFO:tasks.workunit.client.1.vm08.stdout:7/64: creat d3/d8/d9/fd x:0 0 0 2026-03-10T07:51:18.771 INFO:tasks.workunit.client.1.vm08.stdout:7/65: read - d3/f6 zero size 2026-03-10T07:51:18.772 INFO:tasks.workunit.client.1.vm08.stdout:2/72: dread d0/d1/fb [0,4194304] 0 2026-03-10T07:51:18.772 INFO:tasks.workunit.client.1.vm08.stdout:9/49: symlink d2/d3/lf 0 2026-03-10T07:51:18.773 INFO:tasks.workunit.client.1.vm08.stdout:2/73: chown d0/c6 28754 1 2026-03-10T07:51:18.773 INFO:tasks.workunit.client.1.vm08.stdout:0/97: creat dd/d10/d14/f23 x:0 0 0 2026-03-10T07:51:18.773 INFO:tasks.workunit.client.1.vm08.stdout:7/66: dread - d3/d8/d9/fd zero size 2026-03-10T07:51:18.775 INFO:tasks.workunit.client.1.vm08.stdout:5/65: link d0/d4/df/c14 d0/d4/df/c16 0 2026-03-10T07:51:18.777 INFO:tasks.workunit.client.1.vm08.stdout:7/67: write d3/f4 [7311715,84113] 0 2026-03-10T07:51:18.778 INFO:tasks.workunit.client.1.vm08.stdout:0/98: dread f6 [0,4194304] 0 2026-03-10T07:51:18.779 INFO:tasks.workunit.client.1.vm08.stdout:5/66: truncate d0/d4/df/d12/f11 87442 0 2026-03-10T07:51:18.779 INFO:tasks.workunit.client.1.vm08.stdout:3/45: dread d0/fc [0,4194304] 0 2026-03-10T07:51:18.781 INFO:tasks.workunit.client.1.vm08.stdout:2/74: dread d0/d1/d3/f8 [0,4194304] 0 2026-03-10T07:51:18.781 INFO:tasks.workunit.client.1.vm08.stdout:2/75: chown d0/c6 0 1 2026-03-10T07:51:18.782 INFO:tasks.workunit.client.1.vm08.stdout:9/50: getdents d2/de 0 2026-03-10T07:51:18.783 INFO:tasks.workunit.client.1.vm08.stdout:0/99: symlink dd/d18/l24 0 2026-03-10T07:51:18.783 INFO:tasks.workunit.client.1.vm08.stdout:5/67: rmdir d0/d4/df 39 2026-03-10T07:51:18.785 INFO:tasks.workunit.client.1.vm08.stdout:7/68: dwrite d3/f6 [0,4194304] 0 2026-03-10T07:51:18.785 INFO:tasks.workunit.client.1.vm08.stdout:3/46: mkdir d0/dd 0 2026-03-10T07:51:18.787 INFO:tasks.workunit.client.1.vm08.stdout:0/100: dread f8 [0,4194304] 0 2026-03-10T07:51:18.821 INFO:tasks.workunit.client.1.vm08.stdout:9/51: truncate d2/fb 833976 0 2026-03-10T07:51:18.821 INFO:tasks.workunit.client.1.vm08.stdout:7/69: symlink d3/da/le 0 2026-03-10T07:51:18.821 INFO:tasks.workunit.client.1.vm08.stdout:0/101: unlink dd/d10/d14/f23 0 2026-03-10T07:51:18.821 INFO:tasks.workunit.client.1.vm08.stdout:7/70: rename d3/da/lb to d3/da/lf 0 2026-03-10T07:51:18.821 INFO:tasks.workunit.client.1.vm08.stdout:5/68: creat d0/f17 x:0 0 0 2026-03-10T07:51:18.821 INFO:tasks.workunit.client.1.vm08.stdout:7/71: creat d3/d8/f10 x:0 0 0 2026-03-10T07:51:18.821 INFO:tasks.workunit.client.1.vm08.stdout:0/102: link dd/d10/f12 dd/d18/f25 0 2026-03-10T07:51:18.821 INFO:tasks.workunit.client.1.vm08.stdout:5/69: truncate d0/d4/df/d12/f13 948357 0 2026-03-10T07:51:18.821 INFO:tasks.workunit.client.1.vm08.stdout:7/72: creat d3/da/f11 x:0 0 0 2026-03-10T07:51:18.821 INFO:tasks.workunit.client.1.vm08.stdout:5/70: write d0/d4/df/d12/f13 [635614,122520] 0 2026-03-10T07:51:18.821 INFO:tasks.workunit.client.1.vm08.stdout:0/103: dwrite f5 [4194304,4194304] 0 2026-03-10T07:51:18.821 INFO:tasks.workunit.client.1.vm08.stdout:7/73: write d3/f6 [495678,62653] 0 2026-03-10T07:51:18.828 INFO:tasks.workunit.client.1.vm08.stdout:7/74: rename d3/da/le to d3/d8/l12 0 2026-03-10T07:51:18.842 INFO:tasks.workunit.client.1.vm08.stdout:5/71: creat d0/d8/f18 x:0 0 0 2026-03-10T07:51:18.842 INFO:tasks.workunit.client.1.vm08.stdout:5/72: write d0/d4/f5 [4853337,66198] 0 2026-03-10T07:51:18.842 INFO:tasks.workunit.client.1.vm08.stdout:7/75: creat d3/f13 x:0 0 0 2026-03-10T07:51:18.842 INFO:tasks.workunit.client.1.vm08.stdout:7/76: chown d3/c5 49 1 2026-03-10T07:51:18.842 INFO:tasks.workunit.client.1.vm08.stdout:7/77: rename d3/f13 to d3/d8/f14 0 2026-03-10T07:51:18.842 INFO:tasks.workunit.client.1.vm08.stdout:0/104: link lc dd/d10/l26 0 2026-03-10T07:51:18.842 INFO:tasks.workunit.client.1.vm08.stdout:5/73: mkdir d0/d4/d19 0 2026-03-10T07:51:18.842 INFO:tasks.workunit.client.1.vm08.stdout:7/78: dwrite d3/f4 [8388608,4194304] 0 2026-03-10T07:51:18.843 INFO:tasks.workunit.client.1.vm08.stdout:5/74: dwrite d0/d4/df/d12/f11 [0,4194304] 0 2026-03-10T07:51:18.843 INFO:tasks.workunit.client.1.vm08.stdout:5/75: chown d0/f9 3 1 2026-03-10T07:51:18.843 INFO:tasks.workunit.client.0.vm05.stdout:9/967: sync 2026-03-10T07:51:18.846 INFO:tasks.workunit.client.1.vm08.stdout:7/79: dread d3/f6 [0,4194304] 0 2026-03-10T07:51:18.851 INFO:tasks.workunit.client.1.vm08.stdout:5/76: dread d0/d8/fe [0,4194304] 0 2026-03-10T07:51:18.864 INFO:tasks.workunit.client.1.vm08.stdout:7/80: rename d3/da/f11 to d3/d8/d9/f15 0 2026-03-10T07:51:18.882 INFO:tasks.workunit.client.0.vm05.stdout:9/968: rmdir d8/d86/d28/d79/d57/de1/d22/d33/d62/d6d/d9e/d112 39 2026-03-10T07:51:18.882 INFO:tasks.workunit.client.0.vm05.stdout:9/969: rename d8/l61 to d8/d86/d28/d79/d57/de1/d38/d71/d12e/l147 0 2026-03-10T07:51:18.882 INFO:tasks.workunit.client.0.vm05.stdout:9/970: dwrite d8/d86/d28/d79/d57/de1/d1c/d20/d59/d8b/f39 [4194304,4194304] 0 2026-03-10T07:51:18.882 INFO:tasks.workunit.client.1.vm08.stdout:0/105: symlink dd/d10/d14/d1b/l27 0 2026-03-10T07:51:18.882 INFO:tasks.workunit.client.1.vm08.stdout:0/106: write dd/f16 [961855,19706] 0 2026-03-10T07:51:18.882 INFO:tasks.workunit.client.1.vm08.stdout:7/81: rename d3/d8/f10 to d3/f16 0 2026-03-10T07:51:18.882 INFO:tasks.workunit.client.1.vm08.stdout:0/107: rmdir dd/d10 39 2026-03-10T07:51:18.882 INFO:tasks.workunit.client.1.vm08.stdout:7/82: dread f0 [4194304,4194304] 0 2026-03-10T07:51:18.882 INFO:tasks.workunit.client.1.vm08.stdout:7/83: read - d3/f16 zero size 2026-03-10T07:51:18.882 INFO:tasks.workunit.client.1.vm08.stdout:5/77: creat d0/f1a x:0 0 0 2026-03-10T07:51:18.882 INFO:tasks.workunit.client.1.vm08.stdout:0/108: chown dd/d10/d14/d15/l17 6103 1 2026-03-10T07:51:18.882 INFO:tasks.workunit.client.1.vm08.stdout:7/84: dwrite d3/d8/d9/fd [0,4194304] 0 2026-03-10T07:51:18.882 INFO:tasks.workunit.client.1.vm08.stdout:5/78: creat d0/d8/f1b x:0 0 0 2026-03-10T07:51:18.900 INFO:tasks.workunit.client.1.vm08.stdout:0/109: rename dd/d10/d14/d15/c1f to dd/d10/c28 0 2026-03-10T07:51:18.907 INFO:tasks.workunit.client.1.vm08.stdout:7/85: creat d3/da/f17 x:0 0 0 2026-03-10T07:51:18.908 INFO:tasks.workunit.client.1.vm08.stdout:5/79: mkdir d0/d4/df/d12/d1c 0 2026-03-10T07:51:18.910 INFO:tasks.workunit.client.1.vm08.stdout:7/86: dread d3/d8/d9/fd [0,4194304] 0 2026-03-10T07:51:18.914 INFO:tasks.workunit.client.1.vm08.stdout:0/110: mkdir dd/d29 0 2026-03-10T07:51:18.934 INFO:tasks.workunit.client.1.vm08.stdout:7/87: creat d3/d8/f18 x:0 0 0 2026-03-10T07:51:18.934 INFO:tasks.workunit.client.1.vm08.stdout:7/88: read f0 [1621992,128544] 0 2026-03-10T07:51:18.934 INFO:tasks.workunit.client.1.vm08.stdout:0/111: creat dd/d29/f2a x:0 0 0 2026-03-10T07:51:18.934 INFO:tasks.workunit.client.1.vm08.stdout:7/89: symlink d3/d8/d9/l19 0 2026-03-10T07:51:18.934 INFO:tasks.workunit.client.1.vm08.stdout:7/90: write d3/d8/d9/fd [1024317,70520] 0 2026-03-10T07:51:18.934 INFO:tasks.workunit.client.1.vm08.stdout:6/84: truncate d1/f7 1634196 0 2026-03-10T07:51:18.934 INFO:tasks.workunit.client.1.vm08.stdout:2/76: truncate d0/d1/fb 1020161 0 2026-03-10T07:51:18.934 INFO:tasks.workunit.client.1.vm08.stdout:0/112: dwrite dd/d10/f12 [0,4194304] 0 2026-03-10T07:51:18.934 INFO:tasks.workunit.client.1.vm08.stdout:6/85: dwrite d1/db/f16 [0,4194304] 0 2026-03-10T07:51:18.934 INFO:tasks.workunit.client.1.vm08.stdout:7/91: symlink d3/d8/l1a 0 2026-03-10T07:51:18.937 INFO:tasks.workunit.client.0.vm05.stdout:9/971: read d8/d86/f92 [2883194,46705] 0 2026-03-10T07:51:18.938 INFO:tasks.workunit.client.0.vm05.stdout:9/972: write d8/d86/d28/d79/d57/de1/d1c/d20/dd3/f144 [65677,13421] 0 2026-03-10T07:51:18.940 INFO:tasks.workunit.client.1.vm08.stdout:0/113: dwrite f8 [0,4194304] 0 2026-03-10T07:51:18.949 INFO:tasks.workunit.client.1.vm08.stdout:2/77: creat d0/d1/d17/f1a x:0 0 0 2026-03-10T07:51:18.965 INFO:tasks.workunit.client.0.vm05.stdout:8/959: write d1/dd/d4d/d64/d6a/de5/d2a/d48/f59 [3867055,99632] 0 2026-03-10T07:51:18.965 INFO:tasks.workunit.client.0.vm05.stdout:3/973: write d8/f18 [1350459,83745] 0 2026-03-10T07:51:18.966 INFO:tasks.workunit.client.0.vm05.stdout:9/973: mknod d8/d86/d28/d79/d57/dbc/c148 0 2026-03-10T07:51:18.966 INFO:tasks.workunit.client.1.vm08.stdout:4/57: write f1 [28219,86737] 0 2026-03-10T07:51:18.966 INFO:tasks.workunit.client.1.vm08.stdout:4/58: dread - d5/d8/fd zero size 2026-03-10T07:51:18.966 INFO:tasks.workunit.client.1.vm08.stdout:8/82: truncate d0/f6 2627061 0 2026-03-10T07:51:18.966 INFO:tasks.workunit.client.1.vm08.stdout:6/86: creat d1/f18 x:0 0 0 2026-03-10T07:51:18.966 INFO:tasks.workunit.client.1.vm08.stdout:8/83: dread d0/df/f12 [0,4194304] 0 2026-03-10T07:51:18.966 INFO:tasks.workunit.client.1.vm08.stdout:4/59: dwrite f1 [0,4194304] 0 2026-03-10T07:51:18.966 INFO:tasks.workunit.client.1.vm08.stdout:7/92: creat d3/d8/f1b x:0 0 0 2026-03-10T07:51:18.967 INFO:tasks.workunit.client.1.vm08.stdout:3/47: truncate d0/fc 315649 0 2026-03-10T07:51:18.967 INFO:tasks.workunit.client.1.vm08.stdout:4/60: write d5/f10 [893108,50132] 0 2026-03-10T07:51:18.967 INFO:tasks.workunit.client.1.vm08.stdout:8/84: truncate d0/df/f18 4354541 0 2026-03-10T07:51:18.967 INFO:tasks.workunit.client.1.vm08.stdout:0/114: truncate f6 1232668 0 2026-03-10T07:51:18.967 INFO:tasks.workunit.client.1.vm08.stdout:5/80: sync 2026-03-10T07:51:18.967 INFO:tasks.workunit.client.1.vm08.stdout:5/81: readlink d0/la 0 2026-03-10T07:51:18.967 INFO:tasks.workunit.client.1.vm08.stdout:5/82: stat d0/f1a 0 2026-03-10T07:51:18.967 INFO:tasks.workunit.client.0.vm05.stdout:3/974: creat d8/d22/d60/d6e/dca/f14a x:0 0 0 2026-03-10T07:51:18.968 INFO:tasks.workunit.client.0.vm05.stdout:3/975: write d8/d1f/d24/d8a/f57 [8049869,64457] 0 2026-03-10T07:51:18.970 INFO:tasks.workunit.client.1.vm08.stdout:8/85: dwrite d0/fa [0,4194304] 0 2026-03-10T07:51:18.973 INFO:tasks.workunit.client.1.vm08.stdout:2/78: mkdir d0/d1/d3/d5/d1b 0 2026-03-10T07:51:18.977 INFO:tasks.workunit.client.1.vm08.stdout:5/83: dwrite d0/f17 [0,4194304] 0 2026-03-10T07:51:18.977 INFO:tasks.workunit.client.1.vm08.stdout:5/84: readlink d0/la 0 2026-03-10T07:51:18.988 INFO:tasks.workunit.client.1.vm08.stdout:6/87: creat d1/d3/f19 x:0 0 0 2026-03-10T07:51:18.988 INFO:tasks.workunit.client.0.vm05.stdout:9/974: getdents d8/d86/d28/d79/d57/de1/d38/d71/d81/d10f 0 2026-03-10T07:51:18.989 INFO:tasks.workunit.client.1.vm08.stdout:3/48: rename d0/c5 to d0/dd/ce 0 2026-03-10T07:51:18.993 INFO:tasks.workunit.client.1.vm08.stdout:7/93: dwrite d3/d8/f14 [0,4194304] 0 2026-03-10T07:51:18.995 INFO:tasks.workunit.client.0.vm05.stdout:9/975: creat d8/d86/d28/d79/d57/de1/d22/d33/f149 x:0 0 0 2026-03-10T07:51:19.000 INFO:tasks.workunit.client.0.vm05.stdout:9/976: truncate d8/d86/d28/d79/d57/de1/d22/fb1 24075 0 2026-03-10T07:51:19.000 INFO:tasks.workunit.client.1.vm08.stdout:5/85: rmdir d0/d4/df 39 2026-03-10T07:51:19.001 INFO:tasks.workunit.client.1.vm08.stdout:5/86: dread - d0/d8/f1b zero size 2026-03-10T07:51:19.002 INFO:tasks.workunit.client.1.vm08.stdout:5/87: stat d0/d4/d19 0 2026-03-10T07:51:19.002 INFO:tasks.workunit.client.1.vm08.stdout:7/94: dread d3/f4 [0,4194304] 0 2026-03-10T07:51:19.003 INFO:tasks.workunit.client.1.vm08.stdout:6/88: rename d1/d3/df/l10 to d1/l1a 0 2026-03-10T07:51:19.010 INFO:tasks.workunit.client.1.vm08.stdout:6/89: dread d1/d3/df/f12 [0,4194304] 0 2026-03-10T07:51:19.010 INFO:tasks.workunit.client.1.vm08.stdout:6/90: truncate d1/f18 636325 0 2026-03-10T07:51:19.015 INFO:tasks.workunit.client.1.vm08.stdout:2/79: getdents d0/d1/d3/d5/d1b 0 2026-03-10T07:51:19.016 INFO:tasks.workunit.client.1.vm08.stdout:6/91: dwrite d1/db/f16 [0,4194304] 0 2026-03-10T07:51:19.019 INFO:tasks.workunit.client.1.vm08.stdout:7/95: symlink d3/d8/d9/l1c 0 2026-03-10T07:51:19.028 INFO:tasks.workunit.client.1.vm08.stdout:7/96: dwrite d3/d8/f18 [0,4194304] 0 2026-03-10T07:51:19.033 INFO:tasks.workunit.client.1.vm08.stdout:5/88: rmdir d0/d4/df 39 2026-03-10T07:51:19.034 INFO:tasks.workunit.client.1.vm08.stdout:5/89: read - d0/d8/f1b zero size 2026-03-10T07:51:19.034 INFO:tasks.workunit.client.1.vm08.stdout:6/92: rename d1/d3/c11 to d1/db/c1b 0 2026-03-10T07:51:19.039 INFO:tasks.workunit.client.1.vm08.stdout:2/80: rmdir d0/d1/d15 0 2026-03-10T07:51:19.046 INFO:tasks.workunit.client.1.vm08.stdout:2/81: dwrite d0/d1/d3/f8 [0,4194304] 0 2026-03-10T07:51:19.046 INFO:tasks.workunit.client.1.vm08.stdout:2/82: chown d0/d1/d3/d5/f13 1610 1 2026-03-10T07:51:19.060 INFO:tasks.workunit.client.1.vm08.stdout:7/97: sync 2026-03-10T07:51:19.065 INFO:tasks.workunit.client.1.vm08.stdout:6/93: symlink d1/d17/l1c 0 2026-03-10T07:51:19.065 INFO:tasks.workunit.client.1.vm08.stdout:6/94: stat d1/db/f16 0 2026-03-10T07:51:19.065 INFO:tasks.workunit.client.1.vm08.stdout:2/83: creat d0/d1/d17/f1c x:0 0 0 2026-03-10T07:51:19.066 INFO:tasks.workunit.client.1.vm08.stdout:5/90: truncate d0/d8/fe 961067 0 2026-03-10T07:51:19.066 INFO:tasks.workunit.client.1.vm08.stdout:2/84: read - d0/f19 zero size 2026-03-10T07:51:19.073 INFO:tasks.workunit.client.1.vm08.stdout:2/85: dread d0/d1/d3/f8 [0,4194304] 0 2026-03-10T07:51:19.081 INFO:tasks.workunit.client.1.vm08.stdout:7/98: chown d3/c7 3623 1 2026-03-10T07:51:19.081 INFO:tasks.workunit.client.1.vm08.stdout:2/86: stat d0/d1/d3/d5/l16 0 2026-03-10T07:51:19.081 INFO:tasks.workunit.client.1.vm08.stdout:2/87: read d0/d1/d3/f8 [3439867,23762] 0 2026-03-10T07:51:19.081 INFO:tasks.workunit.client.1.vm08.stdout:5/91: chown d0/d4/df/d12/d1c 720383485 1 2026-03-10T07:51:19.081 INFO:tasks.workunit.client.1.vm08.stdout:5/92: fsync d0/d4/df/d12/f11 0 2026-03-10T07:51:19.081 INFO:tasks.workunit.client.1.vm08.stdout:1/63: dwrite d2/d6/fd [0,4194304] 0 2026-03-10T07:51:19.085 INFO:tasks.workunit.client.1.vm08.stdout:2/88: unlink d0/c6 0 2026-03-10T07:51:19.090 INFO:tasks.workunit.client.1.vm08.stdout:7/99: sync 2026-03-10T07:51:19.094 INFO:tasks.workunit.client.1.vm08.stdout:8/86: write d0/f6 [2574796,45972] 0 2026-03-10T07:51:19.096 INFO:tasks.workunit.client.1.vm08.stdout:8/87: truncate d0/df/f18 4935831 0 2026-03-10T07:51:19.096 INFO:tasks.workunit.client.1.vm08.stdout:8/88: fdatasync d0/df/f13 0 2026-03-10T07:51:19.099 INFO:tasks.workunit.client.1.vm08.stdout:9/52: dwrite d2/fb [0,4194304] 0 2026-03-10T07:51:19.103 INFO:tasks.workunit.client.1.vm08.stdout:9/53: chown d2/fd 1007005798 1 2026-03-10T07:51:19.114 INFO:tasks.workunit.client.1.vm08.stdout:1/64: link d2/d6/de/f15 d2/d6/de/f1c 0 2026-03-10T07:51:19.115 INFO:tasks.workunit.client.0.vm05.stdout:8/960: write d1/ff2 [269940,103115] 0 2026-03-10T07:51:19.116 INFO:tasks.workunit.client.0.vm05.stdout:8/961: chown d1/dd/d4d/d64/d6a/de5/d2a/de3/cfa 32393 1 2026-03-10T07:51:19.116 INFO:tasks.workunit.client.0.vm05.stdout:8/962: chown d1/d45/dfb/c12c 29656 1 2026-03-10T07:51:19.120 INFO:tasks.workunit.client.0.vm05.stdout:3/976: dwrite d8/d1c/f9d [0,4194304] 0 2026-03-10T07:51:19.137 INFO:tasks.workunit.client.1.vm08.stdout:9/54: creat d2/f10 x:0 0 0 2026-03-10T07:51:19.142 INFO:tasks.workunit.client.1.vm08.stdout:9/55: dread d2/fd [0,4194304] 0 2026-03-10T07:51:19.146 INFO:tasks.workunit.client.1.vm08.stdout:9/56: creat d2/f11 x:0 0 0 2026-03-10T07:51:19.149 INFO:tasks.workunit.client.1.vm08.stdout:9/57: creat d2/d3/f12 x:0 0 0 2026-03-10T07:51:19.154 INFO:tasks.workunit.client.1.vm08.stdout:1/65: getdents d2 0 2026-03-10T07:51:19.166 INFO:tasks.workunit.client.1.vm08.stdout:9/58: creat d2/de/f13 x:0 0 0 2026-03-10T07:51:19.166 INFO:tasks.workunit.client.1.vm08.stdout:1/66: mknod d2/d6/de/c1d 0 2026-03-10T07:51:19.166 INFO:tasks.workunit.client.1.vm08.stdout:9/59: creat d2/de/f14 x:0 0 0 2026-03-10T07:51:19.166 INFO:tasks.workunit.client.1.vm08.stdout:1/67: rename d2/d6/de/f1b to d2/d6/d11/f1e 0 2026-03-10T07:51:19.166 INFO:tasks.workunit.client.1.vm08.stdout:1/68: dread - d2/d6/de/f15 zero size 2026-03-10T07:51:19.211 INFO:tasks.workunit.client.1.vm08.stdout:4/61: dread d5/f10 [0,4194304] 0 2026-03-10T07:51:19.215 INFO:tasks.workunit.client.0.vm05.stdout:3/977: symlink d8/d1f/d24/d76/dc5/l14b 0 2026-03-10T07:51:19.219 INFO:tasks.workunit.client.0.vm05.stdout:8/963: rename d1/dd/d4d/dcc/dbd/d10e/fd7 to d1/dd/d4d/d64/d6a/de5/d2a/de3/f138 0 2026-03-10T07:51:19.262 INFO:tasks.workunit.client.1.vm08.stdout:7/100: fdatasync d3/d8/f18 0 2026-03-10T07:51:19.263 INFO:tasks.workunit.client.1.vm08.stdout:7/101: chown d3/da 8581 1 2026-03-10T07:51:19.265 INFO:tasks.workunit.client.1.vm08.stdout:7/102: unlink d3/d8/d9/f15 0 2026-03-10T07:51:19.269 INFO:tasks.workunit.client.1.vm08.stdout:7/103: write d3/d8/d9/fd [335720,96095] 0 2026-03-10T07:51:19.269 INFO:tasks.workunit.client.1.vm08.stdout:6/95: fsync d1/d3/f19 0 2026-03-10T07:51:19.269 INFO:tasks.workunit.client.0.vm05.stdout:8/964: sync 2026-03-10T07:51:19.269 INFO:tasks.workunit.client.0.vm05.stdout:9/977: write d8/d86/d28/d79/d57/de1/d1c/d20/dd3/d63/fd5 [1095658,89055] 0 2026-03-10T07:51:19.270 INFO:tasks.workunit.client.1.vm08.stdout:8/89: truncate d0/df/f12 3309080 0 2026-03-10T07:51:19.271 INFO:tasks.workunit.client.0.vm05.stdout:8/965: write d1/dd/d4d/d64/d6a/de5/d2a/d48/f59 [4941514,81700] 0 2026-03-10T07:51:19.289 INFO:tasks.workunit.client.1.vm08.stdout:2/89: getdents d0/d1 0 2026-03-10T07:51:19.289 INFO:tasks.workunit.client.1.vm08.stdout:5/93: read d0/d8/fe [519638,28056] 0 2026-03-10T07:51:19.291 INFO:tasks.workunit.client.1.vm08.stdout:7/104: creat d3/da/f1d x:0 0 0 2026-03-10T07:51:19.296 INFO:tasks.workunit.client.1.vm08.stdout:3/49: dwrite d0/fc [0,4194304] 0 2026-03-10T07:51:19.301 INFO:tasks.workunit.client.1.vm08.stdout:2/90: dwrite d0/d1/d3/f14 [0,4194304] 0 2026-03-10T07:51:19.303 INFO:tasks.workunit.client.1.vm08.stdout:2/91: dread - d0/d1/d17/f1a zero size 2026-03-10T07:51:19.303 INFO:tasks.workunit.client.1.vm08.stdout:2/92: fsync d0/f19 0 2026-03-10T07:51:19.307 INFO:tasks.workunit.client.1.vm08.stdout:6/96: mkdir d1/d3/df/d1d 0 2026-03-10T07:51:19.311 INFO:tasks.workunit.client.1.vm08.stdout:0/115: dwrite f6 [0,4194304] 0 2026-03-10T07:51:19.319 INFO:tasks.workunit.client.1.vm08.stdout:0/116: write dd/f16 [1896082,129668] 0 2026-03-10T07:51:19.327 INFO:tasks.workunit.client.1.vm08.stdout:8/90: creat d0/df/f19 x:0 0 0 2026-03-10T07:51:19.343 INFO:tasks.workunit.client.0.vm05.stdout:9/978: fsync d8/f9c 0 2026-03-10T07:51:19.344 INFO:tasks.workunit.client.1.vm08.stdout:8/91: chown d0/df/l10 205951653 1 2026-03-10T07:51:19.344 INFO:tasks.workunit.client.1.vm08.stdout:8/92: dwrite d0/fa [0,4194304] 0 2026-03-10T07:51:19.344 INFO:tasks.workunit.client.1.vm08.stdout:8/93: dwrite d0/df/f13 [0,4194304] 0 2026-03-10T07:51:19.344 INFO:tasks.workunit.client.1.vm08.stdout:0/117: sync 2026-03-10T07:51:19.351 INFO:tasks.workunit.client.0.vm05.stdout:9/979: symlink d8/d86/dfe/l14a 0 2026-03-10T07:51:19.352 INFO:tasks.workunit.client.0.vm05.stdout:9/980: write d8/d86/d28/d79/d57/de1/d22/d33/d85/f140 [238426,123590] 0 2026-03-10T07:51:19.353 INFO:tasks.workunit.client.0.vm05.stdout:9/981: dread - d8/d86/d28/d79/d57/de1/d22/d33/d62/de0/f12a zero size 2026-03-10T07:51:19.371 INFO:tasks.workunit.client.1.vm08.stdout:9/60: truncate d2/f6 1359766 0 2026-03-10T07:51:19.375 INFO:tasks.workunit.client.1.vm08.stdout:1/69: truncate d2/d6/ff 1870119 0 2026-03-10T07:51:19.387 INFO:tasks.workunit.client.1.vm08.stdout:4/62: write d5/d8/ff [673870,57379] 0 2026-03-10T07:51:19.388 INFO:tasks.workunit.client.0.vm05.stdout:3/978: dwrite d8/d1f/fcd [0,4194304] 0 2026-03-10T07:51:19.398 INFO:tasks.workunit.client.0.vm05.stdout:3/979: mknod d8/d1f/d2a/d34/dbd/d116/c14c 0 2026-03-10T07:51:19.421 INFO:tasks.workunit.client.1.vm08.stdout:7/105: creat d3/d8/f1e x:0 0 0 2026-03-10T07:51:19.425 INFO:tasks.workunit.client.1.vm08.stdout:7/106: dwrite d3/d8/f18 [0,4194304] 0 2026-03-10T07:51:19.426 INFO:tasks.workunit.client.1.vm08.stdout:3/50: rename d0/lb to d0/lf 0 2026-03-10T07:51:19.427 INFO:tasks.workunit.client.1.vm08.stdout:7/107: write d3/da/f1d [936326,13009] 0 2026-03-10T07:51:19.445 INFO:tasks.workunit.client.1.vm08.stdout:2/93: mknod d0/d1/c1d 0 2026-03-10T07:51:19.451 INFO:tasks.workunit.client.0.vm05.stdout:8/966: write d1/dd/d4d/f113 [490960,6101] 0 2026-03-10T07:51:19.456 INFO:tasks.workunit.client.0.vm05.stdout:8/967: symlink d1/d6f/df9/l139 0 2026-03-10T07:51:19.461 INFO:tasks.workunit.client.1.vm08.stdout:0/118: readlink dd/d10/l26 0 2026-03-10T07:51:19.464 INFO:tasks.workunit.client.0.vm05.stdout:8/968: rmdir d1/dd/d4d/d64/d6a/de5/d2a 39 2026-03-10T07:51:19.467 INFO:tasks.workunit.client.0.vm05.stdout:9/982: truncate d8/d86/d28/d79/d57/de1/d22/d33/d62/f11d 3340296 0 2026-03-10T07:51:19.471 INFO:tasks.workunit.client.0.vm05.stdout:9/983: dwrite d8/d86/d28/d79/d57/de1/d1c/d20/d59/d8b/f39 [0,4194304] 0 2026-03-10T07:51:19.476 INFO:tasks.workunit.client.1.vm08.stdout:9/61: rename d2/f7 to d2/f15 0 2026-03-10T07:51:19.478 INFO:tasks.workunit.client.1.vm08.stdout:9/62: write d2/de/f13 [211274,29616] 0 2026-03-10T07:51:19.491 INFO:tasks.workunit.client.1.vm08.stdout:1/70: mkdir d2/d6/de/d1f 0 2026-03-10T07:51:19.492 INFO:tasks.workunit.client.0.vm05.stdout:8/969: write d1/dd/d18/f58 [1379690,1395] 0 2026-03-10T07:51:19.492 INFO:tasks.workunit.client.0.vm05.stdout:8/970: chown d1/dd/d18/f21 6627190 1 2026-03-10T07:51:19.496 INFO:tasks.workunit.client.1.vm08.stdout:5/94: symlink d0/d4/l1d 0 2026-03-10T07:51:19.497 INFO:tasks.workunit.client.0.vm05.stdout:3/980: write d8/d22/d60/d6e/dca/dda/f10d [3709813,28190] 0 2026-03-10T07:51:19.497 INFO:tasks.workunit.client.1.vm08.stdout:5/95: dread - d0/d8/f18 zero size 2026-03-10T07:51:19.499 INFO:tasks.workunit.client.1.vm08.stdout:3/51: unlink d0/l4 0 2026-03-10T07:51:19.500 INFO:tasks.workunit.client.0.vm05.stdout:8/971: mkdir d1/d45/dfb/d13a 0 2026-03-10T07:51:19.500 INFO:tasks.workunit.client.0.vm05.stdout:8/972: dread - d1/dd/f12e zero size 2026-03-10T07:51:19.503 INFO:tasks.workunit.client.1.vm08.stdout:7/108: fdatasync d3/f6 0 2026-03-10T07:51:19.510 INFO:tasks.workunit.client.1.vm08.stdout:8/94: creat d0/df/d17/f1a x:0 0 0 2026-03-10T07:51:19.511 INFO:tasks.workunit.client.1.vm08.stdout:0/119: mknod dd/d10/c2b 0 2026-03-10T07:51:19.512 INFO:tasks.workunit.client.1.vm08.stdout:0/120: chown dd/d10/f12 1 1 2026-03-10T07:51:19.517 INFO:tasks.workunit.client.1.vm08.stdout:0/121: dwrite dd/d18/f21 [0,4194304] 0 2026-03-10T07:51:19.525 INFO:tasks.workunit.client.0.vm05.stdout:9/984: getdents d8/d86/db8 0 2026-03-10T07:51:19.525 INFO:tasks.workunit.client.0.vm05.stdout:9/985: readlink d8/d86/d28/d79/d57/de1/d1c/d20/d59/d8b/l58 0 2026-03-10T07:51:19.528 INFO:tasks.workunit.client.1.vm08.stdout:1/71: rename d2/f12 to d2/d6/de/d1f/f20 0 2026-03-10T07:51:19.535 INFO:tasks.workunit.client.0.vm05.stdout:3/981: dwrite d8/d1f/d24/d76/dc5/de1/d12a/f12d [0,4194304] 0 2026-03-10T07:51:19.536 INFO:tasks.workunit.client.1.vm08.stdout:1/72: write d2/f13 [899310,89575] 0 2026-03-10T07:51:19.536 INFO:tasks.workunit.client.1.vm08.stdout:5/96: mkdir d0/d4/df/d1e 0 2026-03-10T07:51:19.536 INFO:tasks.workunit.client.1.vm08.stdout:3/52: creat d0/f10 x:0 0 0 2026-03-10T07:51:19.537 INFO:tasks.workunit.client.1.vm08.stdout:3/53: write d0/fc [5065173,114340] 0 2026-03-10T07:51:19.544 INFO:tasks.workunit.client.1.vm08.stdout:7/109: unlink f0 0 2026-03-10T07:51:19.544 INFO:tasks.workunit.client.0.vm05.stdout:9/986: creat d8/d86/d28/f14b x:0 0 0 2026-03-10T07:51:19.544 INFO:tasks.workunit.client.0.vm05.stdout:3/982: creat d8/d1c/db3/f14d x:0 0 0 2026-03-10T07:51:19.544 INFO:tasks.workunit.client.1.vm08.stdout:8/95: creat d0/df/f1b x:0 0 0 2026-03-10T07:51:19.545 INFO:tasks.workunit.client.1.vm08.stdout:7/110: truncate d3/f16 346750 0 2026-03-10T07:51:19.545 INFO:tasks.workunit.client.1.vm08.stdout:8/96: read d0/fa [1122318,36703] 0 2026-03-10T07:51:19.545 INFO:tasks.workunit.client.1.vm08.stdout:8/97: dread - d0/df/f1b zero size 2026-03-10T07:51:19.546 INFO:tasks.workunit.client.1.vm08.stdout:9/63: fsync d2/f15 0 2026-03-10T07:51:19.546 INFO:tasks.workunit.client.1.vm08.stdout:1/73: mknod d2/d6/de/c21 0 2026-03-10T07:51:19.547 INFO:tasks.workunit.client.1.vm08.stdout:1/74: read - d2/d6/de/f1c zero size 2026-03-10T07:51:19.556 INFO:tasks.workunit.client.1.vm08.stdout:5/97: mknod d0/d4/df/c1f 0 2026-03-10T07:51:19.557 INFO:tasks.workunit.client.1.vm08.stdout:5/98: truncate d0/d8/f1b 316311 0 2026-03-10T07:51:19.557 INFO:tasks.workunit.client.1.vm08.stdout:5/99: readlink d0/la 0 2026-03-10T07:51:19.558 INFO:tasks.workunit.client.1.vm08.stdout:5/100: write d0/d8/f1b [1286308,95589] 0 2026-03-10T07:51:19.562 INFO:tasks.workunit.client.1.vm08.stdout:5/101: dwrite d0/d4/f5 [4194304,4194304] 0 2026-03-10T07:51:19.564 INFO:tasks.workunit.client.1.vm08.stdout:5/102: write d0/d4/f5 [3583601,57095] 0 2026-03-10T07:51:19.564 INFO:tasks.workunit.client.1.vm08.stdout:5/103: write d0/d4/f5 [6874075,50646] 0 2026-03-10T07:51:19.573 INFO:tasks.workunit.client.1.vm08.stdout:3/54: symlink d0/l11 0 2026-03-10T07:51:19.574 INFO:tasks.workunit.client.0.vm05.stdout:9/987: read d8/d86/d28/d79/d57/de1/d22/d33/d62/d6d/d9e/d112/f135 [1816720,94261] 0 2026-03-10T07:51:19.579 INFO:tasks.workunit.client.1.vm08.stdout:2/94: creat d0/f1e x:0 0 0 2026-03-10T07:51:19.579 INFO:tasks.workunit.client.1.vm08.stdout:6/97: getdents d1/d17 0 2026-03-10T07:51:19.583 INFO:tasks.workunit.client.1.vm08.stdout:7/111: mknod d3/da/c1f 0 2026-03-10T07:51:19.588 INFO:tasks.workunit.client.1.vm08.stdout:7/112: dread d3/f16 [0,4194304] 0 2026-03-10T07:51:19.599 INFO:tasks.workunit.client.0.vm05.stdout:3/983: rmdir d8/d1f/d24/d76/dc5/de1/d52/d149 0 2026-03-10T07:51:19.599 INFO:tasks.workunit.client.0.vm05.stdout:8/973: write d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/fc7 [2498838,15742] 0 2026-03-10T07:51:19.605 INFO:tasks.workunit.client.1.vm08.stdout:8/98: rename d0/df/c11 to d0/df/d15/c1c 0 2026-03-10T07:51:19.607 INFO:tasks.workunit.client.1.vm08.stdout:9/64: rename d2/de/f13 to d2/de/f16 0 2026-03-10T07:51:19.611 INFO:tasks.workunit.client.0.vm05.stdout:3/984: dread d8/d1f/d2a/d4a/f89 [0,4194304] 0 2026-03-10T07:51:19.612 INFO:tasks.workunit.client.1.vm08.stdout:1/75: stat d2/d6/c9 0 2026-03-10T07:51:19.613 INFO:tasks.workunit.client.1.vm08.stdout:1/76: truncate d2/d6/d11/f1e 680538 0 2026-03-10T07:51:19.613 INFO:tasks.workunit.client.1.vm08.stdout:4/63: getdents d5/d8/d9/d12 0 2026-03-10T07:51:19.619 INFO:tasks.workunit.client.0.vm05.stdout:3/985: creat d8/d1c/db3/f14e x:0 0 0 2026-03-10T07:51:19.626 INFO:tasks.workunit.client.0.vm05.stdout:8/974: link d1/dd/d4d/d64/d6a/de5/d2a/d48/d7c/d9c/da4/ff1 d1/dd/d121/f13b 0 2026-03-10T07:51:19.629 INFO:tasks.workunit.client.0.vm05.stdout:3/986: truncate d8/d1c/f56 3417846 0 2026-03-10T07:51:19.630 INFO:tasks.workunit.client.1.vm08.stdout:3/55: symlink d0/l12 0 2026-03-10T07:51:19.631 INFO:tasks.workunit.client.0.vm05.stdout:3/987: dread d8/d1f/d2a/d4a/f89 [0,4194304] 0 2026-03-10T07:51:19.633 INFO:tasks.workunit.client.0.vm05.stdout:3/988: stat d8/d1f/f49 0 2026-03-10T07:51:19.634 INFO:tasks.workunit.client.0.vm05.stdout:3/989: readlink d8/d1f/d24/d45/l71 0 2026-03-10T07:51:19.635 INFO:tasks.workunit.client.1.vm08.stdout:6/98: creat d1/d3/df/f1e x:0 0 0 2026-03-10T07:51:19.637 INFO:tasks.workunit.client.1.vm08.stdout:0/122: truncate dd/d18/f25 2392514 0 2026-03-10T07:51:19.639 INFO:tasks.workunit.client.0.vm05.stdout:3/990: mkdir d8/d110/d14f 0 2026-03-10T07:51:19.639 INFO:tasks.workunit.client.0.vm05.stdout:9/988: dwrite d8/d86/d28/de9/f101 [0,4194304] 0 2026-03-10T07:51:19.640 INFO:tasks.workunit.client.1.vm08.stdout:0/123: dread f6 [0,4194304] 0 2026-03-10T07:51:19.641 INFO:tasks.workunit.client.0.vm05.stdout:3/991: stat d8/d1f/d24/d76/dc5/de1/f114 0 2026-03-10T07:51:19.642 INFO:tasks.workunit.client.0.vm05.stdout:8/975: link d1/dd/d4d/d64/d6a/de5/d2a/d48/f79 d1/dd/d121/f13c 0 2026-03-10T07:51:19.652 INFO:tasks.workunit.client.0.vm05.stdout:3/992: dwrite d8/d1f/d2a/d34/f39 [0,4194304] 0 2026-03-10T07:51:19.667 INFO:tasks.workunit.client.1.vm08.stdout:8/99: mknod d0/df/d17/c1d 0 2026-03-10T07:51:19.678 INFO:tasks.workunit.client.1.vm08.stdout:7/113: rename d3/d8/f1b to d3/d8/d9/f20 0 2026-03-10T07:51:19.683 INFO:tasks.workunit.client.1.vm08.stdout:9/65: mkdir d2/de/d17 0 2026-03-10T07:51:19.687 INFO:tasks.workunit.client.1.vm08.stdout:9/66: write d2/f6 [50112,84356] 0 2026-03-10T07:51:19.687 INFO:tasks.workunit.client.1.vm08.stdout:9/67: dread d2/de/f16 [0,4194304] 0 2026-03-10T07:51:19.687 INFO:tasks.workunit.client.1.vm08.stdout:9/68: readlink d2/l9 0 2026-03-10T07:51:19.687 INFO:tasks.workunit.client.1.vm08.stdout:9/69: write d2/d3/fa [6010935,96195] 0 2026-03-10T07:51:19.691 INFO:tasks.workunit.client.0.vm05.stdout:3/993: creat d8/d22/dad/d119/f150 x:0 0 0 2026-03-10T07:51:19.692 INFO:tasks.workunit.client.0.vm05.stdout:3/994: dread - d8/d1f/d24/d76/dc5/de1/d52/fc0 zero size 2026-03-10T07:51:19.697 INFO:tasks.workunit.client.1.vm08.stdout:5/104: getdents d0/d4/d19 0 2026-03-10T07:51:19.697 INFO:tasks.workunit.client.1.vm08.stdout:5/105: chown d0/d4/df/c1f 7221753 1 2026-03-10T07:51:19.699 INFO:tasks.workunit.client.1.vm08.stdout:3/56: creat d0/f13 x:0 0 0 2026-03-10T07:51:19.701 INFO:tasks.workunit.client.1.vm08.stdout:5/106: dwrite d0/d4/f5 [0,4194304] 0 2026-03-10T07:51:19.710 INFO:tasks.workunit.client.1.vm08.stdout:3/57: sync 2026-03-10T07:51:19.712 INFO:tasks.workunit.client.0.vm05.stdout:3/995: symlink d8/d1f/d24/d76/dc5/de1/d19/d6b/l151 0 2026-03-10T07:51:19.715 INFO:tasks.workunit.client.0.vm05.stdout:9/989: write d8/d86/d28/d79/d57/de1/d38/d71/d81/f83 [818246,25596] 0 2026-03-10T07:51:19.719 INFO:tasks.workunit.client.0.vm05.stdout:9/990: truncate d8/d86/d28/d79/d57/dbc/d11a/f125 218864 0 2026-03-10T07:51:19.722 INFO:tasks.workunit.client.0.vm05.stdout:8/976: creat d1/dd/d4d/d64/d6a/de5/d2a/f13d x:0 0 0 2026-03-10T07:51:19.724 INFO:tasks.workunit.client.1.vm08.stdout:6/99: rmdir d1/d17 39 2026-03-10T07:51:19.733 INFO:tasks.workunit.client.0.vm05.stdout:9/991: dread d8/d86/d28/d79/d57/de1/d22/f4a [0,4194304] 0 2026-03-10T07:51:19.739 INFO:tasks.workunit.client.1.vm08.stdout:8/100: stat d0/ce 0 2026-03-10T07:51:19.743 INFO:tasks.workunit.client.0.vm05.stdout:3/996: write d8/d1f/d2a/f42 [1480348,119088] 0 2026-03-10T07:51:19.748 INFO:tasks.workunit.client.0.vm05.stdout:9/992: symlink d8/d86/d28/d79/d57/de1/d1c/d20/dee/d137/l14c 0 2026-03-10T07:51:19.749 INFO:tasks.workunit.client.0.vm05.stdout:9/993: fdatasync d8/d86/d28/d79/d57/de1/d38/f87 0 2026-03-10T07:51:19.751 INFO:tasks.workunit.client.0.vm05.stdout:8/977: link d1/d6f/df9/d102/dbf/l129 d1/d6f/df9/d102/dbf/d11f/l13e 0 2026-03-10T07:51:19.760 INFO:tasks.workunit.client.0.vm05.stdout:3/997: write d8/d1f/d24/d45/ddf/fe4 [1630784,73587] 0 2026-03-10T07:51:19.763 INFO:tasks.workunit.client.1.vm08.stdout:9/70: creat d2/d3/f18 x:0 0 0 2026-03-10T07:51:19.763 INFO:tasks.workunit.client.0.vm05.stdout:9/994: read - d8/d86/d95/f122 zero size 2026-03-10T07:51:19.769 INFO:tasks.workunit.client.0.vm05.stdout:3/998: creat d8/d8f/dbc/f152 x:0 0 0 2026-03-10T07:51:19.771 INFO:tasks.workunit.client.1.vm08.stdout:3/58: creat d0/f14 x:0 0 0 2026-03-10T07:51:19.774 INFO:tasks.workunit.client.0.vm05.stdout:8/978: creat d1/dd/d4d/d64/d6a/de5/d2a/de3/d12f/f13f x:0 0 0 2026-03-10T07:51:19.775 INFO:tasks.workunit.client.1.vm08.stdout:3/59: dwrite d0/f14 [0,4194304] 0 2026-03-10T07:51:19.779 INFO:tasks.workunit.client.0.vm05.stdout:9/995: mknod d8/d86/d28/d79/d57/de1/d22/d33/d62/d6d/c14d 0 2026-03-10T07:51:19.795 INFO:tasks.workunit.client.0.vm05.stdout:3/999: mknod d8/d22/c153 0 2026-03-10T07:51:19.797 INFO:tasks.workunit.client.1.vm08.stdout:7/114: link d3/da/f1d d3/da/f21 0 2026-03-10T07:51:19.797 INFO:tasks.workunit.client.0.vm05.stdout:9/996: dwrite d8/d86/d28/d79/d57/de1/d22/d33/d62/de0/f12a [0,4194304] 0 2026-03-10T07:51:19.807 INFO:tasks.workunit.client.1.vm08.stdout:1/77: rmdir d2/d19 0 2026-03-10T07:51:19.814 INFO:tasks.workunit.client.1.vm08.stdout:5/107: symlink d0/d4/d19/l20 0 2026-03-10T07:51:19.816 INFO:tasks.workunit.client.1.vm08.stdout:2/95: link d0/d1/d3/d5/cf d0/d1/d3/c1f 0 2026-03-10T07:51:19.817 INFO:tasks.workunit.client.1.vm08.stdout:2/96: dread - d0/d1/d3/d5/f13 zero size 2026-03-10T07:51:19.819 INFO:tasks.workunit.client.0.vm05.stdout:9/997: link d8/d86/d28/d79/d57/de1/d1c/d20/d59/d8b/l58 d8/d86/d28/d79/d57/de1/d22/d33/d62/de0/l14e 0 2026-03-10T07:51:19.821 INFO:tasks.workunit.client.1.vm08.stdout:3/60: mknod d0/dd/c15 0 2026-03-10T07:51:19.823 INFO:tasks.workunit.client.1.vm08.stdout:6/100: creat d1/d3/df/d1d/f1f x:0 0 0 2026-03-10T07:51:19.829 INFO:tasks.workunit.client.1.vm08.stdout:7/115: symlink d3/l22 0 2026-03-10T07:51:19.830 INFO:tasks.workunit.client.1.vm08.stdout:7/116: truncate d3/f16 1087596 0 2026-03-10T07:51:19.846 INFO:tasks.workunit.client.1.vm08.stdout:0/124: dwrite fb [0,4194304] 0 2026-03-10T07:51:19.846 INFO:tasks.workunit.client.1.vm08.stdout:1/78: unlink d2/lb 0 2026-03-10T07:51:19.847 INFO:tasks.workunit.client.0.vm05.stdout:9/998: fdatasync d8/d86/d95/f122 0 2026-03-10T07:51:19.847 INFO:tasks.workunit.client.1.vm08.stdout:4/64: getdents d5/d8/d9 0 2026-03-10T07:51:19.847 INFO:tasks.workunit.client.1.vm08.stdout:1/79: stat d2/d10/c17 0 2026-03-10T07:51:19.848 INFO:tasks.workunit.client.1.vm08.stdout:9/71: dwrite d2/fd [0,4194304] 0 2026-03-10T07:51:19.849 INFO:tasks.workunit.client.1.vm08.stdout:2/97: creat d0/d1/d3/d5/f20 x:0 0 0 2026-03-10T07:51:19.850 INFO:tasks.workunit.client.1.vm08.stdout:3/61: creat d0/f16 x:0 0 0 2026-03-10T07:51:19.850 INFO:tasks.workunit.client.0.vm05.stdout:8/979: dwrite d1/dd/fd5 [0,4194304] 0 2026-03-10T07:51:19.860 INFO:tasks.workunit.client.1.vm08.stdout:2/98: dwrite d0/d1/d3/f14 [0,4194304] 0 2026-03-10T07:51:19.864 INFO:tasks.workunit.client.0.vm05.stdout:9/999: rmdir d8/d86/d28/d79/d57/dbc 39 2026-03-10T07:51:19.872 INFO:tasks.workunit.client.1.vm08.stdout:8/101: link d0/lc d0/df/d17/l1e 0 2026-03-10T07:51:19.876 INFO:tasks.workunit.client.1.vm08.stdout:7/117: creat d3/d8/d9/f23 x:0 0 0 2026-03-10T07:51:19.877 INFO:tasks.workunit.client.0.vm05.stdout:8/980: creat d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d10c/f140 x:0 0 0 2026-03-10T07:51:19.877 INFO:tasks.workunit.client.1.vm08.stdout:0/125: creat dd/d10/d14/d1b/f2c x:0 0 0 2026-03-10T07:51:19.877 INFO:tasks.workunit.client.1.vm08.stdout:1/80: mkdir d2/d6/de/d1f/d22 0 2026-03-10T07:51:19.877 INFO:tasks.workunit.client.1.vm08.stdout:9/72: mknod d2/c19 0 2026-03-10T07:51:19.877 INFO:tasks.workunit.client.1.vm08.stdout:3/62: creat d0/f17 x:0 0 0 2026-03-10T07:51:19.877 INFO:tasks.workunit.client.1.vm08.stdout:2/99: mknod d0/d1/c21 0 2026-03-10T07:51:19.877 INFO:tasks.workunit.client.1.vm08.stdout:9/73: fsync d2/de/f14 0 2026-03-10T07:51:19.877 INFO:tasks.workunit.client.1.vm08.stdout:2/100: stat d0 0 2026-03-10T07:51:19.892 INFO:tasks.workunit.client.1.vm08.stdout:7/118: symlink d3/da/l24 0 2026-03-10T07:51:19.897 INFO:tasks.workunit.client.1.vm08.stdout:8/102: symlink d0/df/d15/l1f 0 2026-03-10T07:51:19.901 INFO:tasks.workunit.client.1.vm08.stdout:0/126: creat dd/d10/d14/d15/d20/f2d x:0 0 0 2026-03-10T07:51:19.901 INFO:tasks.workunit.client.1.vm08.stdout:1/81: readlink d2/la 0 2026-03-10T07:51:19.901 INFO:tasks.workunit.client.1.vm08.stdout:3/63: mkdir d0/dd/d18 0 2026-03-10T07:51:19.902 INFO:tasks.workunit.client.1.vm08.stdout:9/74: creat d2/f1a x:0 0 0 2026-03-10T07:51:19.903 INFO:tasks.workunit.client.1.vm08.stdout:2/101: mknod d0/d1/c22 0 2026-03-10T07:51:19.903 INFO:tasks.workunit.client.1.vm08.stdout:9/75: truncate d2/f11 34344 0 2026-03-10T07:51:19.904 INFO:tasks.workunit.client.1.vm08.stdout:2/102: dread - d0/d1/d3/d5/f13 zero size 2026-03-10T07:51:19.904 INFO:tasks.workunit.client.0.vm05.stdout:8/981: link d1/dd/d4d/dcc/dbd/cf7 d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d10c/c141 0 2026-03-10T07:51:19.905 INFO:tasks.workunit.client.1.vm08.stdout:2/103: chown d0/d1/c11 40838206 1 2026-03-10T07:51:19.905 INFO:tasks.workunit.client.0.vm05.stdout:8/982: chown d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/c7e 1065105 1 2026-03-10T07:51:19.906 INFO:tasks.workunit.client.1.vm08.stdout:3/64: unlink d0/l12 0 2026-03-10T07:51:19.907 INFO:tasks.workunit.client.1.vm08.stdout:3/65: chown d0/ca 4 1 2026-03-10T07:51:19.907 INFO:tasks.workunit.client.0.vm05.stdout:8/983: symlink d1/dd/d4d/l142 0 2026-03-10T07:51:19.908 INFO:tasks.workunit.client.0.vm05.stdout:8/984: chown d1/dd/d4d/d64/d6a/de5/l3c 28327071 1 2026-03-10T07:51:19.909 INFO:tasks.workunit.client.1.vm08.stdout:9/76: dwrite d2/de/f14 [0,4194304] 0 2026-03-10T07:51:19.909 INFO:tasks.workunit.client.1.vm08.stdout:9/77: readlink d2/l9 0 2026-03-10T07:51:19.910 INFO:tasks.workunit.client.1.vm08.stdout:9/78: read d2/fb [573361,85235] 0 2026-03-10T07:51:19.916 INFO:tasks.workunit.client.1.vm08.stdout:4/65: link d5/d8/d9/le d5/l14 0 2026-03-10T07:51:19.916 INFO:tasks.workunit.client.1.vm08.stdout:1/82: link d2/la d2/d6/d11/l23 0 2026-03-10T07:51:19.916 INFO:tasks.workunit.client.1.vm08.stdout:0/127: creat dd/d10/d14/d15/d20/d22/f2e x:0 0 0 2026-03-10T07:51:19.916 INFO:tasks.workunit.client.1.vm08.stdout:9/79: chown d2/d3 0 1 2026-03-10T07:51:19.918 INFO:tasks.workunit.client.1.vm08.stdout:2/104: creat d0/d1/d3/d5/d1b/f23 x:0 0 0 2026-03-10T07:51:19.920 INFO:tasks.workunit.client.1.vm08.stdout:3/66: rename d0/c7 to d0/c19 0 2026-03-10T07:51:19.920 INFO:tasks.workunit.client.1.vm08.stdout:4/66: mknod d5/d8/d9/c15 0 2026-03-10T07:51:19.920 INFO:tasks.workunit.client.1.vm08.stdout:2/105: write d0/d1/d3/f14 [2852143,80281] 0 2026-03-10T07:51:19.922 INFO:tasks.workunit.client.1.vm08.stdout:1/83: creat d2/d6/de/d1f/d22/f24 x:0 0 0 2026-03-10T07:51:19.922 INFO:tasks.workunit.client.1.vm08.stdout:3/67: creat d0/f1a x:0 0 0 2026-03-10T07:51:19.923 INFO:tasks.workunit.client.1.vm08.stdout:2/106: truncate d0/d1/d17/f1a 152247 0 2026-03-10T07:51:19.923 INFO:tasks.workunit.client.1.vm08.stdout:3/68: dread - d0/f17 zero size 2026-03-10T07:51:19.924 INFO:tasks.workunit.client.1.vm08.stdout:9/80: creat d2/de/d17/f1b x:0 0 0 2026-03-10T07:51:19.925 INFO:tasks.workunit.client.1.vm08.stdout:1/84: symlink d2/d6/d11/l25 0 2026-03-10T07:51:19.925 INFO:tasks.workunit.client.1.vm08.stdout:2/107: creat d0/d1/f24 x:0 0 0 2026-03-10T07:51:19.925 INFO:tasks.workunit.client.1.vm08.stdout:1/85: chown d2/d10 4628886 1 2026-03-10T07:51:19.926 INFO:tasks.workunit.client.1.vm08.stdout:3/69: symlink d0/dd/l1b 0 2026-03-10T07:51:19.929 INFO:tasks.workunit.client.1.vm08.stdout:2/108: mknod d0/d1/d3/d5/d1b/c25 0 2026-03-10T07:51:19.930 INFO:tasks.workunit.client.1.vm08.stdout:4/67: dread d5/d8/f11 [0,4194304] 0 2026-03-10T07:51:19.930 INFO:tasks.workunit.client.1.vm08.stdout:9/81: link d2/f10 d2/d3/f1c 0 2026-03-10T07:51:19.930 INFO:tasks.workunit.client.1.vm08.stdout:3/70: truncate d0/f16 213066 0 2026-03-10T07:51:19.930 INFO:tasks.workunit.client.1.vm08.stdout:1/86: mkdir d2/d6/de/d1f/d26 0 2026-03-10T07:51:19.931 INFO:tasks.workunit.client.1.vm08.stdout:9/82: stat d2/f1a 0 2026-03-10T07:51:19.931 INFO:tasks.workunit.client.1.vm08.stdout:1/87: dread - d2/d6/de/f15 zero size 2026-03-10T07:51:19.934 INFO:tasks.workunit.client.1.vm08.stdout:4/68: readlink d5/l14 0 2026-03-10T07:51:19.936 INFO:tasks.workunit.client.1.vm08.stdout:8/103: sync 2026-03-10T07:51:19.936 INFO:tasks.workunit.client.1.vm08.stdout:3/71: symlink d0/dd/l1c 0 2026-03-10T07:51:19.936 INFO:tasks.workunit.client.1.vm08.stdout:9/83: mknod d2/c1d 0 2026-03-10T07:51:19.936 INFO:tasks.workunit.client.1.vm08.stdout:3/72: chown d0/dd 9533 1 2026-03-10T07:51:19.937 INFO:tasks.workunit.client.1.vm08.stdout:9/84: truncate d2/f11 525576 0 2026-03-10T07:51:19.938 INFO:tasks.workunit.client.1.vm08.stdout:9/85: chown d2/de/f14 3392 1 2026-03-10T07:51:19.943 INFO:tasks.workunit.client.1.vm08.stdout:8/104: creat d0/f20 x:0 0 0 2026-03-10T07:51:19.943 INFO:tasks.workunit.client.1.vm08.stdout:3/73: unlink d0/f17 0 2026-03-10T07:51:19.944 INFO:tasks.workunit.client.1.vm08.stdout:9/86: dwrite d2/f11 [0,4194304] 0 2026-03-10T07:51:19.948 INFO:tasks.workunit.client.1.vm08.stdout:9/87: dread d2/fd [0,4194304] 0 2026-03-10T07:51:19.950 INFO:tasks.workunit.client.1.vm08.stdout:1/88: sync 2026-03-10T07:51:19.954 INFO:tasks.workunit.client.1.vm08.stdout:3/74: dwrite d0/f16 [0,4194304] 0 2026-03-10T07:51:19.955 INFO:tasks.workunit.client.1.vm08.stdout:8/105: unlink d0/df/f18 0 2026-03-10T07:51:19.956 INFO:tasks.workunit.client.1.vm08.stdout:8/106: write d0/df/f19 [976991,97663] 0 2026-03-10T07:51:19.960 INFO:tasks.workunit.client.1.vm08.stdout:9/88: creat d2/de/f1e x:0 0 0 2026-03-10T07:51:19.960 INFO:tasks.workunit.client.1.vm08.stdout:1/89: symlink d2/d6/de/l27 0 2026-03-10T07:51:19.961 INFO:tasks.workunit.client.1.vm08.stdout:3/75: creat d0/dd/f1d x:0 0 0 2026-03-10T07:51:19.965 INFO:tasks.workunit.client.1.vm08.stdout:8/107: dwrite d0/df/f13 [0,4194304] 0 2026-03-10T07:51:19.971 INFO:tasks.workunit.client.1.vm08.stdout:9/89: dwrite d2/de/d17/f1b [0,4194304] 0 2026-03-10T07:51:19.976 INFO:tasks.workunit.client.1.vm08.stdout:9/90: readlink d2/l9 0 2026-03-10T07:51:19.988 INFO:tasks.workunit.client.1.vm08.stdout:3/76: creat d0/dd/d18/f1e x:0 0 0 2026-03-10T07:51:20.004 INFO:tasks.workunit.client.1.vm08.stdout:3/77: dread - d0/f10 zero size 2026-03-10T07:51:20.004 INFO:tasks.workunit.client.1.vm08.stdout:3/78: readlink d0/dd/l1b 0 2026-03-10T07:51:20.004 INFO:tasks.workunit.client.1.vm08.stdout:3/79: fsync d0/f10 0 2026-03-10T07:51:20.004 INFO:tasks.workunit.client.1.vm08.stdout:9/91: symlink d2/de/l1f 0 2026-03-10T07:51:20.004 INFO:tasks.workunit.client.1.vm08.stdout:9/92: dread - d2/d3/f1c zero size 2026-03-10T07:51:20.004 INFO:tasks.workunit.client.1.vm08.stdout:3/80: chown d0/f13 2294450 1 2026-03-10T07:51:20.004 INFO:tasks.workunit.client.1.vm08.stdout:9/93: dread - d2/f10 zero size 2026-03-10T07:51:20.004 INFO:tasks.workunit.client.1.vm08.stdout:8/108: link d0/df/f19 d0/df/d17/f21 0 2026-03-10T07:51:20.004 INFO:tasks.workunit.client.1.vm08.stdout:9/94: creat d2/de/d17/f20 x:0 0 0 2026-03-10T07:51:20.004 INFO:tasks.workunit.client.1.vm08.stdout:8/109: creat d0/f22 x:0 0 0 2026-03-10T07:51:20.004 INFO:tasks.workunit.client.1.vm08.stdout:8/110: fsync d0/df/f1b 0 2026-03-10T07:51:20.004 INFO:tasks.workunit.client.1.vm08.stdout:8/111: write d0/df/f1b [551168,56003] 0 2026-03-10T07:51:20.004 INFO:tasks.workunit.client.1.vm08.stdout:9/95: link d2/de/d17/f1b d2/de/d17/f21 0 2026-03-10T07:51:20.004 INFO:tasks.workunit.client.1.vm08.stdout:8/112: mkdir d0/df/d15/d23 0 2026-03-10T07:51:20.004 INFO:tasks.workunit.client.1.vm08.stdout:9/96: mknod d2/c22 0 2026-03-10T07:51:20.004 INFO:tasks.workunit.client.1.vm08.stdout:8/113: mknod d0/df/d15/d23/c24 0 2026-03-10T07:51:20.010 INFO:tasks.workunit.client.1.vm08.stdout:9/97: mknod d2/de/d17/c23 0 2026-03-10T07:51:20.014 INFO:tasks.workunit.client.1.vm08.stdout:8/114: dread d0/fa [0,4194304] 0 2026-03-10T07:51:20.017 INFO:tasks.workunit.client.1.vm08.stdout:9/98: dwrite d2/f6 [0,4194304] 0 2026-03-10T07:51:20.031 INFO:tasks.workunit.client.1.vm08.stdout:8/115: mkdir d0/df/d17/d25 0 2026-03-10T07:51:20.033 INFO:tasks.workunit.client.1.vm08.stdout:6/101: truncate d1/f2 4910932 0 2026-03-10T07:51:20.033 INFO:tasks.workunit.client.1.vm08.stdout:7/119: rename d3/d8 to d3/da/d25 0 2026-03-10T07:51:20.036 INFO:tasks.workunit.client.1.vm08.stdout:7/120: write d3/f4 [4020431,55241] 0 2026-03-10T07:51:20.051 INFO:tasks.workunit.client.1.vm08.stdout:6/102: creat d1/d17/f20 x:0 0 0 2026-03-10T07:51:20.054 INFO:tasks.workunit.client.1.vm08.stdout:6/103: stat d1/db/l14 0 2026-03-10T07:51:20.054 INFO:tasks.workunit.client.1.vm08.stdout:6/104: stat d1/d3/df 0 2026-03-10T07:51:20.054 INFO:tasks.workunit.client.1.vm08.stdout:5/108: dwrite d0/f9 [0,4194304] 0 2026-03-10T07:51:20.054 INFO:tasks.workunit.client.1.vm08.stdout:5/109: dread d0/f17 [0,4194304] 0 2026-03-10T07:51:20.054 INFO:tasks.workunit.client.1.vm08.stdout:5/110: stat d0/d8/f18 0 2026-03-10T07:51:20.054 INFO:tasks.workunit.client.1.vm08.stdout:5/111: mknod d0/d4/c21 0 2026-03-10T07:51:20.054 INFO:tasks.workunit.client.1.vm08.stdout:5/112: chown d0/d8 129887487 1 2026-03-10T07:51:20.054 INFO:tasks.workunit.client.1.vm08.stdout:5/113: write d0/d4/f5 [20946,92071] 0 2026-03-10T07:51:20.054 INFO:tasks.workunit.client.1.vm08.stdout:5/114: mkdir d0/d4/df/d12/d22 0 2026-03-10T07:51:20.055 INFO:tasks.workunit.client.1.vm08.stdout:5/115: chown d0/f1a 4105 1 2026-03-10T07:51:20.055 INFO:tasks.workunit.client.1.vm08.stdout:5/116: unlink d0/d4/c7 0 2026-03-10T07:51:20.057 INFO:tasks.workunit.client.1.vm08.stdout:5/117: creat d0/d4/df/d12/d22/f23 x:0 0 0 2026-03-10T07:51:20.059 INFO:tasks.workunit.client.1.vm08.stdout:7/121: sync 2026-03-10T07:51:20.078 INFO:tasks.workunit.client.1.vm08.stdout:7/122: link d3/da/d25/l1a d3/da/l26 0 2026-03-10T07:51:20.080 INFO:tasks.workunit.client.1.vm08.stdout:7/123: creat d3/da/d25/f27 x:0 0 0 2026-03-10T07:51:20.087 INFO:tasks.workunit.client.1.vm08.stdout:7/124: readlink d3/da/lf 0 2026-03-10T07:51:20.087 INFO:tasks.workunit.client.1.vm08.stdout:7/125: mknod d3/da/d25/c28 0 2026-03-10T07:51:20.088 INFO:tasks.workunit.client.1.vm08.stdout:7/126: creat d3/da/d25/f29 x:0 0 0 2026-03-10T07:51:20.098 INFO:tasks.workunit.client.1.vm08.stdout:7/127: rename d3/da/c1f to d3/da/c2a 0 2026-03-10T07:51:20.109 INFO:tasks.workunit.client.1.vm08.stdout:4/69: rmdir d5 39 2026-03-10T07:51:20.109 INFO:tasks.workunit.client.1.vm08.stdout:4/70: chown f1 734119 1 2026-03-10T07:51:20.112 INFO:tasks.workunit.client.0.vm05.stdout:8/985: dwrite d1/d6f/df9/d102/fb3 [0,4194304] 0 2026-03-10T07:51:20.134 INFO:tasks.workunit.client.1.vm08.stdout:0/128: truncate f2 2363363 0 2026-03-10T07:51:20.134 INFO:tasks.workunit.client.1.vm08.stdout:0/129: truncate f6 4466929 0 2026-03-10T07:51:20.134 INFO:tasks.workunit.client.1.vm08.stdout:0/130: readlink dd/lf 0 2026-03-10T07:51:20.137 INFO:tasks.workunit.client.1.vm08.stdout:7/128: creat d3/f2b x:0 0 0 2026-03-10T07:51:20.142 INFO:tasks.workunit.client.1.vm08.stdout:0/131: mkdir dd/d10/d2f 0 2026-03-10T07:51:20.142 INFO:tasks.workunit.client.1.vm08.stdout:2/109: rmdir d0/d1/d3/d5 39 2026-03-10T07:51:20.142 INFO:tasks.workunit.client.1.vm08.stdout:8/116: fsync d0/f20 0 2026-03-10T07:51:20.143 INFO:tasks.workunit.client.1.vm08.stdout:8/117: truncate d0/df/f19 1500662 0 2026-03-10T07:51:20.143 INFO:tasks.workunit.client.1.vm08.stdout:8/118: chown d0 20 1 2026-03-10T07:51:20.145 INFO:tasks.workunit.client.1.vm08.stdout:0/132: dwrite dd/d29/f2a [0,4194304] 0 2026-03-10T07:51:20.146 INFO:tasks.workunit.client.1.vm08.stdout:1/90: getdents d2/d6/de 0 2026-03-10T07:51:20.147 INFO:tasks.workunit.client.1.vm08.stdout:4/71: mknod d5/d8/d9/c16 0 2026-03-10T07:51:20.147 INFO:tasks.workunit.client.1.vm08.stdout:1/91: dread - d2/d6/de/d1f/d22/f24 zero size 2026-03-10T07:51:20.149 INFO:tasks.workunit.client.1.vm08.stdout:1/92: truncate d2/d6/de/d1f/d22/f24 454071 0 2026-03-10T07:51:20.150 INFO:tasks.workunit.client.1.vm08.stdout:8/119: dread d0/fa [0,4194304] 0 2026-03-10T07:51:20.158 INFO:tasks.workunit.client.1.vm08.stdout:3/81: rmdir d0/dd 39 2026-03-10T07:51:20.159 INFO:tasks.workunit.client.1.vm08.stdout:4/72: dwrite f2 [0,4194304] 0 2026-03-10T07:51:20.161 INFO:tasks.workunit.client.1.vm08.stdout:2/110: dwrite d0/d1/d3/d5/f20 [0,4194304] 0 2026-03-10T07:51:20.162 INFO:tasks.workunit.client.1.vm08.stdout:2/111: read - d0/d1/d3/d5/f13 zero size 2026-03-10T07:51:20.163 INFO:tasks.workunit.client.1.vm08.stdout:9/99: truncate d2/de/d17/f21 1037444 0 2026-03-10T07:51:20.169 INFO:tasks.workunit.client.1.vm08.stdout:2/112: truncate d0/d1/d3/d5/f13 873452 0 2026-03-10T07:51:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:20 vm08.local ceph-mon[59917]: pgmap v29: 65 pgs: 65 active+clean; 2.2 GiB data, 7.5 GiB used, 113 GiB / 120 GiB avail; 47 MiB/s rd, 116 MiB/s wr, 317 op/s 2026-03-10T07:51:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:20 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:20.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:20 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:20.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:20 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:51:20.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:20 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:51:20.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:20 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:20.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:20 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T07:51:20.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:20 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:51:20.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:20 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:20.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:20 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:20.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:20 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:20.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:20 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:20.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:20 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:20.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:20 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:20.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:20 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:20.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:20 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:20.170 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:20 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:20.173 INFO:tasks.workunit.client.1.vm08.stdout:2/113: chown d0/d1/d3/f14 0 1 2026-03-10T07:51:20.178 INFO:tasks.workunit.client.1.vm08.stdout:2/114: dwrite d0/d1/d3/f14 [0,4194304] 0 2026-03-10T07:51:20.178 INFO:tasks.workunit.client.1.vm08.stdout:7/129: chown d3/da/l26 1119722285 1 2026-03-10T07:51:20.191 INFO:tasks.workunit.client.1.vm08.stdout:0/133: mkdir dd/d10/d14/d1b/d30 0 2026-03-10T07:51:20.202 INFO:tasks.workunit.client.1.vm08.stdout:0/134: chown dd/fe 0 1 2026-03-10T07:51:20.203 INFO:tasks.workunit.client.1.vm08.stdout:0/135: fdatasync dd/fe 0 2026-03-10T07:51:20.203 INFO:tasks.workunit.client.1.vm08.stdout:1/93: fdatasync d2/d6/d11/f1e 0 2026-03-10T07:51:20.203 INFO:tasks.workunit.client.1.vm08.stdout:8/120: creat d0/df/f26 x:0 0 0 2026-03-10T07:51:20.203 INFO:tasks.workunit.client.1.vm08.stdout:8/121: readlink d0/l14 0 2026-03-10T07:51:20.203 INFO:tasks.workunit.client.1.vm08.stdout:3/82: stat d0/dd/l1b 0 2026-03-10T07:51:20.203 INFO:tasks.workunit.client.1.vm08.stdout:9/100: creat d2/de/d17/f24 x:0 0 0 2026-03-10T07:51:20.212 INFO:tasks.workunit.client.1.vm08.stdout:1/94: readlink d2/d6/de/l14 0 2026-03-10T07:51:20.212 INFO:tasks.workunit.client.1.vm08.stdout:4/73: sync 2026-03-10T07:51:20.212 INFO:tasks.workunit.client.1.vm08.stdout:2/115: sync 2026-03-10T07:51:20.212 INFO:tasks.workunit.client.1.vm08.stdout:9/101: sync 2026-03-10T07:51:20.213 INFO:tasks.workunit.client.1.vm08.stdout:4/74: write f2 [3943764,101083] 0 2026-03-10T07:51:20.213 INFO:tasks.workunit.client.1.vm08.stdout:2/116: write d0/f1e [696388,76444] 0 2026-03-10T07:51:20.213 INFO:tasks.workunit.client.1.vm08.stdout:9/102: truncate d2/f6 4533089 0 2026-03-10T07:51:20.218 INFO:tasks.workunit.client.1.vm08.stdout:3/83: mkdir d0/dd/d1f 0 2026-03-10T07:51:20.219 INFO:tasks.workunit.client.1.vm08.stdout:0/136: fsync dd/d10/f12 0 2026-03-10T07:51:20.219 INFO:tasks.workunit.client.1.vm08.stdout:8/122: dwrite d0/df/f19 [0,4194304] 0 2026-03-10T07:51:20.220 INFO:tasks.workunit.client.1.vm08.stdout:8/123: write d0/df/d17/f21 [2254905,122219] 0 2026-03-10T07:51:20.220 INFO:tasks.workunit.client.1.vm08.stdout:8/124: dread - d0/df/d17/f1a zero size 2026-03-10T07:51:20.230 INFO:tasks.workunit.client.1.vm08.stdout:1/95: symlink d2/d6/d11/l28 0 2026-03-10T07:51:20.231 INFO:tasks.workunit.client.1.vm08.stdout:2/117: write d0/d1/d3/f8 [4196740,58106] 0 2026-03-10T07:51:20.231 INFO:tasks.workunit.client.1.vm08.stdout:1/96: dread d2/d6/de/d1f/d22/f24 [0,4194304] 0 2026-03-10T07:51:20.231 INFO:tasks.workunit.client.1.vm08.stdout:5/118: rmdir d0 39 2026-03-10T07:51:20.237 INFO:tasks.workunit.client.1.vm08.stdout:1/97: dwrite d2/d6/fd [0,4194304] 0 2026-03-10T07:51:20.243 INFO:tasks.workunit.client.1.vm08.stdout:8/125: sync 2026-03-10T07:51:20.243 INFO:tasks.workunit.client.1.vm08.stdout:3/84: creat d0/dd/f20 x:0 0 0 2026-03-10T07:51:20.246 INFO:tasks.workunit.client.1.vm08.stdout:0/137: write f6 [508505,99760] 0 2026-03-10T07:51:20.247 INFO:tasks.workunit.client.1.vm08.stdout:0/138: dread - dd/f13 zero size 2026-03-10T07:51:20.251 INFO:tasks.workunit.client.1.vm08.stdout:4/75: mkdir d5/d17 0 2026-03-10T07:51:20.255 INFO:tasks.workunit.client.1.vm08.stdout:7/130: fsync d3/da/d25/f27 0 2026-03-10T07:51:20.259 INFO:tasks.workunit.client.1.vm08.stdout:6/105: write d1/f2 [4192473,5722] 0 2026-03-10T07:51:20.261 INFO:tasks.workunit.client.1.vm08.stdout:9/103: rename d2/de/d17 to d2/d3/d25 0 2026-03-10T07:51:20.263 INFO:tasks.workunit.client.1.vm08.stdout:2/118: creat d0/d1/d3/d5/d1b/f26 x:0 0 0 2026-03-10T07:51:20.265 INFO:tasks.workunit.client.1.vm08.stdout:5/119: dread - d0/d4/df/d12/d22/f23 zero size 2026-03-10T07:51:20.266 INFO:tasks.workunit.client.1.vm08.stdout:5/120: chown d0/f9 87612 1 2026-03-10T07:51:20.272 INFO:tasks.workunit.client.1.vm08.stdout:4/76: creat d5/d8/d9/f18 x:0 0 0 2026-03-10T07:51:20.272 INFO:tasks.workunit.client.1.vm08.stdout:4/77: fdatasync d5/d8/d9/f18 0 2026-03-10T07:51:20.276 INFO:tasks.workunit.client.1.vm08.stdout:7/131: symlink d3/l2c 0 2026-03-10T07:51:20.277 INFO:tasks.workunit.client.1.vm08.stdout:7/132: fdatasync d3/da/d25/d9/fd 0 2026-03-10T07:51:20.284 INFO:tasks.workunit.client.1.vm08.stdout:9/104: unlink d2/d3/f18 0 2026-03-10T07:51:20.288 INFO:tasks.workunit.client.1.vm08.stdout:2/119: mkdir d0/d1/d17/d27 0 2026-03-10T07:51:20.289 INFO:tasks.workunit.client.1.vm08.stdout:2/120: write d0/f1e [380203,3753] 0 2026-03-10T07:51:20.291 INFO:tasks.workunit.client.1.vm08.stdout:5/121: mkdir d0/d8/d24 0 2026-03-10T07:51:20.292 INFO:tasks.workunit.client.1.vm08.stdout:5/122: readlink d0/l3 0 2026-03-10T07:51:20.294 INFO:tasks.workunit.client.1.vm08.stdout:1/98: fdatasync d2/d6/ff 0 2026-03-10T07:51:20.297 INFO:tasks.workunit.client.1.vm08.stdout:3/85: symlink d0/dd/d1f/l21 0 2026-03-10T07:51:20.298 INFO:tasks.workunit.client.0.vm05.stdout:8/986: creat d1/dd/d4d/d64/d6a/de5/d2a/d48/d7c/d9c/f143 x:0 0 0 2026-03-10T07:51:20.298 INFO:tasks.workunit.client.1.vm08.stdout:8/126: symlink d0/df/d17/d25/l27 0 2026-03-10T07:51:20.299 INFO:tasks.workunit.client.1.vm08.stdout:0/139: getdents dd/d10/d2f 0 2026-03-10T07:51:20.299 INFO:tasks.workunit.client.1.vm08.stdout:0/140: chown dd/d10/d14/d1b/f2c 669426043 1 2026-03-10T07:51:20.302 INFO:tasks.workunit.client.0.vm05.stdout:8/987: unlink d1/dd/d18/ce1 0 2026-03-10T07:51:20.302 INFO:tasks.workunit.client.0.vm05.stdout:8/988: readlink d1/dd/d5e/l118 0 2026-03-10T07:51:20.304 INFO:tasks.workunit.client.0.vm05.stdout:8/989: chown d1/dd/d4d/d64/d6a/de5/d2a/d48/d5a/c9b 3 1 2026-03-10T07:51:20.306 INFO:tasks.workunit.client.1.vm08.stdout:7/133: mknod d3/c2d 0 2026-03-10T07:51:20.309 INFO:tasks.workunit.client.0.vm05.stdout:8/990: mknod d1/dd/d4d/d64/d8f/c144 0 2026-03-10T07:51:20.314 INFO:tasks.workunit.client.0.vm05.stdout:8/991: truncate d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d5d/fd6 1644908 0 2026-03-10T07:51:20.320 INFO:tasks.workunit.client.0.vm05.stdout:8/992: chown d1/dd/d4d/d64/d6a/de5/d2a/d34/f39 177 1 2026-03-10T07:51:20.334 INFO:tasks.workunit.client.0.vm05.stdout:8/993: creat d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/f145 x:0 0 0 2026-03-10T07:51:20.339 INFO:tasks.workunit.client.0.vm05.stdout:8/994: mkdir d1/dd/d121/d146 0 2026-03-10T07:51:20.351 INFO:tasks.workunit.client.0.vm05.stdout:8/995: dread d1/dd/d5e/d9e/fff [0,4194304] 0 2026-03-10T07:51:20.362 INFO:tasks.workunit.client.0.vm05.stdout:8/996: mkdir d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d10c/d116/d147 0 2026-03-10T07:51:20.378 INFO:tasks.workunit.client.1.vm08.stdout:6/106: fdatasync d1/f7 0 2026-03-10T07:51:20.380 INFO:tasks.workunit.client.1.vm08.stdout:9/105: mkdir d2/d26 0 2026-03-10T07:51:20.380 INFO:tasks.workunit.client.0.vm05.stdout:8/997: rmdir d1/dd 39 2026-03-10T07:51:20.380 INFO:tasks.workunit.client.1.vm08.stdout:9/106: write d2/de/f1e [274378,95323] 0 2026-03-10T07:51:20.384 INFO:tasks.workunit.client.1.vm08.stdout:9/107: dwrite d2/f15 [0,4194304] 0 2026-03-10T07:51:20.387 INFO:tasks.workunit.client.1.vm08.stdout:3/86: creat d0/dd/d18/f22 x:0 0 0 2026-03-10T07:51:20.389 INFO:tasks.workunit.client.1.vm08.stdout:8/127: mkdir d0/df/d17/d25/d28 0 2026-03-10T07:51:20.390 INFO:tasks.workunit.client.1.vm08.stdout:9/108: sync 2026-03-10T07:51:20.392 INFO:tasks.workunit.client.1.vm08.stdout:4/78: symlink d5/d8/l19 0 2026-03-10T07:51:20.392 INFO:tasks.workunit.client.1.vm08.stdout:7/134: rename d3/da/d25/f14 to d3/f2e 0 2026-03-10T07:51:20.393 INFO:tasks.workunit.client.0.vm05.stdout:8/998: fdatasync d1/dd/d18/f47 0 2026-03-10T07:51:20.395 INFO:tasks.workunit.client.1.vm08.stdout:5/123: creat d0/d4/df/d1e/f25 x:0 0 0 2026-03-10T07:51:20.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:20 vm05.local ceph-mon[50387]: pgmap v29: 65 pgs: 65 active+clean; 2.2 GiB data, 7.5 GiB used, 113 GiB / 120 GiB avail; 47 MiB/s rd, 116 MiB/s wr, 317 op/s 2026-03-10T07:51:20.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:20 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:20.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:20 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:20.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:20 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:51:20.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:20 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:51:20.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:20 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:20.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:20 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T07:51:20.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:20 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:51:20.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:20 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:20.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:20 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:20.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:20 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:20.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:20 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:20.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:20 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:20.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:20 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:20.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:20 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:20.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:20 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:20.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:20 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:20.408 INFO:tasks.workunit.client.1.vm08.stdout:1/99: creat d2/d6/de/d1f/d26/f29 x:0 0 0 2026-03-10T07:51:20.409 INFO:tasks.workunit.client.1.vm08.stdout:8/128: readlink d0/df/d17/l1e 0 2026-03-10T07:51:20.409 INFO:tasks.workunit.client.1.vm08.stdout:9/109: creat d2/de/f27 x:0 0 0 2026-03-10T07:51:20.409 INFO:tasks.workunit.client.1.vm08.stdout:0/141: mkdir dd/d10/d14/d1b/d30/d31 0 2026-03-10T07:51:20.410 INFO:tasks.workunit.client.1.vm08.stdout:7/135: mkdir d3/da/d25/d9/d2f 0 2026-03-10T07:51:20.414 INFO:tasks.workunit.client.1.vm08.stdout:5/124: rename d0/la to d0/d4/df/l26 0 2026-03-10T07:51:20.415 INFO:tasks.workunit.client.1.vm08.stdout:9/110: dread d2/de/f14 [0,4194304] 0 2026-03-10T07:51:20.415 INFO:tasks.workunit.client.1.vm08.stdout:1/100: dwrite d2/d6/de/d1f/d26/f29 [0,4194304] 0 2026-03-10T07:51:20.416 INFO:tasks.workunit.client.1.vm08.stdout:7/136: dwrite d3/f2b [0,4194304] 0 2026-03-10T07:51:20.416 INFO:tasks.workunit.client.1.vm08.stdout:5/125: chown d0/l3 0 1 2026-03-10T07:51:20.418 INFO:tasks.workunit.client.1.vm08.stdout:7/137: chown d3 0 1 2026-03-10T07:51:20.418 INFO:tasks.workunit.client.1.vm08.stdout:9/111: write d2/f15 [270554,41729] 0 2026-03-10T07:51:20.419 INFO:tasks.workunit.client.1.vm08.stdout:7/138: chown d3/c2d 95048323 1 2026-03-10T07:51:20.419 INFO:tasks.workunit.client.1.vm08.stdout:9/112: stat d2/d3/d25/c23 0 2026-03-10T07:51:20.429 INFO:tasks.workunit.client.1.vm08.stdout:5/126: dwrite d0/d4/df/d12/f11 [0,4194304] 0 2026-03-10T07:51:20.435 INFO:tasks.workunit.client.1.vm08.stdout:4/79: creat d5/d17/f1a x:0 0 0 2026-03-10T07:51:20.440 INFO:tasks.workunit.client.1.vm08.stdout:9/113: dwrite d2/f1a [0,4194304] 0 2026-03-10T07:51:20.447 INFO:tasks.workunit.client.1.vm08.stdout:7/139: creat d3/da/d25/d9/f30 x:0 0 0 2026-03-10T07:51:20.448 INFO:tasks.workunit.client.1.vm08.stdout:8/129: symlink d0/df/d17/d25/d28/l29 0 2026-03-10T07:51:20.448 INFO:tasks.workunit.client.1.vm08.stdout:4/80: chown d5/d8/fc 1012475160 1 2026-03-10T07:51:20.448 INFO:tasks.workunit.client.1.vm08.stdout:1/101: dwrite d2/d6/ff [0,4194304] 0 2026-03-10T07:51:20.449 INFO:tasks.workunit.client.1.vm08.stdout:8/130: write d0/df/d17/f1a [84045,112063] 0 2026-03-10T07:51:20.449 INFO:tasks.workunit.client.1.vm08.stdout:0/142: mkdir dd/d10/d2f/d32 0 2026-03-10T07:51:20.451 INFO:tasks.workunit.client.1.vm08.stdout:8/131: write d0/f22 [725862,94906] 0 2026-03-10T07:51:20.451 INFO:tasks.workunit.client.1.vm08.stdout:9/114: mkdir d2/de/d28 0 2026-03-10T07:51:20.461 INFO:tasks.workunit.client.1.vm08.stdout:5/127: sync 2026-03-10T07:51:20.463 INFO:tasks.workunit.client.1.vm08.stdout:7/140: rename c2 to d3/da/d25/d9/c31 0 2026-03-10T07:51:20.467 INFO:tasks.workunit.client.1.vm08.stdout:4/81: creat d5/d8/d9/f1b x:0 0 0 2026-03-10T07:51:20.472 INFO:tasks.workunit.client.1.vm08.stdout:7/141: truncate d3/da/f17 934903 0 2026-03-10T07:51:20.472 INFO:tasks.workunit.client.1.vm08.stdout:7/142: write d3/f4 [7689463,14430] 0 2026-03-10T07:51:20.472 INFO:tasks.workunit.client.1.vm08.stdout:1/102: unlink d2/d6/de/c1d 0 2026-03-10T07:51:20.474 INFO:tasks.workunit.client.1.vm08.stdout:7/143: creat d3/da/d25/f32 x:0 0 0 2026-03-10T07:51:20.474 INFO:tasks.workunit.client.1.vm08.stdout:5/128: sync 2026-03-10T07:51:20.480 INFO:tasks.workunit.client.1.vm08.stdout:7/144: fsync d3/da/d25/f1e 0 2026-03-10T07:51:20.480 INFO:tasks.workunit.client.1.vm08.stdout:4/82: symlink d5/l1c 0 2026-03-10T07:51:20.484 INFO:tasks.workunit.client.1.vm08.stdout:5/129: creat d0/d4/df/f27 x:0 0 0 2026-03-10T07:51:20.485 INFO:tasks.workunit.client.1.vm08.stdout:5/130: write d0/d4/df/d12/d22/f23 [610952,97826] 0 2026-03-10T07:51:20.485 INFO:tasks.workunit.client.1.vm08.stdout:9/115: dread d2/f5 [0,4194304] 0 2026-03-10T07:51:20.487 INFO:tasks.workunit.client.1.vm08.stdout:7/145: symlink d3/l33 0 2026-03-10T07:51:20.492 INFO:tasks.workunit.client.1.vm08.stdout:5/131: dwrite d0/d4/df/d12/f11 [0,4194304] 0 2026-03-10T07:51:20.495 INFO:tasks.workunit.client.1.vm08.stdout:5/132: dread d0/d4/f5 [4194304,4194304] 0 2026-03-10T07:51:20.498 INFO:tasks.workunit.client.1.vm08.stdout:5/133: dread d0/d4/f5 [4194304,4194304] 0 2026-03-10T07:51:20.503 INFO:tasks.workunit.client.1.vm08.stdout:5/134: truncate d0/d8/f18 535421 0 2026-03-10T07:51:20.507 INFO:tasks.workunit.client.1.vm08.stdout:5/135: dwrite d0/d4/df/f27 [0,4194304] 0 2026-03-10T07:51:20.507 INFO:tasks.workunit.client.1.vm08.stdout:9/116: sync 2026-03-10T07:51:20.507 INFO:tasks.workunit.client.1.vm08.stdout:5/136: readlink d0/d4/l1d 0 2026-03-10T07:51:20.508 INFO:tasks.workunit.client.1.vm08.stdout:9/117: truncate d2/d3/d25/f20 191596 0 2026-03-10T07:51:20.508 INFO:tasks.workunit.client.1.vm08.stdout:5/137: dread d0/d8/f18 [0,4194304] 0 2026-03-10T07:51:20.520 INFO:tasks.workunit.client.1.vm08.stdout:5/138: dwrite d0/f1a [0,4194304] 0 2026-03-10T07:51:20.541 INFO:tasks.workunit.client.1.vm08.stdout:7/146: creat d3/f34 x:0 0 0 2026-03-10T07:51:20.543 INFO:tasks.workunit.client.1.vm08.stdout:9/118: rename d2/f15 to d2/d26/f29 0 2026-03-10T07:51:20.544 INFO:tasks.workunit.client.1.vm08.stdout:2/121: truncate d0/d1/d3/d5/f20 3780887 0 2026-03-10T07:51:20.545 INFO:tasks.workunit.client.1.vm08.stdout:2/122: dread - d0/f19 zero size 2026-03-10T07:51:20.545 INFO:tasks.workunit.client.1.vm08.stdout:7/147: creat d3/da/d25/f35 x:0 0 0 2026-03-10T07:51:20.546 INFO:tasks.workunit.client.1.vm08.stdout:7/148: read d3/da/d25/f18 [3595930,102827] 0 2026-03-10T07:51:20.553 INFO:tasks.workunit.client.1.vm08.stdout:6/107: dwrite d1/d3/df/f12 [0,4194304] 0 2026-03-10T07:51:20.553 INFO:tasks.workunit.client.1.vm08.stdout:5/139: rename d0/f1a to d0/d4/df/d1e/f28 0 2026-03-10T07:51:20.553 INFO:tasks.workunit.client.1.vm08.stdout:7/149: chown d3/da/f17 243 1 2026-03-10T07:51:20.553 INFO:tasks.workunit.client.1.vm08.stdout:7/150: readlink d3/da/d25/d9/l1c 0 2026-03-10T07:51:20.553 INFO:tasks.workunit.client.1.vm08.stdout:9/119: mkdir d2/d3/d25/d2a 0 2026-03-10T07:51:20.554 INFO:tasks.workunit.client.1.vm08.stdout:9/120: chown d2/l9 55446 1 2026-03-10T07:51:20.555 INFO:tasks.workunit.client.1.vm08.stdout:9/121: stat d2/d3/f12 0 2026-03-10T07:51:20.556 INFO:tasks.workunit.client.1.vm08.stdout:7/151: dread d3/da/d25/f18 [0,4194304] 0 2026-03-10T07:51:20.556 INFO:tasks.workunit.client.1.vm08.stdout:7/152: dread - d3/da/d25/f35 zero size 2026-03-10T07:51:20.559 INFO:tasks.workunit.client.1.vm08.stdout:4/83: creat d5/d8/f1d x:0 0 0 2026-03-10T07:51:20.561 INFO:tasks.workunit.client.1.vm08.stdout:5/140: dwrite d0/d4/df/d12/f13 [0,4194304] 0 2026-03-10T07:51:20.562 INFO:tasks.workunit.client.1.vm08.stdout:5/141: chown d0/d8/f1b 98900545 1 2026-03-10T07:51:20.562 INFO:tasks.workunit.client.1.vm08.stdout:8/132: dread d0/df/d17/f1a [0,4194304] 0 2026-03-10T07:51:20.563 INFO:tasks.workunit.client.1.vm08.stdout:2/123: unlink d0/d1/ce 0 2026-03-10T07:51:20.568 INFO:tasks.workunit.client.1.vm08.stdout:2/124: mknod d0/d1/d17/c28 0 2026-03-10T07:51:20.569 INFO:tasks.workunit.client.1.vm08.stdout:2/125: write d0/d1/d3/f14 [3094526,103746] 0 2026-03-10T07:51:20.570 INFO:tasks.workunit.client.1.vm08.stdout:8/133: dread d0/df/f13 [0,4194304] 0 2026-03-10T07:51:20.571 INFO:tasks.workunit.client.1.vm08.stdout:8/134: readlink d0/df/d17/l1e 0 2026-03-10T07:51:20.571 INFO:tasks.workunit.client.1.vm08.stdout:6/108: sync 2026-03-10T07:51:20.571 INFO:tasks.workunit.client.1.vm08.stdout:5/142: creat d0/d4/df/d12/d1c/f29 x:0 0 0 2026-03-10T07:51:20.572 INFO:tasks.workunit.client.1.vm08.stdout:8/135: write d0/df/f1b [767083,4118] 0 2026-03-10T07:51:20.574 INFO:tasks.workunit.client.1.vm08.stdout:5/143: dread d0/f9 [0,4194304] 0 2026-03-10T07:51:20.582 INFO:tasks.workunit.client.1.vm08.stdout:5/144: readlink d0/d4/l1d 0 2026-03-10T07:51:20.582 INFO:tasks.workunit.client.1.vm08.stdout:2/126: unlink d0/f19 0 2026-03-10T07:51:20.582 INFO:tasks.workunit.client.1.vm08.stdout:6/109: creat d1/d3/f21 x:0 0 0 2026-03-10T07:51:20.582 INFO:tasks.workunit.client.1.vm08.stdout:6/110: fdatasync d1/d3/f13 0 2026-03-10T07:51:20.582 INFO:tasks.workunit.client.1.vm08.stdout:6/111: write d1/d3/df/f12 [2300457,90098] 0 2026-03-10T07:51:20.582 INFO:tasks.workunit.client.1.vm08.stdout:6/112: truncate d1/d3/f21 163862 0 2026-03-10T07:51:20.582 INFO:tasks.workunit.client.1.vm08.stdout:2/127: creat d0/d1/d17/d27/f29 x:0 0 0 2026-03-10T07:51:20.582 INFO:tasks.workunit.client.1.vm08.stdout:6/113: mknod d1/d3/c22 0 2026-03-10T07:51:20.583 INFO:tasks.workunit.client.1.vm08.stdout:2/128: mkdir d0/d1/d3/d5/d2a 0 2026-03-10T07:51:20.585 INFO:tasks.workunit.client.1.vm08.stdout:6/114: creat d1/db/f23 x:0 0 0 2026-03-10T07:51:20.585 INFO:tasks.workunit.client.1.vm08.stdout:6/115: dread - d1/d3/df/d1d/f1f zero size 2026-03-10T07:51:20.587 INFO:tasks.workunit.client.1.vm08.stdout:2/129: rename d0/d1/d3/c7 to d0/d1/d3/c2b 0 2026-03-10T07:51:20.588 INFO:tasks.workunit.client.1.vm08.stdout:9/122: sync 2026-03-10T07:51:20.589 INFO:tasks.workunit.client.1.vm08.stdout:6/116: mkdir d1/db/d24 0 2026-03-10T07:51:20.591 INFO:tasks.workunit.client.1.vm08.stdout:9/123: dwrite d2/d3/fc [0,4194304] 0 2026-03-10T07:51:20.598 INFO:tasks.workunit.client.1.vm08.stdout:9/124: mkdir d2/d3/d25/d2b 0 2026-03-10T07:51:20.599 INFO:tasks.workunit.client.1.vm08.stdout:9/125: stat d2/fd 0 2026-03-10T07:51:20.599 INFO:tasks.workunit.client.1.vm08.stdout:6/117: creat d1/db/d24/f25 x:0 0 0 2026-03-10T07:51:20.599 INFO:tasks.workunit.client.1.vm08.stdout:6/118: stat d1/d3/df/d1d/f1f 0 2026-03-10T07:51:20.599 INFO:tasks.workunit.client.1.vm08.stdout:9/126: dread - d2/d3/d25/f24 zero size 2026-03-10T07:51:20.600 INFO:tasks.workunit.client.1.vm08.stdout:6/119: creat d1/db/d24/f26 x:0 0 0 2026-03-10T07:51:20.601 INFO:tasks.workunit.client.1.vm08.stdout:9/127: link d2/d3/d25/f1b d2/d3/d25/d2b/f2c 0 2026-03-10T07:51:20.602 INFO:tasks.workunit.client.1.vm08.stdout:9/128: chown d2/f5 12661604 1 2026-03-10T07:51:20.602 INFO:tasks.workunit.client.1.vm08.stdout:6/120: symlink d1/d3/l27 0 2026-03-10T07:51:20.616 INFO:tasks.workunit.client.1.vm08.stdout:9/129: symlink d2/d26/l2d 0 2026-03-10T07:51:20.616 INFO:tasks.workunit.client.1.vm08.stdout:6/121: readlink d1/l1a 0 2026-03-10T07:51:20.616 INFO:tasks.workunit.client.1.vm08.stdout:9/130: dread d2/f11 [0,4194304] 0 2026-03-10T07:51:20.616 INFO:tasks.workunit.client.1.vm08.stdout:6/122: dread d1/f7 [0,4194304] 0 2026-03-10T07:51:20.616 INFO:tasks.workunit.client.1.vm08.stdout:9/131: symlink d2/de/l2e 0 2026-03-10T07:51:20.616 INFO:tasks.workunit.client.1.vm08.stdout:6/123: stat d1/d3/df/d1d 0 2026-03-10T07:51:20.616 INFO:tasks.workunit.client.1.vm08.stdout:9/132: mknod d2/d3/d25/d2a/c2f 0 2026-03-10T07:51:20.616 INFO:tasks.workunit.client.1.vm08.stdout:6/124: dwrite d1/d3/df/d1d/f1f [0,4194304] 0 2026-03-10T07:51:20.616 INFO:tasks.workunit.client.1.vm08.stdout:9/133: mkdir d2/d3/d25/d30 0 2026-03-10T07:51:20.616 INFO:tasks.workunit.client.1.vm08.stdout:9/134: dread d2/d3/d25/f1b [0,4194304] 0 2026-03-10T07:51:20.617 INFO:tasks.workunit.client.1.vm08.stdout:9/135: rename d2/f11 to d2/d26/f31 0 2026-03-10T07:51:20.618 INFO:tasks.workunit.client.1.vm08.stdout:3/87: write d0/dd/f1d [2087,102799] 0 2026-03-10T07:51:20.618 INFO:tasks.workunit.client.1.vm08.stdout:9/136: write d2/f1a [2812096,39093] 0 2026-03-10T07:51:20.621 INFO:tasks.workunit.client.1.vm08.stdout:9/137: write d2/de/f14 [1940797,121001] 0 2026-03-10T07:51:20.622 INFO:tasks.workunit.client.1.vm08.stdout:3/88: rename d0/dd/f1d to d0/dd/d18/f23 0 2026-03-10T07:51:20.624 INFO:tasks.workunit.client.1.vm08.stdout:3/89: unlink d0/f14 0 2026-03-10T07:51:20.628 INFO:tasks.workunit.client.1.vm08.stdout:3/90: creat d0/dd/d1f/f24 x:0 0 0 2026-03-10T07:51:20.628 INFO:tasks.workunit.client.1.vm08.stdout:3/91: readlink d0/dd/d1f/l21 0 2026-03-10T07:51:20.629 INFO:tasks.workunit.client.1.vm08.stdout:3/92: truncate d0/dd/f20 669063 0 2026-03-10T07:51:20.630 INFO:tasks.workunit.client.1.vm08.stdout:3/93: dread - d0/f10 zero size 2026-03-10T07:51:20.638 INFO:tasks.workunit.client.1.vm08.stdout:3/94: unlink d0/f1a 0 2026-03-10T07:51:20.638 INFO:tasks.workunit.client.1.vm08.stdout:3/95: write d0/fc [4004329,21540] 0 2026-03-10T07:51:20.639 INFO:tasks.workunit.client.1.vm08.stdout:3/96: mkdir d0/dd/d1f/d25 0 2026-03-10T07:51:20.646 INFO:tasks.workunit.client.1.vm08.stdout:6/125: sync 2026-03-10T07:51:20.646 INFO:tasks.workunit.client.1.vm08.stdout:8/136: sync 2026-03-10T07:51:20.647 INFO:tasks.workunit.client.1.vm08.stdout:5/145: sync 2026-03-10T07:51:20.649 INFO:tasks.workunit.client.1.vm08.stdout:8/137: creat d0/f2a x:0 0 0 2026-03-10T07:51:20.650 INFO:tasks.workunit.client.1.vm08.stdout:8/138: readlink d0/df/d17/d25/l27 0 2026-03-10T07:51:20.662 INFO:tasks.workunit.client.1.vm08.stdout:8/139: write d0/df/d17/f21 [4091114,122585] 0 2026-03-10T07:51:20.662 INFO:tasks.workunit.client.1.vm08.stdout:5/146: link d0/f17 d0/d4/df/f2a 0 2026-03-10T07:51:20.662 INFO:tasks.workunit.client.1.vm08.stdout:6/126: creat d1/f28 x:0 0 0 2026-03-10T07:51:20.662 INFO:tasks.workunit.client.1.vm08.stdout:8/140: dwrite d0/df/d17/f21 [0,4194304] 0 2026-03-10T07:51:20.662 INFO:tasks.workunit.client.1.vm08.stdout:5/147: symlink d0/d4/df/d1e/l2b 0 2026-03-10T07:51:20.662 INFO:tasks.workunit.client.1.vm08.stdout:6/127: rename d1/db/f16 to d1/d3/df/d1d/f29 0 2026-03-10T07:51:20.662 INFO:tasks.workunit.client.1.vm08.stdout:6/128: fsync d1/f18 0 2026-03-10T07:51:20.669 INFO:tasks.workunit.client.1.vm08.stdout:8/141: dwrite d0/df/f19 [4194304,4194304] 0 2026-03-10T07:51:20.674 INFO:tasks.workunit.client.1.vm08.stdout:8/142: stat d0/df/f19 0 2026-03-10T07:51:20.675 INFO:tasks.workunit.client.1.vm08.stdout:8/143: chown d0/df 4836 1 2026-03-10T07:51:20.675 INFO:tasks.workunit.client.1.vm08.stdout:8/144: chown d0/df/d17 291036623 1 2026-03-10T07:51:20.687 INFO:tasks.workunit.client.1.vm08.stdout:5/148: creat d0/d4/df/d12/d1c/f2c x:0 0 0 2026-03-10T07:51:20.687 INFO:tasks.workunit.client.1.vm08.stdout:5/149: fdatasync d0/d4/df/d12/d1c/f29 0 2026-03-10T07:51:20.688 INFO:tasks.workunit.client.1.vm08.stdout:5/150: write d0/d4/df/d1e/f25 [435640,86106] 0 2026-03-10T07:51:20.694 INFO:tasks.workunit.client.1.vm08.stdout:5/151: mkdir d0/d4/df/d2d 0 2026-03-10T07:51:20.703 INFO:tasks.workunit.client.1.vm08.stdout:5/152: creat d0/d4/f2e x:0 0 0 2026-03-10T07:51:20.705 INFO:tasks.workunit.client.1.vm08.stdout:5/153: symlink d0/d4/df/d12/d22/l2f 0 2026-03-10T07:51:20.709 INFO:tasks.workunit.client.1.vm08.stdout:5/154: write d0/f9 [4170762,47928] 0 2026-03-10T07:51:20.712 INFO:tasks.workunit.client.1.vm08.stdout:5/155: fsync d0/d4/f5 0 2026-03-10T07:51:20.727 INFO:tasks.workunit.client.1.vm08.stdout:5/156: creat d0/f30 x:0 0 0 2026-03-10T07:51:20.731 INFO:tasks.workunit.client.0.vm05.stdout:8/999: write d1/dd/d4d/d64/d6a/de5/d2a/d34/d49/d5d/df3/ffe [104209,95526] 0 2026-03-10T07:51:20.745 INFO:tasks.workunit.client.0.vm05.stderr:+ rm -rf -- ./tmp.9XAR9Oh0ly 2026-03-10T07:51:20.749 INFO:tasks.workunit.client.1.vm08.stdout:1/103: dwrite d2/d6/de/d1f/d22/f24 [0,4194304] 0 2026-03-10T07:51:20.754 INFO:tasks.workunit.client.1.vm08.stdout:1/104: dread d2/d6/fd [0,4194304] 0 2026-03-10T07:51:20.757 INFO:tasks.workunit.client.1.vm08.stdout:1/105: dread d2/d6/de/d1f/d22/f24 [0,4194304] 0 2026-03-10T07:51:20.767 INFO:tasks.workunit.client.1.vm08.stdout:5/157: getdents d0/d4/df/d12 0 2026-03-10T07:51:20.768 INFO:tasks.workunit.client.1.vm08.stdout:5/158: chown d0/d4/df/f2a 316 1 2026-03-10T07:51:20.797 INFO:tasks.workunit.client.1.vm08.stdout:0/143: dread dd/f16 [0,4194304] 0 2026-03-10T07:51:20.797 INFO:tasks.workunit.client.1.vm08.stdout:0/144: chown dd/d10/d14/d15/d20 1 1 2026-03-10T07:51:20.859 INFO:tasks.workunit.client.1.vm08.stdout:5/159: creat d0/d4/f31 x:0 0 0 2026-03-10T07:51:20.867 INFO:tasks.workunit.client.1.vm08.stdout:0/145: creat dd/d10/d14/d1b/d30/d31/f33 x:0 0 0 2026-03-10T07:51:20.867 INFO:tasks.workunit.client.1.vm08.stdout:0/146: readlink dd/d10/d14/d1b/l27 0 2026-03-10T07:51:20.903 INFO:tasks.workunit.client.1.vm08.stdout:5/160: unlink d0/d4/df/l26 0 2026-03-10T07:51:20.904 INFO:tasks.workunit.client.1.vm08.stdout:5/161: chown d0/d4/df/d1e/l2b 112452 1 2026-03-10T07:51:20.904 INFO:tasks.workunit.client.1.vm08.stdout:0/147: unlink dd/d10/c28 0 2026-03-10T07:51:20.905 INFO:tasks.workunit.client.1.vm08.stdout:5/162: symlink d0/d4/df/d12/l32 0 2026-03-10T07:51:20.906 INFO:tasks.workunit.client.1.vm08.stdout:5/163: read d0/d4/f5 [4055428,93657] 0 2026-03-10T07:51:20.907 INFO:tasks.workunit.client.1.vm08.stdout:0/148: rename dd/d10/l11 to dd/d10/d14/d15/l34 0 2026-03-10T07:51:20.907 INFO:tasks.workunit.client.1.vm08.stdout:0/149: write f6 [2657858,8876] 0 2026-03-10T07:51:20.908 INFO:tasks.workunit.client.1.vm08.stdout:0/150: write dd/f13 [1013295,47025] 0 2026-03-10T07:51:20.916 INFO:tasks.workunit.client.1.vm08.stdout:5/164: mkdir d0/d33 0 2026-03-10T07:51:20.917 INFO:tasks.workunit.client.1.vm08.stdout:5/165: read d0/d8/f1b [879023,85453] 0 2026-03-10T07:51:20.917 INFO:tasks.workunit.client.1.vm08.stdout:5/166: chown d0/d4/df/d12/d1c/f2c 240704 1 2026-03-10T07:51:20.920 INFO:tasks.workunit.client.1.vm08.stdout:0/151: creat dd/d10/d14/d1b/f35 x:0 0 0 2026-03-10T07:51:20.982 INFO:tasks.workunit.client.1.vm08.stdout:7/153: fsync d3/f34 0 2026-03-10T07:51:20.986 INFO:tasks.workunit.client.1.vm08.stdout:7/154: dwrite d3/da/d25/d9/f30 [0,4194304] 0 2026-03-10T07:51:20.990 INFO:tasks.workunit.client.1.vm08.stdout:7/155: symlink d3/da/d25/d9/l36 0 2026-03-10T07:51:20.992 INFO:tasks.workunit.client.1.vm08.stdout:7/156: read d3/f4 [7859173,97526] 0 2026-03-10T07:51:20.993 INFO:tasks.workunit.client.1.vm08.stdout:7/157: truncate d3/da/d25/f27 393771 0 2026-03-10T07:51:21.007 INFO:tasks.workunit.client.1.vm08.stdout:7/158: dwrite d3/da/d25/d9/f20 [0,4194304] 0 2026-03-10T07:51:21.012 INFO:tasks.workunit.client.1.vm08.stdout:7/159: read d3/da/f21 [398454,52501] 0 2026-03-10T07:51:21.032 INFO:tasks.workunit.client.1.vm08.stdout:7/160: unlink d3/da/d25/f18 0 2026-03-10T07:51:21.039 INFO:tasks.workunit.client.1.vm08.stdout:7/161: symlink d3/da/d25/l37 0 2026-03-10T07:51:21.041 INFO:tasks.workunit.client.1.vm08.stdout:7/162: creat d3/f38 x:0 0 0 2026-03-10T07:51:21.041 INFO:tasks.workunit.client.1.vm08.stdout:7/163: chown d3/da/d25/f29 113 1 2026-03-10T07:51:21.041 INFO:tasks.workunit.client.1.vm08.stdout:7/164: stat d3/l33 0 2026-03-10T07:51:21.044 INFO:tasks.workunit.client.1.vm08.stdout:7/165: mkdir d3/da/d25/d9/d2f/d39 0 2026-03-10T07:51:21.045 INFO:tasks.workunit.client.1.vm08.stdout:7/166: write d3/da/d25/f1e [116263,83628] 0 2026-03-10T07:51:21.045 INFO:tasks.workunit.client.1.vm08.stdout:4/84: truncate d5/d8/ff 490084 0 2026-03-10T07:51:21.048 INFO:tasks.workunit.client.1.vm08.stdout:7/167: mkdir d3/da/d25/d9/d2f/d3a 0 2026-03-10T07:51:21.051 INFO:tasks.workunit.client.1.vm08.stdout:7/168: symlink d3/da/l3b 0 2026-03-10T07:51:21.053 INFO:tasks.workunit.client.1.vm08.stdout:4/85: link d5/d8/fd d5/d8/f1e 0 2026-03-10T07:51:21.055 INFO:tasks.workunit.client.1.vm08.stdout:4/86: dread d5/d8/f11 [0,4194304] 0 2026-03-10T07:51:21.056 INFO:tasks.workunit.client.1.vm08.stdout:4/87: chown d5/d8/f1e 223079196 1 2026-03-10T07:51:21.058 INFO:tasks.workunit.client.1.vm08.stdout:2/130: truncate d0/d1/d3/f8 200900 0 2026-03-10T07:51:21.059 INFO:tasks.workunit.client.1.vm08.stdout:2/131: read - d0/d1/d3/d5/d1b/f23 zero size 2026-03-10T07:51:21.061 INFO:tasks.workunit.client.1.vm08.stdout:2/132: symlink d0/d1/d17/l2c 0 2026-03-10T07:51:21.065 INFO:tasks.workunit.client.1.vm08.stdout:2/133: mknod d0/c2d 0 2026-03-10T07:51:21.066 INFO:tasks.workunit.client.1.vm08.stdout:2/134: stat d0/d1/d17/d27 0 2026-03-10T07:51:21.069 INFO:tasks.workunit.client.1.vm08.stdout:9/138: write d2/f10 [712690,95073] 0 2026-03-10T07:51:21.070 INFO:tasks.workunit.client.1.vm08.stdout:2/135: dwrite d0/d1/f24 [0,4194304] 0 2026-03-10T07:51:21.075 INFO:tasks.workunit.client.1.vm08.stdout:4/88: sync 2026-03-10T07:51:21.075 INFO:tasks.workunit.client.1.vm08.stdout:9/139: write d2/d3/d25/f24 [679390,25863] 0 2026-03-10T07:51:21.076 INFO:tasks.workunit.client.1.vm08.stdout:4/89: read - d5/d8/d9/f1b zero size 2026-03-10T07:51:21.079 INFO:tasks.workunit.client.1.vm08.stdout:2/136: symlink d0/d1/d3/d5/l2e 0 2026-03-10T07:51:21.080 INFO:tasks.workunit.client.1.vm08.stdout:9/140: mknod d2/c32 0 2026-03-10T07:51:21.080 INFO:tasks.workunit.client.1.vm08.stdout:2/137: write d0/d1/d3/d5/d1b/f23 [705058,16388] 0 2026-03-10T07:51:21.080 INFO:tasks.workunit.client.1.vm08.stdout:9/141: chown d2/d3/d25/d30 281528 1 2026-03-10T07:51:21.081 INFO:tasks.workunit.client.1.vm08.stdout:9/142: write d2/d3/fc [3156171,6699] 0 2026-03-10T07:51:21.087 INFO:tasks.workunit.client.1.vm08.stdout:9/143: dwrite d2/de/f1e [0,4194304] 0 2026-03-10T07:51:21.093 INFO:tasks.workunit.client.1.vm08.stdout:9/144: mknod d2/d3/d25/d2a/c33 0 2026-03-10T07:51:21.094 INFO:tasks.workunit.client.1.vm08.stdout:3/97: truncate d0/fc 3867207 0 2026-03-10T07:51:21.094 INFO:tasks.workunit.client.1.vm08.stdout:9/145: truncate d2/fd 3466800 0 2026-03-10T07:51:21.095 INFO:tasks.workunit.client.1.vm08.stdout:9/146: chown d2/d3/d25/f24 7325218 1 2026-03-10T07:51:21.095 INFO:tasks.workunit.client.1.vm08.stdout:3/98: creat d0/dd/f26 x:0 0 0 2026-03-10T07:51:21.095 INFO:tasks.workunit.client.1.vm08.stdout:3/99: chown d0/dd/f20 100400659 1 2026-03-10T07:51:21.097 INFO:tasks.workunit.client.1.vm08.stdout:6/129: getdents d1 0 2026-03-10T07:51:21.098 INFO:tasks.workunit.client.1.vm08.stdout:6/130: write d1/f6 [2637884,49652] 0 2026-03-10T07:51:21.101 INFO:tasks.workunit.client.1.vm08.stdout:3/100: dwrite d0/f16 [0,4194304] 0 2026-03-10T07:51:21.102 INFO:tasks.workunit.client.1.vm08.stdout:3/101: write d0/dd/d18/f1e [889447,20450] 0 2026-03-10T07:51:21.102 INFO:tasks.workunit.client.1.vm08.stdout:3/102: dread - d0/f10 zero size 2026-03-10T07:51:21.110 INFO:tasks.workunit.client.1.vm08.stdout:3/103: dwrite d0/f13 [0,4194304] 0 2026-03-10T07:51:21.111 INFO:tasks.workunit.client.1.vm08.stdout:3/104: truncate d0/dd/d1f/f24 625036 0 2026-03-10T07:51:21.127 INFO:tasks.workunit.client.1.vm08.stdout:9/147: symlink d2/d3/d25/d2b/l34 0 2026-03-10T07:51:21.136 INFO:tasks.workunit.client.1.vm08.stdout:6/131: creat d1/d3/df/d1d/f2a x:0 0 0 2026-03-10T07:51:21.145 INFO:tasks.workunit.client.1.vm08.stdout:6/132: truncate d1/db/d24/f26 222706 0 2026-03-10T07:51:21.145 INFO:tasks.workunit.client.1.vm08.stdout:6/133: dread - d1/d17/f20 zero size 2026-03-10T07:51:21.147 INFO:tasks.workunit.client.1.vm08.stdout:8/145: dwrite d0/df/f12 [0,4194304] 0 2026-03-10T07:51:21.148 INFO:tasks.workunit.client.1.vm08.stdout:8/146: chown d0/df/f19 0 1 2026-03-10T07:51:21.167 INFO:tasks.workunit.client.1.vm08.stdout:8/147: creat d0/df/d17/d25/f2b x:0 0 0 2026-03-10T07:51:21.170 INFO:tasks.workunit.client.1.vm08.stdout:8/148: dwrite d0/df/f19 [4194304,4194304] 0 2026-03-10T07:51:21.170 INFO:tasks.workunit.client.1.vm08.stdout:3/105: link d0/dd/ce d0/dd/d1f/c27 0 2026-03-10T07:51:21.174 INFO:tasks.workunit.client.1.vm08.stdout:1/106: write d2/d6/de/f1c [616530,48287] 0 2026-03-10T07:51:21.175 INFO:tasks.workunit.client.1.vm08.stdout:1/107: readlink d2/d6/d11/l28 0 2026-03-10T07:51:21.179 INFO:tasks.workunit.client.1.vm08.stdout:1/108: dread d2/f13 [0,4194304] 0 2026-03-10T07:51:21.184 INFO:tasks.workunit.client.1.vm08.stdout:8/149: write d0/df/f13 [2790790,45209] 0 2026-03-10T07:51:21.184 INFO:tasks.workunit.client.1.vm08.stdout:8/150: fsync d0/f20 0 2026-03-10T07:51:21.186 INFO:tasks.workunit.client.1.vm08.stdout:1/109: creat d2/d6/de/d1f/f2a x:0 0 0 2026-03-10T07:51:21.189 INFO:tasks.workunit.client.1.vm08.stdout:3/106: creat d0/f28 x:0 0 0 2026-03-10T07:51:21.190 INFO:tasks.workunit.client.1.vm08.stdout:3/107: chown d0/dd/d1f/d25 2368080 1 2026-03-10T07:51:21.190 INFO:tasks.workunit.client.1.vm08.stdout:3/108: stat d0/f13 0 2026-03-10T07:51:21.191 INFO:tasks.workunit.client.1.vm08.stdout:3/109: read d0/f16 [1992705,127297] 0 2026-03-10T07:51:21.192 INFO:tasks.workunit.client.1.vm08.stdout:5/167: getdents d0/d4/df/d12 0 2026-03-10T07:51:21.192 INFO:tasks.workunit.client.1.vm08.stdout:5/168: stat d0/d8/f1b 0 2026-03-10T07:51:21.195 INFO:tasks.workunit.client.1.vm08.stdout:8/151: symlink d0/df/d15/d23/l2c 0 2026-03-10T07:51:21.207 INFO:tasks.workunit.client.1.vm08.stdout:1/110: unlink d2/d6/ff 0 2026-03-10T07:51:21.207 INFO:tasks.workunit.client.1.vm08.stdout:0/152: truncate dd/d18/f21 1674353 0 2026-03-10T07:51:21.207 INFO:tasks.workunit.client.1.vm08.stdout:1/111: write d2/d6/f18 [1490142,39607] 0 2026-03-10T07:51:21.207 INFO:tasks.workunit.client.1.vm08.stdout:3/110: symlink d0/dd/d1f/l29 0 2026-03-10T07:51:21.212 INFO:tasks.workunit.client.1.vm08.stdout:3/111: dread d0/dd/d18/f1e [0,4194304] 0 2026-03-10T07:51:21.213 INFO:tasks.workunit.client.1.vm08.stdout:8/152: fsync d0/df/d17/f21 0 2026-03-10T07:51:21.216 INFO:tasks.workunit.client.1.vm08.stdout:8/153: dread d0/f22 [0,4194304] 0 2026-03-10T07:51:21.220 INFO:tasks.workunit.client.1.vm08.stdout:8/154: dwrite d0/df/f1b [0,4194304] 0 2026-03-10T07:51:21.240 INFO:tasks.workunit.client.1.vm08.stdout:7/169: rmdir d3/da/d25 39 2026-03-10T07:51:21.256 INFO:tasks.workunit.client.1.vm08.stdout:2/138: rmdir d0 39 2026-03-10T07:51:21.259 INFO:tasks.workunit.client.1.vm08.stdout:4/90: write f0 [3971359,36421] 0 2026-03-10T07:51:21.261 INFO:tasks.workunit.client.1.vm08.stdout:5/169: truncate d0/d4/f5 1578969 0 2026-03-10T07:51:21.264 INFO:tasks.workunit.client.1.vm08.stdout:9/148: rename d2/d3/d25/d2a to d2/d3/d25/d30/d35 0 2026-03-10T07:51:21.265 INFO:tasks.workunit.client.1.vm08.stdout:8/155: mknod d0/c2d 0 2026-03-10T07:51:21.265 INFO:tasks.workunit.client.1.vm08.stdout:8/156: read - d0/f20 zero size 2026-03-10T07:51:21.275 INFO:tasks.workunit.client.1.vm08.stdout:7/170: truncate d3/da/d25/f32 355807 0 2026-03-10T07:51:21.276 INFO:tasks.workunit.client.1.vm08.stdout:7/171: write d3/da/d25/f27 [914020,72929] 0 2026-03-10T07:51:21.279 INFO:tasks.workunit.client.1.vm08.stdout:3/112: symlink d0/dd/d1f/d25/l2a 0 2026-03-10T07:51:21.280 INFO:tasks.workunit.client.1.vm08.stdout:6/134: getdents d1/d3/df/d1d 0 2026-03-10T07:51:21.292 INFO:tasks.workunit.client.1.vm08.stdout:5/170: dread d0/d8/f1b [0,4194304] 0 2026-03-10T07:51:21.293 INFO:tasks.workunit.client.1.vm08.stdout:5/171: chown d0/d4/df/d1e/f28 14834746 1 2026-03-10T07:51:21.296 INFO:tasks.workunit.client.1.vm08.stdout:5/172: dwrite d0/f30 [0,4194304] 0 2026-03-10T07:51:21.297 INFO:tasks.workunit.client.1.vm08.stdout:0/153: rename dd/d10/d14/d1b/d30/d31/f33 to dd/d10/d14/f36 0 2026-03-10T07:51:21.308 INFO:tasks.workunit.client.1.vm08.stdout:9/149: truncate d2/fb 3656751 0 2026-03-10T07:51:21.318 INFO:tasks.workunit.client.1.vm08.stdout:8/157: mkdir d0/df/d2e 0 2026-03-10T07:51:21.318 INFO:tasks.workunit.client.1.vm08.stdout:6/135: mkdir d1/d17/d2b 0 2026-03-10T07:51:21.318 INFO:tasks.workunit.client.1.vm08.stdout:6/136: write d1/f2 [2551619,88900] 0 2026-03-10T07:51:21.318 INFO:tasks.workunit.client.1.vm08.stdout:4/91: mkdir d5/d1f 0 2026-03-10T07:51:21.318 INFO:tasks.workunit.client.1.vm08.stdout:0/154: mkdir dd/d10/d2f/d37 0 2026-03-10T07:51:21.318 INFO:tasks.workunit.client.1.vm08.stdout:9/150: rename d2/d3/d25/f20 to d2/d26/f36 0 2026-03-10T07:51:21.321 INFO:tasks.workunit.client.1.vm08.stdout:1/112: dwrite d2/d6/fd [0,4194304] 0 2026-03-10T07:51:21.321 INFO:tasks.workunit.client.1.vm08.stdout:8/158: symlink d0/df/d17/d25/d28/l2f 0 2026-03-10T07:51:21.322 INFO:tasks.workunit.client.1.vm08.stdout:4/92: dread d5/d8/f11 [0,4194304] 0 2026-03-10T07:51:21.323 INFO:tasks.workunit.client.1.vm08.stdout:9/151: dwrite d2/d3/d25/f24 [0,4194304] 0 2026-03-10T07:51:21.325 INFO:tasks.workunit.client.1.vm08.stdout:5/173: sync 2026-03-10T07:51:21.331 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:21 vm05.local ceph-mon[50387]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T07:51:21.331 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:21 vm05.local ceph-mon[50387]: Upgrade: Updating alertmanager.vm05 2026-03-10T07:51:21.331 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:21 vm05.local ceph-mon[50387]: Deploying daemon alertmanager.vm05 on vm05 2026-03-10T07:51:21.331 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:21 vm05.local ceph-mon[50387]: pgmap v30: 65 pgs: 65 active+clean; 2.2 GiB data, 7.5 GiB used, 113 GiB / 120 GiB avail; 29 MiB/s rd, 79 MiB/s wr, 208 op/s 2026-03-10T07:51:21.333 INFO:tasks.workunit.client.1.vm08.stdout:2/139: mknod d0/c2f 0 2026-03-10T07:51:21.347 INFO:tasks.workunit.client.1.vm08.stdout:0/155: write dd/d10/d14/f36 [968260,39025] 0 2026-03-10T07:51:21.347 INFO:tasks.workunit.client.1.vm08.stdout:0/156: fsync dd/d10/d14/d15/d20/f2d 0 2026-03-10T07:51:21.348 INFO:tasks.workunit.client.1.vm08.stdout:0/157: write dd/fe [1200036,55658] 0 2026-03-10T07:51:21.350 INFO:tasks.workunit.client.1.vm08.stdout:0/158: dread fb [0,4194304] 0 2026-03-10T07:51:21.364 INFO:tasks.workunit.client.1.vm08.stdout:9/152: write d2/de/f16 [688790,77458] 0 2026-03-10T07:51:21.367 INFO:tasks.workunit.client.1.vm08.stdout:2/140: mknod d0/d1/d3/d5/c30 0 2026-03-10T07:51:21.370 INFO:tasks.workunit.client.1.vm08.stdout:7/172: link d3/da/d25/l1a d3/da/d25/d9/d2f/d3a/l3c 0 2026-03-10T07:51:21.371 INFO:tasks.workunit.client.1.vm08.stdout:5/174: creat d0/d33/f34 x:0 0 0 2026-03-10T07:51:21.371 INFO:tasks.workunit.client.1.vm08.stdout:9/153: rename d2/d3/d25/f1b to d2/d3/d25/d2b/f37 0 2026-03-10T07:51:21.374 INFO:tasks.workunit.client.1.vm08.stdout:6/137: link d1/db/c1b d1/db/d24/c2c 0 2026-03-10T07:51:21.374 INFO:tasks.workunit.client.1.vm08.stdout:2/141: rename d0/c2f to d0/d1/d3/d5/d1b/c31 0 2026-03-10T07:51:21.374 INFO:tasks.workunit.client.1.vm08.stdout:1/113: getdents d2/d10 0 2026-03-10T07:51:21.376 INFO:tasks.workunit.client.1.vm08.stdout:9/154: truncate d2/d3/f12 64994 0 2026-03-10T07:51:21.376 INFO:tasks.workunit.client.1.vm08.stdout:2/142: write d0/d1/f24 [3830262,121371] 0 2026-03-10T07:51:21.377 INFO:tasks.workunit.client.1.vm08.stdout:1/114: unlink d2/d6/de/c16 0 2026-03-10T07:51:21.379 INFO:tasks.workunit.client.1.vm08.stdout:5/175: write d0/d4/df/d12/f11 [3654009,80028] 0 2026-03-10T07:51:21.384 INFO:tasks.workunit.client.1.vm08.stdout:2/143: dread - d0/d1/d17/f1c zero size 2026-03-10T07:51:21.384 INFO:tasks.workunit.client.1.vm08.stdout:7/173: dwrite d3/da/d25/d9/f23 [0,4194304] 0 2026-03-10T07:51:21.384 INFO:tasks.workunit.client.1.vm08.stdout:7/174: fdatasync d3/da/d25/d9/fd 0 2026-03-10T07:51:21.391 INFO:tasks.workunit.client.1.vm08.stdout:1/115: link d2/f13 d2/d10/f2b 0 2026-03-10T07:51:21.392 INFO:tasks.workunit.client.1.vm08.stdout:5/176: rename d0/d4/f5 to d0/d4/df/d2d/f35 0 2026-03-10T07:51:21.393 INFO:tasks.workunit.client.1.vm08.stdout:5/177: read d0/d4/df/d12/d22/f23 [284336,76699] 0 2026-03-10T07:51:21.394 INFO:tasks.workunit.client.1.vm08.stdout:1/116: unlink d2/d6/d11/l28 0 2026-03-10T07:51:21.395 INFO:tasks.workunit.client.1.vm08.stdout:9/155: link d2/d3/lf d2/d3/d25/l38 0 2026-03-10T07:51:21.395 INFO:tasks.workunit.client.1.vm08.stdout:1/117: symlink d2/l2c 0 2026-03-10T07:51:21.399 INFO:tasks.workunit.client.1.vm08.stdout:6/138: sync 2026-03-10T07:51:21.400 INFO:tasks.workunit.client.1.vm08.stdout:9/156: mknod d2/d3/d25/d30/c39 0 2026-03-10T07:51:21.401 INFO:tasks.workunit.client.1.vm08.stdout:6/139: write d1/d3/f19 [426299,96176] 0 2026-03-10T07:51:21.401 INFO:tasks.workunit.client.1.vm08.stdout:9/157: fsync d2/d3/fa 0 2026-03-10T07:51:21.403 INFO:tasks.workunit.client.1.vm08.stdout:1/118: write d2/d6/de/d1f/d22/f24 [2889650,6919] 0 2026-03-10T07:51:21.403 INFO:tasks.workunit.client.1.vm08.stdout:5/178: rename d0/l3 to d0/l36 0 2026-03-10T07:51:21.404 INFO:tasks.workunit.client.1.vm08.stdout:5/179: fsync d0/d8/fe 0 2026-03-10T07:51:21.405 INFO:tasks.workunit.client.1.vm08.stdout:5/180: stat d0/d4/df/d1e/f28 0 2026-03-10T07:51:21.406 INFO:tasks.workunit.client.1.vm08.stdout:5/181: read - d0/d33/f34 zero size 2026-03-10T07:51:21.407 INFO:tasks.workunit.client.1.vm08.stdout:5/182: write d0/f9 [520163,11791] 0 2026-03-10T07:51:21.409 INFO:tasks.workunit.client.1.vm08.stdout:6/140: sync 2026-03-10T07:51:21.412 INFO:tasks.workunit.client.1.vm08.stdout:6/141: chown d1/db/d24/f26 0 1 2026-03-10T07:51:21.414 INFO:tasks.workunit.client.1.vm08.stdout:9/158: link d2/d3/d25/l38 d2/d3/l3a 0 2026-03-10T07:51:21.415 INFO:tasks.workunit.client.1.vm08.stdout:9/159: read d2/d3/fc [1459320,126366] 0 2026-03-10T07:51:21.417 INFO:tasks.workunit.client.1.vm08.stdout:1/119: dread d2/f4 [0,4194304] 0 2026-03-10T07:51:21.418 INFO:tasks.workunit.client.1.vm08.stdout:6/142: dread d1/d3/df/d1d/f29 [0,4194304] 0 2026-03-10T07:51:21.427 INFO:tasks.workunit.client.1.vm08.stdout:1/120: unlink d2/d6/fd 0 2026-03-10T07:51:21.431 INFO:tasks.workunit.client.1.vm08.stdout:3/113: dwrite d0/f16 [0,4194304] 0 2026-03-10T07:51:21.432 INFO:tasks.workunit.client.1.vm08.stdout:8/159: getdents d0/df/d17/d25/d28 0 2026-03-10T07:51:21.437 INFO:tasks.workunit.client.1.vm08.stdout:6/143: mknod d1/d3/c2d 0 2026-03-10T07:51:21.437 INFO:tasks.workunit.client.1.vm08.stdout:9/160: mknod d2/de/d28/c3b 0 2026-03-10T07:51:21.441 INFO:tasks.workunit.client.1.vm08.stdout:3/114: creat d0/dd/d1f/d25/f2b x:0 0 0 2026-03-10T07:51:21.448 INFO:tasks.workunit.client.1.vm08.stdout:7/175: truncate d3/da/d25/d9/f23 1919825 0 2026-03-10T07:51:21.449 INFO:tasks.workunit.client.1.vm08.stdout:2/144: write d0/d1/d3/d5/f20 [104740,98694] 0 2026-03-10T07:51:21.449 INFO:tasks.workunit.client.1.vm08.stdout:7/176: rename d3/da/d25 to d3/da/d25/d9/d2f/d3a/d3d 22 2026-03-10T07:51:21.450 INFO:tasks.workunit.client.1.vm08.stdout:9/161: unlink d2/d3/f1c 0 2026-03-10T07:51:21.458 INFO:tasks.workunit.client.1.vm08.stdout:5/183: truncate d0/f9 244279 0 2026-03-10T07:51:21.463 INFO:tasks.workunit.client.1.vm08.stdout:8/160: mkdir d0/df/d2e/d30 0 2026-03-10T07:51:21.477 INFO:tasks.workunit.client.1.vm08.stdout:9/162: dread d2/fd [0,4194304] 0 2026-03-10T07:51:21.479 INFO:tasks.workunit.client.1.vm08.stdout:4/93: dwrite d5/d8/ff [0,4194304] 0 2026-03-10T07:51:21.486 INFO:tasks.workunit.client.1.vm08.stdout:1/121: truncate d2/d6/de/d1f/d26/f29 561123 0 2026-03-10T07:51:21.488 INFO:tasks.workunit.client.1.vm08.stdout:6/144: dwrite d1/f7 [0,4194304] 0 2026-03-10T07:51:21.489 INFO:tasks.workunit.client.1.vm08.stdout:1/122: chown d2/d6/f18 175 1 2026-03-10T07:51:21.504 INFO:tasks.workunit.client.1.vm08.stdout:8/161: rename d0/df/d15/d23/c24 to d0/df/d17/d25/c31 0 2026-03-10T07:51:21.510 INFO:tasks.workunit.client.1.vm08.stdout:2/145: mkdir d0/d1/d3/d10/d32 0 2026-03-10T07:51:21.511 INFO:tasks.workunit.client.1.vm08.stdout:2/146: write d0/d1/d3/d5/d1b/f23 [1521780,115332] 0 2026-03-10T07:51:21.518 INFO:tasks.workunit.client.1.vm08.stdout:4/94: symlink d5/d8/d9/d12/l20 0 2026-03-10T07:51:21.518 INFO:tasks.workunit.client.1.vm08.stdout:6/145: unlink d1/d3/ce 0 2026-03-10T07:51:21.519 INFO:tasks.workunit.client.1.vm08.stdout:8/162: creat d0/df/d17/f32 x:0 0 0 2026-03-10T07:51:21.521 INFO:tasks.workunit.client.1.vm08.stdout:6/146: write d1/db/d24/f26 [78659,104961] 0 2026-03-10T07:51:21.529 INFO:tasks.workunit.client.1.vm08.stdout:3/115: getdents d0/dd 0 2026-03-10T07:51:21.530 INFO:tasks.workunit.client.1.vm08.stdout:2/147: mknod d0/c33 0 2026-03-10T07:51:21.530 INFO:tasks.workunit.client.1.vm08.stdout:2/148: fsync d0/d1/f24 0 2026-03-10T07:51:21.531 INFO:tasks.workunit.client.1.vm08.stdout:6/147: rename d1/d3/df/d1d/f29 to d1/d3/f2e 0 2026-03-10T07:51:21.531 INFO:tasks.workunit.client.1.vm08.stdout:2/149: fsync d0/d1/d17/d27/f29 0 2026-03-10T07:51:21.532 INFO:tasks.workunit.client.1.vm08.stdout:2/150: write d0/d1/f24 [5053930,95706] 0 2026-03-10T07:51:21.541 INFO:tasks.workunit.client.1.vm08.stdout:8/163: creat d0/df/d2e/d30/f33 x:0 0 0 2026-03-10T07:51:21.541 INFO:tasks.workunit.client.1.vm08.stdout:3/116: creat d0/dd/d1f/d25/f2c x:0 0 0 2026-03-10T07:51:21.543 INFO:tasks.workunit.client.1.vm08.stdout:1/123: getdents d2/d6/de 0 2026-03-10T07:51:21.545 INFO:tasks.workunit.client.1.vm08.stdout:8/164: dwrite d0/df/d17/f32 [0,4194304] 0 2026-03-10T07:51:21.567 INFO:tasks.workunit.client.1.vm08.stdout:6/148: unlink d1/f18 0 2026-03-10T07:51:21.570 INFO:tasks.workunit.client.1.vm08.stdout:2/151: dread d0/d1/d3/f8 [0,4194304] 0 2026-03-10T07:51:21.574 INFO:tasks.workunit.client.1.vm08.stdout:3/117: mkdir d0/dd/d1f/d25/d2d 0 2026-03-10T07:51:21.586 INFO:tasks.workunit.client.1.vm08.stdout:0/159: dwrite dd/d18/f21 [0,4194304] 0 2026-03-10T07:51:21.588 INFO:tasks.workunit.client.1.vm08.stdout:5/184: dwrite d0/d4/df/d12/d22/f23 [0,4194304] 0 2026-03-10T07:51:21.589 INFO:tasks.workunit.client.1.vm08.stdout:2/152: sync 2026-03-10T07:51:21.590 INFO:tasks.workunit.client.1.vm08.stdout:5/185: readlink d0/d4/d19/l20 0 2026-03-10T07:51:21.590 INFO:tasks.workunit.client.1.vm08.stdout:5/186: dread - d0/d4/f2e zero size 2026-03-10T07:51:21.595 INFO:tasks.workunit.client.1.vm08.stdout:8/165: rename d0/c1 to d0/df/d17/d25/d28/c34 0 2026-03-10T07:51:21.595 INFO:tasks.workunit.client.1.vm08.stdout:7/177: fsync d3/f16 0 2026-03-10T07:51:21.601 INFO:tasks.workunit.client.1.vm08.stdout:3/118: creat d0/dd/d18/f2e x:0 0 0 2026-03-10T07:51:21.605 INFO:tasks.workunit.client.1.vm08.stdout:9/163: dwrite d2/fd [0,4194304] 0 2026-03-10T07:51:21.605 INFO:tasks.workunit.client.1.vm08.stdout:9/164: dread - d2/de/f27 zero size 2026-03-10T07:51:21.623 INFO:tasks.workunit.client.1.vm08.stdout:0/160: fdatasync dd/f16 0 2026-03-10T07:51:21.633 INFO:tasks.workunit.client.1.vm08.stdout:5/187: fsync d0/d8/f18 0 2026-03-10T07:51:21.634 INFO:tasks.workunit.client.1.vm08.stdout:5/188: chown d0/d4/f2e 10435 1 2026-03-10T07:51:21.636 INFO:tasks.workunit.client.1.vm08.stdout:5/189: dread d0/d4/df/d12/d22/f23 [0,4194304] 0 2026-03-10T07:51:21.637 INFO:tasks.workunit.client.1.vm08.stdout:5/190: read d0/d8/fe [30081,55272] 0 2026-03-10T07:51:21.638 INFO:tasks.workunit.client.1.vm08.stdout:7/178: symlink d3/da/d25/d9/d2f/l3e 0 2026-03-10T07:51:21.639 INFO:tasks.workunit.client.1.vm08.stdout:5/191: dread d0/d4/df/d1e/f25 [0,4194304] 0 2026-03-10T07:51:21.640 INFO:tasks.workunit.client.1.vm08.stdout:5/192: chown d0/d4/c6 6149062 1 2026-03-10T07:51:21.640 INFO:tasks.workunit.client.1.vm08.stdout:5/193: chown d0/d8/f1b 112099939 1 2026-03-10T07:51:21.640 INFO:tasks.workunit.client.1.vm08.stdout:8/166: rename d0/df/d17/f21 to d0/df/d17/f35 0 2026-03-10T07:51:21.641 INFO:tasks.workunit.client.1.vm08.stdout:5/194: chown d0/f17 372247757 1 2026-03-10T07:51:21.641 INFO:tasks.workunit.client.1.vm08.stdout:8/167: write d0/f2a [428189,119308] 0 2026-03-10T07:51:21.641 INFO:tasks.workunit.client.1.vm08.stdout:5/195: chown d0/d4/df/d12/d1c 46624 1 2026-03-10T07:51:21.647 INFO:tasks.workunit.client.1.vm08.stdout:3/119: creat d0/dd/d1f/f2f x:0 0 0 2026-03-10T07:51:21.657 INFO:tasks.workunit.client.1.vm08.stdout:9/165: mknod d2/d3/d25/d2b/c3c 0 2026-03-10T07:51:21.662 INFO:tasks.workunit.client.1.vm08.stdout:0/161: symlink dd/d29/l38 0 2026-03-10T07:51:21.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:21 vm08.local ceph-mon[59917]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-prometheus-api-host"}]: dispatch 2026-03-10T07:51:21.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:21 vm08.local ceph-mon[59917]: Upgrade: Updating alertmanager.vm05 2026-03-10T07:51:21.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:21 vm08.local ceph-mon[59917]: Deploying daemon alertmanager.vm05 on vm05 2026-03-10T07:51:21.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:21 vm08.local ceph-mon[59917]: pgmap v30: 65 pgs: 65 active+clean; 2.2 GiB data, 7.5 GiB used, 113 GiB / 120 GiB avail; 29 MiB/s rd, 79 MiB/s wr, 208 op/s 2026-03-10T07:51:21.673 INFO:tasks.workunit.client.1.vm08.stdout:2/153: creat d0/d1/d3/d5/d2a/f34 x:0 0 0 2026-03-10T07:51:21.675 INFO:tasks.workunit.client.1.vm08.stdout:4/95: truncate d5/f10 194166 0 2026-03-10T07:51:21.676 INFO:tasks.workunit.client.1.vm08.stdout:4/96: truncate d5/d8/f1d 211847 0 2026-03-10T07:51:21.679 INFO:tasks.workunit.client.1.vm08.stdout:2/154: dwrite d0/f1e [0,4194304] 0 2026-03-10T07:51:21.681 INFO:tasks.workunit.client.1.vm08.stdout:5/196: rename d0/f9 to d0/d4/df/d1e/f37 0 2026-03-10T07:51:21.682 INFO:tasks.workunit.client.1.vm08.stdout:5/197: write d0/d4/df/d12/d1c/f2c [335091,27432] 0 2026-03-10T07:51:21.683 INFO:tasks.workunit.client.1.vm08.stdout:5/198: write d0/d4/f31 [207246,94855] 0 2026-03-10T07:51:21.687 INFO:tasks.workunit.client.1.vm08.stdout:1/124: write d2/d10/f2b [114637,97423] 0 2026-03-10T07:51:21.691 INFO:tasks.workunit.client.1.vm08.stdout:9/166: creat d2/de/f3d x:0 0 0 2026-03-10T07:51:21.691 INFO:tasks.workunit.client.1.vm08.stdout:1/125: chown d2/d6/d11/l25 104529254 1 2026-03-10T07:51:21.691 INFO:tasks.workunit.client.1.vm08.stdout:0/162: mkdir dd/d10/d14/d1b/d39 0 2026-03-10T07:51:21.693 INFO:tasks.workunit.client.1.vm08.stdout:5/199: dwrite d0/d33/f34 [0,4194304] 0 2026-03-10T07:51:21.698 INFO:tasks.workunit.client.1.vm08.stdout:3/120: dwrite d0/fc [4194304,4194304] 0 2026-03-10T07:51:21.701 INFO:tasks.workunit.client.1.vm08.stdout:3/121: stat d0/l11 0 2026-03-10T07:51:21.701 INFO:tasks.workunit.client.1.vm08.stdout:3/122: fdatasync d0/dd/d18/f22 0 2026-03-10T07:51:21.703 INFO:tasks.workunit.client.1.vm08.stdout:1/126: creat d2/d10/f2d x:0 0 0 2026-03-10T07:51:21.710 INFO:tasks.workunit.client.1.vm08.stdout:0/163: dread fb [0,4194304] 0 2026-03-10T07:51:21.712 INFO:tasks.workunit.client.1.vm08.stdout:0/164: chown f6 102771172 1 2026-03-10T07:51:21.712 INFO:tasks.workunit.client.1.vm08.stdout:5/200: mknod d0/d4/df/d12/c38 0 2026-03-10T07:51:21.715 INFO:tasks.workunit.client.1.vm08.stdout:1/127: dwrite d2/d6/de/d1f/f2a [0,4194304] 0 2026-03-10T07:51:21.728 INFO:tasks.workunit.client.1.vm08.stdout:3/123: dwrite d0/dd/d18/f23 [0,4194304] 0 2026-03-10T07:51:21.734 INFO:tasks.workunit.client.1.vm08.stdout:2/155: rename d0/d1/d17/f1c to d0/f35 0 2026-03-10T07:51:21.759 INFO:tasks.workunit.client.1.vm08.stdout:0/165: symlink dd/d18/l3a 0 2026-03-10T07:51:21.759 INFO:tasks.workunit.client.1.vm08.stdout:1/128: mknod d2/d6/d11/c2e 0 2026-03-10T07:51:21.759 INFO:tasks.workunit.client.1.vm08.stdout:3/124: creat d0/dd/d1f/d25/f30 x:0 0 0 2026-03-10T07:51:21.759 INFO:tasks.workunit.client.1.vm08.stdout:2/156: symlink d0/d1/d3/d10/l36 0 2026-03-10T07:51:21.762 INFO:tasks.workunit.client.1.vm08.stdout:3/125: write d0/dd/d1f/d25/f2b [31161,69058] 0 2026-03-10T07:51:21.765 INFO:tasks.workunit.client.1.vm08.stdout:2/157: dwrite d0/f12 [0,4194304] 0 2026-03-10T07:51:21.770 INFO:tasks.workunit.client.1.vm08.stdout:3/126: dwrite d0/dd/d1f/d25/f30 [0,4194304] 0 2026-03-10T07:51:21.772 INFO:tasks.workunit.client.1.vm08.stdout:3/127: fsync d0/dd/f26 0 2026-03-10T07:51:21.772 INFO:tasks.workunit.client.1.vm08.stdout:1/129: dwrite d2/d10/f2d [0,4194304] 0 2026-03-10T07:51:21.773 INFO:tasks.workunit.client.1.vm08.stdout:3/128: readlink d0/dd/l1c 0 2026-03-10T07:51:21.783 INFO:tasks.workunit.client.1.vm08.stdout:9/167: link d2/d26/f29 d2/d3/d25/d30/f3e 0 2026-03-10T07:51:21.783 INFO:tasks.workunit.client.1.vm08.stdout:5/201: link d0/d4/df/d1e/l2b d0/d33/l39 0 2026-03-10T07:51:21.783 INFO:tasks.workunit.client.1.vm08.stdout:0/166: getdents dd/d10/d2f/d32 0 2026-03-10T07:51:21.784 INFO:tasks.workunit.client.1.vm08.stdout:2/158: fsync d0/d1/d3/f8 0 2026-03-10T07:51:21.784 INFO:tasks.workunit.client.1.vm08.stdout:9/168: write d2/fd [4904563,128847] 0 2026-03-10T07:51:21.786 INFO:tasks.workunit.client.1.vm08.stdout:0/167: truncate dd/d10/d14/d15/d20/d22/f2e 508689 0 2026-03-10T07:51:21.786 INFO:tasks.workunit.client.1.vm08.stdout:2/159: read - d0/d1/d3/d5/d2a/f34 zero size 2026-03-10T07:51:21.790 INFO:tasks.workunit.client.1.vm08.stdout:2/160: chown d0/d1/d3/d5/l16 0 1 2026-03-10T07:51:21.793 INFO:tasks.workunit.client.1.vm08.stdout:5/202: mkdir d0/d4/d19/d3a 0 2026-03-10T07:51:21.796 INFO:tasks.workunit.client.1.vm08.stdout:7/179: dwrite d3/f4 [8388608,4194304] 0 2026-03-10T07:51:21.798 INFO:tasks.workunit.client.1.vm08.stdout:1/130: dwrite d2/f4 [0,4194304] 0 2026-03-10T07:51:21.799 INFO:tasks.workunit.client.1.vm08.stdout:7/180: fsync d3/da/d25/f35 0 2026-03-10T07:51:21.800 INFO:tasks.workunit.client.1.vm08.stdout:1/131: stat d2/d6/de 0 2026-03-10T07:51:21.801 INFO:tasks.workunit.client.1.vm08.stdout:5/203: dread d0/d4/df/d12/d1c/f2c [0,4194304] 0 2026-03-10T07:51:21.808 INFO:tasks.workunit.client.1.vm08.stdout:9/169: mknod d2/d3/d25/c3f 0 2026-03-10T07:51:21.815 INFO:tasks.workunit.client.1.vm08.stdout:9/170: write d2/de/f16 [207282,97711] 0 2026-03-10T07:51:21.815 INFO:tasks.workunit.client.1.vm08.stdout:0/168: mknod dd/d10/d14/c3b 0 2026-03-10T07:51:21.815 INFO:tasks.workunit.client.1.vm08.stdout:6/149: truncate d1/f7 192852 0 2026-03-10T07:51:21.815 INFO:tasks.workunit.client.1.vm08.stdout:2/161: mknod d0/d1/d17/d27/c37 0 2026-03-10T07:51:21.815 INFO:tasks.workunit.client.1.vm08.stdout:2/162: dwrite d0/f12 [0,4194304] 0 2026-03-10T07:51:21.824 INFO:tasks.workunit.client.1.vm08.stdout:2/163: dwrite d0/d1/d17/f1a [0,4194304] 0 2026-03-10T07:51:21.830 INFO:tasks.workunit.client.1.vm08.stdout:8/168: truncate d0/df/f1b 3707035 0 2026-03-10T07:51:21.830 INFO:tasks.workunit.client.1.vm08.stdout:8/169: chown d0/df/f26 13 1 2026-03-10T07:51:21.831 INFO:tasks.workunit.client.1.vm08.stdout:5/204: truncate d0/d8/f1b 1441399 0 2026-03-10T07:51:21.831 INFO:tasks.workunit.client.1.vm08.stdout:5/205: chown d0/d4/df/f2a 166483955 1 2026-03-10T07:51:21.832 INFO:tasks.workunit.client.1.vm08.stdout:9/171: symlink d2/d3/d25/d2b/l40 0 2026-03-10T07:51:21.833 INFO:tasks.workunit.client.1.vm08.stdout:9/172: read d2/d3/f12 [27437,28366] 0 2026-03-10T07:51:21.834 INFO:tasks.workunit.client.1.vm08.stdout:5/206: dread d0/d33/f34 [0,4194304] 0 2026-03-10T07:51:21.849 INFO:tasks.workunit.client.1.vm08.stdout:9/173: creat d2/f41 x:0 0 0 2026-03-10T07:51:21.849 INFO:tasks.workunit.client.1.vm08.stdout:7/181: read d3/da/d25/f27 [482097,32225] 0 2026-03-10T07:51:21.850 INFO:tasks.workunit.client.1.vm08.stdout:6/150: symlink d1/l2f 0 2026-03-10T07:51:21.855 INFO:tasks.workunit.client.1.vm08.stdout:7/182: dwrite d3/da/f17 [0,4194304] 0 2026-03-10T07:51:21.863 INFO:tasks.workunit.client.1.vm08.stdout:6/151: write d1/d3/df/f12 [1735264,123534] 0 2026-03-10T07:51:21.863 INFO:tasks.workunit.client.1.vm08.stdout:1/132: getdents d2/d6/de/d1f/d22 0 2026-03-10T07:51:21.863 INFO:tasks.workunit.client.1.vm08.stdout:1/133: write d2/f13 [1039025,96072] 0 2026-03-10T07:51:21.863 INFO:tasks.workunit.client.1.vm08.stdout:1/134: stat d2/d6/de/c21 0 2026-03-10T07:51:21.863 INFO:tasks.workunit.client.1.vm08.stdout:0/169: getdents dd/d10/d14/d1b/d30 0 2026-03-10T07:51:21.867 INFO:tasks.workunit.client.1.vm08.stdout:0/170: read dd/d10/d14/d15/d20/d22/f2e [464411,78281] 0 2026-03-10T07:51:21.871 INFO:tasks.workunit.client.1.vm08.stdout:7/183: truncate d3/da/f1d 1771581 0 2026-03-10T07:51:21.876 INFO:tasks.workunit.client.1.vm08.stdout:6/152: truncate d1/d17/f20 525976 0 2026-03-10T07:51:21.876 INFO:tasks.workunit.client.1.vm08.stdout:7/184: dwrite d3/da/d25/f1e [0,4194304] 0 2026-03-10T07:51:21.876 INFO:tasks.workunit.client.1.vm08.stdout:5/207: link d0/d4/df/d1e/f37 d0/f3b 0 2026-03-10T07:51:21.882 INFO:tasks.workunit.client.1.vm08.stdout:0/171: rename dd/d10/f12 to dd/d18/f3c 0 2026-03-10T07:51:21.882 INFO:tasks.workunit.client.1.vm08.stdout:1/135: creat d2/d6/de/d1f/d26/f2f x:0 0 0 2026-03-10T07:51:21.883 INFO:tasks.workunit.client.1.vm08.stdout:6/153: mknod d1/d3/df/c30 0 2026-03-10T07:51:21.888 INFO:tasks.workunit.client.1.vm08.stdout:5/208: dwrite d0/d33/f34 [0,4194304] 0 2026-03-10T07:51:21.890 INFO:tasks.workunit.client.1.vm08.stdout:0/172: mkdir dd/d10/d2f/d32/d3d 0 2026-03-10T07:51:21.900 INFO:tasks.workunit.client.1.vm08.stdout:1/136: dwrite d2/d6/d11/f1e [0,4194304] 0 2026-03-10T07:51:21.900 INFO:tasks.workunit.client.1.vm08.stdout:5/209: truncate d0/d4/f2e 210883 0 2026-03-10T07:51:21.907 INFO:tasks.workunit.client.1.vm08.stdout:0/173: symlink dd/d10/d14/d15/d20/l3e 0 2026-03-10T07:51:21.907 INFO:tasks.workunit.client.1.vm08.stdout:0/174: chown dd/d10/d14/d1b/d30/d31 11957549 1 2026-03-10T07:51:21.908 INFO:tasks.workunit.client.1.vm08.stdout:0/175: chown dd/d29/f2a 4 1 2026-03-10T07:51:21.914 INFO:tasks.workunit.client.1.vm08.stdout:7/185: sync 2026-03-10T07:51:21.929 INFO:tasks.workunit.client.1.vm08.stdout:1/137: creat d2/d6/de/d1f/d22/f30 x:0 0 0 2026-03-10T07:51:21.930 INFO:tasks.workunit.client.1.vm08.stdout:5/210: creat d0/d4/df/d12/d22/f3c x:0 0 0 2026-03-10T07:51:21.930 INFO:tasks.workunit.client.1.vm08.stdout:7/186: mknod d3/da/d25/c3f 0 2026-03-10T07:51:21.931 INFO:tasks.workunit.client.1.vm08.stdout:7/187: chown d3/da/d25/c28 2201 1 2026-03-10T07:51:21.934 INFO:tasks.workunit.client.1.vm08.stdout:5/211: dwrite d0/d4/df/d12/f11 [0,4194304] 0 2026-03-10T07:51:21.936 INFO:tasks.workunit.client.1.vm08.stdout:0/176: link dd/d18/f21 dd/d10/d14/d1b/d39/f3f 0 2026-03-10T07:51:21.941 INFO:tasks.workunit.client.1.vm08.stdout:7/188: sync 2026-03-10T07:51:21.941 INFO:tasks.workunit.client.1.vm08.stdout:5/212: rename d0/d4/df/d12/d22/f3c to d0/d4/df/d2d/f3d 0 2026-03-10T07:51:21.945 INFO:tasks.workunit.client.1.vm08.stdout:7/189: dwrite d3/f16 [0,4194304] 0 2026-03-10T07:51:21.960 INFO:tasks.workunit.client.1.vm08.stdout:7/190: sync 2026-03-10T07:51:21.988 INFO:tasks.workunit.client.1.vm08.stdout:7/191: sync 2026-03-10T07:51:21.994 INFO:tasks.workunit.client.1.vm08.stdout:7/192: dwrite d3/da/d25/d9/fd [0,4194304] 0 2026-03-10T07:51:21.997 INFO:tasks.workunit.client.1.vm08.stdout:7/193: fsync d3/da/d25/f1e 0 2026-03-10T07:51:22.003 INFO:tasks.workunit.client.1.vm08.stdout:7/194: write d3/f2e [164355,27424] 0 2026-03-10T07:51:22.003 INFO:tasks.workunit.client.1.vm08.stdout:1/138: dread d2/d6/f18 [0,4194304] 0 2026-03-10T07:51:22.004 INFO:tasks.workunit.client.1.vm08.stdout:1/139: mknod d2/d6/c31 0 2026-03-10T07:51:22.006 INFO:tasks.workunit.client.1.vm08.stdout:1/140: creat d2/d6/de/f32 x:0 0 0 2026-03-10T07:51:22.007 INFO:tasks.workunit.client.1.vm08.stdout:1/141: creat d2/d6/de/d1f/f33 x:0 0 0 2026-03-10T07:51:22.007 INFO:tasks.workunit.client.1.vm08.stdout:1/142: mkdir d2/d34 0 2026-03-10T07:51:22.059 INFO:tasks.workunit.client.1.vm08.stdout:3/129: rmdir d0/dd/d1f/d25 39 2026-03-10T07:51:22.061 INFO:tasks.workunit.client.1.vm08.stdout:2/164: write d0/d1/d3/f8 [462666,7564] 0 2026-03-10T07:51:22.062 INFO:tasks.workunit.client.1.vm08.stdout:2/165: write d0/d1/f24 [1159104,11826] 0 2026-03-10T07:51:22.063 INFO:tasks.workunit.client.1.vm08.stdout:2/166: fdatasync d0/d1/d3/d5/d1b/f23 0 2026-03-10T07:51:22.071 INFO:tasks.workunit.client.1.vm08.stdout:0/177: getdents dd/d10/d14 0 2026-03-10T07:51:22.085 INFO:tasks.workunit.client.1.vm08.stdout:2/167: fsync d0/d1/d3/f8 0 2026-03-10T07:51:22.094 INFO:tasks.workunit.client.1.vm08.stdout:3/130: link d0/dd/d1f/d25/l2a d0/l31 0 2026-03-10T07:51:22.099 INFO:tasks.workunit.client.1.vm08.stdout:2/168: mkdir d0/d1/d3/d10/d38 0 2026-03-10T07:51:22.114 INFO:tasks.workunit.client.1.vm08.stdout:8/170: dwrite d0/f22 [0,4194304] 0 2026-03-10T07:51:22.116 INFO:tasks.workunit.client.1.vm08.stdout:8/171: truncate d0/df/f26 740098 0 2026-03-10T07:51:22.123 INFO:tasks.workunit.client.1.vm08.stdout:3/131: unlink d0/dd/d1f/d25/f30 0 2026-03-10T07:51:22.126 INFO:tasks.workunit.client.1.vm08.stdout:2/169: write d0/f35 [419139,69603] 0 2026-03-10T07:51:22.129 INFO:tasks.workunit.client.1.vm08.stdout:4/97: truncate d5/f10 120100 0 2026-03-10T07:51:22.129 INFO:tasks.workunit.client.1.vm08.stdout:4/98: fsync d5/d8/ff 0 2026-03-10T07:51:22.130 INFO:tasks.workunit.client.1.vm08.stdout:9/174: truncate d2/f5 1167936 0 2026-03-10T07:51:22.133 INFO:tasks.workunit.client.1.vm08.stdout:3/132: mkdir d0/dd/d18/d32 0 2026-03-10T07:51:22.133 INFO:tasks.workunit.client.1.vm08.stdout:3/133: dread - d0/dd/d18/f2e zero size 2026-03-10T07:51:22.137 INFO:tasks.workunit.client.1.vm08.stdout:3/134: dwrite d0/dd/d18/f23 [4194304,4194304] 0 2026-03-10T07:51:22.148 INFO:tasks.workunit.client.1.vm08.stdout:4/99: rmdir d5/d8 39 2026-03-10T07:51:22.151 INFO:tasks.workunit.client.1.vm08.stdout:4/100: dwrite f0 [0,4194304] 0 2026-03-10T07:51:22.156 INFO:tasks.workunit.client.1.vm08.stdout:9/175: symlink d2/d26/l42 0 2026-03-10T07:51:22.165 INFO:tasks.workunit.client.1.vm08.stdout:2/170: mkdir d0/d1/d3/d39 0 2026-03-10T07:51:22.172 INFO:tasks.workunit.client.1.vm08.stdout:3/135: mknod d0/dd/d18/c33 0 2026-03-10T07:51:22.172 INFO:tasks.workunit.client.1.vm08.stdout:5/213: getdents d0/d4/df/d12/d22 0 2026-03-10T07:51:22.174 INFO:tasks.workunit.client.1.vm08.stdout:7/195: truncate d3/da/d25/f1e 4077401 0 2026-03-10T07:51:22.175 INFO:tasks.workunit.client.1.vm08.stdout:7/196: readlink d3/da/d25/d9/l1c 0 2026-03-10T07:51:22.175 INFO:tasks.workunit.client.1.vm08.stdout:7/197: chown d3/l22 181334 1 2026-03-10T07:51:22.180 INFO:tasks.workunit.client.1.vm08.stdout:4/101: dread - d5/d8/d9/f1b zero size 2026-03-10T07:51:22.180 INFO:tasks.workunit.client.1.vm08.stdout:1/143: truncate d2/f13 33861 0 2026-03-10T07:51:22.181 INFO:tasks.workunit.client.1.vm08.stdout:5/214: sync 2026-03-10T07:51:22.188 INFO:tasks.workunit.client.1.vm08.stdout:2/171: mkdir d0/d1/d17/d27/d3a 0 2026-03-10T07:51:22.189 INFO:tasks.workunit.client.1.vm08.stdout:2/172: chown d0/d1/d3/d5/cc 36213 1 2026-03-10T07:51:22.190 INFO:tasks.workunit.client.1.vm08.stdout:3/136: mkdir d0/dd/d1f/d25/d34 0 2026-03-10T07:51:22.194 INFO:tasks.workunit.client.1.vm08.stdout:6/154: dwrite d1/f7 [0,4194304] 0 2026-03-10T07:51:22.213 INFO:tasks.workunit.client.1.vm08.stdout:0/178: truncate f6 3419412 0 2026-03-10T07:51:22.214 INFO:tasks.workunit.client.1.vm08.stdout:8/172: getdents d0/df/d17 0 2026-03-10T07:51:22.218 INFO:tasks.workunit.client.1.vm08.stdout:8/173: dwrite d0/df/d17/d25/f2b [0,4194304] 0 2026-03-10T07:51:22.221 INFO:tasks.workunit.client.1.vm08.stdout:0/179: sync 2026-03-10T07:51:22.223 INFO:tasks.workunit.client.1.vm08.stdout:8/174: chown d0/f20 158003 1 2026-03-10T07:51:22.229 INFO:tasks.workunit.client.1.vm08.stdout:2/173: unlink d0/d1/c11 0 2026-03-10T07:51:22.230 INFO:tasks.workunit.client.1.vm08.stdout:7/198: mkdir d3/da/d25/d9/d2f/d3a/d40 0 2026-03-10T07:51:22.230 INFO:tasks.workunit.client.1.vm08.stdout:3/137: chown d0/dd/d1f/d25/l2a 2042753 1 2026-03-10T07:51:22.230 INFO:tasks.workunit.client.1.vm08.stdout:2/174: chown d0/d1/d3/d10/l36 101384204 1 2026-03-10T07:51:22.251 INFO:tasks.workunit.client.1.vm08.stdout:0/180: rmdir dd/d10/d2f 39 2026-03-10T07:51:22.256 INFO:tasks.workunit.client.1.vm08.stdout:5/215: creat d0/d8/d24/f3e x:0 0 0 2026-03-10T07:51:22.258 INFO:tasks.workunit.client.1.vm08.stdout:5/216: sync 2026-03-10T07:51:22.272 INFO:tasks.workunit.client.1.vm08.stdout:8/175: creat d0/df/d15/f36 x:0 0 0 2026-03-10T07:51:22.273 INFO:tasks.workunit.client.1.vm08.stdout:8/176: dread d0/df/f26 [0,4194304] 0 2026-03-10T07:51:22.279 INFO:tasks.workunit.client.1.vm08.stdout:3/138: unlink d0/dd/d1f/f2f 0 2026-03-10T07:51:22.285 INFO:tasks.workunit.client.1.vm08.stdout:7/199: creat d3/da/d25/d9/f41 x:0 0 0 2026-03-10T07:51:22.286 INFO:tasks.workunit.client.1.vm08.stdout:2/175: chown d0/d1/d3/c2b 113982001 1 2026-03-10T07:51:22.286 INFO:tasks.workunit.client.1.vm08.stdout:2/176: stat d0/d1/d17/c28 0 2026-03-10T07:51:22.286 INFO:tasks.workunit.client.1.vm08.stdout:7/200: dread d3/da/d25/f32 [0,4194304] 0 2026-03-10T07:51:22.287 INFO:tasks.workunit.client.1.vm08.stdout:7/201: read d3/da/d25/d9/f30 [2210808,101292] 0 2026-03-10T07:51:22.290 INFO:tasks.workunit.client.1.vm08.stdout:6/155: mknod d1/db/c31 0 2026-03-10T07:51:22.295 INFO:tasks.workunit.client.1.vm08.stdout:4/102: creat d5/f21 x:0 0 0 2026-03-10T07:51:22.296 INFO:tasks.workunit.client.1.vm08.stdout:4/103: truncate d5/d8/f11 4752121 0 2026-03-10T07:51:22.300 INFO:tasks.workunit.client.1.vm08.stdout:7/202: dread d3/f2b [0,4194304] 0 2026-03-10T07:51:22.300 INFO:tasks.workunit.client.1.vm08.stdout:7/203: fdatasync d3/da/d25/d9/fd 0 2026-03-10T07:51:22.301 INFO:tasks.workunit.client.1.vm08.stdout:9/176: write d2/d3/d25/d30/f3e [1294585,92618] 0 2026-03-10T07:51:22.302 INFO:tasks.workunit.client.1.vm08.stdout:9/177: stat d2/de/f1e 0 2026-03-10T07:51:22.305 INFO:tasks.workunit.client.1.vm08.stdout:8/177: mkdir d0/d37 0 2026-03-10T07:51:22.306 INFO:tasks.workunit.client.1.vm08.stdout:8/178: write d0/df/d17/d25/f2b [2102015,88873] 0 2026-03-10T07:51:22.308 INFO:tasks.workunit.client.1.vm08.stdout:8/179: dread - d0/f20 zero size 2026-03-10T07:51:22.309 INFO:tasks.workunit.client.1.vm08.stdout:9/178: dwrite d2/de/f16 [0,4194304] 0 2026-03-10T07:51:22.318 INFO:tasks.workunit.client.1.vm08.stdout:4/104: rename d5/d8/fd to d5/d17/f22 0 2026-03-10T07:51:22.321 INFO:tasks.workunit.client.1.vm08.stdout:6/156: dwrite d1/d3/f2e [4194304,4194304] 0 2026-03-10T07:51:22.322 INFO:tasks.workunit.client.1.vm08.stdout:6/157: write d1/f7 [2929339,10535] 0 2026-03-10T07:51:22.328 INFO:tasks.workunit.client.1.vm08.stdout:5/217: creat d0/d4/d19/d3a/f3f x:0 0 0 2026-03-10T07:51:22.330 INFO:tasks.workunit.client.1.vm08.stdout:1/144: link d2/d10/f2b d2/d6/de/d1f/d22/f35 0 2026-03-10T07:51:22.333 INFO:tasks.workunit.client.1.vm08.stdout:1/145: dread d2/d6/de/f15 [0,4194304] 0 2026-03-10T07:51:22.339 INFO:tasks.workunit.client.1.vm08.stdout:3/139: creat d0/dd/d1f/d25/d2d/f35 x:0 0 0 2026-03-10T07:51:22.341 INFO:tasks.workunit.client.1.vm08.stdout:3/140: dread d0/fc [4194304,4194304] 0 2026-03-10T07:51:22.342 INFO:tasks.workunit.client.1.vm08.stdout:3/141: chown d0/dd/d1f/l29 7773 1 2026-03-10T07:51:22.342 INFO:tasks.workunit.client.1.vm08.stdout:8/180: mknod d0/df/d2e/c38 0 2026-03-10T07:51:22.343 INFO:tasks.workunit.client.1.vm08.stdout:8/181: write d0/df/d2e/d30/f33 [735941,59406] 0 2026-03-10T07:51:22.350 INFO:tasks.workunit.client.1.vm08.stdout:2/177: creat d0/d1/d3/d39/f3b x:0 0 0 2026-03-10T07:51:22.360 INFO:tasks.workunit.client.1.vm08.stdout:7/204: rename d3/f38 to d3/da/d25/d9/d2f/f42 0 2026-03-10T07:51:22.361 INFO:tasks.workunit.client.1.vm08.stdout:7/205: truncate d3/da/d25/d9/f41 190557 0 2026-03-10T07:51:22.362 INFO:tasks.workunit.client.1.vm08.stdout:4/105: symlink d5/d8/d9/d12/l23 0 2026-03-10T07:51:22.366 INFO:tasks.workunit.client.1.vm08.stdout:4/106: dwrite d5/d8/f1d [0,4194304] 0 2026-03-10T07:51:22.370 INFO:tasks.workunit.client.1.vm08.stdout:4/107: dwrite d5/d8/d9/f1b [0,4194304] 0 2026-03-10T07:51:22.371 INFO:tasks.workunit.client.1.vm08.stdout:4/108: stat f1 0 2026-03-10T07:51:22.390 INFO:tasks.workunit.client.1.vm08.stdout:5/218: mkdir d0/d4/df/d12/d1c/d40 0 2026-03-10T07:51:22.391 INFO:tasks.workunit.client.1.vm08.stdout:5/219: chown d0/d4/df/d12/d22/l2f 301 1 2026-03-10T07:51:22.391 INFO:tasks.workunit.client.1.vm08.stdout:5/220: write d0/d4/f31 [1015750,83202] 0 2026-03-10T07:51:22.411 INFO:tasks.workunit.client.1.vm08.stdout:2/178: creat d0/d1/d3/d5/d1b/f3c x:0 0 0 2026-03-10T07:51:22.436 INFO:tasks.workunit.client.1.vm08.stdout:0/181: getdents dd/d10/d14/d15 0 2026-03-10T07:51:22.446 INFO:tasks.workunit.client.1.vm08.stdout:3/142: creat d0/dd/d1f/d25/d34/f36 x:0 0 0 2026-03-10T07:51:22.450 INFO:tasks.workunit.client.1.vm08.stdout:5/221: dwrite d0/d4/df/d12/d1c/f2c [0,4194304] 0 2026-03-10T07:51:22.451 INFO:tasks.workunit.client.1.vm08.stdout:5/222: chown d0/d4/df/d12/d22/f23 117518 1 2026-03-10T07:51:22.452 INFO:tasks.workunit.client.1.vm08.stdout:5/223: chown d0/d4/c21 1 1 2026-03-10T07:51:22.460 INFO:tasks.workunit.client.1.vm08.stdout:8/182: truncate d0/df/f12 16192 0 2026-03-10T07:51:22.467 INFO:tasks.workunit.client.1.vm08.stdout:6/158: rename d1/cd to d1/d3/df/d1d/c32 0 2026-03-10T07:51:22.471 INFO:tasks.workunit.client.1.vm08.stdout:7/206: truncate d3/da/f1d 1150696 0 2026-03-10T07:51:22.471 INFO:tasks.workunit.client.1.vm08.stdout:6/159: read d1/db/d24/f26 [219841,65052] 0 2026-03-10T07:51:22.473 INFO:tasks.workunit.client.1.vm08.stdout:3/143: unlink d0/dd/d1f/d25/f2c 0 2026-03-10T07:51:22.480 INFO:tasks.workunit.client.1.vm08.stdout:3/144: dwrite d0/f13 [0,4194304] 0 2026-03-10T07:51:22.486 INFO:tasks.workunit.client.1.vm08.stdout:5/224: mkdir d0/d4/df/d1e/d41 0 2026-03-10T07:51:22.494 INFO:tasks.workunit.client.1.vm08.stdout:9/179: dwrite d2/f5 [0,4194304] 0 2026-03-10T07:51:22.497 INFO:tasks.workunit.client.1.vm08.stdout:8/183: mkdir d0/df/d15/d23/d39 0 2026-03-10T07:51:22.516 INFO:tasks.workunit.client.1.vm08.stdout:4/109: creat d5/f24 x:0 0 0 2026-03-10T07:51:22.522 INFO:tasks.workunit.client.1.vm08.stdout:9/180: unlink d2/d3/d25/d30/f3e 0 2026-03-10T07:51:22.523 INFO:tasks.workunit.client.1.vm08.stdout:9/181: write d2/de/f3d [620343,85513] 0 2026-03-10T07:51:22.524 INFO:tasks.workunit.client.1.vm08.stdout:8/184: fsync d0/df/f19 0 2026-03-10T07:51:22.525 INFO:tasks.workunit.client.1.vm08.stdout:8/185: stat d0/c5 0 2026-03-10T07:51:22.526 INFO:tasks.workunit.client.1.vm08.stdout:7/207: mkdir d3/da/d25/d9/d2f/d39/d43 0 2026-03-10T07:51:22.526 INFO:tasks.workunit.client.1.vm08.stdout:1/146: link d2/f13 d2/f36 0 2026-03-10T07:51:22.527 INFO:tasks.workunit.client.1.vm08.stdout:6/160: getdents d1/d17/d2b 0 2026-03-10T07:51:22.531 INFO:tasks.workunit.client.1.vm08.stdout:8/186: dwrite d0/f20 [0,4194304] 0 2026-03-10T07:51:22.536 INFO:tasks.workunit.client.1.vm08.stdout:8/187: dwrite d0/df/f13 [4194304,4194304] 0 2026-03-10T07:51:22.541 INFO:tasks.workunit.client.1.vm08.stdout:8/188: dwrite d0/df/d17/d25/f2b [0,4194304] 0 2026-03-10T07:51:22.543 INFO:tasks.workunit.client.1.vm08.stdout:0/182: getdents dd/d10/d2f/d37 0 2026-03-10T07:51:22.543 INFO:tasks.workunit.client.1.vm08.stdout:3/145: mknod d0/dd/d18/d32/c37 0 2026-03-10T07:51:22.543 INFO:tasks.workunit.client.1.vm08.stdout:5/225: link d0/d33/f34 d0/d4/df/d1e/f42 0 2026-03-10T07:51:22.544 INFO:tasks.workunit.client.1.vm08.stdout:9/182: symlink d2/d3/d25/d30/l43 0 2026-03-10T07:51:22.544 INFO:tasks.workunit.client.1.vm08.stdout:1/147: unlink d2/d6/c31 0 2026-03-10T07:51:22.545 INFO:tasks.workunit.client.1.vm08.stdout:1/148: chown d2/d6/d11/l23 137 1 2026-03-10T07:51:22.546 INFO:tasks.workunit.client.1.vm08.stdout:1/149: chown d2/d6/de/c21 73057 1 2026-03-10T07:51:22.547 INFO:tasks.workunit.client.1.vm08.stdout:4/110: link d5/f24 d5/d1f/f25 0 2026-03-10T07:51:22.548 INFO:tasks.workunit.client.1.vm08.stdout:5/226: dwrite d0/d4/df/d12/f11 [0,4194304] 0 2026-03-10T07:51:22.560 INFO:tasks.workunit.client.1.vm08.stdout:3/146: creat d0/dd/d18/f38 x:0 0 0 2026-03-10T07:51:22.560 INFO:tasks.workunit.client.1.vm08.stdout:9/183: creat d2/f44 x:0 0 0 2026-03-10T07:51:22.560 INFO:tasks.workunit.client.1.vm08.stdout:7/208: rename d3/da/l26 to d3/da/d25/d9/d2f/l44 0 2026-03-10T07:51:22.560 INFO:tasks.workunit.client.1.vm08.stdout:3/147: dread - d0/dd/d18/f2e zero size 2026-03-10T07:51:22.569 INFO:tasks.workunit.client.1.vm08.stdout:8/189: symlink d0/df/d15/d23/d39/l3a 0 2026-03-10T07:51:22.598 INFO:tasks.workunit.client.1.vm08.stdout:4/111: fsync d5/d8/f1e 0 2026-03-10T07:51:22.604 INFO:tasks.workunit.client.1.vm08.stdout:4/112: stat d5/f24 0 2026-03-10T07:51:22.604 INFO:tasks.workunit.client.1.vm08.stdout:6/161: rename d1/d3/df/f1e to d1/db/f33 0 2026-03-10T07:51:22.604 INFO:tasks.workunit.client.1.vm08.stdout:6/162: dread d1/d3/f19 [0,4194304] 0 2026-03-10T07:51:22.604 INFO:tasks.workunit.client.1.vm08.stdout:6/163: readlink d1/d3/l27 0 2026-03-10T07:51:22.604 INFO:tasks.workunit.client.1.vm08.stdout:9/184: symlink d2/d3/d25/d30/d35/l45 0 2026-03-10T07:51:22.604 INFO:tasks.workunit.client.1.vm08.stdout:8/190: mkdir d0/d3b 0 2026-03-10T07:51:22.604 INFO:tasks.workunit.client.1.vm08.stdout:8/191: fsync d0/f6 0 2026-03-10T07:51:22.604 INFO:tasks.workunit.client.1.vm08.stdout:9/185: rename d2/f41 to d2/d3/d25/d30/f46 0 2026-03-10T07:51:22.604 INFO:tasks.workunit.client.1.vm08.stdout:6/164: truncate d1/f2 5126317 0 2026-03-10T07:51:22.604 INFO:tasks.workunit.client.1.vm08.stdout:8/192: creat d0/df/d2e/f3c x:0 0 0 2026-03-10T07:51:22.604 INFO:tasks.workunit.client.1.vm08.stdout:3/148: creat d0/f39 x:0 0 0 2026-03-10T07:51:22.604 INFO:tasks.workunit.client.1.vm08.stdout:4/113: link d5/d1f/f25 d5/f26 0 2026-03-10T07:51:22.604 INFO:tasks.workunit.client.1.vm08.stdout:3/149: creat d0/dd/d1f/d25/d2d/f3a x:0 0 0 2026-03-10T07:51:22.604 INFO:tasks.workunit.client.1.vm08.stdout:8/193: dwrite d0/f22 [0,4194304] 0 2026-03-10T07:51:22.604 INFO:tasks.workunit.client.1.vm08.stdout:3/150: write d0/dd/d1f/d25/f2b [31662,122126] 0 2026-03-10T07:51:22.604 INFO:tasks.workunit.client.1.vm08.stdout:6/165: dwrite d1/db/f33 [0,4194304] 0 2026-03-10T07:51:22.604 INFO:tasks.workunit.client.1.vm08.stdout:4/114: creat d5/f27 x:0 0 0 2026-03-10T07:51:22.605 INFO:tasks.workunit.client.1.vm08.stdout:9/186: link d2/d3/d25/f21 d2/d3/d25/d30/f47 0 2026-03-10T07:51:22.605 INFO:tasks.workunit.client.1.vm08.stdout:3/151: mknod d0/c3b 0 2026-03-10T07:51:22.605 INFO:tasks.workunit.client.1.vm08.stdout:9/187: mknod d2/d3/d25/d2b/c48 0 2026-03-10T07:51:22.605 INFO:tasks.workunit.client.1.vm08.stdout:4/115: symlink d5/l28 0 2026-03-10T07:51:22.605 INFO:tasks.workunit.client.1.vm08.stdout:4/116: chown d5/d8/d9/f1b 0 1 2026-03-10T07:51:22.605 INFO:tasks.workunit.client.1.vm08.stdout:8/194: getdents d0/df/d17 0 2026-03-10T07:51:22.605 INFO:tasks.workunit.client.1.vm08.stdout:4/117: chown d5 222248 1 2026-03-10T07:51:22.605 INFO:tasks.workunit.client.1.vm08.stdout:6/166: creat d1/f34 x:0 0 0 2026-03-10T07:51:22.608 INFO:tasks.workunit.client.1.vm08.stdout:8/195: creat d0/df/d15/d23/f3d x:0 0 0 2026-03-10T07:51:22.609 INFO:tasks.workunit.client.1.vm08.stdout:4/118: dread d5/d8/fc [0,4194304] 0 2026-03-10T07:51:22.616 INFO:tasks.workunit.client.1.vm08.stdout:4/119: fsync d5/d8/f1e 0 2026-03-10T07:51:22.620 INFO:tasks.workunit.client.1.vm08.stdout:4/120: write d5/d8/d9/f18 [129567,61658] 0 2026-03-10T07:51:22.620 INFO:tasks.workunit.client.1.vm08.stdout:6/167: creat d1/f35 x:0 0 0 2026-03-10T07:51:22.628 INFO:tasks.workunit.client.1.vm08.stdout:6/168: read d1/db/d24/f26 [90935,118628] 0 2026-03-10T07:51:22.642 INFO:tasks.workunit.client.1.vm08.stdout:6/169: dwrite d1/d3/f21 [0,4194304] 0 2026-03-10T07:51:22.642 INFO:tasks.workunit.client.1.vm08.stdout:6/170: stat d1/db/c1b 0 2026-03-10T07:51:22.643 INFO:tasks.workunit.client.1.vm08.stdout:4/121: dread d5/f10 [0,4194304] 0 2026-03-10T07:51:22.650 INFO:tasks.workunit.client.1.vm08.stdout:4/122: mknod d5/d17/c29 0 2026-03-10T07:51:22.660 INFO:tasks.workunit.client.1.vm08.stdout:4/123: write f2 [3227352,43656] 0 2026-03-10T07:51:22.660 INFO:tasks.workunit.client.1.vm08.stdout:6/171: dread d1/d3/df/d1d/f1f [0,4194304] 0 2026-03-10T07:51:22.660 INFO:tasks.workunit.client.1.vm08.stdout:6/172: write d1/d3/df/d1d/f2a [339879,10371] 0 2026-03-10T07:51:22.660 INFO:tasks.workunit.client.1.vm08.stdout:4/124: fsync d5/f10 0 2026-03-10T07:51:22.660 INFO:tasks.workunit.client.1.vm08.stdout:4/125: symlink d5/d8/l2a 0 2026-03-10T07:51:22.660 INFO:tasks.workunit.client.1.vm08.stdout:8/196: dread d0/fa [0,4194304] 0 2026-03-10T07:51:22.660 INFO:tasks.workunit.client.1.vm08.stdout:4/126: creat d5/d8/d9/f2b x:0 0 0 2026-03-10T07:51:22.662 INFO:tasks.workunit.client.1.vm08.stdout:4/127: mknod d5/d1f/c2c 0 2026-03-10T07:51:22.663 INFO:tasks.workunit.client.1.vm08.stdout:6/173: dread d1/f2 [0,4194304] 0 2026-03-10T07:51:22.665 INFO:tasks.workunit.client.1.vm08.stdout:4/128: unlink d5/f24 0 2026-03-10T07:51:22.665 INFO:tasks.workunit.client.1.vm08.stdout:6/174: dread - d1/f28 zero size 2026-03-10T07:51:22.670 INFO:tasks.workunit.client.1.vm08.stdout:4/129: creat d5/f2d x:0 0 0 2026-03-10T07:51:22.671 INFO:tasks.workunit.client.1.vm08.stdout:6/175: dwrite d1/f35 [0,4194304] 0 2026-03-10T07:51:22.672 INFO:tasks.workunit.client.1.vm08.stdout:4/130: mknod d5/d8/c2e 0 2026-03-10T07:51:22.673 INFO:tasks.workunit.client.1.vm08.stdout:4/131: readlink d5/l14 0 2026-03-10T07:51:22.673 INFO:tasks.workunit.client.1.vm08.stdout:6/176: mknod d1/d17/d2b/c36 0 2026-03-10T07:51:22.675 INFO:tasks.workunit.client.1.vm08.stdout:6/177: mkdir d1/d17/d2b/d37 0 2026-03-10T07:51:22.676 INFO:tasks.workunit.client.1.vm08.stdout:6/178: write d1/f34 [332150,107848] 0 2026-03-10T07:51:22.677 INFO:tasks.workunit.client.1.vm08.stdout:6/179: stat d1/d3/f13 0 2026-03-10T07:51:22.680 INFO:tasks.workunit.client.1.vm08.stdout:6/180: dwrite d1/f28 [0,4194304] 0 2026-03-10T07:51:22.690 INFO:tasks.workunit.client.1.vm08.stdout:2/179: truncate d0/f35 39450 0 2026-03-10T07:51:22.702 INFO:tasks.workunit.client.1.vm08.stdout:2/180: dread d0/d1/d3/d5/f20 [0,4194304] 0 2026-03-10T07:51:22.705 INFO:tasks.workunit.client.1.vm08.stdout:6/181: dread d1/d3/df/f12 [0,4194304] 0 2026-03-10T07:51:22.705 INFO:tasks.workunit.client.1.vm08.stdout:2/181: creat d0/d1/d17/d27/d3a/f3d x:0 0 0 2026-03-10T07:51:22.706 INFO:tasks.workunit.client.1.vm08.stdout:6/182: chown d1/d3/c22 916 1 2026-03-10T07:51:22.708 INFO:tasks.workunit.client.1.vm08.stdout:6/183: mkdir d1/d3/df/d38 0 2026-03-10T07:51:22.708 INFO:tasks.workunit.client.1.vm08.stdout:2/182: mkdir d0/d1/d3/d3e 0 2026-03-10T07:51:22.711 INFO:tasks.workunit.client.1.vm08.stdout:2/183: dread d0/f12 [0,4194304] 0 2026-03-10T07:51:22.713 INFO:tasks.workunit.client.1.vm08.stdout:6/184: symlink d1/db/l39 0 2026-03-10T07:51:22.716 INFO:tasks.workunit.client.1.vm08.stdout:2/184: symlink d0/d1/d3/d5/d2a/l3f 0 2026-03-10T07:51:22.718 INFO:tasks.workunit.client.1.vm08.stdout:6/185: write d1/db/d24/f26 [176395,94174] 0 2026-03-10T07:51:22.718 INFO:tasks.workunit.client.1.vm08.stdout:6/186: rename d1/d17/d2b to d1/d17/d2b/d3a 22 2026-03-10T07:51:22.739 INFO:tasks.workunit.client.1.vm08.stdout:6/187: fdatasync d1/f34 0 2026-03-10T07:51:22.756 INFO:tasks.workunit.client.1.vm08.stdout:0/183: truncate dd/fe 2953268 0 2026-03-10T07:51:22.756 INFO:tasks.workunit.client.1.vm08.stdout:1/150: write d2/f36 [878832,112040] 0 2026-03-10T07:51:22.759 INFO:tasks.workunit.client.1.vm08.stdout:2/185: symlink d0/d1/d3/d10/d32/l40 0 2026-03-10T07:51:22.761 INFO:tasks.workunit.client.1.vm08.stdout:5/227: dwrite d0/d4/df/d2d/f35 [0,4194304] 0 2026-03-10T07:51:22.764 INFO:tasks.workunit.client.1.vm08.stdout:3/152: rename d0/dd to d0/d3c 0 2026-03-10T07:51:22.764 INFO:tasks.workunit.client.1.vm08.stdout:3/153: dread - d0/d3c/f26 zero size 2026-03-10T07:51:22.765 INFO:tasks.workunit.client.1.vm08.stdout:9/188: write d2/f10 [1789013,29877] 0 2026-03-10T07:51:22.771 INFO:tasks.workunit.client.1.vm08.stdout:9/189: dwrite d2/de/f1e [0,4194304] 0 2026-03-10T07:51:22.775 INFO:tasks.workunit.client.1.vm08.stdout:1/151: mknod d2/d6/de/d1f/d22/c37 0 2026-03-10T07:51:22.775 INFO:tasks.workunit.client.1.vm08.stdout:1/152: chown d2/f13 152410 1 2026-03-10T07:51:22.779 INFO:tasks.workunit.client.1.vm08.stdout:4/132: dwrite d5/f10 [0,4194304] 0 2026-03-10T07:51:22.780 INFO:tasks.workunit.client.1.vm08.stdout:4/133: chown f1 0 1 2026-03-10T07:51:22.780 INFO:tasks.workunit.client.1.vm08.stdout:0/184: dread dd/f16 [0,4194304] 0 2026-03-10T07:51:22.781 INFO:tasks.workunit.client.1.vm08.stdout:4/134: chown d5/d8/f1e 10278989 1 2026-03-10T07:51:22.782 INFO:tasks.workunit.client.1.vm08.stdout:8/197: write d0/df/f26 [84296,49011] 0 2026-03-10T07:51:22.782 INFO:tasks.workunit.client.1.vm08.stdout:1/153: dread d2/f4 [0,4194304] 0 2026-03-10T07:51:22.783 INFO:tasks.workunit.client.1.vm08.stdout:2/186: mknod d0/d1/d3/d5/d2a/c41 0 2026-03-10T07:51:22.791 INFO:tasks.workunit.client.1.vm08.stdout:2/187: write d0/d1/d3/d5/d1b/f3c [700873,39156] 0 2026-03-10T07:51:22.794 INFO:tasks.workunit.client.1.vm08.stdout:5/228: rename d0/d4/df/d2d to d0/d4/d19/d43 0 2026-03-10T07:51:22.798 INFO:tasks.workunit.client.1.vm08.stdout:1/154: dread d2/d6/d11/f1e [0,4194304] 0 2026-03-10T07:51:22.800 INFO:tasks.workunit.client.1.vm08.stdout:3/154: unlink d0/d3c/c15 0 2026-03-10T07:51:22.801 INFO:tasks.workunit.client.1.vm08.stdout:1/155: dread d2/d6/d11/f1e [0,4194304] 0 2026-03-10T07:51:22.807 INFO:tasks.workunit.client.1.vm08.stdout:6/188: unlink d1/db/f33 0 2026-03-10T07:51:22.819 INFO:tasks.workunit.client.1.vm08.stdout:8/198: creat d0/df/d15/d23/d39/f3e x:0 0 0 2026-03-10T07:51:22.819 INFO:tasks.workunit.client.1.vm08.stdout:8/199: read d0/df/d17/d25/f2b [188718,65791] 0 2026-03-10T07:51:22.823 INFO:tasks.workunit.client.1.vm08.stdout:9/190: dread d2/d26/f29 [0,4194304] 0 2026-03-10T07:51:22.823 INFO:tasks.workunit.client.1.vm08.stdout:8/200: chown d0/df/d17/d25/l27 190 1 2026-03-10T07:51:22.828 INFO:tasks.workunit.client.1.vm08.stdout:8/201: dwrite d0/df/d17/f32 [0,4194304] 0 2026-03-10T07:51:22.837 INFO:tasks.workunit.client.1.vm08.stdout:3/155: creat d0/d3c/d1f/d25/d34/f3d x:0 0 0 2026-03-10T07:51:22.838 INFO:tasks.workunit.client.1.vm08.stdout:3/156: write d0/d3c/d1f/d25/d2d/f35 [949930,81040] 0 2026-03-10T07:51:22.846 INFO:tasks.workunit.client.1.vm08.stdout:7/209: write d3/da/f21 [4850,26131] 0 2026-03-10T07:51:22.850 INFO:tasks.workunit.client.1.vm08.stdout:7/210: dwrite d3/da/d25/d9/f41 [0,4194304] 0 2026-03-10T07:51:22.850 INFO:tasks.workunit.client.1.vm08.stdout:7/211: write d3/f16 [2561224,97298] 0 2026-03-10T07:51:22.856 INFO:tasks.workunit.client.1.vm08.stdout:7/212: dwrite d3/f16 [0,4194304] 0 2026-03-10T07:51:22.864 INFO:tasks.workunit.client.1.vm08.stdout:6/189: mkdir d1/db/d24/d3b 0 2026-03-10T07:51:22.869 INFO:tasks.workunit.client.1.vm08.stdout:0/185: creat dd/d10/d2f/d32/d3d/f40 x:0 0 0 2026-03-10T07:51:22.881 INFO:tasks.workunit.client.1.vm08.stdout:9/191: rmdir d2/d3/d25 39 2026-03-10T07:51:22.899 INFO:tasks.workunit.client.1.vm08.stdout:8/202: rmdir d0/df/d15 39 2026-03-10T07:51:22.902 INFO:tasks.workunit.client.1.vm08.stdout:2/188: dwrite d0/d1/fb [0,4194304] 0 2026-03-10T07:51:22.912 INFO:tasks.workunit.client.1.vm08.stdout:2/189: dread d0/d1/d3/f8 [0,4194304] 0 2026-03-10T07:51:22.913 INFO:tasks.workunit.client.1.vm08.stdout:0/186: creat dd/d10/d14/d15/f41 x:0 0 0 2026-03-10T07:51:22.913 INFO:tasks.workunit.client.1.vm08.stdout:0/187: chown dd/d18 196508 1 2026-03-10T07:51:22.914 INFO:tasks.workunit.client.1.vm08.stdout:5/229: write d0/d33/f34 [4889477,111229] 0 2026-03-10T07:51:22.915 INFO:tasks.workunit.client.1.vm08.stdout:1/156: write d2/d6/de/f1c [1130206,97003] 0 2026-03-10T07:51:22.916 INFO:tasks.workunit.client.1.vm08.stdout:4/135: creat d5/f2f x:0 0 0 2026-03-10T07:51:22.919 INFO:tasks.workunit.client.1.vm08.stdout:2/190: dwrite d0/d1/d3/d5/d2a/f34 [0,4194304] 0 2026-03-10T07:51:22.923 INFO:tasks.workunit.client.1.vm08.stdout:3/157: mknod d0/c3e 0 2026-03-10T07:51:22.923 INFO:tasks.workunit.client.1.vm08.stdout:6/190: link d1/f28 d1/d17/d2b/f3c 0 2026-03-10T07:51:22.938 INFO:tasks.workunit.client.1.vm08.stdout:0/188: mkdir dd/d10/d2f/d32/d3d/d42 0 2026-03-10T07:51:22.953 INFO:tasks.workunit.client.1.vm08.stdout:5/230: rename d0/f17 to d0/d4/df/d12/d22/f44 0 2026-03-10T07:51:22.957 INFO:tasks.workunit.client.1.vm08.stdout:3/158: rmdir d0/d3c/d1f/d25/d2d 39 2026-03-10T07:51:22.971 INFO:tasks.workunit.client.1.vm08.stdout:8/203: rmdir d0/df/d15/d23/d39 39 2026-03-10T07:51:22.971 INFO:tasks.workunit.client.1.vm08.stdout:3/159: chown d0/d3c/l1b 59378 1 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:1/157: creat d2/d34/f38 x:0 0 0 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:8/204: chown d0/df/d2e/c38 446528980 1 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:1/158: stat d2/f36 0 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:8/205: dread - d0/df/d15/d23/f3d zero size 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:2/191: rename d0/d1/d3/d5/d2a to d0/d1/d17/d27/d42 0 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:2/192: fsync d0/d1/d3/f14 0 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:9/192: link d2/de/f27 d2/d3/f49 0 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:7/213: link d3/da/d25/d9/d2f/d3a/l3c d3/da/d25/l45 0 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:5/231: mknod d0/d4/c45 0 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:7/214: fdatasync d3/da/f17 0 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:3/160: rename d0/c3e to d0/d3c/d18/c3f 0 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:1/159: symlink d2/d6/de/d1f/l39 0 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:2/193: mknod d0/d1/d3/d10/c43 0 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:2/194: chown d0/d1/d17/d27/d42/l3f 1 1 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:8/206: mkdir d0/d3b/d3f 0 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:3/161: mknod d0/d3c/d1f/d25/d34/c40 0 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:1/160: mkdir d2/d6/d3a 0 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:6/191: getdents d1/d3/df 0 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:7/215: mkdir d3/da/d46 0 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:3/162: fdatasync d0/d3c/d1f/d25/d2d/f35 0 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:1/161: truncate d2/d6/de/d1f/d26/f2f 682287 0 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:3/163: dread d0/d3c/f20 [0,4194304] 0 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:2/195: creat d0/f44 x:0 0 0 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:8/207: rename d0/df/d17/d25/f2b to d0/df/d15/d23/d39/f40 0 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:2/196: dread d0/d1/d3/f8 [0,4194304] 0 2026-03-10T07:51:22.972 INFO:tasks.workunit.client.1.vm08.stdout:1/162: creat d2/d6/f3b x:0 0 0 2026-03-10T07:51:22.973 INFO:tasks.workunit.client.1.vm08.stdout:8/208: mknod d0/df/d2e/d30/c41 0 2026-03-10T07:51:22.974 INFO:tasks.workunit.client.1.vm08.stdout:3/164: mkdir d0/d41 0 2026-03-10T07:51:22.974 INFO:tasks.workunit.client.1.vm08.stdout:2/197: rename d0/d1/d3/f14 to d0/d1/d3/d10/d32/f45 0 2026-03-10T07:51:22.975 INFO:tasks.workunit.client.1.vm08.stdout:0/189: sync 2026-03-10T07:51:22.975 INFO:tasks.workunit.client.1.vm08.stdout:7/216: sync 2026-03-10T07:51:22.977 INFO:tasks.workunit.client.1.vm08.stdout:3/165: fsync d0/fc 0 2026-03-10T07:51:22.978 INFO:tasks.workunit.client.1.vm08.stdout:7/217: creat d3/da/d25/d9/f47 x:0 0 0 2026-03-10T07:51:22.979 INFO:tasks.workunit.client.1.vm08.stdout:8/209: dread d0/f20 [0,4194304] 0 2026-03-10T07:51:22.979 INFO:tasks.workunit.client.1.vm08.stdout:3/166: chown d0/d3c/d1f/d25/f2b 24 1 2026-03-10T07:51:22.980 INFO:tasks.workunit.client.1.vm08.stdout:1/163: dread d2/f4 [0,4194304] 0 2026-03-10T07:51:22.980 INFO:tasks.workunit.client.1.vm08.stdout:3/167: chown d0/d3c/d18/f2e 4 1 2026-03-10T07:51:22.980 INFO:tasks.workunit.client.1.vm08.stdout:8/210: write d0/f2a [578628,115436] 0 2026-03-10T07:51:22.981 INFO:tasks.workunit.client.1.vm08.stdout:1/164: chown d2/d6/f18 451991 1 2026-03-10T07:51:22.981 INFO:tasks.workunit.client.1.vm08.stdout:8/211: fsync d0/df/d2e/f3c 0 2026-03-10T07:51:22.982 INFO:tasks.workunit.client.1.vm08.stdout:1/165: readlink d2/d6/de/l27 0 2026-03-10T07:51:22.985 INFO:tasks.workunit.client.1.vm08.stdout:8/212: sync 2026-03-10T07:51:22.989 INFO:tasks.workunit.client.1.vm08.stdout:1/166: dwrite d2/d6/de/d1f/d26/f2f [0,4194304] 0 2026-03-10T07:51:22.997 INFO:tasks.workunit.client.1.vm08.stdout:2/198: mknod d0/d1/d3/d10/d38/c46 0 2026-03-10T07:51:22.998 INFO:tasks.workunit.client.1.vm08.stdout:3/168: mknod d0/d3c/d1f/d25/d2d/c42 0 2026-03-10T07:51:22.998 INFO:tasks.workunit.client.1.vm08.stdout:8/213: mkdir d0/d3b/d42 0 2026-03-10T07:51:22.999 INFO:tasks.workunit.client.1.vm08.stdout:1/167: creat d2/d34/f3c x:0 0 0 2026-03-10T07:51:23.000 INFO:tasks.workunit.client.1.vm08.stdout:3/169: fdatasync d0/f28 0 2026-03-10T07:51:23.001 INFO:tasks.workunit.client.1.vm08.stdout:2/199: write d0/d1/d3/d5/f13 [1669094,9511] 0 2026-03-10T07:51:23.001 INFO:tasks.workunit.client.1.vm08.stdout:1/168: fdatasync d2/d10/f2d 0 2026-03-10T07:51:23.003 INFO:tasks.workunit.client.1.vm08.stdout:1/169: write d2/d10/f2d [4020917,52167] 0 2026-03-10T07:51:23.007 INFO:tasks.workunit.client.1.vm08.stdout:2/200: dwrite d0/d1/fb [0,4194304] 0 2026-03-10T07:51:23.037 INFO:tasks.workunit.client.1.vm08.stdout:7/218: truncate d3/da/f17 2300238 0 2026-03-10T07:51:23.037 INFO:tasks.workunit.client.1.vm08.stdout:9/193: write d2/f6 [4100491,37439] 0 2026-03-10T07:51:23.041 INFO:tasks.workunit.client.1.vm08.stdout:6/192: write d1/f2 [816622,104250] 0 2026-03-10T07:51:23.046 INFO:tasks.workunit.client.1.vm08.stdout:1/170: fdatasync d2/f4 0 2026-03-10T07:51:23.046 INFO:tasks.workunit.client.1.vm08.stdout:2/201: symlink d0/d1/d3/d10/d38/l47 0 2026-03-10T07:51:23.049 INFO:tasks.workunit.client.1.vm08.stdout:5/232: dwrite d0/d4/df/d1e/f25 [0,4194304] 0 2026-03-10T07:51:23.054 INFO:tasks.workunit.client.1.vm08.stdout:4/136: dread - d5/d8/f1e zero size 2026-03-10T07:51:23.057 INFO:tasks.workunit.client.1.vm08.stdout:0/190: dwrite dd/d10/d14/d15/d20/d22/f2e [0,4194304] 0 2026-03-10T07:51:23.065 INFO:tasks.workunit.client.1.vm08.stdout:6/193: dwrite d1/d3/f13 [0,4194304] 0 2026-03-10T07:51:23.066 INFO:tasks.workunit.client.1.vm08.stdout:9/194: mknod d2/d3/c4a 0 2026-03-10T07:51:23.067 INFO:tasks.workunit.client.1.vm08.stdout:9/195: write d2/de/f1e [2943982,56709] 0 2026-03-10T07:51:23.083 INFO:tasks.workunit.client.1.vm08.stdout:2/202: dwrite d0/d1/d3/d5/d1b/f3c [0,4194304] 0 2026-03-10T07:51:23.083 INFO:tasks.workunit.client.1.vm08.stdout:7/219: symlink d3/da/d25/d9/d2f/d39/d43/l48 0 2026-03-10T07:51:23.083 INFO:tasks.workunit.client.1.vm08.stdout:8/214: getdents d0/d3b 0 2026-03-10T07:51:23.083 INFO:tasks.workunit.client.1.vm08.stdout:5/233: creat d0/d4/df/d12/f46 x:0 0 0 2026-03-10T07:51:23.083 INFO:tasks.workunit.client.1.vm08.stdout:2/203: chown d0/d1/c22 31202 1 2026-03-10T07:51:23.088 INFO:tasks.workunit.client.1.vm08.stdout:3/170: dwrite d0/d3c/d18/f1e [0,4194304] 0 2026-03-10T07:51:23.090 INFO:tasks.workunit.client.1.vm08.stdout:3/171: chown d0/d3c/d18/f38 27 1 2026-03-10T07:51:23.101 INFO:tasks.workunit.client.1.vm08.stdout:5/234: dwrite d0/f30 [0,4194304] 0 2026-03-10T07:51:23.105 INFO:tasks.workunit.client.1.vm08.stdout:7/220: write d3/da/d25/d9/d2f/f42 [599950,13834] 0 2026-03-10T07:51:23.105 INFO:tasks.workunit.client.1.vm08.stdout:5/235: write d0/d4/df/d12/f46 [444530,82328] 0 2026-03-10T07:51:23.105 INFO:tasks.workunit.client.1.vm08.stdout:7/221: dwrite d3/da/d25/f29 [0,4194304] 0 2026-03-10T07:51:23.105 INFO:tasks.workunit.client.1.vm08.stdout:7/222: write d3/f6 [1429265,89712] 0 2026-03-10T07:51:23.111 INFO:tasks.workunit.client.1.vm08.stdout:9/196: dread d2/f10 [0,4194304] 0 2026-03-10T07:51:23.118 INFO:tasks.workunit.client.1.vm08.stdout:8/215: dwrite d0/f20 [0,4194304] 0 2026-03-10T07:51:23.126 INFO:tasks.workunit.client.1.vm08.stdout:3/172: creat d0/d3c/d1f/d25/d2d/f43 x:0 0 0 2026-03-10T07:51:23.139 INFO:tasks.workunit.client.1.vm08.stdout:3/173: truncate d0/d3c/d1f/d25/d34/f3d 786335 0 2026-03-10T07:51:23.141 INFO:tasks.workunit.client.1.vm08.stdout:3/174: write d0/f16 [3806553,7511] 0 2026-03-10T07:51:23.141 INFO:tasks.workunit.client.1.vm08.stdout:0/191: link dd/d18/l3a dd/d10/d2f/l43 0 2026-03-10T07:51:23.141 INFO:tasks.workunit.client.1.vm08.stdout:0/192: readlink dd/d10/d14/d15/l19 0 2026-03-10T07:51:23.141 INFO:tasks.workunit.client.1.vm08.stdout:0/193: write dd/f13 [1249222,57928] 0 2026-03-10T07:51:23.141 INFO:tasks.workunit.client.1.vm08.stdout:4/137: creat d5/d8/f30 x:0 0 0 2026-03-10T07:51:23.141 INFO:tasks.workunit.client.1.vm08.stdout:0/194: fsync f0 0 2026-03-10T07:51:23.141 INFO:tasks.workunit.client.1.vm08.stdout:4/138: chown d5/d8/d9/d12/l23 660550 1 2026-03-10T07:51:23.141 INFO:tasks.workunit.client.1.vm08.stdout:8/216: creat d0/df/d2e/d30/f43 x:0 0 0 2026-03-10T07:51:23.141 INFO:tasks.workunit.client.1.vm08.stdout:2/204: rename d0/d1/d3/c1f to d0/d1/d3/d3e/c48 0 2026-03-10T07:51:23.141 INFO:tasks.workunit.client.1.vm08.stdout:3/175: unlink d0/d3c/d1f/f24 0 2026-03-10T07:51:23.141 INFO:tasks.workunit.client.1.vm08.stdout:2/205: chown d0/d1/d3/d10/l36 15 1 2026-03-10T07:51:23.141 INFO:tasks.workunit.client.1.vm08.stdout:3/176: read - d0/f10 zero size 2026-03-10T07:51:23.141 INFO:tasks.workunit.client.1.vm08.stdout:9/197: dwrite d2/f10 [0,4194304] 0 2026-03-10T07:51:23.147 INFO:tasks.workunit.client.1.vm08.stdout:3/177: dwrite d0/d3c/d18/f38 [0,4194304] 0 2026-03-10T07:51:23.147 INFO:tasks.workunit.client.1.vm08.stdout:3/178: chown d0/l11 515238 1 2026-03-10T07:51:23.153 INFO:tasks.workunit.client.1.vm08.stdout:5/236: rename d0/d4/df/d12/d22/f23 to d0/d8/d24/f47 0 2026-03-10T07:51:23.158 INFO:tasks.workunit.client.1.vm08.stdout:8/217: creat d0/df/d2e/f44 x:0 0 0 2026-03-10T07:51:23.165 INFO:tasks.workunit.client.1.vm08.stdout:8/218: write d0/f2a [528154,26615] 0 2026-03-10T07:51:23.170 INFO:tasks.workunit.client.1.vm08.stdout:0/195: rename dd/d10/d14/d1b/f35 to dd/f44 0 2026-03-10T07:51:23.173 INFO:tasks.workunit.client.1.vm08.stdout:0/196: chown dd/d10/d2f/d32/d3d 17 1 2026-03-10T07:51:23.174 INFO:tasks.workunit.client.1.vm08.stdout:3/179: sync 2026-03-10T07:51:23.174 INFO:tasks.workunit.client.1.vm08.stdout:0/197: stat dd/d10/d14/d1b 0 2026-03-10T07:51:23.182 INFO:tasks.workunit.client.1.vm08.stdout:4/139: dread d5/d8/f11 [0,4194304] 0 2026-03-10T07:51:23.205 INFO:tasks.workunit.client.1.vm08.stdout:0/198: rmdir dd/d10/d2f/d32/d3d/d42 0 2026-03-10T07:51:23.205 INFO:tasks.workunit.client.1.vm08.stdout:3/180: dwrite d0/d3c/d1f/d25/d2d/f43 [0,4194304] 0 2026-03-10T07:51:23.205 INFO:tasks.workunit.client.1.vm08.stdout:0/199: creat dd/d10/d2f/d32/f45 x:0 0 0 2026-03-10T07:51:23.205 INFO:tasks.workunit.client.1.vm08.stdout:3/181: unlink d0/d3c/d1f/d25/d34/c40 0 2026-03-10T07:51:23.205 INFO:tasks.workunit.client.1.vm08.stdout:4/140: mkdir d5/d1f/d31 0 2026-03-10T07:51:23.205 INFO:tasks.workunit.client.1.vm08.stdout:0/200: creat dd/d10/d14/f46 x:0 0 0 2026-03-10T07:51:23.205 INFO:tasks.workunit.client.1.vm08.stdout:4/141: mkdir d5/d8/d9/d32 0 2026-03-10T07:51:23.205 INFO:tasks.workunit.client.1.vm08.stdout:4/142: read d5/d8/ff [3281571,98927] 0 2026-03-10T07:51:23.205 INFO:tasks.workunit.client.1.vm08.stdout:4/143: write d5/f2f [820004,60838] 0 2026-03-10T07:51:23.205 INFO:tasks.workunit.client.1.vm08.stdout:4/144: write d5/d8/d9/f18 [101005,116128] 0 2026-03-10T07:51:23.205 INFO:tasks.workunit.client.1.vm08.stdout:3/182: mkdir d0/d3c/d1f/d44 0 2026-03-10T07:51:23.205 INFO:tasks.workunit.client.1.vm08.stdout:0/201: symlink dd/d10/d2f/l47 0 2026-03-10T07:51:23.205 INFO:tasks.workunit.client.1.vm08.stdout:0/202: stat dd/d10/d14/d15/d20/l3e 0 2026-03-10T07:51:23.205 INFO:tasks.workunit.client.1.vm08.stdout:4/145: dwrite d5/d8/d9/f1b [0,4194304] 0 2026-03-10T07:51:23.205 INFO:tasks.workunit.client.1.vm08.stdout:0/203: rename f0 to dd/d29/f48 0 2026-03-10T07:51:23.205 INFO:tasks.workunit.client.1.vm08.stdout:3/183: creat d0/f45 x:0 0 0 2026-03-10T07:51:23.205 INFO:tasks.workunit.client.1.vm08.stdout:3/184: chown d0/d3c/l1b 221951 1 2026-03-10T07:51:23.210 INFO:tasks.workunit.client.1.vm08.stdout:9/198: read d2/de/f3d [585282,17602] 0 2026-03-10T07:51:23.211 INFO:tasks.workunit.client.1.vm08.stdout:3/185: dwrite d0/d3c/d18/f38 [0,4194304] 0 2026-03-10T07:51:23.212 INFO:tasks.workunit.client.1.vm08.stdout:3/186: fsync d0/f39 0 2026-03-10T07:51:23.213 INFO:tasks.workunit.client.1.vm08.stdout:3/187: chown d0/d3c/d18/f22 2 1 2026-03-10T07:51:23.218 INFO:tasks.workunit.client.1.vm08.stdout:0/204: symlink dd/d10/d14/d1b/d30/l49 0 2026-03-10T07:51:23.222 INFO:tasks.workunit.client.1.vm08.stdout:9/199: creat d2/d26/f4b x:0 0 0 2026-03-10T07:51:23.222 INFO:tasks.workunit.client.1.vm08.stdout:0/205: readlink dd/d10/d14/d1b/d30/l49 0 2026-03-10T07:51:23.222 INFO:tasks.workunit.client.1.vm08.stdout:9/200: write d2/f1a [1559235,21001] 0 2026-03-10T07:51:23.223 INFO:tasks.workunit.client.1.vm08.stdout:9/201: dwrite d2/d26/f29 [4194304,4194304] 0 2026-03-10T07:51:23.229 INFO:tasks.workunit.client.1.vm08.stdout:0/206: dread dd/d10/d14/f36 [0,4194304] 0 2026-03-10T07:51:23.229 INFO:tasks.workunit.client.1.vm08.stdout:0/207: write dd/d10/d14/f46 [67469,80656] 0 2026-03-10T07:51:23.229 INFO:tasks.workunit.client.1.vm08.stdout:0/208: truncate dd/d18/f21 4244986 0 2026-03-10T07:51:23.233 INFO:tasks.workunit.client.1.vm08.stdout:9/202: mkdir d2/d26/d4c 0 2026-03-10T07:51:23.246 INFO:tasks.workunit.client.1.vm08.stdout:0/209: fsync dd/d18/f25 0 2026-03-10T07:51:23.246 INFO:tasks.workunit.client.1.vm08.stdout:0/210: mknod dd/d10/d14/d1b/d30/c4a 0 2026-03-10T07:51:23.246 INFO:tasks.workunit.client.1.vm08.stdout:9/203: rename d2/de/f16 to d2/de/f4d 0 2026-03-10T07:51:23.246 INFO:tasks.workunit.client.1.vm08.stdout:0/211: unlink dd/d10/d14/d15/d20/f2d 0 2026-03-10T07:51:23.246 INFO:tasks.workunit.client.1.vm08.stdout:9/204: creat d2/f4e x:0 0 0 2026-03-10T07:51:23.261 INFO:tasks.workunit.client.1.vm08.stdout:0/212: rmdir dd/d10/d14/d1b/d30/d31 0 2026-03-10T07:51:23.271 INFO:tasks.workunit.client.1.vm08.stdout:0/213: creat dd/d10/d2f/f4b x:0 0 0 2026-03-10T07:51:23.272 INFO:tasks.workunit.client.1.vm08.stdout:0/214: write dd/d10/d14/d15/d20/d22/f2e [83917,6023] 0 2026-03-10T07:51:23.274 INFO:tasks.workunit.client.1.vm08.stdout:0/215: creat dd/d10/d2f/f4c x:0 0 0 2026-03-10T07:51:23.274 INFO:tasks.workunit.client.1.vm08.stdout:0/216: readlink dd/d10/d14/d1b/l27 0 2026-03-10T07:51:23.275 INFO:tasks.workunit.client.1.vm08.stdout:0/217: truncate dd/d10/d2f/f4c 25306 0 2026-03-10T07:51:23.281 INFO:tasks.workunit.client.1.vm08.stdout:0/218: getdents dd/d10/d2f/d37 0 2026-03-10T07:51:23.284 INFO:tasks.workunit.client.1.vm08.stdout:0/219: creat dd/d10/d14/d1b/d30/f4d x:0 0 0 2026-03-10T07:51:23.288 INFO:tasks.workunit.client.1.vm08.stdout:0/220: mknod dd/d10/d2f/d32/c4e 0 2026-03-10T07:51:23.288 INFO:tasks.workunit.client.1.vm08.stdout:0/221: readlink dd/d29/l38 0 2026-03-10T07:51:23.353 INFO:tasks.workunit.client.1.vm08.stdout:1/171: truncate d2/d10/f2d 3851441 0 2026-03-10T07:51:23.353 INFO:tasks.workunit.client.1.vm08.stdout:1/172: fdatasync d2/d6/f3b 0 2026-03-10T07:51:23.355 INFO:tasks.workunit.client.1.vm08.stdout:1/173: creat d2/d6/de/d1f/f3d x:0 0 0 2026-03-10T07:51:23.356 INFO:tasks.workunit.client.1.vm08.stdout:1/174: write d2/d6/de/f15 [870411,53061] 0 2026-03-10T07:51:23.393 INFO:tasks.workunit.client.1.vm08.stdout:2/206: rmdir d0/d1 39 2026-03-10T07:51:23.396 INFO:tasks.workunit.client.1.vm08.stdout:6/194: truncate d1/f6 1268748 0 2026-03-10T07:51:23.398 INFO:tasks.workunit.client.1.vm08.stdout:2/207: dwrite d0/f44 [0,4194304] 0 2026-03-10T07:51:23.400 INFO:tasks.workunit.client.1.vm08.stdout:6/195: mkdir d1/db/d24/d3d 0 2026-03-10T07:51:23.401 INFO:tasks.workunit.client.1.vm08.stdout:7/223: dwrite d3/da/d25/f1e [4194304,4194304] 0 2026-03-10T07:51:23.401 INFO:tasks.workunit.client.1.vm08.stdout:6/196: read - d1/db/f23 zero size 2026-03-10T07:51:23.402 INFO:tasks.workunit.client.1.vm08.stdout:6/197: chown d1/db/d24/d3b 1 1 2026-03-10T07:51:23.409 INFO:tasks.workunit.client.1.vm08.stdout:6/198: dwrite d1/d3/f13 [0,4194304] 0 2026-03-10T07:51:23.410 INFO:tasks.workunit.client.1.vm08.stdout:2/208: readlink d0/d1/d17/d27/d42/l3f 0 2026-03-10T07:51:23.416 INFO:tasks.workunit.client.1.vm08.stdout:2/209: read d0/f1e [2732773,112511] 0 2026-03-10T07:51:23.417 INFO:tasks.workunit.client.1.vm08.stdout:2/210: write d0/d1/d3/d5/d1b/f26 [272736,66401] 0 2026-03-10T07:51:23.418 INFO:tasks.workunit.client.1.vm08.stdout:2/211: truncate d0/d1/d3/d5/d1b/f23 2526475 0 2026-03-10T07:51:23.419 INFO:tasks.workunit.client.1.vm08.stdout:2/212: chown d0/d1/d3/f8 26873 1 2026-03-10T07:51:23.419 INFO:tasks.workunit.client.1.vm08.stdout:2/213: chown d0/d1/d17/d27/f29 264584735 1 2026-03-10T07:51:23.420 INFO:tasks.workunit.client.1.vm08.stdout:2/214: write d0/d1/d17/d27/f29 [477807,126648] 0 2026-03-10T07:51:23.421 INFO:tasks.workunit.client.1.vm08.stdout:2/215: rename d0/d1/d3 to d0/d1/d3/d39/d49 22 2026-03-10T07:51:23.421 INFO:tasks.workunit.client.1.vm08.stdout:2/216: fdatasync d0/d1/d17/f1a 0 2026-03-10T07:51:23.424 INFO:tasks.workunit.client.1.vm08.stdout:2/217: dwrite d0/d1/fb [0,4194304] 0 2026-03-10T07:51:23.437 INFO:tasks.workunit.client.1.vm08.stdout:2/218: read d0/d1/fb [2740385,14356] 0 2026-03-10T07:51:23.437 INFO:tasks.workunit.client.1.vm08.stdout:2/219: chown d0/d1/d3/d39/f3b 353431 1 2026-03-10T07:51:23.437 INFO:tasks.workunit.client.1.vm08.stdout:7/224: dread d3/f2b [0,4194304] 0 2026-03-10T07:51:23.437 INFO:tasks.workunit.client.1.vm08.stdout:7/225: dwrite d3/da/f1d [0,4194304] 0 2026-03-10T07:51:23.441 INFO:tasks.workunit.client.1.vm08.stdout:5/237: getdents d0/d4/df/d12/d22 0 2026-03-10T07:51:23.442 INFO:tasks.workunit.client.1.vm08.stdout:5/238: write d0/d4/df/d12/f13 [4005147,30885] 0 2026-03-10T07:51:23.443 INFO:tasks.workunit.client.1.vm08.stdout:6/199: unlink d1/f2 0 2026-03-10T07:51:23.447 INFO:tasks.workunit.client.1.vm08.stdout:6/200: dwrite d1/d3/f2e [0,4194304] 0 2026-03-10T07:51:23.462 INFO:tasks.workunit.client.1.vm08.stdout:8/219: getdents d0/df/d2e 0 2026-03-10T07:51:23.469 INFO:tasks.workunit.client.1.vm08.stdout:6/201: mkdir d1/d3/d3e 0 2026-03-10T07:51:23.469 INFO:tasks.workunit.client.1.vm08.stdout:6/202: readlink d1/d17/l1c 0 2026-03-10T07:51:23.477 INFO:tasks.workunit.client.1.vm08.stdout:8/220: creat d0/f45 x:0 0 0 2026-03-10T07:51:23.478 INFO:tasks.workunit.client.1.vm08.stdout:8/221: stat d0/df/d15/d23/f3d 0 2026-03-10T07:51:23.479 INFO:tasks.workunit.client.1.vm08.stdout:7/226: link d3/da/d25/d9/d2f/f42 d3/da/d46/f49 0 2026-03-10T07:51:23.479 INFO:tasks.workunit.client.1.vm08.stdout:5/239: mknod d0/d4/df/d1e/d41/c48 0 2026-03-10T07:51:23.480 INFO:tasks.workunit.client.1.vm08.stdout:5/240: chown d0/d4/df/d1e/d41 62442 1 2026-03-10T07:51:23.481 INFO:tasks.workunit.client.1.vm08.stdout:2/220: creat d0/f4a x:0 0 0 2026-03-10T07:51:23.483 INFO:tasks.workunit.client.1.vm08.stdout:8/222: creat d0/d3b/f46 x:0 0 0 2026-03-10T07:51:23.483 INFO:tasks.workunit.client.1.vm08.stdout:6/203: mkdir d1/d3f 0 2026-03-10T07:51:23.484 INFO:tasks.workunit.client.1.vm08.stdout:4/146: rename d5/f27 to d5/d1f/d31/f33 0 2026-03-10T07:51:23.485 INFO:tasks.workunit.client.1.vm08.stdout:5/241: dwrite d0/d4/df/d12/f46 [0,4194304] 0 2026-03-10T07:51:23.487 INFO:tasks.workunit.client.1.vm08.stdout:5/242: read d0/d4/df/d12/d1c/f2c [3216420,42057] 0 2026-03-10T07:51:23.488 INFO:tasks.workunit.client.1.vm08.stdout:3/188: write d0/d3c/f20 [207865,47422] 0 2026-03-10T07:51:23.490 INFO:tasks.workunit.client.1.vm08.stdout:6/204: rename d1/db/d24/d3b to d1/d3/df/d1d/d40 0 2026-03-10T07:51:23.492 INFO:tasks.workunit.client.1.vm08.stdout:8/223: creat d0/d37/f47 x:0 0 0 2026-03-10T07:51:23.493 INFO:tasks.workunit.client.1.vm08.stdout:3/189: dwrite d0/d3c/d18/f22 [0,4194304] 0 2026-03-10T07:51:23.497 INFO:tasks.workunit.client.1.vm08.stdout:8/224: dwrite d0/df/f19 [0,4194304] 0 2026-03-10T07:51:23.503 INFO:tasks.workunit.client.1.vm08.stdout:7/227: sync 2026-03-10T07:51:23.513 INFO:tasks.workunit.client.1.vm08.stdout:3/190: symlink d0/d3c/d1f/d25/d34/l46 0 2026-03-10T07:51:23.524 INFO:tasks.workunit.client.1.vm08.stdout:9/205: dwrite d2/d3/d25/d2b/f37 [0,4194304] 0 2026-03-10T07:51:23.524 INFO:tasks.workunit.client.1.vm08.stdout:1/175: rename d2/d10/f2d to d2/d10/f3e 0 2026-03-10T07:51:23.524 INFO:tasks.workunit.client.1.vm08.stdout:0/222: write dd/fe [1985728,64804] 0 2026-03-10T07:51:23.524 INFO:tasks.workunit.client.1.vm08.stdout:1/176: dwrite d2/d10/f2b [0,4194304] 0 2026-03-10T07:51:23.530 INFO:tasks.workunit.client.1.vm08.stdout:6/205: creat d1/f41 x:0 0 0 2026-03-10T07:51:23.545 INFO:tasks.workunit.client.1.vm08.stdout:3/191: creat d0/d3c/d1f/d25/d34/f47 x:0 0 0 2026-03-10T07:51:23.546 INFO:tasks.workunit.client.1.vm08.stdout:7/228: mknod d3/da/d46/c4a 0 2026-03-10T07:51:23.548 INFO:tasks.workunit.client.1.vm08.stdout:7/229: dread d3/f16 [0,4194304] 0 2026-03-10T07:51:23.555 INFO:tasks.workunit.client.1.vm08.stdout:1/177: creat d2/d10/f3f x:0 0 0 2026-03-10T07:51:23.556 INFO:tasks.workunit.client.1.vm08.stdout:1/178: chown d2/d6/de/d1f/d22 3767413 1 2026-03-10T07:51:23.556 INFO:tasks.workunit.client.1.vm08.stdout:6/206: creat d1/d17/f42 x:0 0 0 2026-03-10T07:51:23.557 INFO:tasks.workunit.client.1.vm08.stdout:3/192: stat d0/c19 0 2026-03-10T07:51:23.560 INFO:tasks.workunit.client.1.vm08.stdout:6/207: dwrite d1/d3/f2e [8388608,4194304] 0 2026-03-10T07:51:23.561 INFO:tasks.workunit.client.1.vm08.stdout:1/179: mkdir d2/d6/de/d1f/d40 0 2026-03-10T07:51:23.561 INFO:tasks.workunit.client.1.vm08.stdout:1/180: rename d2/d6/de/d1f/d40 to d2/d6/de/d1f/d40/d41 22 2026-03-10T07:51:23.564 INFO:tasks.workunit.client.1.vm08.stdout:2/221: fsync d0/f44 0 2026-03-10T07:51:23.573 INFO:tasks.workunit.client.1.vm08.stdout:3/193: mkdir d0/d3c/d18/d48 0 2026-03-10T07:51:23.575 INFO:tasks.workunit.client.1.vm08.stdout:6/208: creat d1/d3/d3e/f43 x:0 0 0 2026-03-10T07:51:23.576 INFO:tasks.workunit.client.1.vm08.stdout:1/181: mknod d2/d6/d3a/c42 0 2026-03-10T07:51:23.578 INFO:tasks.workunit.client.1.vm08.stdout:6/209: mkdir d1/d3/df/d44 0 2026-03-10T07:51:23.579 INFO:tasks.workunit.client.1.vm08.stdout:3/194: symlink d0/d3c/d1f/l49 0 2026-03-10T07:51:23.579 INFO:tasks.workunit.client.1.vm08.stdout:3/195: truncate d0/f45 256010 0 2026-03-10T07:51:23.583 INFO:tasks.workunit.client.1.vm08.stdout:3/196: dwrite d0/d3c/d18/f1e [0,4194304] 0 2026-03-10T07:51:23.592 INFO:tasks.workunit.client.1.vm08.stdout:3/197: fdatasync d0/d3c/d18/f38 0 2026-03-10T07:51:23.593 INFO:tasks.workunit.client.1.vm08.stdout:1/182: sync 2026-03-10T07:51:23.594 INFO:tasks.workunit.client.1.vm08.stdout:1/183: fdatasync d2/d6/de/d1f/f2a 0 2026-03-10T07:51:23.648 INFO:tasks.workunit.client.1.vm08.stdout:7/230: dread d3/da/d25/d9/f23 [0,4194304] 0 2026-03-10T07:51:23.649 INFO:tasks.workunit.client.1.vm08.stdout:7/231: dread d3/da/d25/f32 [0,4194304] 0 2026-03-10T07:51:23.652 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:23 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:23.652 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:23 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:23.652 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:23 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:51:23.652 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:23 vm05.local ceph-mon[50387]: pgmap v31: 65 pgs: 65 active+clean; 2.2 GiB data, 7.5 GiB used, 113 GiB / 120 GiB avail; 29 MiB/s rd, 79 MiB/s wr, 208 op/s 2026-03-10T07:51:23.653 INFO:tasks.workunit.client.1.vm08.stdout:7/232: mkdir d3/da/d25/d9/d2f/d3a/d4b 0 2026-03-10T07:51:23.661 INFO:tasks.workunit.client.1.vm08.stdout:2/222: fsync d0/f4a 0 2026-03-10T07:51:23.665 INFO:tasks.workunit.client.1.vm08.stdout:7/233: link l1 d3/da/d46/l4c 0 2026-03-10T07:51:23.667 INFO:tasks.workunit.client.1.vm08.stdout:8/225: getdents d0/d37 0 2026-03-10T07:51:23.667 INFO:tasks.workunit.client.1.vm08.stdout:5/243: truncate d0/d4/df/d12/f46 3369572 0 2026-03-10T07:51:23.668 INFO:tasks.workunit.client.1.vm08.stdout:8/226: write d0/d37/f47 [568552,13756] 0 2026-03-10T07:51:23.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:23 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:23.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:23 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:23.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:23 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:51:23.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:23 vm08.local ceph-mon[59917]: pgmap v31: 65 pgs: 65 active+clean; 2.2 GiB data, 7.5 GiB used, 113 GiB / 120 GiB avail; 29 MiB/s rd, 79 MiB/s wr, 208 op/s 2026-03-10T07:51:23.670 INFO:tasks.workunit.client.1.vm08.stdout:6/210: dread d1/d3/df/d1d/f2a [0,4194304] 0 2026-03-10T07:51:23.671 INFO:tasks.workunit.client.1.vm08.stdout:0/223: write dd/d18/f25 [3243939,58134] 0 2026-03-10T07:51:23.673 INFO:tasks.workunit.client.1.vm08.stdout:6/211: chown d1/d3/df/d1d 325 1 2026-03-10T07:51:23.675 INFO:tasks.workunit.client.1.vm08.stdout:9/206: dwrite d2/d3/f49 [0,4194304] 0 2026-03-10T07:51:23.677 INFO:tasks.workunit.client.1.vm08.stdout:2/223: creat d0/d1/d3/d10/d38/f4b x:0 0 0 2026-03-10T07:51:23.677 INFO:tasks.workunit.client.1.vm08.stdout:5/244: dread d0/f30 [0,4194304] 0 2026-03-10T07:51:23.678 INFO:tasks.workunit.client.1.vm08.stdout:2/224: fdatasync d0/d1/d17/f1a 0 2026-03-10T07:51:23.678 INFO:tasks.workunit.client.1.vm08.stdout:4/147: chown d5/d17 129854454 1 2026-03-10T07:51:23.683 INFO:tasks.workunit.client.1.vm08.stdout:3/198: truncate d0/f45 737 0 2026-03-10T07:51:23.684 INFO:tasks.workunit.client.1.vm08.stdout:7/234: mkdir d3/da/d25/d9/d2f/d4d 0 2026-03-10T07:51:23.686 INFO:tasks.workunit.client.1.vm08.stdout:8/227: dread d0/df/f12 [0,4194304] 0 2026-03-10T07:51:23.687 INFO:tasks.workunit.client.1.vm08.stdout:8/228: write d0/f22 [1394541,14313] 0 2026-03-10T07:51:23.690 INFO:tasks.workunit.client.1.vm08.stdout:1/184: dwrite d2/d6/de/d1f/d22/f35 [4194304,4194304] 0 2026-03-10T07:51:23.701 INFO:tasks.workunit.client.1.vm08.stdout:1/185: dwrite d2/d6/de/d1f/d22/f35 [4194304,4194304] 0 2026-03-10T07:51:23.701 INFO:tasks.workunit.client.1.vm08.stdout:1/186: fdatasync d2/d6/de/d1f/d22/f30 0 2026-03-10T07:51:23.703 INFO:tasks.workunit.client.1.vm08.stdout:0/224: mknod dd/d10/d14/d1b/d30/c4f 0 2026-03-10T07:51:23.703 INFO:tasks.workunit.client.1.vm08.stdout:9/207: rename d2/fb to d2/de/f4f 0 2026-03-10T07:51:23.705 INFO:tasks.workunit.client.1.vm08.stdout:2/225: symlink d0/d1/d3/d39/l4c 0 2026-03-10T07:51:23.712 INFO:tasks.workunit.client.1.vm08.stdout:1/187: unlink d2/d6/d11/f1e 0 2026-03-10T07:51:23.712 INFO:tasks.workunit.client.1.vm08.stdout:1/188: stat d2/d6/f3b 0 2026-03-10T07:51:23.713 INFO:tasks.workunit.client.1.vm08.stdout:4/148: symlink d5/d1f/d31/l34 0 2026-03-10T07:51:23.716 INFO:tasks.workunit.client.1.vm08.stdout:8/229: symlink d0/d3b/d3f/l48 0 2026-03-10T07:51:23.716 INFO:tasks.workunit.client.1.vm08.stdout:0/225: mknod dd/c50 0 2026-03-10T07:51:23.723 INFO:tasks.workunit.client.1.vm08.stdout:4/149: dwrite d5/d8/f1d [0,4194304] 0 2026-03-10T07:51:23.724 INFO:tasks.workunit.client.1.vm08.stdout:0/226: dwrite f5 [0,4194304] 0 2026-03-10T07:51:23.728 INFO:tasks.workunit.client.1.vm08.stdout:1/189: dread d2/d6/de/f1c [0,4194304] 0 2026-03-10T07:51:23.729 INFO:tasks.workunit.client.1.vm08.stdout:2/226: read d0/f1e [2582860,121851] 0 2026-03-10T07:51:23.731 INFO:tasks.workunit.client.1.vm08.stdout:4/150: dwrite d5/d8/d9/f1b [0,4194304] 0 2026-03-10T07:51:23.737 INFO:tasks.workunit.client.1.vm08.stdout:4/151: chown d5/d17/c29 79 1 2026-03-10T07:51:23.737 INFO:tasks.workunit.client.1.vm08.stdout:4/152: dread - d5/d17/f22 zero size 2026-03-10T07:51:23.750 INFO:tasks.workunit.client.1.vm08.stdout:8/230: mkdir d0/df/d2e/d49 0 2026-03-10T07:51:23.753 INFO:tasks.workunit.client.1.vm08.stdout:9/208: rename d2/d26/f31 to d2/d3/f50 0 2026-03-10T07:51:23.765 INFO:tasks.workunit.client.1.vm08.stdout:5/245: dwrite d0/d4/df/d12/f46 [0,4194304] 0 2026-03-10T07:51:23.765 INFO:tasks.workunit.client.1.vm08.stdout:7/235: rmdir d3/da 39 2026-03-10T07:51:23.766 INFO:tasks.workunit.client.1.vm08.stdout:7/236: write d3/f6 [4948572,84760] 0 2026-03-10T07:51:23.767 INFO:tasks.workunit.client.1.vm08.stdout:3/199: dread d0/f45 [0,4194304] 0 2026-03-10T07:51:23.767 INFO:tasks.workunit.client.1.vm08.stdout:7/237: fsync d3/f34 0 2026-03-10T07:51:23.767 INFO:tasks.workunit.client.1.vm08.stdout:3/200: write d0/f13 [3570727,127870] 0 2026-03-10T07:51:23.769 INFO:tasks.workunit.client.1.vm08.stdout:3/201: chown d0/d3c/f20 1859 1 2026-03-10T07:51:23.769 INFO:tasks.workunit.client.1.vm08.stdout:3/202: fdatasync d0/d3c/f20 0 2026-03-10T07:51:23.770 INFO:tasks.workunit.client.1.vm08.stdout:0/227: creat dd/d10/d14/d15/d20/d22/f51 x:0 0 0 2026-03-10T07:51:23.772 INFO:tasks.workunit.client.1.vm08.stdout:0/228: chown dd/f13 51065 1 2026-03-10T07:51:23.775 INFO:tasks.workunit.client.1.vm08.stdout:2/227: chown d0/d1/d3/d5/cf 5996301 1 2026-03-10T07:51:23.775 INFO:tasks.workunit.client.1.vm08.stdout:3/203: dwrite d0/f16 [0,4194304] 0 2026-03-10T07:51:23.776 INFO:tasks.workunit.client.1.vm08.stdout:8/231: mkdir d0/df/d17/d25/d28/d4a 0 2026-03-10T07:51:23.776 INFO:tasks.workunit.client.1.vm08.stdout:3/204: write d0/f39 [179563,32551] 0 2026-03-10T07:51:23.779 INFO:tasks.workunit.client.1.vm08.stdout:5/246: sync 2026-03-10T07:51:23.781 INFO:tasks.workunit.client.1.vm08.stdout:9/209: rename d2/d26/f36 to d2/f51 0 2026-03-10T07:51:23.786 INFO:tasks.workunit.client.1.vm08.stdout:9/210: write d2/d3/d25/d2b/f2c [2126412,115865] 0 2026-03-10T07:51:23.787 INFO:tasks.workunit.client.1.vm08.stdout:9/211: chown d2/de 1117850 1 2026-03-10T07:51:23.787 INFO:tasks.workunit.client.1.vm08.stdout:2/228: dwrite d0/d1/d17/d27/d3a/f3d [0,4194304] 0 2026-03-10T07:51:23.798 INFO:tasks.workunit.client.1.vm08.stdout:4/153: mknod d5/d8/d9/d32/c35 0 2026-03-10T07:51:23.798 INFO:tasks.workunit.client.1.vm08.stdout:8/232: mknod d0/df/d17/d25/d28/c4b 0 2026-03-10T07:51:23.799 INFO:tasks.workunit.client.1.vm08.stdout:3/205: rmdir d0/d3c/d18/d32 39 2026-03-10T07:51:23.799 INFO:tasks.workunit.client.1.vm08.stdout:5/247: creat d0/d4/df/d1e/f49 x:0 0 0 2026-03-10T07:51:23.800 INFO:tasks.workunit.client.1.vm08.stdout:3/206: write d0/d3c/f26 [444526,119841] 0 2026-03-10T07:51:23.800 INFO:tasks.workunit.client.1.vm08.stdout:9/212: mknod d2/de/c52 0 2026-03-10T07:51:23.816 INFO:tasks.workunit.client.1.vm08.stdout:0/229: mkdir dd/d10/d14/d1b/d39/d52 0 2026-03-10T07:51:23.817 INFO:tasks.workunit.client.1.vm08.stdout:2/229: sync 2026-03-10T07:51:23.819 INFO:tasks.workunit.client.1.vm08.stdout:8/233: unlink d0/df/d15/f36 0 2026-03-10T07:51:23.823 INFO:tasks.workunit.client.1.vm08.stdout:1/190: dread d2/d6/de/d1f/d22/f24 [0,4194304] 0 2026-03-10T07:51:23.823 INFO:tasks.workunit.client.1.vm08.stdout:5/248: mkdir d0/d4/df/d4a 0 2026-03-10T07:51:23.823 INFO:tasks.workunit.client.1.vm08.stdout:4/154: dread - d5/d1f/f25 zero size 2026-03-10T07:51:23.823 INFO:tasks.workunit.client.1.vm08.stdout:3/207: rmdir d0/d3c/d1f/d25 39 2026-03-10T07:51:23.823 INFO:tasks.workunit.client.1.vm08.stdout:5/249: write d0/d4/df/d1e/f25 [3821716,126129] 0 2026-03-10T07:51:23.823 INFO:tasks.workunit.client.1.vm08.stdout:4/155: dread - d5/d1f/f25 zero size 2026-03-10T07:51:23.823 INFO:tasks.workunit.client.1.vm08.stdout:7/238: symlink d3/da/l4e 0 2026-03-10T07:51:23.823 INFO:tasks.workunit.client.1.vm08.stdout:9/213: write d2/f6 [3424355,117866] 0 2026-03-10T07:51:23.827 INFO:tasks.workunit.client.1.vm08.stdout:7/239: read d3/f6 [672756,100833] 0 2026-03-10T07:51:23.827 INFO:tasks.workunit.client.1.vm08.stdout:8/234: creat d0/d3b/f4c x:0 0 0 2026-03-10T07:51:23.828 INFO:tasks.workunit.client.1.vm08.stdout:7/240: readlink d3/da/d25/d9/l1c 0 2026-03-10T07:51:23.828 INFO:tasks.workunit.client.1.vm08.stdout:4/156: dread - d5/f2d zero size 2026-03-10T07:51:23.829 INFO:tasks.workunit.client.1.vm08.stdout:2/230: symlink d0/d1/d3/l4d 0 2026-03-10T07:51:23.829 INFO:tasks.workunit.client.1.vm08.stdout:2/231: readlink d0/d1/d3/d5/l16 0 2026-03-10T07:51:23.835 INFO:tasks.workunit.client.1.vm08.stdout:8/235: symlink d0/df/d17/l4d 0 2026-03-10T07:51:23.835 INFO:tasks.workunit.client.1.vm08.stdout:9/214: mknod d2/d26/c53 0 2026-03-10T07:51:23.836 INFO:tasks.workunit.client.1.vm08.stdout:4/157: truncate d5/d8/ff 1868494 0 2026-03-10T07:51:23.837 INFO:tasks.workunit.client.1.vm08.stdout:7/241: dwrite d3/da/d25/f1e [4194304,4194304] 0 2026-03-10T07:51:23.840 INFO:tasks.workunit.client.1.vm08.stdout:9/215: dwrite d2/f5 [0,4194304] 0 2026-03-10T07:51:23.841 INFO:tasks.workunit.client.1.vm08.stdout:1/191: dread d2/d6/de/d1f/f20 [0,4194304] 0 2026-03-10T07:51:23.848 INFO:tasks.workunit.client.1.vm08.stdout:7/242: mkdir d3/da/d25/d9/d2f/d39/d43/d4f 0 2026-03-10T07:51:23.872 INFO:tasks.workunit.client.1.vm08.stdout:1/192: creat d2/d6/de/d1f/d26/f43 x:0 0 0 2026-03-10T07:51:23.875 INFO:tasks.workunit.client.1.vm08.stdout:2/232: creat d0/d1/d3/f4e x:0 0 0 2026-03-10T07:51:23.877 INFO:tasks.workunit.client.1.vm08.stdout:9/216: symlink d2/d26/d4c/l54 0 2026-03-10T07:51:23.886 INFO:tasks.workunit.client.1.vm08.stdout:7/243: symlink d3/da/d25/d9/d2f/d3a/l50 0 2026-03-10T07:51:23.889 INFO:tasks.workunit.client.1.vm08.stdout:1/193: creat d2/d6/d11/f44 x:0 0 0 2026-03-10T07:51:23.889 INFO:tasks.workunit.client.1.vm08.stdout:1/194: chown d2/d6/de/l27 41 1 2026-03-10T07:51:23.892 INFO:tasks.workunit.client.1.vm08.stdout:9/217: creat d2/d3/d25/f55 x:0 0 0 2026-03-10T07:51:23.893 INFO:tasks.workunit.client.1.vm08.stdout:7/244: write d3/da/d25/d9/f23 [74292,44259] 0 2026-03-10T07:51:23.894 INFO:tasks.workunit.client.1.vm08.stdout:1/195: creat d2/d34/f45 x:0 0 0 2026-03-10T07:51:23.894 INFO:tasks.workunit.client.1.vm08.stdout:9/218: chown d2/d3/lf 3 1 2026-03-10T07:51:23.899 INFO:tasks.workunit.client.1.vm08.stdout:1/196: unlink d2/d6/d11/l25 0 2026-03-10T07:51:23.899 INFO:tasks.workunit.client.1.vm08.stdout:1/197: fsync d2/d34/f38 0 2026-03-10T07:51:23.901 INFO:tasks.workunit.client.1.vm08.stdout:1/198: creat d2/d6/d11/f46 x:0 0 0 2026-03-10T07:51:23.902 INFO:tasks.workunit.client.1.vm08.stdout:1/199: write d2/d6/d11/f44 [613133,74056] 0 2026-03-10T07:51:23.904 INFO:tasks.workunit.client.1.vm08.stdout:1/200: chown d2/d6/de 4310008 1 2026-03-10T07:51:23.905 INFO:tasks.workunit.client.1.vm08.stdout:1/201: truncate d2/d6/d11/f46 128663 0 2026-03-10T07:51:23.908 INFO:tasks.workunit.client.1.vm08.stdout:7/245: dread d3/da/d25/d9/f30 [0,4194304] 0 2026-03-10T07:51:23.918 INFO:tasks.workunit.client.1.vm08.stdout:7/246: creat d3/f51 x:0 0 0 2026-03-10T07:51:23.919 INFO:tasks.workunit.client.1.vm08.stdout:7/247: read - d3/da/d25/d9/f47 zero size 2026-03-10T07:51:23.932 INFO:tasks.workunit.client.1.vm08.stdout:7/248: read d3/da/f17 [1089906,48806] 0 2026-03-10T07:51:23.950 INFO:tasks.workunit.client.1.vm08.stdout:3/208: dread d0/f13 [0,4194304] 0 2026-03-10T07:51:23.953 INFO:tasks.workunit.client.1.vm08.stdout:3/209: dread d0/d3c/d18/f1e [0,4194304] 0 2026-03-10T07:51:23.953 INFO:tasks.workunit.client.1.vm08.stdout:3/210: readlink d0/l9 0 2026-03-10T07:51:23.955 INFO:tasks.workunit.client.1.vm08.stdout:6/212: write d1/f6 [900616,66886] 0 2026-03-10T07:51:23.959 INFO:tasks.workunit.client.1.vm08.stdout:3/211: dread - d0/d3c/d1f/d25/d2d/f3a zero size 2026-03-10T07:51:23.960 INFO:tasks.workunit.client.1.vm08.stdout:3/212: stat d0/c19 0 2026-03-10T07:51:23.960 INFO:tasks.workunit.client.1.vm08.stdout:3/213: truncate d0/d3c/d1f/d25/f2b 761159 0 2026-03-10T07:51:23.962 INFO:tasks.workunit.client.1.vm08.stdout:3/214: dread d0/d3c/d18/f38 [0,4194304] 0 2026-03-10T07:51:23.966 INFO:tasks.workunit.client.1.vm08.stdout:6/213: rename d1/d17/d2b/d37 to d1/d3/df/d1d/d40/d45 0 2026-03-10T07:51:23.967 INFO:tasks.workunit.client.1.vm08.stdout:9/219: getdents d2/de 0 2026-03-10T07:51:23.969 INFO:tasks.workunit.client.1.vm08.stdout:8/236: rmdir d0/df/d15 39 2026-03-10T07:51:23.969 INFO:tasks.workunit.client.1.vm08.stdout:8/237: readlink d0/df/d17/d25/d28/l29 0 2026-03-10T07:51:23.969 INFO:tasks.workunit.client.1.vm08.stdout:3/215: sync 2026-03-10T07:51:23.970 INFO:tasks.workunit.client.1.vm08.stdout:8/238: chown d0/df/d2e/d30/f33 0 1 2026-03-10T07:51:23.970 INFO:tasks.workunit.client.1.vm08.stdout:3/216: fsync d0/f39 0 2026-03-10T07:51:23.971 INFO:tasks.workunit.client.1.vm08.stdout:9/220: creat d2/d26/d4c/f56 x:0 0 0 2026-03-10T07:51:23.972 INFO:tasks.workunit.client.1.vm08.stdout:9/221: chown d2/d3/d25/d30/d35/l45 66795137 1 2026-03-10T07:51:23.988 INFO:tasks.workunit.client.1.vm08.stdout:6/214: mkdir d1/d46 0 2026-03-10T07:51:23.989 INFO:tasks.workunit.client.1.vm08.stdout:0/230: dwrite dd/d18/f21 [0,4194304] 0 2026-03-10T07:51:24.014 INFO:tasks.workunit.client.1.vm08.stdout:8/239: creat d0/d3b/d3f/f4e x:0 0 0 2026-03-10T07:51:24.018 INFO:tasks.workunit.client.1.vm08.stdout:8/240: sync 2026-03-10T07:51:24.018 INFO:tasks.workunit.client.1.vm08.stdout:8/241: chown d0/f22 26606770 1 2026-03-10T07:51:24.018 INFO:tasks.workunit.client.1.vm08.stdout:6/215: unlink d1/d3/df/c30 0 2026-03-10T07:51:24.019 INFO:tasks.workunit.client.1.vm08.stdout:6/216: write d1/db/d24/f26 [118452,114891] 0 2026-03-10T07:51:24.020 INFO:tasks.workunit.client.1.vm08.stdout:6/217: chown d1/f7 2819 1 2026-03-10T07:51:24.020 INFO:tasks.workunit.client.1.vm08.stdout:0/231: mknod dd/d10/d14/d15/d20/c53 0 2026-03-10T07:51:24.022 INFO:tasks.workunit.client.1.vm08.stdout:5/250: dwrite d0/d4/df/d12/d22/f44 [0,4194304] 0 2026-03-10T07:51:24.025 INFO:tasks.workunit.client.1.vm08.stdout:3/217: mkdir d0/d3c/d18/d4a 0 2026-03-10T07:51:24.025 INFO:tasks.workunit.client.1.vm08.stdout:3/218: read - d0/d3c/d1f/d25/d34/f36 zero size 2026-03-10T07:51:24.027 INFO:tasks.workunit.client.1.vm08.stdout:3/219: write d0/d3c/d18/f38 [2041435,61834] 0 2026-03-10T07:51:24.032 INFO:tasks.workunit.client.1.vm08.stdout:5/251: dwrite d0/d4/df/f2a [0,4194304] 0 2026-03-10T07:51:24.033 INFO:tasks.workunit.client.1.vm08.stdout:4/158: write d5/d8/f1e [492124,92105] 0 2026-03-10T07:51:24.044 INFO:tasks.workunit.client.1.vm08.stdout:0/232: truncate fb 1194765 0 2026-03-10T07:51:24.047 INFO:tasks.workunit.client.1.vm08.stdout:7/249: getdents d3/da/d25/d9/d2f/d3a 0 2026-03-10T07:51:24.047 INFO:tasks.workunit.client.1.vm08.stdout:7/250: chown d3/da/d25/d9/f47 3815173 1 2026-03-10T07:51:24.056 INFO:tasks.workunit.client.1.vm08.stdout:5/252: readlink d0/d4/df/d1e/l2b 0 2026-03-10T07:51:24.068 INFO:tasks.workunit.client.1.vm08.stdout:2/233: truncate d0/d1/d3/d5/f20 3695831 0 2026-03-10T07:51:24.073 INFO:tasks.workunit.client.1.vm08.stdout:8/242: mknod d0/df/d17/d25/d28/d4a/c4f 0 2026-03-10T07:51:24.073 INFO:tasks.workunit.client.1.vm08.stdout:6/218: mknod d1/c47 0 2026-03-10T07:51:24.074 INFO:tasks.workunit.client.1.vm08.stdout:8/243: stat d0/df/d2e/f3c 0 2026-03-10T07:51:24.074 INFO:tasks.workunit.client.1.vm08.stdout:8/244: stat d0/f22 0 2026-03-10T07:51:24.074 INFO:tasks.workunit.client.1.vm08.stdout:6/219: write d1/db/f23 [990686,29874] 0 2026-03-10T07:51:24.075 INFO:tasks.workunit.client.1.vm08.stdout:6/220: chown d1/d17/d2b 6688444 1 2026-03-10T07:51:24.078 INFO:tasks.workunit.client.1.vm08.stdout:1/202: dwrite d2/f4 [0,4194304] 0 2026-03-10T07:51:24.078 INFO:tasks.workunit.client.1.vm08.stdout:6/221: sync 2026-03-10T07:51:24.081 INFO:tasks.workunit.client.1.vm08.stdout:1/203: chown d2/d34/f3c 47342 1 2026-03-10T07:51:24.081 INFO:tasks.workunit.client.1.vm08.stdout:8/245: dwrite d0/df/f26 [0,4194304] 0 2026-03-10T07:51:24.082 INFO:tasks.workunit.client.1.vm08.stdout:0/233: mknod dd/d10/d2f/c54 0 2026-03-10T07:51:24.083 INFO:tasks.workunit.client.1.vm08.stdout:0/234: write dd/d18/f21 [600569,109072] 0 2026-03-10T07:51:24.084 INFO:tasks.workunit.client.1.vm08.stdout:3/220: creat d0/d3c/d1f/d44/f4b x:0 0 0 2026-03-10T07:51:24.102 INFO:tasks.workunit.client.1.vm08.stdout:5/253: mknod d0/d4/df/d12/c4b 0 2026-03-10T07:51:24.110 INFO:tasks.workunit.client.1.vm08.stdout:9/222: dwrite d2/d3/fc [0,4194304] 0 2026-03-10T07:51:24.122 INFO:tasks.workunit.client.1.vm08.stdout:9/223: dread d2/d3/f49 [0,4194304] 0 2026-03-10T07:51:24.122 INFO:tasks.workunit.client.1.vm08.stdout:6/222: symlink d1/db/l48 0 2026-03-10T07:51:24.122 INFO:tasks.workunit.client.1.vm08.stdout:8/246: mknod d0/c50 0 2026-03-10T07:51:24.122 INFO:tasks.workunit.client.1.vm08.stdout:1/204: rename d2/d34 to d2/d6/de/d47 0 2026-03-10T07:51:24.122 INFO:tasks.workunit.client.1.vm08.stdout:9/224: dwrite d2/d3/d25/f21 [0,4194304] 0 2026-03-10T07:51:24.122 INFO:tasks.workunit.client.1.vm08.stdout:0/235: mknod dd/d18/c55 0 2026-03-10T07:51:24.123 INFO:tasks.workunit.client.1.vm08.stdout:3/221: unlink d0/d3c/d1f/l29 0 2026-03-10T07:51:24.123 INFO:tasks.workunit.client.1.vm08.stdout:3/222: readlink d0/d3c/l1c 0 2026-03-10T07:51:24.127 INFO:tasks.workunit.client.1.vm08.stdout:3/223: dread d0/d3c/d1f/d25/d2d/f43 [0,4194304] 0 2026-03-10T07:51:24.131 INFO:tasks.workunit.client.1.vm08.stdout:3/224: read d0/d3c/d1f/d25/d2d/f43 [1864569,63668] 0 2026-03-10T07:51:24.131 INFO:tasks.workunit.client.1.vm08.stdout:4/159: dwrite d5/d1f/f25 [0,4194304] 0 2026-03-10T07:51:24.134 INFO:tasks.workunit.client.1.vm08.stdout:6/223: dread d1/d3/f2e [8388608,4194304] 0 2026-03-10T07:51:24.136 INFO:tasks.workunit.client.1.vm08.stdout:1/205: dread d2/d6/de/f1c [0,4194304] 0 2026-03-10T07:51:24.137 INFO:tasks.workunit.client.1.vm08.stdout:6/224: write d1/d17/f42 [438033,32585] 0 2026-03-10T07:51:24.138 INFO:tasks.workunit.client.1.vm08.stdout:1/206: dread - d2/d6/de/d1f/f3d zero size 2026-03-10T07:51:24.141 INFO:tasks.workunit.client.1.vm08.stdout:3/225: dwrite d0/d3c/d1f/d25/d2d/f43 [0,4194304] 0 2026-03-10T07:51:24.143 INFO:tasks.workunit.client.1.vm08.stdout:1/207: dwrite d2/d6/de/d1f/f33 [0,4194304] 0 2026-03-10T07:51:24.151 INFO:tasks.workunit.client.1.vm08.stdout:5/254: dread d0/d8/f1b [0,4194304] 0 2026-03-10T07:51:24.178 INFO:tasks.workunit.client.1.vm08.stdout:9/225: unlink d2/d3/d25/d30/d35/l45 0 2026-03-10T07:51:24.183 INFO:tasks.workunit.client.1.vm08.stdout:4/160: unlink d5/d8/f11 0 2026-03-10T07:51:24.202 INFO:tasks.workunit.client.1.vm08.stdout:5/255: creat d0/d4/df/f4c x:0 0 0 2026-03-10T07:51:24.203 INFO:tasks.workunit.client.1.vm08.stdout:5/256: write d0/d4/df/d1e/f42 [2962241,70442] 0 2026-03-10T07:51:24.203 INFO:tasks.workunit.client.1.vm08.stdout:5/257: readlink d0/d4/df/d12/d22/l2f 0 2026-03-10T07:51:24.209 INFO:tasks.workunit.client.1.vm08.stdout:8/247: unlink d0/df/d15/l1f 0 2026-03-10T07:51:24.212 INFO:tasks.workunit.client.1.vm08.stdout:7/251: getdents d3/da/d25 0 2026-03-10T07:51:24.213 INFO:tasks.workunit.client.1.vm08.stdout:2/234: getdents d0/d1/d17 0 2026-03-10T07:51:24.213 INFO:tasks.workunit.client.1.vm08.stdout:2/235: write d0/d1/d17/f1a [5032544,87277] 0 2026-03-10T07:51:24.214 INFO:tasks.workunit.client.1.vm08.stdout:2/236: fdatasync d0/d1/f24 0 2026-03-10T07:51:24.214 INFO:tasks.workunit.client.1.vm08.stdout:3/226: symlink d0/d41/l4c 0 2026-03-10T07:51:24.214 INFO:tasks.workunit.client.1.vm08.stdout:2/237: readlink d0/d1/d3/d5/l16 0 2026-03-10T07:51:24.217 INFO:tasks.workunit.client.1.vm08.stdout:2/238: stat d0/d1/d17/d27/d42/l3f 0 2026-03-10T07:51:24.218 INFO:tasks.workunit.client.1.vm08.stdout:6/225: creat d1/f49 x:0 0 0 2026-03-10T07:51:24.220 INFO:tasks.workunit.client.1.vm08.stdout:0/236: creat dd/f56 x:0 0 0 2026-03-10T07:51:24.220 INFO:tasks.workunit.client.1.vm08.stdout:4/161: mknod d5/d8/c36 0 2026-03-10T07:51:24.220 INFO:tasks.workunit.client.1.vm08.stdout:0/237: chown dd/d10/d14/f46 7182 1 2026-03-10T07:51:24.220 INFO:tasks.workunit.client.1.vm08.stdout:5/258: sync 2026-03-10T07:51:24.221 INFO:tasks.workunit.client.1.vm08.stdout:9/226: creat d2/d3/f57 x:0 0 0 2026-03-10T07:51:24.222 INFO:tasks.workunit.client.1.vm08.stdout:5/259: readlink d0/d4/df/d12/d22/l2f 0 2026-03-10T07:51:24.224 INFO:tasks.workunit.client.1.vm08.stdout:2/239: creat d0/d1/d3/d10/f4f x:0 0 0 2026-03-10T07:51:24.224 INFO:tasks.workunit.client.1.vm08.stdout:6/226: rename d1/f7 to d1/d3/d3e/f4a 0 2026-03-10T07:51:24.225 INFO:tasks.workunit.client.1.vm08.stdout:5/260: write d0/d4/df/d12/f46 [3133010,26132] 0 2026-03-10T07:51:24.226 INFO:tasks.workunit.client.1.vm08.stdout:0/238: dread dd/d10/d14/f46 [0,4194304] 0 2026-03-10T07:51:24.232 INFO:tasks.workunit.client.1.vm08.stdout:4/162: dwrite d5/d8/d9/f2b [0,4194304] 0 2026-03-10T07:51:24.239 INFO:tasks.workunit.client.1.vm08.stdout:7/252: dread d3/f4 [0,4194304] 0 2026-03-10T07:51:24.240 INFO:tasks.workunit.client.1.vm08.stdout:7/253: truncate d3/da/d25/d9/f47 514998 0 2026-03-10T07:51:24.241 INFO:tasks.workunit.client.1.vm08.stdout:3/227: dwrite d0/d3c/d1f/d25/d34/f3d [0,4194304] 0 2026-03-10T07:51:24.244 INFO:tasks.workunit.client.1.vm08.stdout:3/228: dread - d0/d3c/d1f/d25/d2d/f3a zero size 2026-03-10T07:51:24.251 INFO:tasks.workunit.client.1.vm08.stdout:2/240: truncate d0/f12 4033526 0 2026-03-10T07:51:24.251 INFO:tasks.workunit.client.1.vm08.stdout:2/241: chown d0/d1/d17/d27/d42/c41 15 1 2026-03-10T07:51:24.255 INFO:tasks.workunit.client.1.vm08.stdout:2/242: dread d0/d1/fb [0,4194304] 0 2026-03-10T07:51:24.263 INFO:tasks.workunit.client.1.vm08.stdout:2/243: chown d0/d1/d3/d5/d1b/f26 587111 1 2026-03-10T07:51:24.264 INFO:tasks.workunit.client.1.vm08.stdout:6/227: dread d1/d3/df/d1d/f1f [0,4194304] 0 2026-03-10T07:51:24.264 INFO:tasks.workunit.client.1.vm08.stdout:7/254: rename d3/da/d25/d9/f20 to d3/da/d25/d9/d2f/d3a/d40/f52 0 2026-03-10T07:51:24.264 INFO:tasks.workunit.client.1.vm08.stdout:7/255: creat d3/da/d25/d9/f53 x:0 0 0 2026-03-10T07:51:24.265 INFO:tasks.workunit.client.1.vm08.stdout:6/228: symlink d1/d3/df/d1d/l4b 0 2026-03-10T07:51:24.265 INFO:tasks.workunit.client.1.vm08.stdout:6/229: chown d1/d17/f42 25825 1 2026-03-10T07:51:24.266 INFO:tasks.workunit.client.1.vm08.stdout:6/230: write d1/f49 [1022225,67978] 0 2026-03-10T07:51:24.268 INFO:tasks.workunit.client.1.vm08.stdout:4/163: creat d5/d1f/f37 x:0 0 0 2026-03-10T07:51:24.269 INFO:tasks.workunit.client.1.vm08.stdout:1/208: truncate d2/d6/f18 1483263 0 2026-03-10T07:51:24.270 INFO:tasks.workunit.client.1.vm08.stdout:1/209: dread - d2/d6/f3b zero size 2026-03-10T07:51:24.275 INFO:tasks.workunit.client.1.vm08.stdout:8/248: dwrite d0/fa [0,4194304] 0 2026-03-10T07:51:24.276 INFO:tasks.workunit.client.1.vm08.stdout:8/249: chown d0/df/f13 2048816 1 2026-03-10T07:51:24.278 INFO:tasks.workunit.client.1.vm08.stdout:4/164: sync 2026-03-10T07:51:24.293 INFO:tasks.workunit.client.1.vm08.stdout:8/250: dwrite d0/df/d17/f35 [4194304,4194304] 0 2026-03-10T07:51:24.301 INFO:tasks.workunit.client.1.vm08.stdout:8/251: chown d0/df/d17/d25/l27 0 1 2026-03-10T07:51:24.302 INFO:tasks.workunit.client.1.vm08.stdout:0/239: truncate dd/d10/d14/f46 112064 0 2026-03-10T07:51:24.305 INFO:tasks.workunit.client.1.vm08.stdout:9/227: getdents d2/d3/d25/d2b 0 2026-03-10T07:51:24.306 INFO:tasks.workunit.client.1.vm08.stdout:3/229: creat d0/d3c/d1f/d25/f4d x:0 0 0 2026-03-10T07:51:24.307 INFO:tasks.workunit.client.1.vm08.stdout:6/231: symlink d1/d17/d2b/l4c 0 2026-03-10T07:51:24.308 INFO:tasks.workunit.client.1.vm08.stdout:2/244: creat d0/f50 x:0 0 0 2026-03-10T07:51:24.313 INFO:tasks.workunit.client.1.vm08.stdout:5/261: getdents d0/d4/d19/d3a 0 2026-03-10T07:51:24.318 INFO:tasks.workunit.client.1.vm08.stdout:5/262: dwrite d0/d4/df/f2a [0,4194304] 0 2026-03-10T07:51:24.324 INFO:tasks.workunit.client.1.vm08.stdout:5/263: chown d0/d4/cd 1865565 1 2026-03-10T07:51:24.324 INFO:tasks.workunit.client.1.vm08.stdout:2/245: sync 2026-03-10T07:51:24.329 INFO:tasks.workunit.client.1.vm08.stdout:4/165: dwrite d5/d1f/d31/f33 [0,4194304] 0 2026-03-10T07:51:24.332 INFO:tasks.workunit.client.1.vm08.stdout:1/210: dread d2/d10/f3e [0,4194304] 0 2026-03-10T07:51:24.346 INFO:tasks.workunit.client.1.vm08.stdout:9/228: rename d2/d26/d4c to d2/d58 0 2026-03-10T07:51:24.351 INFO:tasks.workunit.client.1.vm08.stdout:9/229: readlink d2/d3/lf 0 2026-03-10T07:51:24.355 INFO:tasks.workunit.client.1.vm08.stdout:6/232: dread d1/d3/f2e [8388608,4194304] 0 2026-03-10T07:51:24.357 INFO:tasks.workunit.client.1.vm08.stdout:0/240: dread f2 [0,4194304] 0 2026-03-10T07:51:24.358 INFO:tasks.workunit.client.1.vm08.stdout:0/241: chown dd/d10/d2f/d32/d3d 1438 1 2026-03-10T07:51:24.358 INFO:tasks.workunit.client.1.vm08.stdout:0/242: write dd/f16 [2432077,51183] 0 2026-03-10T07:51:24.359 INFO:tasks.workunit.client.1.vm08.stdout:0/243: readlink dd/d10/d14/d15/l17 0 2026-03-10T07:51:24.361 INFO:tasks.workunit.client.1.vm08.stdout:2/246: mknod d0/d1/d3/d10/d38/c51 0 2026-03-10T07:51:24.364 INFO:tasks.workunit.client.1.vm08.stdout:7/256: dwrite d3/da/d46/f49 [0,4194304] 0 2026-03-10T07:51:24.373 INFO:tasks.workunit.client.1.vm08.stdout:3/230: rmdir d0 39 2026-03-10T07:51:24.375 INFO:tasks.workunit.client.1.vm08.stdout:4/166: creat d5/d1f/d31/f38 x:0 0 0 2026-03-10T07:51:24.376 INFO:tasks.workunit.client.1.vm08.stdout:4/167: fsync d5/d17/f22 0 2026-03-10T07:51:24.376 INFO:tasks.workunit.client.1.vm08.stdout:8/252: symlink d0/df/l51 0 2026-03-10T07:51:24.376 INFO:tasks.workunit.client.1.vm08.stdout:4/168: write d5/d17/f22 [133700,58764] 0 2026-03-10T07:51:24.377 INFO:tasks.workunit.client.1.vm08.stdout:8/253: write d0/d3b/f4c [948610,108074] 0 2026-03-10T07:51:24.381 INFO:tasks.workunit.client.1.vm08.stdout:4/169: dread d5/d8/d9/f1b [0,4194304] 0 2026-03-10T07:51:24.381 INFO:tasks.workunit.client.1.vm08.stdout:1/211: rename d2/d6/de/f15 to d2/d6/de/d1f/d26/f48 0 2026-03-10T07:51:24.382 INFO:tasks.workunit.client.1.vm08.stdout:1/212: readlink d2/l2c 0 2026-03-10T07:51:24.382 INFO:tasks.workunit.client.1.vm08.stdout:1/213: write d2/d6/de/f32 [1000342,90320] 0 2026-03-10T07:51:24.384 INFO:tasks.workunit.client.1.vm08.stdout:1/214: truncate d2/d6/de/d1f/d26/f43 455229 0 2026-03-10T07:51:24.387 INFO:tasks.workunit.client.1.vm08.stdout:1/215: dread d2/d6/d11/f44 [0,4194304] 0 2026-03-10T07:51:24.392 INFO:tasks.workunit.client.1.vm08.stdout:6/233: rmdir d1/db 39 2026-03-10T07:51:24.393 INFO:tasks.workunit.client.1.vm08.stdout:9/230: dwrite d2/de/f4d [0,4194304] 0 2026-03-10T07:51:24.397 INFO:tasks.workunit.client.1.vm08.stdout:9/231: dwrite d2/f4e [0,4194304] 0 2026-03-10T07:51:24.427 INFO:tasks.workunit.client.1.vm08.stdout:0/244: symlink dd/d10/d14/d15/l57 0 2026-03-10T07:51:24.433 INFO:tasks.workunit.client.1.vm08.stdout:2/247: creat d0/d1/d3/d10/d38/f52 x:0 0 0 2026-03-10T07:51:24.433 INFO:tasks.workunit.client.1.vm08.stdout:2/248: stat d0/f1e 0 2026-03-10T07:51:24.434 INFO:tasks.workunit.client.1.vm08.stdout:7/257: fsync d3/da/d25/f27 0 2026-03-10T07:51:24.435 INFO:tasks.workunit.client.1.vm08.stdout:7/258: write d3/da/f1d [2382551,68670] 0 2026-03-10T07:51:24.436 INFO:tasks.workunit.client.1.vm08.stdout:5/264: mknod d0/d4/df/d12/d1c/d40/c4d 0 2026-03-10T07:51:24.436 INFO:tasks.workunit.client.1.vm08.stdout:7/259: stat d3/da/d25/d9/d2f/d3a/l50 0 2026-03-10T07:51:24.438 INFO:tasks.workunit.client.1.vm08.stdout:8/254: mknod d0/df/d15/d23/d39/c52 0 2026-03-10T07:51:24.442 INFO:tasks.workunit.client.1.vm08.stdout:1/216: rename d2/f13 to d2/d6/de/d1f/d22/f49 0 2026-03-10T07:51:24.443 INFO:tasks.workunit.client.1.vm08.stdout:9/232: mknod d2/de/c59 0 2026-03-10T07:51:24.445 INFO:tasks.workunit.client.1.vm08.stdout:7/260: mkdir d3/da/d25/d9/d2f/d3a/d40/d54 0 2026-03-10T07:51:24.447 INFO:tasks.workunit.client.1.vm08.stdout:6/234: dwrite d1/d3/d3e/f4a [0,4194304] 0 2026-03-10T07:51:24.453 INFO:tasks.workunit.client.1.vm08.stdout:0/245: rename dd/d10/d2f/d32 to dd/d29/d58 0 2026-03-10T07:51:24.453 INFO:tasks.workunit.client.1.vm08.stdout:5/265: read d0/d4/d19/d43/f35 [2650513,99808] 0 2026-03-10T07:51:24.454 INFO:tasks.workunit.client.1.vm08.stdout:0/246: truncate dd/f44 679225 0 2026-03-10T07:51:24.458 INFO:tasks.workunit.client.1.vm08.stdout:6/235: sync 2026-03-10T07:51:24.458 INFO:tasks.workunit.client.1.vm08.stdout:8/255: sync 2026-03-10T07:51:24.459 INFO:tasks.workunit.client.1.vm08.stdout:6/236: write d1/d3/d3e/f4a [2139143,33895] 0 2026-03-10T07:51:24.463 INFO:tasks.workunit.client.1.vm08.stdout:9/233: unlink d2/d26/c53 0 2026-03-10T07:51:24.463 INFO:tasks.workunit.client.1.vm08.stdout:1/217: creat d2/d6/de/d1f/d26/f4a x:0 0 0 2026-03-10T07:51:24.464 INFO:tasks.workunit.client.1.vm08.stdout:1/218: chown d2/d6/de/d1f/d26 58 1 2026-03-10T07:51:24.468 INFO:tasks.workunit.client.1.vm08.stdout:3/231: rename d0/f13 to d0/d3c/d1f/d25/d34/f4e 0 2026-03-10T07:51:24.473 INFO:tasks.workunit.client.1.vm08.stdout:5/266: symlink d0/d4/df/d1e/d41/l4e 0 2026-03-10T07:51:24.473 INFO:tasks.workunit.client.1.vm08.stdout:8/256: mkdir d0/df/d15/d53 0 2026-03-10T07:51:24.473 INFO:tasks.workunit.client.1.vm08.stdout:6/237: unlink d1/f41 0 2026-03-10T07:51:24.473 INFO:tasks.workunit.client.1.vm08.stdout:9/234: mknod d2/de/d28/c5a 0 2026-03-10T07:51:24.490 INFO:tasks.workunit.client.1.vm08.stdout:6/238: dwrite d1/db/f23 [0,4194304] 0 2026-03-10T07:51:24.498 INFO:tasks.workunit.client.1.vm08.stdout:4/170: truncate d5/f2f 22303 0 2026-03-10T07:51:24.501 INFO:tasks.workunit.client.1.vm08.stdout:4/171: truncate d5/d1f/f37 87250 0 2026-03-10T07:51:24.508 INFO:tasks.workunit.client.1.vm08.stdout:2/249: write d0/d1/d3/d5/f20 [3327692,4075] 0 2026-03-10T07:51:24.509 INFO:tasks.workunit.client.1.vm08.stdout:5/267: mknod d0/d4/d19/d3a/c4f 0 2026-03-10T07:51:24.512 INFO:tasks.workunit.client.1.vm08.stdout:2/250: dwrite d0/d1/d17/f1a [0,4194304] 0 2026-03-10T07:51:24.517 INFO:tasks.workunit.client.1.vm08.stdout:2/251: dread d0/d1/d3/d5/d1b/f26 [0,4194304] 0 2026-03-10T07:51:24.526 INFO:tasks.workunit.client.1.vm08.stdout:3/232: symlink d0/d3c/d18/d4a/l4f 0 2026-03-10T07:51:24.529 INFO:tasks.workunit.client.1.vm08.stdout:9/235: dread d2/d3/f50 [0,4194304] 0 2026-03-10T07:51:24.531 INFO:tasks.workunit.client.1.vm08.stdout:3/233: dread d0/d3c/f26 [0,4194304] 0 2026-03-10T07:51:24.531 INFO:tasks.workunit.client.1.vm08.stdout:0/247: link dd/d10/d14/d1b/d30/c4f dd/d10/d14/c59 0 2026-03-10T07:51:24.532 INFO:tasks.workunit.client.1.vm08.stdout:8/257: mkdir d0/df/d15/d23/d54 0 2026-03-10T07:51:24.533 INFO:tasks.workunit.client.1.vm08.stdout:8/258: fsync d0/df/d2e/d30/f43 0 2026-03-10T07:51:24.533 INFO:tasks.workunit.client.1.vm08.stdout:6/239: mkdir d1/d3/df/d1d/d40/d4d 0 2026-03-10T07:51:24.534 INFO:tasks.workunit.client.1.vm08.stdout:4/172: write d5/f21 [452984,24805] 0 2026-03-10T07:51:24.537 INFO:tasks.workunit.client.1.vm08.stdout:8/259: chown d0/df/d17/d25/d28/l29 0 1 2026-03-10T07:51:24.549 INFO:tasks.workunit.client.1.vm08.stdout:2/252: creat d0/d1/d3/d10/d38/f53 x:0 0 0 2026-03-10T07:51:24.550 INFO:tasks.workunit.client.1.vm08.stdout:7/261: getdents d3/da/d25/d9/d2f/d39/d43 0 2026-03-10T07:51:24.559 INFO:tasks.workunit.client.1.vm08.stdout:9/236: symlink d2/d3/d25/d2b/l5b 0 2026-03-10T07:51:24.559 INFO:tasks.workunit.client.1.vm08.stdout:1/219: truncate d2/d6/de/d1f/d22/f35 7491903 0 2026-03-10T07:51:24.560 INFO:tasks.workunit.client.1.vm08.stdout:3/234: mkdir d0/d3c/d1f/d25/d34/d50 0 2026-03-10T07:51:24.561 INFO:tasks.workunit.client.1.vm08.stdout:3/235: write d0/d3c/f20 [621894,24483] 0 2026-03-10T07:51:24.564 INFO:tasks.workunit.client.1.vm08.stdout:0/248: symlink dd/d10/d14/d15/d20/d22/l5a 0 2026-03-10T07:51:24.568 INFO:tasks.workunit.client.1.vm08.stdout:6/240: creat d1/db/f4e x:0 0 0 2026-03-10T07:51:24.569 INFO:tasks.workunit.client.1.vm08.stdout:8/260: mknod d0/d3b/c55 0 2026-03-10T07:51:24.570 INFO:tasks.workunit.client.1.vm08.stdout:9/237: dread d2/d3/d25/d2b/f37 [0,4194304] 0 2026-03-10T07:51:24.572 INFO:tasks.workunit.client.1.vm08.stdout:9/238: fdatasync d2/f4e 0 2026-03-10T07:51:24.574 INFO:tasks.workunit.client.1.vm08.stdout:0/249: dwrite dd/d29/f2a [0,4194304] 0 2026-03-10T07:51:24.575 INFO:tasks.workunit.client.1.vm08.stdout:7/262: creat d3/da/d25/d9/d2f/d3a/d40/f55 x:0 0 0 2026-03-10T07:51:24.575 INFO:tasks.workunit.client.1.vm08.stdout:5/268: write d0/f30 [1853592,8407] 0 2026-03-10T07:51:24.579 INFO:tasks.workunit.client.1.vm08.stdout:4/173: dwrite d5/f26 [0,4194304] 0 2026-03-10T07:51:24.579 INFO:tasks.workunit.client.1.vm08.stdout:2/253: creat d0/d1/d3/d10/d38/f54 x:0 0 0 2026-03-10T07:51:24.580 INFO:tasks.workunit.client.1.vm08.stdout:3/236: dwrite d0/d3c/d1f/d25/d2d/f3a [0,4194304] 0 2026-03-10T07:51:24.583 INFO:tasks.workunit.client.1.vm08.stdout:5/269: write d0/d4/df/d1e/f42 [3973076,99047] 0 2026-03-10T07:51:24.583 INFO:tasks.workunit.client.1.vm08.stdout:3/237: chown d0/d3c/d1f/d25/d2d/f43 13128391 1 2026-03-10T07:51:24.587 INFO:tasks.workunit.client.1.vm08.stdout:6/241: symlink d1/db/l4f 0 2026-03-10T07:51:24.592 INFO:tasks.workunit.client.1.vm08.stdout:9/239: mknod d2/d3/d25/d30/c5c 0 2026-03-10T07:51:24.593 INFO:tasks.workunit.client.1.vm08.stdout:5/270: dread d0/d4/df/d1e/f37 [0,4194304] 0 2026-03-10T07:51:24.594 INFO:tasks.workunit.client.1.vm08.stdout:5/271: readlink d0/d33/l39 0 2026-03-10T07:51:24.594 INFO:tasks.workunit.client.1.vm08.stdout:7/263: dwrite d3/da/f1d [0,4194304] 0 2026-03-10T07:51:24.598 INFO:tasks.workunit.client.1.vm08.stdout:5/272: chown d0/d4/df/d12/d1c/f2c 3652 1 2026-03-10T07:51:24.598 INFO:tasks.workunit.client.1.vm08.stdout:7/264: write d3/da/f21 [1076693,129290] 0 2026-03-10T07:51:24.622 INFO:tasks.workunit.client.1.vm08.stdout:2/254: rename d0/d1/d3/d5 to d0/d1/d17/d27/d42/d55 0 2026-03-10T07:51:24.623 INFO:tasks.workunit.client.1.vm08.stdout:2/255: readlink d0/d1/d3/l4d 0 2026-03-10T07:51:24.625 INFO:tasks.workunit.client.1.vm08.stdout:6/242: creat d1/db/d24/f50 x:0 0 0 2026-03-10T07:51:24.625 INFO:tasks.workunit.client.1.vm08.stdout:6/243: read - d1/db/d24/f50 zero size 2026-03-10T07:51:24.626 INFO:tasks.workunit.client.1.vm08.stdout:6/244: truncate d1/d3/df/d1d/f2a 1038652 0 2026-03-10T07:51:24.640 INFO:tasks.workunit.client.1.vm08.stdout:6/245: chown d1/d3/df/d1d/d40 2669191 1 2026-03-10T07:51:24.640 INFO:tasks.workunit.client.1.vm08.stdout:8/261: creat d0/df/d15/d23/d54/f56 x:0 0 0 2026-03-10T07:51:24.640 INFO:tasks.workunit.client.1.vm08.stdout:9/240: creat d2/d3/d25/d30/f5d x:0 0 0 2026-03-10T07:51:24.645 INFO:tasks.workunit.client.1.vm08.stdout:5/273: dwrite d0/d4/d19/d43/f35 [0,4194304] 0 2026-03-10T07:51:24.645 INFO:tasks.workunit.client.1.vm08.stdout:7/265: creat d3/da/d25/d9/d2f/d39/f56 x:0 0 0 2026-03-10T07:51:24.647 INFO:tasks.workunit.client.1.vm08.stdout:6/246: mkdir d1/db/d24/d51 0 2026-03-10T07:51:24.650 INFO:tasks.workunit.client.1.vm08.stdout:2/256: mkdir d0/d1/d3/d56 0 2026-03-10T07:51:24.655 INFO:tasks.workunit.client.1.vm08.stdout:5/274: dwrite d0/d4/d19/d3a/f3f [0,4194304] 0 2026-03-10T07:51:24.655 INFO:tasks.workunit.client.1.vm08.stdout:0/250: getdents dd/d10/d14 0 2026-03-10T07:51:24.661 INFO:tasks.workunit.client.1.vm08.stdout:0/251: chown dd/d29/f2a 176581280 1 2026-03-10T07:51:24.662 INFO:tasks.workunit.client.1.vm08.stdout:5/275: mkdir d0/d4/d19/d50 0 2026-03-10T07:51:24.666 INFO:tasks.workunit.client.1.vm08.stdout:2/257: dread d0/d1/d3/f8 [0,4194304] 0 2026-03-10T07:51:24.666 INFO:tasks.workunit.client.1.vm08.stdout:2/258: write d0/f4a [179736,55649] 0 2026-03-10T07:51:24.667 INFO:tasks.workunit.client.1.vm08.stdout:2/259: stat d0/d1/d17/d27/d42/d55/cc 0 2026-03-10T07:51:24.670 INFO:tasks.workunit.client.1.vm08.stdout:9/241: getdents d2/d3/d25/d30/d35 0 2026-03-10T07:51:24.671 INFO:tasks.workunit.client.1.vm08.stdout:0/252: unlink dd/d10/d14/d15/l34 0 2026-03-10T07:51:24.671 INFO:tasks.workunit.client.1.vm08.stdout:2/260: fdatasync d0/d1/d17/d27/d42/d55/f13 0 2026-03-10T07:51:24.672 INFO:tasks.workunit.client.1.vm08.stdout:0/253: write dd/f16 [548067,57405] 0 2026-03-10T07:51:24.673 INFO:tasks.workunit.client.1.vm08.stdout:1/220: rmdir d2/d6/de/d1f/d22 39 2026-03-10T07:51:24.674 INFO:tasks.workunit.client.1.vm08.stdout:7/266: getdents d3/da/d25/d9/d2f 0 2026-03-10T07:51:24.684 INFO:tasks.workunit.client.1.vm08.stdout:3/238: rename d0/d3c/d1f/d25 to d0/d3c/d1f/d44/d51 0 2026-03-10T07:51:24.689 INFO:tasks.workunit.client.1.vm08.stdout:3/239: write d0/d3c/d18/f2e [861906,8174] 0 2026-03-10T07:51:24.697 INFO:tasks.workunit.client.1.vm08.stdout:9/242: sync 2026-03-10T07:51:24.701 INFO:tasks.workunit.client.1.vm08.stdout:5/276: creat d0/f51 x:0 0 0 2026-03-10T07:51:24.701 INFO:tasks.workunit.client.1.vm08.stdout:9/243: dwrite d2/f44 [0,4194304] 0 2026-03-10T07:51:24.701 INFO:tasks.workunit.client.1.vm08.stdout:3/240: mkdir d0/d41/d52 0 2026-03-10T07:51:24.703 INFO:tasks.workunit.client.1.vm08.stdout:9/244: chown d2/d3/d25 11848923 1 2026-03-10T07:51:24.715 INFO:tasks.workunit.client.1.vm08.stdout:3/241: creat d0/d3c/d1f/d44/d51/d34/f53 x:0 0 0 2026-03-10T07:51:24.716 INFO:tasks.workunit.client.1.vm08.stdout:5/277: dwrite d0/d4/d19/d3a/f3f [0,4194304] 0 2026-03-10T07:51:24.724 INFO:tasks.workunit.client.1.vm08.stdout:8/262: rename d0/c7 to d0/df/d15/d23/d54/c57 0 2026-03-10T07:51:24.724 INFO:tasks.workunit.client.1.vm08.stdout:1/221: creat d2/d6/de/d1f/f4b x:0 0 0 2026-03-10T07:51:24.724 INFO:tasks.workunit.client.1.vm08.stdout:8/263: readlink d0/df/d17/l4d 0 2026-03-10T07:51:24.725 INFO:tasks.workunit.client.1.vm08.stdout:7/267: creat d3/f57 x:0 0 0 2026-03-10T07:51:24.725 INFO:tasks.workunit.client.1.vm08.stdout:1/222: fsync d2/d6/de/d47/f3c 0 2026-03-10T07:51:24.725 INFO:tasks.workunit.client.1.vm08.stdout:1/223: chown d2 55940 1 2026-03-10T07:51:24.727 INFO:tasks.workunit.client.1.vm08.stdout:1/224: read - d2/d6/de/d1f/d26/f4a zero size 2026-03-10T07:51:24.727 INFO:tasks.workunit.client.1.vm08.stdout:5/278: symlink d0/d4/df/d1e/d41/l52 0 2026-03-10T07:51:24.728 INFO:tasks.workunit.client.1.vm08.stdout:3/242: symlink d0/d3c/d1f/l54 0 2026-03-10T07:51:24.728 INFO:tasks.workunit.client.1.vm08.stdout:1/225: write d2/d6/de/d47/f38 [850826,18343] 0 2026-03-10T07:51:24.733 INFO:tasks.workunit.client.1.vm08.stdout:7/268: write d3/da/d25/d9/d2f/d3a/d40/f52 [1068900,118283] 0 2026-03-10T07:51:24.736 INFO:tasks.workunit.client.1.vm08.stdout:7/269: chown d3/c7 5808 1 2026-03-10T07:51:24.736 INFO:tasks.workunit.client.1.vm08.stdout:5/279: dwrite d0/d4/f31 [0,4194304] 0 2026-03-10T07:51:24.744 INFO:tasks.workunit.client.1.vm08.stdout:3/243: dwrite d0/fc [4194304,4194304] 0 2026-03-10T07:51:24.751 INFO:tasks.workunit.client.1.vm08.stdout:3/244: readlink d0/d3c/l1b 0 2026-03-10T07:51:24.751 INFO:tasks.workunit.client.1.vm08.stdout:9/245: dread d2/d3/d25/d30/f47 [0,4194304] 0 2026-03-10T07:51:24.751 INFO:tasks.workunit.client.1.vm08.stdout:5/280: dwrite d0/d4/df/d12/d1c/f29 [0,4194304] 0 2026-03-10T07:51:24.751 INFO:tasks.workunit.client.1.vm08.stdout:5/281: dread d0/d4/d19/d43/f35 [0,4194304] 0 2026-03-10T07:51:24.751 INFO:tasks.workunit.client.1.vm08.stdout:5/282: read d0/d8/f18 [81430,48097] 0 2026-03-10T07:51:24.757 INFO:tasks.workunit.client.1.vm08.stdout:5/283: dread d0/d4/df/d1e/f37 [0,4194304] 0 2026-03-10T07:51:24.757 INFO:tasks.workunit.client.1.vm08.stdout:5/284: fsync d0/d4/df/d1e/f42 0 2026-03-10T07:51:24.774 INFO:tasks.workunit.client.1.vm08.stdout:9/246: rename d2/d3/l3a to d2/l5e 0 2026-03-10T07:51:24.774 INFO:tasks.workunit.client.1.vm08.stdout:9/247: stat d2/c1d 0 2026-03-10T07:51:24.774 INFO:tasks.workunit.client.1.vm08.stdout:3/245: mkdir d0/d3c/d18/d48/d55 0 2026-03-10T07:51:24.776 INFO:tasks.workunit.client.1.vm08.stdout:7/270: dread d3/f2e [0,4194304] 0 2026-03-10T07:51:24.777 INFO:tasks.workunit.client.1.vm08.stdout:7/271: write d3/da/d25/d9/f23 [2288046,3937] 0 2026-03-10T07:51:24.784 INFO:tasks.workunit.client.1.vm08.stdout:9/248: dwrite d2/de/f1e [0,4194304] 0 2026-03-10T07:51:24.791 INFO:tasks.workunit.client.1.vm08.stdout:5/285: creat d0/d4/d19/d3a/f53 x:0 0 0 2026-03-10T07:51:24.807 INFO:tasks.workunit.client.1.vm08.stdout:3/246: rename d0/d3c/d1f/d44/d51/d34/d50 to d0/d3c/d18/d48/d55/d56 0 2026-03-10T07:51:24.808 INFO:tasks.workunit.client.1.vm08.stdout:3/247: stat d0/d3c/d1f/d44/d51/f4d 0 2026-03-10T07:51:24.809 INFO:tasks.workunit.client.1.vm08.stdout:8/264: fsync d0/df/d15/d23/d54/f56 0 2026-03-10T07:51:24.810 INFO:tasks.workunit.client.1.vm08.stdout:8/265: truncate d0/df/d2e/f3c 143453 0 2026-03-10T07:51:24.810 INFO:tasks.workunit.client.1.vm08.stdout:8/266: dread - d0/df/d15/d23/f3d zero size 2026-03-10T07:51:24.817 INFO:tasks.workunit.client.1.vm08.stdout:4/174: truncate d5/d1f/f25 647989 0 2026-03-10T07:51:24.821 INFO:tasks.workunit.client.1.vm08.stdout:6/247: dwrite d1/d3/f19 [0,4194304] 0 2026-03-10T07:51:24.825 INFO:tasks.workunit.client.1.vm08.stdout:7/272: creat d3/da/d25/d9/d2f/d39/f58 x:0 0 0 2026-03-10T07:51:24.827 INFO:tasks.workunit.client.1.vm08.stdout:0/254: fsync dd/f44 0 2026-03-10T07:51:24.835 INFO:tasks.workunit.client.1.vm08.stdout:2/261: dwrite d0/f12 [0,4194304] 0 2026-03-10T07:51:24.843 INFO:tasks.workunit.client.1.vm08.stdout:2/262: truncate d0/d1/d17/f1a 5835299 0 2026-03-10T07:51:24.848 INFO:tasks.workunit.client.1.vm08.stdout:6/248: dread d1/d3/f13 [0,4194304] 0 2026-03-10T07:51:24.848 INFO:tasks.workunit.client.1.vm08.stdout:2/263: truncate d0/d1/d3/d10/d38/f52 596878 0 2026-03-10T07:51:24.848 INFO:tasks.workunit.client.1.vm08.stdout:5/286: truncate d0/d8/f18 785304 0 2026-03-10T07:51:24.848 INFO:tasks.workunit.client.1.vm08.stdout:3/248: creat d0/d41/f57 x:0 0 0 2026-03-10T07:51:24.848 INFO:tasks.workunit.client.1.vm08.stdout:8/267: mknod d0/df/d2e/d30/c58 0 2026-03-10T07:51:24.848 INFO:tasks.workunit.client.1.vm08.stdout:8/268: write d0/df/d17/f35 [703934,48827] 0 2026-03-10T07:51:24.880 INFO:tasks.workunit.client.1.vm08.stdout:5/287: dread d0/d8/d24/f47 [0,4194304] 0 2026-03-10T07:51:24.881 INFO:tasks.workunit.client.1.vm08.stdout:5/288: write d0/d8/d24/f3e [935424,78805] 0 2026-03-10T07:51:24.894 INFO:tasks.workunit.client.1.vm08.stdout:7/273: unlink d3/f16 0 2026-03-10T07:51:24.896 INFO:tasks.workunit.client.1.vm08.stdout:0/255: fsync dd/d10/d14/f46 0 2026-03-10T07:51:24.897 INFO:tasks.workunit.client.1.vm08.stdout:1/226: dwrite d2/d6/de/d1f/d26/f29 [0,4194304] 0 2026-03-10T07:51:24.899 INFO:tasks.workunit.client.1.vm08.stdout:5/289: sync 2026-03-10T07:51:24.912 INFO:tasks.workunit.client.1.vm08.stdout:3/249: chown d0/lf 65528753 1 2026-03-10T07:51:24.912 INFO:tasks.workunit.client.1.vm08.stdout:8/269: symlink d0/df/d2e/l59 0 2026-03-10T07:51:24.912 INFO:tasks.workunit.client.1.vm08.stdout:3/250: chown d0/l11 13 1 2026-03-10T07:51:24.913 INFO:tasks.workunit.client.1.vm08.stdout:2/264: dread d0/d1/d17/d27/d42/d55/d1b/f26 [0,4194304] 0 2026-03-10T07:51:24.921 INFO:tasks.workunit.client.1.vm08.stdout:9/249: creat d2/d3/f5f x:0 0 0 2026-03-10T07:51:24.921 INFO:tasks.workunit.client.1.vm08.stdout:9/250: readlink d2/d3/d25/d2b/l5b 0 2026-03-10T07:51:24.924 INFO:tasks.workunit.client.1.vm08.stdout:7/274: creat d3/da/d25/d9/f59 x:0 0 0 2026-03-10T07:51:24.925 INFO:tasks.workunit.client.1.vm08.stdout:7/275: chown d3/da/l24 87 1 2026-03-10T07:51:24.929 INFO:tasks.workunit.client.1.vm08.stdout:0/256: creat dd/d29/d58/d3d/f5b x:0 0 0 2026-03-10T07:51:24.930 INFO:tasks.workunit.client.1.vm08.stdout:0/257: dread - dd/d29/d58/d3d/f5b zero size 2026-03-10T07:51:24.934 INFO:tasks.workunit.client.1.vm08.stdout:4/175: creat d5/d8/f39 x:0 0 0 2026-03-10T07:51:24.940 INFO:tasks.workunit.client.1.vm08.stdout:3/251: rename d0/d3c/d1f/d44/d51/d34/f36 to d0/d3c/d1f/d44/d51/d2d/f58 0 2026-03-10T07:51:24.943 INFO:tasks.workunit.client.1.vm08.stdout:9/251: unlink d2/d3/d25/d2b/f2c 0 2026-03-10T07:51:24.943 INFO:tasks.workunit.client.1.vm08.stdout:9/252: dread - d2/d3/f5f zero size 2026-03-10T07:51:24.947 INFO:tasks.workunit.client.1.vm08.stdout:0/258: mkdir dd/d29/d5c 0 2026-03-10T07:51:24.949 INFO:tasks.workunit.client.1.vm08.stdout:4/176: symlink d5/d8/d9/d12/l3a 0 2026-03-10T07:51:24.962 INFO:tasks.workunit.client.1.vm08.stdout:2/265: mkdir d0/d1/d3/d56/d57 0 2026-03-10T07:51:24.972 INFO:tasks.workunit.client.1.vm08.stdout:7/276: symlink d3/da/d25/d9/d2f/d39/d43/d4f/l5a 0 2026-03-10T07:51:24.972 INFO:tasks.workunit.client.1.vm08.stdout:7/277: dread - d3/da/d25/d9/d2f/d39/f58 zero size 2026-03-10T07:51:24.975 INFO:tasks.workunit.client.1.vm08.stdout:9/253: mknod d2/d3/d25/d30/c60 0 2026-03-10T07:51:24.976 INFO:tasks.workunit.client.1.vm08.stdout:0/259: creat dd/d10/d14/d15/d20/d22/f5d x:0 0 0 2026-03-10T07:51:24.985 INFO:tasks.workunit.client.1.vm08.stdout:4/177: dread d5/d1f/f25 [0,4194304] 0 2026-03-10T07:51:24.987 INFO:tasks.workunit.client.1.vm08.stdout:8/270: link d0/d3b/c55 d0/df/d17/d25/d28/c5a 0 2026-03-10T07:51:24.987 INFO:tasks.workunit.client.1.vm08.stdout:2/266: creat d0/d1/d3/d10/f58 x:0 0 0 2026-03-10T07:51:24.987 INFO:tasks.workunit.client.1.vm08.stdout:8/271: fsync d0/df/d17/f32 0 2026-03-10T07:51:24.995 INFO:tasks.workunit.client.1.vm08.stdout:7/278: mkdir d3/da/d25/d9/d2f/d39/d43/d4f/d5b 0 2026-03-10T07:51:24.997 INFO:tasks.workunit.client.1.vm08.stdout:8/272: fdatasync d0/d3b/d3f/f4e 0 2026-03-10T07:51:24.997 INFO:tasks.workunit.client.1.vm08.stdout:9/254: creat d2/d26/f61 x:0 0 0 2026-03-10T07:51:24.998 INFO:tasks.workunit.client.1.vm08.stdout:9/255: chown d2/f5 541069 1 2026-03-10T07:51:24.999 INFO:tasks.workunit.client.1.vm08.stdout:6/249: truncate d1/db/f23 2150636 0 2026-03-10T07:51:25.002 INFO:tasks.workunit.client.1.vm08.stdout:5/290: getdents d0/d4/df 0 2026-03-10T07:51:25.003 INFO:tasks.workunit.client.1.vm08.stdout:6/250: sync 2026-03-10T07:51:25.007 INFO:tasks.workunit.client.1.vm08.stdout:2/267: unlink d0/d1/d17/d27/d42/d55/d1b/f3c 0 2026-03-10T07:51:25.010 INFO:tasks.workunit.client.1.vm08.stdout:8/273: dwrite d0/f20 [0,4194304] 0 2026-03-10T07:51:25.016 INFO:tasks.workunit.client.1.vm08.stdout:2/268: dwrite d0/d1/d3/f4e [0,4194304] 0 2026-03-10T07:51:25.016 INFO:tasks.workunit.client.1.vm08.stdout:7/279: creat d3/da/d25/f5c x:0 0 0 2026-03-10T07:51:25.019 INFO:tasks.workunit.client.1.vm08.stdout:8/274: dwrite d0/f22 [0,4194304] 0 2026-03-10T07:51:25.019 INFO:tasks.workunit.client.1.vm08.stdout:9/256: mkdir d2/d3/d25/d30/d35/d62 0 2026-03-10T07:51:25.021 INFO:tasks.workunit.client.1.vm08.stdout:9/257: dread - d2/d26/f61 zero size 2026-03-10T07:51:25.031 INFO:tasks.workunit.client.1.vm08.stdout:4/178: rename d5/l28 to d5/d17/l3b 0 2026-03-10T07:51:25.038 INFO:tasks.workunit.client.1.vm08.stdout:2/269: dread d0/d1/d17/f1a [4194304,4194304] 0 2026-03-10T07:51:25.040 INFO:tasks.workunit.client.1.vm08.stdout:5/291: mknod d0/d4/df/d1e/d41/c54 0 2026-03-10T07:51:25.062 INFO:tasks.workunit.client.1.vm08.stdout:3/252: getdents d0/d3c/d1f/d44/d51/d34 0 2026-03-10T07:51:25.065 INFO:tasks.workunit.client.1.vm08.stdout:0/260: creat dd/d10/f5e x:0 0 0 2026-03-10T07:51:25.066 INFO:tasks.workunit.client.1.vm08.stdout:0/261: chown f7 243 1 2026-03-10T07:51:25.066 INFO:tasks.workunit.client.1.vm08.stdout:0/262: chown dd/d18/f25 8 1 2026-03-10T07:51:25.066 INFO:tasks.workunit.client.1.vm08.stdout:8/275: rmdir d0/d3b/d3f 39 2026-03-10T07:51:25.068 INFO:tasks.workunit.client.1.vm08.stdout:9/258: creat d2/d3/d25/f63 x:0 0 0 2026-03-10T07:51:25.069 INFO:tasks.workunit.client.1.vm08.stdout:8/276: dread - d0/df/d2e/d30/f43 zero size 2026-03-10T07:51:25.071 INFO:tasks.workunit.client.1.vm08.stdout:8/277: write d0/df/d2e/f44 [117279,94531] 0 2026-03-10T07:51:25.073 INFO:tasks.workunit.client.1.vm08.stdout:0/263: dwrite dd/f16 [0,4194304] 0 2026-03-10T07:51:25.074 INFO:tasks.workunit.client.1.vm08.stdout:9/259: dwrite d2/d3/fa [0,4194304] 0 2026-03-10T07:51:25.080 INFO:tasks.workunit.client.1.vm08.stdout:7/280: truncate d3/da/d25/d9/fd 1339498 0 2026-03-10T07:51:25.080 INFO:tasks.workunit.client.1.vm08.stdout:7/281: chown d3/f2e 9713719 1 2026-03-10T07:51:25.080 INFO:tasks.workunit.client.1.vm08.stdout:9/260: chown d2/d3/d25/d30/f46 1075614338 1 2026-03-10T07:51:25.080 INFO:tasks.workunit.client.1.vm08.stdout:2/270: unlink d0/d1/d17/d27/d42/d55/d1b/c31 0 2026-03-10T07:51:25.084 INFO:tasks.workunit.client.1.vm08.stdout:0/264: dwrite dd/fe [0,4194304] 0 2026-03-10T07:51:25.085 INFO:tasks.workunit.client.1.vm08.stdout:0/265: write dd/d10/d14/d15/d20/d22/f2e [1666605,84155] 0 2026-03-10T07:51:25.090 INFO:tasks.workunit.client.1.vm08.stdout:8/278: rename d0/df/d17/d25/d28 to d0/df/d15/d23/d39/d5b 0 2026-03-10T07:51:25.096 INFO:tasks.workunit.client.1.vm08.stdout:8/279: dwrite d0/df/d2e/d30/f33 [0,4194304] 0 2026-03-10T07:51:25.105 INFO:tasks.workunit.client.1.vm08.stdout:5/292: mknod d0/c55 0 2026-03-10T07:51:25.105 INFO:tasks.workunit.client.1.vm08.stdout:7/282: rmdir d3/da 39 2026-03-10T07:51:25.105 INFO:tasks.workunit.client.1.vm08.stdout:0/266: mkdir dd/d10/d14/d15/d20/d5f 0 2026-03-10T07:51:25.105 INFO:tasks.workunit.client.1.vm08.stdout:5/293: fdatasync d0/d4/d19/d43/f35 0 2026-03-10T07:51:25.105 INFO:tasks.workunit.client.1.vm08.stdout:5/294: stat d0/d4/df/d12/f46 0 2026-03-10T07:51:25.105 INFO:tasks.workunit.client.1.vm08.stdout:9/261: dwrite d2/de/f4d [0,4194304] 0 2026-03-10T07:51:25.108 INFO:tasks.workunit.client.1.vm08.stdout:9/262: fdatasync d2/d3/f57 0 2026-03-10T07:51:25.108 INFO:tasks.workunit.client.1.vm08.stdout:9/263: chown d2/de/d28/c5a 1 1 2026-03-10T07:51:25.109 INFO:tasks.workunit.client.1.vm08.stdout:9/264: write d2/de/f4d [2483465,125280] 0 2026-03-10T07:51:25.121 INFO:tasks.workunit.client.1.vm08.stdout:3/253: getdents d0/d3c/d18/d32 0 2026-03-10T07:51:25.122 INFO:tasks.workunit.client.1.vm08.stdout:3/254: chown d0/d3c/d18/f38 134369351 1 2026-03-10T07:51:25.123 INFO:tasks.workunit.client.1.vm08.stdout:3/255: stat d0/d3c/f20 0 2026-03-10T07:51:25.135 INFO:tasks.workunit.client.1.vm08.stdout:0/267: symlink dd/d29/d5c/l60 0 2026-03-10T07:51:25.144 INFO:tasks.workunit.client.1.vm08.stdout:7/283: rename d3/da/d25/c28 to d3/da/d25/d9/d2f/d3a/d40/d54/c5d 0 2026-03-10T07:51:25.154 INFO:tasks.workunit.client.1.vm08.stdout:9/265: read d2/d26/f29 [4735559,3060] 0 2026-03-10T07:51:25.156 INFO:tasks.workunit.client.1.vm08.stdout:3/256: creat d0/d3c/d1f/d44/f59 x:0 0 0 2026-03-10T07:51:25.156 INFO:tasks.workunit.client.1.vm08.stdout:3/257: dread - d0/d3c/d1f/d44/d51/d34/f47 zero size 2026-03-10T07:51:25.157 INFO:tasks.workunit.client.1.vm08.stdout:3/258: chown d0/f16 1565 1 2026-03-10T07:51:25.158 INFO:tasks.workunit.client.1.vm08.stdout:8/280: rmdir d0/d3b/d42 0 2026-03-10T07:51:25.163 INFO:tasks.workunit.client.1.vm08.stdout:5/295: link d0/d4/df/d12/d1c/f29 d0/d8/d24/f56 0 2026-03-10T07:51:25.163 INFO:tasks.workunit.client.1.vm08.stdout:1/227: dwrite d2/d6/de/d1f/d22/f35 [4194304,4194304] 0 2026-03-10T07:51:25.164 INFO:tasks.workunit.client.1.vm08.stdout:7/284: mknod d3/da/d25/d9/d2f/d39/d43/c5e 0 2026-03-10T07:51:25.164 INFO:tasks.workunit.client.1.vm08.stdout:9/266: fsync d2/d3/d25/d30/f46 0 2026-03-10T07:51:25.173 INFO:tasks.workunit.client.1.vm08.stdout:5/296: dwrite d0/d4/df/d1e/f28 [0,4194304] 0 2026-03-10T07:51:25.176 INFO:tasks.workunit.client.1.vm08.stdout:5/297: read d0/d8/fe [146594,7871] 0 2026-03-10T07:51:25.178 INFO:tasks.workunit.client.1.vm08.stdout:1/228: symlink d2/d6/de/d1f/l4c 0 2026-03-10T07:51:25.181 INFO:tasks.workunit.client.1.vm08.stdout:7/285: mknod d3/da/d25/d9/d2f/d3a/d4b/c5f 0 2026-03-10T07:51:25.182 INFO:tasks.workunit.client.1.vm08.stdout:7/286: dread d3/da/d25/f32 [0,4194304] 0 2026-03-10T07:51:25.183 INFO:tasks.workunit.client.1.vm08.stdout:1/229: dwrite d2/d10/f3f [0,4194304] 0 2026-03-10T07:51:25.184 INFO:tasks.workunit.client.1.vm08.stdout:9/267: link d2/f4e d2/d3/d25/d30/d35/f64 0 2026-03-10T07:51:25.188 INFO:tasks.workunit.client.1.vm08.stdout:9/268: stat d2/d3/d25/d30/l43 0 2026-03-10T07:51:25.191 INFO:tasks.workunit.client.1.vm08.stdout:5/298: creat d0/d4/df/f57 x:0 0 0 2026-03-10T07:51:25.192 INFO:tasks.workunit.client.1.vm08.stdout:5/299: write d0/d8/d24/f3e [251802,124465] 0 2026-03-10T07:51:25.193 INFO:tasks.workunit.client.1.vm08.stdout:5/300: dread d0/d8/fe [0,4194304] 0 2026-03-10T07:51:25.193 INFO:tasks.workunit.client.1.vm08.stdout:5/301: readlink d0/d4/df/d1e/l2b 0 2026-03-10T07:51:25.195 INFO:tasks.workunit.client.1.vm08.stdout:8/281: getdents d0/df/d17 0 2026-03-10T07:51:25.206 INFO:tasks.workunit.client.1.vm08.stdout:1/230: truncate d2/d10/f3e 2502048 0 2026-03-10T07:51:25.206 INFO:tasks.workunit.client.1.vm08.stdout:9/269: symlink d2/de/d28/l65 0 2026-03-10T07:51:25.211 INFO:tasks.workunit.client.1.vm08.stdout:8/282: mknod d0/df/d15/d23/d39/d5b/d4a/c5c 0 2026-03-10T07:51:25.212 INFO:tasks.workunit.client.1.vm08.stdout:8/283: write d0/df/d17/f32 [1551265,34317] 0 2026-03-10T07:51:25.212 INFO:tasks.workunit.client.1.vm08.stdout:7/287: symlink d3/da/d25/d9/d2f/d4d/l60 0 2026-03-10T07:51:25.212 INFO:tasks.workunit.client.1.vm08.stdout:7/288: stat d3/f57 0 2026-03-10T07:51:25.212 INFO:tasks.workunit.client.1.vm08.stdout:8/284: chown d0/df/d15/d23/d39/d5b 5 1 2026-03-10T07:51:25.216 INFO:tasks.workunit.client.1.vm08.stdout:5/302: symlink d0/d4/l58 0 2026-03-10T07:51:25.217 INFO:tasks.workunit.client.1.vm08.stdout:7/289: mknod d3/da/d25/d9/d2f/c61 0 2026-03-10T07:51:25.218 INFO:tasks.workunit.client.1.vm08.stdout:1/231: creat d2/d6/de/d1f/d40/f4d x:0 0 0 2026-03-10T07:51:25.218 INFO:tasks.workunit.client.1.vm08.stdout:8/285: mkdir d0/df/d5d 0 2026-03-10T07:51:25.218 INFO:tasks.workunit.client.1.vm08.stdout:7/290: dread d3/da/d25/f32 [0,4194304] 0 2026-03-10T07:51:25.219 INFO:tasks.workunit.client.1.vm08.stdout:5/303: creat d0/d4/d19/d43/f59 x:0 0 0 2026-03-10T07:51:25.224 INFO:tasks.workunit.client.1.vm08.stdout:7/291: dwrite d3/f57 [0,4194304] 0 2026-03-10T07:51:25.229 INFO:tasks.workunit.client.1.vm08.stdout:5/304: fdatasync d0/d4/df/d1e/f37 0 2026-03-10T07:51:25.229 INFO:tasks.workunit.client.1.vm08.stdout:7/292: write d3/f4 [7036301,9141] 0 2026-03-10T07:51:25.231 INFO:tasks.workunit.client.1.vm08.stdout:5/305: stat d0/d4/df/d12/d22/f44 0 2026-03-10T07:51:25.236 INFO:tasks.workunit.client.1.vm08.stdout:5/306: dread d0/d8/d24/f47 [0,4194304] 0 2026-03-10T07:51:25.240 INFO:tasks.workunit.client.1.vm08.stdout:1/232: link d2/d6/d11/l23 d2/d6/de/d1f/l4e 0 2026-03-10T07:51:25.250 INFO:tasks.workunit.client.1.vm08.stdout:7/293: link d3/da/f21 d3/da/d25/d9/d2f/f62 0 2026-03-10T07:51:25.257 INFO:tasks.workunit.client.1.vm08.stdout:5/307: truncate d0/d4/d19/d43/f35 2997276 0 2026-03-10T07:51:25.258 INFO:tasks.workunit.client.1.vm08.stdout:5/308: write d0/d4/df/d12/d22/f44 [1823983,46752] 0 2026-03-10T07:51:25.265 INFO:tasks.workunit.client.1.vm08.stdout:7/294: creat d3/da/d25/d9/d2f/d3a/d40/f63 x:0 0 0 2026-03-10T07:51:25.267 INFO:tasks.workunit.client.1.vm08.stdout:7/295: dread d3/f2e [0,4194304] 0 2026-03-10T07:51:25.275 INFO:tasks.workunit.client.1.vm08.stdout:6/251: write d1/d17/d2b/f3c [4730356,9619] 0 2026-03-10T07:51:25.275 INFO:tasks.workunit.client.1.vm08.stdout:6/252: read d1/d17/f42 [152119,7621] 0 2026-03-10T07:51:25.275 INFO:tasks.workunit.client.1.vm08.stdout:7/296: mknod d3/da/d25/d9/d2f/d39/d43/d4f/c64 0 2026-03-10T07:51:25.275 INFO:tasks.workunit.client.1.vm08.stdout:7/297: readlink d3/da/d25/d9/d2f/d4d/l60 0 2026-03-10T07:51:25.277 INFO:tasks.workunit.client.1.vm08.stdout:7/298: chown d3/da/d25/d9/d2f/f62 1493441227 1 2026-03-10T07:51:25.277 INFO:tasks.workunit.client.1.vm08.stdout:7/299: write d3/da/d25/d9/d2f/d3a/d40/f55 [47220,6479] 0 2026-03-10T07:51:25.280 INFO:tasks.workunit.client.1.vm08.stdout:5/309: link d0/f51 d0/d4/df/f5a 0 2026-03-10T07:51:25.280 INFO:tasks.workunit.client.1.vm08.stdout:1/233: rename d2/d6/de/d1f/l39 to d2/d6/de/d1f/l4f 0 2026-03-10T07:51:25.281 INFO:tasks.workunit.client.1.vm08.stdout:5/310: dread d0/d8/f1b [0,4194304] 0 2026-03-10T07:51:25.282 INFO:tasks.workunit.client.1.vm08.stdout:4/179: dread - d5/d17/f1a zero size 2026-03-10T07:51:25.282 INFO:tasks.workunit.client.1.vm08.stdout:5/311: chown d0/d4/d19/d3a/c4f 3362154 1 2026-03-10T07:51:25.283 INFO:tasks.workunit.client.1.vm08.stdout:5/312: write d0/d4/df/d12/f46 [290647,22919] 0 2026-03-10T07:51:25.284 INFO:tasks.workunit.client.1.vm08.stdout:6/253: rename d1/d3/df/d1d/d40/d4d to d1/d3/df/d52 0 2026-03-10T07:51:25.294 INFO:tasks.workunit.client.1.vm08.stdout:7/300: creat d3/da/d46/f65 x:0 0 0 2026-03-10T07:51:25.296 INFO:tasks.workunit.client.1.vm08.stdout:1/234: mkdir d2/d6/d50 0 2026-03-10T07:51:25.296 INFO:tasks.workunit.client.1.vm08.stdout:1/235: stat d2/d6/de/f1c 0 2026-03-10T07:51:25.299 INFO:tasks.workunit.client.1.vm08.stdout:2/271: truncate d0/d1/d17/d27/d3a/f3d 2078143 0 2026-03-10T07:51:25.300 INFO:tasks.workunit.client.1.vm08.stdout:2/272: write d0/d1/d3/d10/d38/f53 [873859,16112] 0 2026-03-10T07:51:25.307 INFO:tasks.workunit.client.1.vm08.stdout:2/273: dwrite d0/d1/d3/f8 [0,4194304] 0 2026-03-10T07:51:25.307 INFO:tasks.workunit.client.1.vm08.stdout:2/274: fdatasync d0/f50 0 2026-03-10T07:51:25.308 INFO:tasks.workunit.client.1.vm08.stdout:2/275: chown d0/f50 134328 1 2026-03-10T07:51:25.312 INFO:tasks.workunit.client.1.vm08.stdout:5/313: readlink d0/l36 0 2026-03-10T07:51:25.315 INFO:tasks.workunit.client.1.vm08.stdout:6/254: creat d1/d17/d2b/f53 x:0 0 0 2026-03-10T07:51:25.319 INFO:tasks.workunit.client.1.vm08.stdout:7/301: rename d3/da/d25/d9/l19 to d3/da/d25/d9/l66 0 2026-03-10T07:51:25.319 INFO:tasks.workunit.client.1.vm08.stdout:6/255: dwrite d1/db/d24/f25 [0,4194304] 0 2026-03-10T07:51:25.322 INFO:tasks.workunit.client.1.vm08.stdout:1/236: dread d2/d6/de/d1f/d22/f24 [0,4194304] 0 2026-03-10T07:51:25.326 INFO:tasks.workunit.client.1.vm08.stdout:4/180: symlink d5/d8/d9/d12/l3c 0 2026-03-10T07:51:25.328 INFO:tasks.workunit.client.1.vm08.stdout:1/237: write d2/d6/de/d1f/f2a [4128584,116924] 0 2026-03-10T07:51:25.329 INFO:tasks.workunit.client.1.vm08.stdout:2/276: creat d0/d1/d17/d27/d3a/f59 x:0 0 0 2026-03-10T07:51:25.331 INFO:tasks.workunit.client.1.vm08.stdout:9/270: truncate d2/d26/f29 5757464 0 2026-03-10T07:51:25.337 INFO:tasks.workunit.client.1.vm08.stdout:3/259: dwrite d0/d3c/f26 [0,4194304] 0 2026-03-10T07:51:25.337 INFO:tasks.workunit.client.1.vm08.stdout:0/268: truncate dd/f16 2469140 0 2026-03-10T07:51:25.337 INFO:tasks.workunit.client.1.vm08.stdout:0/269: dread - dd/d10/d14/d1b/d30/f4d zero size 2026-03-10T07:51:25.337 INFO:tasks.workunit.client.1.vm08.stdout:0/270: chown dd/d10/d14/d15/d20/d22/l5a 1 1 2026-03-10T07:51:25.339 INFO:tasks.workunit.client.1.vm08.stdout:3/260: dwrite d0/d3c/d1f/d44/d51/d34/f53 [0,4194304] 0 2026-03-10T07:51:25.366 INFO:tasks.workunit.client.1.vm08.stdout:2/277: symlink d0/d1/d17/d27/d3a/l5a 0 2026-03-10T07:51:25.366 INFO:tasks.workunit.client.1.vm08.stdout:2/278: dread - d0/d1/d17/d27/d3a/f59 zero size 2026-03-10T07:51:25.368 INFO:tasks.workunit.client.1.vm08.stdout:2/279: truncate d0/d1/d3/d10/d38/f54 836795 0 2026-03-10T07:51:25.381 INFO:tasks.workunit.client.1.vm08.stdout:9/271: rename d2/d3/f50 to d2/de/d28/f66 0 2026-03-10T07:51:25.392 INFO:tasks.workunit.client.1.vm08.stdout:3/261: dread d0/d3c/d18/f38 [0,4194304] 0 2026-03-10T07:51:25.392 INFO:tasks.workunit.client.1.vm08.stdout:7/302: unlink d3/da/d25/d9/d2f/l44 0 2026-03-10T07:51:25.398 INFO:tasks.workunit.client.1.vm08.stdout:7/303: dwrite d3/da/d25/f1e [4194304,4194304] 0 2026-03-10T07:51:25.416 INFO:tasks.workunit.client.1.vm08.stdout:6/256: mknod d1/d3/df/d44/c54 0 2026-03-10T07:51:25.423 INFO:tasks.workunit.client.1.vm08.stdout:3/262: write d0/d3c/d18/f1e [3541180,72211] 0 2026-03-10T07:51:25.429 INFO:tasks.workunit.client.1.vm08.stdout:0/271: creat dd/d10/d14/d15/d20/d5f/f61 x:0 0 0 2026-03-10T07:51:25.429 INFO:tasks.workunit.client.1.vm08.stdout:7/304: stat d3/da/c2a 0 2026-03-10T07:51:25.429 INFO:tasks.workunit.client.1.vm08.stdout:6/257: write d1/d3/df/d1d/f2a [1781206,22845] 0 2026-03-10T07:51:25.429 INFO:tasks.workunit.client.1.vm08.stdout:8/286: dwrite d0/df/f12 [0,4194304] 0 2026-03-10T07:51:25.431 INFO:tasks.workunit.client.1.vm08.stdout:8/287: stat d0/df/l16 0 2026-03-10T07:51:25.433 INFO:tasks.workunit.client.1.vm08.stdout:8/288: write d0/f45 [167981,99648] 0 2026-03-10T07:51:25.433 INFO:tasks.workunit.client.1.vm08.stdout:8/289: dread - d0/d3b/f46 zero size 2026-03-10T07:51:25.439 INFO:tasks.workunit.client.1.vm08.stdout:2/280: creat d0/d1/d3/d56/d57/f5b x:0 0 0 2026-03-10T07:51:25.443 INFO:tasks.workunit.client.1.vm08.stdout:8/290: dread d0/df/f12 [0,4194304] 0 2026-03-10T07:51:25.445 INFO:tasks.workunit.client.1.vm08.stdout:3/263: write d0/d3c/d1f/d44/d51/d34/f4e [988534,4456] 0 2026-03-10T07:51:25.445 INFO:tasks.workunit.client.1.vm08.stdout:3/264: read - d0/d41/f57 zero size 2026-03-10T07:51:25.446 INFO:tasks.workunit.client.1.vm08.stdout:3/265: write d0/d3c/d1f/d44/d51/f4d [94744,1868] 0 2026-03-10T07:51:25.447 INFO:tasks.workunit.client.1.vm08.stdout:7/305: chown d3/da/d25/d9/d2f/d3a/d40/d54/c5d 0 1 2026-03-10T07:51:25.451 INFO:tasks.workunit.client.1.vm08.stdout:7/306: read d3/da/d25/d9/f23 [1006153,31532] 0 2026-03-10T07:51:25.456 INFO:tasks.workunit.client.1.vm08.stdout:7/307: dwrite d3/f4 [8388608,4194304] 0 2026-03-10T07:51:25.460 INFO:tasks.workunit.client.1.vm08.stdout:1/238: getdents d2 0 2026-03-10T07:51:25.461 INFO:tasks.workunit.client.1.vm08.stdout:1/239: chown d2/d6/c9 2002403726 1 2026-03-10T07:51:25.469 INFO:tasks.workunit.client.1.vm08.stdout:2/281: rename d0/d1/d17/d27/f29 to d0/d1/d17/d27/d42/d55/d1b/f5c 0 2026-03-10T07:51:25.474 INFO:tasks.workunit.client.1.vm08.stdout:9/272: rmdir d2/d3/d25/d30/d35/d62 0 2026-03-10T07:51:25.480 INFO:tasks.workunit.client.1.vm08.stdout:4/181: dwrite d5/d8/d9/f1b [0,4194304] 0 2026-03-10T07:51:25.481 INFO:tasks.workunit.client.1.vm08.stdout:8/291: unlink d0/df/d15/d23/d39/d5b/c4b 0 2026-03-10T07:51:25.482 INFO:tasks.workunit.client.1.vm08.stdout:5/314: truncate d0/d4/d19/d3a/f3f 377741 0 2026-03-10T07:51:25.614 INFO:tasks.workunit.client.1.vm08.stdout:3/266: symlink d0/d3c/d18/d48/d55/l5a 0 2026-03-10T07:51:25.614 INFO:tasks.workunit.client.1.vm08.stdout:3/267: write d0/f39 [618079,53889] 0 2026-03-10T07:51:25.615 INFO:tasks.workunit.client.1.vm08.stdout:3/268: fsync d0/d3c/d1f/d44/d51/d34/f3d 0 2026-03-10T07:51:25.616 INFO:tasks.workunit.client.1.vm08.stdout:3/269: dread - d0/d3c/d1f/d44/d51/d2d/f58 zero size 2026-03-10T07:51:25.630 INFO:tasks.workunit.client.1.vm08.stdout:7/308: mkdir d3/da/d25/d9/d2f/d3a/d4b/d67 0 2026-03-10T07:51:25.643 INFO:tasks.workunit.client.1.vm08.stdout:9/273: symlink d2/d3/d25/d30/d35/l67 0 2026-03-10T07:51:25.644 INFO:tasks.workunit.client.1.vm08.stdout:4/182: write d5/d1f/d31/f33 [5149564,120341] 0 2026-03-10T07:51:25.647 INFO:tasks.workunit.client.1.vm08.stdout:9/274: dwrite d2/d3/f5f [0,4194304] 0 2026-03-10T07:51:25.673 INFO:tasks.workunit.client.1.vm08.stdout:7/309: fdatasync d3/f2b 0 2026-03-10T07:51:25.680 INFO:tasks.workunit.client.1.vm08.stdout:5/315: link d0/d4/df/d1e/d41/c54 d0/d4/d19/d50/c5b 0 2026-03-10T07:51:25.681 INFO:tasks.workunit.client.1.vm08.stdout:3/270: creat d0/d3c/d1f/d44/d51/f5b x:0 0 0 2026-03-10T07:51:25.682 INFO:tasks.workunit.client.1.vm08.stdout:5/316: creat d0/d4/df/d1e/d41/f5c x:0 0 0 2026-03-10T07:51:25.688 INFO:tasks.workunit.client.1.vm08.stdout:5/317: unlink d0/d4/df/d12/d1c/f2c 0 2026-03-10T07:51:25.690 INFO:tasks.workunit.client.1.vm08.stdout:5/318: mknod d0/d8/d24/c5d 0 2026-03-10T07:51:25.690 INFO:tasks.workunit.client.1.vm08.stdout:5/319: chown d0/d8/d24 1677 1 2026-03-10T07:51:25.694 INFO:tasks.workunit.client.1.vm08.stdout:5/320: mkdir d0/d8/d5e 0 2026-03-10T07:51:25.695 INFO:tasks.workunit.client.1.vm08.stdout:7/310: sync 2026-03-10T07:51:25.695 INFO:tasks.workunit.client.1.vm08.stdout:5/321: dwrite d0/d4/df/d1e/f28 [0,4194304] 0 2026-03-10T07:51:25.696 INFO:tasks.workunit.client.1.vm08.stdout:7/311: dread - d3/da/d25/d9/d2f/d39/f58 zero size 2026-03-10T07:51:25.700 INFO:tasks.workunit.client.1.vm08.stdout:5/322: creat d0/d4/df/d12/d1c/d40/f5f x:0 0 0 2026-03-10T07:51:25.703 INFO:tasks.workunit.client.1.vm08.stdout:5/323: mkdir d0/d4/d19/d60 0 2026-03-10T07:51:25.705 INFO:tasks.workunit.client.1.vm08.stdout:7/312: dwrite d3/da/d46/f65 [0,4194304] 0 2026-03-10T07:51:25.705 INFO:tasks.workunit.client.1.vm08.stdout:7/313: chown d3/f2e 316795 1 2026-03-10T07:51:25.705 INFO:tasks.workunit.client.1.vm08.stdout:7/314: fdatasync d3/da/d25/d9/d2f/d39/f58 0 2026-03-10T07:51:25.712 INFO:tasks.workunit.client.1.vm08.stdout:7/315: dwrite d3/da/d25/f5c [0,4194304] 0 2026-03-10T07:51:25.716 INFO:tasks.workunit.client.1.vm08.stdout:7/316: write d3/da/d46/f49 [1878886,46880] 0 2026-03-10T07:51:25.717 INFO:tasks.workunit.client.1.vm08.stdout:7/317: creat d3/da/d25/d9/d2f/d39/d43/d4f/f68 x:0 0 0 2026-03-10T07:51:25.719 INFO:tasks.workunit.client.1.vm08.stdout:7/318: truncate d3/da/d25/d9/d2f/d3a/d40/f63 950500 0 2026-03-10T07:51:25.827 INFO:tasks.workunit.client.1.vm08.stdout:4/183: mknod d5/d8/d9/c3d 0 2026-03-10T07:51:25.839 INFO:tasks.workunit.client.1.vm08.stdout:1/240: rename d2/d6/de/d1f/l4f to d2/d6/de/d1f/l51 0 2026-03-10T07:51:25.872 INFO:tasks.workunit.client.1.vm08.stdout:6/258: truncate d1/d3/df/f12 551854 0 2026-03-10T07:51:25.875 INFO:tasks.workunit.client.1.vm08.stdout:6/259: mknod d1/d3/df/d1d/d40/d45/c55 0 2026-03-10T07:51:25.878 INFO:tasks.workunit.client.1.vm08.stdout:6/260: dread d1/db/f23 [0,4194304] 0 2026-03-10T07:51:25.882 INFO:tasks.workunit.client.1.vm08.stdout:6/261: dwrite d1/db/d24/f26 [0,4194304] 0 2026-03-10T07:51:25.883 INFO:tasks.workunit.client.1.vm08.stdout:6/262: write d1/f34 [1464659,6074] 0 2026-03-10T07:51:25.892 INFO:tasks.workunit.client.1.vm08.stdout:2/282: write d0/d1/d17/d27/d42/d55/d1b/f26 [427121,89360] 0 2026-03-10T07:51:25.892 INFO:tasks.workunit.client.1.vm08.stdout:6/263: creat d1/d3/d3e/f56 x:0 0 0 2026-03-10T07:51:25.894 INFO:tasks.workunit.client.1.vm08.stdout:2/283: symlink d0/d1/d17/d27/d42/d55/d1b/l5d 0 2026-03-10T07:51:25.898 INFO:tasks.workunit.client.1.vm08.stdout:6/264: dwrite d1/d3/d3e/f56 [0,4194304] 0 2026-03-10T07:51:25.900 INFO:tasks.workunit.client.1.vm08.stdout:6/265: truncate d1/db/f4e 322031 0 2026-03-10T07:51:25.900 INFO:tasks.workunit.client.1.vm08.stdout:2/284: dwrite d0/d1/d3/d10/f4f [0,4194304] 0 2026-03-10T07:51:25.917 INFO:tasks.workunit.client.1.vm08.stdout:2/285: mknod d0/d1/d3/d10/d32/c5e 0 2026-03-10T07:51:25.920 INFO:tasks.workunit.client.1.vm08.stdout:2/286: dwrite d0/d1/f24 [0,4194304] 0 2026-03-10T07:51:25.922 INFO:tasks.workunit.client.1.vm08.stdout:2/287: truncate d0/d1/d17/d27/d3a/f59 842635 0 2026-03-10T07:51:25.925 INFO:tasks.workunit.client.1.vm08.stdout:2/288: stat d0/d1/d17/d27/d3a/f3d 0 2026-03-10T07:51:25.926 INFO:tasks.workunit.client.1.vm08.stdout:2/289: write d0/f4a [523745,21564] 0 2026-03-10T07:51:25.927 INFO:tasks.workunit.client.1.vm08.stdout:2/290: symlink d0/d1/d3/d10/d38/l5f 0 2026-03-10T07:51:25.928 INFO:tasks.workunit.client.1.vm08.stdout:2/291: creat d0/d1/d3/d10/d38/f60 x:0 0 0 2026-03-10T07:51:25.929 INFO:tasks.workunit.client.1.vm08.stdout:2/292: truncate d0/f4a 1305249 0 2026-03-10T07:51:25.930 INFO:tasks.workunit.client.1.vm08.stdout:2/293: chown d0/d1/d17/d27/d42/d55/d1b/l5d 53 1 2026-03-10T07:51:26.053 INFO:tasks.workunit.client.1.vm08.stdout:0/272: symlink dd/l62 0 2026-03-10T07:51:26.055 INFO:tasks.workunit.client.1.vm08.stdout:1/241: mknod d2/d6/c52 0 2026-03-10T07:51:26.055 INFO:tasks.workunit.client.1.vm08.stdout:1/242: readlink d2/d6/de/l27 0 2026-03-10T07:51:26.056 INFO:tasks.workunit.client.1.vm08.stdout:1/243: fdatasync d2/d6/de/d1f/d22/f30 0 2026-03-10T07:51:26.058 INFO:tasks.workunit.client.1.vm08.stdout:0/273: dwrite dd/d10/d2f/f4c [0,4194304] 0 2026-03-10T07:51:26.059 INFO:tasks.workunit.client.1.vm08.stdout:0/274: write dd/d10/d14/d1b/d30/f4d [739721,22471] 0 2026-03-10T07:51:26.073 INFO:tasks.workunit.client.1.vm08.stdout:1/244: dread d2/d6/de/d47/f38 [0,4194304] 0 2026-03-10T07:51:26.074 INFO:tasks.workunit.client.1.vm08.stdout:1/245: mknod d2/d6/c53 0 2026-03-10T07:51:26.076 INFO:tasks.workunit.client.1.vm08.stdout:1/246: creat d2/d6/d50/f54 x:0 0 0 2026-03-10T07:51:26.077 INFO:tasks.workunit.client.1.vm08.stdout:1/247: symlink d2/d6/d50/l55 0 2026-03-10T07:51:26.086 INFO:tasks.workunit.client.1.vm08.stdout:0/275: sync 2026-03-10T07:51:26.087 INFO:tasks.workunit.client.1.vm08.stdout:0/276: creat dd/d29/d5c/f63 x:0 0 0 2026-03-10T07:51:26.088 INFO:tasks.workunit.client.1.vm08.stdout:0/277: dread - dd/d29/d5c/f63 zero size 2026-03-10T07:51:26.088 INFO:tasks.workunit.client.1.vm08.stdout:0/278: truncate dd/f56 72535 0 2026-03-10T07:51:26.100 INFO:tasks.workunit.client.1.vm08.stdout:1/248: mknod d2/d6/de/c56 0 2026-03-10T07:51:26.101 INFO:tasks.workunit.client.1.vm08.stdout:1/249: chown d2/d6/de/d1f/d22/c37 366 1 2026-03-10T07:51:26.102 INFO:tasks.workunit.client.1.vm08.stdout:1/250: mknod d2/d6/de/d1f/d26/c57 0 2026-03-10T07:51:26.103 INFO:tasks.workunit.client.1.vm08.stdout:1/251: truncate d2/d6/de/d47/f3c 317968 0 2026-03-10T07:51:26.106 INFO:tasks.workunit.client.1.vm08.stdout:8/292: rename d0/d3b to d0/df/d5d/d5e 0 2026-03-10T07:51:26.106 INFO:tasks.workunit.client.1.vm08.stdout:8/293: readlink d0/df/d2e/l59 0 2026-03-10T07:51:26.108 INFO:tasks.workunit.client.1.vm08.stdout:1/252: dwrite d2/d6/de/d47/f38 [0,4194304] 0 2026-03-10T07:51:26.109 INFO:tasks.workunit.client.1.vm08.stdout:9/275: rename d2/d3/d25/c23 to d2/de/d28/c68 0 2026-03-10T07:51:26.110 INFO:tasks.workunit.client.1.vm08.stdout:8/294: symlink d0/df/d15/d23/d39/d5b/l5f 0 2026-03-10T07:51:26.110 INFO:tasks.workunit.client.1.vm08.stdout:9/276: write d2/d26/f4b [297769,124804] 0 2026-03-10T07:51:26.114 INFO:tasks.workunit.client.1.vm08.stdout:3/271: rename d0/d3c/d18/d4a/l4f to d0/d3c/d1f/d44/l5c 0 2026-03-10T07:51:26.115 INFO:tasks.workunit.client.1.vm08.stdout:3/272: write d0/d3c/d1f/d44/f59 [48841,96823] 0 2026-03-10T07:51:26.118 INFO:tasks.workunit.client.1.vm08.stdout:3/273: chown d0/d3c/d18 993004 1 2026-03-10T07:51:26.119 INFO:tasks.workunit.client.1.vm08.stdout:3/274: chown d0/d3c/l1c 2632324 1 2026-03-10T07:51:26.119 INFO:tasks.workunit.client.1.vm08.stdout:7/319: rename d3/da/d46 to d3/da/d25/d9/d2f/d3a/d4b/d67/d69 0 2026-03-10T07:51:26.122 INFO:tasks.workunit.client.1.vm08.stdout:4/184: rename d5/d8/d9/d12/l23 to d5/d17/l3e 0 2026-03-10T07:51:26.123 INFO:tasks.workunit.client.1.vm08.stdout:7/320: unlink d3/da/d25/d9/l36 0 2026-03-10T07:51:26.129 INFO:tasks.workunit.client.1.vm08.stdout:2/294: rename d0/d1/d17/d27/d3a to d0/d1/d3/d10/d32/d61 0 2026-03-10T07:51:26.129 INFO:tasks.workunit.client.1.vm08.stdout:2/295: chown d0/c33 1040560264 1 2026-03-10T07:51:26.129 INFO:tasks.workunit.client.1.vm08.stdout:8/295: dread d0/df/f13 [4194304,4194304] 0 2026-03-10T07:51:26.130 INFO:tasks.workunit.client.1.vm08.stdout:5/324: chown d0/d4/d19/d3a/f3f 196562858 1 2026-03-10T07:51:26.130 INFO:tasks.workunit.client.1.vm08.stdout:5/325: chown d0/d4/df/d12/c38 4522 1 2026-03-10T07:51:26.137 INFO:tasks.workunit.client.1.vm08.stdout:4/185: creat d5/d1f/d31/f3f x:0 0 0 2026-03-10T07:51:26.144 INFO:tasks.workunit.client.1.vm08.stdout:0/279: rename dd/d10/d14/d1b/d39 to dd/d10/d2f/d37/d64 0 2026-03-10T07:51:26.144 INFO:tasks.workunit.client.1.vm08.stdout:2/296: rmdir d0/d1/d17 39 2026-03-10T07:51:26.144 INFO:tasks.workunit.client.1.vm08.stdout:0/280: write dd/fe [3620767,65105] 0 2026-03-10T07:51:26.147 INFO:tasks.workunit.client.1.vm08.stdout:4/186: read d5/d1f/d31/f33 [2177637,125061] 0 2026-03-10T07:51:26.148 INFO:tasks.workunit.client.1.vm08.stdout:9/277: rename d2/f44 to d2/d26/f69 0 2026-03-10T07:51:26.149 INFO:tasks.workunit.client.1.vm08.stdout:0/281: creat dd/d10/d2f/d37/f65 x:0 0 0 2026-03-10T07:51:26.149 INFO:tasks.workunit.client.1.vm08.stdout:2/297: dwrite d0/d1/d3/d10/f58 [0,4194304] 0 2026-03-10T07:51:26.153 INFO:tasks.workunit.client.1.vm08.stdout:8/296: link d0/f22 d0/df/f60 0 2026-03-10T07:51:26.160 INFO:tasks.workunit.client.1.vm08.stdout:9/278: creat d2/d3/d25/d2b/f6a x:0 0 0 2026-03-10T07:51:26.161 INFO:tasks.workunit.client.1.vm08.stdout:9/279: truncate d2/d3/f49 4445691 0 2026-03-10T07:51:26.161 INFO:tasks.workunit.client.1.vm08.stdout:8/297: write d0/df/f13 [3367330,3835] 0 2026-03-10T07:51:26.164 INFO:tasks.workunit.client.1.vm08.stdout:9/280: dwrite d2/f1a [0,4194304] 0 2026-03-10T07:51:26.165 INFO:tasks.workunit.client.1.vm08.stdout:9/281: fsync d2/d3/d25/f63 0 2026-03-10T07:51:26.166 INFO:tasks.workunit.client.1.vm08.stdout:9/282: dread - d2/d3/d25/f63 zero size 2026-03-10T07:51:26.173 INFO:tasks.workunit.client.1.vm08.stdout:9/283: dwrite d2/d26/f61 [0,4194304] 0 2026-03-10T07:51:26.178 INFO:tasks.workunit.client.1.vm08.stdout:4/187: link l3 d5/l40 0 2026-03-10T07:51:26.178 INFO:tasks.workunit.client.1.vm08.stdout:0/282: dread dd/d10/d14/d15/d20/d22/f2e [0,4194304] 0 2026-03-10T07:51:26.180 INFO:tasks.workunit.client.1.vm08.stdout:2/298: link d0/d1/d17/f1a d0/d1/d17/d27/f62 0 2026-03-10T07:51:26.181 INFO:tasks.workunit.client.1.vm08.stdout:9/284: dwrite d2/f5 [0,4194304] 0 2026-03-10T07:51:26.197 INFO:tasks.workunit.client.1.vm08.stdout:0/283: rename dd/d10/d14/d1b/l27 to dd/d10/d14/d15/d20/l66 0 2026-03-10T07:51:26.198 INFO:tasks.workunit.client.1.vm08.stdout:0/284: write dd/d10/d14/d15/d20/d22/f5d [954819,52853] 0 2026-03-10T07:51:26.202 INFO:tasks.workunit.client.1.vm08.stdout:9/285: rename d2/de/l1f to d2/d26/l6b 0 2026-03-10T07:51:26.202 INFO:tasks.workunit.client.1.vm08.stdout:4/188: mkdir d5/d1f/d41 0 2026-03-10T07:51:26.213 INFO:tasks.workunit.client.1.vm08.stdout:0/285: mknod dd/d10/d2f/c67 0 2026-03-10T07:51:26.213 INFO:tasks.workunit.client.1.vm08.stdout:2/299: link d0/f12 d0/d1/d3/f63 0 2026-03-10T07:51:26.213 INFO:tasks.workunit.client.1.vm08.stdout:9/286: creat d2/d3/d25/d30/d35/f6c x:0 0 0 2026-03-10T07:51:26.215 INFO:tasks.workunit.client.1.vm08.stdout:2/300: truncate d0/d1/d3/d10/f4f 4478172 0 2026-03-10T07:51:26.217 INFO:tasks.workunit.client.1.vm08.stdout:0/286: dwrite dd/f13 [0,4194304] 0 2026-03-10T07:51:26.218 INFO:tasks.workunit.client.1.vm08.stdout:4/189: write d5/f2d [64518,100871] 0 2026-03-10T07:51:26.221 INFO:tasks.workunit.client.1.vm08.stdout:2/301: fdatasync d0/f12 0 2026-03-10T07:51:26.229 INFO:tasks.workunit.client.1.vm08.stdout:2/302: write d0/d1/d3/f4e [1685424,122607] 0 2026-03-10T07:51:26.229 INFO:tasks.workunit.client.1.vm08.stdout:0/287: rmdir dd 39 2026-03-10T07:51:26.231 INFO:tasks.workunit.client.1.vm08.stdout:2/303: dwrite d0/d1/d3/d39/f3b [0,4194304] 0 2026-03-10T07:51:26.232 INFO:tasks.workunit.client.1.vm08.stdout:0/288: write dd/d10/d14/d15/d20/d22/f2e [1296215,77522] 0 2026-03-10T07:51:26.233 INFO:tasks.workunit.client.1.vm08.stdout:4/190: link d5/d8/d9/f2b d5/d8/f42 0 2026-03-10T07:51:26.234 INFO:tasks.workunit.client.1.vm08.stdout:0/289: write dd/d10/d14/d15/d20/d5f/f61 [399624,7539] 0 2026-03-10T07:51:26.235 INFO:tasks.workunit.client.1.vm08.stdout:0/290: write dd/d18/f25 [1417409,83504] 0 2026-03-10T07:51:26.247 INFO:tasks.workunit.client.1.vm08.stdout:2/304: dwrite d0/d1/d3/d10/d38/f60 [0,4194304] 0 2026-03-10T07:51:26.248 INFO:tasks.workunit.client.1.vm08.stdout:0/291: dwrite dd/d10/d2f/d37/f65 [0,4194304] 0 2026-03-10T07:51:26.250 INFO:tasks.workunit.client.1.vm08.stdout:4/191: sync 2026-03-10T07:51:26.251 INFO:tasks.workunit.client.1.vm08.stdout:2/305: write d0/d1/d3/d10/f4f [628584,106057] 0 2026-03-10T07:51:26.251 INFO:tasks.workunit.client.1.vm08.stdout:2/306: fsync d0/f44 0 2026-03-10T07:51:26.252 INFO:tasks.workunit.client.1.vm08.stdout:2/307: fdatasync d0/d1/d3/d10/d38/f53 0 2026-03-10T07:51:26.260 INFO:tasks.workunit.client.1.vm08.stdout:9/287: dread d2/f6 [0,4194304] 0 2026-03-10T07:51:26.260 INFO:tasks.workunit.client.1.vm08.stdout:9/288: fsync d2/d3/fc 0 2026-03-10T07:51:26.264 INFO:tasks.workunit.client.1.vm08.stdout:4/192: truncate d5/f21 714538 0 2026-03-10T07:51:26.265 INFO:tasks.workunit.client.1.vm08.stdout:4/193: write f0 [67632,22563] 0 2026-03-10T07:51:26.283 INFO:tasks.workunit.client.1.vm08.stdout:9/289: mknod d2/d3/d25/d30/c6d 0 2026-03-10T07:51:26.296 INFO:tasks.workunit.client.1.vm08.stdout:4/194: dwrite d5/d1f/d31/f33 [4194304,4194304] 0 2026-03-10T07:51:26.300 INFO:tasks.workunit.client.1.vm08.stdout:4/195: fdatasync d5/d1f/d31/f38 0 2026-03-10T07:51:26.307 INFO:tasks.workunit.client.1.vm08.stdout:2/308: link d0/d1/d17/d27/d42/d55/c30 d0/d1/d17/d27/d42/c64 0 2026-03-10T07:51:26.309 INFO:tasks.workunit.client.1.vm08.stdout:2/309: mkdir d0/d1/d3/d10/d65 0 2026-03-10T07:51:26.310 INFO:tasks.workunit.client.1.vm08.stdout:2/310: chown d0/d1/d3/c2b 208 1 2026-03-10T07:51:26.310 INFO:tasks.workunit.client.1.vm08.stdout:2/311: fdatasync d0/d1/d3/d10/d38/f52 0 2026-03-10T07:51:26.316 INFO:tasks.workunit.client.1.vm08.stdout:4/196: creat d5/d17/f43 x:0 0 0 2026-03-10T07:51:26.329 INFO:tasks.workunit.client.1.vm08.stdout:2/312: mknod d0/d1/d17/d27/d42/d55/d1b/c66 0 2026-03-10T07:51:26.333 INFO:tasks.workunit.client.1.vm08.stdout:2/313: read d0/d1/d3/d10/d32/f45 [1309903,122769] 0 2026-03-10T07:51:26.336 INFO:tasks.workunit.client.1.vm08.stdout:4/197: creat d5/d8/d9/d32/f44 x:0 0 0 2026-03-10T07:51:26.344 INFO:tasks.workunit.client.1.vm08.stdout:2/314: fsync d0/f1e 0 2026-03-10T07:51:26.346 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:26 vm05.local ceph-mon[50387]: pgmap v32: 65 pgs: 65 active+clean; 2.6 GiB data, 8.6 GiB used, 111 GiB / 120 GiB avail; 35 MiB/s rd, 135 MiB/s wr, 310 op/s 2026-03-10T07:51:26.346 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:26 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:26.346 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:26 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:26.346 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:26 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:26.346 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:26 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:26.350 INFO:tasks.workunit.client.1.vm08.stdout:6/266: dwrite d1/db/f23 [0,4194304] 0 2026-03-10T07:51:26.355 INFO:tasks.workunit.client.1.vm08.stdout:6/267: sync 2026-03-10T07:51:26.375 INFO:tasks.workunit.client.1.vm08.stdout:1/253: truncate d2/d6/de/d1f/d26/f29 3523650 0 2026-03-10T07:51:26.375 INFO:tasks.workunit.client.1.vm08.stdout:4/198: symlink d5/d8/d9/l45 0 2026-03-10T07:51:26.376 INFO:tasks.workunit.client.1.vm08.stdout:4/199: write d5/d1f/d31/f3f [1000920,81139] 0 2026-03-10T07:51:26.380 INFO:tasks.workunit.client.1.vm08.stdout:3/275: dwrite d0/d3c/d18/f38 [0,4194304] 0 2026-03-10T07:51:26.382 INFO:tasks.workunit.client.1.vm08.stdout:3/276: truncate d0/d3c/d1f/d44/d51/d34/f53 4315143 0 2026-03-10T07:51:26.383 INFO:tasks.workunit.client.1.vm08.stdout:1/254: dread d2/d6/de/d1f/d26/f48 [0,4194304] 0 2026-03-10T07:51:26.384 INFO:tasks.workunit.client.1.vm08.stdout:1/255: write d2/d6/de/d1f/d22/f35 [1532008,52054] 0 2026-03-10T07:51:26.387 INFO:tasks.workunit.client.1.vm08.stdout:2/315: unlink d0/d1/d3/d10/d38/f4b 0 2026-03-10T07:51:26.387 INFO:tasks.workunit.client.1.vm08.stdout:7/321: truncate d3/f2e 1577387 0 2026-03-10T07:51:26.388 INFO:tasks.workunit.client.1.vm08.stdout:4/200: creat d5/d8/d9/f46 x:0 0 0 2026-03-10T07:51:26.388 INFO:tasks.workunit.client.1.vm08.stdout:6/268: creat d1/db/f57 x:0 0 0 2026-03-10T07:51:26.393 INFO:tasks.workunit.client.1.vm08.stdout:5/326: dwrite d0/d8/fe [0,4194304] 0 2026-03-10T07:51:26.395 INFO:tasks.workunit.client.1.vm08.stdout:3/277: rmdir d0/d3c/d18/d48/d55 39 2026-03-10T07:51:26.395 INFO:tasks.workunit.client.1.vm08.stdout:0/292: fsync dd/d10/d2f/d37/f65 0 2026-03-10T07:51:26.395 INFO:tasks.workunit.client.1.vm08.stdout:3/278: fdatasync d0/d3c/d18/f23 0 2026-03-10T07:51:26.396 INFO:tasks.workunit.client.1.vm08.stdout:3/279: write d0/d3c/f26 [5147303,49684] 0 2026-03-10T07:51:26.397 INFO:tasks.workunit.client.1.vm08.stdout:3/280: write d0/f28 [101676,68202] 0 2026-03-10T07:51:26.399 INFO:tasks.workunit.client.1.vm08.stdout:3/281: readlink d0/lf 0 2026-03-10T07:51:26.404 INFO:tasks.workunit.client.1.vm08.stdout:2/316: symlink d0/d1/d3/d39/l67 0 2026-03-10T07:51:26.408 INFO:tasks.workunit.client.1.vm08.stdout:4/201: symlink d5/d1f/d31/l47 0 2026-03-10T07:51:26.408 INFO:tasks.workunit.client.1.vm08.stdout:2/317: truncate d0/d1/d3/d56/d57/f5b 586457 0 2026-03-10T07:51:26.408 INFO:tasks.workunit.client.1.vm08.stdout:6/269: mkdir d1/d17/d2b/d58 0 2026-03-10T07:51:26.414 INFO:tasks.workunit.client.1.vm08.stdout:8/298: write d0/df/f1b [4332484,91379] 0 2026-03-10T07:51:26.418 INFO:tasks.workunit.client.1.vm08.stdout:8/299: dwrite d0/df/d15/d23/d39/f3e [0,4194304] 0 2026-03-10T07:51:26.419 INFO:tasks.workunit.client.1.vm08.stdout:8/300: write d0/df/d2e/f3c [590859,129456] 0 2026-03-10T07:51:26.421 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:26 vm08.local ceph-mon[59917]: pgmap v32: 65 pgs: 65 active+clean; 2.6 GiB data, 8.6 GiB used, 111 GiB / 120 GiB avail; 35 MiB/s rd, 135 MiB/s wr, 310 op/s 2026-03-10T07:51:26.421 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:26 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:26.421 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:26 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:26.421 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:26 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:26.421 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:26 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:26.430 INFO:tasks.workunit.client.1.vm08.stdout:7/322: dread d3/da/d25/d9/d2f/d3a/d4b/d67/d69/f49 [0,4194304] 0 2026-03-10T07:51:26.430 INFO:tasks.workunit.client.1.vm08.stdout:7/323: read d3/f57 [2119675,45155] 0 2026-03-10T07:51:26.437 INFO:tasks.workunit.client.1.vm08.stdout:0/293: creat dd/d10/d2f/d37/d64/f68 x:0 0 0 2026-03-10T07:51:26.438 INFO:tasks.workunit.client.1.vm08.stdout:0/294: chown dd/d10/d2f/l47 1804795 1 2026-03-10T07:51:26.441 INFO:tasks.workunit.client.1.vm08.stdout:5/327: rename d0/d4/l1d to d0/d8/d24/l61 0 2026-03-10T07:51:26.452 INFO:tasks.workunit.client.1.vm08.stdout:9/290: rmdir d2/d3/d25 39 2026-03-10T07:51:26.464 INFO:tasks.workunit.client.1.vm08.stdout:4/202: dread d5/f10 [0,4194304] 0 2026-03-10T07:51:26.469 INFO:tasks.workunit.client.1.vm08.stdout:4/203: dwrite d5/d17/f22 [0,4194304] 0 2026-03-10T07:51:26.471 INFO:tasks.workunit.client.1.vm08.stdout:4/204: chown d5/d1f/d31/l34 565 1 2026-03-10T07:51:26.471 INFO:tasks.workunit.client.1.vm08.stdout:4/205: chown d5/d1f/d31/f38 11314 1 2026-03-10T07:51:26.472 INFO:tasks.workunit.client.1.vm08.stdout:4/206: write d5/d17/f1a [392281,10943] 0 2026-03-10T07:51:26.473 INFO:tasks.workunit.client.1.vm08.stdout:4/207: stat d5/d1f/d31/f38 0 2026-03-10T07:51:26.478 INFO:tasks.workunit.client.1.vm08.stdout:4/208: sync 2026-03-10T07:51:26.488 INFO:tasks.workunit.client.1.vm08.stdout:7/324: rmdir d3 39 2026-03-10T07:51:26.496 INFO:tasks.workunit.client.1.vm08.stdout:5/328: rmdir d0/d8 39 2026-03-10T07:51:26.496 INFO:tasks.workunit.client.1.vm08.stdout:5/329: chown d0/d4/df/d1e/d41/f5c 138868 1 2026-03-10T07:51:26.508 INFO:tasks.workunit.client.1.vm08.stdout:8/301: mknod d0/df/d15/d53/c61 0 2026-03-10T07:51:26.509 INFO:tasks.workunit.client.1.vm08.stdout:6/270: creat d1/db/d24/d51/f59 x:0 0 0 2026-03-10T07:51:26.510 INFO:tasks.workunit.client.1.vm08.stdout:7/325: truncate d3/f51 697463 0 2026-03-10T07:51:26.517 INFO:tasks.workunit.client.1.vm08.stdout:2/318: creat d0/f68 x:0 0 0 2026-03-10T07:51:26.523 INFO:tasks.workunit.client.1.vm08.stdout:9/291: rename d2/d3/d25/d30/c60 to d2/d3/d25/d2b/c6e 0 2026-03-10T07:51:26.523 INFO:tasks.workunit.client.1.vm08.stdout:9/292: readlink d2/d3/d25/d2b/l40 0 2026-03-10T07:51:26.524 INFO:tasks.workunit.client.1.vm08.stdout:4/209: truncate d5/d8/d9/f2b 1976637 0 2026-03-10T07:51:26.525 INFO:tasks.workunit.client.1.vm08.stdout:6/271: creat d1/d3/df/d44/f5a x:0 0 0 2026-03-10T07:51:26.527 INFO:tasks.workunit.client.1.vm08.stdout:1/256: truncate d2/d6/de/d1f/d22/f35 7272700 0 2026-03-10T07:51:26.533 INFO:tasks.workunit.client.1.vm08.stdout:7/326: rmdir d3/da/d25/d9 39 2026-03-10T07:51:26.533 INFO:tasks.workunit.client.1.vm08.stdout:7/327: fsync d3/da/d25/f35 0 2026-03-10T07:51:26.533 INFO:tasks.workunit.client.1.vm08.stdout:7/328: fsync d3/f34 0 2026-03-10T07:51:26.535 INFO:tasks.workunit.client.1.vm08.stdout:9/293: symlink d2/de/l6f 0 2026-03-10T07:51:26.536 INFO:tasks.workunit.client.1.vm08.stdout:3/282: truncate d0/d3c/d1f/d44/d51/d34/f4e 3021375 0 2026-03-10T07:51:26.537 INFO:tasks.workunit.client.1.vm08.stdout:4/210: mkdir d5/d17/d48 0 2026-03-10T07:51:26.538 INFO:tasks.workunit.client.1.vm08.stdout:6/272: unlink d1/d3/f13 0 2026-03-10T07:51:26.539 INFO:tasks.workunit.client.1.vm08.stdout:3/283: dwrite d0/d3c/d18/f22 [0,4194304] 0 2026-03-10T07:51:26.541 INFO:tasks.workunit.client.1.vm08.stdout:0/295: write fb [663498,129203] 0 2026-03-10T07:51:26.553 INFO:tasks.workunit.client.1.vm08.stdout:1/257: mkdir d2/d6/de/d1f/d26/d58 0 2026-03-10T07:51:26.555 INFO:tasks.workunit.client.1.vm08.stdout:1/258: dread d2/d6/de/d1f/d26/f48 [0,4194304] 0 2026-03-10T07:51:26.559 INFO:tasks.workunit.client.1.vm08.stdout:2/319: mknod d0/c69 0 2026-03-10T07:51:26.564 INFO:tasks.workunit.client.1.vm08.stdout:6/273: creat d1/d3/d3e/f5b x:0 0 0 2026-03-10T07:51:26.568 INFO:tasks.workunit.client.1.vm08.stdout:6/274: stat d1/d3/l27 0 2026-03-10T07:51:26.568 INFO:tasks.workunit.client.1.vm08.stdout:2/320: mknod d0/d1/d3/d10/d32/d61/c6a 0 2026-03-10T07:51:26.568 INFO:tasks.workunit.client.1.vm08.stdout:9/294: rename d2/de/f27 to d2/d3/d25/d30/f70 0 2026-03-10T07:51:26.571 INFO:tasks.workunit.client.1.vm08.stdout:1/259: dwrite d2/d10/f2b [4194304,4194304] 0 2026-03-10T07:51:26.574 INFO:tasks.workunit.client.1.vm08.stdout:4/211: creat d5/d1f/d41/f49 x:0 0 0 2026-03-10T07:51:26.574 INFO:tasks.workunit.client.1.vm08.stdout:1/260: dread - d2/d6/de/d1f/d26/f4a zero size 2026-03-10T07:51:26.579 INFO:tasks.workunit.client.1.vm08.stdout:3/284: fsync d0/d3c/d18/f22 0 2026-03-10T07:51:26.586 INFO:tasks.workunit.client.1.vm08.stdout:5/330: dwrite d0/d8/f18 [0,4194304] 0 2026-03-10T07:51:26.588 INFO:tasks.workunit.client.1.vm08.stdout:8/302: link d0/df/d15/d23/d39/d5b/c5a d0/df/d2e/c62 0 2026-03-10T07:51:26.589 INFO:tasks.workunit.client.1.vm08.stdout:5/331: write d0/d4/df/d12/f13 [2434724,36117] 0 2026-03-10T07:51:26.593 INFO:tasks.workunit.client.1.vm08.stdout:6/275: mkdir d1/d3/df/d1d/d40/d45/d5c 0 2026-03-10T07:51:26.599 INFO:tasks.workunit.client.1.vm08.stdout:0/296: dwrite dd/d10/d14/f46 [0,4194304] 0 2026-03-10T07:51:26.599 INFO:tasks.workunit.client.1.vm08.stdout:2/321: mkdir d0/d1/d17/d6b 0 2026-03-10T07:51:26.599 INFO:tasks.workunit.client.1.vm08.stdout:7/329: rename d3/c2d to d3/da/d25/d9/d2f/d3a/d4b/d67/c6a 0 2026-03-10T07:51:26.599 INFO:tasks.workunit.client.1.vm08.stdout:9/295: mknod d2/d3/d25/d2b/c71 0 2026-03-10T07:51:26.599 INFO:tasks.workunit.client.1.vm08.stdout:4/212: symlink d5/d1f/d31/l4a 0 2026-03-10T07:51:26.599 INFO:tasks.workunit.client.1.vm08.stdout:5/332: dwrite d0/d4/df/d12/f11 [0,4194304] 0 2026-03-10T07:51:26.601 INFO:tasks.workunit.client.1.vm08.stdout:9/296: fsync d2/d3/d25/d30/f5d 0 2026-03-10T07:51:26.601 INFO:tasks.workunit.client.1.vm08.stdout:9/297: read - d2/d3/d25/f55 zero size 2026-03-10T07:51:26.601 INFO:tasks.workunit.client.1.vm08.stdout:9/298: chown d2/de/d28 1943017 1 2026-03-10T07:51:26.601 INFO:tasks.workunit.client.1.vm08.stdout:9/299: write d2/d3/d25/d2b/f6a [50973,59786] 0 2026-03-10T07:51:26.601 INFO:tasks.workunit.client.1.vm08.stdout:9/300: chown d2/d3 129635 1 2026-03-10T07:51:26.601 INFO:tasks.workunit.client.1.vm08.stdout:8/303: sync 2026-03-10T07:51:26.601 INFO:tasks.workunit.client.1.vm08.stdout:9/301: chown d2/d3/d25/d30 37 1 2026-03-10T07:51:26.601 INFO:tasks.workunit.client.1.vm08.stdout:1/261: symlink d2/d6/d11/l59 0 2026-03-10T07:51:26.602 INFO:tasks.workunit.client.1.vm08.stdout:9/302: chown d2/d3/d25/f24 594 1 2026-03-10T07:51:26.623 INFO:tasks.workunit.client.1.vm08.stdout:2/322: mknod d0/d1/d3/d10/d32/c6c 0 2026-03-10T07:51:26.623 INFO:tasks.workunit.client.1.vm08.stdout:5/333: dread d0/d4/df/d12/d1c/f29 [0,4194304] 0 2026-03-10T07:51:26.624 INFO:tasks.workunit.client.1.vm08.stdout:5/334: dread - d0/f51 zero size 2026-03-10T07:51:26.626 INFO:tasks.workunit.client.1.vm08.stdout:7/330: dread d3/f2e [0,4194304] 0 2026-03-10T07:51:26.635 INFO:tasks.workunit.client.1.vm08.stdout:5/335: write d0/d4/d19/d3a/f53 [379836,85652] 0 2026-03-10T07:51:26.635 INFO:tasks.workunit.client.1.vm08.stdout:3/285: mknod d0/c5d 0 2026-03-10T07:51:26.635 INFO:tasks.workunit.client.1.vm08.stdout:1/262: rmdir d2/d6/d50 39 2026-03-10T07:51:26.635 INFO:tasks.workunit.client.1.vm08.stdout:5/336: fdatasync d0/d4/df/d12/f46 0 2026-03-10T07:51:26.635 INFO:tasks.workunit.client.1.vm08.stdout:0/297: link dd/d29/f2a dd/d10/d14/f69 0 2026-03-10T07:51:26.635 INFO:tasks.workunit.client.1.vm08.stdout:8/304: rename d0/df/d5d/d5e/d3f/f4e to d0/df/d17/d25/f63 0 2026-03-10T07:51:26.635 INFO:tasks.workunit.client.1.vm08.stdout:3/286: mknod d0/d3c/d1f/d44/d51/d34/c5e 0 2026-03-10T07:51:26.635 INFO:tasks.workunit.client.1.vm08.stdout:1/263: symlink d2/d6/de/d1f/d40/l5a 0 2026-03-10T07:51:26.636 INFO:tasks.workunit.client.1.vm08.stdout:2/323: mknod d0/d1/d3/d10/d65/c6d 0 2026-03-10T07:51:26.636 INFO:tasks.workunit.client.1.vm08.stdout:8/305: mkdir d0/df/d2e/d64 0 2026-03-10T07:51:26.637 INFO:tasks.workunit.client.1.vm08.stdout:5/337: creat d0/d4/df/d4a/f62 x:0 0 0 2026-03-10T07:51:26.638 INFO:tasks.workunit.client.1.vm08.stdout:4/213: link d5/d8/d9/c15 d5/d8/c4b 0 2026-03-10T07:51:26.638 INFO:tasks.workunit.client.1.vm08.stdout:8/306: truncate d0/df/d17/d25/f63 788475 0 2026-03-10T07:51:26.639 INFO:tasks.workunit.client.1.vm08.stdout:5/338: write d0/d4/df/d12/f46 [779514,73703] 0 2026-03-10T07:51:26.640 INFO:tasks.workunit.client.1.vm08.stdout:9/303: getdents d2/de/d28 0 2026-03-10T07:51:26.641 INFO:tasks.workunit.client.1.vm08.stdout:0/298: link dd/d29/d58/d3d/f5b dd/d10/d2f/d37/f6a 0 2026-03-10T07:51:26.642 INFO:tasks.workunit.client.1.vm08.stdout:3/287: creat d0/d3c/d18/d48/d55/f5f x:0 0 0 2026-03-10T07:51:26.642 INFO:tasks.workunit.client.1.vm08.stdout:5/339: mkdir d0/d4/df/d4a/d63 0 2026-03-10T07:51:26.645 INFO:tasks.workunit.client.1.vm08.stdout:5/340: stat d0/d4/df/d1e/f42 0 2026-03-10T07:51:26.646 INFO:tasks.workunit.client.1.vm08.stdout:8/307: rename d0/df/d15/d23/d39/c52 to d0/d37/c65 0 2026-03-10T07:51:26.647 INFO:tasks.workunit.client.1.vm08.stdout:1/264: link d2/d10/f2b d2/d6/de/f5b 0 2026-03-10T07:51:26.647 INFO:tasks.workunit.client.1.vm08.stdout:2/324: dread d0/d1/d3/d10/d38/f53 [0,4194304] 0 2026-03-10T07:51:26.649 INFO:tasks.workunit.client.1.vm08.stdout:3/288: dwrite d0/d3c/f26 [0,4194304] 0 2026-03-10T07:51:26.649 INFO:tasks.workunit.client.1.vm08.stdout:8/308: write d0/df/d2e/f44 [204859,8667] 0 2026-03-10T07:51:26.653 INFO:tasks.workunit.client.1.vm08.stdout:5/341: truncate d0/d4/df/d4a/f62 979459 0 2026-03-10T07:51:26.653 INFO:tasks.workunit.client.1.vm08.stdout:0/299: creat dd/d10/d2f/f6b x:0 0 0 2026-03-10T07:51:26.654 INFO:tasks.workunit.client.1.vm08.stdout:4/214: mknod d5/d8/c4c 0 2026-03-10T07:51:26.655 INFO:tasks.workunit.client.1.vm08.stdout:4/215: fdatasync d5/d1f/d41/f49 0 2026-03-10T07:51:26.655 INFO:tasks.workunit.client.1.vm08.stdout:9/304: rename d2/d3/d25/d2b/l5b to d2/d3/d25/d30/l72 0 2026-03-10T07:51:26.659 INFO:tasks.workunit.client.1.vm08.stdout:4/216: fdatasync d5/d8/d9/f46 0 2026-03-10T07:51:26.659 INFO:tasks.workunit.client.1.vm08.stdout:4/217: chown d5/f2d 3 1 2026-03-10T07:51:26.661 INFO:tasks.workunit.client.1.vm08.stdout:4/218: truncate d5/d8/f30 1013308 0 2026-03-10T07:51:26.662 INFO:tasks.workunit.client.1.vm08.stdout:1/265: read d2/d6/d11/f44 [148286,41881] 0 2026-03-10T07:51:26.673 INFO:tasks.workunit.client.1.vm08.stdout:2/325: dread d0/d1/d3/f4e [0,4194304] 0 2026-03-10T07:51:26.676 INFO:tasks.workunit.client.1.vm08.stdout:6/276: dwrite d1/d17/f42 [0,4194304] 0 2026-03-10T07:51:26.679 INFO:tasks.workunit.client.1.vm08.stdout:6/277: readlink d1/db/l14 0 2026-03-10T07:51:26.680 INFO:tasks.workunit.client.1.vm08.stdout:6/278: chown d1/d3/df/d1d 1904430 1 2026-03-10T07:51:26.680 INFO:tasks.workunit.client.1.vm08.stdout:3/289: symlink d0/d3c/d18/d48/d55/l60 0 2026-03-10T07:51:26.680 INFO:tasks.workunit.client.1.vm08.stdout:7/331: dwrite d3/da/f17 [0,4194304] 0 2026-03-10T07:51:26.687 INFO:tasks.workunit.client.1.vm08.stdout:7/332: fsync d3/da/d25/d9/f47 0 2026-03-10T07:51:26.693 INFO:tasks.workunit.client.1.vm08.stdout:3/290: dwrite d0/d3c/d1f/d44/f59 [0,4194304] 0 2026-03-10T07:51:26.697 INFO:tasks.workunit.client.1.vm08.stdout:3/291: chown d0/d41/d52 46524893 1 2026-03-10T07:51:26.703 INFO:tasks.workunit.client.1.vm08.stdout:8/309: unlink d0/df/d17/d25/f63 0 2026-03-10T07:51:26.711 INFO:tasks.workunit.client.1.vm08.stdout:8/310: dread d0/df/f12 [0,4194304] 0 2026-03-10T07:51:26.711 INFO:tasks.workunit.client.1.vm08.stdout:8/311: chown d0/df/d5d/d5e/d3f 348 1 2026-03-10T07:51:26.711 INFO:tasks.workunit.client.1.vm08.stdout:0/300: chown dd/d10/d14/d15/d20/l66 30 1 2026-03-10T07:51:26.711 INFO:tasks.workunit.client.1.vm08.stdout:8/312: fdatasync d0/df/f1b 0 2026-03-10T07:51:26.714 INFO:tasks.workunit.client.1.vm08.stdout:0/301: dread dd/d29/f48 [0,4194304] 0 2026-03-10T07:51:26.715 INFO:tasks.workunit.client.1.vm08.stdout:0/302: dread - dd/d10/d2f/f6b zero size 2026-03-10T07:51:26.716 INFO:tasks.workunit.client.1.vm08.stdout:9/305: unlink d2/c1d 0 2026-03-10T07:51:26.720 INFO:tasks.workunit.client.1.vm08.stdout:1/266: read d2/d10/f3e [1065085,19522] 0 2026-03-10T07:51:26.733 INFO:tasks.workunit.client.1.vm08.stdout:1/267: stat d2/d6/d11/f46 0 2026-03-10T07:51:26.735 INFO:tasks.workunit.client.1.vm08.stdout:1/268: stat d2/d6/c52 0 2026-03-10T07:51:26.735 INFO:tasks.workunit.client.1.vm08.stdout:0/303: dread dd/d10/d14/f36 [0,4194304] 0 2026-03-10T07:51:26.735 INFO:tasks.workunit.client.1.vm08.stdout:6/279: symlink d1/d17/l5d 0 2026-03-10T07:51:26.735 INFO:tasks.workunit.client.1.vm08.stdout:3/292: rmdir d0/d3c/d1f/d44/d51/d2d 39 2026-03-10T07:51:26.735 INFO:tasks.workunit.client.1.vm08.stdout:3/293: dread - d0/d3c/d1f/d44/f4b zero size 2026-03-10T07:51:26.735 INFO:tasks.workunit.client.1.vm08.stdout:8/313: mkdir d0/df/d17/d66 0 2026-03-10T07:51:26.735 INFO:tasks.workunit.client.1.vm08.stdout:3/294: dread d0/d3c/d1f/d44/d51/f4d [0,4194304] 0 2026-03-10T07:51:26.735 INFO:tasks.workunit.client.1.vm08.stdout:5/342: link d0/d4/df/d12/f46 d0/d4/df/d1e/f64 0 2026-03-10T07:51:26.735 INFO:tasks.workunit.client.1.vm08.stdout:0/304: dwrite dd/f56 [0,4194304] 0 2026-03-10T07:51:26.735 INFO:tasks.workunit.client.1.vm08.stdout:3/295: write d0/d41/f57 [973629,13180] 0 2026-03-10T07:51:26.735 INFO:tasks.workunit.client.1.vm08.stdout:2/326: creat d0/d1/d17/d6b/f6e x:0 0 0 2026-03-10T07:51:26.735 INFO:tasks.workunit.client.1.vm08.stdout:2/327: fsync d0/f12 0 2026-03-10T07:51:26.736 INFO:tasks.workunit.client.1.vm08.stdout:0/305: dread dd/d10/d14/d15/d20/d22/f2e [0,4194304] 0 2026-03-10T07:51:26.744 INFO:tasks.workunit.client.1.vm08.stdout:6/280: mkdir d1/d17/d2b/d5e 0 2026-03-10T07:51:26.745 INFO:tasks.workunit.client.1.vm08.stdout:5/343: rename d0/f30 to d0/d4/df/d4a/d63/f65 0 2026-03-10T07:51:26.746 INFO:tasks.workunit.client.1.vm08.stdout:5/344: readlink d0/d4/df/d12/d22/l2f 0 2026-03-10T07:51:26.746 INFO:tasks.workunit.client.1.vm08.stdout:8/314: dread d0/df/f26 [0,4194304] 0 2026-03-10T07:51:26.747 INFO:tasks.workunit.client.1.vm08.stdout:8/315: write d0/df/d17/f32 [913507,54784] 0 2026-03-10T07:51:26.748 INFO:tasks.workunit.client.1.vm08.stdout:8/316: chown d0/df/d15/d23/d54 1919538 1 2026-03-10T07:51:26.748 INFO:tasks.workunit.client.1.vm08.stdout:1/269: mknod d2/d6/c5c 0 2026-03-10T07:51:26.748 INFO:tasks.workunit.client.1.vm08.stdout:2/328: creat d0/d1/d17/d6b/f6f x:0 0 0 2026-03-10T07:51:26.751 INFO:tasks.workunit.client.1.vm08.stdout:7/333: link d3/da/d25/d9/d2f/d3a/d40/f52 d3/da/f6b 0 2026-03-10T07:51:26.753 INFO:tasks.workunit.client.1.vm08.stdout:7/334: read d3/da/d25/d9/f41 [2640301,78216] 0 2026-03-10T07:51:26.759 INFO:tasks.workunit.client.1.vm08.stdout:6/281: unlink d1/d17/l1c 0 2026-03-10T07:51:26.760 INFO:tasks.workunit.client.1.vm08.stdout:9/306: dwrite d2/f4e [0,4194304] 0 2026-03-10T07:51:26.773 INFO:tasks.workunit.client.1.vm08.stdout:3/296: dread d0/d3c/d18/f23 [4194304,4194304] 0 2026-03-10T07:51:26.774 INFO:tasks.workunit.client.1.vm08.stdout:5/345: mkdir d0/d4/d19/d66 0 2026-03-10T07:51:26.774 INFO:tasks.workunit.client.1.vm08.stdout:0/306: write f7 [1725515,84137] 0 2026-03-10T07:51:26.775 INFO:tasks.workunit.client.1.vm08.stdout:4/219: dwrite d5/d8/d9/f2b [0,4194304] 0 2026-03-10T07:51:26.780 INFO:tasks.workunit.client.1.vm08.stdout:1/270: unlink d2/d6/de/d1f/d26/f43 0 2026-03-10T07:51:26.783 INFO:tasks.workunit.client.1.vm08.stdout:7/335: mkdir d3/da/d25/d9/d2f/d6c 0 2026-03-10T07:51:26.789 INFO:tasks.workunit.client.1.vm08.stdout:6/282: mknod d1/db/c5f 0 2026-03-10T07:51:26.789 INFO:tasks.workunit.client.1.vm08.stdout:2/329: dread d0/d1/d3/f4e [0,4194304] 0 2026-03-10T07:51:26.789 INFO:tasks.workunit.client.1.vm08.stdout:2/330: readlink d0/d1/d3/d39/l4c 0 2026-03-10T07:51:26.791 INFO:tasks.workunit.client.1.vm08.stdout:9/307: mkdir d2/d58/d73 0 2026-03-10T07:51:26.794 INFO:tasks.workunit.client.1.vm08.stdout:4/220: chown f2 50017 1 2026-03-10T07:51:26.797 INFO:tasks.workunit.client.1.vm08.stdout:3/297: rename d0/d41 to d0/d3c/d18/d32/d61 0 2026-03-10T07:51:26.799 INFO:tasks.workunit.client.1.vm08.stdout:3/298: chown d0/d3c/f20 1 1 2026-03-10T07:51:26.800 INFO:tasks.workunit.client.1.vm08.stdout:5/346: unlink d0/d4/df/d12/c38 0 2026-03-10T07:51:26.801 INFO:tasks.workunit.client.1.vm08.stdout:1/271: readlink d2/d6/de/l14 0 2026-03-10T07:51:26.801 INFO:tasks.workunit.client.1.vm08.stdout:0/307: creat dd/d10/d14/d15/d20/d22/f6c x:0 0 0 2026-03-10T07:51:26.801 INFO:tasks.workunit.client.1.vm08.stdout:7/336: symlink d3/da/d25/d9/d2f/d3a/d4b/d67/d69/l6d 0 2026-03-10T07:51:26.802 INFO:tasks.workunit.client.1.vm08.stdout:0/308: readlink dd/d10/d14/d15/l57 0 2026-03-10T07:51:26.802 INFO:tasks.workunit.client.1.vm08.stdout:7/337: chown d3/da/d25/d9/d2f/d39/d43/d4f 1895667 1 2026-03-10T07:51:26.803 INFO:tasks.workunit.client.1.vm08.stdout:0/309: readlink dd/d18/l24 0 2026-03-10T07:51:26.804 INFO:tasks.workunit.client.1.vm08.stdout:2/331: creat d0/d1/d3/d56/f70 x:0 0 0 2026-03-10T07:51:26.806 INFO:tasks.workunit.client.1.vm08.stdout:8/317: creat d0/df/d5d/d5e/f67 x:0 0 0 2026-03-10T07:51:26.808 INFO:tasks.workunit.client.1.vm08.stdout:8/318: write d0/df/d17/f35 [444532,62727] 0 2026-03-10T07:51:26.810 INFO:tasks.workunit.client.1.vm08.stdout:8/319: write d0/df/f19 [8055767,35894] 0 2026-03-10T07:51:26.814 INFO:tasks.workunit.client.1.vm08.stdout:1/272: sync 2026-03-10T07:51:26.814 INFO:tasks.workunit.client.1.vm08.stdout:2/332: dwrite d0/d1/d17/d27/d42/f34 [0,4194304] 0 2026-03-10T07:51:26.823 INFO:tasks.workunit.client.1.vm08.stdout:3/299: creat d0/d3c/d18/d32/f62 x:0 0 0 2026-03-10T07:51:26.823 INFO:tasks.workunit.client.1.vm08.stdout:4/221: write f2 [1934500,63291] 0 2026-03-10T07:51:26.823 INFO:tasks.workunit.client.1.vm08.stdout:5/347: truncate d0/d8/d24/f56 443629 0 2026-03-10T07:51:26.824 INFO:tasks.workunit.client.1.vm08.stdout:5/348: dread - d0/d4/df/d1e/f49 zero size 2026-03-10T07:51:26.826 INFO:tasks.workunit.client.1.vm08.stdout:6/283: creat d1/d3/df/d38/f60 x:0 0 0 2026-03-10T07:51:26.826 INFO:tasks.workunit.client.1.vm08.stdout:2/333: dwrite d0/f68 [0,4194304] 0 2026-03-10T07:51:26.826 INFO:tasks.workunit.client.1.vm08.stdout:6/284: fdatasync d1/d17/f20 0 2026-03-10T07:51:26.833 INFO:tasks.workunit.client.1.vm08.stdout:0/310: symlink dd/d29/d5c/l6d 0 2026-03-10T07:51:26.833 INFO:tasks.workunit.client.1.vm08.stdout:0/311: write dd/d18/f21 [1676878,52265] 0 2026-03-10T07:51:26.834 INFO:tasks.workunit.client.1.vm08.stdout:0/312: stat dd/d18/f21 0 2026-03-10T07:51:26.837 INFO:tasks.workunit.client.1.vm08.stdout:2/334: dread d0/d1/d3/d10/f58 [0,4194304] 0 2026-03-10T07:51:26.842 INFO:tasks.workunit.client.1.vm08.stdout:7/338: rename d3/da/lf to d3/da/d25/d9/d2f/d39/l6e 0 2026-03-10T07:51:26.842 INFO:tasks.workunit.client.1.vm08.stdout:2/335: dread d0/d1/d17/d27/d42/d55/f20 [0,4194304] 0 2026-03-10T07:51:26.844 INFO:tasks.workunit.client.1.vm08.stdout:8/320: rmdir d0/df/d2e 39 2026-03-10T07:51:26.844 INFO:tasks.workunit.client.1.vm08.stdout:1/273: creat d2/d6/de/d1f/d26/f5d x:0 0 0 2026-03-10T07:51:26.845 INFO:tasks.workunit.client.1.vm08.stdout:4/222: creat d5/d1f/d31/f4d x:0 0 0 2026-03-10T07:51:26.846 INFO:tasks.workunit.client.1.vm08.stdout:5/349: creat d0/d4/df/d12/d1c/f67 x:0 0 0 2026-03-10T07:51:26.850 INFO:tasks.workunit.client.1.vm08.stdout:5/350: write d0/d8/f18 [4559870,98985] 0 2026-03-10T07:51:26.857 INFO:tasks.workunit.client.1.vm08.stdout:0/313: mknod dd/d18/c6e 0 2026-03-10T07:51:26.858 INFO:tasks.workunit.client.1.vm08.stdout:0/314: chown dd/fe 243722063 1 2026-03-10T07:51:26.906 INFO:tasks.workunit.client.1.vm08.stdout:7/339: mkdir d3/da/d25/d9/d6f 0 2026-03-10T07:51:26.907 INFO:tasks.workunit.client.1.vm08.stdout:8/321: mknod d0/df/d17/d25/c68 0 2026-03-10T07:51:26.907 INFO:tasks.workunit.client.1.vm08.stdout:8/322: stat d0/df/d15/d23/d39/d5b/d4a 0 2026-03-10T07:51:26.912 INFO:tasks.workunit.client.1.vm08.stdout:7/340: dwrite d3/da/d25/d9/d2f/d39/f56 [0,4194304] 0 2026-03-10T07:51:26.918 INFO:tasks.workunit.client.1.vm08.stdout:7/341: sync 2026-03-10T07:51:26.926 INFO:tasks.workunit.client.1.vm08.stdout:1/274: symlink d2/d6/de/d47/l5e 0 2026-03-10T07:51:26.926 INFO:tasks.workunit.client.1.vm08.stdout:5/351: mknod d0/d4/df/d1e/d41/c68 0 2026-03-10T07:51:26.926 INFO:tasks.workunit.client.1.vm08.stdout:0/315: mknod dd/d29/c6f 0 2026-03-10T07:51:26.928 INFO:tasks.workunit.client.1.vm08.stdout:9/308: link d2/d3/lf d2/d3/d25/d2b/l74 0 2026-03-10T07:51:26.931 INFO:tasks.workunit.client.1.vm08.stdout:8/323: dread d0/df/f60 [0,4194304] 0 2026-03-10T07:51:26.931 INFO:tasks.workunit.client.1.vm08.stdout:8/324: chown d0/df/f60 1 1 2026-03-10T07:51:26.934 INFO:tasks.workunit.client.1.vm08.stdout:2/336: mknod d0/d1/c71 0 2026-03-10T07:51:26.942 INFO:tasks.workunit.client.1.vm08.stdout:8/325: dread d0/f20 [0,4194304] 0 2026-03-10T07:51:26.945 INFO:tasks.workunit.client.1.vm08.stdout:7/342: creat d3/da/d25/d9/d2f/d3a/d4b/f70 x:0 0 0 2026-03-10T07:51:26.954 INFO:tasks.workunit.client.1.vm08.stdout:3/300: dwrite d0/d3c/d1f/d44/d51/d34/f4e [0,4194304] 0 2026-03-10T07:51:26.959 INFO:tasks.workunit.client.1.vm08.stdout:4/223: truncate d5/d8/f42 608697 0 2026-03-10T07:51:26.959 INFO:tasks.workunit.client.1.vm08.stdout:6/285: rmdir d1/d3f 0 2026-03-10T07:51:26.961 INFO:tasks.workunit.client.1.vm08.stdout:1/275: mkdir d2/d6/de/d5f 0 2026-03-10T07:51:26.962 INFO:tasks.workunit.client.1.vm08.stdout:0/316: creat dd/d10/d2f/d37/d64/f70 x:0 0 0 2026-03-10T07:51:26.962 INFO:tasks.workunit.client.1.vm08.stdout:0/317: fdatasync dd/d10/d14/f46 0 2026-03-10T07:51:26.963 INFO:tasks.workunit.client.1.vm08.stdout:0/318: write f5 [148901,116462] 0 2026-03-10T07:51:26.968 INFO:tasks.workunit.client.1.vm08.stdout:6/286: dwrite d1/d17/d2b/f3c [0,4194304] 0 2026-03-10T07:51:26.972 INFO:tasks.workunit.client.1.vm08.stdout:0/319: dwrite dd/d10/d14/d15/d20/d5f/f61 [0,4194304] 0 2026-03-10T07:51:26.973 INFO:tasks.workunit.client.1.vm08.stdout:6/287: chown d1/d3/df/d1d/d40/d45/c55 169 1 2026-03-10T07:51:26.998 INFO:tasks.workunit.client.1.vm08.stdout:2/337: creat d0/d1/d17/d6b/f72 x:0 0 0 2026-03-10T07:51:26.998 INFO:tasks.workunit.client.1.vm08.stdout:8/326: rmdir d0/df/d5d 39 2026-03-10T07:51:27.000 INFO:tasks.workunit.client.1.vm08.stdout:7/343: mkdir d3/da/d25/d9/d2f/d3a/d71 0 2026-03-10T07:51:27.001 INFO:tasks.workunit.client.1.vm08.stdout:4/224: creat d5/d8/d9/f4e x:0 0 0 2026-03-10T07:51:27.021 INFO:tasks.workunit.client.1.vm08.stdout:0/320: fdatasync f2 0 2026-03-10T07:51:27.031 INFO:tasks.workunit.client.1.vm08.stdout:0/321: sync 2026-03-10T07:51:27.035 INFO:tasks.workunit.client.1.vm08.stdout:2/338: rename d0/c69 to d0/d1/d17/d27/d42/d55/d1b/c73 0 2026-03-10T07:51:27.035 INFO:tasks.workunit.client.1.vm08.stdout:2/339: chown d0/d1/d3/d3e 2795646 1 2026-03-10T07:51:27.036 INFO:tasks.workunit.client.1.vm08.stdout:2/340: sync 2026-03-10T07:51:27.042 INFO:tasks.workunit.client.1.vm08.stdout:9/309: dwrite d2/f6 [0,4194304] 0 2026-03-10T07:51:27.050 INFO:tasks.workunit.client.1.vm08.stdout:6/288: mknod d1/d3/df/d1d/d40/d45/d5c/c61 0 2026-03-10T07:51:27.050 INFO:tasks.workunit.client.1.vm08.stdout:6/289: fsync d1/d17/d2b/f3c 0 2026-03-10T07:51:27.051 INFO:tasks.workunit.client.1.vm08.stdout:0/322: mknod dd/d18/c71 0 2026-03-10T07:51:27.051 INFO:tasks.workunit.client.1.vm08.stdout:7/344: rename d3/da/d25/d9/d2f/d39/d43/l48 to d3/da/d25/d9/d2f/d39/d43/d4f/l72 0 2026-03-10T07:51:27.059 INFO:tasks.workunit.client.1.vm08.stdout:2/341: unlink d0/d1/c21 0 2026-03-10T07:51:27.062 INFO:tasks.workunit.client.1.vm08.stdout:9/310: creat d2/d3/d25/f75 x:0 0 0 2026-03-10T07:51:27.064 INFO:tasks.workunit.client.1.vm08.stdout:4/225: mkdir d5/d17/d48/d4f 0 2026-03-10T07:51:27.064 INFO:tasks.workunit.client.1.vm08.stdout:5/352: getdents d0/d4/df 0 2026-03-10T07:51:27.064 INFO:tasks.workunit.client.1.vm08.stdout:1/276: link d2/d6/de/d1f/d22/c37 d2/d6/d3a/c60 0 2026-03-10T07:51:27.065 INFO:tasks.workunit.client.1.vm08.stdout:5/353: chown d0/d4/df/f5a 1004 1 2026-03-10T07:51:27.066 INFO:tasks.workunit.client.1.vm08.stdout:1/277: truncate d2/d6/de/d1f/d26/f2f 4310959 0 2026-03-10T07:51:27.071 INFO:tasks.workunit.client.1.vm08.stdout:0/323: creat dd/d10/d14/d15/d20/f72 x:0 0 0 2026-03-10T07:51:27.071 INFO:tasks.workunit.client.1.vm08.stdout:0/324: chown dd/d10/d14/d15/d20/c53 1967175 1 2026-03-10T07:51:27.071 INFO:tasks.workunit.client.1.vm08.stdout:6/290: rename d1/d17/l5d to d1/d3/df/d1d/d40/d45/l62 0 2026-03-10T07:51:27.072 INFO:tasks.workunit.client.1.vm08.stdout:8/327: getdents d0/df 0 2026-03-10T07:51:27.073 INFO:tasks.workunit.client.1.vm08.stdout:5/354: mkdir d0/d4/d19/d3a/d69 0 2026-03-10T07:51:27.074 INFO:tasks.workunit.client.1.vm08.stdout:2/342: dread d0/f44 [0,4194304] 0 2026-03-10T07:51:27.074 INFO:tasks.workunit.client.1.vm08.stdout:0/325: dwrite dd/d18/f3c [0,4194304] 0 2026-03-10T07:51:27.075 INFO:tasks.workunit.client.1.vm08.stdout:1/278: mkdir d2/d6/d3a/d61 0 2026-03-10T07:51:27.077 INFO:tasks.workunit.client.1.vm08.stdout:7/345: creat d3/da/d25/d9/d6f/f73 x:0 0 0 2026-03-10T07:51:27.087 INFO:tasks.workunit.client.1.vm08.stdout:9/311: rename d2/d3/f57 to d2/d3/d25/f76 0 2026-03-10T07:51:27.093 INFO:tasks.workunit.client.1.vm08.stdout:3/301: truncate d0/d3c/d1f/d44/d51/d2d/f3a 1408521 0 2026-03-10T07:51:27.095 INFO:tasks.workunit.client.1.vm08.stdout:4/226: mkdir d5/d8/d50 0 2026-03-10T07:51:27.097 INFO:tasks.workunit.client.1.vm08.stdout:1/279: truncate d2/d6/de/d1f/f20 824918 0 2026-03-10T07:51:27.106 INFO:tasks.workunit.client.1.vm08.stdout:4/227: sync 2026-03-10T07:51:27.106 INFO:tasks.workunit.client.1.vm08.stdout:4/228: fsync f0 0 2026-03-10T07:51:27.107 INFO:tasks.workunit.client.1.vm08.stdout:4/229: read d5/d1f/f37 [32606,48985] 0 2026-03-10T07:51:27.118 INFO:tasks.workunit.client.1.vm08.stdout:8/328: rename d0/df/d5d/d5e to d0/d69 0 2026-03-10T07:51:27.121 INFO:tasks.workunit.client.1.vm08.stdout:8/329: dwrite d0/d37/f47 [0,4194304] 0 2026-03-10T07:51:27.124 INFO:tasks.workunit.client.1.vm08.stdout:3/302: fsync d0/f45 0 2026-03-10T07:51:27.124 INFO:tasks.workunit.client.1.vm08.stdout:3/303: write d0/f10 [197239,50445] 0 2026-03-10T07:51:27.133 INFO:tasks.workunit.client.1.vm08.stdout:3/304: fsync d0/d3c/d1f/d44/f59 0 2026-03-10T07:51:27.135 INFO:tasks.workunit.client.1.vm08.stdout:5/355: creat d0/d8/d5e/f6a x:0 0 0 2026-03-10T07:51:27.139 INFO:tasks.workunit.client.1.vm08.stdout:1/280: creat d2/d6/de/d1f/d26/f62 x:0 0 0 2026-03-10T07:51:27.152 INFO:tasks.workunit.client.1.vm08.stdout:2/343: symlink d0/l74 0 2026-03-10T07:51:27.152 INFO:tasks.workunit.client.1.vm08.stdout:9/312: rmdir d2 39 2026-03-10T07:51:27.152 INFO:tasks.workunit.client.1.vm08.stdout:2/344: dwrite d0/d1/d3/d10/d32/d61/f59 [0,4194304] 0 2026-03-10T07:51:27.167 INFO:tasks.workunit.client.1.vm08.stdout:0/326: write dd/d10/d2f/d37/f6a [326933,19682] 0 2026-03-10T07:51:27.178 INFO:tasks.workunit.client.1.vm08.stdout:6/291: getdents d1/d17 0 2026-03-10T07:51:27.190 INFO:tasks.workunit.client.1.vm08.stdout:8/330: creat d0/df/d17/d66/f6a x:0 0 0 2026-03-10T07:51:27.196 INFO:tasks.workunit.client.1.vm08.stdout:4/230: write d5/f10 [1716480,87927] 0 2026-03-10T07:51:27.196 INFO:tasks.workunit.client.1.vm08.stdout:7/346: link d3/da/d25/d9/d2f/d3a/d4b/d67/d69/l4c d3/da/d25/d9/d2f/d3a/d40/l74 0 2026-03-10T07:51:27.197 INFO:tasks.workunit.client.1.vm08.stdout:4/231: chown d5/d17/f43 2040607 1 2026-03-10T07:51:27.206 INFO:tasks.workunit.client.1.vm08.stdout:0/327: symlink dd/d10/d2f/d37/d64/d52/l73 0 2026-03-10T07:51:27.210 INFO:tasks.workunit.client.1.vm08.stdout:7/347: dread d3/da/d25/f29 [0,4194304] 0 2026-03-10T07:51:27.215 INFO:tasks.workunit.client.1.vm08.stdout:1/281: dwrite d2/f36 [0,4194304] 0 2026-03-10T07:51:27.222 INFO:tasks.workunit.client.1.vm08.stdout:6/292: link d1/db/d24/d51/f59 d1/d17/f63 0 2026-03-10T07:51:27.225 INFO:tasks.workunit.client.1.vm08.stdout:6/293: dwrite d1/d3/d3e/f56 [0,4194304] 0 2026-03-10T07:51:27.227 INFO:tasks.workunit.client.1.vm08.stdout:9/313: link d2/fd d2/d58/f77 0 2026-03-10T07:51:27.246 INFO:tasks.workunit.client.1.vm08.stdout:7/348: dread d3/da/d25/d9/f30 [0,4194304] 0 2026-03-10T07:51:27.248 INFO:tasks.workunit.client.1.vm08.stdout:7/349: truncate d3/da/d25/d9/d2f/d3a/d40/f63 1611215 0 2026-03-10T07:51:27.249 INFO:tasks.workunit.client.1.vm08.stdout:2/345: link d0/d1/d17/c28 d0/d1/d3/d10/d32/d61/c75 0 2026-03-10T07:51:27.250 INFO:tasks.workunit.client.1.vm08.stdout:2/346: chown d0/d1/d3/d56 156082481 1 2026-03-10T07:51:27.251 INFO:tasks.workunit.client.1.vm08.stdout:2/347: fdatasync d0/d1/d3/d10/d32/d61/f59 0 2026-03-10T07:51:27.253 INFO:tasks.workunit.client.1.vm08.stdout:2/348: truncate d0/d1/d3/d56/f70 343210 0 2026-03-10T07:51:27.253 INFO:tasks.workunit.client.1.vm08.stdout:2/349: read - d0/d1/d17/d6b/f6f zero size 2026-03-10T07:51:27.271 INFO:tasks.workunit.client.1.vm08.stdout:0/328: creat dd/d10/d2f/d37/d64/d52/f74 x:0 0 0 2026-03-10T07:51:27.272 INFO:tasks.workunit.client.1.vm08.stdout:3/305: getdents d0/d3c/d1f/d44 0 2026-03-10T07:51:27.274 INFO:tasks.workunit.client.1.vm08.stdout:8/331: mknod d0/df/d2e/d49/c6b 0 2026-03-10T07:51:27.274 INFO:tasks.workunit.client.1.vm08.stdout:5/356: link d0/d4/df/d12/d1c/f29 d0/d4/d19/d3a/d69/f6b 0 2026-03-10T07:51:27.274 INFO:tasks.workunit.client.1.vm08.stdout:5/357: stat d0/d4/c15 0 2026-03-10T07:51:27.275 INFO:tasks.workunit.client.1.vm08.stdout:3/306: dread d0/f28 [0,4194304] 0 2026-03-10T07:51:27.277 INFO:tasks.workunit.client.1.vm08.stdout:3/307: stat d0/d3c/d1f/d44/d51/d34/f4e 0 2026-03-10T07:51:27.278 INFO:tasks.workunit.client.1.vm08.stdout:1/282: creat d2/d6/de/d47/f63 x:0 0 0 2026-03-10T07:51:27.278 INFO:tasks.workunit.client.1.vm08.stdout:1/283: truncate d2/d6/de/f1c 1814516 0 2026-03-10T07:51:27.279 INFO:tasks.workunit.client.1.vm08.stdout:1/284: readlink d2/la 0 2026-03-10T07:51:27.280 INFO:tasks.workunit.client.1.vm08.stdout:6/294: rmdir d1/db/d24 39 2026-03-10T07:51:27.288 INFO:tasks.workunit.client.1.vm08.stdout:7/350: mknod d3/da/d25/d9/d2f/d3a/d4b/d67/c75 0 2026-03-10T07:51:27.298 INFO:tasks.workunit.client.1.vm08.stdout:4/232: mknod d5/d17/d48/d4f/c51 0 2026-03-10T07:51:27.298 INFO:tasks.workunit.client.1.vm08.stdout:0/329: symlink dd/d10/d14/d15/l75 0 2026-03-10T07:51:27.301 INFO:tasks.workunit.client.1.vm08.stdout:0/330: dwrite dd/d10/d2f/d37/d64/f70 [0,4194304] 0 2026-03-10T07:51:27.302 INFO:tasks.workunit.client.1.vm08.stdout:0/331: fsync dd/d10/d2f/f6b 0 2026-03-10T07:51:27.302 INFO:tasks.workunit.client.1.vm08.stdout:0/332: fsync dd/d10/d14/d15/d20/f72 0 2026-03-10T07:51:27.307 INFO:tasks.workunit.client.1.vm08.stdout:3/308: unlink d0/d3c/d18/f2e 0 2026-03-10T07:51:27.311 INFO:tasks.workunit.client.1.vm08.stdout:6/295: dread d1/d17/f20 [0,4194304] 0 2026-03-10T07:51:27.315 INFO:tasks.workunit.client.1.vm08.stdout:6/296: dread d1/f6 [0,4194304] 0 2026-03-10T07:51:27.315 INFO:tasks.workunit.client.1.vm08.stdout:9/314: symlink d2/d58/d73/l78 0 2026-03-10T07:51:27.318 INFO:tasks.workunit.client.1.vm08.stdout:5/358: dwrite d0/d4/d19/d3a/f3f [0,4194304] 0 2026-03-10T07:51:27.320 INFO:tasks.workunit.client.1.vm08.stdout:5/359: write d0/d4/df/d1e/f42 [5826852,75821] 0 2026-03-10T07:51:27.321 INFO:tasks.workunit.client.1.vm08.stdout:5/360: write d0/d33/f34 [1452609,123702] 0 2026-03-10T07:51:27.323 INFO:tasks.workunit.client.1.vm08.stdout:9/315: read d2/f10 [3601580,43784] 0 2026-03-10T07:51:27.328 INFO:tasks.workunit.client.1.vm08.stdout:9/316: readlink d2/d3/d25/d2b/l40 0 2026-03-10T07:51:27.329 INFO:tasks.workunit.client.1.vm08.stdout:0/333: creat dd/d10/d14/d1b/f76 x:0 0 0 2026-03-10T07:51:27.332 INFO:tasks.workunit.client.1.vm08.stdout:3/309: unlink d0/f16 0 2026-03-10T07:51:27.332 INFO:tasks.workunit.client.1.vm08.stdout:1/285: mkdir d2/d6/de/d1f/d26/d58/d64 0 2026-03-10T07:51:27.335 INFO:tasks.workunit.client.1.vm08.stdout:5/361: rename d0/d4/c45 to d0/d4/df/c6c 0 2026-03-10T07:51:27.335 INFO:tasks.workunit.client.1.vm08.stdout:7/351: link d3/da/d25/f35 d3/da/d25/d9/d2f/d39/f76 0 2026-03-10T07:51:27.336 INFO:tasks.workunit.client.1.vm08.stdout:2/350: creat d0/f76 x:0 0 0 2026-03-10T07:51:27.337 INFO:tasks.workunit.client.1.vm08.stdout:2/351: stat d0/d1/d3/d10/d32/f45 0 2026-03-10T07:51:27.338 INFO:tasks.workunit.client.1.vm08.stdout:8/332: rmdir d0/df/d2e/d64 0 2026-03-10T07:51:27.342 INFO:tasks.workunit.client.1.vm08.stdout:9/317: creat d2/d3/d25/d30/d35/f79 x:0 0 0 2026-03-10T07:51:27.344 INFO:tasks.workunit.client.1.vm08.stdout:7/352: dwrite d3/f4 [0,4194304] 0 2026-03-10T07:51:27.352 INFO:tasks.workunit.client.1.vm08.stdout:6/297: mknod d1/c64 0 2026-03-10T07:51:27.352 INFO:tasks.workunit.client.1.vm08.stdout:8/333: dwrite d0/df/d2e/d30/f33 [0,4194304] 0 2026-03-10T07:51:27.353 INFO:tasks.workunit.client.1.vm08.stdout:0/334: sync 2026-03-10T07:51:27.353 INFO:tasks.workunit.client.1.vm08.stdout:4/233: link d5/d8/d9/d12/l20 d5/d1f/d41/l52 0 2026-03-10T07:51:27.353 INFO:tasks.workunit.client.1.vm08.stdout:5/362: fdatasync d0/d4/df/d1e/f64 0 2026-03-10T07:51:27.357 INFO:tasks.workunit.client.1.vm08.stdout:3/310: mknod d0/c63 0 2026-03-10T07:51:27.360 INFO:tasks.workunit.client.1.vm08.stdout:5/363: stat d0/d4/df/d1e/d41/l4e 0 2026-03-10T07:51:27.362 INFO:tasks.workunit.client.1.vm08.stdout:9/318: dwrite d2/d3/d25/d30/f46 [0,4194304] 0 2026-03-10T07:51:27.364 INFO:tasks.workunit.client.1.vm08.stdout:9/319: read d2/f4e [1920150,17164] 0 2026-03-10T07:51:27.370 INFO:tasks.workunit.client.1.vm08.stdout:7/353: symlink d3/da/d25/d9/d2f/d3a/d4b/d67/d69/l77 0 2026-03-10T07:51:27.376 INFO:tasks.workunit.client.1.vm08.stdout:4/234: creat d5/d8/d9/d12/f53 x:0 0 0 2026-03-10T07:51:27.377 INFO:tasks.workunit.client.1.vm08.stdout:3/311: sync 2026-03-10T07:51:27.385 INFO:tasks.workunit.client.1.vm08.stdout:9/320: creat d2/d3/d25/d2b/f7a x:0 0 0 2026-03-10T07:51:27.385 INFO:tasks.workunit.client.1.vm08.stdout:9/321: fsync d2/de/f4f 0 2026-03-10T07:51:27.408 INFO:tasks.workunit.client.1.vm08.stdout:6/298: rename d1/c47 to d1/db/c65 0 2026-03-10T07:51:27.408 INFO:tasks.workunit.client.1.vm08.stdout:6/299: write d1/f34 [1398025,4160] 0 2026-03-10T07:51:27.412 INFO:tasks.workunit.client.1.vm08.stdout:0/335: creat dd/d10/f77 x:0 0 0 2026-03-10T07:51:27.413 INFO:tasks.workunit.client.1.vm08.stdout:2/352: getdents d0/d1/d3/d56/d57 0 2026-03-10T07:51:27.417 INFO:tasks.workunit.client.1.vm08.stdout:2/353: sync 2026-03-10T07:51:27.420 INFO:tasks.workunit.client.1.vm08.stdout:0/336: unlink f7 0 2026-03-10T07:51:27.420 INFO:tasks.workunit.client.1.vm08.stdout:9/322: fsync d2/d3/d25/d30/f46 0 2026-03-10T07:51:27.421 INFO:tasks.workunit.client.1.vm08.stdout:0/337: chown f2 2659225 1 2026-03-10T07:51:27.421 INFO:tasks.workunit.client.1.vm08.stdout:1/286: truncate d2/f4 3309946 0 2026-03-10T07:51:27.422 INFO:tasks.workunit.client.1.vm08.stdout:1/287: dread - d2/d6/f3b zero size 2026-03-10T07:51:27.422 INFO:tasks.workunit.client.1.vm08.stdout:2/354: dwrite d0/d1/d17/d6b/f6f [0,4194304] 0 2026-03-10T07:51:27.422 INFO:tasks.workunit.client.1.vm08.stdout:0/338: sync 2026-03-10T07:51:27.423 INFO:tasks.workunit.client.1.vm08.stdout:4/235: creat d5/f54 x:0 0 0 2026-03-10T07:51:27.423 INFO:tasks.workunit.client.1.vm08.stdout:2/355: dread - d0/f76 zero size 2026-03-10T07:51:27.429 INFO:tasks.workunit.client.1.vm08.stdout:2/356: write d0/d1/d3/d10/d38/f54 [19626,22007] 0 2026-03-10T07:51:27.430 INFO:tasks.workunit.client.1.vm08.stdout:2/357: chown d0/d1/d3/d56/d57/f5b 1 1 2026-03-10T07:51:27.430 INFO:tasks.workunit.client.1.vm08.stdout:2/358: fsync d0/f12 0 2026-03-10T07:51:27.431 INFO:tasks.workunit.client.1.vm08.stdout:8/334: rename d0/c4 to d0/df/d15/d23/d54/c6c 0 2026-03-10T07:51:27.431 INFO:tasks.workunit.client.1.vm08.stdout:8/335: readlink d0/lc 0 2026-03-10T07:51:27.432 INFO:tasks.workunit.client.1.vm08.stdout:8/336: dread - d0/d69/f46 zero size 2026-03-10T07:51:27.433 INFO:tasks.workunit.client.1.vm08.stdout:4/236: dwrite d5/f2d [0,4194304] 0 2026-03-10T07:51:27.444 INFO:tasks.workunit.client.1.vm08.stdout:0/339: unlink c9 0 2026-03-10T07:51:27.445 INFO:tasks.workunit.client.1.vm08.stdout:5/364: rename d0/d4/d19/d66 to d0/d4/d19/d60/d6d 0 2026-03-10T07:51:27.446 INFO:tasks.workunit.client.1.vm08.stdout:2/359: mknod d0/d1/d17/d27/c77 0 2026-03-10T07:51:27.446 INFO:tasks.workunit.client.1.vm08.stdout:8/337: symlink d0/df/d15/d53/l6d 0 2026-03-10T07:51:27.446 INFO:tasks.workunit.client.1.vm08.stdout:5/365: write d0/d4/df/f57 [635279,25591] 0 2026-03-10T07:51:27.447 INFO:tasks.workunit.client.1.vm08.stdout:7/354: link d3/da/d25/d9/c31 d3/c78 0 2026-03-10T07:51:27.447 INFO:tasks.workunit.client.1.vm08.stdout:8/338: write d0/df/d2e/f44 [1233206,47913] 0 2026-03-10T07:51:27.451 INFO:tasks.workunit.client.1.vm08.stdout:2/360: rmdir d0/d1/d3/d10/d32/d61 39 2026-03-10T07:51:27.451 INFO:tasks.workunit.client.1.vm08.stdout:5/366: creat d0/d4/d19/f6e x:0 0 0 2026-03-10T07:51:27.452 INFO:tasks.workunit.client.1.vm08.stdout:7/355: write d3/da/d25/f27 [588601,124818] 0 2026-03-10T07:51:27.452 INFO:tasks.workunit.client.1.vm08.stdout:9/323: truncate d2/d3/d25/d2b/f37 3092372 0 2026-03-10T07:51:27.454 INFO:tasks.workunit.client.1.vm08.stdout:5/367: chown d0/d4/df/d12/f11 6 1 2026-03-10T07:51:27.457 INFO:tasks.workunit.client.1.vm08.stdout:8/339: dread d0/df/f12 [0,4194304] 0 2026-03-10T07:51:27.459 INFO:tasks.workunit.client.1.vm08.stdout:9/324: dwrite d2/de/f4d [0,4194304] 0 2026-03-10T07:51:27.460 INFO:tasks.workunit.client.1.vm08.stdout:7/356: readlink d3/da/d25/d9/l66 0 2026-03-10T07:51:27.464 INFO:tasks.workunit.client.1.vm08.stdout:8/340: dwrite d0/d37/f47 [0,4194304] 0 2026-03-10T07:51:27.473 INFO:tasks.workunit.client.1.vm08.stdout:5/368: mkdir d0/d8/d24/d6f 0 2026-03-10T07:51:27.473 INFO:tasks.workunit.client.1.vm08.stdout:4/237: getdents d5/d8/d9/d12 0 2026-03-10T07:51:27.474 INFO:tasks.workunit.client.1.vm08.stdout:8/341: sync 2026-03-10T07:51:27.474 INFO:tasks.workunit.client.1.vm08.stdout:9/325: sync 2026-03-10T07:51:27.474 INFO:tasks.workunit.client.1.vm08.stdout:5/369: chown d0/d4/df/d12/d22/l2f 1865 1 2026-03-10T07:51:27.486 INFO:tasks.workunit.client.1.vm08.stdout:9/326: sync 2026-03-10T07:51:27.487 INFO:tasks.workunit.client.1.vm08.stdout:9/327: chown d2/d3/d25/d2b/l40 0 1 2026-03-10T07:51:27.487 INFO:tasks.workunit.client.1.vm08.stdout:3/312: rename d0/d3c/d18/c3f to d0/d3c/d1f/c64 0 2026-03-10T07:51:27.491 INFO:tasks.workunit.client.1.vm08.stdout:9/328: dread d2/f6 [0,4194304] 0 2026-03-10T07:51:27.491 INFO:tasks.workunit.client.1.vm08.stdout:9/329: readlink d2/de/l6f 0 2026-03-10T07:51:27.498 INFO:tasks.workunit.client.1.vm08.stdout:7/357: mkdir d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79 0 2026-03-10T07:51:27.501 INFO:tasks.workunit.client.1.vm08.stdout:7/358: write d3/da/d25/f27 [1564636,79215] 0 2026-03-10T07:51:27.501 INFO:tasks.workunit.client.1.vm08.stdout:4/238: chown d5/l1c 225852762 1 2026-03-10T07:51:27.506 INFO:tasks.workunit.client.1.vm08.stdout:9/330: symlink d2/de/l7b 0 2026-03-10T07:51:27.506 INFO:tasks.workunit.client.1.vm08.stdout:0/340: dwrite dd/d10/d14/f69 [0,4194304] 0 2026-03-10T07:51:27.510 INFO:tasks.workunit.client.1.vm08.stdout:0/341: dread dd/d10/d14/f46 [0,4194304] 0 2026-03-10T07:51:27.511 INFO:tasks.workunit.client.1.vm08.stdout:4/239: sync 2026-03-10T07:51:27.513 INFO:tasks.workunit.client.1.vm08.stdout:9/331: creat d2/d3/d25/f7c x:0 0 0 2026-03-10T07:51:27.514 INFO:tasks.workunit.client.1.vm08.stdout:7/359: truncate d3/da/d25/d9/fd 515614 0 2026-03-10T07:51:27.515 INFO:tasks.workunit.client.1.vm08.stdout:0/342: unlink dd/f13 0 2026-03-10T07:51:27.515 INFO:tasks.workunit.client.1.vm08.stdout:5/370: rmdir d0/d8/d24/d6f 0 2026-03-10T07:51:27.520 INFO:tasks.workunit.client.1.vm08.stdout:4/240: dwrite d5/d8/f30 [0,4194304] 0 2026-03-10T07:51:27.527 INFO:tasks.workunit.client.1.vm08.stdout:6/300: rename d1/d17/d2b/f53 to d1/d17/f66 0 2026-03-10T07:51:27.530 INFO:tasks.workunit.client.1.vm08.stdout:9/332: dwrite d2/de/f4f [4194304,4194304] 0 2026-03-10T07:51:27.532 INFO:tasks.workunit.client.1.vm08.stdout:5/371: dwrite d0/d4/df/d12/f13 [0,4194304] 0 2026-03-10T07:51:27.538 INFO:tasks.workunit.client.1.vm08.stdout:0/343: symlink dd/d29/l78 0 2026-03-10T07:51:27.547 INFO:tasks.workunit.client.1.vm08.stdout:2/361: rename d0/d1/d17/d27 to d0/d1/d3/d56/d78 0 2026-03-10T07:51:27.548 INFO:tasks.workunit.client.1.vm08.stdout:3/313: getdents d0/d3c/d1f/d44/d51/d2d 0 2026-03-10T07:51:27.548 INFO:tasks.workunit.client.1.vm08.stdout:3/314: readlink d0/d3c/d18/d32/d61/l4c 0 2026-03-10T07:51:27.548 INFO:tasks.workunit.client.1.vm08.stdout:9/333: dread d2/de/f4d [0,4194304] 0 2026-03-10T07:51:27.548 INFO:tasks.workunit.client.1.vm08.stdout:6/301: dwrite d1/d17/f42 [0,4194304] 0 2026-03-10T07:51:27.548 INFO:tasks.workunit.client.1.vm08.stdout:6/302: fdatasync d1/d3/d3e/f5b 0 2026-03-10T07:51:27.551 INFO:tasks.workunit.client.1.vm08.stdout:0/344: readlink dd/d10/d2f/l43 0 2026-03-10T07:51:27.554 INFO:tasks.workunit.client.1.vm08.stdout:9/334: sync 2026-03-10T07:51:27.556 INFO:tasks.workunit.client.1.vm08.stdout:1/288: dread d2/f4 [0,4194304] 0 2026-03-10T07:51:27.556 INFO:tasks.workunit.client.1.vm08.stdout:0/345: write dd/d10/d14/f46 [335048,77718] 0 2026-03-10T07:51:27.557 INFO:tasks.workunit.client.1.vm08.stdout:8/342: rename d0/ce to d0/df/d15/c6e 0 2026-03-10T07:51:27.561 INFO:tasks.workunit.client.1.vm08.stdout:2/362: creat d0/d1/d3/d56/d57/f79 x:0 0 0 2026-03-10T07:51:27.566 INFO:tasks.workunit.client.1.vm08.stdout:0/346: dread dd/d29/d58/d3d/f5b [0,4194304] 0 2026-03-10T07:51:27.568 INFO:tasks.workunit.client.1.vm08.stdout:1/289: dwrite d2/d6/de/d47/f3c [0,4194304] 0 2026-03-10T07:51:27.584 INFO:tasks.workunit.client.1.vm08.stdout:5/372: rename d0/d4/df/d12/d1c to d0/d4/d19/d60/d6d/d70 0 2026-03-10T07:51:27.584 INFO:tasks.workunit.client.1.vm08.stdout:2/363: mkdir d0/d1/d3/d56/d78/d42/d55/d7a 0 2026-03-10T07:51:27.584 INFO:tasks.workunit.client.1.vm08.stdout:0/347: mknod dd/d29/d5c/c79 0 2026-03-10T07:51:27.584 INFO:tasks.workunit.client.1.vm08.stdout:3/315: creat d0/d3c/d1f/d44/d51/f65 x:0 0 0 2026-03-10T07:51:27.584 INFO:tasks.workunit.client.1.vm08.stdout:6/303: rename d1/db/d24/c2c to d1/d3/df/d1d/d40/c67 0 2026-03-10T07:51:27.584 INFO:tasks.workunit.client.1.vm08.stdout:8/343: dread d0/df/f1b [0,4194304] 0 2026-03-10T07:51:27.584 INFO:tasks.workunit.client.1.vm08.stdout:8/344: chown d0/df/d17/f35 174 1 2026-03-10T07:51:27.585 INFO:tasks.workunit.client.1.vm08.stdout:1/290: symlink d2/d6/d50/l65 0 2026-03-10T07:51:27.586 INFO:tasks.workunit.client.1.vm08.stdout:2/364: dwrite d0/f68 [0,4194304] 0 2026-03-10T07:51:27.586 INFO:tasks.workunit.client.1.vm08.stdout:0/348: mkdir dd/d10/d14/d15/d20/d7a 0 2026-03-10T07:51:27.586 INFO:tasks.workunit.client.1.vm08.stdout:5/373: creat d0/d4/d19/d3a/d69/f71 x:0 0 0 2026-03-10T07:51:27.599 INFO:tasks.workunit.client.1.vm08.stdout:3/316: rename d0/d3c/f26 to d0/d3c/d18/d32/d61/d52/f66 0 2026-03-10T07:51:27.605 INFO:tasks.workunit.client.1.vm08.stdout:1/291: rmdir d2/d6/de 39 2026-03-10T07:51:27.605 INFO:tasks.workunit.client.1.vm08.stdout:2/365: chown d0/d1/d3/d10/d32/d61/f3d 53 1 2026-03-10T07:51:27.605 INFO:tasks.workunit.client.1.vm08.stdout:5/374: mknod d0/d4/df/d12/c72 0 2026-03-10T07:51:27.605 INFO:tasks.workunit.client.1.vm08.stdout:8/345: link d0/fa d0/df/d15/d23/d39/f6f 0 2026-03-10T07:51:27.605 INFO:tasks.workunit.client.1.vm08.stdout:5/375: creat d0/d4/df/d4a/d63/f73 x:0 0 0 2026-03-10T07:51:27.605 INFO:tasks.workunit.client.1.vm08.stdout:8/346: creat d0/df/d15/f70 x:0 0 0 2026-03-10T07:51:27.605 INFO:tasks.workunit.client.1.vm08.stdout:3/317: symlink d0/d3c/d1f/l67 0 2026-03-10T07:51:27.605 INFO:tasks.workunit.client.1.vm08.stdout:8/347: write d0/df/f1b [3856058,64290] 0 2026-03-10T07:51:27.606 INFO:tasks.workunit.client.1.vm08.stdout:2/366: dread d0/f4a [0,4194304] 0 2026-03-10T07:51:27.608 INFO:tasks.workunit.client.1.vm08.stdout:1/292: symlink d2/d6/de/d5f/l66 0 2026-03-10T07:51:27.609 INFO:tasks.workunit.client.1.vm08.stdout:3/318: creat d0/d3c/d18/d32/d61/d52/f68 x:0 0 0 2026-03-10T07:51:27.609 INFO:tasks.workunit.client.1.vm08.stdout:1/293: readlink d2/la 0 2026-03-10T07:51:27.615 INFO:tasks.workunit.client.1.vm08.stdout:8/348: unlink d0/df/d17/f35 0 2026-03-10T07:51:27.619 INFO:tasks.workunit.client.1.vm08.stdout:8/349: symlink d0/df/l71 0 2026-03-10T07:51:27.625 INFO:tasks.workunit.client.1.vm08.stdout:2/367: dread d0/d1/d3/d56/d78/d42/d55/d1b/f26 [0,4194304] 0 2026-03-10T07:51:27.625 INFO:tasks.workunit.client.1.vm08.stdout:7/360: dread d3/da/d25/f5c [0,4194304] 0 2026-03-10T07:51:27.626 INFO:tasks.workunit.client.1.vm08.stdout:7/361: chown d3/f51 10087 1 2026-03-10T07:51:27.628 INFO:tasks.workunit.client.1.vm08.stdout:2/368: unlink d0/f35 0 2026-03-10T07:51:27.630 INFO:tasks.workunit.client.1.vm08.stdout:7/362: mknod d3/da/d25/d9/d2f/d3a/d4b/c7a 0 2026-03-10T07:51:27.631 INFO:tasks.workunit.client.1.vm08.stdout:2/369: symlink d0/d1/d3/d10/d65/l7b 0 2026-03-10T07:51:27.632 INFO:tasks.workunit.client.1.vm08.stdout:7/363: chown d3/da/c2a 29 1 2026-03-10T07:51:27.634 INFO:tasks.workunit.client.1.vm08.stdout:2/370: creat d0/d1/d3/d10/d65/f7c x:0 0 0 2026-03-10T07:51:27.642 INFO:tasks.workunit.client.1.vm08.stdout:7/364: creat d3/da/d25/d9/d2f/d3a/d4b/f7b x:0 0 0 2026-03-10T07:51:27.642 INFO:tasks.workunit.client.1.vm08.stdout:1/294: dread d2/d10/f3f [0,4194304] 0 2026-03-10T07:51:27.650 INFO:tasks.workunit.client.1.vm08.stdout:1/295: read d2/d6/de/d1f/f20 [125626,47942] 0 2026-03-10T07:51:27.650 INFO:tasks.workunit.client.1.vm08.stdout:2/371: dwrite d0/f50 [0,4194304] 0 2026-03-10T07:51:27.651 INFO:tasks.workunit.client.1.vm08.stdout:1/296: rename d2/d6/de/d1f/f4b to d2/d6/de/d1f/f67 0 2026-03-10T07:51:27.651 INFO:tasks.workunit.client.1.vm08.stdout:2/372: truncate d0/d1/d17/d6b/f72 430981 0 2026-03-10T07:51:27.656 INFO:tasks.workunit.client.1.vm08.stdout:1/297: write d2/d10/f3f [3241283,48351] 0 2026-03-10T07:51:27.662 INFO:tasks.workunit.client.1.vm08.stdout:1/298: creat d2/d6/de/d1f/d26/d58/f68 x:0 0 0 2026-03-10T07:51:27.665 INFO:tasks.workunit.client.1.vm08.stdout:1/299: rmdir d2 39 2026-03-10T07:51:27.683 INFO:tasks.workunit.client.1.vm08.stdout:1/300: write d2/d10/f2b [8359322,52050] 0 2026-03-10T07:51:27.687 INFO:tasks.workunit.client.1.vm08.stdout:9/335: truncate d2/f1a 3674780 0 2026-03-10T07:51:27.699 INFO:tasks.workunit.client.1.vm08.stdout:3/319: fsync d0/d3c/d1f/d44/d51/f65 0 2026-03-10T07:51:27.700 INFO:tasks.workunit.client.1.vm08.stdout:3/320: dread - d0/d3c/d1f/d44/d51/d34/f47 zero size 2026-03-10T07:51:27.702 INFO:tasks.workunit.client.1.vm08.stdout:3/321: dread - d0/d3c/d1f/d44/d51/d2d/f58 zero size 2026-03-10T07:51:27.702 INFO:tasks.workunit.client.1.vm08.stdout:6/304: write d1/f6 [2068355,42181] 0 2026-03-10T07:51:27.705 INFO:tasks.workunit.client.1.vm08.stdout:3/322: unlink d0/l11 0 2026-03-10T07:51:27.705 INFO:tasks.workunit.client.1.vm08.stdout:3/323: readlink d0/d3c/d1f/d44/d51/l2a 0 2026-03-10T07:51:27.706 INFO:tasks.workunit.client.1.vm08.stdout:6/305: creat d1/d17/d2b/f68 x:0 0 0 2026-03-10T07:51:27.707 INFO:tasks.workunit.client.1.vm08.stdout:6/306: chown d1/d3/df/d44/c54 6328 1 2026-03-10T07:51:27.707 INFO:tasks.workunit.client.1.vm08.stdout:0/349: truncate dd/d29/f2a 774926 0 2026-03-10T07:51:27.711 INFO:tasks.workunit.client.1.vm08.stdout:5/376: dwrite d0/d8/f1b [0,4194304] 0 2026-03-10T07:51:27.717 INFO:tasks.workunit.client.1.vm08.stdout:6/307: fsync d1/d17/f63 0 2026-03-10T07:51:27.721 INFO:tasks.workunit.client.1.vm08.stdout:3/324: dwrite d0/d3c/d1f/d44/d51/d34/f3d [4194304,4194304] 0 2026-03-10T07:51:27.724 INFO:tasks.workunit.client.1.vm08.stdout:0/350: dwrite dd/f44 [0,4194304] 0 2026-03-10T07:51:27.726 INFO:tasks.workunit.client.1.vm08.stdout:0/351: chown dd/d10/f77 1035708935 1 2026-03-10T07:51:27.728 INFO:tasks.workunit.client.1.vm08.stdout:4/241: dread f1 [0,4194304] 0 2026-03-10T07:51:27.728 INFO:tasks.workunit.client.1.vm08.stdout:4/242: chown d5/d8/c36 58 1 2026-03-10T07:51:27.730 INFO:tasks.workunit.client.1.vm08.stdout:3/325: dread d0/d3c/d18/d32/d61/d52/f66 [0,4194304] 0 2026-03-10T07:51:27.731 INFO:tasks.workunit.client.1.vm08.stdout:3/326: read d0/d3c/d18/f38 [2990982,7024] 0 2026-03-10T07:51:27.735 INFO:tasks.workunit.client.1.vm08.stdout:9/336: dread d2/f10 [0,4194304] 0 2026-03-10T07:51:27.741 INFO:tasks.workunit.client.1.vm08.stdout:9/337: read d2/d3/d25/d2b/f6a [3300,97148] 0 2026-03-10T07:51:27.752 INFO:tasks.workunit.client.1.vm08.stdout:0/352: mknod dd/d10/d14/d1b/d30/c7b 0 2026-03-10T07:51:27.754 INFO:tasks.workunit.client.1.vm08.stdout:4/243: rmdir d5/d1f 39 2026-03-10T07:51:27.755 INFO:tasks.workunit.client.1.vm08.stdout:3/327: creat d0/d3c/d18/d48/d55/d56/f69 x:0 0 0 2026-03-10T07:51:27.758 INFO:tasks.workunit.client.1.vm08.stdout:6/308: rename d1/db/c1b to d1/d3/df/d1d/c69 0 2026-03-10T07:51:27.759 INFO:tasks.workunit.client.1.vm08.stdout:6/309: write d1/d3/df/d38/f60 [935224,94258] 0 2026-03-10T07:51:27.771 INFO:tasks.workunit.client.1.vm08.stdout:9/338: symlink d2/de/d28/l7d 0 2026-03-10T07:51:27.771 INFO:tasks.workunit.client.1.vm08.stdout:4/244: symlink d5/d8/d9/d32/l55 0 2026-03-10T07:51:27.771 INFO:tasks.workunit.client.1.vm08.stdout:0/353: fsync dd/f16 0 2026-03-10T07:51:27.771 INFO:tasks.workunit.client.1.vm08.stdout:3/328: unlink d0/d3c/d1f/d44/d51/d34/l46 0 2026-03-10T07:51:27.771 INFO:tasks.workunit.client.1.vm08.stdout:3/329: truncate d0/d3c/d1f/d44/d51/d2d/f35 1910246 0 2026-03-10T07:51:27.771 INFO:tasks.workunit.client.1.vm08.stdout:8/350: dwrite d0/df/f26 [0,4194304] 0 2026-03-10T07:51:27.771 INFO:tasks.workunit.client.1.vm08.stdout:6/310: symlink d1/d3/df/d1d/d40/l6a 0 2026-03-10T07:51:27.771 INFO:tasks.workunit.client.1.vm08.stdout:8/351: chown d0/d37 477052650 1 2026-03-10T07:51:27.771 INFO:tasks.workunit.client.1.vm08.stdout:4/245: rmdir d5/d1f/d41 39 2026-03-10T07:51:27.772 INFO:tasks.workunit.client.1.vm08.stdout:9/339: rename d2/d26/f69 to d2/d58/d73/f7e 0 2026-03-10T07:51:27.772 INFO:tasks.workunit.client.1.vm08.stdout:3/330: creat d0/d3c/d1f/d44/f6a x:0 0 0 2026-03-10T07:51:27.772 INFO:tasks.workunit.client.1.vm08.stdout:4/246: rmdir d5/d1f/d31 39 2026-03-10T07:51:27.772 INFO:tasks.workunit.client.1.vm08.stdout:3/331: readlink d0/d3c/d1f/l54 0 2026-03-10T07:51:27.772 INFO:tasks.workunit.client.1.vm08.stdout:9/340: mknod d2/c7f 0 2026-03-10T07:51:27.772 INFO:tasks.workunit.client.1.vm08.stdout:0/354: dwrite dd/d29/f48 [0,4194304] 0 2026-03-10T07:51:27.774 INFO:tasks.workunit.client.1.vm08.stdout:9/341: symlink d2/d58/l80 0 2026-03-10T07:51:27.774 INFO:tasks.workunit.client.1.vm08.stdout:4/247: creat d5/d17/d48/d4f/f56 x:0 0 0 2026-03-10T07:51:27.775 INFO:tasks.workunit.client.1.vm08.stdout:8/352: stat d0/df/d15/d23/l2c 0 2026-03-10T07:51:27.781 INFO:tasks.workunit.client.1.vm08.stdout:9/342: mknod d2/d26/c81 0 2026-03-10T07:51:27.783 INFO:tasks.workunit.client.1.vm08.stdout:6/311: dwrite d1/d17/f66 [0,4194304] 0 2026-03-10T07:51:27.783 INFO:tasks.workunit.client.1.vm08.stdout:9/343: rmdir d2/d3 39 2026-03-10T07:51:27.783 INFO:tasks.workunit.client.1.vm08.stdout:8/353: mkdir d0/df/d17/d72 0 2026-03-10T07:51:27.783 INFO:tasks.workunit.client.1.vm08.stdout:3/332: dwrite d0/d3c/d18/d32/f62 [0,4194304] 0 2026-03-10T07:51:27.787 INFO:tasks.workunit.client.1.vm08.stdout:6/312: dwrite d1/db/d24/f26 [0,4194304] 0 2026-03-10T07:51:27.790 INFO:tasks.workunit.client.1.vm08.stdout:6/313: write d1/d3/d3e/f43 [304417,32294] 0 2026-03-10T07:51:27.791 INFO:tasks.workunit.client.1.vm08.stdout:6/314: write d1/db/d24/f25 [853827,114132] 0 2026-03-10T07:51:27.801 INFO:tasks.workunit.client.1.vm08.stdout:9/344: fsync d2/de/f3d 0 2026-03-10T07:51:27.818 INFO:tasks.workunit.client.1.vm08.stdout:3/333: unlink d0/d3c/d1f/d44/d51/d2d/f58 0 2026-03-10T07:51:27.819 INFO:tasks.workunit.client.1.vm08.stdout:8/354: dread d0/df/f60 [0,4194304] 0 2026-03-10T07:51:27.819 INFO:tasks.workunit.client.1.vm08.stdout:6/315: creat d1/d3/df/d1d/f6b x:0 0 0 2026-03-10T07:51:27.819 INFO:tasks.workunit.client.1.vm08.stdout:6/316: fsync d1/f28 0 2026-03-10T07:51:27.819 INFO:tasks.workunit.client.1.vm08.stdout:9/345: dwrite d2/d3/d25/d30/d35/f6c [0,4194304] 0 2026-03-10T07:51:27.819 INFO:tasks.workunit.client.1.vm08.stdout:3/334: rename d0/d3c/d18/d48/d55/l60 to d0/d3c/d18/d48/d55/d56/l6b 0 2026-03-10T07:51:27.819 INFO:tasks.workunit.client.1.vm08.stdout:9/346: dwrite d2/d3/d25/d30/f5d [0,4194304] 0 2026-03-10T07:51:27.826 INFO:tasks.workunit.client.1.vm08.stdout:3/335: creat d0/d3c/d1f/d44/f6c x:0 0 0 2026-03-10T07:51:27.827 INFO:tasks.workunit.client.1.vm08.stdout:6/317: link d1/d17/d2b/c36 d1/d3/df/d52/c6c 0 2026-03-10T07:51:27.834 INFO:tasks.workunit.client.1.vm08.stdout:2/373: truncate d0/d1/d17/d6b/f72 96869 0 2026-03-10T07:51:27.845 INFO:tasks.workunit.client.1.vm08.stdout:7/365: write d3/da/d25/d9/fd [1094493,1273] 0 2026-03-10T07:51:27.845 INFO:tasks.workunit.client.1.vm08.stdout:7/366: write d3/da/d25/d9/d2f/d3a/d4b/f7b [727760,35122] 0 2026-03-10T07:51:27.845 INFO:tasks.workunit.client.1.vm08.stdout:1/301: dwrite d2/d6/de/d1f/f20 [0,4194304] 0 2026-03-10T07:51:27.845 INFO:tasks.workunit.client.1.vm08.stdout:6/318: mkdir d1/d17/d2b/d6d 0 2026-03-10T07:51:27.845 INFO:tasks.workunit.client.1.vm08.stdout:9/347: symlink d2/l82 0 2026-03-10T07:51:27.845 INFO:tasks.workunit.client.1.vm08.stdout:2/374: mkdir d0/d1/d3/d39/d7d 0 2026-03-10T07:51:27.851 INFO:tasks.workunit.client.1.vm08.stdout:1/302: creat d2/f69 x:0 0 0 2026-03-10T07:51:27.856 INFO:tasks.workunit.client.1.vm08.stdout:8/355: dread d0/df/d2e/f44 [0,4194304] 0 2026-03-10T07:51:27.857 INFO:tasks.workunit.client.1.vm08.stdout:8/356: chown d0/df/d17/d66/f6a 626 1 2026-03-10T07:51:27.865 INFO:tasks.workunit.client.1.vm08.stdout:9/348: creat d2/d3/d25/d2b/f83 x:0 0 0 2026-03-10T07:51:27.870 INFO:tasks.workunit.client.1.vm08.stdout:3/336: sync 2026-03-10T07:51:27.872 INFO:tasks.workunit.client.1.vm08.stdout:6/319: rename d1/d3/df/d1d/f2a to d1/d3/df/d1d/d40/f6e 0 2026-03-10T07:51:27.875 INFO:tasks.workunit.client.1.vm08.stdout:2/375: rmdir d0/d1/d17 39 2026-03-10T07:51:27.876 INFO:tasks.workunit.client.1.vm08.stdout:8/357: symlink d0/df/d2e/d49/l73 0 2026-03-10T07:51:27.886 INFO:tasks.workunit.client.1.vm08.stdout:3/337: mknod d0/d3c/d18/d32/d61/d52/c6d 0 2026-03-10T07:51:27.889 INFO:tasks.workunit.client.1.vm08.stdout:1/303: rename d2/d6/d50/l65 to d2/d6/d11/l6a 0 2026-03-10T07:51:27.891 INFO:tasks.workunit.client.1.vm08.stdout:1/304: read d2/d6/de/d47/f3c [133644,129351] 0 2026-03-10T07:51:27.892 INFO:tasks.workunit.client.1.vm08.stdout:2/376: dread d0/f12 [0,4194304] 0 2026-03-10T07:51:27.893 INFO:tasks.workunit.client.1.vm08.stdout:6/320: mkdir d1/d3/df/d1d/d6f 0 2026-03-10T07:51:27.897 INFO:tasks.workunit.client.1.vm08.stdout:8/358: write d0/df/f1b [483514,128076] 0 2026-03-10T07:51:27.898 INFO:tasks.workunit.client.1.vm08.stdout:9/349: mkdir d2/d3/d84 0 2026-03-10T07:51:27.901 INFO:tasks.workunit.client.1.vm08.stdout:3/338: mknod d0/d3c/d1f/d44/d51/d2d/c6e 0 2026-03-10T07:51:27.918 INFO:tasks.workunit.client.1.vm08.stdout:8/359: mknod d0/df/d15/d23/d39/d5b/d4a/c74 0 2026-03-10T07:51:27.919 INFO:tasks.workunit.client.1.vm08.stdout:9/350: mknod d2/d3/d25/d30/d35/c85 0 2026-03-10T07:51:27.930 INFO:tasks.workunit.client.1.vm08.stdout:3/339: unlink d0/d3c/d1f/d44/f6a 0 2026-03-10T07:51:27.933 INFO:tasks.workunit.client.1.vm08.stdout:6/321: symlink d1/d3/df/d1d/d6f/l70 0 2026-03-10T07:51:27.936 INFO:tasks.workunit.client.1.vm08.stdout:9/351: dwrite d2/d3/d25/d30/d35/f6c [0,4194304] 0 2026-03-10T07:51:27.944 INFO:tasks.workunit.client.1.vm08.stdout:3/340: dwrite d0/d3c/d1f/d44/d51/d34/f47 [0,4194304] 0 2026-03-10T07:51:27.945 INFO:tasks.workunit.client.1.vm08.stdout:6/322: dwrite d1/db/d24/f25 [0,4194304] 0 2026-03-10T07:51:27.945 INFO:tasks.workunit.client.1.vm08.stdout:3/341: dread - d0/d3c/d1f/d44/d51/f65 zero size 2026-03-10T07:51:27.948 INFO:tasks.workunit.client.1.vm08.stdout:9/352: rmdir d2/de/d28 39 2026-03-10T07:51:27.957 INFO:tasks.workunit.client.1.vm08.stdout:6/323: read d1/d3/df/d1d/f1f [2682092,64549] 0 2026-03-10T07:51:27.967 INFO:tasks.workunit.client.1.vm08.stdout:1/305: getdents d2/d6/de 0 2026-03-10T07:51:27.980 INFO:tasks.workunit.client.1.vm08.stdout:6/324: creat d1/d3/f71 x:0 0 0 2026-03-10T07:51:27.980 INFO:tasks.workunit.client.1.vm08.stdout:6/325: write d1/db/d24/f26 [1402588,67892] 0 2026-03-10T07:51:27.980 INFO:tasks.workunit.client.1.vm08.stdout:6/326: write d1/f6 [889606,30427] 0 2026-03-10T07:51:27.980 INFO:tasks.workunit.client.1.vm08.stdout:8/360: getdents d0/df/d5d 0 2026-03-10T07:51:27.980 INFO:tasks.workunit.client.1.vm08.stdout:1/306: mknod d2/d6/d3a/d61/c6b 0 2026-03-10T07:51:27.980 INFO:tasks.workunit.client.1.vm08.stdout:3/342: creat d0/d3c/d1f/f6f x:0 0 0 2026-03-10T07:51:27.980 INFO:tasks.workunit.client.1.vm08.stdout:6/327: dwrite d1/f28 [0,4194304] 0 2026-03-10T07:51:27.980 INFO:tasks.workunit.client.1.vm08.stdout:6/328: chown d1 1189905 1 2026-03-10T07:51:27.980 INFO:tasks.workunit.client.1.vm08.stdout:1/307: write d2/d6/de/d1f/f2a [3774466,114001] 0 2026-03-10T07:51:27.980 INFO:tasks.workunit.client.1.vm08.stdout:1/308: chown d2/d6/de/d1f/d26/f48 440 1 2026-03-10T07:51:27.980 INFO:tasks.workunit.client.1.vm08.stdout:1/309: chown d2/d10 19869 1 2026-03-10T07:51:27.980 INFO:tasks.workunit.client.1.vm08.stdout:3/343: read d0/d3c/d1f/d44/d51/f4d [30764,59143] 0 2026-03-10T07:51:27.980 INFO:tasks.workunit.client.1.vm08.stdout:1/310: readlink d2/d6/de/d1f/l4e 0 2026-03-10T07:51:27.980 INFO:tasks.workunit.client.1.vm08.stdout:3/344: chown d0/d3c/d18 73790063 1 2026-03-10T07:51:27.980 INFO:tasks.workunit.client.1.vm08.stdout:1/311: chown d2/d6/de 897984 1 2026-03-10T07:51:27.980 INFO:tasks.workunit.client.1.vm08.stdout:8/361: creat d0/df/d15/d23/f75 x:0 0 0 2026-03-10T07:51:27.981 INFO:tasks.workunit.client.1.vm08.stdout:1/312: symlink d2/d6/d3a/l6c 0 2026-03-10T07:51:27.981 INFO:tasks.workunit.client.1.vm08.stdout:3/345: creat d0/d3c/d18/d32/d61/d52/f70 x:0 0 0 2026-03-10T07:51:27.985 INFO:tasks.workunit.client.1.vm08.stdout:8/362: dwrite d0/f6 [0,4194304] 0 2026-03-10T07:51:27.987 INFO:tasks.workunit.client.1.vm08.stdout:1/313: creat d2/d6/d3a/f6d x:0 0 0 2026-03-10T07:51:27.991 INFO:tasks.workunit.client.1.vm08.stdout:8/363: dwrite d0/d69/f4c [0,4194304] 0 2026-03-10T07:51:27.996 INFO:tasks.workunit.client.1.vm08.stdout:8/364: creat d0/df/d2e/d30/f76 x:0 0 0 2026-03-10T07:51:27.997 INFO:tasks.workunit.client.1.vm08.stdout:8/365: fsync d0/f2a 0 2026-03-10T07:51:28.002 INFO:tasks.workunit.client.1.vm08.stdout:1/314: dwrite d2/d6/de/d1f/d26/f48 [0,4194304] 0 2026-03-10T07:51:28.013 INFO:tasks.workunit.client.1.vm08.stdout:5/377: dwrite d0/d4/df/f27 [4194304,4194304] 0 2026-03-10T07:51:28.013 INFO:tasks.workunit.client.1.vm08.stdout:8/366: chown d0/df/d15/d23/d39/d5b/c34 1579 1 2026-03-10T07:51:28.013 INFO:tasks.workunit.client.1.vm08.stdout:1/315: creat d2/d6/de/d1f/d26/f6e x:0 0 0 2026-03-10T07:51:28.013 INFO:tasks.workunit.client.1.vm08.stdout:1/316: stat d2/d10/c17 0 2026-03-10T07:51:28.013 INFO:tasks.workunit.client.1.vm08.stdout:1/317: readlink d2/d6/de/d1f/d40/l5a 0 2026-03-10T07:51:28.013 INFO:tasks.workunit.client.1.vm08.stdout:8/367: truncate d0/df/d15/d23/d39/f40 3383897 0 2026-03-10T07:51:28.013 INFO:tasks.workunit.client.1.vm08.stdout:1/318: mkdir d2/d6/d3a/d61/d6f 0 2026-03-10T07:51:28.013 INFO:tasks.workunit.client.1.vm08.stdout:8/368: fsync d0/df/d17/d66/f6a 0 2026-03-10T07:51:28.013 INFO:tasks.workunit.client.1.vm08.stdout:1/319: dread - d2/d6/d3a/f6d zero size 2026-03-10T07:51:28.013 INFO:tasks.workunit.client.1.vm08.stdout:8/369: stat d0/df/d2e 0 2026-03-10T07:51:28.023 INFO:tasks.workunit.client.1.vm08.stdout:1/320: mkdir d2/d6/de/d70 0 2026-03-10T07:51:28.025 INFO:tasks.workunit.client.1.vm08.stdout:8/370: mkdir d0/d69/d77 0 2026-03-10T07:51:28.026 INFO:tasks.workunit.client.1.vm08.stdout:1/321: mkdir d2/d6/de/d71 0 2026-03-10T07:51:28.032 INFO:tasks.workunit.client.1.vm08.stdout:8/371: mknod d0/df/d17/c78 0 2026-03-10T07:51:28.032 INFO:tasks.workunit.client.1.vm08.stdout:8/372: chown d0/df/d2e/d49/l73 10 1 2026-03-10T07:51:28.032 INFO:tasks.workunit.client.1.vm08.stdout:1/322: rename d2/d6/de/d47/f45 to d2/d6/de/d1f/d26/d58/d64/f72 0 2026-03-10T07:51:28.032 INFO:tasks.workunit.client.1.vm08.stdout:1/323: fdatasync d2/f4 0 2026-03-10T07:51:28.032 INFO:tasks.workunit.client.1.vm08.stdout:8/373: truncate d0/df/f26 4515119 0 2026-03-10T07:51:28.032 INFO:tasks.workunit.client.1.vm08.stdout:1/324: mknod d2/d6/de/d1f/d26/c73 0 2026-03-10T07:51:28.068 INFO:tasks.workunit.client.1.vm08.stdout:8/374: dread d0/df/d2e/f3c [0,4194304] 0 2026-03-10T07:51:28.103 INFO:tasks.workunit.client.1.vm08.stdout:8/375: sync 2026-03-10T07:51:28.104 INFO:tasks.workunit.client.1.vm08.stdout:8/376: dread - d0/df/d17/d66/f6a zero size 2026-03-10T07:51:28.114 INFO:tasks.workunit.client.1.vm08.stdout:8/377: rename d0/df/d15/d23/d54/c57 to d0/df/d15/d23/d39/d5b/c79 0 2026-03-10T07:51:28.114 INFO:tasks.workunit.client.1.vm08.stdout:8/378: truncate d0/f6 4990304 0 2026-03-10T07:51:28.153 INFO:tasks.workunit.client.1.vm08.stdout:6/329: chown d1/d3/df/d1d/c69 48 1 2026-03-10T07:51:28.154 INFO:tasks.workunit.client.1.vm08.stdout:6/330: creat d1/d46/f72 x:0 0 0 2026-03-10T07:51:28.155 INFO:tasks.workunit.client.1.vm08.stdout:6/331: fdatasync d1/d17/f63 0 2026-03-10T07:51:28.159 INFO:tasks.workunit.client.1.vm08.stdout:6/332: rename d1/d17/d2b/d6d to d1/db/d24/d73 0 2026-03-10T07:51:28.162 INFO:tasks.workunit.client.1.vm08.stdout:6/333: rmdir d1/d3/df 39 2026-03-10T07:51:28.163 INFO:tasks.workunit.client.1.vm08.stdout:6/334: creat d1/d46/f74 x:0 0 0 2026-03-10T07:51:28.169 INFO:tasks.workunit.client.1.vm08.stdout:6/335: creat d1/db/d24/f75 x:0 0 0 2026-03-10T07:51:28.175 INFO:tasks.workunit.client.1.vm08.stdout:0/355: write dd/d10/d14/d15/d20/d22/f2e [957422,80623] 0 2026-03-10T07:51:28.176 INFO:tasks.workunit.client.1.vm08.stdout:6/336: mkdir d1/d17/d2b/d58/d76 0 2026-03-10T07:51:28.177 INFO:tasks.workunit.client.1.vm08.stdout:0/356: symlink dd/d18/l7c 0 2026-03-10T07:51:28.179 INFO:tasks.workunit.client.1.vm08.stdout:6/337: sync 2026-03-10T07:51:28.181 INFO:tasks.workunit.client.1.vm08.stdout:0/357: symlink dd/d29/d58/d3d/l7d 0 2026-03-10T07:51:28.184 INFO:tasks.workunit.client.1.vm08.stdout:6/338: mkdir d1/d17/d2b/d58/d77 0 2026-03-10T07:51:28.184 INFO:tasks.workunit.client.1.vm08.stdout:0/358: creat dd/d10/d14/d15/d20/f7e x:0 0 0 2026-03-10T07:51:28.186 INFO:tasks.workunit.client.1.vm08.stdout:6/339: dread d1/d3/d3e/f56 [0,4194304] 0 2026-03-10T07:51:28.188 INFO:tasks.workunit.client.1.vm08.stdout:0/359: rename fb to dd/d10/d14/d15/d20/d5f/f7f 0 2026-03-10T07:51:28.197 INFO:tasks.workunit.client.1.vm08.stdout:7/367: dwrite d3/da/d25/d9/d2f/d39/f76 [0,4194304] 0 2026-03-10T07:51:28.212 INFO:tasks.workunit.client.1.vm08.stdout:6/340: creat d1/db/d24/d73/f78 x:0 0 0 2026-03-10T07:51:28.212 INFO:tasks.workunit.client.1.vm08.stdout:4/248: dread f0 [0,4194304] 0 2026-03-10T07:51:28.217 INFO:tasks.workunit.client.1.vm08.stdout:2/377: truncate d0/f68 1186343 0 2026-03-10T07:51:28.229 INFO:tasks.workunit.client.1.vm08.stdout:7/368: symlink d3/da/d25/d9/d2f/d3a/d40/d54/l7c 0 2026-03-10T07:51:28.239 INFO:tasks.workunit.client.1.vm08.stdout:6/341: mkdir d1/db/d24/d73/d79 0 2026-03-10T07:51:28.239 INFO:tasks.workunit.client.1.vm08.stdout:6/342: write d1/d17/f20 [1306071,93234] 0 2026-03-10T07:51:28.243 INFO:tasks.workunit.client.1.vm08.stdout:4/249: mknod d5/d8/d9/d12/c57 0 2026-03-10T07:51:28.245 INFO:tasks.workunit.client.1.vm08.stdout:3/346: truncate d0/d3c/d1f/d44/d51/d34/f53 3240025 0 2026-03-10T07:51:28.253 INFO:tasks.workunit.client.1.vm08.stdout:5/378: dwrite d0/d4/d19/d3a/d69/f6b [0,4194304] 0 2026-03-10T07:51:28.254 INFO:tasks.workunit.client.1.vm08.stdout:5/379: write d0/d4/df/d1e/d41/f5c [394274,6986] 0 2026-03-10T07:51:28.258 INFO:tasks.workunit.client.1.vm08.stdout:5/380: dwrite d0/d4/df/d1e/d41/f5c [0,4194304] 0 2026-03-10T07:51:28.270 INFO:tasks.workunit.client.1.vm08.stdout:5/381: dread d0/d8/f18 [0,4194304] 0 2026-03-10T07:51:28.276 INFO:tasks.workunit.client.1.vm08.stdout:6/343: mknod d1/d17/d2b/c7a 0 2026-03-10T07:51:28.278 INFO:tasks.workunit.client.1.vm08.stdout:4/250: rmdir d5/d8 39 2026-03-10T07:51:28.282 INFO:tasks.workunit.client.1.vm08.stdout:1/325: getdents d2/d6/de/d1f/d26 0 2026-03-10T07:51:28.283 INFO:tasks.workunit.client.1.vm08.stdout:1/326: dread - d2/d6/de/d1f/d26/f4a zero size 2026-03-10T07:51:28.283 INFO:tasks.workunit.client.1.vm08.stdout:1/327: chown d2/d6/d11/f44 551 1 2026-03-10T07:51:28.284 INFO:tasks.workunit.client.1.vm08.stdout:1/328: chown d2/d6/c5c 116544747 1 2026-03-10T07:51:28.284 INFO:tasks.workunit.client.1.vm08.stdout:2/378: mkdir d0/d1/d3/d39/d7d/d7e 0 2026-03-10T07:51:28.285 INFO:tasks.workunit.client.1.vm08.stdout:8/379: write d0/f20 [571262,98231] 0 2026-03-10T07:51:28.287 INFO:tasks.workunit.client.1.vm08.stdout:5/382: creat d0/d4/df/d4a/d63/f74 x:0 0 0 2026-03-10T07:51:28.290 INFO:tasks.workunit.client.1.vm08.stdout:4/251: chown d5/d8/d9/f4e 52290332 1 2026-03-10T07:51:28.290 INFO:tasks.workunit.client.1.vm08.stdout:1/329: dwrite d2/d6/de/d1f/f33 [4194304,4194304] 0 2026-03-10T07:51:28.290 INFO:tasks.workunit.client.1.vm08.stdout:3/347: chown d0/d3c/d1f/d44/d51/d34/f53 7646890 1 2026-03-10T07:51:28.291 INFO:tasks.workunit.client.1.vm08.stdout:5/383: read d0/d4/df/d12/f13 [1159568,38497] 0 2026-03-10T07:51:28.292 INFO:tasks.workunit.client.1.vm08.stdout:1/330: chown d2/d6/de/d47/l5e 34 1 2026-03-10T07:51:28.293 INFO:tasks.workunit.client.1.vm08.stdout:1/331: chown d2/d6/de/d1f 2145 1 2026-03-10T07:51:28.299 INFO:tasks.workunit.client.1.vm08.stdout:8/380: mkdir d0/df/d17/d7a 0 2026-03-10T07:51:28.300 INFO:tasks.workunit.client.1.vm08.stdout:8/381: truncate d0/df/d15/f70 553764 0 2026-03-10T07:51:28.301 INFO:tasks.workunit.client.1.vm08.stdout:8/382: read d0/f45 [239684,14625] 0 2026-03-10T07:51:28.302 INFO:tasks.workunit.client.1.vm08.stdout:2/379: sync 2026-03-10T07:51:28.302 INFO:tasks.workunit.client.1.vm08.stdout:8/383: truncate d0/df/d17/d66/f6a 904784 0 2026-03-10T07:51:28.303 INFO:tasks.workunit.client.1.vm08.stdout:2/380: read - d0/d1/d3/d56/d57/f79 zero size 2026-03-10T07:51:28.309 INFO:tasks.workunit.client.1.vm08.stdout:4/252: dread d5/f26 [0,4194304] 0 2026-03-10T07:51:28.310 INFO:tasks.workunit.client.1.vm08.stdout:2/381: dwrite d0/d1/d3/d56/d57/f79 [0,4194304] 0 2026-03-10T07:51:28.312 INFO:tasks.workunit.client.1.vm08.stdout:6/344: truncate d1/db/d24/d51/f59 445782 0 2026-03-10T07:51:28.316 INFO:tasks.workunit.client.1.vm08.stdout:6/345: stat d1/db/d24/d51 0 2026-03-10T07:51:28.317 INFO:tasks.workunit.client.1.vm08.stdout:6/346: truncate d1/db/f4e 1068001 0 2026-03-10T07:51:28.318 INFO:tasks.workunit.client.1.vm08.stdout:6/347: truncate d1/d17/d2b/f68 24467 0 2026-03-10T07:51:28.318 INFO:tasks.workunit.client.1.vm08.stdout:2/382: dwrite d0/f50 [0,4194304] 0 2026-03-10T07:51:28.327 INFO:tasks.workunit.client.1.vm08.stdout:1/332: rename d2/d6/de/d1f/f20 to d2/d6/de/f74 0 2026-03-10T07:51:28.329 INFO:tasks.workunit.client.1.vm08.stdout:1/333: read d2/d6/de/d1f/d22/f24 [3130483,120324] 0 2026-03-10T07:51:28.331 INFO:tasks.workunit.client.1.vm08.stdout:1/334: fdatasync d2/d6/d3a/f6d 0 2026-03-10T07:51:28.334 INFO:tasks.workunit.client.1.vm08.stdout:6/348: chown d1/d3/df 128032 1 2026-03-10T07:51:28.336 INFO:tasks.workunit.client.1.vm08.stdout:4/253: dwrite d5/d1f/d41/f49 [0,4194304] 0 2026-03-10T07:51:28.339 INFO:tasks.workunit.client.1.vm08.stdout:4/254: write d5/d17/f43 [157581,83087] 0 2026-03-10T07:51:28.344 INFO:tasks.workunit.client.1.vm08.stdout:0/360: dwrite f6 [0,4194304] 0 2026-03-10T07:51:28.346 INFO:tasks.workunit.client.1.vm08.stdout:2/383: creat d0/d1/d3/d56/d78/d42/f7f x:0 0 0 2026-03-10T07:51:28.357 INFO:tasks.workunit.client.1.vm08.stdout:5/384: creat d0/d4/f75 x:0 0 0 2026-03-10T07:51:28.357 INFO:tasks.workunit.client.1.vm08.stdout:7/369: truncate d3/da/d25/d9/d2f/f42 3588388 0 2026-03-10T07:51:28.365 INFO:tasks.workunit.client.1.vm08.stdout:6/349: mknod d1/d3/df/d1d/d6f/c7b 0 2026-03-10T07:51:28.365 INFO:tasks.workunit.client.1.vm08.stdout:6/350: write d1/d17/f20 [267617,18756] 0 2026-03-10T07:51:28.369 INFO:tasks.workunit.client.1.vm08.stdout:4/255: write d5/d8/d9/f18 [1167856,43499] 0 2026-03-10T07:51:28.370 INFO:tasks.workunit.client.1.vm08.stdout:4/256: read - d5/d8/d9/d32/f44 zero size 2026-03-10T07:51:28.370 INFO:tasks.workunit.client.1.vm08.stdout:8/384: link d0/df/d15/d53/c61 d0/df/d17/d66/c7b 0 2026-03-10T07:51:28.371 INFO:tasks.workunit.client.1.vm08.stdout:0/361: mknod dd/d10/d14/c80 0 2026-03-10T07:51:28.372 INFO:tasks.workunit.client.1.vm08.stdout:0/362: chown dd/d29 308774106 1 2026-03-10T07:51:28.372 INFO:tasks.workunit.client.1.vm08.stdout:2/384: creat d0/d1/d3/d39/d7d/f80 x:0 0 0 2026-03-10T07:51:28.373 INFO:tasks.workunit.client.1.vm08.stdout:3/348: getdents d0/d3c/d18/d32/d61 0 2026-03-10T07:51:28.375 INFO:tasks.workunit.client.1.vm08.stdout:5/385: creat d0/d4/df/d4a/d63/f76 x:0 0 0 2026-03-10T07:51:28.382 INFO:tasks.workunit.client.1.vm08.stdout:4/257: rmdir d5 39 2026-03-10T07:51:28.385 INFO:tasks.workunit.client.1.vm08.stdout:9/353: write d2/f1a [4216743,87907] 0 2026-03-10T07:51:28.404 INFO:tasks.workunit.client.1.vm08.stdout:1/335: link d2/d6/de/d1f/d26/f6e d2/d6/de/d1f/f75 0 2026-03-10T07:51:28.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:28 vm05.local ceph-mon[50387]: pgmap v33: 65 pgs: 65 active+clean; 2.6 GiB data, 8.6 GiB used, 111 GiB / 120 GiB avail; 19 MiB/s rd, 103 MiB/s wr, 207 op/s 2026-03-10T07:51:28.412 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:28 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:51:28.412 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:28 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:28.412 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:28 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:28.412 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:28 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:51:28.412 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:28 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:51:28.412 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:28 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:28.412 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:28 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T07:51:28.412 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:28 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:51:28.412 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:28 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:28.412 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:28 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:28.412 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:28 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:28.412 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:28 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:28.412 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:28 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:28.412 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:28 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:28.412 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:28 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:28.412 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:28 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:28.412 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:28 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:28.412 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:28 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:28.415 INFO:tasks.workunit.client.1.vm08.stdout:6/351: mkdir d1/db/d24/d73/d79/d7c 0 2026-03-10T07:51:28.415 INFO:tasks.workunit.client.1.vm08.stdout:6/352: fdatasync d1/db/d24/f50 0 2026-03-10T07:51:28.416 INFO:tasks.workunit.client.1.vm08.stdout:4/258: write d5/d8/d9/d32/f44 [6082,17879] 0 2026-03-10T07:51:28.417 INFO:tasks.workunit.client.1.vm08.stdout:4/259: dread - d5/d8/d9/f4e zero size 2026-03-10T07:51:28.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:28 vm08.local ceph-mon[59917]: pgmap v33: 65 pgs: 65 active+clean; 2.6 GiB data, 8.6 GiB used, 111 GiB / 120 GiB avail; 19 MiB/s rd, 103 MiB/s wr, 207 op/s 2026-03-10T07:51:28.421 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:28 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:51:28.421 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:28 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:28.421 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:28 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:28.421 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:28 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:51:28.421 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:28 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:51:28.421 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:28 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:28.421 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:28 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T07:51:28.421 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:28 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:51:28.421 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:28 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:28.421 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:28 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:28.421 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:28 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:28.421 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:28 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:28.421 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:28 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:28.421 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:28 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:28.421 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:28 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:28.421 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:28 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:28.421 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:28 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:28.421 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:28 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:28.421 INFO:tasks.workunit.client.1.vm08.stdout:9/354: unlink d2/f10 0 2026-03-10T07:51:28.422 INFO:tasks.workunit.client.1.vm08.stdout:1/336: sync 2026-03-10T07:51:28.423 INFO:tasks.workunit.client.1.vm08.stdout:1/337: dread - d2/d6/de/d1f/f75 zero size 2026-03-10T07:51:28.424 INFO:tasks.workunit.client.1.vm08.stdout:0/363: link dd/d10/d14/d15/d20/f72 dd/d10/d2f/f81 0 2026-03-10T07:51:28.424 INFO:tasks.workunit.client.1.vm08.stdout:7/370: dwrite d3/da/d25/d9/d2f/f62 [4194304,4194304] 0 2026-03-10T07:51:28.434 INFO:tasks.workunit.client.1.vm08.stdout:8/385: write d0/df/d17/f1a [799926,64491] 0 2026-03-10T07:51:28.435 INFO:tasks.workunit.client.1.vm08.stdout:2/385: truncate d0/d1/d3/d10/d38/f53 590057 0 2026-03-10T07:51:28.446 INFO:tasks.workunit.client.1.vm08.stdout:9/355: fsync d2/de/f4d 0 2026-03-10T07:51:28.471 INFO:tasks.workunit.client.1.vm08.stdout:8/386: symlink d0/df/d2e/d30/l7c 0 2026-03-10T07:51:28.471 INFO:tasks.workunit.client.1.vm08.stdout:8/387: stat d0/df/d5d 0 2026-03-10T07:51:28.473 INFO:tasks.workunit.client.1.vm08.stdout:3/349: rename d0/d3c/d1f/d44/d51/d2d/f43 to d0/d3c/d1f/d44/d51/f71 0 2026-03-10T07:51:28.477 INFO:tasks.workunit.client.1.vm08.stdout:4/260: creat d5/d1f/d31/f58 x:0 0 0 2026-03-10T07:51:28.482 INFO:tasks.workunit.client.1.vm08.stdout:9/356: unlink d2/d3/d25/d30/c39 0 2026-03-10T07:51:28.490 INFO:tasks.workunit.client.1.vm08.stdout:7/371: creat d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79/f7d x:0 0 0 2026-03-10T07:51:28.493 INFO:tasks.workunit.client.1.vm08.stdout:7/372: dwrite d3/da/d25/f1e [0,4194304] 0 2026-03-10T07:51:28.499 INFO:tasks.workunit.client.1.vm08.stdout:8/388: mknod d0/df/d17/c7d 0 2026-03-10T07:51:28.500 INFO:tasks.workunit.client.1.vm08.stdout:3/350: mknod d0/d3c/d18/d4a/c72 0 2026-03-10T07:51:28.503 INFO:tasks.workunit.client.1.vm08.stdout:8/389: dread d0/df/d17/d66/f6a [0,4194304] 0 2026-03-10T07:51:28.504 INFO:tasks.workunit.client.1.vm08.stdout:8/390: truncate d0/d69/f4c 5146656 0 2026-03-10T07:51:28.506 INFO:tasks.workunit.client.1.vm08.stdout:4/261: dread d5/d8/f42 [0,4194304] 0 2026-03-10T07:51:28.507 INFO:tasks.workunit.client.1.vm08.stdout:4/262: write d5/d8/ff [1493564,49704] 0 2026-03-10T07:51:28.509 INFO:tasks.workunit.client.1.vm08.stdout:0/364: link ca dd/d10/d14/d15/d20/d5f/c82 0 2026-03-10T07:51:28.516 INFO:tasks.workunit.client.1.vm08.stdout:2/386: creat d0/f81 x:0 0 0 2026-03-10T07:51:28.522 INFO:tasks.workunit.client.1.vm08.stdout:7/373: creat d3/da/d25/d9/d2f/d39/d43/f7e x:0 0 0 2026-03-10T07:51:28.522 INFO:tasks.workunit.client.1.vm08.stdout:6/353: getdents d1/d3/df/d1d/d40/d45/d5c 0 2026-03-10T07:51:28.522 INFO:tasks.workunit.client.1.vm08.stdout:6/354: chown d1/d3/df 586778 1 2026-03-10T07:51:28.526 INFO:tasks.workunit.client.1.vm08.stdout:6/355: dwrite d1/db/f23 [0,4194304] 0 2026-03-10T07:51:28.526 INFO:tasks.workunit.client.1.vm08.stdout:6/356: dread - d1/d3/f71 zero size 2026-03-10T07:51:28.539 INFO:tasks.workunit.client.1.vm08.stdout:6/357: dwrite d1/d3/df/d44/f5a [0,4194304] 0 2026-03-10T07:51:28.540 INFO:tasks.workunit.client.1.vm08.stdout:6/358: write d1/d46/f74 [327210,99517] 0 2026-03-10T07:51:28.555 INFO:tasks.workunit.client.1.vm08.stdout:7/374: dread d3/da/d25/f29 [0,4194304] 0 2026-03-10T07:51:28.565 INFO:tasks.workunit.client.1.vm08.stdout:8/391: unlink d0/df/d17/f32 0 2026-03-10T07:51:28.571 INFO:tasks.workunit.client.1.vm08.stdout:1/338: getdents d2/d10 0 2026-03-10T07:51:28.596 INFO:tasks.workunit.client.1.vm08.stdout:5/386: rename d0/d4/df/d4a to d0/d77 0 2026-03-10T07:51:28.599 INFO:tasks.workunit.client.1.vm08.stdout:9/357: truncate d2/de/f1e 3982258 0 2026-03-10T07:51:28.600 INFO:tasks.workunit.client.1.vm08.stdout:6/359: fsync d1/d3/f21 0 2026-03-10T07:51:28.604 INFO:tasks.workunit.client.1.vm08.stdout:3/351: link d0/d3c/d1f/d44/f6c d0/d3c/d18/d32/d61/d52/f73 0 2026-03-10T07:51:28.607 INFO:tasks.workunit.client.1.vm08.stdout:8/392: rmdir d0/df/d15 39 2026-03-10T07:51:28.607 INFO:tasks.workunit.client.1.vm08.stdout:1/339: write d2/d6/de/d1f/d26/d58/d64/f72 [200314,30737] 0 2026-03-10T07:51:28.608 INFO:tasks.workunit.client.1.vm08.stdout:4/263: mknod d5/d8/d50/c59 0 2026-03-10T07:51:28.610 INFO:tasks.workunit.client.1.vm08.stdout:4/264: write d5/d1f/d31/f3f [1486383,47431] 0 2026-03-10T07:51:28.612 INFO:tasks.workunit.client.1.vm08.stdout:4/265: readlink d5/d1f/d31/l47 0 2026-03-10T07:51:28.612 INFO:tasks.workunit.client.1.vm08.stdout:4/266: fdatasync d5/d8/f30 0 2026-03-10T07:51:28.613 INFO:tasks.workunit.client.1.vm08.stdout:0/365: rename dd/f56 to dd/d29/d58/d3d/f83 0 2026-03-10T07:51:28.622 INFO:tasks.workunit.client.1.vm08.stdout:9/358: dwrite d2/d3/d25/f24 [0,4194304] 0 2026-03-10T07:51:28.623 INFO:tasks.workunit.client.1.vm08.stdout:3/352: mknod d0/d3c/d18/d4a/c74 0 2026-03-10T07:51:28.623 INFO:tasks.workunit.client.1.vm08.stdout:9/359: write d2/d3/d25/d2b/f7a [369002,105908] 0 2026-03-10T07:51:28.623 INFO:tasks.workunit.client.1.vm08.stdout:1/340: mkdir d2/d6/de/d1f/d40/d76 0 2026-03-10T07:51:28.627 INFO:tasks.workunit.client.1.vm08.stdout:3/353: truncate d0/d3c/d1f/d44/f4b 38965 0 2026-03-10T07:51:28.629 INFO:tasks.workunit.client.1.vm08.stdout:7/375: dwrite d3/da/d25/d9/d2f/f42 [0,4194304] 0 2026-03-10T07:51:28.642 INFO:tasks.workunit.client.1.vm08.stdout:4/267: creat d5/d8/d9/f5a x:0 0 0 2026-03-10T07:51:28.642 INFO:tasks.workunit.client.1.vm08.stdout:0/366: creat dd/d10/d14/d15/f84 x:0 0 0 2026-03-10T07:51:28.642 INFO:tasks.workunit.client.1.vm08.stdout:8/393: symlink d0/d69/d77/l7e 0 2026-03-10T07:51:28.643 INFO:tasks.workunit.client.1.vm08.stdout:6/360: mkdir d1/d7d 0 2026-03-10T07:51:28.643 INFO:tasks.workunit.client.1.vm08.stdout:2/387: getdents d0/d1/d3/d10/d32 0 2026-03-10T07:51:28.643 INFO:tasks.workunit.client.1.vm08.stdout:4/268: fsync d5/d17/f22 0 2026-03-10T07:51:28.644 INFO:tasks.workunit.client.1.vm08.stdout:4/269: stat d5/d8/l2a 0 2026-03-10T07:51:28.648 INFO:tasks.workunit.client.1.vm08.stdout:5/387: getdents d0/d4/d19/d60/d6d/d70/d40 0 2026-03-10T07:51:28.649 INFO:tasks.workunit.client.1.vm08.stdout:0/367: write dd/d29/f2a [267614,79823] 0 2026-03-10T07:51:28.650 INFO:tasks.workunit.client.1.vm08.stdout:7/376: mknod d3/da/c7f 0 2026-03-10T07:51:28.652 INFO:tasks.workunit.client.1.vm08.stdout:7/377: chown d3/da/d25/d9/f23 0 1 2026-03-10T07:51:28.652 INFO:tasks.workunit.client.1.vm08.stdout:0/368: mknod dd/d29/d5c/c85 0 2026-03-10T07:51:28.655 INFO:tasks.workunit.client.1.vm08.stdout:4/270: dwrite d5/d17/f1a [0,4194304] 0 2026-03-10T07:51:28.659 INFO:tasks.workunit.client.1.vm08.stdout:3/354: dwrite d0/d3c/d1f/d44/d51/f71 [4194304,4194304] 0 2026-03-10T07:51:28.666 INFO:tasks.workunit.client.1.vm08.stdout:7/378: symlink d3/da/d25/d9/d2f/d3a/l80 0 2026-03-10T07:51:28.666 INFO:tasks.workunit.client.1.vm08.stdout:4/271: unlink d5/f26 0 2026-03-10T07:51:28.668 INFO:tasks.workunit.client.1.vm08.stdout:9/360: dread d2/de/f4f [4194304,4194304] 0 2026-03-10T07:51:28.668 INFO:tasks.workunit.client.1.vm08.stdout:3/355: mknod d0/d3c/d18/d48/d55/c75 0 2026-03-10T07:51:28.668 INFO:tasks.workunit.client.1.vm08.stdout:6/361: rename d1/d3/df/d1d/d40/c67 to d1/c7e 0 2026-03-10T07:51:28.669 INFO:tasks.workunit.client.1.vm08.stdout:2/388: link d0/d1/c71 d0/d1/d17/c82 0 2026-03-10T07:51:28.671 INFO:tasks.workunit.client.1.vm08.stdout:4/272: chown d5/d17/d48/d4f/f56 266520 1 2026-03-10T07:51:28.680 INFO:tasks.workunit.client.1.vm08.stdout:5/388: dread d0/d4/d19/d43/f35 [0,4194304] 0 2026-03-10T07:51:28.680 INFO:tasks.workunit.client.1.vm08.stdout:1/341: dwrite d2/d6/de/d47/f3c [0,4194304] 0 2026-03-10T07:51:28.680 INFO:tasks.workunit.client.1.vm08.stdout:1/342: chown d2/d6/de/f32 4040340 1 2026-03-10T07:51:28.685 INFO:tasks.workunit.client.1.vm08.stdout:3/356: creat d0/d3c/d18/d32/f76 x:0 0 0 2026-03-10T07:51:28.691 INFO:tasks.workunit.client.1.vm08.stdout:4/273: rmdir d5/d8/d9 39 2026-03-10T07:51:28.692 INFO:tasks.workunit.client.1.vm08.stdout:5/389: creat d0/d77/d63/f78 x:0 0 0 2026-03-10T07:51:28.695 INFO:tasks.workunit.client.1.vm08.stdout:3/357: mknod d0/d3c/d18/d48/d55/d56/c77 0 2026-03-10T07:51:28.697 INFO:tasks.workunit.client.1.vm08.stdout:4/274: creat d5/d17/d48/f5b x:0 0 0 2026-03-10T07:51:28.698 INFO:tasks.workunit.client.1.vm08.stdout:5/390: creat d0/d4/d19/f79 x:0 0 0 2026-03-10T07:51:28.705 INFO:tasks.workunit.client.1.vm08.stdout:1/343: symlink d2/d6/de/d71/l77 0 2026-03-10T07:51:28.705 INFO:tasks.workunit.client.1.vm08.stdout:3/358: dwrite d0/d3c/d18/f1e [0,4194304] 0 2026-03-10T07:51:28.705 INFO:tasks.workunit.client.1.vm08.stdout:7/379: getdents d3/da/d25/d9/d2f/d4d 0 2026-03-10T07:51:28.705 INFO:tasks.workunit.client.1.vm08.stdout:7/380: readlink d3/da/d25/d9/d2f/d3a/d4b/d67/d69/l77 0 2026-03-10T07:51:28.705 INFO:tasks.workunit.client.1.vm08.stdout:9/361: rename d2/d3/d25 to d2/d86 0 2026-03-10T07:51:28.706 INFO:tasks.workunit.client.1.vm08.stdout:4/275: write d5/d8/d9/d12/f53 [310947,84238] 0 2026-03-10T07:51:28.710 INFO:tasks.workunit.client.1.vm08.stdout:3/359: dwrite d0/d3c/d18/f22 [4194304,4194304] 0 2026-03-10T07:51:28.712 INFO:tasks.workunit.client.1.vm08.stdout:3/360: chown d0/f10 876408 1 2026-03-10T07:51:28.714 INFO:tasks.workunit.client.1.vm08.stdout:1/344: creat d2/d6/de/d1f/f78 x:0 0 0 2026-03-10T07:51:28.720 INFO:tasks.workunit.client.1.vm08.stdout:1/345: fdatasync d2/d6/de/d1f/d22/f30 0 2026-03-10T07:51:28.722 INFO:tasks.workunit.client.1.vm08.stdout:3/361: mknod d0/d3c/d1f/d44/d51/d34/c78 0 2026-03-10T07:51:28.722 INFO:tasks.workunit.client.1.vm08.stdout:4/276: getdents d5/d8/d9/d12 0 2026-03-10T07:51:28.725 INFO:tasks.workunit.client.1.vm08.stdout:4/277: rmdir d5 39 2026-03-10T07:51:28.729 INFO:tasks.workunit.client.1.vm08.stdout:3/362: creat d0/f79 x:0 0 0 2026-03-10T07:51:28.731 INFO:tasks.workunit.client.1.vm08.stdout:4/278: rename d5/l14 to d5/d8/d9/l5c 0 2026-03-10T07:51:28.732 INFO:tasks.workunit.client.1.vm08.stdout:4/279: chown d5/d1f/d31/l4a 407 1 2026-03-10T07:51:28.733 INFO:tasks.workunit.client.1.vm08.stdout:4/280: chown d5/f21 1287 1 2026-03-10T07:51:28.734 INFO:tasks.workunit.client.1.vm08.stdout:5/391: dread d0/d77/d63/f65 [0,4194304] 0 2026-03-10T07:51:28.734 INFO:tasks.workunit.client.1.vm08.stdout:3/363: dread d0/d3c/f20 [0,4194304] 0 2026-03-10T07:51:28.734 INFO:tasks.workunit.client.1.vm08.stdout:5/392: fdatasync d0/d77/f62 0 2026-03-10T07:51:28.735 INFO:tasks.workunit.client.1.vm08.stdout:5/393: write d0/d4/df/f2a [321001,51884] 0 2026-03-10T07:51:28.740 INFO:tasks.workunit.client.1.vm08.stdout:5/394: rename d0/d8/fe to d0/d4/df/f7a 0 2026-03-10T07:51:28.743 INFO:tasks.workunit.client.1.vm08.stdout:3/364: dwrite d0/d3c/f20 [0,4194304] 0 2026-03-10T07:51:28.743 INFO:tasks.workunit.client.1.vm08.stdout:5/395: symlink d0/d33/l7b 0 2026-03-10T07:51:28.743 INFO:tasks.workunit.client.1.vm08.stdout:3/365: readlink d0/d3c/d18/d48/d55/l5a 0 2026-03-10T07:51:28.751 INFO:tasks.workunit.client.1.vm08.stdout:5/396: rename d0/d4/df/f5a to d0/d4/d19/d43/f7c 0 2026-03-10T07:51:28.760 INFO:tasks.workunit.client.1.vm08.stdout:3/366: rename d0/d3c/d1f/d44/f6c to d0/d3c/d18/d32/d61/f7a 0 2026-03-10T07:51:28.760 INFO:tasks.workunit.client.1.vm08.stdout:9/362: dread d2/d86/d30/f70 [0,4194304] 0 2026-03-10T07:51:28.772 INFO:tasks.workunit.client.1.vm08.stdout:3/367: dread d0/fc [4194304,4194304] 0 2026-03-10T07:51:28.777 INFO:tasks.workunit.client.1.vm08.stdout:3/368: dwrite d0/d3c/d18/d32/d61/d52/f70 [0,4194304] 0 2026-03-10T07:51:28.786 INFO:tasks.workunit.client.1.vm08.stdout:2/389: rmdir d0 39 2026-03-10T07:51:28.786 INFO:tasks.workunit.client.1.vm08.stdout:8/394: write d0/df/d15/d23/d39/f40 [3442123,20033] 0 2026-03-10T07:51:28.786 INFO:tasks.workunit.client.1.vm08.stdout:4/281: sync 2026-03-10T07:51:28.787 INFO:tasks.workunit.client.1.vm08.stdout:9/363: sync 2026-03-10T07:51:28.791 INFO:tasks.workunit.client.1.vm08.stdout:0/369: truncate dd/d18/f3c 3738126 0 2026-03-10T07:51:28.791 INFO:tasks.workunit.client.1.vm08.stdout:2/390: write d0/d1/d3/d10/d38/f52 [208827,88660] 0 2026-03-10T07:51:28.793 INFO:tasks.workunit.client.1.vm08.stdout:8/395: write d0/df/d2e/d30/f33 [2269677,88697] 0 2026-03-10T07:51:28.794 INFO:tasks.workunit.client.1.vm08.stdout:3/369: rmdir d0/d3c/d18/d48/d55/d56 39 2026-03-10T07:51:28.794 INFO:tasks.workunit.client.1.vm08.stdout:6/362: truncate d1/d3/df/d44/f5a 27205 0 2026-03-10T07:51:28.795 INFO:tasks.workunit.client.1.vm08.stdout:3/370: readlink d0/d3c/d1f/l54 0 2026-03-10T07:51:28.795 INFO:tasks.workunit.client.1.vm08.stdout:1/346: getdents d2/d6/de/d71 0 2026-03-10T07:51:28.797 INFO:tasks.workunit.client.1.vm08.stdout:7/381: truncate d3/da/f17 3037625 0 2026-03-10T07:51:28.806 INFO:tasks.workunit.client.1.vm08.stdout:6/363: dread d1/d46/f74 [0,4194304] 0 2026-03-10T07:51:28.806 INFO:tasks.workunit.client.1.vm08.stdout:6/364: chown d1/d3/df/d52 97139 1 2026-03-10T07:51:28.807 INFO:tasks.workunit.client.1.vm08.stdout:4/282: unlink d5/d1f/d31/f3f 0 2026-03-10T07:51:28.814 INFO:tasks.workunit.client.1.vm08.stdout:9/364: sync 2026-03-10T07:51:28.814 INFO:tasks.workunit.client.1.vm08.stdout:5/397: unlink d0/d4/df/f7a 0 2026-03-10T07:51:28.814 INFO:tasks.workunit.client.1.vm08.stdout:5/398: chown d0/d4/f31 1 1 2026-03-10T07:51:28.818 INFO:tasks.workunit.client.1.vm08.stdout:8/396: creat d0/df/d2e/d30/f7f x:0 0 0 2026-03-10T07:51:28.822 INFO:tasks.workunit.client.1.vm08.stdout:3/371: rename d0/c5d to d0/d3c/d18/d4a/c7b 0 2026-03-10T07:51:28.824 INFO:tasks.workunit.client.1.vm08.stdout:7/382: rmdir d3/da/d25/d9/d2f/d3a/d40 39 2026-03-10T07:51:28.835 INFO:tasks.workunit.client.1.vm08.stdout:0/370: dwrite dd/d29/d58/d3d/f40 [0,4194304] 0 2026-03-10T07:51:28.848 INFO:tasks.workunit.client.1.vm08.stdout:2/391: dwrite d0/f1e [0,4194304] 0 2026-03-10T07:51:28.852 INFO:tasks.workunit.client.1.vm08.stdout:0/371: dread dd/d10/d2f/d37/d64/f70 [0,4194304] 0 2026-03-10T07:51:28.857 INFO:tasks.workunit.client.1.vm08.stdout:1/347: creat d2/d6/de/d1f/d40/d76/f79 x:0 0 0 2026-03-10T07:51:28.859 INFO:tasks.workunit.client.1.vm08.stdout:0/372: dwrite dd/d29/f48 [0,4194304] 0 2026-03-10T07:51:28.864 INFO:tasks.workunit.client.1.vm08.stdout:8/397: dread d0/df/d15/d23/d39/f6f [0,4194304] 0 2026-03-10T07:51:28.864 INFO:tasks.workunit.client.1.vm08.stdout:3/372: symlink d0/d3c/d18/d4a/l7c 0 2026-03-10T07:51:28.870 INFO:tasks.workunit.client.1.vm08.stdout:0/373: read dd/d10/d14/d15/d20/d22/f2e [1757883,55490] 0 2026-03-10T07:51:28.876 INFO:tasks.workunit.client.1.vm08.stdout:3/373: dwrite d0/d3c/d18/d32/d61/f57 [0,4194304] 0 2026-03-10T07:51:28.877 INFO:tasks.workunit.client.1.vm08.stdout:8/398: dwrite d0/df/d2e/d30/f76 [0,4194304] 0 2026-03-10T07:51:28.886 INFO:tasks.workunit.client.1.vm08.stdout:7/383: rename d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79/f7d to d3/da/d25/d9/d2f/d3a/d4b/d67/d69/f81 0 2026-03-10T07:51:28.887 INFO:tasks.workunit.client.1.vm08.stdout:6/365: link d1/d17/d2b/f68 d1/db/d24/d73/d79/d7c/f7f 0 2026-03-10T07:51:28.889 INFO:tasks.workunit.client.1.vm08.stdout:2/392: mknod d0/d1/d3/d10/d32/c83 0 2026-03-10T07:51:28.897 INFO:tasks.workunit.client.1.vm08.stdout:8/399: mknod d0/df/d2e/d30/c80 0 2026-03-10T07:51:28.899 INFO:tasks.workunit.client.1.vm08.stdout:4/283: dwrite d5/d8/f42 [0,4194304] 0 2026-03-10T07:51:28.907 INFO:tasks.workunit.client.1.vm08.stdout:1/348: rename d2/d6/d11 to d2/d6/d3a/d61/d7a 0 2026-03-10T07:51:28.908 INFO:tasks.workunit.client.1.vm08.stdout:9/365: dread d2/de/f4d [0,4194304] 0 2026-03-10T07:51:28.910 INFO:tasks.workunit.client.1.vm08.stdout:9/366: chown d2/d3/fa 13716 1 2026-03-10T07:51:28.921 INFO:tasks.workunit.client.1.vm08.stdout:1/349: write d2/d6/f3b [1033434,12583] 0 2026-03-10T07:51:28.926 INFO:tasks.workunit.client.1.vm08.stdout:1/350: dread - d2/d6/de/d1f/d26/d58/f68 zero size 2026-03-10T07:51:28.926 INFO:tasks.workunit.client.1.vm08.stdout:3/374: mknod d0/d3c/d18/c7d 0 2026-03-10T07:51:28.926 INFO:tasks.workunit.client.1.vm08.stdout:8/400: unlink d0/df/d15/d23/d54/f56 0 2026-03-10T07:51:28.926 INFO:tasks.workunit.client.1.vm08.stdout:4/284: creat d5/d17/d48/f5d x:0 0 0 2026-03-10T07:51:28.926 INFO:tasks.workunit.client.1.vm08.stdout:7/384: rename d3/da/d25/f1e to d3/da/d25/d9/d2f/d39/d43/f82 0 2026-03-10T07:51:28.926 INFO:tasks.workunit.client.1.vm08.stdout:6/366: getdents d1/d7d 0 2026-03-10T07:51:28.926 INFO:tasks.workunit.client.1.vm08.stdout:7/385: chown d3/da/d25/d9/d2f/d6c 903000999 1 2026-03-10T07:51:28.926 INFO:tasks.workunit.client.1.vm08.stdout:7/386: chown d3/da/d25/d9/d2f/d4d 20 1 2026-03-10T07:51:28.926 INFO:tasks.workunit.client.1.vm08.stdout:1/351: mknod d2/d6/d3a/c7b 0 2026-03-10T07:51:28.926 INFO:tasks.workunit.client.1.vm08.stdout:7/387: chown d3/l22 3968 1 2026-03-10T07:51:28.926 INFO:tasks.workunit.client.1.vm08.stdout:4/285: symlink d5/d8/d9/d32/l5e 0 2026-03-10T07:51:28.926 INFO:tasks.workunit.client.1.vm08.stdout:7/388: chown d3/cc 9562281 1 2026-03-10T07:51:28.926 INFO:tasks.workunit.client.1.vm08.stdout:8/401: creat d0/df/d5d/f81 x:0 0 0 2026-03-10T07:51:28.926 INFO:tasks.workunit.client.1.vm08.stdout:4/286: symlink d5/d17/d48/l5f 0 2026-03-10T07:51:28.926 INFO:tasks.workunit.client.1.vm08.stdout:5/399: dread d0/f3b [0,4194304] 0 2026-03-10T07:51:28.927 INFO:tasks.workunit.client.1.vm08.stdout:3/375: creat d0/d3c/d1f/f7e x:0 0 0 2026-03-10T07:51:28.927 INFO:tasks.workunit.client.1.vm08.stdout:7/389: symlink d3/da/l83 0 2026-03-10T07:51:28.928 INFO:tasks.workunit.client.1.vm08.stdout:3/376: chown d0/f28 13149 1 2026-03-10T07:51:28.928 INFO:tasks.workunit.client.1.vm08.stdout:7/390: readlink d3/da/l24 0 2026-03-10T07:51:28.930 INFO:tasks.workunit.client.1.vm08.stdout:1/352: getdents d2 0 2026-03-10T07:51:28.930 INFO:tasks.workunit.client.1.vm08.stdout:4/287: dwrite d5/d8/ff [0,4194304] 0 2026-03-10T07:51:28.933 INFO:tasks.workunit.client.1.vm08.stdout:3/377: fdatasync d0/d3c/d18/d32/d61/f7a 0 2026-03-10T07:51:28.940 INFO:tasks.workunit.client.1.vm08.stdout:6/367: sync 2026-03-10T07:51:28.948 INFO:tasks.workunit.client.1.vm08.stdout:3/378: creat d0/d3c/d18/d32/d61/d52/f7f x:0 0 0 2026-03-10T07:51:28.948 INFO:tasks.workunit.client.1.vm08.stdout:1/353: rename d2/d6/de/d1f/f33 to d2/d6/de/f7c 0 2026-03-10T07:51:28.949 INFO:tasks.workunit.client.1.vm08.stdout:3/379: write d0/d3c/d1f/f6f [56911,106631] 0 2026-03-10T07:51:28.949 INFO:tasks.workunit.client.1.vm08.stdout:7/391: creat d3/da/f84 x:0 0 0 2026-03-10T07:51:28.951 INFO:tasks.workunit.client.1.vm08.stdout:7/392: chown d3/da/d25/d9/d2f/d3a/d4b/c5f 5 1 2026-03-10T07:51:28.956 INFO:tasks.workunit.client.1.vm08.stdout:6/368: rename d1/d3/df/d44/c54 to d1/db/d24/d3d/c80 0 2026-03-10T07:51:28.956 INFO:tasks.workunit.client.1.vm08.stdout:6/369: fsync d1/d3/d3e/f43 0 2026-03-10T07:51:28.956 INFO:tasks.workunit.client.1.vm08.stdout:2/393: mkdir d0/d1/d3/d10/d32/d61/d84 0 2026-03-10T07:51:28.956 INFO:tasks.workunit.client.1.vm08.stdout:6/370: chown d1/db/l39 313771 1 2026-03-10T07:51:28.956 INFO:tasks.workunit.client.1.vm08.stdout:3/380: fdatasync d0/d3c/d18/d32/d61/d52/f7f 0 2026-03-10T07:51:28.956 INFO:tasks.workunit.client.1.vm08.stdout:7/393: write d3/da/d25/f5c [760249,82576] 0 2026-03-10T07:51:28.956 INFO:tasks.workunit.client.1.vm08.stdout:7/394: write d3/da/d25/d9/d2f/d3a/d40/f63 [1645312,99893] 0 2026-03-10T07:51:28.957 INFO:tasks.workunit.client.1.vm08.stdout:7/395: dread - d3/da/d25/d9/d2f/d3a/d4b/f70 zero size 2026-03-10T07:51:28.957 INFO:tasks.workunit.client.1.vm08.stdout:7/396: dread - d3/da/f84 zero size 2026-03-10T07:51:28.957 INFO:tasks.workunit.client.1.vm08.stdout:7/397: stat d3/da/c2a 0 2026-03-10T07:51:28.958 INFO:tasks.workunit.client.1.vm08.stdout:6/371: creat d1/d3/d3e/f81 x:0 0 0 2026-03-10T07:51:28.959 INFO:tasks.workunit.client.1.vm08.stdout:3/381: mkdir d0/d3c/d18/d80 0 2026-03-10T07:51:28.960 INFO:tasks.workunit.client.1.vm08.stdout:2/394: rename d0/d1/d3/d56/d78/d42/d55/d1b/f5c to d0/d1/f85 0 2026-03-10T07:51:28.962 INFO:tasks.workunit.client.1.vm08.stdout:3/382: creat d0/d3c/d18/d48/d55/d56/f81 x:0 0 0 2026-03-10T07:51:28.964 INFO:tasks.workunit.client.1.vm08.stdout:7/398: link d3/f57 d3/f85 0 2026-03-10T07:51:28.965 INFO:tasks.workunit.client.1.vm08.stdout:6/372: dread d1/d3/df/f12 [0,4194304] 0 2026-03-10T07:51:28.968 INFO:tasks.workunit.client.1.vm08.stdout:7/399: readlink d3/da/d25/d9/d2f/d39/l6e 0 2026-03-10T07:51:28.969 INFO:tasks.workunit.client.1.vm08.stdout:5/400: dread d0/d4/d19/d3a/f3f [0,4194304] 0 2026-03-10T07:51:28.972 INFO:tasks.workunit.client.1.vm08.stdout:7/400: mknod d3/da/d25/d9/d2f/d6c/c86 0 2026-03-10T07:51:28.972 INFO:tasks.workunit.client.1.vm08.stdout:6/373: dwrite d1/f34 [0,4194304] 0 2026-03-10T07:51:28.977 INFO:tasks.workunit.client.1.vm08.stdout:3/383: dread d0/d3c/d1f/d44/d51/d34/f4e [0,4194304] 0 2026-03-10T07:51:28.977 INFO:tasks.workunit.client.1.vm08.stdout:1/354: sync 2026-03-10T07:51:28.981 INFO:tasks.workunit.client.1.vm08.stdout:3/384: truncate d0/d3c/d18/f23 8862936 0 2026-03-10T07:51:28.986 INFO:tasks.workunit.client.1.vm08.stdout:1/355: creat d2/d6/d3a/f7d x:0 0 0 2026-03-10T07:51:28.987 INFO:tasks.workunit.client.1.vm08.stdout:3/385: dread d0/d3c/d1f/d44/d51/f2b [0,4194304] 0 2026-03-10T07:51:28.988 INFO:tasks.workunit.client.1.vm08.stdout:1/356: chown d2/d6/de/d47/f63 1303025585 1 2026-03-10T07:51:28.988 INFO:tasks.workunit.client.1.vm08.stdout:3/386: truncate d0/f39 1440646 0 2026-03-10T07:51:28.991 INFO:tasks.workunit.client.1.vm08.stdout:6/374: creat d1/d3/df/d44/f82 x:0 0 0 2026-03-10T07:51:28.996 INFO:tasks.workunit.client.1.vm08.stdout:1/357: dwrite d2/d6/de/d1f/f67 [0,4194304] 0 2026-03-10T07:51:29.001 INFO:tasks.workunit.client.1.vm08.stdout:7/401: link d3/da/d25/d9/d2f/d3a/d4b/d67/d69/f65 d3/da/d25/d9/f87 0 2026-03-10T07:51:29.009 INFO:tasks.workunit.client.1.vm08.stdout:8/402: getdents d0/df/d5d 0 2026-03-10T07:51:29.009 INFO:tasks.workunit.client.1.vm08.stdout:7/402: read - d3/da/d25/d9/d2f/d3a/d4b/f70 zero size 2026-03-10T07:51:29.011 INFO:tasks.workunit.client.1.vm08.stdout:9/367: dwrite d2/fd [4194304,4194304] 0 2026-03-10T07:51:29.015 INFO:tasks.workunit.client.1.vm08.stdout:6/375: symlink d1/d3/df/d1d/d40/l83 0 2026-03-10T07:51:29.025 INFO:tasks.workunit.client.1.vm08.stdout:6/376: dread - d1/d3/f71 zero size 2026-03-10T07:51:29.025 INFO:tasks.workunit.client.1.vm08.stdout:1/358: mknod d2/d6/de/d1f/d22/c7e 0 2026-03-10T07:51:29.026 INFO:tasks.workunit.client.1.vm08.stdout:7/403: truncate d3/f2e 35489 0 2026-03-10T07:51:29.026 INFO:tasks.workunit.client.1.vm08.stdout:0/374: link dd/d18/f25 dd/d29/d58/f86 0 2026-03-10T07:51:29.026 INFO:tasks.workunit.client.1.vm08.stdout:6/377: mknod d1/d7d/c84 0 2026-03-10T07:51:29.026 INFO:tasks.workunit.client.1.vm08.stdout:3/387: getdents d0/d3c/d18/d32/d61 0 2026-03-10T07:51:29.026 INFO:tasks.workunit.client.1.vm08.stdout:0/375: rmdir dd/d10/d2f/d37/d64 39 2026-03-10T07:51:29.026 INFO:tasks.workunit.client.1.vm08.stdout:8/403: sync 2026-03-10T07:51:29.026 INFO:tasks.workunit.client.1.vm08.stdout:6/378: write d1/d3/d3e/f81 [314015,24333] 0 2026-03-10T07:51:29.027 INFO:tasks.workunit.client.1.vm08.stdout:1/359: link d2/d6/de/d1f/d26/f48 d2/d6/d50/f7f 0 2026-03-10T07:51:29.028 INFO:tasks.workunit.client.1.vm08.stdout:0/376: stat dd/d29/d58/d3d/f83 0 2026-03-10T07:51:29.028 INFO:tasks.workunit.client.1.vm08.stdout:4/288: dwrite d5/d17/f1a [4194304,4194304] 0 2026-03-10T07:51:29.030 INFO:tasks.workunit.client.1.vm08.stdout:2/395: rename d0/d1/d3/d56/d78/d42 to d0/d1/d3/d39/d7d/d86 0 2026-03-10T07:51:29.035 INFO:tasks.workunit.client.1.vm08.stdout:9/368: getdents d2/d58/d73 0 2026-03-10T07:51:29.035 INFO:tasks.workunit.client.1.vm08.stdout:8/404: write d0/df/d2e/d30/f43 [112003,59109] 0 2026-03-10T07:51:29.035 INFO:tasks.workunit.client.1.vm08.stdout:6/379: read d1/d46/f74 [282528,81899] 0 2026-03-10T07:51:29.037 INFO:tasks.workunit.client.1.vm08.stdout:1/360: mkdir d2/d6/de/d70/d80 0 2026-03-10T07:51:29.037 INFO:tasks.workunit.client.1.vm08.stdout:0/377: truncate dd/f16 161964 0 2026-03-10T07:51:29.038 INFO:tasks.workunit.client.1.vm08.stdout:1/361: fsync d2/f36 0 2026-03-10T07:51:29.039 INFO:tasks.workunit.client.1.vm08.stdout:6/380: truncate d1/d3/d3e/f81 1084877 0 2026-03-10T07:51:29.039 INFO:tasks.workunit.client.1.vm08.stdout:0/378: chown dd/d10/d14/d15/d20/d22/f51 95 1 2026-03-10T07:51:29.039 INFO:tasks.workunit.client.1.vm08.stdout:3/388: dwrite d0/d3c/d1f/d44/d51/f71 [4194304,4194304] 0 2026-03-10T07:51:29.040 INFO:tasks.workunit.client.1.vm08.stdout:0/379: dread - dd/d10/d14/d15/f84 zero size 2026-03-10T07:51:29.041 INFO:tasks.workunit.client.1.vm08.stdout:5/401: rename d0/d4/l58 to d0/d4/d19/d60/l7d 0 2026-03-10T07:51:29.042 INFO:tasks.workunit.client.1.vm08.stdout:2/396: fsync d0/d1/d3/d39/d7d/d86/d55/f20 0 2026-03-10T07:51:29.043 INFO:tasks.workunit.client.1.vm08.stdout:4/289: dread f1 [0,4194304] 0 2026-03-10T07:51:29.043 INFO:tasks.workunit.client.1.vm08.stdout:5/402: chown d0/d4/d19/d3a/f3f 3155791 1 2026-03-10T07:51:29.052 INFO:tasks.workunit.client.1.vm08.stdout:6/381: stat d1/d17/d2b/c36 0 2026-03-10T07:51:29.052 INFO:tasks.workunit.client.1.vm08.stdout:1/362: rmdir d2/d6/de/d71 39 2026-03-10T07:51:29.055 INFO:tasks.workunit.client.1.vm08.stdout:0/380: write dd/d10/d14/d15/d20/d5f/f7f [1200243,64850] 0 2026-03-10T07:51:29.055 INFO:tasks.workunit.client.1.vm08.stdout:5/403: creat d0/d8/f7e x:0 0 0 2026-03-10T07:51:29.056 INFO:tasks.workunit.client.1.vm08.stdout:0/381: chown dd/d18/l3a 953 1 2026-03-10T07:51:29.056 INFO:tasks.workunit.client.1.vm08.stdout:0/382: chown dd/d10/d14/d15 2150894 1 2026-03-10T07:51:29.057 INFO:tasks.workunit.client.1.vm08.stdout:7/404: rename d3/da/d25/d9/c31 to d3/da/d25/d9/d2f/d3a/d4b/c88 0 2026-03-10T07:51:29.058 INFO:tasks.workunit.client.1.vm08.stdout:2/397: truncate d0/d1/d17/f1a 6137231 0 2026-03-10T07:51:29.058 INFO:tasks.workunit.client.1.vm08.stdout:7/405: readlink d3/da/d25/l12 0 2026-03-10T07:51:29.059 INFO:tasks.workunit.client.1.vm08.stdout:6/382: creat d1/d3/df/f85 x:0 0 0 2026-03-10T07:51:29.059 INFO:tasks.workunit.client.1.vm08.stdout:5/404: creat d0/d4/df/f7f x:0 0 0 2026-03-10T07:51:29.060 INFO:tasks.workunit.client.1.vm08.stdout:8/405: link d0/df/d2e/d30/l7c d0/df/d2e/l82 0 2026-03-10T07:51:29.060 INFO:tasks.workunit.client.1.vm08.stdout:3/389: creat d0/d3c/d1f/d44/d51/f82 x:0 0 0 2026-03-10T07:51:29.060 INFO:tasks.workunit.client.1.vm08.stdout:7/406: readlink d3/da/d25/d9/d2f/d3a/d4b/d67/d69/l77 0 2026-03-10T07:51:29.060 INFO:tasks.workunit.client.1.vm08.stdout:0/383: mknod dd/d29/d58/d3d/c87 0 2026-03-10T07:51:29.061 INFO:tasks.workunit.client.1.vm08.stdout:0/384: stat dd/d10/d14/d15/d20/f7e 0 2026-03-10T07:51:29.062 INFO:tasks.workunit.client.1.vm08.stdout:1/363: link d2/d6/de/d47/f3c d2/d6/de/d1f/d22/f81 0 2026-03-10T07:51:29.066 INFO:tasks.workunit.client.1.vm08.stdout:2/398: mknod d0/d1/d3/d39/c87 0 2026-03-10T07:51:29.066 INFO:tasks.workunit.client.1.vm08.stdout:3/390: mkdir d0/d3c/d18/d32/d61/d83 0 2026-03-10T07:51:29.067 INFO:tasks.workunit.client.1.vm08.stdout:8/406: mknod d0/d37/c83 0 2026-03-10T07:51:29.067 INFO:tasks.workunit.client.1.vm08.stdout:0/385: dwrite dd/d10/d14/d15/d20/d5f/f7f [0,4194304] 0 2026-03-10T07:51:29.067 INFO:tasks.workunit.client.1.vm08.stdout:1/364: creat d2/d6/de/d70/f82 x:0 0 0 2026-03-10T07:51:29.072 INFO:tasks.workunit.client.1.vm08.stdout:1/365: readlink d2/d6/de/l27 0 2026-03-10T07:51:29.072 INFO:tasks.workunit.client.1.vm08.stdout:1/366: read - d2/d6/de/d1f/f3d zero size 2026-03-10T07:51:29.073 INFO:tasks.workunit.client.1.vm08.stdout:7/407: symlink d3/da/d25/d9/d2f/d3a/l89 0 2026-03-10T07:51:29.076 INFO:tasks.workunit.client.1.vm08.stdout:0/386: unlink dd/d29/d5c/c79 0 2026-03-10T07:51:29.078 INFO:tasks.workunit.client.1.vm08.stdout:8/407: chown d0/df/l10 48846416 1 2026-03-10T07:51:29.079 INFO:tasks.workunit.client.1.vm08.stdout:8/408: write d0/df/d15/d23/f3d [666650,42741] 0 2026-03-10T07:51:29.081 INFO:tasks.workunit.client.1.vm08.stdout:7/408: truncate d3/da/d25/d9/d2f/d39/f58 705781 0 2026-03-10T07:51:29.096 INFO:tasks.workunit.client.1.vm08.stdout:2/399: write d0/d1/d3/d10/d32/f45 [4500089,19625] 0 2026-03-10T07:51:29.096 INFO:tasks.workunit.client.1.vm08.stdout:6/383: creat d1/d3/df/d44/f86 x:0 0 0 2026-03-10T07:51:29.098 INFO:tasks.workunit.client.1.vm08.stdout:1/367: rename d2/d6/de/d1f/d26/d58/d64 to d2/d6/de/d1f/d26/d58/d83 0 2026-03-10T07:51:29.099 INFO:tasks.workunit.client.1.vm08.stdout:1/368: read d2/d6/de/d1f/d26/f48 [247780,116579] 0 2026-03-10T07:51:29.100 INFO:tasks.workunit.client.1.vm08.stdout:6/384: dwrite d1/f34 [0,4194304] 0 2026-03-10T07:51:29.103 INFO:tasks.workunit.client.1.vm08.stdout:8/409: chown d0/df/d17/l4d 20207274 1 2026-03-10T07:51:29.104 INFO:tasks.workunit.client.1.vm08.stdout:3/391: creat d0/f84 x:0 0 0 2026-03-10T07:51:29.106 INFO:tasks.workunit.client.1.vm08.stdout:0/387: rename ca to dd/d10/d2f/c88 0 2026-03-10T07:51:29.106 INFO:tasks.workunit.client.1.vm08.stdout:8/410: mknod d0/c84 0 2026-03-10T07:51:29.106 INFO:tasks.workunit.client.1.vm08.stdout:7/409: mkdir d3/da/d8a 0 2026-03-10T07:51:29.106 INFO:tasks.workunit.client.1.vm08.stdout:6/385: mkdir d1/d3/df/d1d/d40/d87 0 2026-03-10T07:51:29.107 INFO:tasks.workunit.client.1.vm08.stdout:8/411: fdatasync d0/df/d15/d23/d39/f3e 0 2026-03-10T07:51:29.107 INFO:tasks.workunit.client.1.vm08.stdout:7/410: readlink d3/da/l24 0 2026-03-10T07:51:29.107 INFO:tasks.workunit.client.1.vm08.stdout:6/386: truncate d1/d46/f72 1032478 0 2026-03-10T07:51:29.108 INFO:tasks.workunit.client.1.vm08.stdout:3/392: chown d0/d3c/d18/d48/d55/d56/l6b 2096313 1 2026-03-10T07:51:29.108 INFO:tasks.workunit.client.1.vm08.stdout:6/387: dread - d1/d3/df/f85 zero size 2026-03-10T07:51:29.109 INFO:tasks.workunit.client.1.vm08.stdout:2/400: read d0/d1/d3/d10/f58 [3827695,72784] 0 2026-03-10T07:51:29.109 INFO:tasks.workunit.client.1.vm08.stdout:6/388: fsync d1/d17/f20 0 2026-03-10T07:51:29.109 INFO:tasks.workunit.client.1.vm08.stdout:1/369: read d2/d6/d3a/d61/d7a/f44 [357902,121538] 0 2026-03-10T07:51:29.114 INFO:tasks.workunit.client.1.vm08.stdout:6/389: fsync d1/db/d24/f25 0 2026-03-10T07:51:29.115 INFO:tasks.workunit.client.1.vm08.stdout:3/393: write d0/d3c/d1f/d44/d51/f71 [6183500,64448] 0 2026-03-10T07:51:29.123 INFO:tasks.workunit.client.1.vm08.stdout:2/401: sync 2026-03-10T07:51:29.123 INFO:tasks.workunit.client.1.vm08.stdout:1/370: dwrite d2/d10/f3f [0,4194304] 0 2026-03-10T07:51:29.123 INFO:tasks.workunit.client.1.vm08.stdout:6/390: dwrite d1/f6 [0,4194304] 0 2026-03-10T07:51:29.126 INFO:tasks.workunit.client.1.vm08.stdout:8/412: creat d0/df/d15/d23/d39/f85 x:0 0 0 2026-03-10T07:51:29.126 INFO:tasks.workunit.client.1.vm08.stdout:0/388: link dd/d10/d14/c3b dd/d29/d58/d3d/c89 0 2026-03-10T07:51:29.126 INFO:tasks.workunit.client.1.vm08.stdout:2/402: dwrite d0/d1/d3/d39/f3b [0,4194304] 0 2026-03-10T07:51:29.129 INFO:tasks.workunit.client.1.vm08.stdout:8/413: fdatasync d0/df/d2e/d30/f33 0 2026-03-10T07:51:29.129 INFO:tasks.workunit.client.1.vm08.stdout:0/389: dread - dd/d10/d14/d1b/f76 zero size 2026-03-10T07:51:29.135 INFO:tasks.workunit.client.1.vm08.stdout:0/390: mknod dd/d10/d14/d1b/c8a 0 2026-03-10T07:51:29.135 INFO:tasks.workunit.client.1.vm08.stdout:8/414: read d0/df/d15/d23/f3d [380959,8310] 0 2026-03-10T07:51:29.135 INFO:tasks.workunit.client.1.vm08.stdout:8/415: fdatasync d0/d37/f47 0 2026-03-10T07:51:29.138 INFO:tasks.workunit.client.1.vm08.stdout:6/391: symlink d1/d3/df/d38/l88 0 2026-03-10T07:51:29.142 INFO:tasks.workunit.client.1.vm08.stdout:8/416: write d0/df/d15/d23/d39/f40 [1486554,50992] 0 2026-03-10T07:51:29.144 INFO:tasks.workunit.client.1.vm08.stdout:8/417: mkdir d0/d37/d86 0 2026-03-10T07:51:29.148 INFO:tasks.workunit.client.1.vm08.stdout:8/418: creat d0/d69/d77/f87 x:0 0 0 2026-03-10T07:51:29.149 INFO:tasks.workunit.client.1.vm08.stdout:6/392: dread d1/d17/d2b/f3c [0,4194304] 0 2026-03-10T07:51:29.149 INFO:tasks.workunit.client.1.vm08.stdout:8/419: read - d0/df/d5d/f81 zero size 2026-03-10T07:51:29.150 INFO:tasks.workunit.client.1.vm08.stdout:6/393: chown d1/d3/d3e/f81 472076952 1 2026-03-10T07:51:29.153 INFO:tasks.workunit.client.1.vm08.stdout:6/394: creat d1/d3/df/d1d/d40/d87/f89 x:0 0 0 2026-03-10T07:51:29.154 INFO:tasks.workunit.client.1.vm08.stdout:6/395: rmdir d1 39 2026-03-10T07:51:29.155 INFO:tasks.workunit.client.1.vm08.stdout:6/396: fdatasync d1/f49 0 2026-03-10T07:51:29.155 INFO:tasks.workunit.client.1.vm08.stdout:8/420: dread d0/df/d2e/f44 [0,4194304] 0 2026-03-10T07:51:29.156 INFO:tasks.workunit.client.1.vm08.stdout:6/397: write d1/d17/d2b/f3c [738278,117393] 0 2026-03-10T07:51:29.158 INFO:tasks.workunit.client.1.vm08.stdout:8/421: dwrite d0/d69/f46 [0,4194304] 0 2026-03-10T07:51:29.160 INFO:tasks.workunit.client.1.vm08.stdout:6/398: fdatasync d1/f35 0 2026-03-10T07:51:29.167 INFO:tasks.workunit.client.1.vm08.stdout:6/399: truncate d1/d3/df/f85 150336 0 2026-03-10T07:51:29.168 INFO:tasks.workunit.client.1.vm08.stdout:6/400: chown d1/d3/d3e 12498 1 2026-03-10T07:51:29.169 INFO:tasks.workunit.client.1.vm08.stdout:9/369: truncate d2/de/f4f 477527 0 2026-03-10T07:51:29.171 INFO:tasks.workunit.client.1.vm08.stdout:2/403: truncate d0/d1/d3/d56/d78/f62 6829288 0 2026-03-10T07:51:29.172 INFO:tasks.workunit.client.1.vm08.stdout:4/290: truncate d5/f2d 1628279 0 2026-03-10T07:51:29.173 INFO:tasks.workunit.client.1.vm08.stdout:9/370: rename d2/d58/f56 to d2/d86/d30/f87 0 2026-03-10T07:51:29.173 INFO:tasks.workunit.client.1.vm08.stdout:8/422: dwrite d0/f20 [0,4194304] 0 2026-03-10T07:51:29.175 INFO:tasks.workunit.client.1.vm08.stdout:8/423: fdatasync d0/df/d5d/f81 0 2026-03-10T07:51:29.178 INFO:tasks.workunit.client.1.vm08.stdout:4/291: chown d5/f2f 7564 1 2026-03-10T07:51:29.183 INFO:tasks.workunit.client.1.vm08.stdout:6/401: dread d1/d3/df/d38/f60 [0,4194304] 0 2026-03-10T07:51:29.186 INFO:tasks.workunit.client.1.vm08.stdout:4/292: dwrite d5/d17/f1a [0,4194304] 0 2026-03-10T07:51:29.189 INFO:tasks.workunit.client.1.vm08.stdout:5/405: dwrite d0/d4/d19/d43/f35 [0,4194304] 0 2026-03-10T07:51:29.192 INFO:tasks.workunit.client.1.vm08.stdout:9/371: symlink d2/d86/d30/l88 0 2026-03-10T07:51:29.208 INFO:tasks.workunit.client.1.vm08.stdout:6/402: mknod d1/d3/df/d1d/d40/d45/c8a 0 2026-03-10T07:51:29.208 INFO:tasks.workunit.client.1.vm08.stdout:9/372: rmdir d2/de 39 2026-03-10T07:51:29.208 INFO:tasks.workunit.client.1.vm08.stdout:2/404: link d0/d1/d3/d10/d38/l5f d0/d1/d3/d39/d7d/d86/d55/d7a/l88 0 2026-03-10T07:51:29.208 INFO:tasks.workunit.client.1.vm08.stdout:9/373: readlink d2/d26/l2d 0 2026-03-10T07:51:29.208 INFO:tasks.workunit.client.1.vm08.stdout:5/406: creat d0/f80 x:0 0 0 2026-03-10T07:51:29.208 INFO:tasks.workunit.client.1.vm08.stdout:9/374: truncate d2/d86/f7c 337992 0 2026-03-10T07:51:29.208 INFO:tasks.workunit.client.1.vm08.stdout:4/293: dwrite d5/d8/d9/f18 [0,4194304] 0 2026-03-10T07:51:29.208 INFO:tasks.workunit.client.1.vm08.stdout:2/405: dread d0/d1/d3/d56/d57/f79 [0,4194304] 0 2026-03-10T07:51:29.208 INFO:tasks.workunit.client.1.vm08.stdout:6/403: link d1/d3/df/d38/l88 d1/db/d24/d51/l8b 0 2026-03-10T07:51:29.208 INFO:tasks.workunit.client.1.vm08.stdout:5/407: mkdir d0/d4/d19/d81 0 2026-03-10T07:51:29.208 INFO:tasks.workunit.client.1.vm08.stdout:4/294: dread - d5/d1f/d31/f4d zero size 2026-03-10T07:51:29.208 INFO:tasks.workunit.client.1.vm08.stdout:9/375: mknod d2/d3/c89 0 2026-03-10T07:51:29.209 INFO:tasks.workunit.client.1.vm08.stdout:5/408: rmdir d0/d4/df/d1e/d41 39 2026-03-10T07:51:29.211 INFO:tasks.workunit.client.1.vm08.stdout:2/406: dwrite d0/d1/d3/d39/d7d/f80 [0,4194304] 0 2026-03-10T07:51:29.211 INFO:tasks.workunit.client.1.vm08.stdout:4/295: symlink d5/d8/d9/d12/l60 0 2026-03-10T07:51:29.220 INFO:tasks.workunit.client.1.vm08.stdout:9/376: mkdir d2/d8a 0 2026-03-10T07:51:29.222 INFO:tasks.workunit.client.1.vm08.stdout:2/407: dwrite d0/d1/d3/d10/d32/d61/f59 [0,4194304] 0 2026-03-10T07:51:29.224 INFO:tasks.workunit.client.1.vm08.stdout:4/296: rmdir d5/d1f/d41 39 2026-03-10T07:51:29.229 INFO:tasks.workunit.client.1.vm08.stdout:2/408: unlink d0/c2d 0 2026-03-10T07:51:29.230 INFO:tasks.workunit.client.1.vm08.stdout:4/297: mkdir d5/d1f/d31/d61 0 2026-03-10T07:51:29.230 INFO:tasks.workunit.client.1.vm08.stdout:2/409: chown d0/d1/d3/d39/d7d/d7e 164 1 2026-03-10T07:51:29.234 INFO:tasks.workunit.client.1.vm08.stdout:9/377: truncate d2/de/f3d 365221 0 2026-03-10T07:51:29.234 INFO:tasks.workunit.client.1.vm08.stdout:4/298: rename d5/d1f/d41/f49 to d5/d1f/d31/f62 0 2026-03-10T07:51:29.235 INFO:tasks.workunit.client.1.vm08.stdout:4/299: dread - d5/d8/d9/f46 zero size 2026-03-10T07:51:29.236 INFO:tasks.workunit.client.1.vm08.stdout:4/300: truncate d5/d8/ff 4310992 0 2026-03-10T07:51:29.243 INFO:tasks.workunit.client.1.vm08.stdout:2/410: getdents d0/d1/d3 0 2026-03-10T07:51:29.244 INFO:tasks.workunit.client.1.vm08.stdout:4/301: mknod d5/d1f/d31/d61/c63 0 2026-03-10T07:51:29.247 INFO:tasks.workunit.client.1.vm08.stdout:9/378: dwrite d2/f5 [0,4194304] 0 2026-03-10T07:51:29.257 INFO:tasks.workunit.client.1.vm08.stdout:2/411: link d0/c33 d0/d1/d3/d39/d7d/c89 0 2026-03-10T07:51:29.257 INFO:tasks.workunit.client.1.vm08.stdout:2/412: dread d0/f1e [0,4194304] 0 2026-03-10T07:51:29.259 INFO:tasks.workunit.client.1.vm08.stdout:2/413: dwrite d0/d1/d3/d56/f70 [0,4194304] 0 2026-03-10T07:51:29.269 INFO:tasks.workunit.client.1.vm08.stdout:9/379: link d2/d86/d30/d35/c85 d2/de/d28/c8b 0 2026-03-10T07:51:29.275 INFO:tasks.workunit.client.1.vm08.stdout:2/414: symlink d0/d1/d3/d3e/l8a 0 2026-03-10T07:51:29.275 INFO:tasks.workunit.client.1.vm08.stdout:2/415: truncate d0/d1/d3/d39/d7d/d86/f7f 638186 0 2026-03-10T07:51:29.281 INFO:tasks.workunit.client.1.vm08.stdout:2/416: dread d0/f4a [0,4194304] 0 2026-03-10T07:51:29.282 INFO:tasks.workunit.client.1.vm08.stdout:9/380: rmdir d2/d8a 0 2026-03-10T07:51:29.283 INFO:tasks.workunit.client.1.vm08.stdout:9/381: write d2/d26/f61 [401349,122884] 0 2026-03-10T07:51:29.287 INFO:tasks.workunit.client.1.vm08.stdout:9/382: creat d2/d3/d84/f8c x:0 0 0 2026-03-10T07:51:29.288 INFO:tasks.workunit.client.1.vm08.stdout:9/383: creat d2/de/d28/f8d x:0 0 0 2026-03-10T07:51:29.289 INFO:tasks.workunit.client.1.vm08.stdout:9/384: truncate d2/d3/d84/f8c 529091 0 2026-03-10T07:51:29.289 INFO:tasks.workunit.client.1.vm08.stdout:9/385: write d2/d86/f55 [633927,66453] 0 2026-03-10T07:51:29.294 INFO:tasks.workunit.client.1.vm08.stdout:9/386: rename d2/de/f14 to d2/d3/f8e 0 2026-03-10T07:51:29.295 INFO:tasks.workunit.client.1.vm08.stdout:9/387: symlink d2/l8f 0 2026-03-10T07:51:29.314 INFO:tasks.workunit.client.1.vm08.stdout:1/371: dread d2/d10/f2b [0,4194304] 0 2026-03-10T07:51:29.316 INFO:tasks.workunit.client.1.vm08.stdout:1/372: symlink d2/d6/de/d1f/d22/l84 0 2026-03-10T07:51:29.317 INFO:tasks.workunit.client.1.vm08.stdout:1/373: truncate d2/d6/d50/f54 90429 0 2026-03-10T07:51:29.321 INFO:tasks.workunit.client.1.vm08.stdout:1/374: creat d2/d6/de/d70/d80/f85 x:0 0 0 2026-03-10T07:51:29.335 INFO:tasks.workunit.client.1.vm08.stdout:7/411: dwrite d3/da/d25/d9/d2f/d3a/d40/f52 [4194304,4194304] 0 2026-03-10T07:51:29.353 INFO:tasks.workunit.client.1.vm08.stdout:7/412: dread d3/da/f17 [0,4194304] 0 2026-03-10T07:51:29.365 INFO:tasks.workunit.client.1.vm08.stdout:7/413: creat d3/da/d25/d9/d2f/d3a/d71/f8b x:0 0 0 2026-03-10T07:51:29.367 INFO:tasks.workunit.client.1.vm08.stdout:7/414: mkdir d3/da/d25/d9/d2f/d3a/d71/d8c 0 2026-03-10T07:51:29.368 INFO:tasks.workunit.client.1.vm08.stdout:7/415: write d3/da/f6b [3248649,35388] 0 2026-03-10T07:51:29.397 INFO:tasks.workunit.client.1.vm08.stdout:6/404: read d1/d17/f66 [3296676,126969] 0 2026-03-10T07:51:29.397 INFO:tasks.workunit.client.1.vm08.stdout:3/394: truncate d0/d3c/d18/f23 8655371 0 2026-03-10T07:51:29.398 INFO:tasks.workunit.client.1.vm08.stdout:2/417: dread d0/f44 [0,4194304] 0 2026-03-10T07:51:29.411 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:29 vm05.local ceph-mon[50387]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T07:51:29.418 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:29 vm05.local ceph-mon[50387]: Upgrade: Updating grafana.vm05 2026-03-10T07:51:29.419 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:29 vm05.local ceph-mon[50387]: Deploying daemon grafana.vm05 on vm05 2026-03-10T07:51:29.419 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:29 vm05.local ceph-mon[50387]: pgmap v34: 65 pgs: 65 active+clean; 2.8 GiB data, 9.3 GiB used, 111 GiB / 120 GiB avail; 34 MiB/s rd, 162 MiB/s wr, 320 op/s 2026-03-10T07:51:29.419 INFO:tasks.workunit.client.1.vm08.stdout:0/391: write dd/d18/f25 [3272640,14648] 0 2026-03-10T07:51:29.419 INFO:tasks.workunit.client.1.vm08.stdout:0/392: write dd/d18/f25 [4418902,73120] 0 2026-03-10T07:51:29.420 INFO:tasks.workunit.client.1.vm08.stdout:4/302: fdatasync d5/d8/d9/f18 0 2026-03-10T07:51:29.420 INFO:tasks.workunit.client.1.vm08.stdout:4/303: chown d5/d8/c2e 957432119 1 2026-03-10T07:51:29.420 INFO:tasks.workunit.client.1.vm08.stdout:4/304: chown d5/d8/d9/d12 28 1 2026-03-10T07:51:29.420 INFO:tasks.workunit.client.1.vm08.stdout:6/405: chown d1/d3/df/d1d/d40/d45/l62 178 1 2026-03-10T07:51:29.420 INFO:tasks.workunit.client.1.vm08.stdout:4/305: write d5/d17/f1a [5886662,20649] 0 2026-03-10T07:51:29.420 INFO:tasks.workunit.client.1.vm08.stdout:3/395: mkdir d0/d3c/d1f/d44/d51/d2d/d85 0 2026-03-10T07:51:29.420 INFO:tasks.workunit.client.1.vm08.stdout:8/424: write d0/df/d15/d23/d39/f6f [659919,127396] 0 2026-03-10T07:51:29.420 INFO:tasks.workunit.client.1.vm08.stdout:8/425: fsync d0/d37/f47 0 2026-03-10T07:51:29.420 INFO:tasks.workunit.client.1.vm08.stdout:8/426: chown d0/df/d15 48665 1 2026-03-10T07:51:29.420 INFO:tasks.workunit.client.1.vm08.stdout:6/406: rename d1/db/f23 to d1/db/d24/d73/d79/f8c 0 2026-03-10T07:51:29.420 INFO:tasks.workunit.client.1.vm08.stdout:4/306: mkdir d5/d17/d48/d64 0 2026-03-10T07:51:29.420 INFO:tasks.workunit.client.1.vm08.stdout:3/396: creat d0/d3c/d18/d80/f86 x:0 0 0 2026-03-10T07:51:29.420 INFO:tasks.workunit.client.1.vm08.stdout:3/397: read - d0/d3c/d1f/f7e zero size 2026-03-10T07:51:29.420 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:29 vm08.local ceph-mon[59917]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-alertmanager-api-host"}]: dispatch 2026-03-10T07:51:29.420 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:29 vm08.local ceph-mon[59917]: Upgrade: Updating grafana.vm05 2026-03-10T07:51:29.420 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:29 vm08.local ceph-mon[59917]: Deploying daemon grafana.vm05 on vm05 2026-03-10T07:51:29.420 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:29 vm08.local ceph-mon[59917]: pgmap v34: 65 pgs: 65 active+clean; 2.8 GiB data, 9.3 GiB used, 111 GiB / 120 GiB avail; 34 MiB/s rd, 162 MiB/s wr, 320 op/s 2026-03-10T07:51:29.420 INFO:tasks.workunit.client.1.vm08.stdout:4/307: dwrite d5/d8/d9/f18 [0,4194304] 0 2026-03-10T07:51:29.422 INFO:tasks.workunit.client.1.vm08.stdout:4/308: fsync d5/f10 0 2026-03-10T07:51:29.422 INFO:tasks.workunit.client.1.vm08.stdout:0/393: sync 2026-03-10T07:51:29.423 INFO:tasks.workunit.client.1.vm08.stdout:0/394: write dd/d10/d14/d1b/d30/f4d [945479,112456] 0 2026-03-10T07:51:29.424 INFO:tasks.workunit.client.1.vm08.stdout:0/395: write dd/d10/d14/d1b/d30/f4d [302004,92592] 0 2026-03-10T07:51:29.426 INFO:tasks.workunit.client.1.vm08.stdout:4/309: write d5/d8/d9/d12/f53 [1064020,45008] 0 2026-03-10T07:51:29.428 INFO:tasks.workunit.client.1.vm08.stdout:4/310: truncate d5/f21 730705 0 2026-03-10T07:51:29.429 INFO:tasks.workunit.client.1.vm08.stdout:6/407: truncate d1/f49 82205 0 2026-03-10T07:51:29.430 INFO:tasks.workunit.client.1.vm08.stdout:3/398: creat d0/d3c/f87 x:0 0 0 2026-03-10T07:51:29.430 INFO:tasks.workunit.client.1.vm08.stdout:3/399: readlink d0/d3c/d1f/d44/d51/l2a 0 2026-03-10T07:51:29.443 INFO:tasks.workunit.client.1.vm08.stdout:6/408: truncate d1/d3/f21 3794478 0 2026-03-10T07:51:29.447 INFO:tasks.workunit.client.1.vm08.stdout:0/396: rename dd/l62 to dd/d10/d2f/d37/d64/d52/l8b 0 2026-03-10T07:51:29.447 INFO:tasks.workunit.client.1.vm08.stdout:6/409: sync 2026-03-10T07:51:29.448 INFO:tasks.workunit.client.1.vm08.stdout:0/397: write dd/d29/d5c/f63 [902503,88294] 0 2026-03-10T07:51:29.448 INFO:tasks.workunit.client.1.vm08.stdout:8/427: link d0/df/d15/d23/d54/c6c d0/df/d17/d25/c88 0 2026-03-10T07:51:29.450 INFO:tasks.workunit.client.1.vm08.stdout:0/398: fsync dd/d10/d14/d15/d20/d5f/f7f 0 2026-03-10T07:51:29.452 INFO:tasks.workunit.client.1.vm08.stdout:4/311: symlink d5/d17/d48/d64/l65 0 2026-03-10T07:51:29.452 INFO:tasks.workunit.client.1.vm08.stdout:6/410: dwrite d1/db/f4e [0,4194304] 0 2026-03-10T07:51:29.455 INFO:tasks.workunit.client.1.vm08.stdout:3/400: mknod d0/d3c/d1f/d44/d51/c88 0 2026-03-10T07:51:29.456 INFO:tasks.workunit.client.1.vm08.stdout:0/399: rmdir dd/d10/d14/d15/d20 39 2026-03-10T07:51:29.456 INFO:tasks.workunit.client.1.vm08.stdout:0/400: chown dd/d10/d14/d15 1 1 2026-03-10T07:51:29.456 INFO:tasks.workunit.client.1.vm08.stdout:0/401: chown dd/d18 289 1 2026-03-10T07:51:29.457 INFO:tasks.workunit.client.1.vm08.stdout:3/401: read d0/d3c/d18/f22 [6222519,35777] 0 2026-03-10T07:51:29.466 INFO:tasks.workunit.client.1.vm08.stdout:3/402: dwrite d0/f79 [0,4194304] 0 2026-03-10T07:51:29.468 INFO:tasks.workunit.client.1.vm08.stdout:8/428: getdents d0/d69 0 2026-03-10T07:51:29.480 INFO:tasks.workunit.client.1.vm08.stdout:3/403: unlink d0/d3c/d18/d48/d55/f5f 0 2026-03-10T07:51:29.488 INFO:tasks.workunit.client.1.vm08.stdout:5/409: dread d0/d4/df/d1e/f25 [0,4194304] 0 2026-03-10T07:51:29.488 INFO:tasks.workunit.client.1.vm08.stdout:5/410: mkdir d0/d4/df/d82 0 2026-03-10T07:51:29.488 INFO:tasks.workunit.client.1.vm08.stdout:5/411: mkdir d0/d77/d83 0 2026-03-10T07:51:29.488 INFO:tasks.workunit.client.1.vm08.stdout:5/412: creat d0/d4/df/d12/d22/f84 x:0 0 0 2026-03-10T07:51:29.490 INFO:tasks.workunit.client.1.vm08.stdout:3/404: sync 2026-03-10T07:51:29.492 INFO:tasks.workunit.client.1.vm08.stdout:5/413: dwrite d0/d4/d19/d3a/d69/f71 [0,4194304] 0 2026-03-10T07:51:29.495 INFO:tasks.workunit.client.1.vm08.stdout:3/405: fdatasync d0/d3c/d18/f38 0 2026-03-10T07:51:29.495 INFO:tasks.workunit.client.1.vm08.stdout:3/406: fsync d0/d3c/d18/d32/d61/d52/f7f 0 2026-03-10T07:51:29.500 INFO:tasks.workunit.client.1.vm08.stdout:5/414: creat d0/d8/f85 x:0 0 0 2026-03-10T07:51:29.502 INFO:tasks.workunit.client.1.vm08.stdout:3/407: dread d0/d3c/d1f/d44/d51/f2b [0,4194304] 0 2026-03-10T07:51:29.503 INFO:tasks.workunit.client.1.vm08.stdout:3/408: truncate d0/d3c/f87 445292 0 2026-03-10T07:51:29.503 INFO:tasks.workunit.client.1.vm08.stdout:5/415: dwrite d0/d4/d19/d3a/f53 [0,4194304] 0 2026-03-10T07:51:29.506 INFO:tasks.workunit.client.1.vm08.stdout:3/409: chown d0/d3c/d18/d4a/l7c 1313480 1 2026-03-10T07:51:29.507 INFO:tasks.workunit.client.1.vm08.stdout:5/416: dwrite d0/d4/f31 [0,4194304] 0 2026-03-10T07:51:29.510 INFO:tasks.workunit.client.1.vm08.stdout:5/417: creat d0/d8/d24/f86 x:0 0 0 2026-03-10T07:51:29.514 INFO:tasks.workunit.client.1.vm08.stdout:3/410: fdatasync d0/d3c/d18/f23 0 2026-03-10T07:51:29.517 INFO:tasks.workunit.client.1.vm08.stdout:5/418: rename d0/d4/df/d12/l32 to d0/d4/d19/l87 0 2026-03-10T07:51:29.518 INFO:tasks.workunit.client.1.vm08.stdout:3/411: mkdir d0/d3c/d1f/d89 0 2026-03-10T07:51:29.519 INFO:tasks.workunit.client.1.vm08.stdout:3/412: write d0/d3c/d1f/d44/d51/f82 [107359,89161] 0 2026-03-10T07:51:29.526 INFO:tasks.workunit.client.1.vm08.stdout:5/419: creat d0/d4/df/d12/f88 x:0 0 0 2026-03-10T07:51:29.536 INFO:tasks.workunit.client.1.vm08.stdout:3/413: creat d0/d3c/d18/d4a/f8a x:0 0 0 2026-03-10T07:51:29.536 INFO:tasks.workunit.client.1.vm08.stdout:5/420: mknod d0/d8/d5e/c89 0 2026-03-10T07:51:29.536 INFO:tasks.workunit.client.1.vm08.stdout:3/414: dwrite d0/f84 [0,4194304] 0 2026-03-10T07:51:29.536 INFO:tasks.workunit.client.1.vm08.stdout:3/415: write d0/d3c/d18/d32/f62 [4139367,87929] 0 2026-03-10T07:51:29.541 INFO:tasks.workunit.client.1.vm08.stdout:5/421: creat d0/d4/d19/d50/f8a x:0 0 0 2026-03-10T07:51:29.542 INFO:tasks.workunit.client.1.vm08.stdout:3/416: dwrite d0/d3c/d18/d32/d61/d52/f68 [0,4194304] 0 2026-03-10T07:51:29.542 INFO:tasks.workunit.client.1.vm08.stdout:5/422: chown d0/d8/f18 26441 1 2026-03-10T07:51:29.546 INFO:tasks.workunit.client.1.vm08.stdout:5/423: creat d0/d4/df/d12/d22/f8b x:0 0 0 2026-03-10T07:51:29.547 INFO:tasks.workunit.client.1.vm08.stdout:3/417: write d0/d3c/d1f/d44/d51/f65 [483272,106706] 0 2026-03-10T07:51:29.551 INFO:tasks.workunit.client.1.vm08.stdout:3/418: unlink d0/f28 0 2026-03-10T07:51:29.557 INFO:tasks.workunit.client.1.vm08.stdout:3/419: dread - d0/d3c/d18/d4a/f8a zero size 2026-03-10T07:51:29.557 INFO:tasks.workunit.client.1.vm08.stdout:3/420: write d0/f39 [1955140,59510] 0 2026-03-10T07:51:29.557 INFO:tasks.workunit.client.1.vm08.stdout:3/421: stat d0/d3c/d18/d48/d55/c75 0 2026-03-10T07:51:29.557 INFO:tasks.workunit.client.1.vm08.stdout:5/424: rename d0/d4/df/c14 to d0/d4/d19/d81/c8c 0 2026-03-10T07:51:29.562 INFO:tasks.workunit.client.1.vm08.stdout:3/422: dwrite d0/d3c/d1f/d44/d51/d34/f3d [4194304,4194304] 0 2026-03-10T07:51:29.564 INFO:tasks.workunit.client.1.vm08.stdout:9/388: dwrite d2/de/f1e [0,4194304] 0 2026-03-10T07:51:29.565 INFO:tasks.workunit.client.1.vm08.stdout:1/375: dwrite d2/d10/f3e [0,4194304] 0 2026-03-10T07:51:29.574 INFO:tasks.workunit.client.1.vm08.stdout:6/411: fsync d1/f49 0 2026-03-10T07:51:29.574 INFO:tasks.workunit.client.1.vm08.stdout:8/429: rmdir d0/df/d17 39 2026-03-10T07:51:29.576 INFO:tasks.workunit.client.1.vm08.stdout:2/418: dwrite d0/d1/d17/d6b/f72 [0,4194304] 0 2026-03-10T07:51:29.580 INFO:tasks.workunit.client.1.vm08.stdout:7/416: write d3/f2e [971113,107960] 0 2026-03-10T07:51:29.581 INFO:tasks.workunit.client.1.vm08.stdout:4/312: truncate d5/d8/fc 1776735 0 2026-03-10T07:51:29.582 INFO:tasks.workunit.client.1.vm08.stdout:0/402: truncate dd/d10/d14/f46 382450 0 2026-03-10T07:51:29.585 INFO:tasks.workunit.client.1.vm08.stdout:7/417: creat d3/da/d25/d9/d2f/d3a/d4b/f8d x:0 0 0 2026-03-10T07:51:29.588 INFO:tasks.workunit.client.1.vm08.stdout:8/430: mkdir d0/df/d17/d7a/d89 0 2026-03-10T07:51:29.589 INFO:tasks.workunit.client.1.vm08.stdout:1/376: dread d2/d6/de/d1f/d22/f35 [4194304,4194304] 0 2026-03-10T07:51:29.591 INFO:tasks.workunit.client.1.vm08.stdout:8/431: unlink d0/df/d2e/d30/f7f 0 2026-03-10T07:51:29.602 INFO:tasks.workunit.client.1.vm08.stdout:6/412: getdents d1/d3/df/d52 0 2026-03-10T07:51:29.602 INFO:tasks.workunit.client.1.vm08.stdout:8/432: unlink d0/d69/d3f/l48 0 2026-03-10T07:51:29.602 INFO:tasks.workunit.client.1.vm08.stdout:8/433: chown d0/df/d15/d23/d39/d5b/d4a 2 1 2026-03-10T07:51:29.602 INFO:tasks.workunit.client.1.vm08.stdout:0/403: dread dd/f44 [0,4194304] 0 2026-03-10T07:51:29.602 INFO:tasks.workunit.client.1.vm08.stdout:0/404: write dd/d29/d5c/f63 [1474290,8476] 0 2026-03-10T07:51:29.602 INFO:tasks.workunit.client.1.vm08.stdout:0/405: stat dd/d10/c2b 0 2026-03-10T07:51:29.608 INFO:tasks.workunit.client.1.vm08.stdout:2/419: getdents d0/d1/d3 0 2026-03-10T07:51:29.610 INFO:tasks.workunit.client.1.vm08.stdout:4/313: dread d5/f10 [0,4194304] 0 2026-03-10T07:51:29.610 INFO:tasks.workunit.client.1.vm08.stdout:6/413: mknod d1/db/c8d 0 2026-03-10T07:51:29.610 INFO:tasks.workunit.client.1.vm08.stdout:6/414: fdatasync d1/d46/f72 0 2026-03-10T07:51:29.612 INFO:tasks.workunit.client.1.vm08.stdout:2/420: chown d0/d1/d3/d10/d38/l5f 1540 1 2026-03-10T07:51:29.612 INFO:tasks.workunit.client.1.vm08.stdout:8/434: link d0/l14 d0/l8a 0 2026-03-10T07:51:29.613 INFO:tasks.workunit.client.1.vm08.stdout:8/435: fsync d0/d37/f47 0 2026-03-10T07:51:29.613 INFO:tasks.workunit.client.1.vm08.stdout:4/314: unlink d5/d8/d9/d12/f53 0 2026-03-10T07:51:29.614 INFO:tasks.workunit.client.1.vm08.stdout:2/421: symlink d0/d1/d3/d56/l8b 0 2026-03-10T07:51:29.615 INFO:tasks.workunit.client.1.vm08.stdout:8/436: creat d0/df/d17/d7a/d89/f8b x:0 0 0 2026-03-10T07:51:29.617 INFO:tasks.workunit.client.1.vm08.stdout:8/437: mkdir d0/df/d15/d23/d39/d5b/d8c 0 2026-03-10T07:51:29.618 INFO:tasks.workunit.client.1.vm08.stdout:8/438: chown d0/df/f19 13 1 2026-03-10T07:51:29.620 INFO:tasks.workunit.client.1.vm08.stdout:8/439: fsync d0/df/f12 0 2026-03-10T07:51:29.620 INFO:tasks.workunit.client.1.vm08.stdout:8/440: fsync d0/f2a 0 2026-03-10T07:51:29.621 INFO:tasks.workunit.client.1.vm08.stdout:2/422: dread d0/d1/f24 [0,4194304] 0 2026-03-10T07:51:29.623 INFO:tasks.workunit.client.1.vm08.stdout:8/441: symlink d0/df/d17/l8d 0 2026-03-10T07:51:29.624 INFO:tasks.workunit.client.1.vm08.stdout:2/423: rename d0/c33 to d0/d1/d3/d10/d38/c8c 0 2026-03-10T07:51:29.625 INFO:tasks.workunit.client.1.vm08.stdout:8/442: link d0/l14 d0/df/d15/d23/d39/d5b/d4a/l8e 0 2026-03-10T07:51:29.626 INFO:tasks.workunit.client.1.vm08.stdout:2/424: unlink d0/d1/d3/d10/c18 0 2026-03-10T07:51:29.634 INFO:tasks.workunit.client.1.vm08.stdout:8/443: creat d0/df/d15/d23/d54/f8f x:0 0 0 2026-03-10T07:51:29.634 INFO:tasks.workunit.client.1.vm08.stdout:2/425: mknod d0/d1/d3/d39/d7d/d7e/c8d 0 2026-03-10T07:51:29.635 INFO:tasks.workunit.client.1.vm08.stdout:2/426: chown d0/d1/d17/l2c 1536 1 2026-03-10T07:51:29.635 INFO:tasks.workunit.client.1.vm08.stdout:8/444: dwrite d0/f22 [0,4194304] 0 2026-03-10T07:51:29.635 INFO:tasks.workunit.client.1.vm08.stdout:8/445: write d0/d69/f4c [133933,52192] 0 2026-03-10T07:51:29.643 INFO:tasks.workunit.client.1.vm08.stdout:2/427: sync 2026-03-10T07:51:29.646 INFO:tasks.workunit.client.1.vm08.stdout:2/428: truncate d0/d1/fb 3373284 0 2026-03-10T07:51:29.648 INFO:tasks.workunit.client.1.vm08.stdout:2/429: getdents d0/d1/d3/d10/d32/d61/d84 0 2026-03-10T07:51:29.649 INFO:tasks.workunit.client.1.vm08.stdout:2/430: chown d0/d1/d3/d39/d7d/f80 7 1 2026-03-10T07:51:29.654 INFO:tasks.workunit.client.1.vm08.stdout:2/431: dread d0/d1/d3/f63 [0,4194304] 0 2026-03-10T07:51:29.680 INFO:tasks.workunit.client.1.vm08.stdout:2/432: dread d0/f68 [0,4194304] 0 2026-03-10T07:51:29.683 INFO:tasks.workunit.client.1.vm08.stdout:2/433: unlink d0/d1/d3/d39/d7d/d86/d55/c30 0 2026-03-10T07:51:29.684 INFO:tasks.workunit.client.1.vm08.stdout:2/434: write d0/d1/d3/d10/d38/f52 [246842,67255] 0 2026-03-10T07:51:29.685 INFO:tasks.workunit.client.1.vm08.stdout:0/406: dread dd/d10/d2f/d37/f65 [0,4194304] 0 2026-03-10T07:51:29.686 INFO:tasks.workunit.client.1.vm08.stdout:2/435: chown d0/d1/d3/d10/d38/f53 622 1 2026-03-10T07:51:29.687 INFO:tasks.workunit.client.1.vm08.stdout:0/407: symlink dd/d10/d14/d1b/d30/l8c 0 2026-03-10T07:51:29.689 INFO:tasks.workunit.client.1.vm08.stdout:0/408: mknod dd/d10/d14/d15/d20/c8d 0 2026-03-10T07:51:29.690 INFO:tasks.workunit.client.1.vm08.stdout:0/409: creat dd/d10/d2f/f8e x:0 0 0 2026-03-10T07:51:29.691 INFO:tasks.workunit.client.1.vm08.stdout:0/410: rmdir dd/d10/d14/d15/d20/d5f 39 2026-03-10T07:51:29.692 INFO:tasks.workunit.client.1.vm08.stdout:0/411: truncate dd/d10/f77 535064 0 2026-03-10T07:51:29.692 INFO:tasks.workunit.client.1.vm08.stdout:0/412: stat f8 0 2026-03-10T07:51:29.693 INFO:tasks.workunit.client.1.vm08.stdout:0/413: chown dd/d10/d14/d15/d20/c8d 53 1 2026-03-10T07:51:29.697 INFO:tasks.workunit.client.1.vm08.stdout:8/446: dread d0/df/d15/d23/d39/f3e [0,4194304] 0 2026-03-10T07:51:29.702 INFO:tasks.workunit.client.1.vm08.stdout:8/447: dwrite d0/df/f13 [4194304,4194304] 0 2026-03-10T07:51:29.712 INFO:tasks.workunit.client.1.vm08.stdout:0/414: sync 2026-03-10T07:51:29.712 INFO:tasks.workunit.client.1.vm08.stdout:8/448: sync 2026-03-10T07:51:29.716 INFO:tasks.workunit.client.1.vm08.stdout:8/449: creat d0/d37/d86/f90 x:0 0 0 2026-03-10T07:51:29.719 INFO:tasks.workunit.client.1.vm08.stdout:8/450: creat d0/df/d17/d72/f91 x:0 0 0 2026-03-10T07:51:29.727 INFO:tasks.workunit.client.1.vm08.stdout:0/415: dread dd/d18/f3c [0,4194304] 0 2026-03-10T07:51:29.727 INFO:tasks.workunit.client.1.vm08.stdout:8/451: symlink d0/df/d15/d23/l92 0 2026-03-10T07:51:29.727 INFO:tasks.workunit.client.1.vm08.stdout:8/452: mknod d0/d69/d3f/c93 0 2026-03-10T07:51:29.727 INFO:tasks.workunit.client.1.vm08.stdout:8/453: symlink d0/d37/l94 0 2026-03-10T07:51:29.729 INFO:tasks.workunit.client.1.vm08.stdout:0/416: dwrite dd/d10/d14/d1b/f76 [0,4194304] 0 2026-03-10T07:51:29.737 INFO:tasks.workunit.client.1.vm08.stdout:5/425: truncate d0/d4/df/f27 6112941 0 2026-03-10T07:51:29.739 INFO:tasks.workunit.client.1.vm08.stdout:3/423: write d0/d3c/d18/d32/d61/f7a [842088,61909] 0 2026-03-10T07:51:29.739 INFO:tasks.workunit.client.1.vm08.stdout:5/426: chown d0/d4/df/d12/d22 26777 1 2026-03-10T07:51:29.743 INFO:tasks.workunit.client.1.vm08.stdout:1/377: write d2/d6/d3a/d61/d7a/f44 [687211,80839] 0 2026-03-10T07:51:29.745 INFO:tasks.workunit.client.1.vm08.stdout:1/378: read d2/d10/f3f [826727,80880] 0 2026-03-10T07:51:29.745 INFO:tasks.workunit.client.1.vm08.stdout:7/418: dwrite d3/da/d25/d9/f53 [0,4194304] 0 2026-03-10T07:51:29.751 INFO:tasks.workunit.client.1.vm08.stdout:6/415: write d1/f35 [5120609,72980] 0 2026-03-10T07:51:29.752 INFO:tasks.workunit.client.1.vm08.stdout:8/454: rename d0/df/d15/d23/d39/d5b/d8c to d0/df/d15/d95 0 2026-03-10T07:51:29.753 INFO:tasks.workunit.client.1.vm08.stdout:8/455: chown d0/f45 522264803 1 2026-03-10T07:51:29.755 INFO:tasks.workunit.client.1.vm08.stdout:4/315: write d5/d8/f1d [4495514,105590] 0 2026-03-10T07:51:29.765 INFO:tasks.workunit.client.1.vm08.stdout:1/379: readlink d2/d6/d3a/d61/d7a/l6a 0 2026-03-10T07:51:29.765 INFO:tasks.workunit.client.1.vm08.stdout:8/456: dwrite d0/df/d15/d23/d39/f85 [0,4194304] 0 2026-03-10T07:51:29.774 INFO:tasks.workunit.client.1.vm08.stdout:9/389: dread d2/d26/f4b [0,4194304] 0 2026-03-10T07:51:29.775 INFO:tasks.workunit.client.1.vm08.stdout:9/390: truncate d2/d86/d30/d35/f79 925766 0 2026-03-10T07:51:29.779 INFO:tasks.workunit.client.1.vm08.stdout:9/391: dwrite d2/d3/fa [8388608,4194304] 0 2026-03-10T07:51:29.780 INFO:tasks.workunit.client.1.vm08.stdout:9/392: fsync d2/d86/f75 0 2026-03-10T07:51:29.787 INFO:tasks.workunit.client.1.vm08.stdout:5/427: rename d0/d4/df/d1e/f28 to d0/d4/df/d82/f8d 0 2026-03-10T07:51:29.789 INFO:tasks.workunit.client.1.vm08.stdout:6/416: creat d1/d3/df/d1d/d40/d87/f8e x:0 0 0 2026-03-10T07:51:29.790 INFO:tasks.workunit.client.1.vm08.stdout:7/419: dread d3/da/d25/d9/f87 [0,4194304] 0 2026-03-10T07:51:29.790 INFO:tasks.workunit.client.1.vm08.stdout:7/420: write d3/da/d25/d9/d6f/f73 [382804,110173] 0 2026-03-10T07:51:29.791 INFO:tasks.workunit.client.1.vm08.stdout:2/436: write d0/d1/d3/d10/d32/d61/f3d [56200,69174] 0 2026-03-10T07:51:29.794 INFO:tasks.workunit.client.1.vm08.stdout:6/417: write d1/d3/df/d1d/f6b [774720,35840] 0 2026-03-10T07:51:29.795 INFO:tasks.workunit.client.1.vm08.stdout:9/393: readlink d2/d26/l6b 0 2026-03-10T07:51:29.795 INFO:tasks.workunit.client.1.vm08.stdout:7/421: truncate d3/da/f84 15684 0 2026-03-10T07:51:29.799 INFO:tasks.workunit.client.1.vm08.stdout:9/394: fsync d2/f1a 0 2026-03-10T07:51:29.810 INFO:tasks.workunit.client.1.vm08.stdout:9/395: chown d2/d86/d2b/f6a 111 1 2026-03-10T07:51:29.810 INFO:tasks.workunit.client.1.vm08.stdout:9/396: stat d2/d86/f63 0 2026-03-10T07:51:29.810 INFO:tasks.workunit.client.1.vm08.stdout:5/428: mkdir d0/d8/d5e/d8e 0 2026-03-10T07:51:29.811 INFO:tasks.workunit.client.1.vm08.stdout:4/316: mknod d5/c66 0 2026-03-10T07:51:29.811 INFO:tasks.workunit.client.1.vm08.stdout:8/457: unlink d0/df/d15/d23/d54/c6c 0 2026-03-10T07:51:29.811 INFO:tasks.workunit.client.1.vm08.stdout:2/437: rmdir d0/d1/d3/d56 39 2026-03-10T07:51:29.811 INFO:tasks.workunit.client.1.vm08.stdout:9/397: creat d2/d86/f90 x:0 0 0 2026-03-10T07:51:29.811 INFO:tasks.workunit.client.1.vm08.stdout:9/398: write d2/d86/d30/f5d [3225918,95110] 0 2026-03-10T07:51:29.811 INFO:tasks.workunit.client.1.vm08.stdout:0/417: getdents dd/d29 0 2026-03-10T07:51:29.811 INFO:tasks.workunit.client.1.vm08.stdout:6/418: creat d1/d3/df/d52/f8f x:0 0 0 2026-03-10T07:51:29.811 INFO:tasks.workunit.client.1.vm08.stdout:5/429: rename d0/d4/df/f4c to d0/d8/d24/f8f 0 2026-03-10T07:51:29.811 INFO:tasks.workunit.client.1.vm08.stdout:1/380: creat d2/d6/f86 x:0 0 0 2026-03-10T07:51:29.811 INFO:tasks.workunit.client.1.vm08.stdout:4/317: symlink d5/d8/d50/l67 0 2026-03-10T07:51:29.811 INFO:tasks.workunit.client.1.vm08.stdout:9/399: mkdir d2/d3/d84/d91 0 2026-03-10T07:51:29.812 INFO:tasks.workunit.client.1.vm08.stdout:8/458: creat d0/df/d15/d95/f96 x:0 0 0 2026-03-10T07:51:29.813 INFO:tasks.workunit.client.1.vm08.stdout:2/438: mkdir d0/d1/d3/d10/d32/d61/d8e 0 2026-03-10T07:51:29.813 INFO:tasks.workunit.client.1.vm08.stdout:9/400: chown d2/d86/d2b/c48 51 1 2026-03-10T07:51:29.814 INFO:tasks.workunit.client.1.vm08.stdout:7/422: symlink d3/da/d25/d9/d2f/d39/l8e 0 2026-03-10T07:51:29.816 INFO:tasks.workunit.client.1.vm08.stdout:9/401: mknod d2/d86/c92 0 2026-03-10T07:51:29.817 INFO:tasks.workunit.client.1.vm08.stdout:4/318: chown d5/d17/l3b 27556 1 2026-03-10T07:51:29.817 INFO:tasks.workunit.client.1.vm08.stdout:8/459: dwrite d0/d37/f47 [0,4194304] 0 2026-03-10T07:51:29.818 INFO:tasks.workunit.client.1.vm08.stdout:8/460: chown d0/df/d15/d23 3007 1 2026-03-10T07:51:29.819 INFO:tasks.workunit.client.1.vm08.stdout:9/402: chown d2/de/l2e 15564 1 2026-03-10T07:51:29.820 INFO:tasks.workunit.client.1.vm08.stdout:0/418: dread dd/d29/f48 [0,4194304] 0 2026-03-10T07:51:29.822 INFO:tasks.workunit.client.1.vm08.stdout:1/381: rmdir d2/d6/d3a 39 2026-03-10T07:51:29.827 INFO:tasks.workunit.client.1.vm08.stdout:5/430: symlink d0/d4/df/d1e/d41/l90 0 2026-03-10T07:51:29.847 INFO:tasks.workunit.client.1.vm08.stdout:5/431: chown d0/d8/d24 815239 1 2026-03-10T07:51:29.847 INFO:tasks.workunit.client.1.vm08.stdout:5/432: dwrite d0/d4/df/d1e/d41/f5c [0,4194304] 0 2026-03-10T07:51:29.847 INFO:tasks.workunit.client.1.vm08.stdout:1/382: dwrite d2/d6/de/d1f/f78 [0,4194304] 0 2026-03-10T07:51:29.847 INFO:tasks.workunit.client.1.vm08.stdout:8/461: mkdir d0/d97 0 2026-03-10T07:51:29.847 INFO:tasks.workunit.client.1.vm08.stdout:9/403: creat d2/d58/f93 x:0 0 0 2026-03-10T07:51:29.847 INFO:tasks.workunit.client.1.vm08.stdout:8/462: read d0/d69/f46 [3229076,110011] 0 2026-03-10T07:51:29.847 INFO:tasks.workunit.client.1.vm08.stdout:0/419: truncate dd/d10/d14/d15/d20/d22/f2e 3578947 0 2026-03-10T07:51:29.847 INFO:tasks.workunit.client.1.vm08.stdout:0/420: readlink dd/d29/l38 0 2026-03-10T07:51:29.847 INFO:tasks.workunit.client.1.vm08.stdout:4/319: dread d5/d17/f43 [0,4194304] 0 2026-03-10T07:51:29.848 INFO:tasks.workunit.client.1.vm08.stdout:9/404: chown d2/d3/d84/f8c 31 1 2026-03-10T07:51:29.848 INFO:tasks.workunit.client.1.vm08.stdout:1/383: dwrite d2/d6/de/d1f/f67 [0,4194304] 0 2026-03-10T07:51:29.848 INFO:tasks.workunit.client.1.vm08.stdout:5/433: mknod d0/d4/d19/d81/c91 0 2026-03-10T07:51:29.848 INFO:tasks.workunit.client.1.vm08.stdout:7/423: getdents d3/da/d25 0 2026-03-10T07:51:29.850 INFO:tasks.workunit.client.1.vm08.stdout:5/434: fdatasync d0/d77/d63/f76 0 2026-03-10T07:51:29.850 INFO:tasks.workunit.client.1.vm08.stdout:8/463: unlink d0/df/d17/d25/c31 0 2026-03-10T07:51:29.851 INFO:tasks.workunit.client.1.vm08.stdout:4/320: creat d5/d8/f68 x:0 0 0 2026-03-10T07:51:29.851 INFO:tasks.workunit.client.1.vm08.stdout:1/384: creat d2/d6/d3a/d61/d7a/f87 x:0 0 0 2026-03-10T07:51:29.853 INFO:tasks.workunit.client.1.vm08.stdout:8/464: stat d0/d37/f47 0 2026-03-10T07:51:29.854 INFO:tasks.workunit.client.1.vm08.stdout:7/424: creat d3/da/d25/d9/d2f/d39/d43/d4f/f8f x:0 0 0 2026-03-10T07:51:29.855 INFO:tasks.workunit.client.1.vm08.stdout:1/385: write d2/d6/de/d1f/d22/f49 [7516712,96757] 0 2026-03-10T07:51:29.855 INFO:tasks.workunit.client.1.vm08.stdout:5/435: stat d0/d4/df/d12/c72 0 2026-03-10T07:51:29.855 INFO:tasks.workunit.client.1.vm08.stdout:8/465: read d0/d69/f46 [748908,104698] 0 2026-03-10T07:51:29.856 INFO:tasks.workunit.client.1.vm08.stdout:4/321: rename d5/d8/d9/f4e to d5/d8/d9/f69 0 2026-03-10T07:51:29.859 INFO:tasks.workunit.client.1.vm08.stdout:1/386: write d2/d10/f3e [1392367,53645] 0 2026-03-10T07:51:29.878 INFO:tasks.workunit.client.1.vm08.stdout:1/387: fsync d2/f4 0 2026-03-10T07:51:29.879 INFO:tasks.workunit.client.1.vm08.stdout:1/388: read d2/d6/de/d1f/f2a [961102,57951] 0 2026-03-10T07:51:29.879 INFO:tasks.workunit.client.1.vm08.stdout:4/322: mknod d5/d8/d9/c6a 0 2026-03-10T07:51:29.879 INFO:tasks.workunit.client.1.vm08.stdout:7/425: mknod d3/da/d25/d9/d2f/d39/c90 0 2026-03-10T07:51:29.879 INFO:tasks.workunit.client.1.vm08.stdout:8/466: rename d0/df/d15/d95/f96 to d0/df/d15/d23/d39/d5b/d4a/f98 0 2026-03-10T07:51:29.879 INFO:tasks.workunit.client.1.vm08.stdout:5/436: dwrite d0/d4/d19/d50/f8a [0,4194304] 0 2026-03-10T07:51:29.879 INFO:tasks.workunit.client.1.vm08.stdout:1/389: write d2/d6/d3a/f7d [891241,126986] 0 2026-03-10T07:51:29.879 INFO:tasks.workunit.client.1.vm08.stdout:5/437: dread - d0/d77/d63/f78 zero size 2026-03-10T07:51:29.879 INFO:tasks.workunit.client.1.vm08.stdout:1/390: chown d2/d6/de/d1f/d26/f5d 1 1 2026-03-10T07:51:29.879 INFO:tasks.workunit.client.1.vm08.stdout:8/467: mknod d0/df/d15/d23/c99 0 2026-03-10T07:51:29.879 INFO:tasks.workunit.client.1.vm08.stdout:7/426: rename d3/da/d25/d9/d6f/f73 to d3/da/d25/d9/d2f/d3a/d4b/d67/d69/f91 0 2026-03-10T07:51:29.879 INFO:tasks.workunit.client.1.vm08.stdout:8/468: write d0/df/f1b [5254657,27096] 0 2026-03-10T07:51:29.879 INFO:tasks.workunit.client.1.vm08.stdout:7/427: write d3/da/d25/d9/d2f/d3a/d40/f55 [159313,14887] 0 2026-03-10T07:51:29.879 INFO:tasks.workunit.client.1.vm08.stdout:1/391: dwrite d2/d6/d3a/d61/d7a/f44 [0,4194304] 0 2026-03-10T07:51:29.885 INFO:tasks.workunit.client.1.vm08.stdout:8/469: getdents d0/d37/d86 0 2026-03-10T07:51:30.007 INFO:tasks.workunit.client.1.vm08.stdout:3/424: dwrite d0/d3c/d1f/d44/d51/f4d [0,4194304] 0 2026-03-10T07:51:30.012 INFO:tasks.workunit.client.1.vm08.stdout:3/425: creat d0/d3c/d18/d32/d61/d83/f8b x:0 0 0 2026-03-10T07:51:30.014 INFO:tasks.workunit.client.1.vm08.stdout:3/426: dwrite d0/d3c/d1f/d44/f59 [0,4194304] 0 2026-03-10T07:51:30.018 INFO:tasks.workunit.client.1.vm08.stdout:3/427: link d0/d3c/d18/d32/f62 d0/d3c/d1f/d44/f8c 0 2026-03-10T07:51:30.018 INFO:tasks.workunit.client.1.vm08.stdout:3/428: write d0/d3c/d18/d32/f76 [458631,57198] 0 2026-03-10T07:51:30.019 INFO:tasks.workunit.client.1.vm08.stdout:3/429: stat d0/d3c/d18/d48/d55/l5a 0 2026-03-10T07:51:30.019 INFO:tasks.workunit.client.1.vm08.stdout:3/430: dread - d0/d3c/d18/d48/d55/d56/f69 zero size 2026-03-10T07:51:30.047 INFO:tasks.workunit.client.1.vm08.stdout:0/421: sync 2026-03-10T07:51:30.048 INFO:tasks.workunit.client.1.vm08.stdout:4/323: sync 2026-03-10T07:51:30.048 INFO:tasks.workunit.client.1.vm08.stdout:1/392: sync 2026-03-10T07:51:30.048 INFO:tasks.workunit.client.1.vm08.stdout:0/422: chown dd/d29/d5c/c85 422884 1 2026-03-10T07:51:30.049 INFO:tasks.workunit.client.1.vm08.stdout:4/324: write d5/d17/d48/f5b [67817,46341] 0 2026-03-10T07:51:30.054 INFO:tasks.workunit.client.1.vm08.stdout:4/325: mknod d5/d1f/c6b 0 2026-03-10T07:51:30.056 INFO:tasks.workunit.client.1.vm08.stdout:4/326: creat d5/d8/d9/d32/f6c x:0 0 0 2026-03-10T07:51:30.056 INFO:tasks.workunit.client.1.vm08.stdout:1/393: dwrite d2/d6/de/d1f/d40/d76/f79 [0,4194304] 0 2026-03-10T07:51:30.056 INFO:tasks.workunit.client.1.vm08.stdout:4/327: chown d5/d8/d9/cb 6804 1 2026-03-10T07:51:30.061 INFO:tasks.workunit.client.1.vm08.stdout:4/328: dwrite d5/f10 [0,4194304] 0 2026-03-10T07:51:30.076 INFO:tasks.workunit.client.1.vm08.stdout:4/329: mknod d5/d1f/d31/d61/c6d 0 2026-03-10T07:51:30.077 INFO:tasks.workunit.client.1.vm08.stdout:4/330: write d5/d8/f68 [867866,37901] 0 2026-03-10T07:51:30.078 INFO:tasks.workunit.client.1.vm08.stdout:4/331: write d5/d8/d9/d32/f44 [668589,31781] 0 2026-03-10T07:51:30.082 INFO:tasks.workunit.client.1.vm08.stdout:4/332: getdents d5/d17/d48/d64 0 2026-03-10T07:51:30.082 INFO:tasks.workunit.client.1.vm08.stdout:4/333: read d5/d17/f43 [33847,92765] 0 2026-03-10T07:51:30.084 INFO:tasks.workunit.client.1.vm08.stdout:4/334: read d5/d17/f1a [1201352,110252] 0 2026-03-10T07:51:30.136 INFO:tasks.workunit.client.1.vm08.stdout:1/394: dread d2/d6/de/d1f/d26/f48 [0,4194304] 0 2026-03-10T07:51:30.138 INFO:tasks.workunit.client.1.vm08.stdout:1/395: creat d2/d6/d3a/d61/f88 x:0 0 0 2026-03-10T07:51:30.139 INFO:tasks.workunit.client.1.vm08.stdout:1/396: mkdir d2/d6/de/d1f/d26/d89 0 2026-03-10T07:51:30.158 INFO:tasks.workunit.client.1.vm08.stdout:4/335: sync 2026-03-10T07:51:30.160 INFO:tasks.workunit.client.1.vm08.stdout:4/336: symlink d5/d1f/d31/d61/l6e 0 2026-03-10T07:51:30.161 INFO:tasks.workunit.client.1.vm08.stdout:4/337: fsync d5/d1f/d31/f58 0 2026-03-10T07:51:30.189 INFO:tasks.workunit.client.1.vm08.stdout:6/419: truncate d1/d17/d2b/f3c 4038174 0 2026-03-10T07:51:30.197 INFO:tasks.workunit.client.1.vm08.stdout:9/405: truncate d2/d3/f5f 2843907 0 2026-03-10T07:51:30.199 INFO:tasks.workunit.client.1.vm08.stdout:5/438: truncate d0/d4/d19/d43/f35 2187966 0 2026-03-10T07:51:30.199 INFO:tasks.workunit.client.1.vm08.stdout:7/428: truncate d3/da/f21 4704222 0 2026-03-10T07:51:30.200 INFO:tasks.workunit.client.1.vm08.stdout:9/406: creat d2/d3/d84/f94 x:0 0 0 2026-03-10T07:51:30.201 INFO:tasks.workunit.client.1.vm08.stdout:6/420: truncate d1/db/d24/d51/f59 1225257 0 2026-03-10T07:51:30.202 INFO:tasks.workunit.client.1.vm08.stdout:5/439: rename d0/d77/d63 to d0/d4/d19/d81/d92 0 2026-03-10T07:51:30.207 INFO:tasks.workunit.client.1.vm08.stdout:5/440: read d0/d8/f1b [2058533,90026] 0 2026-03-10T07:51:30.209 INFO:tasks.workunit.client.1.vm08.stdout:5/441: truncate d0/d4/df/d12/d22/f8b 591899 0 2026-03-10T07:51:30.210 INFO:tasks.workunit.client.1.vm08.stdout:2/439: getdents d0/d1/d3/d39/d7d/d86/d55/d7a 0 2026-03-10T07:51:30.210 INFO:tasks.workunit.client.1.vm08.stdout:0/423: dread dd/d29/f2a [0,4194304] 0 2026-03-10T07:51:30.220 INFO:tasks.workunit.client.1.vm08.stdout:0/424: dread - dd/d10/d14/d15/d20/f7e zero size 2026-03-10T07:51:30.220 INFO:tasks.workunit.client.1.vm08.stdout:8/470: truncate d0/df/d15/d23/f3d 480951 0 2026-03-10T07:51:30.220 INFO:tasks.workunit.client.1.vm08.stdout:9/407: fsync d2/d3/f12 0 2026-03-10T07:51:30.220 INFO:tasks.workunit.client.1.vm08.stdout:9/408: stat d2/l82 0 2026-03-10T07:51:30.220 INFO:tasks.workunit.client.1.vm08.stdout:6/421: mknod d1/d3/df/d1d/c90 0 2026-03-10T07:51:30.220 INFO:tasks.workunit.client.1.vm08.stdout:7/429: symlink d3/da/d8a/l92 0 2026-03-10T07:51:30.220 INFO:tasks.workunit.client.1.vm08.stdout:9/409: dwrite d2/d3/fc [0,4194304] 0 2026-03-10T07:51:30.220 INFO:tasks.workunit.client.1.vm08.stdout:2/440: dwrite d0/d1/d3/d39/d7d/f80 [0,4194304] 0 2026-03-10T07:51:30.223 INFO:tasks.workunit.client.1.vm08.stdout:0/425: link dd/d10/d2f/f8e dd/d29/d5c/f8f 0 2026-03-10T07:51:30.225 INFO:tasks.workunit.client.1.vm08.stdout:2/441: chown d0/d1/d3/d39/l4c 42518592 1 2026-03-10T07:51:30.225 INFO:tasks.workunit.client.1.vm08.stdout:7/430: truncate d3/da/d25/d9/d2f/d39/d43/d4f/f68 395292 0 2026-03-10T07:51:30.228 INFO:tasks.workunit.client.1.vm08.stdout:7/431: chown d3/da/d25/d9/d2f/d39/d43/d4f/l5a 1491486048 1 2026-03-10T07:51:30.231 INFO:tasks.workunit.client.1.vm08.stdout:3/431: truncate d0/d3c/d1f/d44/f59 1235350 0 2026-03-10T07:51:30.235 INFO:tasks.workunit.client.1.vm08.stdout:9/410: creat d2/d58/f95 x:0 0 0 2026-03-10T07:51:30.235 INFO:tasks.workunit.client.1.vm08.stdout:0/426: write dd/d10/d14/d15/d20/d5f/f61 [3815756,7476] 0 2026-03-10T07:51:30.238 INFO:tasks.workunit.client.1.vm08.stdout:0/427: dread dd/f44 [0,4194304] 0 2026-03-10T07:51:30.239 INFO:tasks.workunit.client.1.vm08.stdout:0/428: readlink dd/d29/l78 0 2026-03-10T07:51:30.240 INFO:tasks.workunit.client.1.vm08.stdout:2/442: mknod d0/d1/d3/d39/d7d/d7e/c8f 0 2026-03-10T07:51:30.243 INFO:tasks.workunit.client.1.vm08.stdout:0/429: dwrite f5 [4194304,4194304] 0 2026-03-10T07:51:30.243 INFO:tasks.workunit.client.1.vm08.stdout:3/432: mknod d0/d3c/d18/d80/c8d 0 2026-03-10T07:51:30.244 INFO:tasks.workunit.client.1.vm08.stdout:3/433: dread - d0/d3c/d18/d32/d61/d83/f8b zero size 2026-03-10T07:51:30.251 INFO:tasks.workunit.client.1.vm08.stdout:1/397: write d2/d6/de/d1f/d26/f6e [744212,17810] 0 2026-03-10T07:51:30.261 INFO:tasks.workunit.client.1.vm08.stdout:1/398: write d2/d6/d50/f54 [112122,125007] 0 2026-03-10T07:51:30.262 INFO:tasks.workunit.client.1.vm08.stdout:4/338: write f2 [2472918,95370] 0 2026-03-10T07:51:30.262 INFO:tasks.workunit.client.1.vm08.stdout:0/430: dread - dd/d10/d14/d15/d20/d22/f51 zero size 2026-03-10T07:51:30.262 INFO:tasks.workunit.client.1.vm08.stdout:1/399: truncate d2/d6/de/d1f/d26/f62 712118 0 2026-03-10T07:51:30.262 INFO:tasks.workunit.client.1.vm08.stdout:8/471: dread d0/df/f19 [4194304,4194304] 0 2026-03-10T07:51:30.262 INFO:tasks.workunit.client.1.vm08.stdout:8/472: stat d0/d37/c83 0 2026-03-10T07:51:30.262 INFO:tasks.workunit.client.1.vm08.stdout:7/432: creat d3/f93 x:0 0 0 2026-03-10T07:51:30.262 INFO:tasks.workunit.client.1.vm08.stdout:2/443: rename d0/d1/fb to d0/d1/d3/d10/d32/d61/d8e/f90 0 2026-03-10T07:51:30.262 INFO:tasks.workunit.client.1.vm08.stdout:1/400: symlink d2/d6/de/d70/l8a 0 2026-03-10T07:51:30.264 INFO:tasks.workunit.client.1.vm08.stdout:9/411: read d2/fd [4709870,2362] 0 2026-03-10T07:51:30.268 INFO:tasks.workunit.client.1.vm08.stdout:8/473: symlink d0/df/d2e/l9a 0 2026-03-10T07:51:30.269 INFO:tasks.workunit.client.1.vm08.stdout:7/433: write d3/f6 [4984453,104975] 0 2026-03-10T07:51:30.271 INFO:tasks.workunit.client.1.vm08.stdout:7/434: chown d3/da/d25/d9/d2f/d3a/d4b/d67/c75 2506387 1 2026-03-10T07:51:30.272 INFO:tasks.workunit.client.1.vm08.stdout:1/401: symlink d2/d6/de/d1f/d40/d76/l8b 0 2026-03-10T07:51:30.273 INFO:tasks.workunit.client.1.vm08.stdout:3/434: link d0/d3c/d1f/d44/d51/d2d/c42 d0/d3c/d1f/d44/d51/d2d/c8e 0 2026-03-10T07:51:30.273 INFO:tasks.workunit.client.1.vm08.stdout:2/444: rename d0/d1/d3/d39/d7d/d86 to d0/d1/d3/d39/d7d/d86/d91 22 2026-03-10T07:51:30.275 INFO:tasks.workunit.client.1.vm08.stdout:8/474: dwrite d0/df/d15/f70 [0,4194304] 0 2026-03-10T07:51:30.282 INFO:tasks.workunit.client.1.vm08.stdout:9/412: sync 2026-03-10T07:51:30.282 INFO:tasks.workunit.client.1.vm08.stdout:3/435: dwrite d0/d3c/d1f/d44/f4b [0,4194304] 0 2026-03-10T07:51:30.287 INFO:tasks.workunit.client.1.vm08.stdout:1/402: rmdir d2/d6/de/d1f/d26/d58 39 2026-03-10T07:51:30.289 INFO:tasks.workunit.client.1.vm08.stdout:5/442: dread d0/d4/df/d82/f8d [0,4194304] 0 2026-03-10T07:51:30.290 INFO:tasks.workunit.client.1.vm08.stdout:5/443: readlink d0/d4/df/d1e/l2b 0 2026-03-10T07:51:30.290 INFO:tasks.workunit.client.1.vm08.stdout:0/431: getdents dd/d10/d14/d1b 0 2026-03-10T07:51:30.291 INFO:tasks.workunit.client.1.vm08.stdout:1/403: dread d2/d6/de/d1f/d22/f35 [4194304,4194304] 0 2026-03-10T07:51:30.292 INFO:tasks.workunit.client.1.vm08.stdout:2/445: rmdir d0/d1/d3/d10/d32/d61 39 2026-03-10T07:51:30.294 INFO:tasks.workunit.client.1.vm08.stdout:1/404: read - d2/d6/d3a/d61/d7a/f87 zero size 2026-03-10T07:51:30.295 INFO:tasks.workunit.client.1.vm08.stdout:2/446: write d0/d1/d17/d6b/f6e [658089,62239] 0 2026-03-10T07:51:30.311 INFO:tasks.workunit.client.1.vm08.stdout:8/475: creat d0/d37/f9b x:0 0 0 2026-03-10T07:51:30.311 INFO:tasks.workunit.client.1.vm08.stdout:3/436: rmdir d0/d3c/d18/d48 39 2026-03-10T07:51:30.311 INFO:tasks.workunit.client.1.vm08.stdout:3/437: chown d0/ca 0 1 2026-03-10T07:51:30.311 INFO:tasks.workunit.client.1.vm08.stdout:0/432: creat dd/d10/d2f/f90 x:0 0 0 2026-03-10T07:51:30.311 INFO:tasks.workunit.client.1.vm08.stdout:8/476: mkdir d0/df/d15/d9c 0 2026-03-10T07:51:30.316 INFO:tasks.workunit.client.1.vm08.stdout:5/444: mknod d0/c93 0 2026-03-10T07:51:30.316 INFO:tasks.workunit.client.1.vm08.stdout:3/438: creat d0/d3c/d1f/d44/f8f x:0 0 0 2026-03-10T07:51:30.317 INFO:tasks.workunit.client.1.vm08.stdout:5/445: fsync d0/d4/df/d1e/d41/f5c 0 2026-03-10T07:51:30.325 INFO:tasks.workunit.client.1.vm08.stdout:0/433: rename dd/d10/d2f/d37/f6a to dd/d29/f91 0 2026-03-10T07:51:30.338 INFO:tasks.workunit.client.1.vm08.stdout:8/477: fdatasync d0/f45 0 2026-03-10T07:51:30.338 INFO:tasks.workunit.client.1.vm08.stdout:2/447: getdents d0/d1/d3/d39/d7d/d86/d55/d1b 0 2026-03-10T07:51:30.338 INFO:tasks.workunit.client.1.vm08.stdout:3/439: symlink d0/d3c/d18/d48/l90 0 2026-03-10T07:51:30.338 INFO:tasks.workunit.client.1.vm08.stdout:8/478: dwrite d0/df/d2e/f3c [0,4194304] 0 2026-03-10T07:51:30.338 INFO:tasks.workunit.client.1.vm08.stdout:2/448: symlink d0/d1/d17/d6b/l92 0 2026-03-10T07:51:30.339 INFO:tasks.workunit.client.1.vm08.stdout:8/479: write d0/fa [719939,129111] 0 2026-03-10T07:51:30.339 INFO:tasks.workunit.client.1.vm08.stdout:3/440: dread - d0/d3c/d18/d80/f86 zero size 2026-03-10T07:51:30.340 INFO:tasks.workunit.client.1.vm08.stdout:4/339: fsync d5/d8/fc 0 2026-03-10T07:51:30.348 INFO:tasks.workunit.client.1.vm08.stdout:3/441: mknod d0/d3c/d18/d32/d61/c91 0 2026-03-10T07:51:30.349 INFO:tasks.workunit.client.1.vm08.stdout:4/340: dwrite d5/d8/d9/f18 [0,4194304] 0 2026-03-10T07:51:30.352 INFO:tasks.workunit.client.1.vm08.stdout:2/449: write d0/d1/d3/d10/d32/d61/f59 [4957846,112304] 0 2026-03-10T07:51:30.355 INFO:tasks.workunit.client.1.vm08.stdout:4/341: mknod d5/d1f/d41/c6f 0 2026-03-10T07:51:30.360 INFO:tasks.workunit.client.1.vm08.stdout:6/422: write d1/d3/df/f12 [220064,23798] 0 2026-03-10T07:51:30.360 INFO:tasks.workunit.client.1.vm08.stdout:4/342: dwrite d5/d17/f43 [0,4194304] 0 2026-03-10T07:51:30.363 INFO:tasks.workunit.client.1.vm08.stdout:4/343: dread - d5/d8/d9/f5a zero size 2026-03-10T07:51:30.363 INFO:tasks.workunit.client.1.vm08.stdout:3/442: sync 2026-03-10T07:51:30.363 INFO:tasks.workunit.client.1.vm08.stdout:2/450: sync 2026-03-10T07:51:30.363 INFO:tasks.workunit.client.1.vm08.stdout:3/443: readlink d0/l9 0 2026-03-10T07:51:30.365 INFO:tasks.workunit.client.1.vm08.stdout:6/423: creat d1/d7d/f91 x:0 0 0 2026-03-10T07:51:30.366 INFO:tasks.workunit.client.1.vm08.stdout:3/444: fdatasync d0/d3c/d1f/d44/d51/f71 0 2026-03-10T07:51:30.369 INFO:tasks.workunit.client.1.vm08.stdout:8/480: dread d0/df/d15/d23/d39/f40 [0,4194304] 0 2026-03-10T07:51:30.370 INFO:tasks.workunit.client.1.vm08.stdout:6/424: dread - d1/d7d/f91 zero size 2026-03-10T07:51:30.372 INFO:tasks.workunit.client.1.vm08.stdout:2/451: dwrite d0/d1/d3/d10/d65/f7c [0,4194304] 0 2026-03-10T07:51:30.373 INFO:tasks.workunit.client.1.vm08.stdout:4/344: mkdir d5/d1f/d70 0 2026-03-10T07:51:30.373 INFO:tasks.workunit.client.1.vm08.stdout:8/481: write d0/df/d15/d23/f75 [185158,2728] 0 2026-03-10T07:51:30.379 INFO:tasks.workunit.client.1.vm08.stdout:9/413: read d2/d3/f5f [2500542,57603] 0 2026-03-10T07:51:30.380 INFO:tasks.workunit.client.1.vm08.stdout:3/445: creat d0/d3c/d18/d48/d55/d56/f92 x:0 0 0 2026-03-10T07:51:30.386 INFO:tasks.workunit.client.1.vm08.stdout:6/425: unlink d1/d3/df/f12 0 2026-03-10T07:51:30.386 INFO:tasks.workunit.client.1.vm08.stdout:8/482: creat d0/df/d5d/f9d x:0 0 0 2026-03-10T07:51:30.387 INFO:tasks.workunit.client.1.vm08.stdout:4/345: symlink d5/d17/d48/l71 0 2026-03-10T07:51:30.389 INFO:tasks.workunit.client.1.vm08.stdout:0/434: truncate dd/d10/d14/d15/d20/d22/f2e 2674965 0 2026-03-10T07:51:30.391 INFO:tasks.workunit.client.1.vm08.stdout:9/414: dread d2/d86/d2b/f6a [0,4194304] 0 2026-03-10T07:51:30.392 INFO:tasks.workunit.client.1.vm08.stdout:7/435: write d3/f57 [3130265,82449] 0 2026-03-10T07:51:30.398 INFO:tasks.workunit.client.1.vm08.stdout:8/483: dwrite d0/df/d17/f1a [0,4194304] 0 2026-03-10T07:51:30.399 INFO:tasks.workunit.client.1.vm08.stdout:1/405: write d2/d6/de/d1f/f2a [877035,52576] 0 2026-03-10T07:51:30.403 INFO:tasks.workunit.client.1.vm08.stdout:4/346: mknod d5/d1f/d70/c72 0 2026-03-10T07:51:30.403 INFO:tasks.workunit.client.1.vm08.stdout:3/446: creat d0/d3c/d1f/f93 x:0 0 0 2026-03-10T07:51:30.404 INFO:tasks.workunit.client.1.vm08.stdout:0/435: mkdir dd/d10/d14/d15/d20/d92 0 2026-03-10T07:51:30.404 INFO:tasks.workunit.client.1.vm08.stdout:4/347: stat d5/d1f/d31/f4d 0 2026-03-10T07:51:30.405 INFO:tasks.workunit.client.1.vm08.stdout:0/436: dread - dd/d29/d5c/f8f zero size 2026-03-10T07:51:30.406 INFO:tasks.workunit.client.1.vm08.stdout:4/348: truncate d5/d8/f1e 4231243 0 2026-03-10T07:51:30.407 INFO:tasks.workunit.client.1.vm08.stdout:6/426: link d1/d3/df/d1d/l4b d1/db/d24/d73/d79/l92 0 2026-03-10T07:51:30.408 INFO:tasks.workunit.client.1.vm08.stdout:8/484: unlink d0/f45 0 2026-03-10T07:51:30.409 INFO:tasks.workunit.client.1.vm08.stdout:4/349: mknod d5/d1f/d70/c73 0 2026-03-10T07:51:30.409 INFO:tasks.workunit.client.1.vm08.stdout:5/446: dwrite d0/d8/f18 [0,4194304] 0 2026-03-10T07:51:30.410 INFO:tasks.workunit.client.1.vm08.stdout:6/427: creat d1/db/d24/f93 x:0 0 0 2026-03-10T07:51:30.410 INFO:tasks.workunit.client.1.vm08.stdout:4/350: write d5/f10 [2839969,91166] 0 2026-03-10T07:51:30.411 INFO:tasks.workunit.client.1.vm08.stdout:1/406: rename d2/d6/d3a/d61/d7a to d2/d6/de/d1f/d26/d58/d8c 0 2026-03-10T07:51:30.414 INFO:tasks.workunit.client.1.vm08.stdout:4/351: fdatasync d5/d17/d48/f5d 0 2026-03-10T07:51:30.415 INFO:tasks.workunit.client.1.vm08.stdout:8/485: creat d0/df/d2e/f9e x:0 0 0 2026-03-10T07:51:30.419 INFO:tasks.workunit.client.1.vm08.stdout:6/428: creat d1/d3/df/d1d/f94 x:0 0 0 2026-03-10T07:51:30.419 INFO:tasks.workunit.client.1.vm08.stdout:5/447: dread d0/d4/f31 [0,4194304] 0 2026-03-10T07:51:30.419 INFO:tasks.workunit.client.1.vm08.stdout:7/436: getdents d3/da/d25/d9/d6f 0 2026-03-10T07:51:30.424 INFO:tasks.workunit.client.1.vm08.stdout:8/486: unlink d0/df/d17/c7d 0 2026-03-10T07:51:30.426 INFO:tasks.workunit.client.1.vm08.stdout:7/437: unlink d3/da/d25/c3f 0 2026-03-10T07:51:30.428 INFO:tasks.workunit.client.1.vm08.stdout:3/447: sync 2026-03-10T07:51:30.429 INFO:tasks.workunit.client.1.vm08.stdout:4/352: getdents d5/d8/d9 0 2026-03-10T07:51:30.429 INFO:tasks.workunit.client.1.vm08.stdout:6/429: dwrite d1/db/d24/f26 [0,4194304] 0 2026-03-10T07:51:30.431 INFO:tasks.workunit.client.1.vm08.stdout:8/487: creat d0/df/f9f x:0 0 0 2026-03-10T07:51:30.439 INFO:tasks.workunit.client.1.vm08.stdout:4/353: mknod d5/d8/d9/d12/c74 0 2026-03-10T07:51:30.441 INFO:tasks.workunit.client.1.vm08.stdout:1/407: dread d2/f36 [0,4194304] 0 2026-03-10T07:51:30.441 INFO:tasks.workunit.client.1.vm08.stdout:8/488: symlink d0/df/d17/la0 0 2026-03-10T07:51:30.441 INFO:tasks.workunit.client.1.vm08.stdout:3/448: mknod d0/d3c/d18/d48/c94 0 2026-03-10T07:51:30.441 INFO:tasks.workunit.client.1.vm08.stdout:3/449: stat d0/d3c/d18/d48 0 2026-03-10T07:51:30.441 INFO:tasks.workunit.client.1.vm08.stdout:3/450: chown d0/d3c/d18/f22 459408 1 2026-03-10T07:51:30.441 INFO:tasks.workunit.client.1.vm08.stdout:1/408: dread - d2/d6/de/d1f/d22/f30 zero size 2026-03-10T07:51:30.444 INFO:tasks.workunit.client.1.vm08.stdout:4/354: creat d5/d1f/d31/d61/f75 x:0 0 0 2026-03-10T07:51:30.444 INFO:tasks.workunit.client.1.vm08.stdout:8/489: rename d0/df/d15/d23/c99 to d0/df/d15/ca1 0 2026-03-10T07:51:30.445 INFO:tasks.workunit.client.1.vm08.stdout:3/451: unlink d0/d3c/d18/d32/d61/f7a 0 2026-03-10T07:51:30.445 INFO:tasks.workunit.client.1.vm08.stdout:3/452: chown d0/d3c/d18 522 1 2026-03-10T07:51:30.446 INFO:tasks.workunit.client.1.vm08.stdout:1/409: creat d2/d6/de/d70/f8d x:0 0 0 2026-03-10T07:51:30.447 INFO:tasks.workunit.client.1.vm08.stdout:4/355: unlink d5/d17/f22 0 2026-03-10T07:51:30.448 INFO:tasks.workunit.client.1.vm08.stdout:1/410: chown d2/d6/de/d1f/f2a 33006839 1 2026-03-10T07:51:30.448 INFO:tasks.workunit.client.1.vm08.stdout:8/490: truncate d0/d69/f46 3066052 0 2026-03-10T07:51:30.449 INFO:tasks.workunit.client.1.vm08.stdout:3/453: mkdir d0/d3c/d1f/d95 0 2026-03-10T07:51:30.450 INFO:tasks.workunit.client.1.vm08.stdout:8/491: rename d0/df/d2e/l9a to d0/df/d2e/la2 0 2026-03-10T07:51:30.451 INFO:tasks.workunit.client.1.vm08.stdout:3/454: unlink d0/fc 0 2026-03-10T07:51:30.472 INFO:tasks.workunit.client.1.vm08.stdout:3/455: read - d0/d3c/d1f/d44/d51/f5b zero size 2026-03-10T07:51:30.472 INFO:tasks.workunit.client.1.vm08.stdout:1/411: mkdir d2/d6/de/d1f/d26/d89/d8e 0 2026-03-10T07:51:30.472 INFO:tasks.workunit.client.1.vm08.stdout:8/492: creat d0/fa3 x:0 0 0 2026-03-10T07:51:30.472 INFO:tasks.workunit.client.1.vm08.stdout:3/456: unlink d0/d3c/d18/d32/d61/f57 0 2026-03-10T07:51:30.472 INFO:tasks.workunit.client.1.vm08.stdout:8/493: chown d0/df/d17/f1a 116 1 2026-03-10T07:51:30.472 INFO:tasks.workunit.client.1.vm08.stdout:8/494: dread - d0/d37/d86/f90 zero size 2026-03-10T07:51:30.472 INFO:tasks.workunit.client.1.vm08.stdout:1/412: mkdir d2/d6/de/d1f/d8f 0 2026-03-10T07:51:30.472 INFO:tasks.workunit.client.1.vm08.stdout:3/457: mknod d0/c96 0 2026-03-10T07:51:30.472 INFO:tasks.workunit.client.1.vm08.stdout:1/413: fsync d2/d6/de/d1f/d22/f35 0 2026-03-10T07:51:30.472 INFO:tasks.workunit.client.1.vm08.stdout:3/458: mknod d0/d3c/d1f/c97 0 2026-03-10T07:51:30.472 INFO:tasks.workunit.client.1.vm08.stdout:8/495: link d0/c84 d0/df/ca4 0 2026-03-10T07:51:30.472 INFO:tasks.workunit.client.1.vm08.stdout:1/414: creat d2/d6/de/d1f/d26/d89/d8e/f90 x:0 0 0 2026-03-10T07:51:30.472 INFO:tasks.workunit.client.1.vm08.stdout:3/459: getdents d0/d3c/d1f/d44/d51/d2d/d85 0 2026-03-10T07:51:30.472 INFO:tasks.workunit.client.1.vm08.stdout:1/415: dwrite d2/d6/de/d1f/d26/f6e [0,4194304] 0 2026-03-10T07:51:30.472 INFO:tasks.workunit.client.1.vm08.stdout:8/496: dwrite d0/df/d17/f1a [0,4194304] 0 2026-03-10T07:51:30.472 INFO:tasks.workunit.client.1.vm08.stdout:8/497: chown d0/df/d15/d53 101014187 1 2026-03-10T07:51:30.476 INFO:tasks.workunit.client.1.vm08.stdout:8/498: mknod d0/df/d2e/d49/ca5 0 2026-03-10T07:51:30.477 INFO:tasks.workunit.client.1.vm08.stdout:3/460: rename d0/d3c/d18/d48/c94 to d0/c98 0 2026-03-10T07:51:30.481 INFO:tasks.workunit.client.1.vm08.stdout:8/499: dwrite d0/d37/f9b [0,4194304] 0 2026-03-10T07:51:30.490 INFO:tasks.workunit.client.1.vm08.stdout:3/461: dread d0/d3c/d18/f23 [4194304,4194304] 0 2026-03-10T07:51:30.490 INFO:tasks.workunit.client.1.vm08.stdout:8/500: mknod d0/df/d15/d23/d39/d5b/d4a/ca6 0 2026-03-10T07:51:30.491 INFO:tasks.workunit.client.1.vm08.stdout:3/462: write d0/d3c/d18/f23 [9506223,108044] 0 2026-03-10T07:51:30.494 INFO:tasks.workunit.client.1.vm08.stdout:8/501: dwrite d0/df/d15/d23/d39/f85 [0,4194304] 0 2026-03-10T07:51:30.511 INFO:tasks.workunit.client.1.vm08.stdout:3/463: dread d0/d3c/f20 [0,4194304] 0 2026-03-10T07:51:30.516 INFO:tasks.workunit.client.1.vm08.stdout:8/502: link d0/df/f9f d0/df/d15/d23/d39/d5b/d4a/fa7 0 2026-03-10T07:51:30.523 INFO:tasks.workunit.client.1.vm08.stdout:3/464: dread d0/d3c/d18/f22 [4194304,4194304] 0 2026-03-10T07:51:30.604 INFO:tasks.workunit.client.1.vm08.stdout:2/452: dread d0/d1/d3/d10/d32/d61/d8e/f90 [0,4194304] 0 2026-03-10T07:51:30.604 INFO:tasks.workunit.client.1.vm08.stdout:2/453: chown d0/d1/d3/d10/d38/f60 470754 1 2026-03-10T07:51:30.606 INFO:tasks.workunit.client.1.vm08.stdout:2/454: mknod d0/d1/d3/d10/d32/d61/d8e/c93 0 2026-03-10T07:51:30.607 INFO:tasks.workunit.client.1.vm08.stdout:2/455: read d0/d1/d3/d39/f3b [908344,39703] 0 2026-03-10T07:51:30.615 INFO:tasks.workunit.client.1.vm08.stdout:5/448: dread d0/d4/df/f2a [0,4194304] 0 2026-03-10T07:51:30.615 INFO:tasks.workunit.client.1.vm08.stdout:2/456: link d0/d1/d3/d39/d7d/d86/d55/d1b/f23 d0/d1/d3/d39/d7d/d86/d55/d7a/f94 0 2026-03-10T07:51:30.618 INFO:tasks.workunit.client.1.vm08.stdout:5/449: mkdir d0/d4/df/d12/d94 0 2026-03-10T07:51:30.620 INFO:tasks.workunit.client.1.vm08.stdout:5/450: mknod d0/d33/c95 0 2026-03-10T07:51:30.623 INFO:tasks.workunit.client.1.vm08.stdout:5/451: link d0/d4/d19/d3a/d69/f71 d0/d8/d5e/d8e/f96 0 2026-03-10T07:51:30.623 INFO:tasks.workunit.client.1.vm08.stdout:5/452: stat d0/c55 0 2026-03-10T07:51:30.631 INFO:tasks.workunit.client.1.vm08.stdout:5/453: dread d0/d4/df/d12/f11 [0,4194304] 0 2026-03-10T07:51:30.637 INFO:tasks.workunit.client.1.vm08.stdout:5/454: creat d0/d4/df/d12/f97 x:0 0 0 2026-03-10T07:51:30.637 INFO:tasks.workunit.client.1.vm08.stdout:5/455: stat d0/d4/df/d1e/d41/c48 0 2026-03-10T07:51:30.637 INFO:tasks.workunit.client.1.vm08.stdout:5/456: unlink d0/d8/d24/f86 0 2026-03-10T07:51:30.645 INFO:tasks.workunit.client.1.vm08.stdout:5/457: unlink d0/d4/d19/d60/d6d/d70/d40/c4d 0 2026-03-10T07:51:30.645 INFO:tasks.workunit.client.1.vm08.stdout:5/458: dread - d0/d4/d19/d81/d92/f76 zero size 2026-03-10T07:51:30.646 INFO:tasks.workunit.client.1.vm08.stdout:5/459: write d0/d8/d24/f56 [1398924,29578] 0 2026-03-10T07:51:30.647 INFO:tasks.workunit.client.1.vm08.stdout:5/460: chown d0/d4/df/d1e/f25 20498942 1 2026-03-10T07:51:30.648 INFO:tasks.workunit.client.1.vm08.stdout:5/461: mkdir d0/d4/d19/d60/d6d/d98 0 2026-03-10T07:51:30.649 INFO:tasks.workunit.client.1.vm08.stdout:5/462: chown d0/d4/df/d1e 27575538 1 2026-03-10T07:51:30.653 INFO:tasks.workunit.client.1.vm08.stdout:5/463: symlink d0/d4/df/d12/d94/l99 0 2026-03-10T07:51:30.659 INFO:tasks.workunit.client.1.vm08.stdout:5/464: symlink d0/d4/l9a 0 2026-03-10T07:51:30.663 INFO:tasks.workunit.client.1.vm08.stdout:5/465: unlink d0/d4/d19/d3a/f53 0 2026-03-10T07:51:30.664 INFO:tasks.workunit.client.1.vm08.stdout:5/466: creat d0/d4/d19/d50/f9b x:0 0 0 2026-03-10T07:51:30.667 INFO:tasks.workunit.client.1.vm08.stdout:5/467: link d0/c55 d0/d8/d5e/c9c 0 2026-03-10T07:51:30.670 INFO:tasks.workunit.client.1.vm08.stdout:5/468: mknod d0/d4/d19/d50/c9d 0 2026-03-10T07:51:30.671 INFO:tasks.workunit.client.1.vm08.stdout:6/430: chown d1/d17/d2b/f3c 2 1 2026-03-10T07:51:30.671 INFO:tasks.workunit.client.1.vm08.stdout:6/431: chown d1/d3/df/d1d 9192 1 2026-03-10T07:51:30.672 INFO:tasks.workunit.client.1.vm08.stdout:5/469: fsync d0/d4/df/d12/f13 0 2026-03-10T07:51:30.676 INFO:tasks.workunit.client.1.vm08.stdout:6/432: mkdir d1/d3/df/d1d/d40/d87/d95 0 2026-03-10T07:51:30.680 INFO:tasks.workunit.client.1.vm08.stdout:6/433: creat d1/d17/d2b/d5e/f96 x:0 0 0 2026-03-10T07:51:30.681 INFO:tasks.workunit.client.1.vm08.stdout:4/356: dread f1 [0,4194304] 0 2026-03-10T07:51:30.682 INFO:tasks.workunit.client.1.vm08.stdout:4/357: write d5/d17/d48/d4f/f56 [1045530,70665] 0 2026-03-10T07:51:30.691 INFO:tasks.workunit.client.1.vm08.stdout:6/434: write d1/d3/df/d1d/d40/f6e [2703169,47074] 0 2026-03-10T07:51:30.691 INFO:tasks.workunit.client.1.vm08.stdout:6/435: chown d1/d17/d2b/c7a 407 1 2026-03-10T07:51:30.693 INFO:tasks.workunit.client.1.vm08.stdout:9/415: dwrite d2/de/f4d [0,4194304] 0 2026-03-10T07:51:30.695 INFO:tasks.workunit.client.1.vm08.stdout:9/416: chown d2/l82 654966 1 2026-03-10T07:51:30.697 INFO:tasks.workunit.client.1.vm08.stdout:4/358: mknod d5/d1f/d31/c76 0 2026-03-10T07:51:30.700 INFO:tasks.workunit.client.1.vm08.stdout:9/417: dwrite d2/d86/d30/d35/f6c [0,4194304] 0 2026-03-10T07:51:30.702 INFO:tasks.workunit.client.1.vm08.stdout:9/418: chown d2/d58/f95 28010549 1 2026-03-10T07:51:30.707 INFO:tasks.workunit.client.1.vm08.stdout:6/436: fsync d1/d3/df/d44/f5a 0 2026-03-10T07:51:30.707 INFO:tasks.workunit.client.1.vm08.stdout:0/437: truncate dd/d29/d58/f86 1255248 0 2026-03-10T07:51:30.711 INFO:tasks.workunit.client.1.vm08.stdout:9/419: creat d2/de/d28/f96 x:0 0 0 2026-03-10T07:51:30.716 INFO:tasks.workunit.client.1.vm08.stdout:9/420: dread d2/d3/f49 [4194304,4194304] 0 2026-03-10T07:51:30.717 INFO:tasks.workunit.client.1.vm08.stdout:6/437: symlink d1/d3/df/d1d/d40/d87/l97 0 2026-03-10T07:51:30.718 INFO:tasks.workunit.client.1.vm08.stdout:0/438: mknod dd/d10/d2f/d37/d64/d52/c93 0 2026-03-10T07:51:30.722 INFO:tasks.workunit.client.1.vm08.stdout:7/438: write d3/da/d25/d9/d2f/d39/f56 [1650168,128934] 0 2026-03-10T07:51:30.725 INFO:tasks.workunit.client.1.vm08.stdout:4/359: creat d5/f77 x:0 0 0 2026-03-10T07:51:30.730 INFO:tasks.workunit.client.1.vm08.stdout:7/439: dread d3/da/d25/d9/d2f/d3a/d40/f52 [4194304,4194304] 0 2026-03-10T07:51:30.732 INFO:tasks.workunit.client.1.vm08.stdout:9/421: mkdir d2/d86/d30/d35/d97 0 2026-03-10T07:51:30.732 INFO:tasks.workunit.client.1.vm08.stdout:9/422: readlink d2/de/l7b 0 2026-03-10T07:51:30.736 INFO:tasks.workunit.client.1.vm08.stdout:6/438: creat d1/db/d24/d73/d79/f98 x:0 0 0 2026-03-10T07:51:30.755 INFO:tasks.workunit.client.1.vm08.stdout:1/416: dwrite d2/d6/de/d1f/d22/f35 [4194304,4194304] 0 2026-03-10T07:51:30.792 INFO:tasks.workunit.client.1.vm08.stdout:4/360: creat d5/d1f/d70/f78 x:0 0 0 2026-03-10T07:51:30.795 INFO:tasks.workunit.client.1.vm08.stdout:9/423: mkdir d2/de/d28/d98 0 2026-03-10T07:51:30.803 INFO:tasks.workunit.client.1.vm08.stdout:6/439: fsync d1/d17/d2b/f3c 0 2026-03-10T07:51:30.805 INFO:tasks.workunit.client.1.vm08.stdout:6/440: dread d1/f49 [0,4194304] 0 2026-03-10T07:51:30.826 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.824+0000 7f77e1000700 1 -- 192.168.123.105:0/2939881662 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f77dc107f60 msgr2=0x7f77dc10a350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:30.826 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.824+0000 7f77e1000700 1 --2- 192.168.123.105:0/2939881662 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f77dc107f60 0x7f77dc10a350 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f77cc007670 tx=0x7f77cc007980 comp rx=0 tx=0).stop 2026-03-10T07:51:30.826 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.825+0000 7f77e1000700 1 -- 192.168.123.105:0/2939881662 shutdown_connections 2026-03-10T07:51:30.827 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.825+0000 7f77e1000700 1 --2- 192.168.123.105:0/2939881662 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f77dc10a890 0x7f77dc10cd00 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:30.827 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.825+0000 7f77e1000700 1 --2- 192.168.123.105:0/2939881662 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f77dc107f60 0x7f77dc10a350 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:30.827 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.825+0000 7f77e1000700 1 -- 192.168.123.105:0/2939881662 >> 192.168.123.105:0/2939881662 conn(0x7f77dc06dae0 msgr2=0x7f77dc06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:51:30.827 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.825+0000 7f77e1000700 1 -- 192.168.123.105:0/2939881662 shutdown_connections 2026-03-10T07:51:30.827 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.825+0000 7f77e1000700 1 -- 192.168.123.105:0/2939881662 wait complete. 2026-03-10T07:51:30.827 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.826+0000 7f77e1000700 1 Processor -- start 2026-03-10T07:51:30.828 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.826+0000 7f77e1000700 1 -- start start 2026-03-10T07:51:30.828 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.826+0000 7f77e1000700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f77dc107f60 0x7f77dc19cba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:51:30.828 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.826+0000 7f77e1000700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f77dc10a890 0x7f77dc19d0e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:51:30.828 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.826+0000 7f77e1000700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f77dc19d620 con 0x7f77dc10a890 2026-03-10T07:51:30.828 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.826+0000 7f77e1000700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f77dc19d760 con 0x7f77dc107f60 2026-03-10T07:51:30.828 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.827+0000 7f77da59c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f77dc107f60 0x7f77dc19cba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:51:30.828 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.827+0000 7f77da59c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f77dc107f60 0x7f77dc19cba0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:53778/0 (socket says 192.168.123.105:53778) 2026-03-10T07:51:30.828 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.827+0000 7f77da59c700 1 -- 192.168.123.105:0/1974759675 learned_addr learned my addr 192.168.123.105:0/1974759675 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:51:30.828 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.827+0000 7f77d9d9b700 1 --2- 192.168.123.105:0/1974759675 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f77dc10a890 0x7f77dc19d0e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:51:30.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.827+0000 7f77da59c700 1 -- 192.168.123.105:0/1974759675 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f77dc10a890 msgr2=0x7f77dc19d0e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:30.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.827+0000 7f77da59c700 1 --2- 192.168.123.105:0/1974759675 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f77dc10a890 0x7f77dc19d0e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:30.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.827+0000 7f77da59c700 1 -- 192.168.123.105:0/1974759675 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f77cc007350 con 0x7f77dc107f60 2026-03-10T07:51:30.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.828+0000 7f77da59c700 1 --2- 192.168.123.105:0/1974759675 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f77dc107f60 0x7f77dc19cba0 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f77cc005910 tx=0x7f77cc0049b0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:51:30.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.828+0000 7f77cb7fe700 1 -- 192.168.123.105:0/1974759675 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f77cc00f070 con 0x7f77dc107f60 2026-03-10T07:51:30.830 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.828+0000 7f77e1000700 1 -- 192.168.123.105:0/1974759675 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f77dc1b32a0 con 0x7f77dc107f60 2026-03-10T07:51:30.830 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.828+0000 7f77e1000700 1 -- 192.168.123.105:0/1974759675 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f77dc1b37f0 con 0x7f77dc107f60 2026-03-10T07:51:30.830 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.829+0000 7f77cb7fe700 1 -- 192.168.123.105:0/1974759675 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f77cc004500 con 0x7f77dc107f60 2026-03-10T07:51:30.830 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.829+0000 7f77cb7fe700 1 -- 192.168.123.105:0/1974759675 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f77cc00c600 con 0x7f77dc107f60 2026-03-10T07:51:30.831 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.829+0000 7f77e1000700 1 -- 192.168.123.105:0/1974759675 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f77bc005320 con 0x7f77dc107f60 2026-03-10T07:51:30.832 INFO:tasks.workunit.client.1.vm08.stdout:4/361: dread d5/d1f/f25 [0,4194304] 0 2026-03-10T07:51:30.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.831+0000 7f77cb7fe700 1 -- 192.168.123.105:0/1974759675 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f77cc003680 con 0x7f77dc107f60 2026-03-10T07:51:30.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.832+0000 7f77cb7fe700 1 --2- 192.168.123.105:0/1974759675 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f77c40776d0 0x7f77c4079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:51:30.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.832+0000 7f77cb7fe700 1 -- 192.168.123.105:0/1974759675 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f77cc09a570 con 0x7f77dc107f60 2026-03-10T07:51:30.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.832+0000 7f77d9d9b700 1 --2- 192.168.123.105:0/1974759675 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f77c40776d0 0x7f77c4079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:51:30.834 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.833+0000 7f77d9d9b700 1 --2- 192.168.123.105:0/1974759675 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f77c40776d0 0x7f77c4079b90 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f77dc072f90 tx=0x7f77d0009450 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:51:30.838 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.834+0000 7f77cb7fe700 1 -- 192.168.123.105:0/1974759675 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f77cc063080 con 0x7f77dc107f60 2026-03-10T07:51:30.848 INFO:tasks.workunit.client.1.vm08.stdout:1/417: creat d2/d6/de/d1f/d8f/f91 x:0 0 0 2026-03-10T07:51:30.861 INFO:tasks.workunit.client.1.vm08.stdout:7/440: creat d3/da/d25/f94 x:0 0 0 2026-03-10T07:51:30.894 INFO:tasks.workunit.client.1.vm08.stdout:9/424: symlink d2/d3/d84/d91/l99 0 2026-03-10T07:51:30.897 INFO:tasks.workunit.client.1.vm08.stdout:6/441: link d1/d3/df/d44/f5a d1/d17/d2b/d58/d76/f99 0 2026-03-10T07:51:30.898 INFO:tasks.workunit.client.1.vm08.stdout:1/418: mknod d2/d6/de/d1f/d40/c92 0 2026-03-10T07:51:30.899 INFO:tasks.workunit.client.1.vm08.stdout:1/419: fdatasync d2/d6/de/d1f/f2a 0 2026-03-10T07:51:30.899 INFO:tasks.workunit.client.1.vm08.stdout:1/420: stat d2/d6/de/d1f/d26/c73 0 2026-03-10T07:51:30.903 INFO:tasks.workunit.client.1.vm08.stdout:1/421: dwrite d2/d6/d3a/f7d [0,4194304] 0 2026-03-10T07:51:30.905 INFO:tasks.workunit.client.1.vm08.stdout:1/422: write d2/d6/de/d1f/d26/f4a [1010497,3440] 0 2026-03-10T07:51:30.905 INFO:tasks.workunit.client.1.vm08.stdout:1/423: chown d2/d6/d3a/c7b 2269 1 2026-03-10T07:51:30.924 INFO:tasks.workunit.client.1.vm08.stdout:6/442: unlink d1/db/d24/f26 0 2026-03-10T07:51:30.933 INFO:tasks.workunit.client.1.vm08.stdout:3/465: dwrite d0/d3c/d18/d32/f62 [4194304,4194304] 0 2026-03-10T07:51:30.968 INFO:tasks.workunit.client.1.vm08.stdout:1/424: symlink d2/d6/de/d47/l93 0 2026-03-10T07:51:30.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.991+0000 7f77e1000700 1 -- 192.168.123.105:0/1974759675 --> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f77bc000bf0 con 0x7f77c40776d0 2026-03-10T07:51:30.994 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.993+0000 7f77cb7fe700 1 -- 192.168.123.105:0/1974759675 <== mgr.14628 v2:192.168.123.105:6800/413688438 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+364 (secure 0 0 0) 0x7f77bc000bf0 con 0x7f77c40776d0 2026-03-10T07:51:30.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.997+0000 7f77c97fa700 1 -- 192.168.123.105:0/1974759675 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f77c40776d0 msgr2=0x7f77c4079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:30.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.997+0000 7f77c97fa700 1 --2- 192.168.123.105:0/1974759675 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f77c40776d0 0x7f77c4079b90 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f77dc072f90 tx=0x7f77d0009450 comp rx=0 tx=0).stop 2026-03-10T07:51:30.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.997+0000 7f77c97fa700 1 -- 192.168.123.105:0/1974759675 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f77dc107f60 msgr2=0x7f77dc19cba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:30.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.997+0000 7f77c97fa700 1 --2- 192.168.123.105:0/1974759675 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f77dc107f60 0x7f77dc19cba0 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f77cc005910 tx=0x7f77cc0049b0 comp rx=0 tx=0).stop 2026-03-10T07:51:30.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.997+0000 7f77c97fa700 1 -- 192.168.123.105:0/1974759675 shutdown_connections 2026-03-10T07:51:30.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.997+0000 7f77c97fa700 1 --2- 192.168.123.105:0/1974759675 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f77dc107f60 0x7f77dc19cba0 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:30.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.997+0000 7f77c97fa700 1 --2- 192.168.123.105:0/1974759675 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f77c40776d0 0x7f77c4079b90 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:30.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.997+0000 7f77c97fa700 1 --2- 192.168.123.105:0/1974759675 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f77dc10a890 0x7f77dc19d0e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:30.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.997+0000 7f77c97fa700 1 -- 192.168.123.105:0/1974759675 >> 192.168.123.105:0/1974759675 conn(0x7f77dc06dae0 msgr2=0x7f77dc06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:51:30.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.997+0000 7f77c97fa700 1 -- 192.168.123.105:0/1974759675 shutdown_connections 2026-03-10T07:51:30.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:30.998+0000 7f77c97fa700 1 -- 192.168.123.105:0/1974759675 wait complete. 2026-03-10T07:51:31.001 INFO:tasks.workunit.client.1.vm08.stdout:1/425: unlink d2/d6/de/d1f/f67 0 2026-03-10T07:51:31.008 INFO:tasks.workunit.client.1.vm08.stdout:2/457: creat d0/d1/d17/f95 x:0 0 0 2026-03-10T07:51:31.012 INFO:tasks.workunit.client.1.vm08.stdout:9/425: dread d2/d86/f24 [0,4194304] 0 2026-03-10T07:51:31.014 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T07:51:31.027 INFO:tasks.workunit.client.1.vm08.stdout:1/426: mknod d2/d6/de/d1f/d40/c94 0 2026-03-10T07:51:31.043 INFO:tasks.workunit.client.1.vm08.stdout:1/427: dread d2/d6/de/d1f/d22/f81 [0,4194304] 0 2026-03-10T07:51:31.045 INFO:tasks.workunit.client.1.vm08.stdout:9/426: sync 2026-03-10T07:51:31.052 INFO:tasks.workunit.client.1.vm08.stdout:0/439: dwrite dd/d10/d2f/d37/d64/f70 [0,4194304] 0 2026-03-10T07:51:31.058 INFO:tasks.workunit.client.1.vm08.stdout:0/440: creat dd/d10/d14/d15/f94 x:0 0 0 2026-03-10T07:51:31.061 INFO:tasks.workunit.client.1.vm08.stdout:4/362: link d5/d17/l3b d5/d8/d50/l79 0 2026-03-10T07:51:31.062 INFO:tasks.workunit.client.1.vm08.stdout:4/363: rmdir d5/d17/d48/d4f 39 2026-03-10T07:51:31.072 INFO:tasks.workunit.client.1.vm08.stdout:1/428: dread d2/d10/f3f [0,4194304] 0 2026-03-10T07:51:31.077 INFO:tasks.workunit.client.1.vm08.stdout:1/429: write d2/d6/de/f74 [373344,76955] 0 2026-03-10T07:51:31.077 INFO:tasks.workunit.client.1.vm08.stdout:4/364: readlink d5/d8/d9/l5c 0 2026-03-10T07:51:31.079 INFO:tasks.workunit.client.1.vm08.stdout:9/427: getdents d2/de/d28 0 2026-03-10T07:51:31.080 INFO:tasks.workunit.client.1.vm08.stdout:4/365: chown d5/d8/c36 11230 1 2026-03-10T07:51:31.080 INFO:tasks.workunit.client.1.vm08.stdout:9/428: dread - d2/de/d28/f96 zero size 2026-03-10T07:51:31.080 INFO:tasks.workunit.client.1.vm08.stdout:1/430: mknod d2/d6/c95 0 2026-03-10T07:51:31.081 INFO:tasks.workunit.client.1.vm08.stdout:9/429: write d2/d86/d2b/f7a [344820,110553] 0 2026-03-10T07:51:31.103 INFO:tasks.workunit.client.1.vm08.stdout:9/430: mknod d2/d26/c9a 0 2026-03-10T07:51:31.114 INFO:tasks.workunit.client.1.vm08.stdout:9/431: mkdir d2/d86/d30/d35/d9b 0 2026-03-10T07:51:31.118 INFO:tasks.workunit.client.1.vm08.stdout:9/432: dwrite d2/d26/f61 [0,4194304] 0 2026-03-10T07:51:31.127 INFO:tasks.workunit.client.1.vm08.stdout:9/433: creat d2/d86/d30/d35/d97/f9c x:0 0 0 2026-03-10T07:51:31.138 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.136+0000 7f7f456df700 1 -- 192.168.123.105:0/671185599 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7f40075a10 msgr2=0x7f7f40077ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:31.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.136+0000 7f7f456df700 1 --2- 192.168.123.105:0/671185599 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7f40075a10 0x7f7f40077ea0 secure :-1 s=READY pgs=320 cs=0 l=1 rev1=1 crypto rx=0x7f7f3800d3f0 tx=0x7f7f3800d700 comp rx=0 tx=0).stop 2026-03-10T07:51:31.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.136+0000 7f7f456df700 1 -- 192.168.123.105:0/671185599 shutdown_connections 2026-03-10T07:51:31.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.136+0000 7f7f456df700 1 --2- 192.168.123.105:0/671185599 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7f40075a10 0x7f7f40077ea0 unknown :-1 s=CLOSED pgs=320 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:31.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.136+0000 7f7f456df700 1 --2- 192.168.123.105:0/671185599 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7f40072b20 0x7f7f40072f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:31.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.136+0000 7f7f456df700 1 -- 192.168.123.105:0/671185599 >> 192.168.123.105:0/671185599 conn(0x7f7f4006daa0 msgr2=0x7f7f4006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.136+0000 7f7f456df700 1 -- 192.168.123.105:0/671185599 shutdown_connections 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.136+0000 7f7f456df700 1 -- 192.168.123.105:0/671185599 wait complete. 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.137+0000 7f7f456df700 1 Processor -- start 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.137+0000 7f7f456df700 1 -- start start 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.137+0000 7f7f456df700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7f40072b20 0x7f7f40082f50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.137+0000 7f7f456df700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7f40083490 0x7f7f40083910 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.137+0000 7f7f456df700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7f4012e760 con 0x7f7f40083490 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.137+0000 7f7f456df700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7f4012e8d0 con 0x7f7f40072b20 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.137+0000 7f7f3ffff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7f40072b20 0x7f7f40082f50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.137+0000 7f7f3f7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7f40083490 0x7f7f40083910 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.137+0000 7f7f3f7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7f40083490 0x7f7f40083910 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:51976/0 (socket says 192.168.123.105:51976) 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.137+0000 7f7f3f7fe700 1 -- 192.168.123.105:0/588089964 learned_addr learned my addr 192.168.123.105:0/588089964 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.138+0000 7f7f3f7fe700 1 -- 192.168.123.105:0/588089964 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7f40072b20 msgr2=0x7f7f40082f50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.138+0000 7f7f3f7fe700 1 --2- 192.168.123.105:0/588089964 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7f40072b20 0x7f7f40082f50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.138+0000 7f7f3f7fe700 1 -- 192.168.123.105:0/588089964 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7f38007ed0 con 0x7f7f40083490 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.138+0000 7f7f3f7fe700 1 --2- 192.168.123.105:0/588089964 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7f40083490 0x7f7f40083910 secure :-1 s=READY pgs=321 cs=0 l=1 rev1=1 crypto rx=0x7f7f38003c60 tx=0x7f7f38003d40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.138+0000 7f7f3d7fa700 1 -- 192.168.123.105:0/588089964 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7f3801c070 con 0x7f7f40083490 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.138+0000 7f7f456df700 1 -- 192.168.123.105:0/588089964 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7f4012eb50 con 0x7f7f40083490 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.138+0000 7f7f456df700 1 -- 192.168.123.105:0/588089964 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7f4012f040 con 0x7f7f40083490 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.139+0000 7f7f3d7fa700 1 -- 192.168.123.105:0/588089964 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f7f3800fb40 con 0x7f7f40083490 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.140+0000 7f7f3d7fa700 1 -- 192.168.123.105:0/588089964 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7f38017bb0 con 0x7f7f40083490 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.140+0000 7f7f3d7fa700 1 -- 192.168.123.105:0/588089964 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f7f3802a430 con 0x7f7f40083490 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.141+0000 7f7f3d7fa700 1 --2- 192.168.123.105:0/588089964 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f7f28077860 0x7f7f28079d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.141+0000 7f7f3ffff700 1 --2- 192.168.123.105:0/588089964 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f7f28077860 0x7f7f28079d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.142+0000 7f7f3d7fa700 1 -- 192.168.123.105:0/588089964 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f7f38013070 con 0x7f7f40083490 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.142+0000 7f7f3ffff700 1 --2- 192.168.123.105:0/588089964 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f7f28077860 0x7f7f28079d20 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f7f30005950 tx=0x7f7f300058e0 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:51:31.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.142+0000 7f7f456df700 1 -- 192.168.123.105:0/588089964 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7f2c005320 con 0x7f7f40083490 2026-03-10T07:51:31.148 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.147+0000 7f7f3d7fa700 1 -- 192.168.123.105:0/588089964 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f7f38064e50 con 0x7f7f40083490 2026-03-10T07:51:31.155 INFO:tasks.workunit.client.1.vm08.stdout:7/441: symlink d3/da/d25/d9/d2f/d39/l95 0 2026-03-10T07:51:31.158 INFO:tasks.workunit.client.1.vm08.stdout:7/442: rmdir d3/da/d8a 39 2026-03-10T07:51:31.159 INFO:tasks.workunit.client.1.vm08.stdout:7/443: readlink d3/da/d25/d9/l1c 0 2026-03-10T07:51:31.163 INFO:tasks.workunit.client.1.vm08.stdout:7/444: getdents d3/da/d25/d9/d2f/d3a/d71/d8c 0 2026-03-10T07:51:31.165 INFO:tasks.workunit.client.1.vm08.stdout:7/445: symlink d3/l96 0 2026-03-10T07:51:31.172 INFO:tasks.workunit.client.1.vm08.stdout:6/443: dwrite d1/d3/f19 [0,4194304] 0 2026-03-10T07:51:31.176 INFO:tasks.workunit.client.1.vm08.stdout:3/466: dwrite d0/d3c/d1f/d44/d51/d2d/f3a [0,4194304] 0 2026-03-10T07:51:31.177 INFO:tasks.workunit.client.1.vm08.stdout:3/467: write d0/d3c/d18/d32/f76 [976341,54945] 0 2026-03-10T07:51:31.178 INFO:tasks.workunit.client.1.vm08.stdout:3/468: write d0/d3c/d1f/d44/d51/d34/f3d [5072100,80078] 0 2026-03-10T07:51:31.179 INFO:tasks.workunit.client.1.vm08.stdout:3/469: stat d0/lf 0 2026-03-10T07:51:31.208 INFO:tasks.workunit.client.1.vm08.stdout:2/458: rmdir d0 39 2026-03-10T07:51:31.222 INFO:tasks.workunit.client.1.vm08.stdout:2/459: stat d0/d1/d3/d10/d38/f53 0 2026-03-10T07:51:31.222 INFO:tasks.workunit.client.1.vm08.stdout:3/470: link d0/d3c/d1f/f6f d0/d3c/d1f/d44/d51/d34/f99 0 2026-03-10T07:51:31.223 INFO:tasks.workunit.client.1.vm08.stdout:4/366: mknod d5/d1f/c7a 0 2026-03-10T07:51:31.230 INFO:tasks.workunit.client.1.vm08.stdout:2/460: fsync d0/d1/d3/d39/d7d/d86/d55/f20 0 2026-03-10T07:51:31.231 INFO:tasks.workunit.client.1.vm08.stdout:8/503: rename d0/df/d17/d66 to d0/df/d15/d23/da8 0 2026-03-10T07:51:31.234 INFO:tasks.workunit.client.1.vm08.stdout:8/504: dwrite d0/fa [0,4194304] 0 2026-03-10T07:51:31.251 INFO:tasks.workunit.client.1.vm08.stdout:5/470: rename d0/d8/d24/c5d to d0/d4/d19/d3a/c9e 0 2026-03-10T07:51:31.252 INFO:tasks.workunit.client.1.vm08.stdout:3/471: creat d0/f9a x:0 0 0 2026-03-10T07:51:31.255 INFO:tasks.workunit.client.1.vm08.stdout:0/441: rename dd/d29 to dd/d10/d2f/d37/d64/d95 0 2026-03-10T07:51:31.256 INFO:tasks.workunit.client.1.vm08.stdout:1/431: write d2/d6/de/f32 [1782048,3520] 0 2026-03-10T07:51:31.257 INFO:tasks.workunit.client.1.vm08.stdout:1/432: write d2/d6/de/d1f/d26/f6e [3426000,17012] 0 2026-03-10T07:51:31.261 INFO:tasks.workunit.client.1.vm08.stdout:1/433: dread d2/d6/de/d1f/d40/d76/f79 [0,4194304] 0 2026-03-10T07:51:31.265 INFO:tasks.workunit.client.1.vm08.stdout:3/472: symlink d0/d3c/d18/d32/l9b 0 2026-03-10T07:51:31.266 INFO:tasks.workunit.client.1.vm08.stdout:1/434: dwrite d2/d6/f86 [0,4194304] 0 2026-03-10T07:51:31.273 INFO:tasks.workunit.client.1.vm08.stdout:5/471: fdatasync d0/d4/df/f27 0 2026-03-10T07:51:31.274 INFO:tasks.workunit.client.1.vm08.stdout:9/434: truncate d2/d3/fa 2146892 0 2026-03-10T07:51:31.278 INFO:tasks.workunit.client.1.vm08.stdout:7/446: rename d3/da/d25/d9/d2f/d3a/d4b/d67/d69/f91 to d3/da/d25/d9/d2f/f97 0 2026-03-10T07:51:31.279 INFO:tasks.workunit.client.1.vm08.stdout:0/442: creat dd/d18/f96 x:0 0 0 2026-03-10T07:51:31.280 INFO:tasks.workunit.client.1.vm08.stdout:6/444: write d1/d3/d3e/f4a [4665473,81048] 0 2026-03-10T07:51:31.282 INFO:tasks.workunit.client.1.vm08.stdout:0/443: write dd/d18/f96 [332427,6888] 0 2026-03-10T07:51:31.285 INFO:tasks.workunit.client.1.vm08.stdout:6/445: dwrite d1/d17/f20 [0,4194304] 0 2026-03-10T07:51:31.290 INFO:tasks.workunit.client.1.vm08.stdout:6/446: dread d1/f49 [0,4194304] 0 2026-03-10T07:51:31.290 INFO:tasks.workunit.client.1.vm08.stdout:6/447: stat d1/db 0 2026-03-10T07:51:31.291 INFO:tasks.workunit.client.1.vm08.stdout:6/448: write d1/f35 [1623272,53105] 0 2026-03-10T07:51:31.307 INFO:tasks.workunit.client.1.vm08.stdout:2/461: getdents d0/d1/d3/d10/d32 0 2026-03-10T07:51:31.309 INFO:tasks.workunit.client.1.vm08.stdout:3/473: creat d0/d3c/d18/d48/d55/d56/f9c x:0 0 0 2026-03-10T07:51:31.313 INFO:tasks.workunit.client.1.vm08.stdout:2/462: dread d0/d1/d3/d39/d7d/d86/d55/f20 [0,4194304] 0 2026-03-10T07:51:31.315 INFO:tasks.workunit.client.1.vm08.stdout:9/435: mkdir d2/d86/d30/d35/d97/d9d 0 2026-03-10T07:51:31.316 INFO:tasks.workunit.client.1.vm08.stdout:2/463: dwrite d0/f76 [0,4194304] 0 2026-03-10T07:51:31.338 INFO:tasks.workunit.client.1.vm08.stdout:4/367: rename d5/d17 to d5/d8/d9/d12/d7b 0 2026-03-10T07:51:31.339 INFO:tasks.workunit.client.1.vm08.stdout:4/368: read - d5/d1f/d31/d61/f75 zero size 2026-03-10T07:51:31.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.338+0000 7f7f456df700 1 -- 192.168.123.105:0/588089964 --> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f7f2c000bf0 con 0x7f7f28077860 2026-03-10T07:51:31.341 INFO:tasks.workunit.client.1.vm08.stdout:0/444: rmdir dd/d10/d14/d1b 39 2026-03-10T07:51:31.342 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.341+0000 7f7f3d7fa700 1 -- 192.168.123.105:0/588089964 <== mgr.14628 v2:192.168.123.105:6800/413688438 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+364 (secure 0 0 0) 0x7f7f2c000bf0 con 0x7f7f28077860 2026-03-10T07:51:31.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.349+0000 7f7f456df700 1 -- 192.168.123.105:0/588089964 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f7f28077860 msgr2=0x7f7f28079d20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:31.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.349+0000 7f7f456df700 1 --2- 192.168.123.105:0/588089964 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f7f28077860 0x7f7f28079d20 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f7f30005950 tx=0x7f7f300058e0 comp rx=0 tx=0).stop 2026-03-10T07:51:31.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.349+0000 7f7f456df700 1 -- 192.168.123.105:0/588089964 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7f40083490 msgr2=0x7f7f40083910 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:31.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.349+0000 7f7f456df700 1 --2- 192.168.123.105:0/588089964 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7f40083490 0x7f7f40083910 secure :-1 s=READY pgs=321 cs=0 l=1 rev1=1 crypto rx=0x7f7f38003c60 tx=0x7f7f38003d40 comp rx=0 tx=0).stop 2026-03-10T07:51:31.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.350+0000 7f7f456df700 1 -- 192.168.123.105:0/588089964 shutdown_connections 2026-03-10T07:51:31.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.350+0000 7f7f456df700 1 --2- 192.168.123.105:0/588089964 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7f40072b20 0x7f7f40082f50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:31.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.350+0000 7f7f456df700 1 --2- 192.168.123.105:0/588089964 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f7f28077860 0x7f7f28079d20 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:31.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.350+0000 7f7f456df700 1 --2- 192.168.123.105:0/588089964 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7f40083490 0x7f7f40083910 unknown :-1 s=CLOSED pgs=321 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:31.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.350+0000 7f7f456df700 1 -- 192.168.123.105:0/588089964 >> 192.168.123.105:0/588089964 conn(0x7f7f4006daa0 msgr2=0x7f7f4006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:51:31.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.350+0000 7f7f456df700 1 -- 192.168.123.105:0/588089964 shutdown_connections 2026-03-10T07:51:31.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.351+0000 7f7f456df700 1 -- 192.168.123.105:0/588089964 wait complete. 2026-03-10T07:51:31.371 INFO:tasks.workunit.client.1.vm08.stdout:9/436: stat d2/d26/f29 0 2026-03-10T07:51:31.374 INFO:tasks.workunit.client.1.vm08.stdout:5/472: rename d0/d4/c15 to d0/d77/d83/c9f 0 2026-03-10T07:51:31.375 INFO:tasks.workunit.client.1.vm08.stdout:5/473: write d0/d4/d19/d81/d92/f74 [425704,27022] 0 2026-03-10T07:51:31.377 INFO:tasks.workunit.client.1.vm08.stdout:5/474: truncate d0/d4/df/f2a 5153526 0 2026-03-10T07:51:31.386 INFO:tasks.workunit.client.1.vm08.stdout:4/369: truncate d5/d1f/f37 556897 0 2026-03-10T07:51:31.389 INFO:tasks.workunit.client.1.vm08.stdout:4/370: dwrite d5/d8/d9/d32/f6c [0,4194304] 0 2026-03-10T07:51:31.398 INFO:tasks.workunit.client.1.vm08.stdout:8/505: write d0/d69/f46 [281470,2890] 0 2026-03-10T07:51:31.427 INFO:tasks.workunit.client.1.vm08.stdout:3/474: write d0/f45 [302170,32187] 0 2026-03-10T07:51:31.431 INFO:tasks.workunit.client.1.vm08.stdout:3/475: dwrite d0/d3c/d18/d32/d61/d83/f8b [0,4194304] 0 2026-03-10T07:51:31.451 INFO:tasks.workunit.client.1.vm08.stdout:7/447: truncate d3/f6 4029545 0 2026-03-10T07:51:31.451 INFO:tasks.workunit.client.1.vm08.stdout:0/445: getdents dd/d10/d14/d15/d20/d92 0 2026-03-10T07:51:31.454 INFO:tasks.workunit.client.1.vm08.stdout:1/435: getdents d2/d6/de/d1f/d8f 0 2026-03-10T07:51:31.458 INFO:tasks.workunit.client.1.vm08.stdout:2/464: creat d0/d1/d3/f96 x:0 0 0 2026-03-10T07:51:31.461 INFO:tasks.workunit.client.1.vm08.stdout:9/437: write d2/de/f4f [753047,19392] 0 2026-03-10T07:51:31.475 INFO:tasks.workunit.client.1.vm08.stdout:5/475: write d0/d4/df/d12/f13 [3916334,40773] 0 2026-03-10T07:51:31.475 INFO:tasks.workunit.client.1.vm08.stdout:9/438: sync 2026-03-10T07:51:31.476 INFO:tasks.workunit.client.1.vm08.stdout:9/439: fdatasync d2/d86/f55 0 2026-03-10T07:51:31.481 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.480+0000 7fa5a2247700 1 -- 192.168.123.105:0/2468857365 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa59c075a40 msgr2=0x7fa59c077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:31.481 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.480+0000 7fa5a2247700 1 --2- 192.168.123.105:0/2468857365 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa59c075a40 0x7fa59c077ed0 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7fa59400d3e0 tx=0x7fa59400d6f0 comp rx=0 tx=0).stop 2026-03-10T07:51:31.481 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.480+0000 7fa5a2247700 1 -- 192.168.123.105:0/2468857365 shutdown_connections 2026-03-10T07:51:31.481 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.480+0000 7fa5a2247700 1 --2- 192.168.123.105:0/2468857365 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa59c075a40 0x7fa59c077ed0 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:31.481 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.480+0000 7fa5a2247700 1 --2- 192.168.123.105:0/2468857365 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa59c072b50 0x7fa59c072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:31.482 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.480+0000 7fa5a2247700 1 -- 192.168.123.105:0/2468857365 >> 192.168.123.105:0/2468857365 conn(0x7fa59c06dae0 msgr2=0x7fa59c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:51:31.482 INFO:tasks.workunit.client.1.vm08.stdout:7/448: creat d3/da/d25/d9/d2f/d6c/f98 x:0 0 0 2026-03-10T07:51:31.482 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.480+0000 7fa5a2247700 1 -- 192.168.123.105:0/2468857365 shutdown_connections 2026-03-10T07:51:31.482 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.480+0000 7fa5a2247700 1 -- 192.168.123.105:0/2468857365 wait complete. 2026-03-10T07:51:31.482 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.480+0000 7fa5a2247700 1 Processor -- start 2026-03-10T07:51:31.482 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.480+0000 7fa5a2247700 1 -- start start 2026-03-10T07:51:31.483 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.481+0000 7fa5a2247700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa59c072b50 0x7fa59c082fd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:51:31.483 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.481+0000 7fa5a2247700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa59c083510 0x7fa59c1b3080 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:51:31.483 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.481+0000 7fa5a2247700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa59c083990 con 0x7fa59c072b50 2026-03-10T07:51:31.483 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.481+0000 7fa5a2247700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa59c083b00 con 0x7fa59c083510 2026-03-10T07:51:31.483 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.481+0000 7fa59b7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa59c072b50 0x7fa59c082fd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:51:31.483 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.481+0000 7fa59affd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa59c083510 0x7fa59c1b3080 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:51:31.483 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.481+0000 7fa59affd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa59c083510 0x7fa59c1b3080 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:53834/0 (socket says 192.168.123.105:53834) 2026-03-10T07:51:31.483 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.481+0000 7fa59affd700 1 -- 192.168.123.105:0/3532010592 learned_addr learned my addr 192.168.123.105:0/3532010592 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:51:31.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.482+0000 7fa59affd700 1 -- 192.168.123.105:0/3532010592 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa59c072b50 msgr2=0x7fa59c082fd0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:31.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.482+0000 7fa59affd700 1 --2- 192.168.123.105:0/3532010592 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa59c072b50 0x7fa59c082fd0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:31.484 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.482+0000 7fa59affd700 1 -- 192.168.123.105:0/3532010592 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa59400d090 con 0x7fa59c083510 2026-03-10T07:51:31.485 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.483+0000 7fa59affd700 1 --2- 192.168.123.105:0/3532010592 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa59c083510 0x7fa59c1b3080 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7fa59400dac0 tx=0x7fa594009bd0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:51:31.490 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.484+0000 7fa598ff9700 1 -- 192.168.123.105:0/3532010592 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa594010040 con 0x7fa59c083510 2026-03-10T07:51:31.490 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.484+0000 7fa5a2247700 1 -- 192.168.123.105:0/3532010592 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa59c1b35c0 con 0x7fa59c083510 2026-03-10T07:51:31.490 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.484+0000 7fa5a2247700 1 -- 192.168.123.105:0/3532010592 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa59c1b3ac0 con 0x7fa59c083510 2026-03-10T07:51:31.490 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.485+0000 7fa598ff9700 1 -- 192.168.123.105:0/3532010592 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7fa594004250 con 0x7fa59c083510 2026-03-10T07:51:31.490 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.485+0000 7fa598ff9700 1 -- 192.168.123.105:0/3532010592 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa59401d8f0 con 0x7fa59c083510 2026-03-10T07:51:31.490 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.488+0000 7fa598ff9700 1 -- 192.168.123.105:0/3532010592 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7fa59401da50 con 0x7fa59c083510 2026-03-10T07:51:31.490 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.489+0000 7fa598ff9700 1 --2- 192.168.123.105:0/3532010592 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fa58407fed0 0x7fa584082390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:51:31.490 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.489+0000 7fa598ff9700 1 -- 192.168.123.105:0/3532010592 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7fa59409c000 con 0x7fa59c083510 2026-03-10T07:51:31.492 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.489+0000 7fa59b7fe700 1 --2- 192.168.123.105:0/3532010592 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fa58407fed0 0x7fa584082390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:51:31.492 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.489+0000 7fa5a2247700 1 -- 192.168.123.105:0/3532010592 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa588005320 con 0x7fa59c083510 2026-03-10T07:51:31.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.494+0000 7fa598ff9700 1 -- 192.168.123.105:0/3532010592 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7fa594064a90 con 0x7fa59c083510 2026-03-10T07:51:31.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.494+0000 7fa59b7fe700 1 --2- 192.168.123.105:0/3532010592 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fa58407fed0 0x7fa584082390 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7fa59c1aed00 tx=0x7fa58c00c040 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:51:31.506 INFO:tasks.workunit.client.1.vm08.stdout:4/371: mkdir d5/d8/d9/d12/d7b/d48/d4f/d7c 0 2026-03-10T07:51:31.507 INFO:tasks.workunit.client.1.vm08.stdout:6/449: getdents d1/db/d24/d73/d79/d7c 0 2026-03-10T07:51:31.520 INFO:tasks.workunit.client.1.vm08.stdout:8/506: truncate d0/df/d15/d23/d39/f3e 1951268 0 2026-03-10T07:51:31.531 INFO:tasks.workunit.client.1.vm08.stdout:2/465: unlink d0/d1/d3/d10/d32/c5e 0 2026-03-10T07:51:31.539 INFO:tasks.workunit.client.1.vm08.stdout:3/476: dwrite d0/d3c/d18/d32/d61/d52/f68 [0,4194304] 0 2026-03-10T07:51:31.550 INFO:tasks.workunit.client.1.vm08.stdout:1/436: dwrite d2/d6/de/d1f/d26/f48 [0,4194304] 0 2026-03-10T07:51:31.551 INFO:tasks.workunit.client.1.vm08.stdout:5/476: dwrite d0/d4/d19/d3a/f3f [0,4194304] 0 2026-03-10T07:51:31.552 INFO:tasks.workunit.client.1.vm08.stdout:0/446: unlink dd/d10/d14/d1b/d30/l49 0 2026-03-10T07:51:31.556 INFO:tasks.workunit.client.1.vm08.stdout:9/440: creat d2/d58/d73/f9e x:0 0 0 2026-03-10T07:51:31.559 INFO:tasks.workunit.client.1.vm08.stdout:1/437: dread d2/d6/f86 [0,4194304] 0 2026-03-10T07:51:31.566 INFO:tasks.workunit.client.1.vm08.stdout:4/372: mknod d5/d8/d50/c7d 0 2026-03-10T07:51:31.579 INFO:tasks.workunit.client.1.vm08.stdout:6/450: mknod d1/db/d24/d51/c9a 0 2026-03-10T07:51:31.589 INFO:tasks.workunit.client.1.vm08.stdout:8/507: creat d0/df/d17/d7a/d89/fa9 x:0 0 0 2026-03-10T07:51:31.602 INFO:tasks.workunit.client.1.vm08.stdout:2/466: symlink d0/d1/d3/d10/l97 0 2026-03-10T07:51:31.620 INFO:tasks.workunit.client.1.vm08.stdout:3/477: unlink d0/d3c/d1f/d44/d51/c88 0 2026-03-10T07:51:31.621 INFO:tasks.workunit.client.1.vm08.stdout:3/478: truncate d0/d3c/d1f/d44/f8f 190641 0 2026-03-10T07:51:31.651 INFO:tasks.workunit.client.1.vm08.stdout:0/447: read dd/d10/d2f/f4c [1357490,24767] 0 2026-03-10T07:51:31.652 INFO:tasks.workunit.client.1.vm08.stdout:9/441: fsync d2/d86/d2b/f6a 0 2026-03-10T07:51:31.653 INFO:tasks.workunit.client.1.vm08.stdout:1/438: unlink d2/d6/de/d1f/d26/f5d 0 2026-03-10T07:51:31.656 INFO:tasks.workunit.client.1.vm08.stdout:9/442: dwrite d2/d86/f55 [0,4194304] 0 2026-03-10T07:51:31.661 INFO:tasks.workunit.client.1.vm08.stdout:1/439: sync 2026-03-10T07:51:31.670 INFO:tasks.workunit.client.1.vm08.stdout:4/373: symlink d5/d8/d9/d12/d7b/d48/d64/l7e 0 2026-03-10T07:51:31.670 INFO:tasks.workunit.client.1.vm08.stdout:6/451: fdatasync d1/f28 0 2026-03-10T07:51:31.674 INFO:tasks.workunit.client.1.vm08.stdout:6/452: dwrite d1/d3/df/d44/f82 [0,4194304] 0 2026-03-10T07:51:31.676 INFO:tasks.workunit.client.1.vm08.stdout:6/453: write d1/d17/f42 [2489155,98734] 0 2026-03-10T07:51:31.676 INFO:tasks.workunit.client.1.vm08.stdout:6/454: chown d1/db/d24/d73 17731 1 2026-03-10T07:51:31.679 INFO:tasks.workunit.client.1.vm08.stdout:2/467: creat d0/d1/d3/d39/d7d/f98 x:0 0 0 2026-03-10T07:51:31.687 INFO:tasks.workunit.client.1.vm08.stdout:5/477: write d0/d8/d24/f3e [376351,121708] 0 2026-03-10T07:51:31.699 INFO:tasks.workunit.client.1.vm08.stdout:0/448: mknod dd/d10/d2f/d37/d64/d52/c97 0 2026-03-10T07:51:31.711 INFO:tasks.workunit.client.1.vm08.stdout:7/449: link d3/da/d25/d9/d2f/d3a/l50 d3/da/d25/d9/d6f/l99 0 2026-03-10T07:51:31.711 INFO:tasks.workunit.client.1.vm08.stdout:9/443: rename d2/d86/f76 to d2/d58/f9f 0 2026-03-10T07:51:31.719 INFO:tasks.workunit.client.1.vm08.stdout:6/455: creat d1/d3/df/d1d/f9b x:0 0 0 2026-03-10T07:51:31.719 INFO:tasks.workunit.client.1.vm08.stdout:0/449: mknod dd/d10/d2f/d37/d64/c98 0 2026-03-10T07:51:31.719 INFO:tasks.workunit.client.1.vm08.stdout:6/456: fdatasync d1/d3/d3e/f43 0 2026-03-10T07:51:31.721 INFO:tasks.workunit.client.1.vm08.stdout:6/457: truncate d1/d3/d3e/f81 1899115 0 2026-03-10T07:51:31.727 INFO:tasks.workunit.client.1.vm08.stdout:1/440: rename d2/d10/f2b to d2/d6/de/d1f/d26/d58/d8c/f96 0 2026-03-10T07:51:31.727 INFO:tasks.workunit.client.1.vm08.stdout:1/441: stat d2/d6/de/c56 0 2026-03-10T07:51:31.727 INFO:tasks.workunit.client.1.vm08.stdout:1/442: write d2/f4 [1181423,8011] 0 2026-03-10T07:51:31.729 INFO:tasks.workunit.client.1.vm08.stdout:1/443: read d2/d6/de/d1f/d26/d58/d8c/f44 [69014,11715] 0 2026-03-10T07:51:31.735 INFO:tasks.workunit.client.1.vm08.stdout:3/479: creat d0/d3c/d1f/d44/d51/f9d x:0 0 0 2026-03-10T07:51:31.736 INFO:tasks.workunit.client.1.vm08.stdout:3/480: read - d0/d3c/d1f/d44/d51/f5b zero size 2026-03-10T07:51:31.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.737+0000 7fa5a2247700 1 -- 192.168.123.105:0/3532010592 --> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fa588000bf0 con 0x7fa58407fed0 2026-03-10T07:51:31.742 INFO:tasks.workunit.client.1.vm08.stdout:3/481: dread d0/d3c/d1f/d44/d51/d34/f47 [0,4194304] 0 2026-03-10T07:51:31.748 INFO:tasks.workunit.client.1.vm08.stdout:5/478: unlink d0/d4/df/c6c 0 2026-03-10T07:51:31.749 INFO:tasks.workunit.client.1.vm08.stdout:6/458: creat d1/d3/df/d52/f9c x:0 0 0 2026-03-10T07:51:31.749 INFO:tasks.workunit.client.1.vm08.stdout:9/444: link d2/d86/d30/f5d d2/d86/d30/d35/fa0 0 2026-03-10T07:51:31.750 INFO:tasks.workunit.client.1.vm08.stdout:9/445: chown d2/de/d28/l7d 2127 1 2026-03-10T07:51:31.752 INFO:tasks.workunit.client.1.vm08.stdout:1/444: creat d2/d6/de/d1f/d26/d58/d8c/f97 x:0 0 0 2026-03-10T07:51:31.753 INFO:tasks.workunit.client.1.vm08.stdout:1/445: dread - d2/d6/de/d47/f63 zero size 2026-03-10T07:51:31.767 INFO:tasks.workunit.client.1.vm08.stdout:3/482: unlink d0/d3c/d18/d32/f76 0 2026-03-10T07:51:31.777 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T07:51:31.777 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (9s) 6s ago 5m 16.4M - 0.25.0 c8568f914cd2 ac15d5f35994 2026-03-10T07:51:31.777 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (5m) 6s ago 5m 8506k - 18.2.1 5be31c24972a 26c4db858175 2026-03-10T07:51:31.777 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (4m) 37s ago 4m 8409k - 18.2.1 5be31c24972a 209e2398a09c 2026-03-10T07:51:31.777 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (5m) 6s ago 5m 7419k - 18.2.1 5be31c24972a d3d7b92c8ac3 2026-03-10T07:51:31.777 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (4m) 37s ago 4m 7415k - 18.2.1 5be31c24972a 96136e0195f7 2026-03-10T07:51:31.777 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (4m) 6s ago 4m 98.7M - 9.4.7 954c08fa6188 35089be30fc6 2026-03-10T07:51:31.777 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.omfhnh vm05 running (3m) 6s ago 3m 255M - 18.2.1 5be31c24972a e23de179e09c 2026-03-10T07:51:31.777 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.pavqil vm05 running (3m) 6s ago 3m 15.6M - 18.2.1 5be31c24972a 5b9e5afa214c 2026-03-10T07:51:31.777 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.dgsaon vm08 running (3m) 37s ago 3m 16.4M - 18.2.1 5be31c24972a 1696aee522b5 2026-03-10T07:51:31.777 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ybmbgd vm08 running (3m) 37s ago 3m 14.7M - 18.2.1 5be31c24972a 30b0e51cd2ed 2026-03-10T07:51:31.777 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.blexke vm05 *:8443,9283,8765 running (72s) 6s ago 5m 576M - 19.2.3-678-ge911bdeb 654f31e6858e 5467b6a3bd38 2026-03-10T07:51:31.777 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.orfpog vm08 *:8443,9283,8765 running (54s) 37s ago 4m 487M - 19.2.3-678-ge911bdeb 654f31e6858e 7a15d7811d5b 2026-03-10T07:51:31.777 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (5m) 6s ago 5m 54.3M 2048M 18.2.1 5be31c24972a 2a459bf05146 2026-03-10T07:51:31.777 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (4m) 37s ago 4m 42.1M 2048M 18.2.1 5be31c24972a e01dfb712474 2026-03-10T07:51:31.777 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (43s) 6s ago 5m 9214k - 1.7.0 72c9c2088986 7cd0b23b4118 2026-03-10T07:51:31.777 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (39s) 37s ago 4m 5368k - 1.7.0 72c9c2088986 3dd4d91d5881 2026-03-10T07:51:31.778 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (4m) 6s ago 4m 271M 4096M 18.2.1 5be31c24972a 9b7c5ea48cea 2026-03-10T07:51:31.778 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (4m) 6s ago 4m 279M 4096M 18.2.1 5be31c24972a 88e0b65b2c93 2026-03-10T07:51:31.778 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (3m) 6s ago 3m 238M 4096M 18.2.1 5be31c24972a b8d3cbeb49f1 2026-03-10T07:51:31.778 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (3m) 37s ago 3m 257M 4096M 18.2.1 5be31c24972a 0a62c54a86c0 2026-03-10T07:51:31.778 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (3m) 37s ago 3m 196M 4096M 18.2.1 5be31c24972a bd748b691ccd 2026-03-10T07:51:31.778 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (3m) 37s ago 3m 220M 4096M 18.2.1 5be31c24972a 9f08820ae98b 2026-03-10T07:51:31.778 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (19s) 6s ago 4m 44.4M - 2.51.0 1d3b7f56885b c59a6be07563 2026-03-10T07:51:31.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.767+0000 7fa598ff9700 1 -- 192.168.123.105:0/3532010592 <== mgr.14628 v2:192.168.123.105:6800/413688438 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7fa588000bf0 con 0x7fa58407fed0 2026-03-10T07:51:31.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.769+0000 7fa5827fc700 1 -- 192.168.123.105:0/3532010592 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fa58407fed0 msgr2=0x7fa584082390 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:31.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.769+0000 7fa5827fc700 1 --2- 192.168.123.105:0/3532010592 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fa58407fed0 0x7fa584082390 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7fa59c1aed00 tx=0x7fa58c00c040 comp rx=0 tx=0).stop 2026-03-10T07:51:31.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.769+0000 7fa5827fc700 1 -- 192.168.123.105:0/3532010592 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa59c083510 msgr2=0x7fa59c1b3080 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:31.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.769+0000 7fa5827fc700 1 --2- 192.168.123.105:0/3532010592 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa59c083510 0x7fa59c1b3080 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7fa59400dac0 tx=0x7fa594009bd0 comp rx=0 tx=0).stop 2026-03-10T07:51:31.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.771+0000 7fa5827fc700 1 -- 192.168.123.105:0/3532010592 shutdown_connections 2026-03-10T07:51:31.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.771+0000 7fa5827fc700 1 --2- 192.168.123.105:0/3532010592 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fa58407fed0 0x7fa584082390 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:31.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.771+0000 7fa5827fc700 1 --2- 192.168.123.105:0/3532010592 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa59c072b50 0x7fa59c082fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:31.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.771+0000 7fa5827fc700 1 --2- 192.168.123.105:0/3532010592 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa59c083510 0x7fa59c1b3080 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:31.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.771+0000 7fa5827fc700 1 -- 192.168.123.105:0/3532010592 >> 192.168.123.105:0/3532010592 conn(0x7fa59c06dae0 msgr2=0x7fa59c06ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:51:31.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.772+0000 7fa5827fc700 1 -- 192.168.123.105:0/3532010592 shutdown_connections 2026-03-10T07:51:31.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.772+0000 7fa5827fc700 1 -- 192.168.123.105:0/3532010592 wait complete. 2026-03-10T07:51:31.781 INFO:tasks.workunit.client.1.vm08.stdout:4/374: getdents d5/d8/d9/d12/d7b/d48/d4f 0 2026-03-10T07:51:31.789 INFO:tasks.workunit.client.1.vm08.stdout:6/459: rename d1/d3/f19 to d1/d3/df/d1d/f9d 0 2026-03-10T07:51:31.790 INFO:tasks.workunit.client.1.vm08.stdout:6/460: write d1/d3/df/d1d/f94 [260717,7474] 0 2026-03-10T07:51:31.790 INFO:tasks.workunit.client.1.vm08.stdout:8/508: truncate d0/f22 1147298 0 2026-03-10T07:51:31.800 INFO:tasks.workunit.client.1.vm08.stdout:2/468: getdents d0/d1/d3/d10/d32/d61/d84 0 2026-03-10T07:51:31.808 INFO:tasks.workunit.client.1.vm08.stdout:0/450: write dd/d10/d14/d15/f41 [747476,17480] 0 2026-03-10T07:51:31.809 INFO:tasks.workunit.client.1.vm08.stdout:0/451: readlink dd/d10/d14/d15/l17 0 2026-03-10T07:51:31.810 INFO:tasks.workunit.client.1.vm08.stdout:7/450: getdents d3/da/d25 0 2026-03-10T07:51:31.810 INFO:tasks.workunit.client.1.vm08.stdout:0/452: dread - dd/d10/d2f/f8e zero size 2026-03-10T07:51:31.812 INFO:tasks.workunit.client.1.vm08.stdout:3/483: rename d0/d3c/d18/d48/d55/d56/f69 to d0/d3c/d18/d48/d55/d56/f9e 0 2026-03-10T07:51:31.812 INFO:tasks.workunit.client.1.vm08.stdout:4/375: creat d5/d1f/d41/f7f x:0 0 0 2026-03-10T07:51:31.814 INFO:tasks.workunit.client.1.vm08.stdout:5/479: dwrite d0/d4/d19/d43/f59 [0,4194304] 0 2026-03-10T07:51:31.822 INFO:tasks.workunit.client.1.vm08.stdout:7/451: dread d3/da/d25/d9/d2f/d39/d43/d4f/f68 [0,4194304] 0 2026-03-10T07:51:31.823 INFO:tasks.workunit.client.1.vm08.stdout:7/452: chown d3/da/c7f 29545 1 2026-03-10T07:51:31.825 INFO:tasks.workunit.client.1.vm08.stdout:6/461: unlink d1/d3/df/d1d/f94 0 2026-03-10T07:51:31.826 INFO:tasks.workunit.client.1.vm08.stdout:6/462: chown d1/d3/df/d1d/d40/d45/c55 1800397 1 2026-03-10T07:51:31.826 INFO:tasks.workunit.client.1.vm08.stdout:9/446: creat d2/d86/d30/d35/d9b/fa1 x:0 0 0 2026-03-10T07:51:31.827 INFO:tasks.workunit.client.1.vm08.stdout:9/447: write d2/de/f1e [4587690,41908] 0 2026-03-10T07:51:31.831 INFO:tasks.workunit.client.1.vm08.stdout:9/448: write d2/d86/f90 [27606,8429] 0 2026-03-10T07:51:31.832 INFO:tasks.workunit.client.1.vm08.stdout:9/449: chown d2/d86/d30/f5d 216795185 1 2026-03-10T07:51:31.832 INFO:tasks.workunit.client.1.vm08.stdout:7/453: dwrite d3/da/d25/d9/d2f/d39/f56 [4194304,4194304] 0 2026-03-10T07:51:31.844 INFO:tasks.workunit.client.1.vm08.stdout:1/446: mkdir d2/d6/de/d1f/d26/d98 0 2026-03-10T07:51:31.848 INFO:tasks.workunit.client.1.vm08.stdout:2/469: truncate d0/f4a 1967542 0 2026-03-10T07:51:31.857 INFO:tasks.workunit.client.1.vm08.stdout:9/450: dread d2/fd [4194304,4194304] 0 2026-03-10T07:51:31.857 INFO:tasks.workunit.client.1.vm08.stdout:9/451: write d2/d86/f55 [1265671,102202] 0 2026-03-10T07:51:31.858 INFO:tasks.workunit.client.1.vm08.stdout:9/452: chown d2/d26/l2d 1720 1 2026-03-10T07:51:31.859 INFO:tasks.workunit.client.1.vm08.stdout:9/453: fsync d2/f5 0 2026-03-10T07:51:31.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.887+0000 7f1294db1700 1 -- 192.168.123.105:0/3115760878 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1290075a10 msgr2=0x7f1290077ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:31.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.887+0000 7f1294db1700 1 --2- 192.168.123.105:0/3115760878 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1290075a10 0x7f1290077ea0 secure :-1 s=READY pgs=322 cs=0 l=1 rev1=1 crypto rx=0x7f1280007780 tx=0x7f1280007a90 comp rx=0 tx=0).stop 2026-03-10T07:51:31.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.887+0000 7f1294db1700 1 -- 192.168.123.105:0/3115760878 shutdown_connections 2026-03-10T07:51:31.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.887+0000 7f1294db1700 1 --2- 192.168.123.105:0/3115760878 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1290075a10 0x7f1290077ea0 unknown :-1 s=CLOSED pgs=322 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:31.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.887+0000 7f1294db1700 1 --2- 192.168.123.105:0/3115760878 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1290072b20 0x7f1290072f40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:31.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.887+0000 7f1294db1700 1 -- 192.168.123.105:0/3115760878 >> 192.168.123.105:0/3115760878 conn(0x7f129006daa0 msgr2=0x7f129006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:51:31.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.887+0000 7f1294db1700 1 -- 192.168.123.105:0/3115760878 shutdown_connections 2026-03-10T07:51:31.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.887+0000 7f1294db1700 1 -- 192.168.123.105:0/3115760878 wait complete. 2026-03-10T07:51:31.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.887+0000 7f1294db1700 1 Processor -- start 2026-03-10T07:51:31.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.887+0000 7f1294db1700 1 -- start start 2026-03-10T07:51:31.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.887+0000 7f1294db1700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1290072b20 0x7f12900830b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:51:31.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.887+0000 7f1294db1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f12900835f0 0x7f129012e490 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:51:31.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.887+0000 7f1294db1700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1290083a70 con 0x7f12900835f0 2026-03-10T07:51:31.890 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.887+0000 7f1294db1700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1290083be0 con 0x7f1290072b20 2026-03-10T07:51:31.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.890+0000 7f128f7fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1290072b20 0x7f12900830b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:51:31.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.890+0000 7f128f7fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1290072b20 0x7f12900830b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:53856/0 (socket says 192.168.123.105:53856) 2026-03-10T07:51:31.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.890+0000 7f128f7fe700 1 -- 192.168.123.105:0/2462813532 learned_addr learned my addr 192.168.123.105:0/2462813532 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:51:31.891 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.890+0000 7f128effd700 1 --2- 192.168.123.105:0/2462813532 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f12900835f0 0x7f129012e490 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:51:31.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.891+0000 7f128f7fe700 1 -- 192.168.123.105:0/2462813532 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f12900835f0 msgr2=0x7f129012e490 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:31.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.891+0000 7f128f7fe700 1 --2- 192.168.123.105:0/2462813532 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f12900835f0 0x7f129012e490 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:31.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.891+0000 7f128f7fe700 1 -- 192.168.123.105:0/2462813532 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f1280007430 con 0x7f1290072b20 2026-03-10T07:51:31.893 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.892+0000 7f128f7fe700 1 --2- 192.168.123.105:0/2462813532 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1290072b20 0x7f12900830b0 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f128800c390 tx=0x7f128800c6a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:51:31.895 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.893+0000 7f128cff9700 1 -- 192.168.123.105:0/2462813532 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f128800e030 con 0x7f1290072b20 2026-03-10T07:51:31.895 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.893+0000 7f1294db1700 1 -- 192.168.123.105:0/2462813532 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f129012e9d0 con 0x7f1290072b20 2026-03-10T07:51:31.895 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.893+0000 7f1294db1700 1 -- 192.168.123.105:0/2462813532 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f129012eed0 con 0x7f1290072b20 2026-03-10T07:51:31.895 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.893+0000 7f128cff9700 1 -- 192.168.123.105:0/2462813532 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f128800f040 con 0x7f1290072b20 2026-03-10T07:51:31.895 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.893+0000 7f128cff9700 1 -- 192.168.123.105:0/2462813532 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1288014650 con 0x7f1290072b20 2026-03-10T07:51:31.895 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.894+0000 7f12767fc700 1 -- 192.168.123.105:0/2462813532 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f127c005320 con 0x7f1290072b20 2026-03-10T07:51:31.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.896+0000 7f128cff9700 1 -- 192.168.123.105:0/2462813532 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f1288009110 con 0x7f1290072b20 2026-03-10T07:51:31.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.896+0000 7f128cff9700 1 --2- 192.168.123.105:0/2462813532 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f1278077790 0x7f1278079c50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:51:31.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.896+0000 7f128cff9700 1 -- 192.168.123.105:0/2462813532 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f1288099a90 con 0x7f1290072b20 2026-03-10T07:51:31.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.896+0000 7f128effd700 1 --2- 192.168.123.105:0/2462813532 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f1278077790 0x7f1278079c50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:51:31.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.897+0000 7f128cff9700 1 -- 192.168.123.105:0/2462813532 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f1288061ed0 con 0x7f1290072b20 2026-03-10T07:51:31.900 INFO:tasks.workunit.client.1.vm08.stdout:5/480: creat d0/d4/d19/fa0 x:0 0 0 2026-03-10T07:51:31.912 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:31.908+0000 7f128effd700 1 --2- 192.168.123.105:0/2462813532 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f1278077790 0x7f1278079c50 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f1280007750 tx=0x7f1280007de0 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:51:31.936 INFO:tasks.workunit.client.1.vm08.stdout:3/484: unlink d0/d3c/d18/d48/d55/d56/f92 0 2026-03-10T07:51:31.973 INFO:tasks.workunit.client.1.vm08.stdout:1/447: creat d2/d10/f99 x:0 0 0 2026-03-10T07:51:31.973 INFO:tasks.workunit.client.1.vm08.stdout:2/470: write d0/d1/d3/d39/d7d/d86/f7f [584533,3079] 0 2026-03-10T07:51:31.988 INFO:tasks.workunit.client.1.vm08.stdout:5/481: creat d0/d4/d19/d3a/fa1 x:0 0 0 2026-03-10T07:51:32.001 INFO:tasks.workunit.client.1.vm08.stdout:3/485: unlink d0/d3c/d1f/d44/d51/f71 0 2026-03-10T07:51:32.023 INFO:tasks.workunit.client.1.vm08.stdout:4/376: link d5/d8/d9/l5c d5/d8/d9/d32/l80 0 2026-03-10T07:51:32.060 INFO:tasks.workunit.client.1.vm08.stdout:7/454: rename d3/da/c7f to d3/da/d25/d9/d2f/d3a/c9a 0 2026-03-10T07:51:32.061 INFO:tasks.workunit.client.1.vm08.stdout:7/455: chown d3/da/d25/d9/d2f/d39/l95 44501 1 2026-03-10T07:51:32.064 INFO:tasks.workunit.client.1.vm08.stdout:7/456: dwrite d3/f4 [4194304,4194304] 0 2026-03-10T07:51:32.069 INFO:tasks.workunit.client.1.vm08.stdout:9/454: creat d2/fa2 x:0 0 0 2026-03-10T07:51:32.084 INFO:tasks.workunit.client.1.vm08.stdout:9/455: sync 2026-03-10T07:51:32.110 INFO:tasks.workunit.client.1.vm08.stdout:4/377: fsync d5/d8/d9/f1b 0 2026-03-10T07:51:32.127 INFO:tasks.workunit.client.1.vm08.stdout:7/457: mknod d3/da/d25/d9/d2f/d39/d43/d4f/c9b 0 2026-03-10T07:51:32.130 INFO:tasks.workunit.client.1.vm08.stdout:9/456: fdatasync d2/d58/f9f 0 2026-03-10T07:51:32.131 INFO:tasks.workunit.client.1.vm08.stdout:6/463: write d1/db/d24/d73/d79/d7c/f7f [90790,40204] 0 2026-03-10T07:51:32.134 INFO:tasks.workunit.client.1.vm08.stdout:6/464: dread d1/d3/d3e/f43 [0,4194304] 0 2026-03-10T07:51:32.134 INFO:tasks.workunit.client.1.vm08.stdout:6/465: stat d1/d3 0 2026-03-10T07:51:32.135 INFO:tasks.workunit.client.1.vm08.stdout:6/466: truncate d1/d3/d3e/f43 1025260 0 2026-03-10T07:51:32.139 INFO:tasks.workunit.client.1.vm08.stdout:8/509: dwrite d0/df/d15/d23/f3d [0,4194304] 0 2026-03-10T07:51:32.143 INFO:tasks.workunit.client.1.vm08.stdout:0/453: truncate dd/f44 1446431 0 2026-03-10T07:51:32.147 INFO:tasks.workunit.client.1.vm08.stdout:0/454: dwrite dd/d10/d2f/d37/d64/d95/d5c/f63 [0,4194304] 0 2026-03-10T07:51:32.162 INFO:tasks.workunit.client.1.vm08.stdout:5/482: write d0/d4/df/f27 [387913,130097] 0 2026-03-10T07:51:32.163 INFO:tasks.workunit.client.1.vm08.stdout:5/483: write d0/d4/df/f7f [155469,74163] 0 2026-03-10T07:51:32.164 INFO:tasks.workunit.client.1.vm08.stdout:5/484: dread - d0/d4/d19/d60/d6d/d70/f67 zero size 2026-03-10T07:51:32.167 INFO:tasks.workunit.client.1.vm08.stdout:3/486: write d0/d3c/d1f/d44/d51/d34/f47 [4561743,43889] 0 2026-03-10T07:51:32.170 INFO:tasks.workunit.client.1.vm08.stdout:2/471: dwrite d0/d1/d17/f1a [0,4194304] 0 2026-03-10T07:51:32.190 INFO:tasks.workunit.client.1.vm08.stdout:4/378: symlink d5/d8/d9/l81 0 2026-03-10T07:51:32.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.190+0000 7f12767fc700 1 -- 192.168.123.105:0/2462813532 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f127c005cc0 con 0x7f1290072b20 2026-03-10T07:51:32.199 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:51:32.199 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T07:51:32.199 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-10T07:51:32.199 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:51:32.200 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T07:51:32.200 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:51:32.200 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:51:32.200 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T07:51:32.200 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 6 2026-03-10T07:51:32.200 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:51:32.200 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T07:51:32.200 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T07:51:32.200 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:51:32.200 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T07:51:32.200 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 12, 2026-03-10T07:51:32.200 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:51:32.200 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T07:51:32.200 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:51:32.200 INFO:tasks.workunit.client.1.vm08.stdout:6/467: unlink d1/d3/df/d52/f9c 0 2026-03-10T07:51:32.200 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.194+0000 7f128cff9700 1 -- 192.168.123.105:0/2462813532 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+679 (secure 0 0 0) 0x7f1288061cf0 con 0x7f1290072b20 2026-03-10T07:51:32.202 INFO:tasks.workunit.client.1.vm08.stdout:6/468: write d1/f35 [2994481,86403] 0 2026-03-10T07:51:32.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.199+0000 7f1294db1700 1 -- 192.168.123.105:0/2462813532 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f1278077790 msgr2=0x7f1278079c50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:32.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.199+0000 7f1294db1700 1 --2- 192.168.123.105:0/2462813532 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f1278077790 0x7f1278079c50 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f1280007750 tx=0x7f1280007de0 comp rx=0 tx=0).stop 2026-03-10T07:51:32.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.199+0000 7f1294db1700 1 -- 192.168.123.105:0/2462813532 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1290072b20 msgr2=0x7f12900830b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:32.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.199+0000 7f1294db1700 1 --2- 192.168.123.105:0/2462813532 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1290072b20 0x7f12900830b0 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f128800c390 tx=0x7f128800c6a0 comp rx=0 tx=0).stop 2026-03-10T07:51:32.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.199+0000 7f1294db1700 1 -- 192.168.123.105:0/2462813532 shutdown_connections 2026-03-10T07:51:32.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.199+0000 7f1294db1700 1 --2- 192.168.123.105:0/2462813532 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1290072b20 0x7f12900830b0 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:32.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.199+0000 7f1294db1700 1 --2- 192.168.123.105:0/2462813532 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f1278077790 0x7f1278079c50 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:32.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.199+0000 7f1294db1700 1 --2- 192.168.123.105:0/2462813532 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f12900835f0 0x7f129012e490 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:32.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.199+0000 7f1294db1700 1 -- 192.168.123.105:0/2462813532 >> 192.168.123.105:0/2462813532 conn(0x7f129006daa0 msgr2=0x7f129006ff00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:51:32.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.199+0000 7f1294db1700 1 -- 192.168.123.105:0/2462813532 shutdown_connections 2026-03-10T07:51:32.203 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.199+0000 7f1294db1700 1 -- 192.168.123.105:0/2462813532 wait complete. 2026-03-10T07:51:32.208 INFO:tasks.workunit.client.1.vm08.stdout:1/448: getdents d2/d6 0 2026-03-10T07:51:32.214 INFO:tasks.workunit.client.1.vm08.stdout:9/457: dread d2/d3/f8e [0,4194304] 0 2026-03-10T07:51:32.214 INFO:tasks.workunit.client.1.vm08.stdout:9/458: read - d2/fa2 zero size 2026-03-10T07:51:32.215 INFO:tasks.workunit.client.1.vm08.stdout:9/459: write d2/d86/f90 [889650,111781] 0 2026-03-10T07:51:32.225 INFO:tasks.workunit.client.1.vm08.stdout:8/510: symlink d0/df/d17/d7a/d89/laa 0 2026-03-10T07:51:32.226 INFO:tasks.workunit.client.1.vm08.stdout:0/455: write dd/d10/d2f/d37/d64/d95/f2a [1424579,103686] 0 2026-03-10T07:51:32.227 INFO:tasks.workunit.client.1.vm08.stdout:3/487: mknod d0/d3c/d18/d32/d61/c9f 0 2026-03-10T07:51:32.228 INFO:tasks.workunit.client.1.vm08.stdout:3/488: write d0/d3c/d18/d32/d61/d83/f8b [96735,5710] 0 2026-03-10T07:51:32.231 INFO:tasks.workunit.client.1.vm08.stdout:5/485: dwrite d0/d4/d19/d81/d92/f65 [0,4194304] 0 2026-03-10T07:51:32.232 INFO:tasks.workunit.client.1.vm08.stdout:5/486: chown d0/d4/df/d1e/f25 27618571 1 2026-03-10T07:51:32.232 INFO:tasks.workunit.client.1.vm08.stdout:5/487: fdatasync d0/d4/d19/d50/f8a 0 2026-03-10T07:51:32.240 INFO:tasks.workunit.client.1.vm08.stdout:2/472: unlink d0/d1/d3/d39/d7d/d7e/c8d 0 2026-03-10T07:51:32.240 INFO:tasks.workunit.client.1.vm08.stdout:2/473: dread - d0/f81 zero size 2026-03-10T07:51:32.269 INFO:tasks.workunit.client.1.vm08.stdout:7/458: mknod d3/da/d25/d9/d2f/d3a/c9c 0 2026-03-10T07:51:32.269 INFO:tasks.workunit.client.1.vm08.stdout:7/459: chown d3/da/d25/d9/d2f/d6c 315251 1 2026-03-10T07:51:32.271 INFO:tasks.workunit.client.1.vm08.stdout:7/460: truncate d3/da/d25/d9/d2f/d3a/d4b/f8d 551699 0 2026-03-10T07:51:32.304 INFO:tasks.workunit.client.1.vm08.stdout:1/449: rename d2/d6/de/d1f/d26/d58/d8c/l23 to d2/d6/de/d1f/d26/d89/l9a 0 2026-03-10T07:51:32.307 INFO:tasks.workunit.client.1.vm08.stdout:1/450: dwrite d2/d6/de/d1f/d26/d58/d8c/f87 [0,4194304] 0 2026-03-10T07:51:32.310 INFO:tasks.workunit.client.1.vm08.stdout:1/451: truncate d2/d6/de/d1f/f78 5203686 0 2026-03-10T07:51:32.322 INFO:tasks.workunit.client.1.vm08.stdout:4/379: dread d5/d1f/d31/f33 [4194304,4194304] 0 2026-03-10T07:51:32.322 INFO:tasks.workunit.client.1.vm08.stdout:9/460: rmdir d2/d3/d84/d91 39 2026-03-10T07:51:32.327 INFO:tasks.workunit.client.1.vm08.stdout:9/461: dread d2/d86/d30/f70 [0,4194304] 0 2026-03-10T07:51:32.330 INFO:tasks.workunit.client.1.vm08.stdout:4/380: dread d5/d8/d9/d12/d7b/d48/f5b [0,4194304] 0 2026-03-10T07:51:32.344 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.340+0000 7f83187c0700 1 -- 192.168.123.105:0/1066836833 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8310075a40 msgr2=0x7f8310077ed0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:32.344 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.340+0000 7f83187c0700 1 --2- 192.168.123.105:0/1066836833 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8310075a40 0x7f8310077ed0 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f830800cd40 tx=0x7f830800a320 comp rx=0 tx=0).stop 2026-03-10T07:51:32.344 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.340+0000 7f83187c0700 1 -- 192.168.123.105:0/1066836833 shutdown_connections 2026-03-10T07:51:32.344 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.340+0000 7f83187c0700 1 --2- 192.168.123.105:0/1066836833 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8310075a40 0x7f8310077ed0 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:32.344 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.340+0000 7f83187c0700 1 --2- 192.168.123.105:0/1066836833 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8310072b50 0x7f8310072f70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:32.344 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.340+0000 7f83187c0700 1 -- 192.168.123.105:0/1066836833 >> 192.168.123.105:0/1066836833 conn(0x7f831006dae0 msgr2=0x7f831006ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:51:32.346 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:32 vm05.local ceph-mon[50387]: from='client.24473 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:51:32.346 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:32 vm05.local ceph-mon[50387]: pgmap v35: 65 pgs: 65 active+clean; 2.8 GiB data, 9.3 GiB used, 111 GiB / 120 GiB avail; 21 MiB/s rd, 115 MiB/s wr, 215 op/s 2026-03-10T07:51:32.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.340+0000 7f83187c0700 1 -- 192.168.123.105:0/1066836833 shutdown_connections 2026-03-10T07:51:32.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.340+0000 7f83187c0700 1 -- 192.168.123.105:0/1066836833 wait complete. 2026-03-10T07:51:32.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.341+0000 7f83187c0700 1 Processor -- start 2026-03-10T07:51:32.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.341+0000 7f83187c0700 1 -- start start 2026-03-10T07:51:32.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.341+0000 7f83187c0700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8310072b50 0x7f8310082f70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:51:32.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.341+0000 7f83187c0700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8310075a40 0x7f83100834b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:51:32.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.341+0000 7f83187c0700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8310083ad0 con 0x7f8310075a40 2026-03-10T07:51:32.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.341+0000 7f83187c0700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8310083c10 con 0x7f8310072b50 2026-03-10T07:51:32.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.341+0000 7f831655c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8310072b50 0x7f8310082f70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:51:32.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.341+0000 7f831655c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8310072b50 0x7f8310082f70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:53864/0 (socket says 192.168.123.105:53864) 2026-03-10T07:51:32.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.341+0000 7f831655c700 1 -- 192.168.123.105:0/3520276755 learned_addr learned my addr 192.168.123.105:0/3520276755 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:51:32.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.341+0000 7f831655c700 1 -- 192.168.123.105:0/3520276755 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8310075a40 msgr2=0x7f83100834b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:32.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.341+0000 7f831655c700 1 --2- 192.168.123.105:0/3520276755 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8310075a40 0x7f83100834b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:32.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.341+0000 7f831655c700 1 -- 192.168.123.105:0/3520276755 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f830800c9f0 con 0x7f8310072b50 2026-03-10T07:51:32.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.341+0000 7f831655c700 1 --2- 192.168.123.105:0/3520276755 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8310072b50 0x7f8310082f70 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f830c00eb10 tx=0x7f830c00eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:51:32.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.341+0000 7f83077fe700 1 -- 192.168.123.105:0/3520276755 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f830c00cca0 con 0x7f8310072b50 2026-03-10T07:51:32.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.341+0000 7f83187c0700 1 -- 192.168.123.105:0/3520276755 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f831012e490 con 0x7f8310072b50 2026-03-10T07:51:32.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.341+0000 7f83187c0700 1 -- 192.168.123.105:0/3520276755 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f831012e9e0 con 0x7f8310072b50 2026-03-10T07:51:32.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.342+0000 7f83077fe700 1 -- 192.168.123.105:0/3520276755 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1139+0+0 (secure 0 0 0) 0x7f830c00ce00 con 0x7f8310072b50 2026-03-10T07:51:32.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.342+0000 7f83077fe700 1 -- 192.168.123.105:0/3520276755 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f830c018910 con 0x7f8310072b50 2026-03-10T07:51:32.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.342+0000 7f83187c0700 1 -- 192.168.123.105:0/3520276755 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f82f4005320 con 0x7f8310072b50 2026-03-10T07:51:32.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.343+0000 7f83077fe700 1 -- 192.168.123.105:0/3520276755 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 30) v1 ==== 100066+0+0 (secure 0 0 0) 0x7f830c010c80 con 0x7f8310072b50 2026-03-10T07:51:32.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.344+0000 7f83077fe700 1 --2- 192.168.123.105:0/3520276755 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f82fc0776d0 0x7f82fc079b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:51:32.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.344+0000 7f83077fe700 1 -- 192.168.123.105:0/3520276755 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f830c014070 con 0x7f8310072b50 2026-03-10T07:51:32.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.344+0000 7f8315d5b700 1 --2- 192.168.123.105:0/3520276755 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f82fc0776d0 0x7f82fc079b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:51:32.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.344+0000 7f8315d5b700 1 --2- 192.168.123.105:0/3520276755 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f82fc0776d0 0x7f82fc079b90 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f8308009fd0 tx=0x7f8308006000 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:51:32.354 INFO:tasks.workunit.client.1.vm08.stdout:3/489: write d0/d3c/d18/f38 [2029109,47801] 0 2026-03-10T07:51:32.355 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.346+0000 7f83077fe700 1 -- 192.168.123.105:0/3520276755 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f830c0632e0 con 0x7f8310072b50 2026-03-10T07:51:32.359 INFO:tasks.workunit.client.1.vm08.stdout:3/490: dread d0/d3c/d1f/d44/f59 [0,4194304] 0 2026-03-10T07:51:32.359 INFO:tasks.workunit.client.1.vm08.stdout:3/491: dread - d0/d3c/d18/d80/f86 zero size 2026-03-10T07:51:32.360 INFO:tasks.workunit.client.1.vm08.stdout:3/492: read - d0/d3c/d18/d4a/f8a zero size 2026-03-10T07:51:32.364 INFO:tasks.workunit.client.1.vm08.stdout:5/488: mknod d0/d4/df/d12/d22/ca2 0 2026-03-10T07:51:32.367 INFO:tasks.workunit.client.1.vm08.stdout:5/489: dwrite d0/d4/d19/d43/f59 [0,4194304] 0 2026-03-10T07:51:32.368 INFO:tasks.workunit.client.1.vm08.stdout:5/490: chown d0/d4/d19/d60 96 1 2026-03-10T07:51:32.369 INFO:tasks.workunit.client.1.vm08.stdout:5/491: fsync d0/d4/d19/d81/d92/f74 0 2026-03-10T07:51:32.401 INFO:tasks.workunit.client.1.vm08.stdout:0/456: rename dd/c50 to dd/d10/d14/d15/d20/d22/c99 0 2026-03-10T07:51:32.409 INFO:tasks.workunit.client.1.vm08.stdout:0/457: dread dd/d10/d14/d15/d20/d5f/f7f [0,4194304] 0 2026-03-10T07:51:32.417 INFO:tasks.workunit.client.1.vm08.stdout:0/458: dread dd/d10/d2f/d37/d64/d95/f48 [0,4194304] 0 2026-03-10T07:51:32.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:32 vm08.local ceph-mon[59917]: from='client.24473 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:51:32.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:32 vm08.local ceph-mon[59917]: pgmap v35: 65 pgs: 65 active+clean; 2.8 GiB data, 9.3 GiB used, 111 GiB / 120 GiB avail; 21 MiB/s rd, 115 MiB/s wr, 215 op/s 2026-03-10T07:51:32.421 INFO:tasks.workunit.client.1.vm08.stdout:8/511: creat d0/df/d15/d9c/fab x:0 0 0 2026-03-10T07:51:32.425 INFO:tasks.workunit.client.1.vm08.stdout:8/512: dwrite d0/df/d5d/f81 [0,4194304] 0 2026-03-10T07:51:32.437 INFO:tasks.workunit.client.1.vm08.stdout:2/474: getdents d0/d1/d3/d10/d32/d61/d84 0 2026-03-10T07:51:32.440 INFO:tasks.workunit.client.1.vm08.stdout:2/475: dwrite d0/d1/d17/d6b/f6e [0,4194304] 0 2026-03-10T07:51:32.453 INFO:tasks.workunit.client.1.vm08.stdout:5/492: unlink d0/d4/df/d12/d22/f84 0 2026-03-10T07:51:32.461 INFO:tasks.workunit.client.1.vm08.stdout:1/452: mkdir d2/d6/de/d1f/d26/d98/d9b 0 2026-03-10T07:51:32.473 INFO:tasks.workunit.client.1.vm08.stdout:8/513: creat d0/df/d17/d7a/d89/fac x:0 0 0 2026-03-10T07:51:32.477 INFO:tasks.workunit.client.1.vm08.stdout:3/493: unlink d0/d3c/ce 0 2026-03-10T07:51:32.477 INFO:tasks.workunit.client.1.vm08.stdout:3/494: fsync d0/f45 0 2026-03-10T07:51:32.480 INFO:tasks.workunit.client.1.vm08.stdout:7/461: truncate d3/da/d25/d9/d2f/d39/f76 1229758 0 2026-03-10T07:51:32.482 INFO:tasks.workunit.client.1.vm08.stdout:3/495: dwrite d0/d3c/d18/d4a/f8a [0,4194304] 0 2026-03-10T07:51:32.486 INFO:tasks.workunit.client.1.vm08.stdout:2/476: symlink d0/d1/d3/d39/d7d/d86/d55/d7a/l99 0 2026-03-10T07:51:32.486 INFO:tasks.workunit.client.1.vm08.stdout:3/496: truncate d0/f45 1291354 0 2026-03-10T07:51:32.487 INFO:tasks.workunit.client.1.vm08.stdout:8/514: dread d0/df/f19 [0,4194304] 0 2026-03-10T07:51:32.487 INFO:tasks.workunit.client.1.vm08.stdout:8/515: chown d0/df/d2e/f9e 30622496 1 2026-03-10T07:51:32.488 INFO:tasks.workunit.client.1.vm08.stdout:8/516: write d0/fa [677602,127092] 0 2026-03-10T07:51:32.489 INFO:tasks.workunit.client.1.vm08.stdout:6/469: getdents d1/db/d24/d73 0 2026-03-10T07:51:32.493 INFO:tasks.workunit.client.1.vm08.stdout:5/493: creat d0/d4/d19/d3a/d69/fa3 x:0 0 0 2026-03-10T07:51:32.493 INFO:tasks.workunit.client.1.vm08.stdout:5/494: stat d0/d4/d19/d3a/d69 0 2026-03-10T07:51:32.494 INFO:tasks.workunit.client.1.vm08.stdout:5/495: chown d0/d8/d5e/c89 15 1 2026-03-10T07:51:32.494 INFO:tasks.workunit.client.1.vm08.stdout:5/496: stat d0/d4/df/d12/f88 0 2026-03-10T07:51:32.533 INFO:tasks.workunit.client.1.vm08.stdout:0/459: dwrite dd/f44 [0,4194304] 0 2026-03-10T07:51:32.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.545+0000 7f83187c0700 1 -- 192.168.123.105:0/3520276755 --> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f82f4000bf0 con 0x7f82fc0776d0 2026-03-10T07:51:32.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.547+0000 7f83077fe700 1 -- 192.168.123.105:0/3520276755 <== mgr.14628 v2:192.168.123.105:6800/413688438 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+364 (secure 0 0 0) 0x7f82f4000bf0 con 0x7f82fc0776d0 2026-03-10T07:51:32.549 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:51:32.549 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T07:51:32.549 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T07:51:32.549 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading daemons of type(s) mgr", 2026-03-10T07:51:32.549 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T07:51:32.549 INFO:teuthology.orchestra.run.vm05.stdout: "mgr" 2026-03-10T07:51:32.549 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T07:51:32.549 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "2/2 daemons upgraded", 2026-03-10T07:51:32.549 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading grafana daemons", 2026-03-10T07:51:32.549 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T07:51:32.549 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:51:32.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.551+0000 7f83057fa700 1 -- 192.168.123.105:0/3520276755 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f82fc0776d0 msgr2=0x7f82fc079b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:32.556 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.552+0000 7f83057fa700 1 --2- 192.168.123.105:0/3520276755 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f82fc0776d0 0x7f82fc079b90 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f8308009fd0 tx=0x7f8308006000 comp rx=0 tx=0).stop 2026-03-10T07:51:32.556 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.552+0000 7f83057fa700 1 -- 192.168.123.105:0/3520276755 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8310072b50 msgr2=0x7f8310082f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:51:32.556 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.552+0000 7f83057fa700 1 --2- 192.168.123.105:0/3520276755 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8310072b50 0x7f8310082f70 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f830c00eb10 tx=0x7f830c00eed0 comp rx=0 tx=0).stop 2026-03-10T07:51:32.556 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.552+0000 7f83057fa700 1 -- 192.168.123.105:0/3520276755 shutdown_connections 2026-03-10T07:51:32.556 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.552+0000 7f83057fa700 1 --2- 192.168.123.105:0/3520276755 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8310072b50 0x7f8310082f70 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:32.556 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.552+0000 7f83057fa700 1 --2- 192.168.123.105:0/3520276755 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f82fc0776d0 0x7f82fc079b90 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:32.556 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.552+0000 7f83057fa700 1 --2- 192.168.123.105:0/3520276755 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8310075a40 0x7f83100834b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:51:32.556 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.552+0000 7f83057fa700 1 -- 192.168.123.105:0/3520276755 >> 192.168.123.105:0/3520276755 conn(0x7f831006dae0 msgr2=0x7f831006ff40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:51:32.556 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.553+0000 7f83057fa700 1 -- 192.168.123.105:0/3520276755 shutdown_connections 2026-03-10T07:51:32.556 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:51:32.553+0000 7f83057fa700 1 -- 192.168.123.105:0/3520276755 wait complete. 2026-03-10T07:51:32.575 INFO:tasks.workunit.client.1.vm08.stdout:7/462: creat d3/da/d25/d9/d2f/d39/d43/f9d x:0 0 0 2026-03-10T07:51:32.577 INFO:tasks.workunit.client.1.vm08.stdout:7/463: read d3/da/d25/f29 [1992247,1865] 0 2026-03-10T07:51:32.589 INFO:tasks.workunit.client.1.vm08.stdout:8/517: creat d0/df/d15/d23/d54/fad x:0 0 0 2026-03-10T07:51:32.590 INFO:tasks.workunit.client.1.vm08.stdout:8/518: truncate d0/df/f12 5136518 0 2026-03-10T07:51:32.590 INFO:tasks.workunit.client.1.vm08.stdout:8/519: write d0/d69/f4c [1285301,99807] 0 2026-03-10T07:51:32.605 INFO:tasks.workunit.client.1.vm08.stdout:2/477: dread d0/d1/d3/d39/d7d/d86/d55/d1b/f23 [0,4194304] 0 2026-03-10T07:51:32.605 INFO:tasks.workunit.client.1.vm08.stdout:2/478: readlink d0/d1/d3/d39/l4c 0 2026-03-10T07:51:32.606 INFO:tasks.workunit.client.1.vm08.stdout:9/462: rename d2/d86/d30/l43 to d2/la3 0 2026-03-10T07:51:32.616 INFO:tasks.workunit.client.1.vm08.stdout:5/497: write d0/d4/df/d12/f46 [1046367,10098] 0 2026-03-10T07:51:32.617 INFO:tasks.workunit.client.1.vm08.stdout:5/498: chown d0/d4/df/f7f 204413913 1 2026-03-10T07:51:32.617 INFO:tasks.workunit.client.1.vm08.stdout:1/453: mkdir d2/d6/de/d9c 0 2026-03-10T07:51:32.635 INFO:tasks.workunit.client.1.vm08.stdout:4/381: getdents d5 0 2026-03-10T07:51:32.654 INFO:tasks.workunit.client.1.vm08.stdout:6/470: getdents d1/db/d24/d73/d79/d7c 0 2026-03-10T07:51:32.654 INFO:tasks.workunit.client.1.vm08.stdout:6/471: chown d1/db/d24/d73/d79 676133 1 2026-03-10T07:51:32.659 INFO:tasks.workunit.client.1.vm08.stdout:8/520: mknod d0/df/d17/d7a/d89/cae 0 2026-03-10T07:51:32.659 INFO:tasks.workunit.client.1.vm08.stdout:8/521: read d0/df/d15/d23/d39/f6f [3386725,40703] 0 2026-03-10T07:51:32.663 INFO:tasks.workunit.client.1.vm08.stdout:2/479: creat d0/d1/d17/d6b/f9a x:0 0 0 2026-03-10T07:51:32.664 INFO:tasks.workunit.client.1.vm08.stdout:9/463: readlink d2/d3/lf 0 2026-03-10T07:51:32.665 INFO:tasks.workunit.client.1.vm08.stdout:9/464: write d2/de/d28/f96 [51947,130762] 0 2026-03-10T07:51:32.670 INFO:tasks.workunit.client.1.vm08.stdout:2/480: dread d0/d1/d3/d10/d38/f54 [0,4194304] 0 2026-03-10T07:51:32.671 INFO:tasks.workunit.client.1.vm08.stdout:9/465: dwrite d2/d86/f7c [0,4194304] 0 2026-03-10T07:51:32.672 INFO:tasks.workunit.client.1.vm08.stdout:2/481: stat d0/d1/d3/d39/d7d/d86/d55/d1b/l5d 0 2026-03-10T07:51:32.673 INFO:tasks.workunit.client.1.vm08.stdout:9/466: truncate d2/d86/d30/d35/d9b/fa1 379609 0 2026-03-10T07:51:32.673 INFO:tasks.workunit.client.1.vm08.stdout:2/482: write d0/d1/d17/d6b/f9a [591252,104965] 0 2026-03-10T07:51:32.684 INFO:tasks.workunit.client.1.vm08.stdout:3/497: rename d0/d3c/d18/f1e to d0/d3c/d1f/d44/d51/d2d/d85/fa0 0 2026-03-10T07:51:32.695 INFO:tasks.workunit.client.1.vm08.stdout:1/454: rmdir d2/d6/d3a/d61 39 2026-03-10T07:51:32.703 INFO:tasks.workunit.client.1.vm08.stdout:5/499: dread d0/d4/d19/d43/f35 [0,4194304] 0 2026-03-10T07:51:32.709 INFO:tasks.workunit.client.1.vm08.stdout:0/460: fdatasync dd/f16 0 2026-03-10T07:51:32.709 INFO:tasks.workunit.client.1.vm08.stdout:0/461: readlink dd/d18/l7c 0 2026-03-10T07:51:32.710 INFO:tasks.workunit.client.1.vm08.stdout:0/462: stat dd/d10/d2f/c67 0 2026-03-10T07:51:32.710 INFO:tasks.workunit.client.1.vm08.stdout:0/463: stat dd/d10/d14/d15/d20/d5f 0 2026-03-10T07:51:32.713 INFO:tasks.workunit.client.1.vm08.stdout:4/382: creat d5/d1f/d31/f82 x:0 0 0 2026-03-10T07:51:32.715 INFO:tasks.workunit.client.1.vm08.stdout:7/464: creat d3/da/d8a/f9e x:0 0 0 2026-03-10T07:51:32.719 INFO:tasks.workunit.client.1.vm08.stdout:6/472: mknod d1/d3/d3e/c9e 0 2026-03-10T07:51:32.720 INFO:tasks.workunit.client.1.vm08.stdout:6/473: chown d1/d3/df/d1d/f1f 508191296 1 2026-03-10T07:51:32.723 INFO:tasks.workunit.client.1.vm08.stdout:8/522: symlink d0/laf 0 2026-03-10T07:51:32.732 INFO:tasks.workunit.client.1.vm08.stdout:9/467: mkdir d2/d26/da4 0 2026-03-10T07:51:32.733 INFO:tasks.workunit.client.1.vm08.stdout:9/468: chown d2/d86/d2b/l34 202999229 1 2026-03-10T07:51:32.745 INFO:tasks.workunit.client.1.vm08.stdout:5/500: mkdir d0/d4/d19/d81/da4 0 2026-03-10T07:51:32.748 INFO:tasks.workunit.client.1.vm08.stdout:0/464: creat dd/d18/f9a x:0 0 0 2026-03-10T07:51:32.748 INFO:tasks.workunit.client.1.vm08.stdout:0/465: write dd/d18/f96 [526358,106503] 0 2026-03-10T07:51:32.751 INFO:tasks.workunit.client.1.vm08.stdout:0/466: chown dd/d10/d2f/d37/f65 559822 1 2026-03-10T07:51:32.761 INFO:tasks.workunit.client.1.vm08.stdout:9/469: fsync d2/d86/d30/d35/f64 0 2026-03-10T07:51:32.761 INFO:tasks.workunit.client.1.vm08.stdout:9/470: read - d2/d86/d2b/f83 zero size 2026-03-10T07:51:32.765 INFO:tasks.workunit.client.1.vm08.stdout:9/471: dwrite d2/d86/f90 [0,4194304] 0 2026-03-10T07:51:32.765 INFO:tasks.workunit.client.1.vm08.stdout:9/472: chown d2/d86/d30/d35/c33 1 1 2026-03-10T07:51:32.767 INFO:tasks.workunit.client.1.vm08.stdout:9/473: write d2/de/d28/f96 [557373,41798] 0 2026-03-10T07:51:32.771 INFO:tasks.workunit.client.1.vm08.stdout:2/483: mkdir d0/d1/d3/d56/d9b 0 2026-03-10T07:51:32.786 INFO:tasks.workunit.client.1.vm08.stdout:3/498: dwrite d0/d3c/d1f/d44/d51/f9d [0,4194304] 0 2026-03-10T07:51:32.797 INFO:tasks.workunit.client.1.vm08.stdout:5/501: creat d0/d4/df/d12/d22/fa5 x:0 0 0 2026-03-10T07:51:32.799 INFO:tasks.workunit.client.1.vm08.stdout:5/502: truncate d0/d4/df/d12/f11 4612173 0 2026-03-10T07:51:32.805 INFO:tasks.workunit.client.1.vm08.stdout:0/467: truncate dd/d18/f21 3638240 0 2026-03-10T07:51:32.807 INFO:tasks.workunit.client.1.vm08.stdout:7/465: fsync d3/da/d25/d9/d2f/d39/f76 0 2026-03-10T07:51:32.820 INFO:tasks.workunit.client.1.vm08.stdout:0/468: dwrite dd/d10/d2f/d37/d64/d95/d5c/f63 [0,4194304] 0 2026-03-10T07:51:32.820 INFO:tasks.workunit.client.1.vm08.stdout:7/466: dwrite d3/da/d25/d9/d2f/d39/d43/f9d [0,4194304] 0 2026-03-10T07:51:32.849 INFO:tasks.workunit.client.1.vm08.stdout:1/455: creat d2/d6/d3a/d61/d6f/f9d x:0 0 0 2026-03-10T07:51:32.851 INFO:tasks.workunit.client.1.vm08.stdout:3/499: mkdir d0/d3c/da1 0 2026-03-10T07:51:32.853 INFO:tasks.workunit.client.1.vm08.stdout:4/383: dwrite d5/d8/f1e [4194304,4194304] 0 2026-03-10T07:51:32.853 INFO:tasks.workunit.client.1.vm08.stdout:3/500: chown d0/d3c/d18/d48/d55/d56/f9e 1742 1 2026-03-10T07:51:32.884 INFO:tasks.workunit.client.1.vm08.stdout:4/384: dread d5/d8/d9/d12/d7b/d48/d4f/f56 [0,4194304] 0 2026-03-10T07:51:32.906 INFO:tasks.workunit.client.1.vm08.stdout:2/484: mkdir d0/d1/d3/d56/d9b/d9c 0 2026-03-10T07:51:32.906 INFO:tasks.workunit.client.1.vm08.stdout:2/485: read - d0/d1/d3/f96 zero size 2026-03-10T07:51:32.907 INFO:tasks.workunit.client.1.vm08.stdout:2/486: readlink d0/d1/d3/d10/d38/l47 0 2026-03-10T07:51:32.910 INFO:tasks.workunit.client.1.vm08.stdout:1/456: mkdir d2/d6/de/d1f/d26/d58/d9e 0 2026-03-10T07:51:32.936 INFO:tasks.workunit.client.1.vm08.stdout:6/474: link d1/db/c65 d1/db/d24/d73/d79/c9f 0 2026-03-10T07:51:32.937 INFO:tasks.workunit.client.1.vm08.stdout:6/475: truncate d1/d3/df/d1d/f9d 5163721 0 2026-03-10T07:51:32.942 INFO:tasks.workunit.client.1.vm08.stdout:6/476: dread d1/d17/d2b/f68 [0,4194304] 0 2026-03-10T07:51:32.947 INFO:tasks.workunit.client.1.vm08.stdout:8/523: getdents d0/df/d15/d23/d39 0 2026-03-10T07:51:32.954 INFO:tasks.workunit.client.1.vm08.stdout:9/474: creat d2/de/fa5 x:0 0 0 2026-03-10T07:51:32.954 INFO:tasks.workunit.client.1.vm08.stdout:9/475: read d2/d26/f29 [751326,34834] 0 2026-03-10T07:51:32.955 INFO:tasks.workunit.client.1.vm08.stdout:2/487: symlink d0/d1/d3/d39/d7d/d7e/l9d 0 2026-03-10T07:51:32.956 INFO:tasks.workunit.client.1.vm08.stdout:2/488: fsync d0/f81 0 2026-03-10T07:51:32.957 INFO:tasks.workunit.client.1.vm08.stdout:1/457: mkdir d2/d6/d9f 0 2026-03-10T07:51:32.961 INFO:tasks.workunit.client.1.vm08.stdout:5/503: creat d0/d4/df/d1e/fa6 x:0 0 0 2026-03-10T07:51:32.989 INFO:tasks.workunit.client.1.vm08.stdout:7/467: creat d3/da/f9f x:0 0 0 2026-03-10T07:51:32.990 INFO:tasks.workunit.client.1.vm08.stdout:7/468: chown d3/da/d25/d9/f47 312523 1 2026-03-10T07:51:32.990 INFO:tasks.workunit.client.1.vm08.stdout:7/469: dread - d3/da/d25/d9/d2f/d3a/d4b/f70 zero size 2026-03-10T07:51:32.996 INFO:tasks.workunit.client.1.vm08.stdout:5/504: symlink d0/d8/d24/la7 0 2026-03-10T07:51:33.009 INFO:tasks.workunit.client.1.vm08.stdout:0/469: getdents dd/d18 0 2026-03-10T07:51:33.009 INFO:tasks.workunit.client.1.vm08.stdout:6/477: symlink d1/la0 0 2026-03-10T07:51:33.009 INFO:tasks.workunit.client.1.vm08.stdout:0/470: chown dd/d10/d2f/f8e 346930 1 2026-03-10T07:51:33.010 INFO:tasks.workunit.client.1.vm08.stdout:6/478: dread - d1/d3/df/d44/f86 zero size 2026-03-10T07:51:33.010 INFO:tasks.workunit.client.1.vm08.stdout:0/471: dread - dd/d10/d14/d15/d20/f7e zero size 2026-03-10T07:51:33.018 INFO:tasks.workunit.client.1.vm08.stdout:9/476: creat d2/d3/d84/d91/fa6 x:0 0 0 2026-03-10T07:51:33.018 INFO:tasks.workunit.client.1.vm08.stdout:9/477: fsync d2/d26/f61 0 2026-03-10T07:51:33.019 INFO:tasks.workunit.client.1.vm08.stdout:5/505: sync 2026-03-10T07:51:33.019 INFO:tasks.workunit.client.1.vm08.stdout:0/472: sync 2026-03-10T07:51:33.021 INFO:tasks.workunit.client.1.vm08.stdout:8/524: write d0/df/f60 [1013585,48954] 0 2026-03-10T07:51:33.026 INFO:tasks.workunit.client.1.vm08.stdout:2/489: unlink d0/d1/c71 0 2026-03-10T07:51:33.027 INFO:tasks.workunit.client.1.vm08.stdout:2/490: chown d0/d1/d3/d10 73509893 1 2026-03-10T07:51:33.030 INFO:tasks.workunit.client.1.vm08.stdout:8/525: dread d0/df/f12 [0,4194304] 0 2026-03-10T07:51:33.031 INFO:tasks.workunit.client.1.vm08.stdout:3/501: link d0/d3c/d1f/c27 d0/d3c/d1f/d44/d51/d34/ca2 0 2026-03-10T07:51:33.032 INFO:tasks.workunit.client.1.vm08.stdout:4/385: getdents d5/d1f/d70 0 2026-03-10T07:51:33.033 INFO:tasks.workunit.client.1.vm08.stdout:6/479: symlink d1/d3/df/la1 0 2026-03-10T07:51:33.034 INFO:tasks.workunit.client.1.vm08.stdout:4/386: sync 2026-03-10T07:51:33.035 INFO:tasks.workunit.client.1.vm08.stdout:4/387: read f0 [2169779,127034] 0 2026-03-10T07:51:33.036 INFO:tasks.workunit.client.1.vm08.stdout:4/388: dread - d5/d8/d9/d12/d7b/d48/f5d zero size 2026-03-10T07:51:33.037 INFO:tasks.workunit.client.1.vm08.stdout:6/480: dread d1/d3/df/d1d/f9d [0,4194304] 0 2026-03-10T07:51:33.037 INFO:tasks.workunit.client.1.vm08.stdout:6/481: chown d1/d17/f42 7275 1 2026-03-10T07:51:33.037 INFO:tasks.workunit.client.1.vm08.stdout:9/478: unlink d2/d86/d30/f70 0 2026-03-10T07:51:33.038 INFO:tasks.workunit.client.1.vm08.stdout:9/479: dread - d2/d86/d2b/f83 zero size 2026-03-10T07:51:33.038 INFO:tasks.workunit.client.1.vm08.stdout:6/482: dread - d1/d3/df/d44/f86 zero size 2026-03-10T07:51:33.039 INFO:tasks.workunit.client.1.vm08.stdout:9/480: dread d2/d3/f12 [0,4194304] 0 2026-03-10T07:51:33.039 INFO:tasks.workunit.client.1.vm08.stdout:6/483: fsync d1/d17/d2b/d5e/f96 0 2026-03-10T07:51:33.040 INFO:tasks.workunit.client.1.vm08.stdout:0/473: mkdir dd/d10/d2f/d37/d64/d95/d58/d3d/d9b 0 2026-03-10T07:51:33.043 INFO:tasks.workunit.client.1.vm08.stdout:0/474: dread dd/d10/d14/d15/f41 [0,4194304] 0 2026-03-10T07:51:33.044 INFO:tasks.workunit.client.1.vm08.stdout:6/484: dwrite d1/d17/f20 [0,4194304] 0 2026-03-10T07:51:33.046 INFO:tasks.workunit.client.1.vm08.stdout:5/506: rename d0/d4/d19/d43/f3d to d0/d4/d19/d60/d6d/d70/d40/fa8 0 2026-03-10T07:51:33.056 INFO:tasks.workunit.client.1.vm08.stdout:2/491: mknod d0/d1/d3/d3e/c9e 0 2026-03-10T07:51:33.057 INFO:tasks.workunit.client.1.vm08.stdout:5/507: dread d0/d4/df/f2a [4194304,4194304] 0 2026-03-10T07:51:33.058 INFO:tasks.workunit.client.1.vm08.stdout:5/508: chown d0/d33 777 1 2026-03-10T07:51:33.058 INFO:tasks.workunit.client.1.vm08.stdout:5/509: write d0/d4/df/d12/d22/fa5 [1031615,110614] 0 2026-03-10T07:51:33.059 INFO:tasks.workunit.client.1.vm08.stdout:7/470: write d3/da/d25/d9/d2f/f62 [3278798,127875] 0 2026-03-10T07:51:33.062 INFO:tasks.workunit.client.1.vm08.stdout:8/526: symlink d0/df/d15/d9c/lb0 0 2026-03-10T07:51:33.072 INFO:tasks.workunit.client.1.vm08.stdout:4/389: dread d5/d8/ff [0,4194304] 0 2026-03-10T07:51:33.072 INFO:tasks.workunit.client.1.vm08.stdout:4/390: chown d5/d1f/d31/f62 4 1 2026-03-10T07:51:33.074 INFO:tasks.workunit.client.1.vm08.stdout:9/481: fdatasync d2/d86/d30/d35/f64 0 2026-03-10T07:51:33.078 INFO:tasks.workunit.client.1.vm08.stdout:9/482: dwrite d2/de/d28/f96 [0,4194304] 0 2026-03-10T07:51:33.086 INFO:tasks.workunit.client.1.vm08.stdout:9/483: write d2/f5 [1531718,40245] 0 2026-03-10T07:51:33.086 INFO:tasks.workunit.client.1.vm08.stdout:9/484: dread - d2/d58/f9f zero size 2026-03-10T07:51:33.096 INFO:tasks.workunit.client.1.vm08.stdout:0/475: chown dd/d10/d2f/d37/d64/d95/d58/d3d/c89 602930 1 2026-03-10T07:51:33.104 INFO:tasks.workunit.client.1.vm08.stdout:2/492: rmdir d0/d1/d3/d39/d7d/d86 39 2026-03-10T07:51:33.106 INFO:tasks.workunit.client.1.vm08.stdout:5/510: mknod d0/d4/df/d82/ca9 0 2026-03-10T07:51:33.107 INFO:tasks.workunit.client.1.vm08.stdout:7/471: symlink d3/da/d25/d9/d2f/d3a/d71/la0 0 2026-03-10T07:51:33.109 INFO:tasks.workunit.client.1.vm08.stdout:5/511: dread d0/d4/df/d12/d22/fa5 [0,4194304] 0 2026-03-10T07:51:33.117 INFO:tasks.workunit.client.1.vm08.stdout:8/527: symlink d0/d69/lb1 0 2026-03-10T07:51:33.130 INFO:tasks.workunit.client.1.vm08.stdout:8/528: dread d0/df/d2e/f3c [0,4194304] 0 2026-03-10T07:51:33.133 INFO:tasks.workunit.client.1.vm08.stdout:4/391: fdatasync d5/d1f/d31/f62 0 2026-03-10T07:51:33.137 INFO:tasks.workunit.client.1.vm08.stdout:8/529: dread d0/d69/f46 [0,4194304] 0 2026-03-10T07:51:33.137 INFO:tasks.workunit.client.1.vm08.stdout:8/530: readlink d0/df/d15/d9c/lb0 0 2026-03-10T07:51:33.142 INFO:tasks.workunit.client.1.vm08.stdout:6/485: link d1/f35 d1/d3/df/d44/fa2 0 2026-03-10T07:51:33.147 INFO:tasks.workunit.client.1.vm08.stdout:6/486: dwrite d1/d3/df/d52/f8f [0,4194304] 0 2026-03-10T07:51:33.156 INFO:tasks.workunit.client.1.vm08.stdout:6/487: dread d1/d3/df/d44/fa2 [0,4194304] 0 2026-03-10T07:51:33.182 INFO:tasks.workunit.client.1.vm08.stdout:4/392: rename d5/d8/d9/f2b to d5/d1f/d41/f83 0 2026-03-10T07:51:33.183 INFO:tasks.workunit.client.1.vm08.stdout:8/531: symlink d0/df/d17/lb2 0 2026-03-10T07:51:33.189 INFO:tasks.workunit.client.1.vm08.stdout:1/458: getdents d2/d6/de/d1f/d40/d76 0 2026-03-10T07:51:33.197 INFO:tasks.workunit.client.1.vm08.stdout:6/488: unlink d1/db/c5f 0 2026-03-10T07:51:33.197 INFO:tasks.workunit.client.1.vm08.stdout:6/489: dread - d1/db/d24/f93 zero size 2026-03-10T07:51:33.198 INFO:tasks.workunit.client.1.vm08.stdout:3/502: getdents d0/d3c/d18/d48/d55 0 2026-03-10T07:51:33.198 INFO:tasks.workunit.client.1.vm08.stdout:3/503: chown d0/d3c/d18/d4a/f8a 0 1 2026-03-10T07:51:33.209 INFO:tasks.workunit.client.1.vm08.stdout:5/512: truncate d0/d4/f31 571001 0 2026-03-10T07:51:33.213 INFO:tasks.workunit.client.1.vm08.stdout:4/393: rmdir d5/d1f/d41 39 2026-03-10T07:51:33.214 INFO:tasks.workunit.client.1.vm08.stdout:4/394: fdatasync d5/d8/d9/d32/f44 0 2026-03-10T07:51:33.214 INFO:tasks.workunit.client.1.vm08.stdout:4/395: chown f2 15168725 1 2026-03-10T07:51:33.216 INFO:tasks.workunit.client.1.vm08.stdout:8/532: rename d0/d37/f9b to d0/d69/d3f/fb3 0 2026-03-10T07:51:33.221 INFO:tasks.workunit.client.1.vm08.stdout:0/476: creat dd/d10/d14/d15/f9c x:0 0 0 2026-03-10T07:51:33.222 INFO:tasks.workunit.client.1.vm08.stdout:1/459: write d2/d6/d3a/d61/d6f/f9d [981485,71591] 0 2026-03-10T07:51:33.226 INFO:tasks.workunit.client.1.vm08.stdout:6/490: read d1/db/d24/d51/f59 [1082511,127391] 0 2026-03-10T07:51:33.226 INFO:tasks.workunit.client.1.vm08.stdout:0/477: dread dd/d10/d2f/d37/d64/d95/f48 [0,4194304] 0 2026-03-10T07:51:33.227 INFO:tasks.workunit.client.1.vm08.stdout:3/504: write d0/d3c/d1f/d44/f59 [1580615,61022] 0 2026-03-10T07:51:33.231 INFO:tasks.workunit.client.1.vm08.stdout:1/460: sync 2026-03-10T07:51:33.232 INFO:tasks.workunit.client.1.vm08.stdout:2/493: creat d0/d1/f9f x:0 0 0 2026-03-10T07:51:33.232 INFO:tasks.workunit.client.1.vm08.stdout:7/472: link d3/da/d25/d9/d2f/c61 d3/da/d25/d9/d2f/d3a/d4b/ca1 0 2026-03-10T07:51:33.233 INFO:tasks.workunit.client.1.vm08.stdout:1/461: sync 2026-03-10T07:51:33.236 INFO:tasks.workunit.client.1.vm08.stdout:5/513: unlink d0/d4/df/d1e/d41/l4e 0 2026-03-10T07:51:33.236 INFO:tasks.workunit.client.1.vm08.stdout:5/514: chown d0/c93 3777 1 2026-03-10T07:51:33.244 INFO:tasks.workunit.client.1.vm08.stdout:9/485: truncate d2/d3/fc 2613884 0 2026-03-10T07:51:33.250 INFO:tasks.workunit.client.1.vm08.stdout:8/533: mkdir d0/df/d15/d23/da8/db4 0 2026-03-10T07:51:33.253 INFO:tasks.workunit.client.1.vm08.stdout:8/534: write d0/df/d2e/d30/f76 [1760692,34381] 0 2026-03-10T07:51:33.257 INFO:tasks.workunit.client.1.vm08.stdout:8/535: write d0/f2a [451647,79419] 0 2026-03-10T07:51:33.257 INFO:tasks.workunit.client.1.vm08.stdout:6/491: rename d1/f34 to d1/db/d24/d73/d79/d7c/fa3 0 2026-03-10T07:51:33.259 INFO:tasks.workunit.client.1.vm08.stdout:8/536: write d0/df/d17/d72/f91 [741082,100954] 0 2026-03-10T07:51:33.261 INFO:tasks.workunit.client.1.vm08.stdout:8/537: sync 2026-03-10T07:51:33.262 INFO:tasks.workunit.client.1.vm08.stdout:4/396: dread d5/d8/d9/d12/d7b/f1a [0,4194304] 0 2026-03-10T07:51:33.264 INFO:tasks.workunit.client.1.vm08.stdout:7/473: rmdir d3/da/d25/d9/d2f/d39/d43/d4f/d5b 39 2026-03-10T07:51:33.267 INFO:tasks.workunit.client.1.vm08.stdout:0/478: dread dd/d10/d2f/d37/d64/d95/d58/f86 [0,4194304] 0 2026-03-10T07:51:33.268 INFO:tasks.workunit.client.1.vm08.stdout:4/397: dwrite d5/d1f/d70/f78 [0,4194304] 0 2026-03-10T07:51:33.269 INFO:tasks.workunit.client.1.vm08.stdout:2/494: dwrite d0/d1/d3/d10/d32/d61/d8e/f90 [0,4194304] 0 2026-03-10T07:51:33.270 INFO:tasks.workunit.client.1.vm08.stdout:1/462: mkdir d2/d6/de/d47/da0 0 2026-03-10T07:51:33.276 INFO:tasks.workunit.client.1.vm08.stdout:9/486: unlink d2/d86/d30/l88 0 2026-03-10T07:51:33.298 INFO:tasks.workunit.client.1.vm08.stdout:6/492: unlink d1/d17/f42 0 2026-03-10T07:51:33.299 INFO:tasks.workunit.client.1.vm08.stdout:6/493: readlink d1/d3/l9 0 2026-03-10T07:51:33.301 INFO:tasks.workunit.client.1.vm08.stdout:8/538: rename d0/df/d15/d23/d39/d5b/d4a/c5c to d0/df/d15/d95/cb5 0 2026-03-10T07:51:33.302 INFO:tasks.workunit.client.1.vm08.stdout:8/539: write d0/df/d15/d23/d39/f85 [2569576,107060] 0 2026-03-10T07:51:33.306 INFO:tasks.workunit.client.1.vm08.stdout:7/474: symlink d3/da/d25/la2 0 2026-03-10T07:51:33.318 INFO:tasks.workunit.client.1.vm08.stdout:0/479: creat dd/d10/d14/d15/d20/d5f/f9d x:0 0 0 2026-03-10T07:51:33.320 INFO:tasks.workunit.client.1.vm08.stdout:0/480: sync 2026-03-10T07:51:33.322 INFO:tasks.workunit.client.1.vm08.stdout:2/495: dread d0/d1/d3/d39/d7d/d86/d55/d1b/f23 [0,4194304] 0 2026-03-10T07:51:33.324 INFO:tasks.workunit.client.1.vm08.stdout:0/481: dread dd/d10/d14/d15/d20/d5f/f7f [0,4194304] 0 2026-03-10T07:51:33.327 INFO:tasks.workunit.client.1.vm08.stdout:6/494: rename d1/db/l4f to d1/d3/df/d52/la4 0 2026-03-10T07:51:33.330 INFO:tasks.workunit.client.1.vm08.stdout:3/505: truncate d0/d3c/d1f/d44/d51/d2d/d85/fa0 2587079 0 2026-03-10T07:51:33.331 INFO:tasks.workunit.client.1.vm08.stdout:3/506: sync 2026-03-10T07:51:33.336 INFO:tasks.workunit.client.1.vm08.stdout:4/398: dwrite d5/d1f/d31/f38 [0,4194304] 0 2026-03-10T07:51:33.344 INFO:tasks.workunit.client.1.vm08.stdout:3/507: dread d0/f84 [0,4194304] 0 2026-03-10T07:51:33.355 INFO:tasks.workunit.client.1.vm08.stdout:8/540: creat d0/df/d17/d7a/d89/fb6 x:0 0 0 2026-03-10T07:51:33.364 INFO:tasks.workunit.client.1.vm08.stdout:1/463: mknod d2/d6/de/d47/da0/ca1 0 2026-03-10T07:51:33.392 INFO:tasks.workunit.client.1.vm08.stdout:9/487: write d2/d86/f21 [2584965,111101] 0 2026-03-10T07:51:33.395 INFO:tasks.workunit.client.1.vm08.stdout:0/482: rename dd/d10/d14/d15/l19 to dd/d10/d14/d15/d20/d22/l9e 0 2026-03-10T07:51:33.396 INFO:tasks.workunit.client.1.vm08.stdout:0/483: write dd/d10/d14/d15/f84 [897291,101138] 0 2026-03-10T07:51:33.402 INFO:tasks.workunit.client.1.vm08.stdout:6/495: fsync d1/d3/f2e 0 2026-03-10T07:51:33.409 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:33 vm05.local ceph-mon[50387]: from='client.14692 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:51:33.409 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:33 vm05.local ceph-mon[50387]: from='client.24481 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:51:33.409 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:33 vm05.local ceph-mon[50387]: from='client.? 192.168.123.105:0/2462813532' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:33.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:33 vm08.local ceph-mon[59917]: from='client.14692 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:51:33.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:33 vm08.local ceph-mon[59917]: from='client.24481 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:51:33.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:33 vm08.local ceph-mon[59917]: from='client.? 192.168.123.105:0/2462813532' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:33.423 INFO:tasks.workunit.client.1.vm08.stdout:8/541: write d0/fa [4741380,64483] 0 2026-03-10T07:51:33.429 INFO:tasks.workunit.client.1.vm08.stdout:5/515: getdents d0/d4/d19/d3a 0 2026-03-10T07:51:33.436 INFO:tasks.workunit.client.1.vm08.stdout:2/496: read d0/d1/d17/f1a [5863771,127411] 0 2026-03-10T07:51:33.438 INFO:tasks.workunit.client.1.vm08.stdout:9/488: chown d2/la3 816386 1 2026-03-10T07:51:33.446 INFO:tasks.workunit.client.1.vm08.stdout:6/496: mknod d1/d3/d3e/ca5 0 2026-03-10T07:51:33.455 INFO:tasks.workunit.client.1.vm08.stdout:4/399: truncate d5/d1f/d70/f78 1052795 0 2026-03-10T07:51:33.456 INFO:tasks.workunit.client.1.vm08.stdout:4/400: write d5/d1f/d31/f62 [3172397,92352] 0 2026-03-10T07:51:33.464 INFO:tasks.workunit.client.1.vm08.stdout:3/508: dwrite d0/d3c/d1f/f6f [0,4194304] 0 2026-03-10T07:51:33.464 INFO:tasks.workunit.client.1.vm08.stdout:3/509: stat d0/d3c/l1b 0 2026-03-10T07:51:33.473 INFO:tasks.workunit.client.1.vm08.stdout:5/516: mkdir d0/d77/daa 0 2026-03-10T07:51:33.473 INFO:tasks.workunit.client.1.vm08.stdout:3/510: write d0/d3c/d18/d32/d61/d52/f68 [901121,60980] 0 2026-03-10T07:51:33.473 INFO:tasks.workunit.client.1.vm08.stdout:7/475: creat d3/da/d25/d9/d2f/d3a/d4b/fa3 x:0 0 0 2026-03-10T07:51:33.475 INFO:tasks.workunit.client.1.vm08.stdout:2/497: truncate d0/d1/d3/f4e 405384 0 2026-03-10T07:51:33.476 INFO:tasks.workunit.client.1.vm08.stdout:2/498: chown d0/d1/d3/d39/d7d/d86/d55/d7a/l99 103979845 1 2026-03-10T07:51:33.479 INFO:tasks.workunit.client.1.vm08.stdout:8/542: dwrite d0/df/f9f [0,4194304] 0 2026-03-10T07:51:33.481 INFO:tasks.workunit.client.1.vm08.stdout:0/484: fdatasync dd/d10/d14/d15/d20/d22/f2e 0 2026-03-10T07:51:33.487 INFO:tasks.workunit.client.1.vm08.stdout:8/543: dwrite d0/df/d17/d72/f91 [0,4194304] 0 2026-03-10T07:51:33.508 INFO:tasks.workunit.client.1.vm08.stdout:3/511: creat d0/d3c/d18/d32/d61/d83/fa3 x:0 0 0 2026-03-10T07:51:33.508 INFO:tasks.workunit.client.1.vm08.stdout:7/476: rmdir d3/da/d25/d9/d2f/d4d 39 2026-03-10T07:51:33.509 INFO:tasks.workunit.client.1.vm08.stdout:9/489: symlink d2/d26/da4/la7 0 2026-03-10T07:51:33.509 INFO:tasks.workunit.client.1.vm08.stdout:2/499: read d0/d1/d3/d56/d78/f62 [2324546,46569] 0 2026-03-10T07:51:33.510 INFO:tasks.workunit.client.1.vm08.stdout:9/490: write d2/de/f1e [5169661,77180] 0 2026-03-10T07:51:33.512 INFO:tasks.workunit.client.1.vm08.stdout:4/401: unlink d5/d8/f1d 0 2026-03-10T07:51:33.513 INFO:tasks.workunit.client.1.vm08.stdout:4/402: rename d5/d1f/d31/d61 to d5/d1f/d31/d61/d84 22 2026-03-10T07:51:33.514 INFO:tasks.workunit.client.1.vm08.stdout:1/464: getdents d2/d6/de/d47 0 2026-03-10T07:51:33.515 INFO:tasks.workunit.client.1.vm08.stdout:3/512: unlink d0/d3c/d18/c7d 0 2026-03-10T07:51:33.517 INFO:tasks.workunit.client.1.vm08.stdout:0/485: getdents dd/d18 0 2026-03-10T07:51:33.517 INFO:tasks.workunit.client.1.vm08.stdout:8/544: getdents d0/df/d15/d23/da8/db4 0 2026-03-10T07:51:33.518 INFO:tasks.workunit.client.1.vm08.stdout:6/497: creat d1/d17/fa6 x:0 0 0 2026-03-10T07:51:33.519 INFO:tasks.workunit.client.1.vm08.stdout:5/517: creat d0/d4/fab x:0 0 0 2026-03-10T07:51:33.520 INFO:tasks.workunit.client.1.vm08.stdout:1/465: rename d2/d6/de/d1f/d22/f49 to d2/d6/de/d1f/d26/d58/d83/fa2 0 2026-03-10T07:51:33.520 INFO:tasks.workunit.client.1.vm08.stdout:2/500: mkdir d0/d1/d17/d6b/da0 0 2026-03-10T07:51:33.521 INFO:tasks.workunit.client.1.vm08.stdout:0/486: mkdir dd/d10/d14/d15/d20/d5f/d9f 0 2026-03-10T07:51:33.521 INFO:tasks.workunit.client.1.vm08.stdout:6/498: write d1/db/d24/f93 [546855,113749] 0 2026-03-10T07:51:33.522 INFO:tasks.workunit.client.1.vm08.stdout:9/491: dwrite d2/d26/f4b [0,4194304] 0 2026-03-10T07:51:33.528 INFO:tasks.workunit.client.1.vm08.stdout:3/513: dwrite d0/d3c/f20 [0,4194304] 0 2026-03-10T07:51:33.539 INFO:tasks.workunit.client.1.vm08.stdout:7/477: creat d3/fa4 x:0 0 0 2026-03-10T07:51:33.539 INFO:tasks.workunit.client.1.vm08.stdout:0/487: symlink dd/d10/d2f/la0 0 2026-03-10T07:51:33.539 INFO:tasks.workunit.client.1.vm08.stdout:9/492: creat d2/de/d28/fa8 x:0 0 0 2026-03-10T07:51:33.544 INFO:tasks.workunit.client.1.vm08.stdout:0/488: write dd/d10/d2f/d37/d64/d95/d58/d3d/f40 [718243,15273] 0 2026-03-10T07:51:33.547 INFO:tasks.workunit.client.1.vm08.stdout:9/493: dread d2/d86/f7c [0,4194304] 0 2026-03-10T07:51:33.548 INFO:tasks.workunit.client.1.vm08.stdout:8/545: dwrite d0/df/d15/d23/d39/d5b/d4a/f98 [0,4194304] 0 2026-03-10T07:51:33.549 INFO:tasks.workunit.client.1.vm08.stdout:8/546: chown d0/df/d15/d23/d39/d5b/d4a/fa7 321793406 1 2026-03-10T07:51:33.556 INFO:tasks.workunit.client.1.vm08.stdout:9/494: sync 2026-03-10T07:51:33.560 INFO:tasks.workunit.client.1.vm08.stdout:8/547: dwrite d0/df/f60 [0,4194304] 0 2026-03-10T07:51:33.564 INFO:tasks.workunit.client.1.vm08.stdout:1/466: symlink d2/d6/de/d47/la3 0 2026-03-10T07:51:33.581 INFO:tasks.workunit.client.1.vm08.stdout:2/501: getdents d0/d1/d3/d39/d7d/d86/d55/d1b 0 2026-03-10T07:51:33.583 INFO:tasks.workunit.client.1.vm08.stdout:8/548: sync 2026-03-10T07:51:33.584 INFO:tasks.workunit.client.1.vm08.stdout:4/403: dwrite d5/d8/f39 [0,4194304] 0 2026-03-10T07:51:33.585 INFO:tasks.workunit.client.1.vm08.stdout:4/404: truncate d5/d1f/f25 876539 0 2026-03-10T07:51:33.586 INFO:tasks.workunit.client.1.vm08.stdout:4/405: write d5/d8/d9/d32/f44 [1080465,81734] 0 2026-03-10T07:51:33.587 INFO:tasks.workunit.client.1.vm08.stdout:4/406: write d5/f21 [1750497,129043] 0 2026-03-10T07:51:33.587 INFO:tasks.workunit.client.1.vm08.stdout:8/549: dread d0/df/f19 [0,4194304] 0 2026-03-10T07:51:33.588 INFO:tasks.workunit.client.1.vm08.stdout:8/550: write d0/d69/d77/f87 [677288,63631] 0 2026-03-10T07:51:33.593 INFO:tasks.workunit.client.1.vm08.stdout:5/518: mkdir d0/d8/d5e/d8e/dac 0 2026-03-10T07:51:33.598 INFO:tasks.workunit.client.1.vm08.stdout:4/407: read d5/d8/d9/d12/d7b/f43 [1628033,101704] 0 2026-03-10T07:51:33.621 INFO:tasks.workunit.client.1.vm08.stdout:6/499: fsync d1/d3/f21 0 2026-03-10T07:51:33.625 INFO:tasks.workunit.client.1.vm08.stdout:9/495: creat d2/d86/d30/d35/d9b/fa9 x:0 0 0 2026-03-10T07:51:33.625 INFO:tasks.workunit.client.1.vm08.stdout:9/496: write d2/d58/f95 [570984,24735] 0 2026-03-10T07:51:33.629 INFO:tasks.workunit.client.1.vm08.stdout:3/514: rename d0/d3c/f20 to d0/d3c/d1f/d89/fa4 0 2026-03-10T07:51:33.634 INFO:tasks.workunit.client.1.vm08.stdout:2/502: creat d0/d1/d3/d10/d32/d61/d8e/fa1 x:0 0 0 2026-03-10T07:51:33.640 INFO:tasks.workunit.client.1.vm08.stdout:8/551: rmdir d0/d69/d3f 39 2026-03-10T07:51:33.641 INFO:tasks.workunit.client.1.vm08.stdout:7/478: symlink d3/da/d25/d9/d2f/d39/d43/d4f/la5 0 2026-03-10T07:51:33.659 INFO:tasks.workunit.client.1.vm08.stdout:5/519: mkdir d0/d4/df/d1e/d41/dad 0 2026-03-10T07:51:33.664 INFO:tasks.workunit.client.1.vm08.stdout:0/489: dwrite dd/d10/d14/d15/d20/d22/f6c [0,4194304] 0 2026-03-10T07:51:33.676 INFO:tasks.workunit.client.1.vm08.stdout:6/500: creat d1/d17/d2b/fa7 x:0 0 0 2026-03-10T07:51:33.678 INFO:tasks.workunit.client.1.vm08.stdout:3/515: dread - d0/d3c/d18/d32/d61/d52/f7f zero size 2026-03-10T07:51:33.678 INFO:tasks.workunit.client.1.vm08.stdout:3/516: readlink d0/d3c/d18/d4a/l7c 0 2026-03-10T07:51:33.682 INFO:tasks.workunit.client.1.vm08.stdout:3/517: fsync d0/f45 0 2026-03-10T07:51:33.685 INFO:tasks.workunit.client.1.vm08.stdout:3/518: dread d0/d3c/d1f/d44/d51/d34/f47 [0,4194304] 0 2026-03-10T07:51:33.688 INFO:tasks.workunit.client.1.vm08.stdout:5/520: mkdir d0/d8/d5e/d8e/dae 0 2026-03-10T07:51:33.694 INFO:tasks.workunit.client.1.vm08.stdout:5/521: dwrite d0/d4/d19/d60/d6d/d70/f67 [0,4194304] 0 2026-03-10T07:51:33.704 INFO:tasks.workunit.client.1.vm08.stdout:4/408: symlink d5/d8/l85 0 2026-03-10T07:51:33.711 INFO:tasks.workunit.client.1.vm08.stdout:0/490: fdatasync dd/d10/d2f/d37/d64/d52/f74 0 2026-03-10T07:51:33.714 INFO:tasks.workunit.client.1.vm08.stdout:6/501: unlink d1/d3/df/d52/c6c 0 2026-03-10T07:51:33.715 INFO:tasks.workunit.client.1.vm08.stdout:6/502: write d1/d3/d3e/f43 [463215,102630] 0 2026-03-10T07:51:33.720 INFO:tasks.workunit.client.1.vm08.stdout:9/497: dread d2/d86/d30/d35/f64 [0,4194304] 0 2026-03-10T07:51:33.724 INFO:tasks.workunit.client.1.vm08.stdout:9/498: dwrite d2/d26/f4b [0,4194304] 0 2026-03-10T07:51:33.739 INFO:tasks.workunit.client.1.vm08.stdout:7/479: mkdir d3/da/da6 0 2026-03-10T07:51:33.751 INFO:tasks.workunit.client.1.vm08.stdout:1/467: link d2/d6/c52 d2/d6/de/d1f/d22/ca4 0 2026-03-10T07:51:33.752 INFO:tasks.workunit.client.1.vm08.stdout:1/468: write d2/d6/de/d70/f8d [835212,45451] 0 2026-03-10T07:51:33.755 INFO:tasks.workunit.client.1.vm08.stdout:0/491: creat dd/d10/d2f/d37/d64/d95/d58/d3d/fa1 x:0 0 0 2026-03-10T07:51:33.761 INFO:tasks.workunit.client.1.vm08.stdout:5/522: write d0/d4/d19/d60/d6d/d70/d40/f5f [40149,103120] 0 2026-03-10T07:51:33.763 INFO:tasks.workunit.client.1.vm08.stdout:2/503: creat d0/d1/fa2 x:0 0 0 2026-03-10T07:51:33.764 INFO:tasks.workunit.client.1.vm08.stdout:4/409: dread d5/d8/f30 [0,4194304] 0 2026-03-10T07:51:33.765 INFO:tasks.workunit.client.1.vm08.stdout:4/410: write d5/d8/d9/f5a [656918,121950] 0 2026-03-10T07:51:33.775 INFO:tasks.workunit.client.1.vm08.stdout:9/499: creat d2/d86/faa x:0 0 0 2026-03-10T07:51:33.775 INFO:tasks.workunit.client.1.vm08.stdout:9/500: read d2/d86/f90 [97556,79507] 0 2026-03-10T07:51:33.787 INFO:tasks.workunit.client.1.vm08.stdout:7/480: dwrite d3/da/d25/d9/f30 [0,4194304] 0 2026-03-10T07:51:33.798 INFO:tasks.workunit.client.1.vm08.stdout:0/492: write dd/d10/f5e [830035,85657] 0 2026-03-10T07:51:33.806 INFO:tasks.workunit.client.1.vm08.stdout:4/411: rmdir d5 39 2026-03-10T07:51:33.824 INFO:tasks.workunit.client.1.vm08.stdout:9/501: mknod d2/d86/cab 0 2026-03-10T07:51:33.826 INFO:tasks.workunit.client.1.vm08.stdout:3/519: creat d0/d3c/d18/fa5 x:0 0 0 2026-03-10T07:51:33.828 INFO:tasks.workunit.client.1.vm08.stdout:8/552: getdents d0/d69 0 2026-03-10T07:51:33.829 INFO:tasks.workunit.client.1.vm08.stdout:8/553: read d0/df/d2e/f3c [2394216,44375] 0 2026-03-10T07:51:33.829 INFO:tasks.workunit.client.1.vm08.stdout:8/554: chown d0/df/d17 46865283 1 2026-03-10T07:51:33.830 INFO:tasks.workunit.client.1.vm08.stdout:8/555: stat d0/d69/f46 0 2026-03-10T07:51:33.832 INFO:tasks.workunit.client.1.vm08.stdout:7/481: dread - d3/da/d25/f94 zero size 2026-03-10T07:51:33.858 INFO:tasks.workunit.client.1.vm08.stdout:5/523: mkdir d0/d4/df/d1e/daf 0 2026-03-10T07:51:33.858 INFO:tasks.workunit.client.1.vm08.stdout:0/493: dwrite dd/d10/d14/d15/d20/d22/f2e [0,4194304] 0 2026-03-10T07:51:33.859 INFO:tasks.workunit.client.1.vm08.stdout:2/504: creat d0/d1/d17/d6b/da0/fa3 x:0 0 0 2026-03-10T07:51:33.861 INFO:tasks.workunit.client.1.vm08.stdout:0/494: chown dd/d10/d14/f69 102554303 1 2026-03-10T07:51:33.862 INFO:tasks.workunit.client.1.vm08.stdout:0/495: write dd/d10/d2f/d37/f65 [5026043,92492] 0 2026-03-10T07:51:33.863 INFO:tasks.workunit.client.1.vm08.stdout:0/496: truncate dd/d10/d2f/d37/d64/d95/f2a 2331672 0 2026-03-10T07:51:33.876 INFO:tasks.workunit.client.1.vm08.stdout:9/502: truncate d2/d3/f5f 2928358 0 2026-03-10T07:51:33.877 INFO:tasks.workunit.client.1.vm08.stdout:9/503: chown d2/l9 88 1 2026-03-10T07:51:33.879 INFO:tasks.workunit.client.1.vm08.stdout:1/469: creat d2/d6/de/fa5 x:0 0 0 2026-03-10T07:51:33.885 INFO:tasks.workunit.client.1.vm08.stdout:8/556: creat d0/d37/d86/fb7 x:0 0 0 2026-03-10T07:51:33.885 INFO:tasks.workunit.client.1.vm08.stdout:8/557: chown d0/fa3 16358902 1 2026-03-10T07:51:33.888 INFO:tasks.workunit.client.1.vm08.stdout:7/482: fsync d3/da/d25/d9/f59 0 2026-03-10T07:51:33.892 INFO:tasks.workunit.client.1.vm08.stdout:8/558: dwrite d0/f2a [0,4194304] 0 2026-03-10T07:51:33.892 INFO:tasks.workunit.client.1.vm08.stdout:8/559: dread - d0/d37/d86/f90 zero size 2026-03-10T07:51:33.894 INFO:tasks.workunit.client.1.vm08.stdout:8/560: read - d0/d37/d86/fb7 zero size 2026-03-10T07:51:33.900 INFO:tasks.workunit.client.1.vm08.stdout:6/503: getdents d1/d17/d2b/d5e 0 2026-03-10T07:51:33.900 INFO:tasks.workunit.client.1.vm08.stdout:8/561: truncate d0/d37/d86/f90 921964 0 2026-03-10T07:51:33.900 INFO:tasks.workunit.client.1.vm08.stdout:5/524: rename d0/d4/d19/fa0 to d0/d4/d19/d3a/d69/fb0 0 2026-03-10T07:51:33.901 INFO:tasks.workunit.client.1.vm08.stdout:2/505: rmdir d0/d1/d3/d39/d7d/d86 39 2026-03-10T07:51:33.912 INFO:tasks.workunit.client.1.vm08.stdout:9/504: creat d2/d86/d30/fac x:0 0 0 2026-03-10T07:51:33.912 INFO:tasks.workunit.client.1.vm08.stdout:9/505: readlink d2/l9 0 2026-03-10T07:51:33.915 INFO:tasks.workunit.client.1.vm08.stdout:3/520: fsync d0/d3c/d18/d32/d61/d52/f73 0 2026-03-10T07:51:33.926 INFO:tasks.workunit.client.1.vm08.stdout:8/562: unlink d0/d37/l94 0 2026-03-10T07:51:33.928 INFO:tasks.workunit.client.1.vm08.stdout:6/504: dwrite d1/f35 [0,4194304] 0 2026-03-10T07:51:33.939 INFO:tasks.workunit.client.1.vm08.stdout:3/521: fsync d0/d3c/d1f/f93 0 2026-03-10T07:51:33.939 INFO:tasks.workunit.client.1.vm08.stdout:6/505: dwrite d1/d7d/f91 [0,4194304] 0 2026-03-10T07:51:33.941 INFO:tasks.workunit.client.1.vm08.stdout:3/522: readlink d0/d3c/d1f/d44/d51/l2a 0 2026-03-10T07:51:33.941 INFO:tasks.workunit.client.1.vm08.stdout:7/483: symlink d3/da/d25/d9/d2f/d39/d43/d4f/la7 0 2026-03-10T07:51:33.941 INFO:tasks.workunit.client.1.vm08.stdout:7/484: dread - d3/fa4 zero size 2026-03-10T07:51:33.942 INFO:tasks.workunit.client.1.vm08.stdout:8/563: symlink d0/df/d15/d23/d39/d5b/d4a/lb8 0 2026-03-10T07:51:33.945 INFO:tasks.workunit.client.1.vm08.stdout:7/485: write d3/da/d25/d9/d2f/d39/f58 [960019,38664] 0 2026-03-10T07:51:33.951 INFO:tasks.workunit.client.1.vm08.stdout:8/564: dread d0/df/f60 [0,4194304] 0 2026-03-10T07:51:33.951 INFO:tasks.workunit.client.1.vm08.stdout:8/565: write d0/d69/d77/f87 [544454,77586] 0 2026-03-10T07:51:33.951 INFO:tasks.workunit.client.1.vm08.stdout:8/566: fsync d0/df/d15/f70 0 2026-03-10T07:51:33.951 INFO:tasks.workunit.client.1.vm08.stdout:8/567: stat d0/df/d15/d23 0 2026-03-10T07:51:33.951 INFO:tasks.workunit.client.1.vm08.stdout:8/568: dread - d0/fa3 zero size 2026-03-10T07:51:33.957 INFO:tasks.workunit.client.1.vm08.stdout:2/506: creat d0/d1/d3/d56/d9b/d9c/fa4 x:0 0 0 2026-03-10T07:51:33.960 INFO:tasks.workunit.client.1.vm08.stdout:4/412: creat d5/d8/f86 x:0 0 0 2026-03-10T07:51:33.964 INFO:tasks.workunit.client.1.vm08.stdout:4/413: chown d5/d8/f1e 14165519 1 2026-03-10T07:51:33.974 INFO:tasks.workunit.client.1.vm08.stdout:4/414: dwrite d5/d8/d9/d12/d7b/d48/f5d [0,4194304] 0 2026-03-10T07:51:33.974 INFO:tasks.workunit.client.1.vm08.stdout:4/415: chown d5/d8/d9/l81 93 1 2026-03-10T07:51:33.974 INFO:tasks.workunit.client.1.vm08.stdout:4/416: chown d5/d1f/d70 254209480 1 2026-03-10T07:51:33.974 INFO:tasks.workunit.client.1.vm08.stdout:8/569: dread d0/d37/f47 [0,4194304] 0 2026-03-10T07:51:33.974 INFO:tasks.workunit.client.1.vm08.stdout:8/570: fdatasync d0/df/d17/d7a/d89/fac 0 2026-03-10T07:51:33.992 INFO:tasks.workunit.client.1.vm08.stdout:6/506: mkdir d1/d17/d2b/d5e/da8 0 2026-03-10T07:51:33.992 INFO:tasks.workunit.client.1.vm08.stdout:6/507: chown d1/d3/df/d44 5066479 1 2026-03-10T07:51:34.001 INFO:tasks.workunit.client.1.vm08.stdout:5/525: rename d0/d4/d19/d3a/d69/fb0 to d0/d4/d19/d81/d92/fb1 0 2026-03-10T07:51:34.001 INFO:tasks.workunit.client.1.vm08.stdout:0/497: write dd/d10/d2f/f81 [189200,18676] 0 2026-03-10T07:51:34.002 INFO:tasks.workunit.client.1.vm08.stdout:5/526: chown d0/d4/d19/d60/d6d/d70/f67 202113 1 2026-03-10T07:51:34.090 INFO:tasks.workunit.client.1.vm08.stdout:2/507: write d0/d1/d3/d39/d7d/d86/f7f [737224,66194] 0 2026-03-10T07:51:34.095 INFO:tasks.workunit.client.1.vm08.stdout:1/470: getdents d2/d6/de/d1f 0 2026-03-10T07:51:34.108 INFO:tasks.workunit.client.1.vm08.stdout:8/571: dwrite d0/d69/d3f/fb3 [4194304,4194304] 0 2026-03-10T07:51:34.108 INFO:tasks.workunit.client.1.vm08.stdout:8/572: write d0/df/d2e/d30/f43 [1203274,36829] 0 2026-03-10T07:51:34.115 INFO:tasks.workunit.client.1.vm08.stdout:6/508: symlink d1/d3/df/d1d/d40/la9 0 2026-03-10T07:51:34.116 INFO:tasks.workunit.client.1.vm08.stdout:6/509: chown d1/d3/d3e/f81 19480490 1 2026-03-10T07:51:34.137 INFO:tasks.workunit.client.1.vm08.stdout:0/498: rename f6 to dd/d10/d14/d15/d20/d5f/fa2 0 2026-03-10T07:51:34.144 INFO:tasks.workunit.client.1.vm08.stdout:5/527: chown d0/d8/d5e/c9c 1274 1 2026-03-10T07:51:34.161 INFO:tasks.workunit.client.1.vm08.stdout:9/506: getdents d2/d86/d30/d35 0 2026-03-10T07:51:34.174 INFO:tasks.workunit.client.1.vm08.stdout:4/417: symlink d5/d8/d9/d12/d7b/d48/l87 0 2026-03-10T07:51:34.175 INFO:tasks.workunit.client.1.vm08.stdout:1/471: rmdir d2/d6/de/d1f/d26/d98 39 2026-03-10T07:51:34.176 INFO:tasks.workunit.client.1.vm08.stdout:1/472: dread - d2/d6/de/d70/d80/f85 zero size 2026-03-10T07:51:34.204 INFO:tasks.workunit.client.1.vm08.stdout:8/573: truncate d0/df/d15/d23/d39/f3e 275541 0 2026-03-10T07:51:34.204 INFO:tasks.workunit.client.1.vm08.stdout:8/574: stat d0/df/d15/d23/d39/f85 0 2026-03-10T07:51:34.206 INFO:tasks.workunit.client.1.vm08.stdout:6/510: truncate d1/d17/f63 1726824 0 2026-03-10T07:51:34.224 INFO:tasks.workunit.client.1.vm08.stdout:8/575: dread d0/df/f1b [0,4194304] 0 2026-03-10T07:51:34.226 INFO:tasks.workunit.client.1.vm08.stdout:0/499: dwrite f2 [0,4194304] 0 2026-03-10T07:51:34.228 INFO:tasks.workunit.client.1.vm08.stdout:5/528: rmdir d0/d4/d19/d50 39 2026-03-10T07:51:34.229 INFO:tasks.workunit.client.1.vm08.stdout:3/523: link d0/d3c/d1f/d44/d51/d2d/f3a d0/d3c/d18/fa6 0 2026-03-10T07:51:34.233 INFO:tasks.workunit.client.1.vm08.stdout:9/507: creat d2/d86/d30/d35/fad x:0 0 0 2026-03-10T07:51:34.243 INFO:tasks.workunit.client.1.vm08.stdout:2/508: truncate d0/f12 897960 0 2026-03-10T07:51:34.245 INFO:tasks.workunit.client.1.vm08.stdout:4/418: symlink d5/l88 0 2026-03-10T07:51:34.250 INFO:tasks.workunit.client.1.vm08.stdout:6/511: creat d1/d3/df/d1d/d40/d45/faa x:0 0 0 2026-03-10T07:51:34.252 INFO:tasks.workunit.client.1.vm08.stdout:6/512: dread d1/d3/df/d44/f5a [0,4194304] 0 2026-03-10T07:51:34.252 INFO:tasks.workunit.client.1.vm08.stdout:6/513: readlink d1/db/l14 0 2026-03-10T07:51:34.258 INFO:tasks.workunit.client.1.vm08.stdout:5/529: symlink d0/d8/d5e/d8e/lb2 0 2026-03-10T07:51:34.261 INFO:tasks.workunit.client.1.vm08.stdout:0/500: symlink dd/d10/d14/d1b/d30/la3 0 2026-03-10T07:51:34.264 INFO:tasks.workunit.client.1.vm08.stdout:7/486: link d3/da/d25/d9/d2f/d39/l95 d3/da/d25/la8 0 2026-03-10T07:51:34.268 INFO:tasks.workunit.client.1.vm08.stdout:2/509: creat d0/d1/d3/d39/d7d/d7e/fa5 x:0 0 0 2026-03-10T07:51:34.275 INFO:tasks.workunit.client.1.vm08.stdout:1/473: mknod d2/d6/de/d1f/d26/ca6 0 2026-03-10T07:51:34.300 INFO:tasks.workunit.client.1.vm08.stdout:0/501: creat dd/d10/d2f/d37/d64/d95/d58/d3d/fa4 x:0 0 0 2026-03-10T07:51:34.302 INFO:tasks.workunit.client.1.vm08.stdout:9/508: symlink d2/d86/d30/d35/d97/d9d/lae 0 2026-03-10T07:51:34.309 INFO:tasks.workunit.client.1.vm08.stdout:2/510: dread d0/d1/d3/d39/d7d/d86/d55/f13 [0,4194304] 0 2026-03-10T07:51:34.311 INFO:tasks.workunit.client.1.vm08.stdout:4/419: write d5/d8/d9/d12/d7b/d48/f5d [2359217,75865] 0 2026-03-10T07:51:34.318 INFO:tasks.workunit.client.1.vm08.stdout:5/530: creat d0/d77/daa/fb3 x:0 0 0 2026-03-10T07:51:34.319 INFO:tasks.workunit.client.1.vm08.stdout:5/531: chown d0/d4/df/d12/f13 61084 1 2026-03-10T07:51:34.324 INFO:tasks.workunit.client.1.vm08.stdout:0/502: mkdir dd/d10/d14/d1b/da5 0 2026-03-10T07:51:34.326 INFO:tasks.workunit.client.1.vm08.stdout:7/487: creat d3/da/d25/d9/d2f/d4d/fa9 x:0 0 0 2026-03-10T07:51:34.328 INFO:tasks.workunit.client.1.vm08.stdout:1/474: creat d2/d6/d9f/fa7 x:0 0 0 2026-03-10T07:51:34.403 INFO:tasks.workunit.client.1.vm08.stdout:2/511: mkdir d0/d1/d3/d56/d78/da6 0 2026-03-10T07:51:34.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:34 vm05.local ceph-mon[50387]: from='client.24489 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:51:34.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:34 vm05.local ceph-mon[50387]: pgmap v36: 65 pgs: 65 active+clean; 2.8 GiB data, 9.3 GiB used, 111 GiB / 120 GiB avail; 21 MiB/s rd, 115 MiB/s wr, 215 op/s 2026-03-10T07:51:34.417 INFO:tasks.workunit.client.1.vm08.stdout:8/576: getdents d0/df/d15/d95 0 2026-03-10T07:51:34.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:34 vm08.local ceph-mon[59917]: from='client.24489 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:51:34.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:34 vm08.local ceph-mon[59917]: pgmap v36: 65 pgs: 65 active+clean; 2.8 GiB data, 9.3 GiB used, 111 GiB / 120 GiB avail; 21 MiB/s rd, 115 MiB/s wr, 215 op/s 2026-03-10T07:51:34.420 INFO:tasks.workunit.client.1.vm08.stdout:8/577: dread d0/d69/d77/f87 [0,4194304] 0 2026-03-10T07:51:34.420 INFO:tasks.workunit.client.1.vm08.stdout:8/578: chown d0/df/l10 1172962 1 2026-03-10T07:51:34.425 INFO:tasks.workunit.client.1.vm08.stdout:5/532: write d0/d4/f75 [708296,44672] 0 2026-03-10T07:51:34.425 INFO:tasks.workunit.client.1.vm08.stdout:3/524: getdents d0/d3c/d1f/d44/d51 0 2026-03-10T07:51:34.428 INFO:tasks.workunit.client.1.vm08.stdout:5/533: dread d0/d4/d19/d81/d92/f74 [0,4194304] 0 2026-03-10T07:51:34.431 INFO:tasks.workunit.client.1.vm08.stdout:7/488: mknod d3/da/d25/d9/d2f/d39/caa 0 2026-03-10T07:51:34.432 INFO:tasks.workunit.client.1.vm08.stdout:9/509: getdents d2/de/d28/d98 0 2026-03-10T07:51:34.439 INFO:tasks.workunit.client.1.vm08.stdout:1/475: fsync d2/d6/de/d1f/f3d 0 2026-03-10T07:51:34.439 INFO:tasks.workunit.client.1.vm08.stdout:1/476: read - d2/d6/d3a/d61/f88 zero size 2026-03-10T07:51:34.440 INFO:tasks.workunit.client.1.vm08.stdout:1/477: fsync d2/d6/d3a/f7d 0 2026-03-10T07:51:34.443 INFO:tasks.workunit.client.1.vm08.stdout:2/512: fdatasync d0/d1/d3/d39/d7d/d86/d55/d7a/f94 0 2026-03-10T07:51:34.446 INFO:tasks.workunit.client.1.vm08.stdout:6/514: getdents d1/d3 0 2026-03-10T07:51:34.448 INFO:tasks.workunit.client.1.vm08.stdout:4/420: mkdir d5/d8/d89 0 2026-03-10T07:51:34.452 INFO:tasks.workunit.client.1.vm08.stdout:8/579: creat d0/df/d2e/fb9 x:0 0 0 2026-03-10T07:51:34.452 INFO:tasks.workunit.client.1.vm08.stdout:3/525: rmdir d0/d3c/d1f/d44 39 2026-03-10T07:51:34.454 INFO:tasks.workunit.client.1.vm08.stdout:5/534: rmdir d0/d8 39 2026-03-10T07:51:34.455 INFO:tasks.workunit.client.1.vm08.stdout:0/503: mkdir dd/d10/d14/da6 0 2026-03-10T07:51:34.457 INFO:tasks.workunit.client.1.vm08.stdout:9/510: mkdir d2/d86/daf 0 2026-03-10T07:51:34.468 INFO:tasks.workunit.client.1.vm08.stdout:6/515: mknod d1/d3/df/d52/cab 0 2026-03-10T07:51:34.468 INFO:tasks.workunit.client.1.vm08.stdout:6/516: readlink d1/l2f 0 2026-03-10T07:51:34.468 INFO:tasks.workunit.client.1.vm08.stdout:6/517: stat d1/d17/d2b/f68 0 2026-03-10T07:51:34.469 INFO:tasks.workunit.client.1.vm08.stdout:6/518: write d1/d3/d3e/f43 [562112,19105] 0 2026-03-10T07:51:34.469 INFO:tasks.workunit.client.1.vm08.stdout:4/421: mknod d5/d8/d9/d12/d7b/d48/d4f/c8a 0 2026-03-10T07:51:34.471 INFO:tasks.workunit.client.1.vm08.stdout:6/519: readlink d1/d3/la 0 2026-03-10T07:51:34.472 INFO:tasks.workunit.client.1.vm08.stdout:8/580: dread d0/df/d15/d23/da8/f6a [0,4194304] 0 2026-03-10T07:51:34.473 INFO:tasks.workunit.client.1.vm08.stdout:6/520: read d1/f28 [905360,72116] 0 2026-03-10T07:51:34.474 INFO:tasks.workunit.client.1.vm08.stdout:6/521: chown d1/d3/d3e 1101853 1 2026-03-10T07:51:34.475 INFO:tasks.workunit.client.1.vm08.stdout:8/581: read d0/df/d5d/f81 [2822452,49486] 0 2026-03-10T07:51:34.475 INFO:tasks.workunit.client.1.vm08.stdout:8/582: chown d0/c2d 406907891 1 2026-03-10T07:51:34.475 INFO:tasks.workunit.client.1.vm08.stdout:8/583: chown d0/d37 220220 1 2026-03-10T07:51:34.478 INFO:tasks.workunit.client.1.vm08.stdout:3/526: write d0/d3c/d18/d48/d55/d56/f81 [342546,47241] 0 2026-03-10T07:51:34.479 INFO:tasks.workunit.client.1.vm08.stdout:3/527: stat d0/d3c/d18/d48/d55/d56/f9c 0 2026-03-10T07:51:34.487 INFO:tasks.workunit.client.1.vm08.stdout:2/513: dread d0/f1e [0,4194304] 0 2026-03-10T07:51:34.488 INFO:tasks.workunit.client.1.vm08.stdout:2/514: fdatasync d0/d1/d3/d56/d9b/d9c/fa4 0 2026-03-10T07:51:34.493 INFO:tasks.workunit.client.1.vm08.stdout:7/489: rename d3/da/d25/f5c to d3/da/d25/d9/d6f/fab 0 2026-03-10T07:51:34.494 INFO:tasks.workunit.client.1.vm08.stdout:0/504: dwrite dd/d10/d2f/d37/d64/d95/d58/f86 [0,4194304] 0 2026-03-10T07:51:34.495 INFO:tasks.workunit.client.1.vm08.stdout:0/505: dread - dd/d10/d14/d1b/f2c zero size 2026-03-10T07:51:34.502 INFO:tasks.workunit.client.1.vm08.stdout:9/511: dwrite d2/d86/f24 [4194304,4194304] 0 2026-03-10T07:51:34.510 INFO:tasks.workunit.client.1.vm08.stdout:1/478: link d2/d6/de/f32 d2/d6/de/d1f/d26/d89/fa8 0 2026-03-10T07:51:34.512 INFO:tasks.workunit.client.1.vm08.stdout:4/422: mkdir d5/d8/d9/d12/d7b/d8b 0 2026-03-10T07:51:34.514 INFO:tasks.workunit.client.1.vm08.stdout:1/479: dwrite d2/d6/de/d1f/d26/f6e [4194304,4194304] 0 2026-03-10T07:51:34.520 INFO:tasks.workunit.client.1.vm08.stdout:6/522: unlink d1/c64 0 2026-03-10T07:51:34.531 INFO:tasks.workunit.client.1.vm08.stdout:5/535: rmdir d0/d4/d19/d50 39 2026-03-10T07:51:34.532 INFO:tasks.workunit.client.1.vm08.stdout:5/536: write d0/d77/daa/fb3 [633839,87328] 0 2026-03-10T07:51:34.534 INFO:tasks.workunit.client.1.vm08.stdout:2/515: read - d0/d1/d17/f95 zero size 2026-03-10T07:51:34.555 INFO:tasks.workunit.client.1.vm08.stdout:3/528: read d0/d3c/d1f/d44/d51/f9d [453075,7382] 0 2026-03-10T07:51:34.555 INFO:tasks.workunit.client.1.vm08.stdout:9/512: dread d2/f6 [0,4194304] 0 2026-03-10T07:51:34.556 INFO:tasks.workunit.client.1.vm08.stdout:3/529: chown d0/d3c/d1f/d44/d51/f65 342485697 1 2026-03-10T07:51:34.561 INFO:tasks.workunit.client.1.vm08.stdout:5/537: creat d0/d4/d19/d60/d6d/d70/fb4 x:0 0 0 2026-03-10T07:51:34.564 INFO:tasks.workunit.client.1.vm08.stdout:5/538: dread d0/d4/df/f2a [4194304,4194304] 0 2026-03-10T07:51:34.564 INFO:tasks.workunit.client.1.vm08.stdout:2/516: truncate d0/d1/d3/d39/f3b 2601552 0 2026-03-10T07:51:34.564 INFO:tasks.workunit.client.1.vm08.stdout:2/517: write d0/d1/d3/d56/d9b/d9c/fa4 [595138,96087] 0 2026-03-10T07:51:34.576 INFO:tasks.workunit.client.1.vm08.stdout:1/480: sync 2026-03-10T07:51:34.577 INFO:tasks.workunit.client.1.vm08.stdout:1/481: stat d2/d6/de 0 2026-03-10T07:51:34.582 INFO:tasks.workunit.client.1.vm08.stdout:9/513: dwrite d2/d58/f77 [0,4194304] 0 2026-03-10T07:51:34.584 INFO:tasks.workunit.client.1.vm08.stdout:1/482: dwrite d2/d6/de/d1f/d26/d58/d8c/f87 [4194304,4194304] 0 2026-03-10T07:51:34.600 INFO:tasks.workunit.client.1.vm08.stdout:8/584: write d0/df/f26 [5353606,41213] 0 2026-03-10T07:51:34.604 INFO:tasks.workunit.client.1.vm08.stdout:7/490: write d3/da/d25/f35 [1178272,93783] 0 2026-03-10T07:51:34.604 INFO:tasks.workunit.client.1.vm08.stdout:6/523: dwrite d1/d3/d3e/f56 [0,4194304] 0 2026-03-10T07:51:34.606 INFO:tasks.workunit.client.1.vm08.stdout:7/491: fdatasync d3/da/d25/d9/d2f/d39/f58 0 2026-03-10T07:51:34.614 INFO:tasks.workunit.client.1.vm08.stdout:7/492: dread d3/da/d25/d9/d2f/f97 [0,4194304] 0 2026-03-10T07:51:34.615 INFO:tasks.workunit.client.1.vm08.stdout:5/539: dwrite d0/d4/df/f2a [0,4194304] 0 2026-03-10T07:51:34.615 INFO:tasks.workunit.client.1.vm08.stdout:7/493: dread - d3/da/d25/d9/d2f/d39/d43/d4f/f8f zero size 2026-03-10T07:51:34.618 INFO:tasks.workunit.client.1.vm08.stdout:0/506: rename lc to dd/d10/d14/d15/la7 0 2026-03-10T07:51:34.619 INFO:tasks.workunit.client.1.vm08.stdout:0/507: chown dd/d10/d14/d1b/d30/f4d 59 1 2026-03-10T07:51:34.633 INFO:tasks.workunit.client.1.vm08.stdout:9/514: write d2/f51 [1128022,4657] 0 2026-03-10T07:51:34.653 INFO:tasks.workunit.client.1.vm08.stdout:6/524: mkdir d1/db/d24/dac 0 2026-03-10T07:51:34.662 INFO:tasks.workunit.client.1.vm08.stdout:2/518: mkdir d0/d1/d3/d10/d32/d61/d84/da7 0 2026-03-10T07:51:34.664 INFO:tasks.workunit.client.1.vm08.stdout:5/540: read d0/d4/df/d1e/f37 [2702467,90530] 0 2026-03-10T07:51:34.678 INFO:tasks.workunit.client.1.vm08.stdout:2/519: dread d0/d1/d17/f1a [4194304,4194304] 0 2026-03-10T07:51:34.678 INFO:tasks.workunit.client.1.vm08.stdout:2/520: stat d0/d1/d3/d10/d65/f7c 0 2026-03-10T07:51:34.681 INFO:tasks.workunit.client.1.vm08.stdout:4/423: getdents d5/d8/d9/d12/d7b/d48 0 2026-03-10T07:51:34.691 INFO:tasks.workunit.client.1.vm08.stdout:6/525: unlink d1/d3/df/d1d/d40/d45/l62 0 2026-03-10T07:51:34.701 INFO:tasks.workunit.client.1.vm08.stdout:5/541: dwrite d0/d4/df/d82/f8d [0,4194304] 0 2026-03-10T07:51:34.710 INFO:tasks.workunit.client.1.vm08.stdout:5/542: dread d0/d77/daa/fb3 [0,4194304] 0 2026-03-10T07:51:34.710 INFO:tasks.workunit.client.1.vm08.stdout:5/543: write d0/d4/df/d82/f8d [4444307,117912] 0 2026-03-10T07:51:34.714 INFO:tasks.workunit.client.1.vm08.stdout:7/494: dwrite d3/da/d25/d9/d2f/d3a/d4b/d67/d69/f65 [0,4194304] 0 2026-03-10T07:51:34.714 INFO:tasks.workunit.client.1.vm08.stdout:2/521: creat d0/d1/d3/d10/d32/d61/d84/fa8 x:0 0 0 2026-03-10T07:51:34.726 INFO:tasks.workunit.client.1.vm08.stdout:3/530: getdents d0/d3c/d1f/d44/d51/d2d/d85 0 2026-03-10T07:51:34.726 INFO:tasks.workunit.client.1.vm08.stdout:3/531: dread - d0/d3c/d18/d80/f86 zero size 2026-03-10T07:51:34.730 INFO:tasks.workunit.client.1.vm08.stdout:8/585: rmdir d0/df/d15/d23/da8/db4 0 2026-03-10T07:51:34.732 INFO:tasks.workunit.client.1.vm08.stdout:6/526: rename d1/db/d24/d51 to d1/db/d24/dac/dad 0 2026-03-10T07:51:34.744 INFO:tasks.workunit.client.1.vm08.stdout:3/532: sync 2026-03-10T07:51:34.748 INFO:tasks.workunit.client.1.vm08.stdout:0/508: truncate dd/d10/f5e 868078 0 2026-03-10T07:51:34.753 INFO:tasks.workunit.client.1.vm08.stdout:0/509: dread dd/d10/d2f/d37/f65 [0,4194304] 0 2026-03-10T07:51:34.755 INFO:tasks.workunit.client.1.vm08.stdout:7/495: mkdir d3/da/d25/d9/d2f/d3a/d4b/d67/d69/dac 0 2026-03-10T07:51:34.756 INFO:tasks.workunit.client.1.vm08.stdout:7/496: chown d3/da/d25/d9/d2f/d39/d43/d4f/c9b 9 1 2026-03-10T07:51:34.756 INFO:tasks.workunit.client.1.vm08.stdout:1/483: getdents d2/d10 0 2026-03-10T07:51:34.768 INFO:tasks.workunit.client.1.vm08.stdout:8/586: rename d0/df/d17/d7a to d0/df/d15/d23/d54/dba 0 2026-03-10T07:51:34.772 INFO:tasks.workunit.client.1.vm08.stdout:6/527: symlink d1/db/d24/lae 0 2026-03-10T07:51:34.776 INFO:tasks.workunit.client.1.vm08.stdout:4/424: rmdir d5/d8/d9/d12/d7b/d8b 0 2026-03-10T07:51:34.777 INFO:tasks.workunit.client.1.vm08.stdout:9/515: getdents d2/d86/d30/d35/d9b 0 2026-03-10T07:51:34.779 INFO:tasks.workunit.client.1.vm08.stdout:3/533: unlink d0/d3c/d1f/d44/d51/l2a 0 2026-03-10T07:51:34.782 INFO:tasks.workunit.client.1.vm08.stdout:3/534: dwrite d0/d3c/d1f/d44/f8c [4194304,4194304] 0 2026-03-10T07:51:34.787 INFO:tasks.workunit.client.1.vm08.stdout:0/510: chown dd/d10/d14/d15/la7 3 1 2026-03-10T07:51:34.788 INFO:tasks.workunit.client.1.vm08.stdout:0/511: dread - dd/d10/d14/d15/f9c zero size 2026-03-10T07:51:34.788 INFO:tasks.workunit.client.1.vm08.stdout:0/512: chown dd/d10/d14/d15/d20/l66 9 1 2026-03-10T07:51:34.789 INFO:tasks.workunit.client.1.vm08.stdout:7/497: rmdir d3/da/d25/d9/d2f/d3a/d4b 39 2026-03-10T07:51:34.793 INFO:tasks.workunit.client.1.vm08.stdout:8/587: rmdir d0/df/d17/d72 39 2026-03-10T07:51:34.834 INFO:tasks.workunit.client.1.vm08.stdout:8/588: dwrite d0/df/f9f [0,4194304] 0 2026-03-10T07:51:34.834 INFO:tasks.workunit.client.1.vm08.stdout:8/589: dread - d0/fa3 zero size 2026-03-10T07:51:34.834 INFO:tasks.workunit.client.1.vm08.stdout:5/544: getdents d0/d8/d5e/d8e/dac 0 2026-03-10T07:51:34.834 INFO:tasks.workunit.client.1.vm08.stdout:9/516: rename d2/d86/d2b/l74 to d2/d86/d30/d35/d9b/lb0 0 2026-03-10T07:51:34.834 INFO:tasks.workunit.client.1.vm08.stdout:8/590: symlink d0/d37/lbb 0 2026-03-10T07:51:34.834 INFO:tasks.workunit.client.1.vm08.stdout:2/522: creat d0/d1/d3/d39/d7d/d86/fa9 x:0 0 0 2026-03-10T07:51:34.834 INFO:tasks.workunit.client.1.vm08.stdout:6/528: mkdir d1/d17/d2b/d58/d77/daf 0 2026-03-10T07:51:34.834 INFO:tasks.workunit.client.1.vm08.stdout:6/529: read d1/f35 [1189447,27019] 0 2026-03-10T07:51:34.834 INFO:tasks.workunit.client.1.vm08.stdout:5/545: write d0/d4/d19/d60/d6d/d70/d40/fa8 [222963,10337] 0 2026-03-10T07:51:34.834 INFO:tasks.workunit.client.1.vm08.stdout:9/517: unlink d2/d86/d30/d35/c33 0 2026-03-10T07:51:34.834 INFO:tasks.workunit.client.1.vm08.stdout:4/425: link d5/d8/d9/d12/d7b/d48/d64/l7e d5/d8/d9/d12/d7b/d48/d64/l8c 0 2026-03-10T07:51:34.834 INFO:tasks.workunit.client.1.vm08.stdout:6/530: symlink d1/d3/df/lb0 0 2026-03-10T07:51:34.834 INFO:tasks.workunit.client.1.vm08.stdout:2/523: creat d0/d1/d3/d39/d7d/d86/d55/d1b/faa x:0 0 0 2026-03-10T07:51:34.834 INFO:tasks.workunit.client.1.vm08.stdout:9/518: symlink d2/d86/d30/lb1 0 2026-03-10T07:51:34.834 INFO:tasks.workunit.client.1.vm08.stdout:4/426: mkdir d5/d8/d9/d12/d7b/d48/d4f/d8d 0 2026-03-10T07:51:34.835 INFO:tasks.workunit.client.1.vm08.stdout:8/591: mkdir d0/df/d15/d23/d39/d5b/dbc 0 2026-03-10T07:51:34.835 INFO:tasks.workunit.client.1.vm08.stdout:2/524: chown d0/d1/d3/l4d 1739441 1 2026-03-10T07:51:34.835 INFO:tasks.workunit.client.1.vm08.stdout:9/519: chown d2/d3/c4a 3742 1 2026-03-10T07:51:34.835 INFO:tasks.workunit.client.1.vm08.stdout:5/546: rename d0/f80 to d0/fb5 0 2026-03-10T07:51:34.835 INFO:tasks.workunit.client.1.vm08.stdout:6/531: mkdir d1/d17/d2b/d58/d77/daf/db1 0 2026-03-10T07:51:34.835 INFO:tasks.workunit.client.1.vm08.stdout:4/427: symlink d5/d8/d9/d32/l8e 0 2026-03-10T07:51:34.835 INFO:tasks.workunit.client.1.vm08.stdout:5/547: read - d0/d4/d19/d81/d92/f76 zero size 2026-03-10T07:51:34.835 INFO:tasks.workunit.client.1.vm08.stdout:8/592: creat d0/d69/d3f/fbd x:0 0 0 2026-03-10T07:51:34.835 INFO:tasks.workunit.client.1.vm08.stdout:9/520: rename d2/d86/c3f to d2/d86/d30/d35/d97/cb2 0 2026-03-10T07:51:34.835 INFO:tasks.workunit.client.1.vm08.stdout:5/548: truncate d0/d4/d19/d3a/fa1 735648 0 2026-03-10T07:51:34.835 INFO:tasks.workunit.client.1.vm08.stdout:9/521: write d2/d58/f93 [81456,23915] 0 2026-03-10T07:51:34.835 INFO:tasks.workunit.client.1.vm08.stdout:8/593: getdents d0/df/d15/d23/d39/d5b/d4a 0 2026-03-10T07:51:34.835 INFO:tasks.workunit.client.1.vm08.stdout:4/428: dwrite d5/d1f/d31/f82 [0,4194304] 0 2026-03-10T07:51:34.835 INFO:tasks.workunit.client.1.vm08.stdout:5/549: rmdir d0/d4/d19/d60/d6d/d98 0 2026-03-10T07:51:34.835 INFO:tasks.workunit.client.1.vm08.stdout:5/550: write d0/d4/d19/d3a/d69/f6b [4041836,126787] 0 2026-03-10T07:51:34.836 INFO:tasks.workunit.client.1.vm08.stdout:5/551: dwrite d0/d4/d19/d43/f59 [0,4194304] 0 2026-03-10T07:51:34.839 INFO:tasks.workunit.client.1.vm08.stdout:5/552: truncate d0/d4/df/d1e/f25 4816487 0 2026-03-10T07:51:34.848 INFO:tasks.workunit.client.1.vm08.stdout:0/513: dread dd/d10/d14/d15/f41 [0,4194304] 0 2026-03-10T07:51:34.848 INFO:tasks.workunit.client.1.vm08.stdout:4/429: mknod d5/d8/d9/d12/d7b/d48/d4f/c8f 0 2026-03-10T07:51:34.848 INFO:tasks.workunit.client.1.vm08.stdout:5/553: creat d0/d4/d19/d60/fb6 x:0 0 0 2026-03-10T07:51:34.848 INFO:tasks.workunit.client.1.vm08.stdout:8/594: mknod d0/df/d15/d23/d39/d5b/dbc/cbe 0 2026-03-10T07:51:34.848 INFO:tasks.workunit.client.1.vm08.stdout:8/595: truncate d0/df/d2e/d30/f43 2018080 0 2026-03-10T07:51:34.848 INFO:tasks.workunit.client.1.vm08.stdout:5/554: dwrite d0/d4/d19/d81/d92/f65 [0,4194304] 0 2026-03-10T07:51:34.851 INFO:tasks.workunit.client.1.vm08.stdout:8/596: mkdir d0/df/d15/d23/d54/dba/d89/dbf 0 2026-03-10T07:51:34.861 INFO:tasks.workunit.client.1.vm08.stdout:4/430: creat d5/d8/f90 x:0 0 0 2026-03-10T07:51:34.861 INFO:tasks.workunit.client.1.vm08.stdout:4/431: chown d5/d8/d9/d12/d7b/d48/f5d 0 1 2026-03-10T07:51:34.861 INFO:tasks.workunit.client.1.vm08.stdout:4/432: stat d5/d8/f39 0 2026-03-10T07:51:34.861 INFO:tasks.workunit.client.1.vm08.stdout:8/597: mkdir d0/df/d2e/d30/dc0 0 2026-03-10T07:51:34.861 INFO:tasks.workunit.client.1.vm08.stdout:4/433: dread d5/d1f/d70/f78 [0,4194304] 0 2026-03-10T07:51:34.861 INFO:tasks.workunit.client.1.vm08.stdout:8/598: dread d0/df/d2e/f3c [0,4194304] 0 2026-03-10T07:51:34.861 INFO:tasks.workunit.client.1.vm08.stdout:4/434: mkdir d5/d8/d9/d12/d7b/d48/d4f/d8d/d91 0 2026-03-10T07:51:34.861 INFO:tasks.workunit.client.1.vm08.stdout:8/599: rmdir d0/df/d15/d23/d54/dba/d89 39 2026-03-10T07:51:34.861 INFO:tasks.workunit.client.1.vm08.stdout:8/600: chown d0/df/d15/f70 3127 1 2026-03-10T07:51:34.863 INFO:tasks.workunit.client.1.vm08.stdout:7/498: sync 2026-03-10T07:51:34.865 INFO:tasks.workunit.client.1.vm08.stdout:8/601: dwrite d0/df/d15/d23/f75 [0,4194304] 0 2026-03-10T07:51:34.865 INFO:tasks.workunit.client.1.vm08.stdout:7/499: truncate d3/da/d25/d9/d6f/fab 3746759 0 2026-03-10T07:51:34.866 INFO:tasks.workunit.client.1.vm08.stdout:8/602: readlink d0/df/d15/d23/d39/l3a 0 2026-03-10T07:51:34.867 INFO:tasks.workunit.client.1.vm08.stdout:8/603: write d0/df/f26 [5306396,25216] 0 2026-03-10T07:51:34.869 INFO:tasks.workunit.client.1.vm08.stdout:7/500: truncate d3/da/d25/d9/d2f/f42 318780 0 2026-03-10T07:51:34.873 INFO:tasks.workunit.client.1.vm08.stdout:7/501: dwrite d3/da/d25/d9/d2f/f62 [0,4194304] 0 2026-03-10T07:51:34.887 INFO:tasks.workunit.client.1.vm08.stdout:8/604: unlink d0/df/d15/d23/d54/dba/d89/cae 0 2026-03-10T07:51:34.898 INFO:tasks.workunit.client.1.vm08.stdout:7/502: fdatasync d3/f51 0 2026-03-10T07:51:34.898 INFO:tasks.workunit.client.1.vm08.stdout:8/605: rename d0/df/d2e/d49/ca5 to d0/df/d15/d23/d54/dba/d89/cc1 0 2026-03-10T07:51:34.900 INFO:tasks.workunit.client.1.vm08.stdout:8/606: dread d0/df/d2e/f3c [0,4194304] 0 2026-03-10T07:51:34.903 INFO:tasks.workunit.client.1.vm08.stdout:8/607: creat d0/df/d15/d23/da8/fc2 x:0 0 0 2026-03-10T07:51:34.903 INFO:tasks.workunit.client.1.vm08.stdout:8/608: dread - d0/d69/d3f/fbd zero size 2026-03-10T07:51:34.904 INFO:tasks.workunit.client.1.vm08.stdout:8/609: read d0/d37/f47 [480835,77841] 0 2026-03-10T07:51:34.905 INFO:tasks.workunit.client.1.vm08.stdout:8/610: write d0/df/d2e/f9e [888972,108605] 0 2026-03-10T07:51:34.907 INFO:tasks.workunit.client.1.vm08.stdout:8/611: getdents d0/df/d17/d72 0 2026-03-10T07:51:34.929 INFO:tasks.workunit.client.1.vm08.stdout:8/612: write d0/df/d15/d23/d54/fad [18348,113166] 0 2026-03-10T07:51:34.929 INFO:tasks.workunit.client.1.vm08.stdout:8/613: creat d0/df/d15/d23/da8/fc3 x:0 0 0 2026-03-10T07:51:35.027 INFO:tasks.workunit.client.1.vm08.stdout:1/484: write d2/d6/de/d1f/d40/f4d [404429,68107] 0 2026-03-10T07:51:35.033 INFO:tasks.workunit.client.1.vm08.stdout:1/485: write d2/d6/f86 [2487424,96232] 0 2026-03-10T07:51:35.039 INFO:tasks.workunit.client.1.vm08.stdout:1/486: dwrite d2/d6/de/d47/f63 [0,4194304] 0 2026-03-10T07:51:35.049 INFO:tasks.workunit.client.1.vm08.stdout:3/535: dread d0/d3c/d1f/d44/d51/f4d [0,4194304] 0 2026-03-10T07:51:35.053 INFO:tasks.workunit.client.1.vm08.stdout:1/487: rmdir d2/d6/de/d1f/d26/d58/d9e 0 2026-03-10T07:51:35.053 INFO:tasks.workunit.client.1.vm08.stdout:1/488: mkdir d2/d6/de/d1f/da9 0 2026-03-10T07:51:35.054 INFO:tasks.workunit.client.1.vm08.stdout:1/489: fdatasync d2/d6/de/d70/d80/f85 0 2026-03-10T07:51:35.138 INFO:tasks.workunit.client.1.vm08.stdout:2/525: write d0/d1/d3/d10/d38/f54 [227535,93662] 0 2026-03-10T07:51:35.138 INFO:tasks.workunit.client.1.vm08.stdout:9/522: rmdir d2/d86/d30 39 2026-03-10T07:51:35.138 INFO:tasks.workunit.client.1.vm08.stdout:6/532: write d1/d3/f2e [4682667,78714] 0 2026-03-10T07:51:35.140 INFO:tasks.workunit.client.1.vm08.stdout:9/523: dread - d2/d3/d84/d91/fa6 zero size 2026-03-10T07:51:35.143 INFO:tasks.workunit.client.1.vm08.stdout:6/533: truncate d1/db/d24/d73/f78 221266 0 2026-03-10T07:51:35.144 INFO:tasks.workunit.client.1.vm08.stdout:2/526: unlink d0/f4a 0 2026-03-10T07:51:35.145 INFO:tasks.workunit.client.1.vm08.stdout:9/524: creat d2/d58/fb3 x:0 0 0 2026-03-10T07:51:35.160 INFO:tasks.workunit.client.1.vm08.stdout:0/514: write dd/d10/d2f/d37/d64/d95/f91 [844862,8192] 0 2026-03-10T07:51:35.195 INFO:tasks.workunit.client.1.vm08.stdout:0/515: readlink dd/d10/l26 0 2026-03-10T07:51:35.196 INFO:tasks.workunit.client.1.vm08.stdout:7/503: write d3/f34 [271140,86658] 0 2026-03-10T07:51:35.196 INFO:tasks.workunit.client.1.vm08.stdout:7/504: dread d3/da/f21 [0,4194304] 0 2026-03-10T07:51:35.196 INFO:tasks.workunit.client.1.vm08.stdout:5/555: dwrite d0/d8/d5e/f6a [0,4194304] 0 2026-03-10T07:51:35.196 INFO:tasks.workunit.client.1.vm08.stdout:8/614: dwrite d0/f20 [0,4194304] 0 2026-03-10T07:51:35.196 INFO:tasks.workunit.client.1.vm08.stdout:5/556: write d0/d4/d19/d3a/f3f [2793736,42782] 0 2026-03-10T07:51:35.196 INFO:tasks.workunit.client.1.vm08.stdout:4/435: dwrite d5/f2f [0,4194304] 0 2026-03-10T07:51:35.196 INFO:tasks.workunit.client.1.vm08.stdout:2/527: mknod d0/d1/d3/cab 0 2026-03-10T07:51:35.196 INFO:tasks.workunit.client.1.vm08.stdout:9/525: symlink d2/de/d28/d98/lb4 0 2026-03-10T07:51:35.196 INFO:tasks.workunit.client.1.vm08.stdout:1/490: chown d2/d6/de/d1f/d26/d58/d8c/l6a 44473308 1 2026-03-10T07:51:35.196 INFO:tasks.workunit.client.1.vm08.stdout:8/615: readlink d0/df/d15/d23/d39/d5b/l29 0 2026-03-10T07:51:35.196 INFO:tasks.workunit.client.1.vm08.stdout:5/557: rename d0/d4/d19/f6e to d0/d4/df/d12/d94/fb7 0 2026-03-10T07:51:35.196 INFO:tasks.workunit.client.1.vm08.stdout:4/436: creat d5/d8/d9/f92 x:0 0 0 2026-03-10T07:51:35.196 INFO:tasks.workunit.client.1.vm08.stdout:3/536: symlink d0/la7 0 2026-03-10T07:51:35.196 INFO:tasks.workunit.client.1.vm08.stdout:8/616: dwrite d0/df/d15/d23/f3d [0,4194304] 0 2026-03-10T07:51:35.196 INFO:tasks.workunit.client.1.vm08.stdout:7/505: mknod d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79/cad 0 2026-03-10T07:51:35.196 INFO:tasks.workunit.client.1.vm08.stdout:9/526: write d2/d3/f5f [68738,73862] 0 2026-03-10T07:51:35.196 INFO:tasks.workunit.client.1.vm08.stdout:6/534: sync 2026-03-10T07:51:35.205 INFO:tasks.workunit.client.1.vm08.stdout:1/491: rename d2/d6/de/d5f/l66 to d2/d6/de/d1f/da9/laa 0 2026-03-10T07:51:35.205 INFO:tasks.workunit.client.1.vm08.stdout:5/558: mknod d0/d8/d24/cb8 0 2026-03-10T07:51:35.205 INFO:tasks.workunit.client.1.vm08.stdout:8/617: creat d0/df/d17/d25/fc4 x:0 0 0 2026-03-10T07:51:35.207 INFO:tasks.workunit.client.1.vm08.stdout:4/437: symlink d5/d1f/d41/l93 0 2026-03-10T07:51:35.209 INFO:tasks.workunit.client.1.vm08.stdout:5/559: stat d0/d4/d19/l87 0 2026-03-10T07:51:35.209 INFO:tasks.workunit.client.1.vm08.stdout:4/438: write d5/d8/d9/d32/f6c [131193,74771] 0 2026-03-10T07:51:35.209 INFO:tasks.workunit.client.1.vm08.stdout:6/535: write d1/d17/f66 [4398794,94977] 0 2026-03-10T07:51:35.209 INFO:tasks.workunit.client.1.vm08.stdout:7/506: symlink d3/da/d25/d9/d2f/d39/d43/d4f/lae 0 2026-03-10T07:51:35.213 INFO:tasks.workunit.client.1.vm08.stdout:8/618: mkdir d0/df/d15/d23/d54/dba/d89/dc5 0 2026-03-10T07:51:35.213 INFO:tasks.workunit.client.1.vm08.stdout:1/492: dwrite d2/d6/d3a/f7d [0,4194304] 0 2026-03-10T07:51:35.217 INFO:tasks.workunit.client.1.vm08.stdout:6/536: read - d1/db/d24/f75 zero size 2026-03-10T07:51:35.219 INFO:tasks.workunit.client.1.vm08.stdout:5/560: creat d0/d4/df/d1e/d41/dad/fb9 x:0 0 0 2026-03-10T07:51:35.219 INFO:tasks.workunit.client.1.vm08.stdout:4/439: symlink d5/d8/l94 0 2026-03-10T07:51:35.222 INFO:tasks.workunit.client.1.vm08.stdout:5/561: chown d0/c93 28 1 2026-03-10T07:51:35.225 INFO:tasks.workunit.client.1.vm08.stdout:6/537: mkdir d1/d3/d3e/db2 0 2026-03-10T07:51:35.234 INFO:tasks.workunit.client.1.vm08.stdout:9/527: link d2/d86/d30/d35/fad d2/fb5 0 2026-03-10T07:51:35.234 INFO:tasks.workunit.client.1.vm08.stdout:1/493: fsync d2/d6/de/d1f/d26/d58/d8c/f96 0 2026-03-10T07:51:35.234 INFO:tasks.workunit.client.1.vm08.stdout:4/440: mkdir d5/d8/d9/d95 0 2026-03-10T07:51:35.234 INFO:tasks.workunit.client.1.vm08.stdout:8/619: dread d0/d69/f46 [0,4194304] 0 2026-03-10T07:51:35.234 INFO:tasks.workunit.client.1.vm08.stdout:7/507: mknod d3/da/d25/d9/d2f/d3a/d4b/d67/d69/dac/caf 0 2026-03-10T07:51:35.234 INFO:tasks.workunit.client.1.vm08.stdout:9/528: creat d2/d86/d30/d35/d9b/fb6 x:0 0 0 2026-03-10T07:51:35.234 INFO:tasks.workunit.client.1.vm08.stdout:7/508: readlink d3/da/d25/d9/d2f/d39/d43/d4f/l72 0 2026-03-10T07:51:35.234 INFO:tasks.workunit.client.1.vm08.stdout:8/620: chown d0/df/d2e/d30/l7c 14425 1 2026-03-10T07:51:35.234 INFO:tasks.workunit.client.1.vm08.stdout:4/441: creat d5/d8/d9/d95/f96 x:0 0 0 2026-03-10T07:51:35.238 INFO:tasks.workunit.client.1.vm08.stdout:1/494: creat d2/d6/de/d1f/d40/d76/fab x:0 0 0 2026-03-10T07:51:35.238 INFO:tasks.workunit.client.1.vm08.stdout:9/529: readlink d2/d86/d30/d35/d9b/lb0 0 2026-03-10T07:51:35.240 INFO:tasks.workunit.client.1.vm08.stdout:4/442: creat d5/d1f/d41/f97 x:0 0 0 2026-03-10T07:51:35.240 INFO:tasks.workunit.client.1.vm08.stdout:1/495: creat d2/d6/de/d47/fac x:0 0 0 2026-03-10T07:51:35.241 INFO:tasks.workunit.client.1.vm08.stdout:9/530: truncate d2/d3/d84/f8c 1269110 0 2026-03-10T07:51:35.241 INFO:tasks.workunit.client.1.vm08.stdout:8/621: dread d0/df/f60 [0,4194304] 0 2026-03-10T07:51:35.242 INFO:tasks.workunit.client.1.vm08.stdout:1/496: dread - d2/d6/de/d1f/f3d zero size 2026-03-10T07:51:35.244 INFO:tasks.workunit.client.1.vm08.stdout:9/531: mknod d2/d86/daf/cb7 0 2026-03-10T07:51:35.247 INFO:tasks.workunit.client.1.vm08.stdout:1/497: dwrite d2/d6/de/f5b [0,4194304] 0 2026-03-10T07:51:35.259 INFO:tasks.workunit.client.1.vm08.stdout:4/443: link d5/d1f/c2c d5/d8/d50/c98 0 2026-03-10T07:51:35.259 INFO:tasks.workunit.client.1.vm08.stdout:9/532: mkdir d2/d3/d84/d91/db8 0 2026-03-10T07:51:35.261 INFO:tasks.workunit.client.1.vm08.stdout:4/444: rmdir d5 39 2026-03-10T07:51:35.264 INFO:tasks.workunit.client.1.vm08.stdout:9/533: dwrite d2/de/d28/f96 [0,4194304] 0 2026-03-10T07:51:35.265 INFO:tasks.workunit.client.1.vm08.stdout:1/498: unlink d2/d6/de/d1f/d22/c7e 0 2026-03-10T07:51:35.268 INFO:tasks.workunit.client.1.vm08.stdout:1/499: mkdir d2/d6/d3a/d61/d6f/dad 0 2026-03-10T07:51:35.272 INFO:tasks.workunit.client.1.vm08.stdout:0/516: dread dd/d10/d2f/d37/d64/d95/f2a [0,4194304] 0 2026-03-10T07:51:35.278 INFO:tasks.workunit.client.1.vm08.stdout:1/500: link d2/d6/de/d1f/d22/ca4 d2/d6/de/d9c/cae 0 2026-03-10T07:51:35.314 INFO:tasks.workunit.client.1.vm08.stdout:1/501: dwrite d2/d6/de/d70/f8d [0,4194304] 0 2026-03-10T07:51:35.315 INFO:tasks.workunit.client.1.vm08.stdout:1/502: truncate d2/d6/de/d1f/d26/d89/d8e/f90 512705 0 2026-03-10T07:51:35.315 INFO:tasks.workunit.client.1.vm08.stdout:1/503: creat d2/d6/de/d1f/da9/faf x:0 0 0 2026-03-10T07:51:35.315 INFO:tasks.workunit.client.1.vm08.stdout:1/504: mknod d2/d6/de/d1f/d26/d98/d9b/cb0 0 2026-03-10T07:51:35.315 INFO:tasks.workunit.client.1.vm08.stdout:1/505: creat d2/d6/de/d5f/fb1 x:0 0 0 2026-03-10T07:51:35.315 INFO:tasks.workunit.client.1.vm08.stdout:1/506: dread d2/d6/de/d1f/d26/d89/d8e/f90 [0,4194304] 0 2026-03-10T07:51:35.316 INFO:tasks.workunit.client.1.vm08.stdout:1/507: dread d2/f4 [0,4194304] 0 2026-03-10T07:51:35.350 INFO:tasks.workunit.client.1.vm08.stdout:0/517: sync 2026-03-10T07:51:35.370 INFO:tasks.workunit.client.1.vm08.stdout:5/562: rename d0/d4/df/d12/d94 to d0/d4/d19/d60/d6d/d70/d40/dba 0 2026-03-10T07:51:35.371 INFO:tasks.workunit.client.1.vm08.stdout:5/563: write d0/d4/df/d12/f97 [35306,43515] 0 2026-03-10T07:51:35.374 INFO:tasks.workunit.client.1.vm08.stdout:6/538: rename d1/db/d24/d73/d79/d7c/f7f to d1/d3/df/d1d/d40/d87/fb3 0 2026-03-10T07:51:35.379 INFO:tasks.workunit.client.1.vm08.stdout:6/539: chown d1/d3/df/d52/la4 0 1 2026-03-10T07:51:35.380 INFO:tasks.workunit.client.1.vm08.stdout:3/537: write d0/d3c/d1f/d89/fa4 [4037520,100335] 0 2026-03-10T07:51:35.380 INFO:tasks.workunit.client.1.vm08.stdout:3/538: stat d0/d3c/d1f/d44/d51/f4d 0 2026-03-10T07:51:35.382 INFO:tasks.workunit.client.1.vm08.stdout:6/540: dread - d1/db/f57 zero size 2026-03-10T07:51:35.392 INFO:tasks.workunit.client.1.vm08.stdout:6/541: chown d1/d3/df/d44/fa2 121 1 2026-03-10T07:51:35.392 INFO:tasks.workunit.client.1.vm08.stdout:5/564: getdents d0/d4/df/d82 0 2026-03-10T07:51:35.392 INFO:tasks.workunit.client.1.vm08.stdout:6/542: write d1/d17/f66 [5381290,66516] 0 2026-03-10T07:51:35.392 INFO:tasks.workunit.client.1.vm08.stdout:6/543: chown d1/d3/df/d1d/d40/d87/f89 1056463698 1 2026-03-10T07:51:35.392 INFO:tasks.workunit.client.1.vm08.stdout:3/539: mknod d0/d3c/d18/d48/d55/d56/ca8 0 2026-03-10T07:51:35.392 INFO:tasks.workunit.client.1.vm08.stdout:5/565: dread - d0/d4/df/d1e/f49 zero size 2026-03-10T07:51:35.392 INFO:tasks.workunit.client.1.vm08.stdout:3/540: chown d0/d3c/d18/d32/d61/d83/f8b 1878403 1 2026-03-10T07:51:35.392 INFO:tasks.workunit.client.1.vm08.stdout:6/544: truncate d1/d3/d3e/f5b 850457 0 2026-03-10T07:51:35.392 INFO:tasks.workunit.client.1.vm08.stdout:3/541: write d0/d3c/d18/d48/d55/d56/f9c [172337,46848] 0 2026-03-10T07:51:35.392 INFO:tasks.workunit.client.1.vm08.stdout:5/566: fdatasync d0/d4/df/d1e/f37 0 2026-03-10T07:51:35.392 INFO:tasks.workunit.client.1.vm08.stdout:6/545: mknod d1/d3/df/d1d/d40/d45/d5c/cb4 0 2026-03-10T07:51:35.392 INFO:tasks.workunit.client.1.vm08.stdout:3/542: write d0/d3c/d1f/d44/d51/d34/f99 [2943908,106914] 0 2026-03-10T07:51:35.393 INFO:tasks.workunit.client.1.vm08.stdout:6/546: symlink d1/d17/d2b/d5e/lb5 0 2026-03-10T07:51:35.394 INFO:tasks.workunit.client.1.vm08.stdout:5/567: dwrite d0/d4/df/d1e/f64 [0,4194304] 0 2026-03-10T07:51:35.396 INFO:tasks.workunit.client.1.vm08.stdout:5/568: chown d0/d4/f75 1 1 2026-03-10T07:51:35.403 INFO:tasks.workunit.client.1.vm08.stdout:5/569: truncate d0/d4/d19/d60/d6d/d70/f67 4378440 0 2026-03-10T07:51:35.404 INFO:tasks.workunit.client.1.vm08.stdout:6/547: creat d1/d17/d2b/d58/d76/fb6 x:0 0 0 2026-03-10T07:51:35.416 INFO:tasks.workunit.client.1.vm08.stdout:6/548: dread d1/db/f4e [0,4194304] 0 2026-03-10T07:51:35.431 INFO:tasks.workunit.client.1.vm08.stdout:6/549: dread - d1/d17/d2b/d5e/f96 zero size 2026-03-10T07:51:35.431 INFO:tasks.workunit.client.1.vm08.stdout:6/550: getdents d1/d3/df/d1d/d40/d87/d95 0 2026-03-10T07:51:35.431 INFO:tasks.workunit.client.1.vm08.stdout:2/528: dwrite d0/d1/d3/f63 [0,4194304] 0 2026-03-10T07:51:35.432 INFO:tasks.workunit.client.1.vm08.stdout:7/509: rename d3/da/d25/d9/d2f/d3a/d4b/d67/d69 to d3/da/d25/d9/d2f/d3a/d4b/db0 0 2026-03-10T07:51:35.432 INFO:tasks.workunit.client.1.vm08.stdout:6/551: dwrite d1/d3/d3e/f56 [4194304,4194304] 0 2026-03-10T07:51:35.432 INFO:tasks.workunit.client.1.vm08.stdout:2/529: write d0/d1/d3/d10/d38/f52 [1132000,82495] 0 2026-03-10T07:51:35.434 INFO:tasks.workunit.client.1.vm08.stdout:2/530: chown d0/d1/d3/l4d 112818691 1 2026-03-10T07:51:35.435 INFO:tasks.workunit.client.1.vm08.stdout:2/531: dread - d0/d1/d3/d10/d32/d61/d84/fa8 zero size 2026-03-10T07:51:35.435 INFO:tasks.workunit.client.1.vm08.stdout:2/532: chown d0/f12 12030 1 2026-03-10T07:51:35.436 INFO:tasks.workunit.client.1.vm08.stdout:7/510: readlink d3/da/d25/l1a 0 2026-03-10T07:51:35.437 INFO:tasks.workunit.client.1.vm08.stdout:7/511: write d3/f57 [486302,102284] 0 2026-03-10T07:51:35.437 INFO:tasks.workunit.client.1.vm08.stdout:3/543: read d0/d3c/d18/d32/d61/d52/f68 [155791,19823] 0 2026-03-10T07:51:35.443 INFO:tasks.workunit.client.1.vm08.stdout:6/552: creat d1/d17/d2b/d58/d77/fb7 x:0 0 0 2026-03-10T07:51:35.456 INFO:tasks.workunit.client.1.vm08.stdout:8/622: truncate d0/df/d15/d23/d39/d5b/d4a/f98 1070265 0 2026-03-10T07:51:35.456 INFO:tasks.workunit.client.1.vm08.stdout:2/533: mknod d0/d1/d3/d39/cac 0 2026-03-10T07:51:35.456 INFO:tasks.workunit.client.1.vm08.stdout:9/534: rename d2/de/f3d to d2/d86/d30/d35/d97/d9d/fb9 0 2026-03-10T07:51:35.457 INFO:tasks.workunit.client.1.vm08.stdout:4/445: dwrite d5/d8/d9/d12/d7b/d48/d4f/f56 [0,4194304] 0 2026-03-10T07:51:35.457 INFO:tasks.workunit.client.1.vm08.stdout:8/623: mknod d0/df/d17/d25/cc6 0 2026-03-10T07:51:35.457 INFO:tasks.workunit.client.1.vm08.stdout:3/544: mkdir d0/d3c/d18/da9 0 2026-03-10T07:51:35.457 INFO:tasks.workunit.client.1.vm08.stdout:3/545: dread - d0/d3c/d1f/f93 zero size 2026-03-10T07:51:35.457 INFO:tasks.workunit.client.1.vm08.stdout:6/553: unlink d1/c7e 0 2026-03-10T07:51:35.461 INFO:tasks.workunit.client.1.vm08.stdout:4/446: chown d5/d8/d9/d12/d7b/l3e 9 1 2026-03-10T07:51:35.466 INFO:tasks.workunit.client.1.vm08.stdout:8/624: creat d0/df/d15/d23/da8/fc7 x:0 0 0 2026-03-10T07:51:35.466 INFO:tasks.workunit.client.1.vm08.stdout:3/546: dread d0/d3c/d18/d32/f62 [4194304,4194304] 0 2026-03-10T07:51:35.466 INFO:tasks.workunit.client.1.vm08.stdout:3/547: dread - d0/d3c/d18/d32/d61/d83/fa3 zero size 2026-03-10T07:51:35.466 INFO:tasks.workunit.client.1.vm08.stdout:4/447: dread d5/d8/d9/d12/d7b/d48/d4f/f56 [0,4194304] 0 2026-03-10T07:51:35.466 INFO:tasks.workunit.client.1.vm08.stdout:8/625: dwrite d0/d69/d3f/fb3 [4194304,4194304] 0 2026-03-10T07:51:35.472 INFO:tasks.workunit.client.1.vm08.stdout:9/535: creat d2/fba x:0 0 0 2026-03-10T07:51:35.472 INFO:tasks.workunit.client.1.vm08.stdout:6/554: rmdir d1/d3/df/d38 39 2026-03-10T07:51:35.473 INFO:tasks.workunit.client.1.vm08.stdout:9/536: readlink d2/d3/d84/d91/l99 0 2026-03-10T07:51:35.474 INFO:tasks.workunit.client.1.vm08.stdout:9/537: write d2/de/fa5 [958645,8624] 0 2026-03-10T07:51:35.476 INFO:tasks.workunit.client.1.vm08.stdout:8/626: rename d0/df/d15/d53/l6d to d0/df/d15/d23/da8/lc8 0 2026-03-10T07:51:35.484 INFO:tasks.workunit.client.1.vm08.stdout:9/538: mkdir d2/de/d28/d98/dbb 0 2026-03-10T07:51:35.485 INFO:tasks.workunit.client.1.vm08.stdout:8/627: dread d0/df/d15/f70 [0,4194304] 0 2026-03-10T07:51:35.486 INFO:tasks.workunit.client.1.vm08.stdout:6/555: dwrite d1/d17/f20 [0,4194304] 0 2026-03-10T07:51:35.495 INFO:tasks.workunit.client.1.vm08.stdout:6/556: stat d1/db/f57 0 2026-03-10T07:51:35.499 INFO:tasks.workunit.client.1.vm08.stdout:9/539: mknod d2/d26/cbc 0 2026-03-10T07:51:35.502 INFO:tasks.workunit.client.1.vm08.stdout:9/540: dread - d2/d86/d30/d35/d9b/fb6 zero size 2026-03-10T07:51:35.530 INFO:tasks.workunit.client.1.vm08.stdout:7/512: sync 2026-03-10T07:51:35.530 INFO:tasks.workunit.client.1.vm08.stdout:2/534: sync 2026-03-10T07:51:35.531 INFO:tasks.workunit.client.1.vm08.stdout:7/513: dread - d3/da/d25/d9/d2f/d3a/d4b/db0/f81 zero size 2026-03-10T07:51:35.538 INFO:tasks.workunit.client.1.vm08.stdout:2/535: dwrite d0/d1/d3/d10/d65/f7c [4194304,4194304] 0 2026-03-10T07:51:35.540 INFO:tasks.workunit.client.1.vm08.stdout:2/536: readlink d0/d1/d3/d39/d7d/d86/d55/d1b/l5d 0 2026-03-10T07:51:35.540 INFO:tasks.workunit.client.1.vm08.stdout:2/537: write d0/f81 [769354,37066] 0 2026-03-10T07:51:35.549 INFO:tasks.workunit.client.1.vm08.stdout:2/538: mkdir d0/d1/d3/d56/d78/dad 0 2026-03-10T07:51:35.574 INFO:tasks.workunit.client.1.vm08.stdout:2/539: chown d0/d1/d3/d10/d38/c46 348563116 1 2026-03-10T07:51:35.574 INFO:tasks.workunit.client.1.vm08.stdout:2/540: chown d0/d1/d3/d10/d32/d61/d84/da7 220 1 2026-03-10T07:51:35.574 INFO:tasks.workunit.client.1.vm08.stdout:2/541: chown d0/d1/d3/d56/d78/c77 23 1 2026-03-10T07:51:35.581 INFO:tasks.workunit.client.1.vm08.stdout:0/518: write dd/d18/f21 [2260887,25647] 0 2026-03-10T07:51:35.584 INFO:tasks.workunit.client.1.vm08.stdout:1/508: dwrite d2/d6/de/d1f/d26/d58/f68 [0,4194304] 0 2026-03-10T07:51:35.588 INFO:tasks.workunit.client.1.vm08.stdout:9/541: sync 2026-03-10T07:51:35.596 INFO:tasks.workunit.client.1.vm08.stdout:9/542: link d2/d3/d84/d91/fa6 d2/d86/d30/d35/d97/d9d/fbd 0 2026-03-10T07:51:35.598 INFO:tasks.workunit.client.1.vm08.stdout:0/519: getdents dd/d10/d2f/d37 0 2026-03-10T07:51:35.600 INFO:tasks.workunit.client.1.vm08.stdout:9/543: rename d2/de/d28/c3b to d2/d3/d84/cbe 0 2026-03-10T07:51:35.601 INFO:tasks.workunit.client.1.vm08.stdout:9/544: write d2/d86/d30/d35/d97/f9c [566441,90738] 0 2026-03-10T07:51:35.602 INFO:tasks.workunit.client.1.vm08.stdout:9/545: chown d2/d86/d30/f87 1335769 1 2026-03-10T07:51:35.624 INFO:tasks.workunit.client.1.vm08.stdout:0/520: dread dd/fe [0,4194304] 0 2026-03-10T07:51:35.624 INFO:tasks.workunit.client.1.vm08.stdout:5/570: dwrite d0/d4/df/d1e/f42 [0,4194304] 0 2026-03-10T07:51:35.631 INFO:tasks.workunit.client.1.vm08.stdout:9/546: dread d2/de/f4d [0,4194304] 0 2026-03-10T07:51:35.632 INFO:tasks.workunit.client.1.vm08.stdout:9/547: truncate d2/fba 389153 0 2026-03-10T07:51:35.692 INFO:tasks.workunit.client.1.vm08.stdout:5/571: sync 2026-03-10T07:51:35.703 INFO:tasks.workunit.client.1.vm08.stdout:3/548: dread d0/d3c/d18/d32/d61/d52/f66 [0,4194304] 0 2026-03-10T07:51:35.707 INFO:tasks.workunit.client.1.vm08.stdout:5/572: link d0/d4/d19/d81/c8c d0/d8/d5e/cbb 0 2026-03-10T07:51:35.711 INFO:tasks.workunit.client.1.vm08.stdout:5/573: symlink d0/d8/lbc 0 2026-03-10T07:51:35.712 INFO:tasks.workunit.client.1.vm08.stdout:5/574: write d0/d4/df/d82/f8d [5465669,38331] 0 2026-03-10T07:51:35.727 INFO:tasks.workunit.client.1.vm08.stdout:5/575: sync 2026-03-10T07:51:35.728 INFO:tasks.workunit.client.1.vm08.stdout:5/576: write d0/d4/d19/d60/d6d/d70/f67 [727882,8926] 0 2026-03-10T07:51:35.739 INFO:tasks.workunit.client.1.vm08.stdout:4/448: truncate d5/f2f 117113 0 2026-03-10T07:51:35.740 INFO:tasks.workunit.client.1.vm08.stdout:4/449: dread - d5/d8/f86 zero size 2026-03-10T07:51:35.742 INFO:tasks.workunit.client.1.vm08.stdout:4/450: creat d5/d1f/d41/f99 x:0 0 0 2026-03-10T07:51:35.750 INFO:tasks.workunit.client.1.vm08.stdout:6/557: write d1/db/d24/f75 [472320,33714] 0 2026-03-10T07:51:35.758 INFO:tasks.workunit.client.1.vm08.stdout:6/558: symlink d1/db/d24/dac/dad/lb8 0 2026-03-10T07:51:35.758 INFO:tasks.workunit.client.1.vm08.stdout:6/559: creat d1/d3/df/d1d/d40/d45/d5c/fb9 x:0 0 0 2026-03-10T07:51:35.759 INFO:tasks.workunit.client.1.vm08.stdout:6/560: creat d1/db/d24/dac/fba x:0 0 0 2026-03-10T07:51:35.759 INFO:tasks.workunit.client.1.vm08.stdout:6/561: dread - d1/d3/df/d1d/d40/d45/faa zero size 2026-03-10T07:51:35.760 INFO:tasks.workunit.client.1.vm08.stdout:6/562: read d1/d3/d3e/f56 [287722,116416] 0 2026-03-10T07:51:35.761 INFO:tasks.workunit.client.1.vm08.stdout:6/563: chown d1/d3/df/d1d/d40/d45/d5c/c61 506156 1 2026-03-10T07:51:35.777 INFO:tasks.workunit.client.1.vm08.stdout:8/628: truncate d0/df/d2e/f3c 1718640 0 2026-03-10T07:51:35.793 INFO:tasks.workunit.client.1.vm08.stdout:8/629: write d0/df/d17/d25/fc4 [985780,89700] 0 2026-03-10T07:51:35.794 INFO:tasks.workunit.client.1.vm08.stdout:8/630: stat d0/d37/d86/f90 0 2026-03-10T07:51:35.794 INFO:tasks.workunit.client.1.vm08.stdout:8/631: symlink d0/df/d15/d23/lc9 0 2026-03-10T07:51:35.794 INFO:tasks.workunit.client.1.vm08.stdout:8/632: chown d0/d37/d86/fb7 10350586 1 2026-03-10T07:51:35.794 INFO:tasks.workunit.client.1.vm08.stdout:8/633: dwrite d0/df/d15/d23/da8/fc7 [0,4194304] 0 2026-03-10T07:51:35.794 INFO:tasks.workunit.client.1.vm08.stdout:2/542: dwrite d0/d1/d3/d56/d57/f5b [0,4194304] 0 2026-03-10T07:51:35.794 INFO:tasks.workunit.client.1.vm08.stdout:8/634: write d0/df/d15/d23/d39/f6f [1144810,123857] 0 2026-03-10T07:51:35.804 INFO:tasks.workunit.client.1.vm08.stdout:2/543: creat d0/d1/d3/d10/d65/fae x:0 0 0 2026-03-10T07:51:35.807 INFO:tasks.workunit.client.1.vm08.stdout:2/544: mkdir d0/d1/d3/d10/d38/daf 0 2026-03-10T07:51:35.811 INFO:tasks.workunit.client.1.vm08.stdout:0/521: symlink dd/d10/d14/d15/d20/la8 0 2026-03-10T07:51:35.820 INFO:tasks.workunit.client.1.vm08.stdout:2/545: getdents d0/d1/d3/d56/d9b/d9c 0 2026-03-10T07:51:35.830 INFO:tasks.workunit.client.1.vm08.stdout:0/522: dread dd/d10/f5e [0,4194304] 0 2026-03-10T07:51:35.830 INFO:tasks.workunit.client.1.vm08.stdout:2/546: creat d0/d1/d3/d39/d7d/d86/d55/fb0 x:0 0 0 2026-03-10T07:51:35.830 INFO:tasks.workunit.client.1.vm08.stdout:0/523: mkdir dd/d10/d2f/d37/d64/d52/da9 0 2026-03-10T07:51:35.830 INFO:tasks.workunit.client.1.vm08.stdout:0/524: chown dd/d10/d2f/d37/d64/d52/f74 1950 1 2026-03-10T07:51:35.836 INFO:tasks.workunit.client.1.vm08.stdout:9/548: truncate d2/d86/d2b/f37 1855417 0 2026-03-10T07:51:35.891 INFO:tasks.workunit.client.1.vm08.stdout:1/509: creat d2/d6/de/fb2 x:0 0 0 2026-03-10T07:51:35.891 INFO:tasks.workunit.client.1.vm08.stdout:1/510: read d2/d6/d3a/f7d [3092234,115890] 0 2026-03-10T07:51:35.892 INFO:tasks.workunit.client.1.vm08.stdout:1/511: read - d2/d6/de/d1f/d22/f30 zero size 2026-03-10T07:51:35.898 INFO:tasks.workunit.client.1.vm08.stdout:8/635: write d0/df/d2e/f44 [657126,36744] 0 2026-03-10T07:51:35.899 INFO:tasks.workunit.client.1.vm08.stdout:7/514: symlink d3/da/lb1 0 2026-03-10T07:51:35.900 INFO:tasks.workunit.client.1.vm08.stdout:7/515: write d3/da/d25/d9/d2f/d3a/d4b/fa3 [794400,8733] 0 2026-03-10T07:51:35.903 INFO:tasks.workunit.client.1.vm08.stdout:8/636: link d0/df/d17/l1e d0/df/d2e/d30/lca 0 2026-03-10T07:51:35.933 INFO:tasks.workunit.client.1.vm08.stdout:6/564: rmdir d1/db/d24/d73/d79 39 2026-03-10T07:51:35.935 INFO:tasks.workunit.client.1.vm08.stdout:2/547: write d0/d1/d17/d6b/f6f [3230643,61026] 0 2026-03-10T07:51:35.937 INFO:tasks.workunit.client.1.vm08.stdout:0/525: dwrite dd/d10/d14/d15/d20/d5f/f61 [0,4194304] 0 2026-03-10T07:51:35.938 INFO:tasks.workunit.client.1.vm08.stdout:0/526: readlink dd/d10/d14/d1b/d30/l8c 0 2026-03-10T07:51:35.939 INFO:tasks.workunit.client.1.vm08.stdout:0/527: write dd/d10/d2f/d37/d64/d95/d58/d3d/fa1 [299169,66939] 0 2026-03-10T07:51:35.945 INFO:tasks.workunit.client.1.vm08.stdout:0/528: dwrite dd/d10/d2f/d37/d64/d95/d58/d3d/f5b [0,4194304] 0 2026-03-10T07:51:35.945 INFO:tasks.workunit.client.1.vm08.stdout:6/565: creat d1/d3/df/d1d/d40/d45/fbb x:0 0 0 2026-03-10T07:51:35.948 INFO:tasks.workunit.client.1.vm08.stdout:2/548: fdatasync d0/d1/d3/d10/f4f 0 2026-03-10T07:51:35.952 INFO:tasks.workunit.client.1.vm08.stdout:0/529: dread dd/d10/d2f/f81 [0,4194304] 0 2026-03-10T07:51:35.956 INFO:tasks.workunit.client.1.vm08.stdout:0/530: creat dd/d10/d2f/d37/d64/d95/d58/d3d/faa x:0 0 0 2026-03-10T07:51:35.957 INFO:tasks.workunit.client.1.vm08.stdout:4/451: mknod d5/d8/d9/d12/d7b/c9a 0 2026-03-10T07:51:35.957 INFO:tasks.workunit.client.1.vm08.stdout:0/531: chown dd/f44 14439584 1 2026-03-10T07:51:35.958 INFO:tasks.workunit.client.1.vm08.stdout:6/566: dread d1/d3/d3e/f4a [0,4194304] 0 2026-03-10T07:51:35.960 INFO:tasks.workunit.client.1.vm08.stdout:6/567: read d1/d3/d3e/f81 [1737919,98055] 0 2026-03-10T07:51:35.973 INFO:tasks.workunit.client.1.vm08.stdout:6/568: creat d1/d3/df/d1d/d40/d45/d5c/fbc x:0 0 0 2026-03-10T07:51:35.974 INFO:tasks.workunit.client.1.vm08.stdout:3/549: rename d0/d3c/da1 to d0/d3c/d18/d32/daa 0 2026-03-10T07:51:35.975 INFO:tasks.workunit.client.1.vm08.stdout:6/569: fdatasync d1/d46/f72 0 2026-03-10T07:51:35.977 INFO:tasks.workunit.client.1.vm08.stdout:0/532: getdents dd/d10 0 2026-03-10T07:51:35.978 INFO:tasks.workunit.client.1.vm08.stdout:6/570: mknod d1/d17/d2b/d58/d77/cbd 0 2026-03-10T07:51:35.980 INFO:tasks.workunit.client.1.vm08.stdout:0/533: dwrite dd/d10/d14/d15/f84 [0,4194304] 0 2026-03-10T07:51:35.986 INFO:tasks.workunit.client.1.vm08.stdout:5/577: rename d0/d33/c95 to d0/d4/df/d1e/cbd 0 2026-03-10T07:51:35.992 INFO:tasks.workunit.client.1.vm08.stdout:6/571: mknod d1/d3/df/d1d/d6f/cbe 0 2026-03-10T07:51:35.993 INFO:tasks.workunit.client.1.vm08.stdout:1/512: dwrite d2/d6/de/d1f/d26/d58/d8c/f46 [0,4194304] 0 2026-03-10T07:51:35.995 INFO:tasks.workunit.client.1.vm08.stdout:3/550: link d0/d3c/d18/d48/d55/d56/f81 d0/d3c/d1f/d95/fab 0 2026-03-10T07:51:36.011 INFO:tasks.workunit.client.1.vm08.stdout:1/513: dwrite d2/d6/de/fb2 [0,4194304] 0 2026-03-10T07:51:36.011 INFO:tasks.workunit.client.1.vm08.stdout:9/549: rename d2/d86 to d2/d58/dbf 0 2026-03-10T07:51:36.011 INFO:tasks.workunit.client.1.vm08.stdout:9/550: truncate d2/d58/dbf/d30/d35/d9b/fb6 88913 0 2026-03-10T07:51:36.012 INFO:tasks.workunit.client.1.vm08.stdout:4/452: sync 2026-03-10T07:51:36.021 INFO:tasks.workunit.client.1.vm08.stdout:0/534: dread dd/d10/d14/d15/d20/d5f/f7f [0,4194304] 0 2026-03-10T07:51:36.021 INFO:tasks.workunit.client.1.vm08.stdout:7/516: write d3/da/d25/d9/f53 [511995,115532] 0 2026-03-10T07:51:36.023 INFO:tasks.workunit.client.1.vm08.stdout:1/514: symlink d2/d6/de/d1f/d22/lb3 0 2026-03-10T07:51:36.024 INFO:tasks.workunit.client.1.vm08.stdout:9/551: rmdir d2/d58/dbf/d30 39 2026-03-10T07:51:36.026 INFO:tasks.workunit.client.1.vm08.stdout:0/535: dwrite dd/d10/d2f/d37/d64/d95/d58/d3d/f40 [0,4194304] 0 2026-03-10T07:51:36.027 INFO:tasks.workunit.client.1.vm08.stdout:0/536: fdatasync dd/d10/d14/d15/d20/d5f/f7f 0 2026-03-10T07:51:36.030 INFO:tasks.workunit.client.1.vm08.stdout:3/551: fsync d0/d3c/d1f/f7e 0 2026-03-10T07:51:36.030 INFO:tasks.workunit.client.1.vm08.stdout:1/515: dread d2/d6/de/d70/f8d [0,4194304] 0 2026-03-10T07:51:36.032 INFO:tasks.workunit.client.1.vm08.stdout:7/517: truncate d3/f2b 237558 0 2026-03-10T07:51:36.039 INFO:tasks.workunit.client.1.vm08.stdout:8/637: rename d0/df/d15/d23/d39/d5b/c5a to d0/d69/d77/ccb 0 2026-03-10T07:51:36.050 INFO:tasks.workunit.client.1.vm08.stdout:4/453: mkdir d5/d1f/d9b 0 2026-03-10T07:51:36.054 INFO:tasks.workunit.client.1.vm08.stdout:0/537: mknod dd/d10/d14/d1b/cab 0 2026-03-10T07:51:36.054 INFO:tasks.workunit.client.1.vm08.stdout:1/516: creat d2/d6/de/d70/fb4 x:0 0 0 2026-03-10T07:51:36.054 INFO:tasks.workunit.client.1.vm08.stdout:7/518: mkdir d3/da/d25/d9/d2f/d39/db2 0 2026-03-10T07:51:36.054 INFO:tasks.workunit.client.1.vm08.stdout:2/549: rename d0/d1/d3/d10/d32 to d0/d1/d3/d56/d78/dad/db1 0 2026-03-10T07:51:36.057 INFO:tasks.workunit.client.1.vm08.stdout:7/519: write d3/f34 [397419,86468] 0 2026-03-10T07:51:36.059 INFO:tasks.workunit.client.1.vm08.stdout:0/538: dwrite dd/d10/d2f/d37/d64/d95/d5c/f63 [4194304,4194304] 0 2026-03-10T07:51:36.063 INFO:tasks.workunit.client.1.vm08.stdout:3/552: rename d0/d3c/d1f/d44/d51/d34/c78 to d0/d3c/d18/d32/cac 0 2026-03-10T07:51:36.068 INFO:tasks.workunit.client.1.vm08.stdout:2/550: fsync d0/d1/d3/d56/d57/f79 0 2026-03-10T07:51:36.068 INFO:tasks.workunit.client.1.vm08.stdout:1/517: mkdir d2/d6/de/d1f/d8f/db5 0 2026-03-10T07:51:36.073 INFO:tasks.workunit.client.1.vm08.stdout:2/551: dwrite d0/d1/f9f [0,4194304] 0 2026-03-10T07:51:36.089 INFO:tasks.workunit.client.1.vm08.stdout:4/454: rename d5/d8/d50/l79 to d5/d8/d9/d12/d7b/d48/d4f/l9c 0 2026-03-10T07:51:36.105 INFO:tasks.workunit.client.1.vm08.stdout:4/455: stat d5/d8/d9/d12/d7b/l3b 0 2026-03-10T07:51:36.105 INFO:tasks.workunit.client.1.vm08.stdout:1/518: getdents d2/d6/d3a/d61/d6f 0 2026-03-10T07:51:36.107 INFO:tasks.workunit.client.1.vm08.stdout:2/552: getdents d0/d1/d3/d56/d78/dad/db1/d61/d84 0 2026-03-10T07:51:36.108 INFO:tasks.workunit.client.1.vm08.stdout:4/456: mknod d5/d1f/d41/c9d 0 2026-03-10T07:51:36.108 INFO:tasks.workunit.client.1.vm08.stdout:1/519: creat d2/d6/de/d5f/fb6 x:0 0 0 2026-03-10T07:51:36.112 INFO:tasks.workunit.client.1.vm08.stdout:2/553: rename d0/d1/d3/d56/d9b to d0/d1/d17/db2 0 2026-03-10T07:51:36.116 INFO:tasks.workunit.client.1.vm08.stdout:2/554: symlink d0/d1/d3/d56/d78/dad/db1/lb3 0 2026-03-10T07:51:36.116 INFO:tasks.workunit.client.1.vm08.stdout:2/555: chown d0/d1/d3/d56/d78/dad/db1/d61/d84 87 1 2026-03-10T07:51:36.118 INFO:tasks.workunit.client.1.vm08.stdout:2/556: dwrite d0/d1/d3/d39/d7d/d86/f7f [0,4194304] 0 2026-03-10T07:51:36.125 INFO:tasks.workunit.client.1.vm08.stdout:2/557: unlink d0/d1/d3/d10/d38/c51 0 2026-03-10T07:51:36.128 INFO:tasks.workunit.client.1.vm08.stdout:2/558: creat d0/d1/d3/d10/fb4 x:0 0 0 2026-03-10T07:51:36.128 INFO:tasks.workunit.client.1.vm08.stdout:2/559: truncate d0/d1/d3/d39/d7d/d7e/fa5 499540 0 2026-03-10T07:51:36.129 INFO:tasks.workunit.client.1.vm08.stdout:2/560: mknod d0/d1/d3/d56/d78/dad/db1/d61/cb5 0 2026-03-10T07:51:36.132 INFO:tasks.workunit.client.1.vm08.stdout:2/561: getdents d0/d1/d3/d56/d78 0 2026-03-10T07:51:36.136 INFO:tasks.workunit.client.1.vm08.stdout:2/562: dwrite d0/d1/d17/db2/d9c/fa4 [0,4194304] 0 2026-03-10T07:51:36.141 INFO:tasks.workunit.client.1.vm08.stdout:4/457: sync 2026-03-10T07:51:36.143 INFO:tasks.workunit.client.1.vm08.stdout:7/520: read d3/da/d25/d9/f53 [706823,109222] 0 2026-03-10T07:51:36.150 INFO:tasks.workunit.client.1.vm08.stdout:7/521: write d3/da/d25/d9/f87 [1777900,50462] 0 2026-03-10T07:51:36.150 INFO:tasks.workunit.client.1.vm08.stdout:7/522: dwrite d3/da/d25/d9/f30 [0,4194304] 0 2026-03-10T07:51:36.159 INFO:tasks.workunit.client.1.vm08.stdout:4/458: dread f1 [0,4194304] 0 2026-03-10T07:51:36.163 INFO:tasks.workunit.client.1.vm08.stdout:7/523: dread d3/f57 [0,4194304] 0 2026-03-10T07:51:36.176 INFO:tasks.workunit.client.1.vm08.stdout:4/459: unlink d5/d8/d9/f92 0 2026-03-10T07:51:36.177 INFO:tasks.workunit.client.1.vm08.stdout:4/460: chown d5/d8/f39 6801 1 2026-03-10T07:51:36.203 INFO:tasks.workunit.client.1.vm08.stdout:4/461: symlink d5/d8/d9/d12/d7b/d48/l9e 0 2026-03-10T07:51:36.213 INFO:tasks.workunit.client.1.vm08.stdout:4/462: read - d5/d8/d9/f46 zero size 2026-03-10T07:51:36.213 INFO:tasks.workunit.client.1.vm08.stdout:4/463: read - d5/d8/d9/f46 zero size 2026-03-10T07:51:36.217 INFO:tasks.workunit.client.1.vm08.stdout:4/464: truncate d5/f77 596398 0 2026-03-10T07:51:36.223 INFO:tasks.workunit.client.1.vm08.stdout:2/563: dread d0/f76 [0,4194304] 0 2026-03-10T07:51:36.228 INFO:tasks.workunit.client.1.vm08.stdout:2/564: creat d0/d1/d3/d39/d7d/d86/d55/d1b/fb6 x:0 0 0 2026-03-10T07:51:36.235 INFO:tasks.workunit.client.1.vm08.stdout:5/578: write d0/d4/d19/d81/d92/f78 [1003127,103585] 0 2026-03-10T07:51:36.237 INFO:tasks.workunit.client.1.vm08.stdout:5/579: chown d0/d33/l7b 5723 1 2026-03-10T07:51:36.242 INFO:tasks.workunit.client.1.vm08.stdout:6/572: dwrite d1/d3/df/d44/fa2 [0,4194304] 0 2026-03-10T07:51:36.252 INFO:tasks.workunit.client.1.vm08.stdout:8/638: getdents d0/df/d15/d23/d39/d5b 0 2026-03-10T07:51:36.253 INFO:tasks.workunit.client.1.vm08.stdout:5/580: creat d0/d4/d19/d81/da4/fbe x:0 0 0 2026-03-10T07:51:36.253 INFO:tasks.workunit.client.1.vm08.stdout:5/581: chown d0/d8/d5e 521468 1 2026-03-10T07:51:36.254 INFO:tasks.workunit.client.1.vm08.stdout:6/573: dwrite d1/d3/df/d1d/d40/d45/faa [0,4194304] 0 2026-03-10T07:51:36.256 INFO:tasks.workunit.client.1.vm08.stdout:5/582: truncate d0/d4/df/d1e/d41/dad/fb9 581121 0 2026-03-10T07:51:36.260 INFO:tasks.workunit.client.1.vm08.stdout:6/574: dread - d1/db/d24/d73/d79/f98 zero size 2026-03-10T07:51:36.260 INFO:tasks.workunit.client.1.vm08.stdout:5/583: unlink d0/d77/daa/fb3 0 2026-03-10T07:51:36.260 INFO:tasks.workunit.client.1.vm08.stdout:2/565: dread d0/d1/d3/d10/f58 [0,4194304] 0 2026-03-10T07:51:36.262 INFO:tasks.workunit.client.1.vm08.stdout:8/639: creat d0/df/d15/d23/d54/dba/d89/dc5/fcc x:0 0 0 2026-03-10T07:51:36.275 INFO:tasks.workunit.client.1.vm08.stdout:5/584: fsync d0/d4/d19/d43/f35 0 2026-03-10T07:51:36.275 INFO:tasks.workunit.client.1.vm08.stdout:5/585: fsync d0/d4/d19/d60/fb6 0 2026-03-10T07:51:36.281 INFO:tasks.workunit.client.1.vm08.stdout:8/640: creat d0/d69/d3f/fcd x:0 0 0 2026-03-10T07:51:36.281 INFO:tasks.workunit.client.1.vm08.stdout:9/552: link d2/d58/dbf/f21 d2/d3/fc0 0 2026-03-10T07:51:36.282 INFO:tasks.workunit.client.1.vm08.stdout:0/539: write dd/f16 [768836,107610] 0 2026-03-10T07:51:36.283 INFO:tasks.workunit.client.1.vm08.stdout:0/540: write dd/d10/d2f/d37/d64/f3f [2593265,24285] 0 2026-03-10T07:51:36.286 INFO:tasks.workunit.client.1.vm08.stdout:5/586: rename d0/d4/df/d1e to d0/d4/df/dbf 0 2026-03-10T07:51:36.290 INFO:tasks.workunit.client.1.vm08.stdout:2/566: link d0/d1/d3/d39/d7d/d86/f7f d0/d1/d3/d39/fb7 0 2026-03-10T07:51:36.292 INFO:tasks.workunit.client.1.vm08.stdout:8/641: chown d0/df/d2e/d30/lca 9 1 2026-03-10T07:51:36.295 INFO:tasks.workunit.client.1.vm08.stdout:8/642: dread d0/df/d17/d25/fc4 [0,4194304] 0 2026-03-10T07:51:36.298 INFO:tasks.workunit.client.1.vm08.stdout:3/553: dwrite d0/f39 [0,4194304] 0 2026-03-10T07:51:36.302 INFO:tasks.workunit.client.1.vm08.stdout:0/541: rmdir dd/d10/d14/d1b/d30 39 2026-03-10T07:51:36.306 INFO:tasks.workunit.client.1.vm08.stdout:3/554: dread d0/d3c/d1f/d44/f59 [0,4194304] 0 2026-03-10T07:51:36.306 INFO:tasks.workunit.client.1.vm08.stdout:3/555: stat d0/d3c/d1f/d89 0 2026-03-10T07:51:36.314 INFO:tasks.workunit.client.1.vm08.stdout:2/567: rename d0/d1/d3/d56/d78/dad/db1/d61/d8e/f90 to d0/d1/d3/d10/d65/fb8 0 2026-03-10T07:51:36.315 INFO:tasks.workunit.client.1.vm08.stdout:5/587: chown d0/d4/d19/d43/f7c 57389 1 2026-03-10T07:51:36.326 INFO:tasks.workunit.client.1.vm08.stdout:8/643: fsync d0/df/d15/d23/d54/dba/d89/f8b 0 2026-03-10T07:51:36.326 INFO:tasks.workunit.client.1.vm08.stdout:5/588: dwrite d0/d4/df/d82/f8d [0,4194304] 0 2026-03-10T07:51:36.327 INFO:tasks.workunit.client.1.vm08.stdout:3/556: sync 2026-03-10T07:51:36.328 INFO:tasks.workunit.client.1.vm08.stdout:1/520: dwrite d2/d6/de/d1f/d26/f2f [4194304,4194304] 0 2026-03-10T07:51:36.340 INFO:tasks.workunit.client.1.vm08.stdout:0/542: truncate dd/d10/d2f/f4c 1760104 0 2026-03-10T07:51:36.364 INFO:tasks.workunit.client.1.vm08.stdout:7/524: dwrite d3/da/f6b [0,4194304] 0 2026-03-10T07:51:36.406 INFO:tasks.workunit.client.1.vm08.stdout:6/575: dwrite d1/f49 [0,4194304] 0 2026-03-10T07:51:36.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:36 vm05.local ceph-mon[50387]: pgmap v37: 65 pgs: 65 active+clean; 2.9 GiB data, 9.7 GiB used, 110 GiB / 120 GiB avail; 42 MiB/s rd, 160 MiB/s wr, 329 op/s 2026-03-10T07:51:36.409 INFO:tasks.workunit.client.1.vm08.stdout:2/568: mkdir d0/d1/d3/d39/d7d/d86/d55/db9 0 2026-03-10T07:51:36.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:36 vm08.local ceph-mon[59917]: pgmap v37: 65 pgs: 65 active+clean; 2.9 GiB data, 9.7 GiB used, 110 GiB / 120 GiB avail; 42 MiB/s rd, 160 MiB/s wr, 329 op/s 2026-03-10T07:51:36.435 INFO:tasks.workunit.client.1.vm08.stdout:6/576: dread d1/d17/f66 [4194304,4194304] 0 2026-03-10T07:51:36.448 INFO:tasks.workunit.client.1.vm08.stdout:0/543: mknod dd/d10/cac 0 2026-03-10T07:51:36.454 INFO:tasks.workunit.client.1.vm08.stdout:5/589: write d0/d8/f7e [754126,121941] 0 2026-03-10T07:51:36.455 INFO:tasks.workunit.client.1.vm08.stdout:1/521: write d2/d6/de/d1f/d26/d89/d8e/f90 [729355,84355] 0 2026-03-10T07:51:36.462 INFO:tasks.workunit.client.1.vm08.stdout:8/644: mkdir d0/df/d2e/d30/dc0/dce 0 2026-03-10T07:51:36.464 INFO:tasks.workunit.client.1.vm08.stdout:6/577: unlink d1/d17/f20 0 2026-03-10T07:51:36.467 INFO:tasks.workunit.client.1.vm08.stdout:9/553: getdents d2 0 2026-03-10T07:51:36.471 INFO:tasks.workunit.client.1.vm08.stdout:1/522: mknod d2/d6/de/d1f/d26/d98/cb7 0 2026-03-10T07:51:36.472 INFO:tasks.workunit.client.1.vm08.stdout:1/523: fdatasync d2/d6/de/d1f/d22/f35 0 2026-03-10T07:51:36.475 INFO:tasks.workunit.client.1.vm08.stdout:1/524: dread d2/d6/de/d1f/d26/d89/d8e/f90 [0,4194304] 0 2026-03-10T07:51:36.477 INFO:tasks.workunit.client.1.vm08.stdout:3/557: truncate d0/d3c/d1f/d89/fa4 1845728 0 2026-03-10T07:51:36.478 INFO:tasks.workunit.client.1.vm08.stdout:7/525: write d3/da/d25/f32 [9876,83160] 0 2026-03-10T07:51:36.478 INFO:tasks.workunit.client.1.vm08.stdout:2/569: dwrite d0/d1/d3/d39/d7d/d86/d55/d7a/f94 [0,4194304] 0 2026-03-10T07:51:36.488 INFO:tasks.workunit.client.1.vm08.stdout:3/558: dwrite d0/d3c/d18/d32/d61/d52/f7f [0,4194304] 0 2026-03-10T07:51:36.495 INFO:tasks.workunit.client.1.vm08.stdout:8/645: creat d0/df/d5d/fcf x:0 0 0 2026-03-10T07:51:36.500 INFO:tasks.workunit.client.1.vm08.stdout:6/578: truncate d1/db/d24/f25 1750644 0 2026-03-10T07:51:36.500 INFO:tasks.workunit.client.1.vm08.stdout:8/646: write d0/df/d15/d23/d39/f6f [4237346,63353] 0 2026-03-10T07:51:36.500 INFO:tasks.workunit.client.1.vm08.stdout:6/579: write d1/f35 [5441997,41992] 0 2026-03-10T07:51:36.501 INFO:tasks.workunit.client.1.vm08.stdout:5/590: creat d0/d4/df/dbf/daf/fc0 x:0 0 0 2026-03-10T07:51:36.503 INFO:tasks.workunit.client.1.vm08.stdout:8/647: dwrite d0/df/d5d/fcf [0,4194304] 0 2026-03-10T07:51:36.503 INFO:tasks.workunit.client.1.vm08.stdout:5/591: chown d0/d4/df/dbf/d41/dad/fb9 7 1 2026-03-10T07:51:36.506 INFO:tasks.workunit.client.1.vm08.stdout:8/648: dwrite d0/df/d15/d23/d39/f6f [4194304,4194304] 0 2026-03-10T07:51:36.515 INFO:tasks.workunit.client.1.vm08.stdout:5/592: dwrite d0/d4/df/dbf/fa6 [0,4194304] 0 2026-03-10T07:51:36.517 INFO:tasks.workunit.client.1.vm08.stdout:9/554: rmdir d2/d58/dbf/d30 39 2026-03-10T07:51:36.521 INFO:tasks.workunit.client.1.vm08.stdout:4/465: link d5/f2f d5/d8/d9/d12/d7b/d48/d4f/d7c/f9f 0 2026-03-10T07:51:36.522 INFO:tasks.workunit.client.1.vm08.stdout:4/466: dread - d5/d1f/d41/f7f zero size 2026-03-10T07:51:36.522 INFO:tasks.workunit.client.1.vm08.stdout:5/593: dread d0/d4/df/d12/f46 [0,4194304] 0 2026-03-10T07:51:36.545 INFO:tasks.workunit.client.1.vm08.stdout:5/594: unlink d0/d77/f62 0 2026-03-10T07:51:36.549 INFO:tasks.workunit.client.1.vm08.stdout:1/525: rename d2/d6/de/d1f/d26/c57 to d2/d6/cb8 0 2026-03-10T07:51:36.556 INFO:tasks.workunit.client.1.vm08.stdout:5/595: creat d0/d4/d19/d60/d6d/d70/fc1 x:0 0 0 2026-03-10T07:51:36.559 INFO:tasks.workunit.client.1.vm08.stdout:5/596: dwrite d0/d4/d19/d60/d6d/d70/f67 [4194304,4194304] 0 2026-03-10T07:51:36.572 INFO:tasks.workunit.client.1.vm08.stdout:2/570: rename d0/d1/d17/c28 to d0/d1/d3/d56/d78/dad/db1/d61/d84/cba 0 2026-03-10T07:51:36.572 INFO:tasks.workunit.client.1.vm08.stdout:2/571: stat d0/d1/d3/d56/d78/dad/db1/c83 0 2026-03-10T07:51:36.577 INFO:tasks.workunit.client.1.vm08.stdout:9/555: link d2/d58/dbf/d30/d35/f79 d2/d26/fc1 0 2026-03-10T07:51:36.585 INFO:tasks.workunit.client.1.vm08.stdout:7/526: rename d3/da/d25/d9/d2f/d39/d43/d4f/f8f to d3/da/d25/d9/fb3 0 2026-03-10T07:51:36.585 INFO:tasks.workunit.client.1.vm08.stdout:9/556: mknod d2/d26/da4/cc2 0 2026-03-10T07:51:36.585 INFO:tasks.workunit.client.1.vm08.stdout:9/557: dread - d2/d58/fb3 zero size 2026-03-10T07:51:36.585 INFO:tasks.workunit.client.1.vm08.stdout:8/649: rename d0/d37/f47 to d0/df/d15/d23/d54/dba/d89/fd0 0 2026-03-10T07:51:36.585 INFO:tasks.workunit.client.1.vm08.stdout:2/572: creat d0/fbb x:0 0 0 2026-03-10T07:51:36.585 INFO:tasks.workunit.client.1.vm08.stdout:7/527: dwrite d3/da/d8a/f9e [0,4194304] 0 2026-03-10T07:51:36.585 INFO:tasks.workunit.client.1.vm08.stdout:5/597: getdents d0/d4/df 0 2026-03-10T07:51:36.585 INFO:tasks.workunit.client.1.vm08.stdout:9/558: creat d2/d58/dbf/d30/d35/fc3 x:0 0 0 2026-03-10T07:51:36.585 INFO:tasks.workunit.client.1.vm08.stdout:5/598: stat d0/d4/df/d12/c4b 0 2026-03-10T07:51:36.586 INFO:tasks.workunit.client.1.vm08.stdout:9/559: chown d2/d26/l6b 9842 1 2026-03-10T07:51:36.586 INFO:tasks.workunit.client.1.vm08.stdout:7/528: truncate d3/da/d25/f29 4739280 0 2026-03-10T07:51:36.587 INFO:tasks.workunit.client.1.vm08.stdout:9/560: dread - d2/fb5 zero size 2026-03-10T07:51:36.594 INFO:tasks.workunit.client.1.vm08.stdout:4/467: rename d5/d8/d9 to d5/da0 0 2026-03-10T07:51:36.594 INFO:tasks.workunit.client.1.vm08.stdout:2/573: truncate d0/d1/d3/d56/d78/dad/db1/d61/f3d 1358480 0 2026-03-10T07:51:36.595 INFO:tasks.workunit.client.1.vm08.stdout:5/599: creat d0/d4/d19/d81/da4/fc2 x:0 0 0 2026-03-10T07:51:36.595 INFO:tasks.workunit.client.1.vm08.stdout:9/561: rmdir d2/d26 39 2026-03-10T07:51:36.596 INFO:tasks.workunit.client.1.vm08.stdout:4/468: write d5/da0/f5a [68301,55618] 0 2026-03-10T07:51:36.596 INFO:tasks.workunit.client.1.vm08.stdout:1/526: rename d2/d6/de/d47/f63 to d2/d6/de/d1f/d26/d58/fb9 0 2026-03-10T07:51:36.600 INFO:tasks.workunit.client.1.vm08.stdout:7/529: symlink d3/da/d25/d9/d2f/d3a/lb4 0 2026-03-10T07:51:36.604 INFO:tasks.workunit.client.1.vm08.stdout:4/469: symlink d5/da0/d12/d7b/d48/d64/la1 0 2026-03-10T07:51:36.607 INFO:tasks.workunit.client.1.vm08.stdout:4/470: truncate f1 3498298 0 2026-03-10T07:51:36.608 INFO:tasks.workunit.client.1.vm08.stdout:8/650: rename d0/d69/c55 to d0/df/d17/cd1 0 2026-03-10T07:51:36.608 INFO:tasks.workunit.client.1.vm08.stdout:9/562: unlink d2/d3/f49 0 2026-03-10T07:51:36.614 INFO:tasks.workunit.client.1.vm08.stdout:8/651: chown d0/df/d15/d23/d39/d5b/d4a/f98 326287 1 2026-03-10T07:51:36.615 INFO:tasks.workunit.client.1.vm08.stdout:0/544: dwrite dd/d10/f77 [0,4194304] 0 2026-03-10T07:51:36.617 INFO:tasks.workunit.client.1.vm08.stdout:3/559: dwrite d0/f10 [0,4194304] 0 2026-03-10T07:51:36.626 INFO:tasks.workunit.client.1.vm08.stdout:4/471: fsync d5/f2d 0 2026-03-10T07:51:36.637 INFO:tasks.workunit.client.1.vm08.stdout:5/600: dread d0/f3b [0,4194304] 0 2026-03-10T07:51:36.643 INFO:tasks.workunit.client.1.vm08.stdout:6/580: dwrite d1/d3/d3e/f5b [0,4194304] 0 2026-03-10T07:51:36.651 INFO:tasks.workunit.client.1.vm08.stdout:6/581: chown d1/db/d24/d73/d79/c9f 81212304 1 2026-03-10T07:51:36.651 INFO:tasks.workunit.client.1.vm08.stdout:5/601: dread d0/d8/d24/f56 [0,4194304] 0 2026-03-10T07:51:36.654 INFO:tasks.workunit.client.1.vm08.stdout:9/563: unlink d2/de/d28/d98/lb4 0 2026-03-10T07:51:36.655 INFO:tasks.workunit.client.1.vm08.stdout:9/564: chown d2/de/d28/fa8 9852 1 2026-03-10T07:51:36.656 INFO:tasks.workunit.client.1.vm08.stdout:5/602: dwrite d0/d4/df/d82/f8d [4194304,4194304] 0 2026-03-10T07:51:36.658 INFO:tasks.workunit.client.1.vm08.stdout:3/560: symlink d0/d3c/d18/d32/d61/d83/lad 0 2026-03-10T07:51:36.667 INFO:tasks.workunit.client.1.vm08.stdout:6/582: symlink d1/d46/lbf 0 2026-03-10T07:51:36.671 INFO:tasks.workunit.client.1.vm08.stdout:5/603: unlink d0/c1 0 2026-03-10T07:51:36.674 INFO:tasks.workunit.client.1.vm08.stdout:6/583: dwrite d1/d3/df/d1d/d40/d45/fbb [0,4194304] 0 2026-03-10T07:51:36.684 INFO:tasks.workunit.client.1.vm08.stdout:4/472: mkdir d5/da0/d12/d7b/d48/da2 0 2026-03-10T07:51:36.689 INFO:tasks.workunit.client.1.vm08.stdout:0/545: mkdir dd/d10/d14/d15/dad 0 2026-03-10T07:51:36.692 INFO:tasks.workunit.client.1.vm08.stdout:9/565: fsync d2/d58/dbf/d2b/f6a 0 2026-03-10T07:51:36.697 INFO:tasks.workunit.client.1.vm08.stdout:5/604: symlink d0/d4/d19/d81/da4/lc3 0 2026-03-10T07:51:36.702 INFO:tasks.workunit.client.1.vm08.stdout:5/605: truncate d0/d4/d19/d60/fb6 807329 0 2026-03-10T07:51:36.706 INFO:tasks.workunit.client.1.vm08.stdout:3/561: symlink d0/d3c/d18/da9/lae 0 2026-03-10T07:51:36.706 INFO:tasks.workunit.client.1.vm08.stdout:1/527: write d2/d6/de/d47/f3c [1730798,86148] 0 2026-03-10T07:51:36.706 INFO:tasks.workunit.client.1.vm08.stdout:4/473: symlink d5/da0/d12/d7b/la3 0 2026-03-10T07:51:36.706 INFO:tasks.workunit.client.1.vm08.stdout:2/574: dwrite d0/d1/d3/d56/d78/dad/db1/f45 [0,4194304] 0 2026-03-10T07:51:36.706 INFO:tasks.workunit.client.1.vm08.stdout:7/530: write d3/da/d25/d9/d2f/d3a/d40/f55 [529098,91648] 0 2026-03-10T07:51:36.715 INFO:tasks.workunit.client.1.vm08.stdout:1/528: dread d2/f36 [0,4194304] 0 2026-03-10T07:51:36.716 INFO:tasks.workunit.client.1.vm08.stdout:1/529: write d2/d6/de/d1f/d26/f48 [626584,119054] 0 2026-03-10T07:51:36.717 INFO:tasks.workunit.client.1.vm08.stdout:5/606: sync 2026-03-10T07:51:36.724 INFO:tasks.workunit.client.1.vm08.stdout:3/562: mknod d0/d3c/d1f/d44/d51/d34/caf 0 2026-03-10T07:51:36.727 INFO:tasks.workunit.client.1.vm08.stdout:2/575: creat d0/d1/d17/db2/d9c/fbc x:0 0 0 2026-03-10T07:51:36.727 INFO:tasks.workunit.client.1.vm08.stdout:2/576: chown d0/fbb 254328 1 2026-03-10T07:51:36.727 INFO:tasks.workunit.client.1.vm08.stdout:5/607: dwrite d0/d4/df/dbf/fa6 [0,4194304] 0 2026-03-10T07:51:36.745 INFO:tasks.workunit.client.1.vm08.stdout:1/530: symlink d2/d6/de/d5f/lba 0 2026-03-10T07:51:36.755 INFO:tasks.workunit.client.1.vm08.stdout:3/563: symlink d0/d3c/d18/da9/lb0 0 2026-03-10T07:51:36.756 INFO:tasks.workunit.client.1.vm08.stdout:2/577: creat d0/d1/d3/d56/d78/dad/db1/d61/fbd x:0 0 0 2026-03-10T07:51:36.756 INFO:tasks.workunit.client.1.vm08.stdout:1/531: truncate d2/f69 522327 0 2026-03-10T07:51:36.757 INFO:tasks.workunit.client.1.vm08.stdout:4/474: mknod d5/da0/d12/d7b/d48/d4f/d7c/ca4 0 2026-03-10T07:51:36.765 INFO:tasks.workunit.client.1.vm08.stdout:0/546: rename dd/d10/d2f/d37/d64/d95/d5c/c85 to dd/d10/d14/d1b/cae 0 2026-03-10T07:51:36.782 INFO:tasks.workunit.client.1.vm08.stdout:4/475: symlink d5/da0/d95/la5 0 2026-03-10T07:51:36.783 INFO:tasks.workunit.client.1.vm08.stdout:4/476: truncate d5/d1f/d31/f62 4495906 0 2026-03-10T07:51:36.785 INFO:tasks.workunit.client.1.vm08.stdout:5/608: rename d0/d4/d19/d60/d6d/d70/fc1 to d0/d4/d19/d81/da4/fc4 0 2026-03-10T07:51:36.788 INFO:tasks.workunit.client.1.vm08.stdout:0/547: dread - dd/d10/d14/d15/f94 zero size 2026-03-10T07:51:36.790 INFO:tasks.workunit.client.1.vm08.stdout:0/548: read dd/d10/d14/f36 [484067,14285] 0 2026-03-10T07:51:36.803 INFO:tasks.workunit.client.1.vm08.stdout:0/549: mkdir dd/d10/d2f/d37/daf 0 2026-03-10T07:51:36.803 INFO:tasks.workunit.client.1.vm08.stdout:4/477: truncate d5/d8/f68 1007505 0 2026-03-10T07:51:36.807 INFO:tasks.workunit.client.1.vm08.stdout:5/609: fdatasync d0/d4/d19/d60/d6d/d70/d40/dba/fb7 0 2026-03-10T07:51:36.808 INFO:tasks.workunit.client.1.vm08.stdout:8/652: dwrite d0/df/d15/d23/d54/dba/d89/fd0 [0,4194304] 0 2026-03-10T07:51:36.811 INFO:tasks.workunit.client.1.vm08.stdout:6/584: dwrite d1/d17/f66 [0,4194304] 0 2026-03-10T07:51:36.811 INFO:tasks.workunit.client.1.vm08.stdout:6/585: fdatasync d1/d7d/f91 0 2026-03-10T07:51:36.819 INFO:tasks.workunit.client.1.vm08.stdout:9/566: dwrite d2/d58/dbf/d30/d35/fa0 [0,4194304] 0 2026-03-10T07:51:36.836 INFO:tasks.workunit.client.1.vm08.stdout:4/478: rename d5/da0/f1b to d5/da0/d12/d7b/d48/d4f/d8d/d91/fa6 0 2026-03-10T07:51:36.840 INFO:tasks.workunit.client.1.vm08.stdout:7/531: dwrite d3/da/d25/d9/fd [0,4194304] 0 2026-03-10T07:51:36.841 INFO:tasks.workunit.client.1.vm08.stdout:5/610: mkdir d0/d4/d19/d60/d6d/d70/dc5 0 2026-03-10T07:51:36.885 INFO:tasks.workunit.client.1.vm08.stdout:3/564: write d0/d3c/d18/f23 [6529964,74422] 0 2026-03-10T07:51:36.886 INFO:tasks.workunit.client.1.vm08.stdout:2/578: write d0/d1/d3/d56/d78/dad/db1/d61/f59 [5066337,123683] 0 2026-03-10T07:51:36.887 INFO:tasks.workunit.client.1.vm08.stdout:3/565: truncate d0/d3c/d18/d48/d55/d56/f81 441291 0 2026-03-10T07:51:36.894 INFO:tasks.workunit.client.1.vm08.stdout:6/586: rename d1/d3/df/d1d/d40/la9 to d1/d3/df/d1d/d40/lc0 0 2026-03-10T07:51:36.895 INFO:tasks.workunit.client.1.vm08.stdout:9/567: creat d2/de/d28/d98/fc4 x:0 0 0 2026-03-10T07:51:36.896 INFO:tasks.workunit.client.1.vm08.stdout:1/532: truncate d2/d6/d3a/d61/d6f/f9d 828516 0 2026-03-10T07:51:36.897 INFO:tasks.workunit.client.1.vm08.stdout:1/533: write d2/d6/de/d1f/da9/faf [99941,54920] 0 2026-03-10T07:51:36.923 INFO:tasks.workunit.client.1.vm08.stdout:5/611: symlink d0/d77/daa/lc6 0 2026-03-10T07:51:36.929 INFO:tasks.workunit.client.1.vm08.stdout:5/612: sync 2026-03-10T07:51:36.944 INFO:tasks.workunit.client.1.vm08.stdout:8/653: creat d0/df/d2e/d30/dc0/dce/fd2 x:0 0 0 2026-03-10T07:51:36.951 INFO:tasks.workunit.client.1.vm08.stdout:2/579: truncate d0/d1/d3/d56/d57/f79 4712984 0 2026-03-10T07:51:36.954 INFO:tasks.workunit.client.1.vm08.stdout:2/580: dwrite d0/d1/fa2 [0,4194304] 0 2026-03-10T07:51:36.977 INFO:tasks.workunit.client.1.vm08.stdout:6/587: dwrite d1/d17/d2b/d58/d76/f99 [0,4194304] 0 2026-03-10T07:51:37.002 INFO:tasks.workunit.client.1.vm08.stdout:5/613: symlink d0/d4/d19/d3a/d69/lc7 0 2026-03-10T07:51:37.002 INFO:tasks.workunit.client.1.vm08.stdout:8/654: rename d0/df/d15/d23/d39/d5b/l2f to d0/df/d15/d23/d54/ld3 0 2026-03-10T07:51:37.002 INFO:tasks.workunit.client.1.vm08.stdout:5/614: readlink d0/d4/df/dbf/l2b 0 2026-03-10T07:51:37.002 INFO:tasks.workunit.client.1.vm08.stdout:8/655: fdatasync d0/df/d15/d23/da8/fc2 0 2026-03-10T07:51:37.007 INFO:tasks.workunit.client.1.vm08.stdout:5/615: dwrite d0/d8/d5e/f6a [0,4194304] 0 2026-03-10T07:51:37.008 INFO:tasks.workunit.client.1.vm08.stdout:8/656: dwrite d0/d69/d3f/fb3 [0,4194304] 0 2026-03-10T07:51:37.023 INFO:tasks.workunit.client.1.vm08.stdout:3/566: mknod d0/d3c/d1f/d44/d51/d2d/d85/cb1 0 2026-03-10T07:51:37.048 INFO:tasks.workunit.client.1.vm08.stdout:0/550: creat dd/d10/d2f/d37/d64/d95/d5c/fb0 x:0 0 0 2026-03-10T07:51:37.048 INFO:tasks.workunit.client.1.vm08.stdout:0/551: chown dd/d18/c71 1067045 1 2026-03-10T07:51:37.050 INFO:tasks.workunit.client.1.vm08.stdout:2/581: mknod d0/d1/d3/d56/d78/dad/db1/cbe 0 2026-03-10T07:51:37.051 INFO:tasks.workunit.client.1.vm08.stdout:5/616: write d0/d4/d19/d50/f8a [1169118,117255] 0 2026-03-10T07:51:37.053 INFO:tasks.workunit.client.1.vm08.stdout:8/657: mknod d0/d69/d77/cd4 0 2026-03-10T07:51:37.059 INFO:tasks.workunit.client.1.vm08.stdout:3/567: mknod d0/d3c/d1f/d44/d51/d2d/cb2 0 2026-03-10T07:51:37.060 INFO:tasks.workunit.client.1.vm08.stdout:3/568: readlink d0/d3c/d18/d48/d55/d56/l6b 0 2026-03-10T07:51:37.061 INFO:tasks.workunit.client.1.vm08.stdout:1/534: link d2/d10/f3e d2/d6/de/d70/d80/fbb 0 2026-03-10T07:51:37.065 INFO:tasks.workunit.client.1.vm08.stdout:0/552: fdatasync dd/d10/d2f/d37/d64/f68 0 2026-03-10T07:51:37.065 INFO:tasks.workunit.client.1.vm08.stdout:0/553: fsync dd/d18/f21 0 2026-03-10T07:51:37.068 INFO:tasks.workunit.client.1.vm08.stdout:2/582: unlink d0/d1/d17/d6b/da0/fa3 0 2026-03-10T07:51:37.071 INFO:tasks.workunit.client.1.vm08.stdout:5/617: mkdir d0/d4/df/dbf/d41/dc8 0 2026-03-10T07:51:37.079 INFO:tasks.workunit.client.1.vm08.stdout:8/658: symlink d0/df/d17/d25/ld5 0 2026-03-10T07:51:37.080 INFO:tasks.workunit.client.1.vm08.stdout:3/569: mknod d0/d3c/d18/da9/cb3 0 2026-03-10T07:51:37.082 INFO:tasks.workunit.client.1.vm08.stdout:1/535: truncate d2/d10/f3f 4657726 0 2026-03-10T07:51:37.086 INFO:tasks.workunit.client.1.vm08.stdout:3/570: sync 2026-03-10T07:51:37.086 INFO:tasks.workunit.client.1.vm08.stdout:3/571: chown d0/d3c/l1b 4736028 1 2026-03-10T07:51:37.087 INFO:tasks.workunit.client.1.vm08.stdout:3/572: chown d0/d3c/d18/fa5 51 1 2026-03-10T07:51:37.094 INFO:tasks.workunit.client.1.vm08.stdout:8/659: rmdir d0/df/d2e/d30/dc0 39 2026-03-10T07:51:37.096 INFO:tasks.workunit.client.1.vm08.stdout:1/536: unlink d2/d6/de/d1f/d40/c94 0 2026-03-10T07:51:37.096 INFO:tasks.workunit.client.1.vm08.stdout:1/537: readlink d2/l2c 0 2026-03-10T07:51:37.097 INFO:tasks.workunit.client.1.vm08.stdout:3/573: creat d0/d3c/d1f/d44/fb4 x:0 0 0 2026-03-10T07:51:37.099 INFO:tasks.workunit.client.1.vm08.stdout:7/532: dwrite d3/da/d25/d9/f41 [0,4194304] 0 2026-03-10T07:51:37.100 INFO:tasks.workunit.client.1.vm08.stdout:0/554: creat dd/d10/d2f/d37/d64/d95/d58/d3d/d9b/fb1 x:0 0 0 2026-03-10T07:51:37.101 INFO:tasks.workunit.client.1.vm08.stdout:2/583: symlink d0/d1/d3/d39/d7d/d86/d55/db9/lbf 0 2026-03-10T07:51:37.106 INFO:tasks.workunit.client.1.vm08.stdout:0/555: sync 2026-03-10T07:51:37.110 INFO:tasks.workunit.client.1.vm08.stdout:1/538: mknod d2/d6/de/d47/cbc 0 2026-03-10T07:51:37.112 INFO:tasks.workunit.client.1.vm08.stdout:1/539: stat d2/d6/de/d1f/d26/d58/d8c/f87 0 2026-03-10T07:51:37.113 INFO:tasks.workunit.client.1.vm08.stdout:7/533: dwrite d3/da/d25/d9/d2f/d3a/d40/f55 [0,4194304] 0 2026-03-10T07:51:37.114 INFO:tasks.workunit.client.1.vm08.stdout:4/479: dwrite d5/da0/d12/d7b/d48/d4f/d8d/d91/fa6 [0,4194304] 0 2026-03-10T07:51:37.116 INFO:tasks.workunit.client.1.vm08.stdout:4/480: read d5/d1f/d70/f78 [841586,106815] 0 2026-03-10T07:51:37.119 INFO:tasks.workunit.client.1.vm08.stdout:6/588: dwrite d1/d3/df/d1d/d40/d87/f8e [0,4194304] 0 2026-03-10T07:51:37.124 INFO:tasks.workunit.client.1.vm08.stdout:6/589: write d1/d17/d2b/d5e/f96 [98418,73914] 0 2026-03-10T07:51:37.129 INFO:tasks.workunit.client.1.vm08.stdout:9/568: dwrite d2/f1a [0,4194304] 0 2026-03-10T07:51:37.147 INFO:tasks.workunit.client.1.vm08.stdout:5/618: link d0/d4/df/dbf/d41/l52 d0/d4/d19/d60/d6d/d70/d40/dba/lc9 0 2026-03-10T07:51:37.160 INFO:tasks.workunit.client.1.vm08.stdout:8/660: mknod d0/df/cd6 0 2026-03-10T07:51:37.163 INFO:tasks.workunit.client.1.vm08.stdout:7/534: mkdir d3/da/d25/d9/d2f/d3a/d40/d54/db5 0 2026-03-10T07:51:37.163 INFO:tasks.workunit.client.1.vm08.stdout:2/584: getdents d0/d1/d3/d10/d38/daf 0 2026-03-10T07:51:37.163 INFO:tasks.workunit.client.1.vm08.stdout:1/540: dread d2/f69 [0,4194304] 0 2026-03-10T07:51:37.163 INFO:tasks.workunit.client.1.vm08.stdout:6/590: chown d1/d17/f63 1 1 2026-03-10T07:51:37.163 INFO:tasks.workunit.client.1.vm08.stdout:6/591: write d1/d3/df/d1d/f9b [128284,104521] 0 2026-03-10T07:51:37.163 INFO:tasks.workunit.client.1.vm08.stdout:9/569: mknod d2/de/d28/cc5 0 2026-03-10T07:51:37.163 INFO:tasks.workunit.client.1.vm08.stdout:7/535: chown d3/da/d25/d9/d2f/d3a/d4b/d67/c75 1 1 2026-03-10T07:51:37.163 INFO:tasks.workunit.client.1.vm08.stdout:0/556: truncate dd/d10/d14/d1b/d30/f4d 1110227 0 2026-03-10T07:51:37.164 INFO:tasks.workunit.client.1.vm08.stdout:6/592: write d1/d3/df/d44/f82 [2840388,25060] 0 2026-03-10T07:51:37.165 INFO:tasks.workunit.client.1.vm08.stdout:6/593: chown d1/d3/df/d1d/d40/l6a 128835751 1 2026-03-10T07:51:37.165 INFO:tasks.workunit.client.1.vm08.stdout:0/557: write dd/d10/d2f/d37/d64/d95/d5c/f63 [9126963,98676] 0 2026-03-10T07:51:37.171 INFO:tasks.workunit.client.1.vm08.stdout:0/558: write dd/d10/d2f/d37/d64/d95/d58/d3d/f40 [711078,21553] 0 2026-03-10T07:51:37.172 INFO:tasks.workunit.client.1.vm08.stdout:7/536: dwrite d3/fa4 [0,4194304] 0 2026-03-10T07:51:37.193 INFO:tasks.workunit.client.1.vm08.stdout:5/619: rename d0/d4/df/d12/d22/fa5 to d0/d4/df/dbf/daf/fca 0 2026-03-10T07:51:37.193 INFO:tasks.workunit.client.1.vm08.stdout:5/620: truncate d0/d4/d19/d60/d6d/d70/fb4 520678 0 2026-03-10T07:51:37.194 INFO:tasks.workunit.client.1.vm08.stdout:5/621: chown d0/d4/df/dbf/d41/dc8 2863 1 2026-03-10T07:51:37.195 INFO:tasks.workunit.client.1.vm08.stdout:5/622: write d0/d4/df/dbf/fa6 [3145657,108352] 0 2026-03-10T07:51:37.216 INFO:tasks.workunit.client.1.vm08.stdout:8/661: dread d0/f6 [0,4194304] 0 2026-03-10T07:51:37.247 INFO:tasks.workunit.client.1.vm08.stdout:1/541: mkdir d2/d6/de/d47/dbd 0 2026-03-10T07:51:37.279 INFO:tasks.workunit.client.1.vm08.stdout:7/537: mkdir d3/da/d25/d9/d2f/d4d/db6 0 2026-03-10T07:51:37.280 INFO:tasks.workunit.client.1.vm08.stdout:7/538: readlink d3/da/lb1 0 2026-03-10T07:51:37.282 INFO:tasks.workunit.client.1.vm08.stdout:5/623: creat d0/d4/d19/d60/d6d/fcb x:0 0 0 2026-03-10T07:51:37.290 INFO:tasks.workunit.client.1.vm08.stdout:8/662: rmdir d0/df/d2e/d30 39 2026-03-10T07:51:37.296 INFO:tasks.workunit.client.1.vm08.stdout:1/542: unlink d2/d6/de/d70/f8d 0 2026-03-10T07:51:37.300 INFO:tasks.workunit.client.1.vm08.stdout:6/594: unlink d1/db/d24/d73/d79/l92 0 2026-03-10T07:51:37.300 INFO:tasks.workunit.client.1.vm08.stdout:3/574: write d0/d3c/d18/d32/d61/d52/f73 [1090465,5563] 0 2026-03-10T07:51:37.303 INFO:tasks.workunit.client.1.vm08.stdout:6/595: dread d1/d3/d3e/f5b [0,4194304] 0 2026-03-10T07:51:37.304 INFO:tasks.workunit.client.1.vm08.stdout:6/596: chown d1/d17/f63 31 1 2026-03-10T07:51:37.305 INFO:tasks.workunit.client.1.vm08.stdout:7/539: rename d3/da/d25/d9/d2f/d3a/d4b/db0 to d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7 0 2026-03-10T07:51:37.305 INFO:tasks.workunit.client.1.vm08.stdout:5/624: dread - d0/d8/f85 zero size 2026-03-10T07:51:37.306 INFO:tasks.workunit.client.1.vm08.stdout:3/575: read d0/d3c/d1f/d95/fab [158392,89731] 0 2026-03-10T07:51:37.307 INFO:tasks.workunit.client.1.vm08.stdout:4/481: getdents d5/da0/d12/d7b/d48/d4f/d8d 0 2026-03-10T07:51:37.312 INFO:tasks.workunit.client.1.vm08.stdout:8/663: dread d0/df/d2e/d30/f43 [0,4194304] 0 2026-03-10T07:51:37.312 INFO:tasks.workunit.client.1.vm08.stdout:7/540: read - d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/f81 zero size 2026-03-10T07:51:37.313 INFO:tasks.workunit.client.1.vm08.stdout:1/543: creat d2/d6/de/d1f/d26/d98/fbe x:0 0 0 2026-03-10T07:51:37.318 INFO:tasks.workunit.client.1.vm08.stdout:7/541: stat d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/l6d 0 2026-03-10T07:51:37.318 INFO:tasks.workunit.client.1.vm08.stdout:8/664: truncate d0/df/d15/d23/da8/fc2 147638 0 2026-03-10T07:51:37.320 INFO:tasks.workunit.client.1.vm08.stdout:0/559: link dd/d10/d2f/d37/d64/f68 dd/d10/d2f/d37/d64/d95/d58/fb2 0 2026-03-10T07:51:37.322 INFO:tasks.workunit.client.1.vm08.stdout:6/597: unlink d1/db/d24/d73/d79/c9f 0 2026-03-10T07:51:37.323 INFO:tasks.workunit.client.1.vm08.stdout:2/585: dwrite d0/d1/d3/d39/d7d/d86/d55/d1b/f26 [0,4194304] 0 2026-03-10T07:51:37.327 INFO:tasks.workunit.client.1.vm08.stdout:6/598: dread d1/f35 [0,4194304] 0 2026-03-10T07:51:37.328 INFO:tasks.workunit.client.1.vm08.stdout:6/599: truncate d1/d3/df/d44/f82 4906865 0 2026-03-10T07:51:37.339 INFO:tasks.workunit.client.1.vm08.stdout:5/625: symlink d0/d8/d5e/lcc 0 2026-03-10T07:51:37.342 INFO:tasks.workunit.client.1.vm08.stdout:9/570: getdents d2/d3/d84 0 2026-03-10T07:51:37.374 INFO:tasks.workunit.client.1.vm08.stdout:1/544: rename d2/d6/de/f32 to d2/d6/de/d1f/d26/d89/d8e/fbf 0 2026-03-10T07:51:37.378 INFO:tasks.workunit.client.1.vm08.stdout:7/542: unlink d3/da/d25/d9/d2f/d3a/d40/d54/l7c 0 2026-03-10T07:51:37.378 INFO:tasks.workunit.client.1.vm08.stdout:7/543: chown d3/da/d25/d9/d2f/d39/d43 3 1 2026-03-10T07:51:37.389 INFO:tasks.workunit.client.1.vm08.stdout:0/560: dwrite dd/d10/d14/d15/f94 [0,4194304] 0 2026-03-10T07:51:37.396 INFO:tasks.workunit.client.1.vm08.stdout:0/561: write dd/d10/d2f/d37/d64/d95/d5c/f63 [7149602,42620] 0 2026-03-10T07:51:37.396 INFO:tasks.workunit.client.1.vm08.stdout:5/626: chown d0/d4/df/dbf/daf/fca 1806154 1 2026-03-10T07:51:37.396 INFO:tasks.workunit.client.1.vm08.stdout:2/586: symlink d0/d1/d3/d56/d57/lc0 0 2026-03-10T07:51:37.402 INFO:tasks.workunit.client.1.vm08.stdout:9/571: rmdir d2/d3/d84 39 2026-03-10T07:51:37.404 INFO:tasks.workunit.client.1.vm08.stdout:9/572: fsync d2/de/d28/fa8 0 2026-03-10T07:51:37.405 INFO:tasks.workunit.client.1.vm08.stdout:6/600: write d1/db/d24/dac/dad/f59 [2453852,53621] 0 2026-03-10T07:51:37.430 INFO:tasks.workunit.client.1.vm08.stdout:2/587: rmdir d0 39 2026-03-10T07:51:37.432 INFO:tasks.workunit.client.1.vm08.stdout:3/576: link d0/d3c/d1f/d44/f59 d0/d3c/d1f/d44/d51/d34/fb5 0 2026-03-10T07:51:37.434 INFO:tasks.workunit.client.1.vm08.stdout:4/482: truncate d5/f21 1584128 0 2026-03-10T07:51:37.438 INFO:tasks.workunit.client.1.vm08.stdout:5/627: dread d0/d8/d24/f56 [0,4194304] 0 2026-03-10T07:51:37.442 INFO:tasks.workunit.client.1.vm08.stdout:8/665: unlink d0/df/d15/d23/d39/d5b/c79 0 2026-03-10T07:51:37.446 INFO:tasks.workunit.client.1.vm08.stdout:1/545: dwrite d2/d6/d50/f54 [0,4194304] 0 2026-03-10T07:51:37.457 INFO:tasks.workunit.client.1.vm08.stdout:0/562: creat dd/d10/d2f/d37/daf/fb3 x:0 0 0 2026-03-10T07:51:37.459 INFO:tasks.workunit.client.1.vm08.stdout:3/577: stat d0/d3c/d1f/d44/d51/f9d 0 2026-03-10T07:51:37.461 INFO:tasks.workunit.client.1.vm08.stdout:8/666: dread d0/df/d17/d72/f91 [0,4194304] 0 2026-03-10T07:51:37.462 INFO:tasks.workunit.client.1.vm08.stdout:0/563: read dd/d10/d14/d15/d20/d5f/f7f [723262,28200] 0 2026-03-10T07:51:37.469 INFO:tasks.workunit.client.1.vm08.stdout:0/564: dread dd/fe [0,4194304] 0 2026-03-10T07:51:37.478 INFO:tasks.workunit.client.1.vm08.stdout:6/601: truncate d1/d3/d3e/f5b 554620 0 2026-03-10T07:51:37.486 INFO:tasks.workunit.client.1.vm08.stdout:9/573: dwrite d2/d26/f29 [0,4194304] 0 2026-03-10T07:51:37.491 INFO:tasks.workunit.client.1.vm08.stdout:4/483: rename d5/da0/d12/d7b/d48/d64 to d5/da0/d12/d7b/da7 0 2026-03-10T07:51:37.492 INFO:tasks.workunit.client.1.vm08.stdout:7/544: dwrite d3/da/d25/f27 [0,4194304] 0 2026-03-10T07:51:37.492 INFO:tasks.workunit.client.1.vm08.stdout:9/574: fdatasync d2/d58/dbf/d2b/f83 0 2026-03-10T07:51:37.493 INFO:tasks.workunit.client.1.vm08.stdout:8/667: creat d0/df/d15/d23/d54/dba/d89/dc5/fd7 x:0 0 0 2026-03-10T07:51:37.493 INFO:tasks.workunit.client.1.vm08.stdout:5/628: dwrite d0/d4/d19/d81/d92/f73 [0,4194304] 0 2026-03-10T07:51:37.494 INFO:tasks.workunit.client.1.vm08.stdout:7/545: truncate d3/f34 1365525 0 2026-03-10T07:51:37.494 INFO:tasks.workunit.client.1.vm08.stdout:5/629: write d0/d4/d19/d3a/fa1 [824424,64567] 0 2026-03-10T07:51:37.496 INFO:tasks.workunit.client.1.vm08.stdout:7/546: chown d3/da/d25/d9 75962 1 2026-03-10T07:51:37.496 INFO:tasks.workunit.client.1.vm08.stdout:0/565: readlink dd/d10/d14/d15/d20/d22/l9e 0 2026-03-10T07:51:37.503 INFO:tasks.workunit.client.1.vm08.stdout:8/668: dwrite d0/df/d15/d23/f3d [0,4194304] 0 2026-03-10T07:51:37.512 INFO:tasks.workunit.client.1.vm08.stdout:6/602: chown d1/d3/df/d1d/c69 7 1 2026-03-10T07:51:37.525 INFO:tasks.workunit.client.1.vm08.stdout:7/547: unlink d3/da/d25/d9/d2f/d39/c90 0 2026-03-10T07:51:37.528 INFO:tasks.workunit.client.1.vm08.stdout:7/548: stat d3/da/d25/d9/d2f/d39/caa 0 2026-03-10T07:51:37.528 INFO:tasks.workunit.client.1.vm08.stdout:4/484: sync 2026-03-10T07:51:37.529 INFO:tasks.workunit.client.1.vm08.stdout:4/485: fdatasync d5/d8/ff 0 2026-03-10T07:51:37.529 INFO:tasks.workunit.client.1.vm08.stdout:7/549: dread d3/da/d25/d9/d2f/d3a/d4b/fa3 [0,4194304] 0 2026-03-10T07:51:37.530 INFO:tasks.workunit.client.1.vm08.stdout:0/566: creat dd/d10/d2f/d37/d64/d95/d58/d3d/d9b/fb4 x:0 0 0 2026-03-10T07:51:37.534 INFO:tasks.workunit.client.1.vm08.stdout:8/669: readlink d0/df/d15/d23/d39/d5b/d4a/lb8 0 2026-03-10T07:51:37.538 INFO:tasks.workunit.client.1.vm08.stdout:8/670: dwrite d0/df/d15/d23/d54/fad [0,4194304] 0 2026-03-10T07:51:37.541 INFO:tasks.workunit.client.1.vm08.stdout:8/671: readlink d0/d37/lbb 0 2026-03-10T07:51:37.575 INFO:tasks.workunit.client.1.vm08.stdout:4/486: fsync d5/da0/f18 0 2026-03-10T07:51:37.586 INFO:tasks.workunit.client.1.vm08.stdout:5/630: write d0/d4/d19/d43/f35 [257602,2273] 0 2026-03-10T07:51:37.594 INFO:tasks.workunit.client.1.vm08.stdout:2/588: getdents d0/d1/d3/d39/d7d/d86/d55 0 2026-03-10T07:51:37.598 INFO:tasks.workunit.client.1.vm08.stdout:3/578: getdents d0/d3c/d18/d32/d61/d52 0 2026-03-10T07:51:37.602 INFO:tasks.workunit.client.1.vm08.stdout:6/603: write d1/f28 [2694985,84659] 0 2026-03-10T07:51:37.602 INFO:tasks.workunit.client.1.vm08.stdout:9/575: write d2/d26/fc1 [105848,55184] 0 2026-03-10T07:51:37.602 INFO:tasks.workunit.client.1.vm08.stdout:6/604: readlink d1/d3/l9 0 2026-03-10T07:51:37.603 INFO:tasks.workunit.client.1.vm08.stdout:9/576: dread - d2/d58/dbf/d30/fac zero size 2026-03-10T07:51:37.604 INFO:tasks.workunit.client.1.vm08.stdout:6/605: write d1/d17/d2b/d58/d77/fb7 [511592,15710] 0 2026-03-10T07:51:37.607 INFO:tasks.workunit.client.1.vm08.stdout:1/546: rename d2/d6/de/d1f/d26/d89/d8e/f90 to d2/d6/de/fc0 0 2026-03-10T07:51:37.616 INFO:tasks.workunit.client.1.vm08.stdout:7/550: mkdir d3/da/d25/db8 0 2026-03-10T07:51:37.616 INFO:tasks.workunit.client.1.vm08.stdout:4/487: dwrite d5/da0/d12/d7b/f43 [0,4194304] 0 2026-03-10T07:51:37.629 INFO:tasks.workunit.client.1.vm08.stdout:8/672: mknod d0/df/d15/d23/cd8 0 2026-03-10T07:51:37.634 INFO:tasks.workunit.client.1.vm08.stdout:2/589: mknod d0/d1/d3/d56/d78/dad/db1/d61/d84/cc1 0 2026-03-10T07:51:37.634 INFO:tasks.workunit.client.1.vm08.stdout:9/577: unlink d2/d58/dbf/d30/f87 0 2026-03-10T07:51:37.634 INFO:tasks.workunit.client.1.vm08.stdout:2/590: dwrite d0/d1/d17/d6b/f9a [0,4194304] 0 2026-03-10T07:51:37.634 INFO:tasks.workunit.client.1.vm08.stdout:2/591: chown d0/d1/d3/d39/d7d/d86/fa9 1 1 2026-03-10T07:51:37.640 INFO:tasks.workunit.client.1.vm08.stdout:6/606: rmdir d1/d3/df/d1d 39 2026-03-10T07:51:37.648 INFO:tasks.workunit.client.1.vm08.stdout:0/567: truncate dd/d10/d14/d15/d20/d5f/f61 743289 0 2026-03-10T07:51:37.651 INFO:tasks.workunit.client.1.vm08.stdout:4/488: readlink d5/l40 0 2026-03-10T07:51:37.651 INFO:tasks.workunit.client.1.vm08.stdout:7/551: creat d3/da/d25/d9/d2f/d4d/fb9 x:0 0 0 2026-03-10T07:51:37.652 INFO:tasks.workunit.client.1.vm08.stdout:5/631: creat d0/d8/d5e/d8e/dae/fcd x:0 0 0 2026-03-10T07:51:37.655 INFO:tasks.workunit.client.1.vm08.stdout:4/489: read d5/d1f/d31/f33 [2365701,106194] 0 2026-03-10T07:51:37.657 INFO:tasks.workunit.client.1.vm08.stdout:8/673: creat d0/d69/fd9 x:0 0 0 2026-03-10T07:51:37.657 INFO:tasks.workunit.client.1.vm08.stdout:5/632: dwrite d0/d4/d19/d81/d92/f65 [0,4194304] 0 2026-03-10T07:51:37.671 INFO:tasks.workunit.client.1.vm08.stdout:3/579: fdatasync d0/f9a 0 2026-03-10T07:51:37.673 INFO:tasks.workunit.client.1.vm08.stdout:9/578: creat d2/d58/fc6 x:0 0 0 2026-03-10T07:51:37.676 INFO:tasks.workunit.client.1.vm08.stdout:1/547: dread d2/d6/de/d1f/f75 [0,4194304] 0 2026-03-10T07:51:37.677 INFO:tasks.workunit.client.1.vm08.stdout:2/592: creat d0/d1/d3/d39/d7d/d86/d55/db9/fc2 x:0 0 0 2026-03-10T07:51:37.682 INFO:tasks.workunit.client.1.vm08.stdout:9/579: dread d2/d58/f95 [0,4194304] 0 2026-03-10T07:51:37.683 INFO:tasks.workunit.client.1.vm08.stdout:4/490: sync 2026-03-10T07:51:37.685 INFO:tasks.workunit.client.1.vm08.stdout:2/593: dwrite d0/d1/d17/f95 [0,4194304] 0 2026-03-10T07:51:37.692 INFO:tasks.workunit.client.1.vm08.stdout:7/552: rename d3/da/l83 to d3/da/d25/d9/d2f/d3a/d4b/d67/lba 0 2026-03-10T07:51:37.699 INFO:tasks.workunit.client.1.vm08.stdout:8/674: readlink d0/df/d15/d23/da8/lc8 0 2026-03-10T07:51:37.699 INFO:tasks.workunit.client.1.vm08.stdout:5/633: fsync d0/d4/d19/d3a/d69/f71 0 2026-03-10T07:51:37.707 INFO:tasks.workunit.client.1.vm08.stdout:3/580: rmdir d0/d3c/d1f/d44/d51/d34 39 2026-03-10T07:51:37.711 INFO:tasks.workunit.client.1.vm08.stdout:3/581: dwrite d0/d3c/d18/d4a/f8a [0,4194304] 0 2026-03-10T07:51:37.712 INFO:tasks.workunit.client.1.vm08.stdout:3/582: fdatasync d0/f10 0 2026-03-10T07:51:37.712 INFO:tasks.workunit.client.1.vm08.stdout:3/583: stat d0/d3c/d18/d32/d61/d83/lad 0 2026-03-10T07:51:37.729 INFO:tasks.workunit.client.1.vm08.stdout:9/580: truncate d2/d58/dbf/f63 840183 0 2026-03-10T07:51:37.730 INFO:tasks.workunit.client.1.vm08.stdout:9/581: write d2/d58/dbf/d30/d35/fc3 [384192,104424] 0 2026-03-10T07:51:37.734 INFO:tasks.workunit.client.1.vm08.stdout:2/594: mkdir d0/d1/d17/db2/dc3 0 2026-03-10T07:51:37.736 INFO:tasks.workunit.client.1.vm08.stdout:4/491: dwrite d5/d1f/d41/f83 [0,4194304] 0 2026-03-10T07:51:37.739 INFO:tasks.workunit.client.1.vm08.stdout:7/553: unlink d3/l33 0 2026-03-10T07:51:37.752 INFO:tasks.workunit.client.1.vm08.stdout:8/675: fsync d0/df/f13 0 2026-03-10T07:51:37.752 INFO:tasks.workunit.client.1.vm08.stdout:8/676: readlink d0/df/d17/lb2 0 2026-03-10T07:51:37.772 INFO:tasks.workunit.client.1.vm08.stdout:3/584: mknod d0/d3c/d18/d32/daa/cb6 0 2026-03-10T07:51:37.774 INFO:tasks.workunit.client.1.vm08.stdout:5/634: dwrite d0/d4/df/dbf/f49 [0,4194304] 0 2026-03-10T07:51:37.788 INFO:tasks.workunit.client.1.vm08.stdout:1/548: mkdir d2/d6/de/d71/dc1 0 2026-03-10T07:51:37.791 INFO:tasks.workunit.client.1.vm08.stdout:6/607: link d1/d17/f66 d1/db/fc1 0 2026-03-10T07:51:37.793 INFO:tasks.workunit.client.1.vm08.stdout:9/582: mknod d2/d58/dbf/d30/d35/d97/d9d/cc7 0 2026-03-10T07:51:37.795 INFO:tasks.workunit.client.1.vm08.stdout:2/595: creat d0/d1/d3/d39/d7d/d86/d55/d1b/fc4 x:0 0 0 2026-03-10T07:51:37.806 INFO:tasks.workunit.client.1.vm08.stdout:0/568: dwrite dd/d10/d14/d15/d20/d5f/f61 [0,4194304] 0 2026-03-10T07:51:37.808 INFO:tasks.workunit.client.1.vm08.stdout:8/677: symlink d0/d37/d86/lda 0 2026-03-10T07:51:37.823 INFO:tasks.workunit.client.1.vm08.stdout:5/635: rename d0/d8/d5e/d8e/dae to d0/d8/dce 0 2026-03-10T07:51:37.826 INFO:tasks.workunit.client.1.vm08.stdout:1/549: fdatasync d2/d6/de/d1f/d26/d89/d8e/fbf 0 2026-03-10T07:51:37.830 INFO:tasks.workunit.client.1.vm08.stdout:2/596: creat d0/d1/d3/d10/d65/fc5 x:0 0 0 2026-03-10T07:51:37.833 INFO:tasks.workunit.client.1.vm08.stdout:4/492: creat d5/d1f/d9b/fa8 x:0 0 0 2026-03-10T07:51:37.833 INFO:tasks.workunit.client.1.vm08.stdout:4/493: dread - d5/da0/f69 zero size 2026-03-10T07:51:37.835 INFO:tasks.workunit.client.1.vm08.stdout:5/636: sync 2026-03-10T07:51:37.841 INFO:tasks.workunit.client.1.vm08.stdout:0/569: unlink dd/d10/d2f/d37/d64/d95/l78 0 2026-03-10T07:51:37.842 INFO:tasks.workunit.client.1.vm08.stdout:0/570: fdatasync dd/d10/d14/d15/d20/d5f/f7f 0 2026-03-10T07:51:37.844 INFO:tasks.workunit.client.1.vm08.stdout:3/585: stat d0/d3c/d1f/d44/d51/d34/f4e 0 2026-03-10T07:51:37.847 INFO:tasks.workunit.client.1.vm08.stdout:1/550: mkdir d2/d6/de/d1f/d26/d58/d83/dc2 0 2026-03-10T07:51:37.848 INFO:tasks.workunit.client.1.vm08.stdout:0/571: dwrite dd/d10/d14/d15/f9c [0,4194304] 0 2026-03-10T07:51:37.850 INFO:tasks.workunit.client.1.vm08.stdout:9/583: unlink d2/d58/dbf/d30/f47 0 2026-03-10T07:51:37.858 INFO:tasks.workunit.client.1.vm08.stdout:3/586: dread d0/d3c/d1f/d44/f59 [0,4194304] 0 2026-03-10T07:51:37.868 INFO:tasks.workunit.client.1.vm08.stdout:0/572: dread dd/d10/d14/d15/f84 [0,4194304] 0 2026-03-10T07:51:37.872 INFO:tasks.workunit.client.1.vm08.stdout:4/494: creat d5/d1f/d70/fa9 x:0 0 0 2026-03-10T07:51:37.877 INFO:tasks.workunit.client.1.vm08.stdout:1/551: read d2/d6/de/f74 [1795881,99497] 0 2026-03-10T07:51:37.897 INFO:tasks.workunit.client.1.vm08.stdout:9/584: creat d2/d58/dbf/d30/d35/fc8 x:0 0 0 2026-03-10T07:51:37.898 INFO:tasks.workunit.client.1.vm08.stdout:9/585: chown d2/de 21634 1 2026-03-10T07:51:37.906 INFO:tasks.workunit.client.1.vm08.stdout:3/587: chown d0/d3c/d1f/d44/d51/f5b 969286537 1 2026-03-10T07:51:37.907 INFO:tasks.workunit.client.1.vm08.stdout:7/554: getdents d3/da/d25/d9/d2f/d39 0 2026-03-10T07:51:37.909 INFO:tasks.workunit.client.1.vm08.stdout:0/573: unlink dd/d10/d14/d15/d20/d5f/f7f 0 2026-03-10T07:51:37.923 INFO:tasks.workunit.client.1.vm08.stdout:6/608: link d1/d3/df/d1d/d40/f6e d1/d3/df/d1d/d40/d87/fc2 0 2026-03-10T07:51:37.927 INFO:tasks.workunit.client.1.vm08.stdout:9/586: unlink d2/de/l7b 0 2026-03-10T07:51:37.929 INFO:tasks.workunit.client.1.vm08.stdout:6/609: dread d1/db/d24/dac/dad/f59 [0,4194304] 0 2026-03-10T07:51:37.931 INFO:tasks.workunit.client.1.vm08.stdout:3/588: mknod d0/d3c/d1f/d44/cb7 0 2026-03-10T07:51:37.932 INFO:tasks.workunit.client.1.vm08.stdout:7/555: write d3/da/f1d [3441014,100754] 0 2026-03-10T07:51:37.936 INFO:tasks.workunit.client.1.vm08.stdout:5/637: link d0/d4/d19/d50/f9b d0/d4/d19/d81/d92/fcf 0 2026-03-10T07:51:37.937 INFO:tasks.workunit.client.1.vm08.stdout:7/556: dwrite d3/f34 [0,4194304] 0 2026-03-10T07:51:37.937 INFO:tasks.workunit.client.1.vm08.stdout:5/638: fsync d0/d4/d19/d60/d6d/d70/d40/f5f 0 2026-03-10T07:51:37.937 INFO:tasks.workunit.client.1.vm08.stdout:8/678: getdents d0/df/d15/d23/d54 0 2026-03-10T07:51:37.938 INFO:tasks.workunit.client.1.vm08.stdout:1/552: mkdir d2/d6/de/d47/dbd/dc3 0 2026-03-10T07:51:37.946 INFO:tasks.workunit.client.1.vm08.stdout:7/557: sync 2026-03-10T07:51:37.946 INFO:tasks.workunit.client.1.vm08.stdout:3/589: mknod d0/d3c/d1f/d89/cb8 0 2026-03-10T07:51:37.949 INFO:tasks.workunit.client.1.vm08.stdout:3/590: dwrite d0/f39 [4194304,4194304] 0 2026-03-10T07:51:37.951 INFO:tasks.workunit.client.1.vm08.stdout:9/587: symlink d2/lc9 0 2026-03-10T07:51:37.954 INFO:tasks.workunit.client.1.vm08.stdout:5/639: rename d0/d4/d19/d50 to d0/d8/d24/dd0 0 2026-03-10T07:51:37.958 INFO:tasks.workunit.client.1.vm08.stdout:4/495: dwrite d5/d1f/d70/f78 [0,4194304] 0 2026-03-10T07:51:37.964 INFO:tasks.workunit.client.1.vm08.stdout:4/496: dread d5/da0/d32/f44 [0,4194304] 0 2026-03-10T07:51:37.964 INFO:tasks.workunit.client.1.vm08.stdout:2/597: dwrite d0/d1/d3/d56/d57/f79 [0,4194304] 0 2026-03-10T07:51:37.973 INFO:tasks.workunit.client.1.vm08.stdout:7/558: truncate d3/f51 1580716 0 2026-03-10T07:51:37.974 INFO:tasks.workunit.client.1.vm08.stdout:8/679: creat d0/df/d15/d23/d54/dba/d89/dbf/fdb x:0 0 0 2026-03-10T07:51:37.986 INFO:tasks.workunit.client.1.vm08.stdout:2/598: rename d0/d1/d3/d39/d7d/d86/d55/d1b/faa to d0/d1/d3/d56/d78/dad/fc6 0 2026-03-10T07:51:37.987 INFO:tasks.workunit.client.1.vm08.stdout:2/599: write d0/d1/fa2 [1231147,85991] 0 2026-03-10T07:51:37.989 INFO:tasks.workunit.client.1.vm08.stdout:9/588: unlink d2/d58/dbf/d30/d35/d97/d9d/fb9 0 2026-03-10T07:51:37.989 INFO:tasks.workunit.client.1.vm08.stdout:9/589: fsync d2/d26/f29 0 2026-03-10T07:51:37.991 INFO:tasks.workunit.client.1.vm08.stdout:2/600: chown d0/d1/d3/d56/d78/dad/db1/d61/d84/cc1 17 1 2026-03-10T07:51:38.022 INFO:tasks.workunit.client.1.vm08.stdout:8/680: unlink d0/df/d17/d25/c68 0 2026-03-10T07:51:38.030 INFO:tasks.workunit.client.1.vm08.stdout:5/640: getdents d0/d4/d19/d60/d6d/d70 0 2026-03-10T07:51:38.035 INFO:tasks.workunit.client.1.vm08.stdout:6/610: write d1/d3/d3e/f81 [223046,98624] 0 2026-03-10T07:51:38.039 INFO:tasks.workunit.client.1.vm08.stdout:2/601: dread d0/d1/d3/d56/d78/dad/db1/d61/f59 [0,4194304] 0 2026-03-10T07:51:38.041 INFO:tasks.workunit.client.1.vm08.stdout:2/602: read d0/f68 [892905,60779] 0 2026-03-10T07:51:38.044 INFO:tasks.workunit.client.1.vm08.stdout:0/574: truncate dd/f44 887055 0 2026-03-10T07:51:38.044 INFO:tasks.workunit.client.1.vm08.stdout:4/497: link d5/da0/d12/d7b/l3b d5/d1f/d9b/laa 0 2026-03-10T07:51:38.045 INFO:tasks.workunit.client.1.vm08.stdout:8/681: symlink d0/df/d2e/d49/ldc 0 2026-03-10T07:51:38.046 INFO:tasks.workunit.client.1.vm08.stdout:5/641: readlink d0/d4/d19/d60/d6d/d70/d40/dba/lc9 0 2026-03-10T07:51:38.046 INFO:tasks.workunit.client.1.vm08.stdout:5/642: readlink d0/d4/l9a 0 2026-03-10T07:51:38.050 INFO:tasks.workunit.client.1.vm08.stdout:6/611: creat d1/db/d24/fc3 x:0 0 0 2026-03-10T07:51:38.069 INFO:tasks.workunit.client.1.vm08.stdout:1/553: dwrite d2/d6/de/d1f/d26/d58/d83/f72 [0,4194304] 0 2026-03-10T07:51:38.069 INFO:tasks.workunit.client.1.vm08.stdout:9/590: getdents d2/de/d28/d98 0 2026-03-10T07:51:38.069 INFO:tasks.workunit.client.1.vm08.stdout:1/554: dread d2/f36 [0,4194304] 0 2026-03-10T07:51:38.069 INFO:tasks.workunit.client.1.vm08.stdout:0/575: creat dd/d10/d14/d15/d20/d22/fb5 x:0 0 0 2026-03-10T07:51:38.069 INFO:tasks.workunit.client.1.vm08.stdout:5/643: mknod d0/d4/d19/d3a/cd1 0 2026-03-10T07:51:38.069 INFO:tasks.workunit.client.1.vm08.stdout:5/644: write d0/d4/d19/d81/da4/fc2 [831663,56714] 0 2026-03-10T07:51:38.069 INFO:tasks.workunit.client.1.vm08.stdout:9/591: read d2/d58/dbf/d30/d35/fc3 [132722,109406] 0 2026-03-10T07:51:38.069 INFO:tasks.workunit.client.1.vm08.stdout:6/612: dwrite d1/db/d24/d73/d79/d7c/fa3 [0,4194304] 0 2026-03-10T07:51:38.070 INFO:tasks.workunit.client.1.vm08.stdout:4/498: sync 2026-03-10T07:51:38.084 INFO:tasks.workunit.client.1.vm08.stdout:5/645: rename d0/d8/d5e/d8e/dac to d0/d8/dce/dd2 0 2026-03-10T07:51:38.085 INFO:tasks.workunit.client.1.vm08.stdout:1/555: getdents d2/d6/d3a/d61/d6f/dad 0 2026-03-10T07:51:38.086 INFO:tasks.workunit.client.1.vm08.stdout:1/556: read - d2/d6/de/fa5 zero size 2026-03-10T07:51:38.087 INFO:tasks.workunit.client.1.vm08.stdout:8/682: creat d0/df/fdd x:0 0 0 2026-03-10T07:51:38.092 INFO:tasks.workunit.client.1.vm08.stdout:6/613: mknod d1/db/d24/d73/d79/d7c/cc4 0 2026-03-10T07:51:38.093 INFO:tasks.workunit.client.1.vm08.stdout:4/499: stat d5/d8/c2e 0 2026-03-10T07:51:38.093 INFO:tasks.workunit.client.1.vm08.stdout:6/614: write d1/d3/df/d44/f5a [24226,34581] 0 2026-03-10T07:51:38.094 INFO:tasks.workunit.client.1.vm08.stdout:0/576: rename dd/d18/f96 to dd/d10/d14/d1b/da5/fb6 0 2026-03-10T07:51:38.095 INFO:tasks.workunit.client.1.vm08.stdout:9/592: mkdir d2/d3/d84/dca 0 2026-03-10T07:51:38.095 INFO:tasks.workunit.client.1.vm08.stdout:0/577: write dd/d10/d2f/d37/d64/d95/d5c/f63 [8823268,113521] 0 2026-03-10T07:51:38.098 INFO:tasks.workunit.client.1.vm08.stdout:4/500: dwrite d5/d8/f1e [4194304,4194304] 0 2026-03-10T07:51:38.105 INFO:tasks.workunit.client.1.vm08.stdout:6/615: dwrite d1/d3/df/d1d/d40/d45/d5c/fb9 [0,4194304] 0 2026-03-10T07:51:38.105 INFO:tasks.workunit.client.1.vm08.stdout:1/557: creat d2/d10/fc4 x:0 0 0 2026-03-10T07:51:38.105 INFO:tasks.workunit.client.1.vm08.stdout:1/558: write d2/d6/de/d1f/da9/faf [95220,38076] 0 2026-03-10T07:51:38.108 INFO:tasks.workunit.client.1.vm08.stdout:6/616: dread d1/d17/f66 [0,4194304] 0 2026-03-10T07:51:38.110 INFO:tasks.workunit.client.1.vm08.stdout:6/617: read - d1/db/f57 zero size 2026-03-10T07:51:38.112 INFO:tasks.workunit.client.1.vm08.stdout:7/559: dwrite d3/da/f17 [0,4194304] 0 2026-03-10T07:51:38.125 INFO:tasks.workunit.client.1.vm08.stdout:9/593: creat d2/de/d28/d98/dbb/fcb x:0 0 0 2026-03-10T07:51:38.125 INFO:tasks.workunit.client.1.vm08.stdout:5/646: getdents d0/d4/df/dbf/d41/dad 0 2026-03-10T07:51:38.125 INFO:tasks.workunit.client.1.vm08.stdout:8/683: getdents d0/d69 0 2026-03-10T07:51:38.135 INFO:tasks.workunit.client.1.vm08.stdout:0/578: rename f2 to dd/d10/d14/d1b/fb7 0 2026-03-10T07:51:38.139 INFO:tasks.workunit.client.1.vm08.stdout:4/501: rename d5/da0/d12/d7b/da7/l65 to d5/da0/d12/d7b/d48/d4f/d7c/lab 0 2026-03-10T07:51:38.139 INFO:tasks.workunit.client.1.vm08.stdout:7/560: dread d3/f2b [0,4194304] 0 2026-03-10T07:51:38.143 INFO:tasks.workunit.client.1.vm08.stdout:6/618: rename d1/db/d24/fc3 to d1/d17/d2b/d58/d76/fc5 0 2026-03-10T07:51:38.143 INFO:tasks.workunit.client.1.vm08.stdout:7/561: rename d3/da/d25/d9/d2f to d3/da/d25/d9/d2f/d3a/d71/dbb 22 2026-03-10T07:51:38.143 INFO:tasks.workunit.client.1.vm08.stdout:6/619: stat d1/db/d24/f50 0 2026-03-10T07:51:38.145 INFO:tasks.workunit.client.1.vm08.stdout:5/647: link d0/d4/f2e d0/d4/d19/d60/d6d/d70/dc5/fd3 0 2026-03-10T07:51:38.145 INFO:tasks.workunit.client.1.vm08.stdout:8/684: creat d0/df/d17/fde x:0 0 0 2026-03-10T07:51:38.146 INFO:tasks.workunit.client.1.vm08.stdout:6/620: mknod d1/d3/d3e/db2/cc6 0 2026-03-10T07:51:38.148 INFO:tasks.workunit.client.1.vm08.stdout:8/685: dread d0/df/d15/d23/da8/fc2 [0,4194304] 0 2026-03-10T07:51:38.161 INFO:tasks.workunit.client.1.vm08.stdout:4/502: link l3 d5/d1f/d31/lac 0 2026-03-10T07:51:38.168 INFO:tasks.workunit.client.1.vm08.stdout:6/621: write d1/d17/d2b/d58/d76/fc5 [1025848,45063] 0 2026-03-10T07:51:38.168 INFO:tasks.workunit.client.1.vm08.stdout:4/503: write d5/d1f/d31/f82 [4508688,65734] 0 2026-03-10T07:51:38.168 INFO:tasks.workunit.client.1.vm08.stdout:8/686: symlink d0/df/d2e/d49/ldf 0 2026-03-10T07:51:38.168 INFO:tasks.workunit.client.1.vm08.stdout:8/687: dread - d0/df/d15/d23/d54/dba/d89/fac zero size 2026-03-10T07:51:38.168 INFO:tasks.workunit.client.1.vm08.stdout:6/622: creat d1/d3/d3e/db2/fc7 x:0 0 0 2026-03-10T07:51:38.175 INFO:tasks.workunit.client.1.vm08.stdout:8/688: unlink d0/d69/f46 0 2026-03-10T07:51:38.176 INFO:tasks.workunit.client.1.vm08.stdout:6/623: mknod d1/d17/d2b/cc8 0 2026-03-10T07:51:38.177 INFO:tasks.workunit.client.1.vm08.stdout:6/624: write d1/d3/df/d1d/d40/d45/d5c/fb9 [3663869,70423] 0 2026-03-10T07:51:38.182 INFO:tasks.workunit.client.1.vm08.stdout:8/689: creat d0/df/d2e/d49/fe0 x:0 0 0 2026-03-10T07:51:38.192 INFO:tasks.workunit.client.1.vm08.stdout:8/690: truncate d0/df/d15/d23/d54/dba/d89/fa9 498220 0 2026-03-10T07:51:38.192 INFO:tasks.workunit.client.1.vm08.stdout:8/691: write d0/df/d15/d23/f75 [2035203,75670] 0 2026-03-10T07:51:38.192 INFO:tasks.workunit.client.1.vm08.stdout:0/579: sync 2026-03-10T07:51:38.202 INFO:tasks.workunit.client.1.vm08.stdout:0/580: rmdir dd 39 2026-03-10T07:51:38.215 INFO:tasks.workunit.client.1.vm08.stdout:6/625: getdents d1/db/d24/dac/dad 0 2026-03-10T07:51:38.216 INFO:tasks.workunit.client.1.vm08.stdout:6/626: chown d1/f49 1 1 2026-03-10T07:51:38.222 INFO:tasks.workunit.client.1.vm08.stdout:8/692: dread d0/fa [0,4194304] 0 2026-03-10T07:51:38.234 INFO:tasks.workunit.client.1.vm08.stdout:0/581: write dd/d18/f3c [4112524,52963] 0 2026-03-10T07:51:38.242 INFO:tasks.workunit.client.1.vm08.stdout:6/627: stat d1/db/c65 0 2026-03-10T07:51:38.245 INFO:tasks.workunit.client.1.vm08.stdout:3/591: dwrite d0/d3c/d1f/d89/fa4 [0,4194304] 0 2026-03-10T07:51:38.246 INFO:tasks.workunit.client.1.vm08.stdout:3/592: readlink d0/d3c/l1c 0 2026-03-10T07:51:38.254 INFO:tasks.workunit.client.1.vm08.stdout:2/603: dwrite d0/d1/d3/d39/fb7 [0,4194304] 0 2026-03-10T07:51:38.271 INFO:tasks.workunit.client.1.vm08.stdout:0/582: truncate dd/d10/d2f/d37/d64/d95/d5c/f8f 620307 0 2026-03-10T07:51:38.271 INFO:tasks.workunit.client.1.vm08.stdout:0/583: readlink dd/d18/l3a 0 2026-03-10T07:51:38.272 INFO:tasks.workunit.client.1.vm08.stdout:6/628: mknod d1/d17/d2b/d58/d76/cc9 0 2026-03-10T07:51:38.272 INFO:tasks.workunit.client.1.vm08.stdout:0/584: fsync dd/d10/d14/d15/d20/f7e 0 2026-03-10T07:51:38.277 INFO:tasks.workunit.client.1.vm08.stdout:8/693: creat d0/df/d15/d23/d39/d5b/dbc/fe1 x:0 0 0 2026-03-10T07:51:38.277 INFO:tasks.workunit.client.1.vm08.stdout:8/694: chown d0/c2d 2 1 2026-03-10T07:51:38.278 INFO:tasks.workunit.client.1.vm08.stdout:8/695: read - d0/df/fdd zero size 2026-03-10T07:51:38.279 INFO:tasks.workunit.client.1.vm08.stdout:1/559: write d2/d6/de/d1f/f3d [764232,25002] 0 2026-03-10T07:51:38.281 INFO:tasks.workunit.client.1.vm08.stdout:3/593: mknod d0/d3c/cb9 0 2026-03-10T07:51:38.289 INFO:tasks.workunit.client.1.vm08.stdout:9/594: write d2/d58/dbf/f75 [684350,18509] 0 2026-03-10T07:51:38.294 INFO:tasks.workunit.client.1.vm08.stdout:8/696: creat d0/d37/d86/fe2 x:0 0 0 2026-03-10T07:51:38.294 INFO:tasks.workunit.client.1.vm08.stdout:8/697: write d0/fa3 [642007,76347] 0 2026-03-10T07:51:38.301 INFO:tasks.workunit.client.1.vm08.stdout:0/585: dread dd/d10/d2f/d37/d64/d95/f2a [0,4194304] 0 2026-03-10T07:51:38.303 INFO:tasks.workunit.client.1.vm08.stdout:1/560: creat d2/d6/d3a/d61/fc5 x:0 0 0 2026-03-10T07:51:38.303 INFO:tasks.workunit.client.1.vm08.stdout:0/586: write dd/d10/d2f/d37/d64/d95/d58/f86 [275178,80222] 0 2026-03-10T07:51:38.309 INFO:tasks.workunit.client.1.vm08.stdout:1/561: sync 2026-03-10T07:51:38.313 INFO:tasks.workunit.client.1.vm08.stdout:5/648: write d0/f3b [1959795,2409] 0 2026-03-10T07:51:38.315 INFO:tasks.workunit.client.1.vm08.stdout:5/649: readlink d0/d4/d19/d81/da4/lc3 0 2026-03-10T07:51:38.315 INFO:tasks.workunit.client.1.vm08.stdout:6/629: symlink d1/d3/df/d38/lca 0 2026-03-10T07:51:38.316 INFO:tasks.workunit.client.1.vm08.stdout:5/650: write d0/d4/df/f2a [463914,61120] 0 2026-03-10T07:51:38.319 INFO:tasks.workunit.client.1.vm08.stdout:7/562: truncate d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/f65 1177810 0 2026-03-10T07:51:38.321 INFO:tasks.workunit.client.1.vm08.stdout:9/595: read d2/d3/fc [954342,97961] 0 2026-03-10T07:51:38.324 INFO:tasks.workunit.client.1.vm08.stdout:4/504: write d5/d8/f68 [338512,3145] 0 2026-03-10T07:51:38.326 INFO:tasks.workunit.client.1.vm08.stdout:9/596: dwrite d2/d58/dbf/d30/d35/d97/f9c [0,4194304] 0 2026-03-10T07:51:38.341 INFO:tasks.workunit.client.1.vm08.stdout:6/630: mkdir d1/d17/d2b/d5e/dcb 0 2026-03-10T07:51:38.344 INFO:tasks.workunit.client.1.vm08.stdout:5/651: symlink d0/d8/ld4 0 2026-03-10T07:51:38.350 INFO:tasks.workunit.client.1.vm08.stdout:5/652: dwrite d0/d4/df/d12/f97 [0,4194304] 0 2026-03-10T07:51:38.366 INFO:tasks.workunit.client.1.vm08.stdout:2/604: truncate d0/d1/d3/d39/d7d/d86/d55/d7a/f94 617968 0 2026-03-10T07:51:38.377 INFO:tasks.workunit.client.1.vm08.stdout:3/594: write d0/d3c/d1f/d44/d51/d2d/d85/fa0 [901997,95221] 0 2026-03-10T07:51:38.406 INFO:tasks.workunit.client.1.vm08.stdout:9/597: readlink d2/l8f 0 2026-03-10T07:51:38.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:38 vm05.local ceph-mon[50387]: pgmap v38: 65 pgs: 65 active+clean; 2.9 GiB data, 9.7 GiB used, 110 GiB / 120 GiB avail; 36 MiB/s rd, 105 MiB/s wr, 227 op/s 2026-03-10T07:51:38.409 INFO:tasks.workunit.client.1.vm08.stdout:9/598: dwrite d2/de/d28/f96 [0,4194304] 0 2026-03-10T07:51:38.411 INFO:tasks.workunit.client.1.vm08.stdout:6/631: mknod d1/d3/ccc 0 2026-03-10T07:51:38.421 INFO:tasks.workunit.client.1.vm08.stdout:5/653: creat d0/d4/df/d12/d22/fd5 x:0 0 0 2026-03-10T07:51:38.422 INFO:tasks.workunit.client.1.vm08.stdout:2/605: mknod d0/d1/d3/d39/cc7 0 2026-03-10T07:51:38.422 INFO:tasks.workunit.client.1.vm08.stdout:3/595: creat d0/d3c/d1f/d89/fba x:0 0 0 2026-03-10T07:51:38.423 INFO:tasks.workunit.client.1.vm08.stdout:2/606: chown d0/d1/d3/d10/f58 55244 1 2026-03-10T07:51:38.423 INFO:tasks.workunit.client.1.vm08.stdout:1/562: dwrite d2/d10/f3f [0,4194304] 0 2026-03-10T07:51:38.424 INFO:tasks.workunit.client.1.vm08.stdout:1/563: readlink d2/d6/d50/l55 0 2026-03-10T07:51:38.434 INFO:tasks.workunit.client.1.vm08.stdout:8/698: rename d0/df/d15/d23/l92 to d0/df/le3 0 2026-03-10T07:51:38.435 INFO:tasks.workunit.client.1.vm08.stdout:0/587: creat dd/d10/fb8 x:0 0 0 2026-03-10T07:51:38.435 INFO:tasks.workunit.client.1.vm08.stdout:4/505: mkdir d5/d1f/dad 0 2026-03-10T07:51:38.437 INFO:tasks.workunit.client.1.vm08.stdout:9/599: creat d2/d58/dbf/d30/d35/d97/d9d/fcc x:0 0 0 2026-03-10T07:51:38.438 INFO:tasks.workunit.client.1.vm08.stdout:9/600: readlink d2/de/d28/l65 0 2026-03-10T07:51:38.438 INFO:tasks.workunit.client.1.vm08.stdout:5/654: creat d0/d8/dce/fd6 x:0 0 0 2026-03-10T07:51:38.439 INFO:tasks.workunit.client.1.vm08.stdout:5/655: write d0/d4/d19/d60/d6d/d70/fb4 [1513863,36315] 0 2026-03-10T07:51:38.440 INFO:tasks.workunit.client.1.vm08.stdout:7/563: rmdir d3/da/d25/db8 0 2026-03-10T07:51:38.448 INFO:tasks.workunit.client.1.vm08.stdout:7/564: dread d3/da/d25/d9/d2f/d39/f56 [4194304,4194304] 0 2026-03-10T07:51:38.454 INFO:tasks.workunit.client.1.vm08.stdout:3/596: dread d0/d3c/d1f/d44/d51/d34/f4e [0,4194304] 0 2026-03-10T07:51:38.455 INFO:tasks.workunit.client.1.vm08.stdout:2/607: unlink d0/d1/d3/d10/d38/f52 0 2026-03-10T07:51:38.455 INFO:tasks.workunit.client.1.vm08.stdout:2/608: chown d0/d1/d3/d39/d7d/d86/d55/l16 4 1 2026-03-10T07:51:38.458 INFO:tasks.workunit.client.1.vm08.stdout:1/564: mkdir d2/d10/dc6 0 2026-03-10T07:51:38.459 INFO:tasks.workunit.client.1.vm08.stdout:8/699: creat d0/df/d17/d72/fe4 x:0 0 0 2026-03-10T07:51:38.466 INFO:tasks.workunit.client.1.vm08.stdout:4/506: creat d5/da0/d32/fae x:0 0 0 2026-03-10T07:51:38.467 INFO:tasks.workunit.client.1.vm08.stdout:4/507: write d5/d8/f30 [5258,80537] 0 2026-03-10T07:51:38.469 INFO:tasks.workunit.client.1.vm08.stdout:9/601: write d2/f51 [735251,88286] 0 2026-03-10T07:51:38.474 INFO:tasks.workunit.client.1.vm08.stdout:5/656: rmdir d0/d4/df/d12 39 2026-03-10T07:51:38.478 INFO:tasks.workunit.client.1.vm08.stdout:2/609: symlink d0/d1/d3/d39/lc8 0 2026-03-10T07:51:38.479 INFO:tasks.workunit.client.1.vm08.stdout:2/610: write d0/d1/d3/d10/d65/f7c [3987587,93572] 0 2026-03-10T07:51:38.480 INFO:tasks.workunit.client.1.vm08.stdout:6/632: dread - d1/d3/d3e/db2/fc7 zero size 2026-03-10T07:51:38.485 INFO:tasks.workunit.client.1.vm08.stdout:0/588: creat dd/d10/d14/d15/d20/d5f/fb9 x:0 0 0 2026-03-10T07:51:38.489 INFO:tasks.workunit.client.1.vm08.stdout:9/602: creat d2/d58/dbf/d30/d35/d97/fcd x:0 0 0 2026-03-10T07:51:38.490 INFO:tasks.workunit.client.1.vm08.stdout:9/603: write d2/d58/dbf/d30/d35/d97/f9c [1733579,61846] 0 2026-03-10T07:51:38.492 INFO:tasks.workunit.client.1.vm08.stdout:5/657: chown d0/d4/df/d12 32682 1 2026-03-10T07:51:38.494 INFO:tasks.workunit.client.1.vm08.stdout:5/658: chown d0/d4/df/d12/f97 953 1 2026-03-10T07:51:38.496 INFO:tasks.workunit.client.1.vm08.stdout:9/604: dwrite d2/d58/fc6 [0,4194304] 0 2026-03-10T07:51:38.497 INFO:tasks.workunit.client.1.vm08.stdout:7/565: mkdir d3/da/dbc 0 2026-03-10T07:51:38.498 INFO:tasks.workunit.client.1.vm08.stdout:8/700: dwrite d0/df/f19 [4194304,4194304] 0 2026-03-10T07:51:38.499 INFO:tasks.workunit.client.1.vm08.stdout:3/597: mknod d0/cbb 0 2026-03-10T07:51:38.512 INFO:tasks.workunit.client.1.vm08.stdout:8/701: dread d0/df/d15/d23/da8/f6a [0,4194304] 0 2026-03-10T07:51:38.512 INFO:tasks.workunit.client.1.vm08.stdout:2/611: mkdir d0/d1/d3/d39/d7d/d86/d55/dc9 0 2026-03-10T07:51:38.513 INFO:tasks.workunit.client.1.vm08.stdout:8/702: dread - d0/df/d17/d72/fe4 zero size 2026-03-10T07:51:38.514 INFO:tasks.workunit.client.1.vm08.stdout:2/612: chown d0/d1/d3/d10/d65/c6d 6190 1 2026-03-10T07:51:38.516 INFO:tasks.workunit.client.1.vm08.stdout:8/703: dread - d0/d37/d86/fb7 zero size 2026-03-10T07:51:38.517 INFO:tasks.workunit.client.1.vm08.stdout:5/659: dwrite d0/d4/fab [0,4194304] 0 2026-03-10T07:51:38.520 INFO:tasks.workunit.client.1.vm08.stdout:6/633: dread d1/d3/d3e/f5b [0,4194304] 0 2026-03-10T07:51:38.524 INFO:tasks.workunit.client.1.vm08.stdout:8/704: dwrite d0/f2a [0,4194304] 0 2026-03-10T07:51:38.526 INFO:tasks.workunit.client.1.vm08.stdout:8/705: stat d0/df/d15/d23/d39/d5b/dbc/fe1 0 2026-03-10T07:51:38.540 INFO:tasks.workunit.client.1.vm08.stdout:7/566: symlink d3/da/d25/d9/d2f/d4d/lbd 0 2026-03-10T07:51:38.544 INFO:tasks.workunit.client.1.vm08.stdout:9/605: mknod d2/de/d28/d98/dbb/cce 0 2026-03-10T07:51:38.548 INFO:tasks.workunit.client.1.vm08.stdout:9/606: fdatasync d2/d58/dbf/d30/d35/d97/fcd 0 2026-03-10T07:51:38.550 INFO:tasks.workunit.client.1.vm08.stdout:0/589: symlink dd/d10/d14/d15/d20/d5f/d9f/lba 0 2026-03-10T07:51:38.552 INFO:tasks.workunit.client.1.vm08.stdout:6/634: creat d1/d3/d3e/fcd x:0 0 0 2026-03-10T07:51:38.552 INFO:tasks.workunit.client.1.vm08.stdout:6/635: chown d1/d17/d2b/d58/d77/daf/db1 22179976 1 2026-03-10T07:51:38.554 INFO:tasks.workunit.client.1.vm08.stdout:5/660: unlink d0/d8/d5e/c9c 0 2026-03-10T07:51:38.556 INFO:tasks.workunit.client.1.vm08.stdout:5/661: dread d0/d4/df/d12/f97 [0,4194304] 0 2026-03-10T07:51:38.557 INFO:tasks.workunit.client.1.vm08.stdout:8/706: chown d0/df/d15/c6e 1487 1 2026-03-10T07:51:38.558 INFO:tasks.workunit.client.1.vm08.stdout:8/707: stat d0/df/d15/d23/d39/d5b/d4a/c4f 0 2026-03-10T07:51:38.572 INFO:tasks.workunit.client.1.vm08.stdout:6/636: mknod d1/d7d/cce 0 2026-03-10T07:51:38.576 INFO:tasks.workunit.client.1.vm08.stdout:8/708: write d0/df/f60 [3513252,87608] 0 2026-03-10T07:51:38.586 INFO:tasks.workunit.client.1.vm08.stdout:8/709: stat d0/df/f13 0 2026-03-10T07:51:38.586 INFO:tasks.workunit.client.1.vm08.stdout:9/607: link d2/d26/f4b d2/d26/da4/fcf 0 2026-03-10T07:51:38.587 INFO:tasks.workunit.client.1.vm08.stdout:7/567: sync 2026-03-10T07:51:38.588 INFO:tasks.workunit.client.1.vm08.stdout:7/568: read - d3/da/d25/d9/d2f/d4d/fb9 zero size 2026-03-10T07:51:38.589 INFO:tasks.workunit.client.1.vm08.stdout:7/569: readlink d3/l22 0 2026-03-10T07:51:38.592 INFO:tasks.workunit.client.1.vm08.stdout:2/613: rename d0/fbb to d0/fca 0 2026-03-10T07:51:38.593 INFO:tasks.workunit.client.1.vm08.stdout:2/614: write d0/d1/d3/d56/d57/f5b [3441230,113658] 0 2026-03-10T07:51:38.597 INFO:tasks.workunit.client.1.vm08.stdout:1/565: truncate d2/d6/de/d1f/d26/f6e 44299 0 2026-03-10T07:51:38.600 INFO:tasks.workunit.client.1.vm08.stdout:3/598: write d0/d3c/d1f/d44/d51/d34/fb5 [996330,24258] 0 2026-03-10T07:51:38.602 INFO:tasks.workunit.client.1.vm08.stdout:3/599: chown d0/d3c/d18/d32/d61/d52/f70 0 1 2026-03-10T07:51:38.606 INFO:tasks.workunit.client.1.vm08.stdout:4/508: dwrite d5/f21 [0,4194304] 0 2026-03-10T07:51:38.612 INFO:tasks.workunit.client.1.vm08.stdout:8/710: creat d0/df/d15/d23/d54/dba/d89/dbf/fe5 x:0 0 0 2026-03-10T07:51:38.612 INFO:tasks.workunit.client.1.vm08.stdout:8/711: fsync d0/df/d15/d23/d54/dba/d89/fac 0 2026-03-10T07:51:38.613 INFO:tasks.workunit.client.1.vm08.stdout:9/608: rmdir d2/d26/da4 39 2026-03-10T07:51:38.613 INFO:tasks.workunit.client.1.vm08.stdout:2/615: dread d0/d1/fa2 [0,4194304] 0 2026-03-10T07:51:38.614 INFO:tasks.workunit.client.1.vm08.stdout:2/616: chown d0/d1/d3/d39/d7d/d86/d55/d1b/fb6 18 1 2026-03-10T07:51:38.631 INFO:tasks.workunit.client.1.vm08.stdout:3/600: creat d0/d3c/d18/d48/d55/d56/fbc x:0 0 0 2026-03-10T07:51:38.632 INFO:tasks.workunit.client.1.vm08.stdout:6/637: symlink d1/d3/df/d1d/d6f/lcf 0 2026-03-10T07:51:38.640 INFO:tasks.workunit.client.1.vm08.stdout:8/712: rmdir d0/df/d15/d23/d54/dba/d89 39 2026-03-10T07:51:38.641 INFO:tasks.workunit.client.1.vm08.stdout:8/713: readlink d0/df/d17/lb2 0 2026-03-10T07:51:38.642 INFO:tasks.workunit.client.1.vm08.stdout:8/714: write d0/df/d17/d72/fe4 [92245,57633] 0 2026-03-10T07:51:38.645 INFO:tasks.workunit.client.1.vm08.stdout:2/617: creat d0/d1/d17/db2/fcb x:0 0 0 2026-03-10T07:51:38.649 INFO:tasks.workunit.client.1.vm08.stdout:9/609: dwrite d2/de/d28/f8d [0,4194304] 0 2026-03-10T07:51:38.649 INFO:tasks.workunit.client.1.vm08.stdout:7/570: unlink d3/da/d25/d9/d2f/d3a/l50 0 2026-03-10T07:51:38.650 INFO:tasks.workunit.client.1.vm08.stdout:2/618: dread d0/f68 [0,4194304] 0 2026-03-10T07:51:38.651 INFO:tasks.workunit.client.1.vm08.stdout:2/619: chown d0/d1/d3/d39/d7d/d7e 349434143 1 2026-03-10T07:51:38.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:38 vm08.local ceph-mon[59917]: pgmap v38: 65 pgs: 65 active+clean; 2.9 GiB data, 9.7 GiB used, 110 GiB / 120 GiB avail; 36 MiB/s rd, 105 MiB/s wr, 227 op/s 2026-03-10T07:51:38.673 INFO:tasks.workunit.client.1.vm08.stdout:3/601: fdatasync d0/d3c/d18/d32/f62 0 2026-03-10T07:51:38.675 INFO:tasks.workunit.client.1.vm08.stdout:6/638: creat d1/d3/d3e/db2/fd0 x:0 0 0 2026-03-10T07:51:38.675 INFO:tasks.workunit.client.1.vm08.stdout:5/662: link d0/d4/df/c16 d0/d4/d19/d81/d92/cd7 0 2026-03-10T07:51:38.681 INFO:tasks.workunit.client.1.vm08.stdout:4/509: mkdir d5/d1f/daf 0 2026-03-10T07:51:38.681 INFO:tasks.workunit.client.1.vm08.stdout:4/510: chown d5/da0/cb 1094329 1 2026-03-10T07:51:38.682 INFO:tasks.workunit.client.1.vm08.stdout:8/715: creat d0/d69/fe6 x:0 0 0 2026-03-10T07:51:38.685 INFO:tasks.workunit.client.1.vm08.stdout:5/663: dread d0/d4/df/f2a [0,4194304] 0 2026-03-10T07:51:38.687 INFO:tasks.workunit.client.1.vm08.stdout:8/716: dwrite d0/df/d2e/fb9 [0,4194304] 0 2026-03-10T07:51:38.697 INFO:tasks.workunit.client.1.vm08.stdout:6/639: fsync d1/d3/d3e/db2/fd0 0 2026-03-10T07:51:38.698 INFO:tasks.workunit.client.1.vm08.stdout:7/571: mknod d3/da/d25/d9/d2f/d6c/cbe 0 2026-03-10T07:51:38.703 INFO:tasks.workunit.client.1.vm08.stdout:1/566: write d2/d6/de/d1f/d26/f6e [578559,101712] 0 2026-03-10T07:51:38.708 INFO:tasks.workunit.client.1.vm08.stdout:0/590: getdents dd/d10/d2f 0 2026-03-10T07:51:38.712 INFO:tasks.workunit.client.1.vm08.stdout:4/511: fdatasync d5/da0/d12/d7b/d48/d4f/f56 0 2026-03-10T07:51:38.720 INFO:tasks.workunit.client.1.vm08.stdout:5/664: creat d0/d4/d19/d60/d6d/d70/d40/dba/fd8 x:0 0 0 2026-03-10T07:51:38.726 INFO:tasks.workunit.client.1.vm08.stdout:8/717: unlink d0/df/d2e/f9e 0 2026-03-10T07:51:38.729 INFO:tasks.workunit.client.1.vm08.stdout:6/640: creat d1/db/d24/fd1 x:0 0 0 2026-03-10T07:51:38.730 INFO:tasks.workunit.client.1.vm08.stdout:6/641: chown d1/d3/df/d1d/d6f 3439735 1 2026-03-10T07:51:38.733 INFO:tasks.workunit.client.1.vm08.stdout:2/620: fsync d0/d1/d3/d39/d7d/d86/d55/d7a/f94 0 2026-03-10T07:51:38.733 INFO:tasks.workunit.client.1.vm08.stdout:1/567: write d2/d6/de/d1f/d26/d58/fb9 [940345,105415] 0 2026-03-10T07:51:38.739 INFO:tasks.workunit.client.1.vm08.stdout:3/602: creat d0/d3c/d1f/d95/fbd x:0 0 0 2026-03-10T07:51:38.747 INFO:tasks.workunit.client.1.vm08.stdout:8/718: fdatasync d0/df/d2e/d30/f33 0 2026-03-10T07:51:38.752 INFO:tasks.workunit.client.1.vm08.stdout:4/512: dwrite d5/da0/f18 [4194304,4194304] 0 2026-03-10T07:51:38.754 INFO:tasks.workunit.client.1.vm08.stdout:9/610: rename d2/d58/dbf/d30 to d2/d58/dbf/dd0 0 2026-03-10T07:51:38.754 INFO:tasks.workunit.client.1.vm08.stdout:4/513: chown d5/da0/d12/d7b/f43 1802 1 2026-03-10T07:51:38.763 INFO:tasks.workunit.client.1.vm08.stdout:5/665: dwrite d0/d8/d5e/d8e/f96 [0,4194304] 0 2026-03-10T07:51:38.764 INFO:tasks.workunit.client.1.vm08.stdout:9/611: dwrite d2/d26/fc1 [0,4194304] 0 2026-03-10T07:51:38.765 INFO:tasks.workunit.client.1.vm08.stdout:9/612: truncate d2/d58/dbf/dd0/d35/d97/f9c 4866072 0 2026-03-10T07:51:38.765 INFO:tasks.workunit.client.1.vm08.stdout:9/613: chown d2/d58/dbf/l38 12373 1 2026-03-10T07:51:38.792 INFO:tasks.workunit.client.1.vm08.stdout:1/568: dread d2/d6/de/d70/d80/fbb [0,4194304] 0 2026-03-10T07:51:38.792 INFO:tasks.workunit.client.1.vm08.stdout:1/569: fsync d2/d6/d3a/d61/fc5 0 2026-03-10T07:51:38.797 INFO:tasks.workunit.client.1.vm08.stdout:8/719: dread d0/df/d5d/f81 [0,4194304] 0 2026-03-10T07:51:38.799 INFO:tasks.workunit.client.1.vm08.stdout:4/514: write d5/da0/d12/d7b/d48/f5d [2754125,17163] 0 2026-03-10T07:51:38.800 INFO:tasks.workunit.client.1.vm08.stdout:7/572: link d3/da/d25/d9/f47 d3/da/d25/d9/fbf 0 2026-03-10T07:51:38.806 INFO:tasks.workunit.client.1.vm08.stdout:9/614: chown d2/d3/fc0 0 1 2026-03-10T07:51:38.815 INFO:tasks.workunit.client.1.vm08.stdout:0/591: write dd/d10/d2f/d37/d64/d95/d5c/f8f [548952,35017] 0 2026-03-10T07:51:38.816 INFO:tasks.workunit.client.1.vm08.stdout:2/621: write d0/d1/fa2 [3950674,37763] 0 2026-03-10T07:51:38.817 INFO:tasks.workunit.client.1.vm08.stdout:1/570: mknod d2/d6/de/d1f/da9/cc7 0 2026-03-10T07:51:38.817 INFO:tasks.workunit.client.1.vm08.stdout:8/720: chown d0/df/d15/d23/d54/dba/d89 6 1 2026-03-10T07:51:38.820 INFO:tasks.workunit.client.1.vm08.stdout:0/592: write dd/d10/d2f/d37/d64/d95/d58/d3d/d9b/fb4 [395543,15966] 0 2026-03-10T07:51:38.821 INFO:tasks.workunit.client.1.vm08.stdout:2/622: write d0/d1/d3/d39/d7d/d86/fa9 [363759,53384] 0 2026-03-10T07:51:38.823 INFO:tasks.workunit.client.1.vm08.stdout:3/603: dwrite d0/d3c/d18/d48/d55/d56/f81 [0,4194304] 0 2026-03-10T07:51:38.824 INFO:tasks.workunit.client.1.vm08.stdout:0/593: dread - dd/d10/d2f/d37/d64/d95/d58/d3d/d9b/fb1 zero size 2026-03-10T07:51:38.827 INFO:tasks.workunit.client.1.vm08.stdout:7/573: write d3/da/d25/d9/f53 [2020468,5856] 0 2026-03-10T07:51:38.829 INFO:tasks.workunit.client.1.vm08.stdout:7/574: write d3/da/d25/d9/f41 [3277033,111229] 0 2026-03-10T07:51:38.830 INFO:tasks.workunit.client.1.vm08.stdout:8/721: dwrite d0/d69/d3f/fb3 [4194304,4194304] 0 2026-03-10T07:51:38.831 INFO:tasks.workunit.client.1.vm08.stdout:8/722: fdatasync d0/df/d15/d23/f3d 0 2026-03-10T07:51:38.836 INFO:tasks.workunit.client.1.vm08.stdout:6/642: creat d1/db/fd2 x:0 0 0 2026-03-10T07:51:38.844 INFO:tasks.workunit.client.1.vm08.stdout:8/723: dwrite d0/f2a [0,4194304] 0 2026-03-10T07:51:38.869 INFO:tasks.workunit.client.1.vm08.stdout:1/571: creat d2/d6/de/d70/d80/fc8 x:0 0 0 2026-03-10T07:51:38.870 INFO:tasks.workunit.client.1.vm08.stdout:2/623: truncate d0/d1/d17/d6b/f72 2486713 0 2026-03-10T07:51:38.872 INFO:tasks.workunit.client.1.vm08.stdout:3/604: chown d0/d3c/d18/d32/cac 950228 1 2026-03-10T07:51:38.874 INFO:tasks.workunit.client.1.vm08.stdout:0/594: rename f5 to dd/d10/d14/d15/d20/d5f/d9f/fbb 0 2026-03-10T07:51:38.881 INFO:tasks.workunit.client.1.vm08.stdout:9/615: unlink d2/d26/da4/fcf 0 2026-03-10T07:51:38.883 INFO:tasks.workunit.client.1.vm08.stdout:8/724: symlink d0/d37/d86/le7 0 2026-03-10T07:51:38.886 INFO:tasks.workunit.client.1.vm08.stdout:3/605: symlink d0/d3c/d18/d80/lbe 0 2026-03-10T07:51:38.890 INFO:tasks.workunit.client.1.vm08.stdout:7/575: mkdir d3/da/d25/d9/d2f/d3a/dc0 0 2026-03-10T07:51:38.894 INFO:tasks.workunit.client.1.vm08.stdout:9/616: unlink d2/d58/dbf/dd0/fac 0 2026-03-10T07:51:38.896 INFO:tasks.workunit.client.1.vm08.stdout:8/725: mknod d0/df/d15/d23/d39/d5b/d4a/ce8 0 2026-03-10T07:51:38.897 INFO:tasks.workunit.client.1.vm08.stdout:0/595: dread dd/d10/d2f/d37/d64/d95/d58/d3d/d9b/fb4 [0,4194304] 0 2026-03-10T07:51:38.898 INFO:tasks.workunit.client.1.vm08.stdout:4/515: truncate d5/d1f/d41/f83 2678447 0 2026-03-10T07:51:38.900 INFO:tasks.workunit.client.1.vm08.stdout:6/643: write d1/f6 [1374162,78161] 0 2026-03-10T07:51:38.901 INFO:tasks.workunit.client.1.vm08.stdout:5/666: dwrite d0/d8/d24/dd0/f9b [0,4194304] 0 2026-03-10T07:51:38.922 INFO:tasks.workunit.client.1.vm08.stdout:1/572: write d2/d6/de/fc0 [746128,101561] 0 2026-03-10T07:51:38.923 INFO:tasks.workunit.client.1.vm08.stdout:1/573: write d2/d6/de/d1f/d26/f6e [239982,15260] 0 2026-03-10T07:51:38.923 INFO:tasks.workunit.client.1.vm08.stdout:1/574: chown d2/l2c 0 1 2026-03-10T07:51:38.930 INFO:tasks.workunit.client.1.vm08.stdout:8/726: symlink d0/d37/le9 0 2026-03-10T07:51:38.931 INFO:tasks.workunit.client.1.vm08.stdout:4/516: mkdir d5/d8/d50/db0 0 2026-03-10T07:51:38.932 INFO:tasks.workunit.client.1.vm08.stdout:6/644: rename d1/d3/df/d1d/d40/d45/faa to d1/d46/fd3 0 2026-03-10T07:51:38.934 INFO:tasks.workunit.client.1.vm08.stdout:0/596: dwrite dd/d10/d2f/d37/d64/d95/d58/d3d/d9b/fb4 [0,4194304] 0 2026-03-10T07:51:38.938 INFO:tasks.workunit.client.1.vm08.stdout:0/597: dread - dd/d10/d2f/d37/d64/d95/d58/d3d/faa zero size 2026-03-10T07:51:38.941 INFO:tasks.workunit.client.1.vm08.stdout:5/667: rmdir d0/d4/d19/d81/da4 39 2026-03-10T07:51:38.941 INFO:tasks.workunit.client.1.vm08.stdout:1/575: truncate d2/d6/de/f74 2699891 0 2026-03-10T07:51:38.942 INFO:tasks.workunit.client.1.vm08.stdout:7/576: mkdir d3/da/d25/d9/d2f/d4d/db6/dc1 0 2026-03-10T07:51:38.943 INFO:tasks.workunit.client.1.vm08.stdout:7/577: fsync d3/da/d8a/f9e 0 2026-03-10T07:51:38.949 INFO:tasks.workunit.client.1.vm08.stdout:6/645: symlink d1/d17/d2b/d58/d77/ld4 0 2026-03-10T07:51:38.950 INFO:tasks.workunit.client.1.vm08.stdout:2/624: getdents d0/d1/d17 0 2026-03-10T07:51:38.950 INFO:tasks.workunit.client.1.vm08.stdout:1/576: creat d2/d6/de/d71/fc9 x:0 0 0 2026-03-10T07:51:38.972 INFO:tasks.workunit.client.1.vm08.stdout:9/617: symlink d2/d3/d84/d91/ld1 0 2026-03-10T07:51:38.972 INFO:tasks.workunit.client.1.vm08.stdout:7/578: creat d3/da/d25/d9/d2f/d39/d43/fc2 x:0 0 0 2026-03-10T07:51:38.972 INFO:tasks.workunit.client.1.vm08.stdout:6/646: symlink d1/db/d24/dac/ld5 0 2026-03-10T07:51:38.972 INFO:tasks.workunit.client.1.vm08.stdout:5/668: symlink d0/d4/df/dbf/ld9 0 2026-03-10T07:51:38.972 INFO:tasks.workunit.client.1.vm08.stdout:3/606: getdents d0/d3c/d18/d32 0 2026-03-10T07:51:38.972 INFO:tasks.workunit.client.1.vm08.stdout:9/618: dread - d2/d3/d84/f94 zero size 2026-03-10T07:51:38.972 INFO:tasks.workunit.client.1.vm08.stdout:0/598: link dd/d10/d2f/la0 dd/d10/d2f/d37/d64/d52/lbc 0 2026-03-10T07:51:38.972 INFO:tasks.workunit.client.1.vm08.stdout:7/579: mkdir d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79/dc3 0 2026-03-10T07:51:38.972 INFO:tasks.workunit.client.1.vm08.stdout:5/669: mknod d0/d4/df/d82/cda 0 2026-03-10T07:51:38.972 INFO:tasks.workunit.client.1.vm08.stdout:3/607: write d0/f39 [8373598,81524] 0 2026-03-10T07:51:38.974 INFO:tasks.workunit.client.1.vm08.stdout:5/670: creat d0/d77/d83/fdb x:0 0 0 2026-03-10T07:51:38.980 INFO:tasks.workunit.client.1.vm08.stdout:2/625: dread d0/f50 [0,4194304] 0 2026-03-10T07:51:38.988 INFO:tasks.workunit.client.1.vm08.stdout:0/599: mkdir dd/d10/dbd 0 2026-03-10T07:51:38.990 INFO:tasks.workunit.client.1.vm08.stdout:0/600: write dd/d10/d14/d15/d20/d22/f2e [3218992,100302] 0 2026-03-10T07:51:38.990 INFO:tasks.workunit.client.1.vm08.stdout:3/608: symlink d0/d3c/d18/d48/d55/lbf 0 2026-03-10T07:51:38.991 INFO:tasks.workunit.client.1.vm08.stdout:1/577: sync 2026-03-10T07:51:39.028 INFO:tasks.workunit.client.1.vm08.stdout:4/517: write d5/da0/d32/f44 [1734346,45733] 0 2026-03-10T07:51:39.039 INFO:tasks.workunit.client.1.vm08.stdout:8/727: write d0/df/d15/d23/d54/dba/d89/fa9 [943066,2298] 0 2026-03-10T07:51:39.040 INFO:tasks.workunit.client.1.vm08.stdout:8/728: stat d0/df/d2e 0 2026-03-10T07:51:39.047 INFO:tasks.workunit.client.1.vm08.stdout:9/619: creat d2/d3/d84/fd2 x:0 0 0 2026-03-10T07:51:39.047 INFO:tasks.workunit.client.1.vm08.stdout:6/647: dwrite d1/f35 [0,4194304] 0 2026-03-10T07:51:39.054 INFO:tasks.workunit.client.1.vm08.stdout:7/580: dwrite d3/da/d25/d9/d2f/d39/d43/d4f/f68 [0,4194304] 0 2026-03-10T07:51:39.057 INFO:tasks.workunit.client.1.vm08.stdout:3/609: mkdir d0/d3c/d18/d32/d61/d52/dc0 0 2026-03-10T07:51:39.059 INFO:tasks.workunit.client.1.vm08.stdout:1/578: rmdir d2/d6/de/d1f/da9 39 2026-03-10T07:51:39.068 INFO:tasks.workunit.client.1.vm08.stdout:2/626: getdents d0/d1/d3/d10/d38/daf 0 2026-03-10T07:51:39.078 INFO:tasks.workunit.client.1.vm08.stdout:6/648: mkdir d1/d17/d2b/d5e/dd6 0 2026-03-10T07:51:39.084 INFO:tasks.workunit.client.1.vm08.stdout:0/601: creat dd/d10/dbd/fbe x:0 0 0 2026-03-10T07:51:39.089 INFO:tasks.workunit.client.1.vm08.stdout:3/610: mkdir d0/d3c/d18/d80/dc1 0 2026-03-10T07:51:39.095 INFO:tasks.workunit.client.1.vm08.stdout:4/518: rename d5/da0/le to d5/da0/d95/lb1 0 2026-03-10T07:51:39.097 INFO:tasks.workunit.client.1.vm08.stdout:2/627: truncate d0/f44 1035996 0 2026-03-10T07:51:39.102 INFO:tasks.workunit.client.1.vm08.stdout:0/602: read dd/f44 [805106,4298] 0 2026-03-10T07:51:39.102 INFO:tasks.workunit.client.1.vm08.stdout:0/603: chown dd/d10/d14/d1b/d30/l8c 15 1 2026-03-10T07:51:39.107 INFO:tasks.workunit.client.1.vm08.stdout:4/519: sync 2026-03-10T07:51:39.108 INFO:tasks.workunit.client.1.vm08.stdout:2/628: truncate d0/d1/d3/d10/f4f 630961 0 2026-03-10T07:51:39.109 INFO:tasks.workunit.client.1.vm08.stdout:2/629: dread - d0/d1/d17/db2/d9c/fbc zero size 2026-03-10T07:51:39.110 INFO:tasks.workunit.client.1.vm08.stdout:2/630: write d0/f81 [934717,21690] 0 2026-03-10T07:51:39.117 INFO:tasks.workunit.client.1.vm08.stdout:5/671: write d0/fb5 [190716,114258] 0 2026-03-10T07:51:39.119 INFO:tasks.workunit.client.1.vm08.stdout:7/581: dwrite d3/f4 [8388608,4194304] 0 2026-03-10T07:51:39.135 INFO:tasks.workunit.client.1.vm08.stdout:9/620: getdents d2 0 2026-03-10T07:51:39.138 INFO:tasks.workunit.client.1.vm08.stdout:4/520: dread d5/d1f/f37 [0,4194304] 0 2026-03-10T07:51:39.138 INFO:tasks.workunit.client.1.vm08.stdout:6/649: creat d1/d3/df/fd7 x:0 0 0 2026-03-10T07:51:39.143 INFO:tasks.workunit.client.1.vm08.stdout:3/611: dwrite d0/d3c/d18/d48/d55/d56/f9c [0,4194304] 0 2026-03-10T07:51:39.147 INFO:tasks.workunit.client.1.vm08.stdout:0/604: dread dd/d10/d14/d1b/f76 [0,4194304] 0 2026-03-10T07:51:39.148 INFO:tasks.workunit.client.1.vm08.stdout:0/605: fsync dd/d18/f25 0 2026-03-10T07:51:39.151 INFO:tasks.workunit.client.1.vm08.stdout:3/612: dwrite d0/d3c/d18/d4a/f8a [0,4194304] 0 2026-03-10T07:51:39.162 INFO:tasks.workunit.client.1.vm08.stdout:7/582: symlink d3/da/d25/d9/lc4 0 2026-03-10T07:51:39.163 INFO:tasks.workunit.client.1.vm08.stdout:9/621: creat d2/d58/d73/fd3 x:0 0 0 2026-03-10T07:51:39.166 INFO:tasks.workunit.client.1.vm08.stdout:6/650: creat d1/d3/df/d38/fd8 x:0 0 0 2026-03-10T07:51:39.168 INFO:tasks.workunit.client.1.vm08.stdout:2/631: mknod d0/d1/d3/d39/d7d/d86/ccc 0 2026-03-10T07:51:39.171 INFO:tasks.workunit.client.1.vm08.stdout:0/606: mknod dd/d10/d14/d15/d20/d5f/cbf 0 2026-03-10T07:51:39.173 INFO:tasks.workunit.client.1.vm08.stdout:3/613: read d0/f79 [2691036,34467] 0 2026-03-10T07:51:39.173 INFO:tasks.workunit.client.1.vm08.stdout:7/583: creat d3/da/d25/d9/fc5 x:0 0 0 2026-03-10T07:51:39.175 INFO:tasks.workunit.client.1.vm08.stdout:8/729: rename d0/df/d2e/d30/dc0 to d0/df/d15/d23/d39/d5b/dea 0 2026-03-10T07:51:39.177 INFO:tasks.workunit.client.1.vm08.stdout:9/622: truncate d2/f4e 2586458 0 2026-03-10T07:51:39.179 INFO:tasks.workunit.client.1.vm08.stdout:6/651: mknod d1/d3/df/d52/cd9 0 2026-03-10T07:51:39.179 INFO:tasks.workunit.client.1.vm08.stdout:6/652: chown d1/d17/f63 30684202 1 2026-03-10T07:51:39.181 INFO:tasks.workunit.client.1.vm08.stdout:0/607: creat dd/d10/d2f/d37/d64/d52/fc0 x:0 0 0 2026-03-10T07:51:39.184 INFO:tasks.workunit.client.1.vm08.stdout:1/579: rename d2/d6/d3a/d61/fc5 to d2/d6/de/d70/fca 0 2026-03-10T07:51:39.186 INFO:tasks.workunit.client.1.vm08.stdout:8/730: rmdir d0/df/d5d 39 2026-03-10T07:51:39.188 INFO:tasks.workunit.client.1.vm08.stdout:9/623: creat d2/de/d28/d98/fd4 x:0 0 0 2026-03-10T07:51:39.191 INFO:tasks.workunit.client.1.vm08.stdout:9/624: dread d2/d58/dbf/dd0/d35/f79 [0,4194304] 0 2026-03-10T07:51:39.191 INFO:tasks.workunit.client.1.vm08.stdout:3/614: mknod d0/cc2 0 2026-03-10T07:51:39.193 INFO:tasks.workunit.client.1.vm08.stdout:0/608: unlink dd/d10/d14/d15/d20/d22/fb5 0 2026-03-10T07:51:39.196 INFO:tasks.workunit.client.1.vm08.stdout:8/731: fsync d0/df/d15/d23/d54/f8f 0 2026-03-10T07:51:39.198 INFO:tasks.workunit.client.1.vm08.stdout:6/653: getdents d1/d17/d2b/d58/d77/daf/db1 0 2026-03-10T07:51:39.200 INFO:tasks.workunit.client.1.vm08.stdout:9/625: mkdir d2/d58/dbf/dd0/d35/d97/dd5 0 2026-03-10T07:51:39.205 INFO:tasks.workunit.client.1.vm08.stdout:8/732: stat d0/df/d5d/f9d 0 2026-03-10T07:51:39.207 INFO:tasks.workunit.client.1.vm08.stdout:6/654: fsync d1/db/fc1 0 2026-03-10T07:51:39.211 INFO:tasks.workunit.client.1.vm08.stdout:3/615: mkdir d0/d3c/d18/dc3 0 2026-03-10T07:51:39.235 INFO:tasks.workunit.client.1.vm08.stdout:0/609: mkdir dd/d10/d14/d15/d20/d92/dc1 0 2026-03-10T07:51:39.235 INFO:tasks.workunit.client.1.vm08.stdout:8/733: mknod d0/df/d15/d23/d54/dba/d89/ceb 0 2026-03-10T07:51:39.235 INFO:tasks.workunit.client.1.vm08.stdout:6/655: mknod d1/db/d24/d3d/cda 0 2026-03-10T07:51:39.235 INFO:tasks.workunit.client.1.vm08.stdout:0/610: symlink dd/d10/lc2 0 2026-03-10T07:51:39.235 INFO:tasks.workunit.client.1.vm08.stdout:3/616: rmdir d0/d3c/d18/dc3 0 2026-03-10T07:51:39.235 INFO:tasks.workunit.client.1.vm08.stdout:3/617: dread - d0/d3c/d1f/d89/fba zero size 2026-03-10T07:51:39.235 INFO:tasks.workunit.client.1.vm08.stdout:3/618: write d0/f39 [7284010,80297] 0 2026-03-10T07:51:39.235 INFO:tasks.workunit.client.1.vm08.stdout:0/611: chown dd/d10/f77 2 1 2026-03-10T07:51:39.235 INFO:tasks.workunit.client.1.vm08.stdout:3/619: write d0/d3c/d1f/d89/fa4 [1659026,21938] 0 2026-03-10T07:51:39.235 INFO:tasks.workunit.client.1.vm08.stdout:8/734: creat d0/df/d2e/d30/fec x:0 0 0 2026-03-10T07:51:39.235 INFO:tasks.workunit.client.1.vm08.stdout:0/612: creat dd/d18/fc3 x:0 0 0 2026-03-10T07:51:39.235 INFO:tasks.workunit.client.1.vm08.stdout:0/613: chown dd/d10/d2f/d37/d64/d52/c97 13884686 1 2026-03-10T07:51:39.235 INFO:tasks.workunit.client.1.vm08.stdout:0/614: fdatasync dd/d10/f77 0 2026-03-10T07:51:39.235 INFO:tasks.workunit.client.1.vm08.stdout:8/735: dwrite d0/df/d15/d23/d39/d5b/dea/dce/fd2 [0,4194304] 0 2026-03-10T07:51:39.240 INFO:tasks.workunit.client.1.vm08.stdout:6/656: getdents d1/d46 0 2026-03-10T07:51:39.252 INFO:tasks.workunit.client.1.vm08.stdout:6/657: mknod d1/d3/df/d1d/d40/d45/d5c/cdb 0 2026-03-10T07:51:39.252 INFO:tasks.workunit.client.1.vm08.stdout:0/615: mknod dd/d10/d14/d15/cc4 0 2026-03-10T07:51:39.252 INFO:tasks.workunit.client.1.vm08.stdout:0/616: chown dd/d10/d14/d15/cc4 0 1 2026-03-10T07:51:39.252 INFO:tasks.workunit.client.1.vm08.stdout:0/617: fdatasync dd/d10/d14/d15/d20/d5f/fb9 0 2026-03-10T07:51:39.252 INFO:tasks.workunit.client.1.vm08.stdout:6/658: truncate d1/d3/df/d44/f86 469386 0 2026-03-10T07:51:39.252 INFO:tasks.workunit.client.1.vm08.stdout:6/659: chown d1/d3/d3e 1997142 1 2026-03-10T07:51:39.253 INFO:tasks.workunit.client.1.vm08.stdout:0/618: rmdir dd/d10/d14/d1b/d30 39 2026-03-10T07:51:39.255 INFO:tasks.workunit.client.1.vm08.stdout:6/660: dread d1/db/d24/f75 [0,4194304] 0 2026-03-10T07:51:39.259 INFO:tasks.workunit.client.1.vm08.stdout:6/661: symlink d1/d3/df/d1d/d6f/ldc 0 2026-03-10T07:51:39.263 INFO:tasks.workunit.client.1.vm08.stdout:0/619: link dd/d10/d14/d15/d20/d5f/fa2 dd/d10/d14/d15/d20/fc5 0 2026-03-10T07:51:39.266 INFO:tasks.workunit.client.1.vm08.stdout:0/620: mkdir dd/d10/d14/d15/d20/d22/dc6 0 2026-03-10T07:51:39.267 INFO:tasks.workunit.client.1.vm08.stdout:6/662: link d1/d3/df/d1d/c69 d1/d17/d2b/d5e/cdd 0 2026-03-10T07:51:39.269 INFO:tasks.workunit.client.1.vm08.stdout:0/621: creat dd/d10/d14/d15/d20/d92/fc7 x:0 0 0 2026-03-10T07:51:39.274 INFO:tasks.workunit.client.1.vm08.stdout:0/622: dread dd/d10/d2f/d37/d64/d95/d58/d3d/fa1 [0,4194304] 0 2026-03-10T07:51:39.279 INFO:tasks.workunit.client.1.vm08.stdout:6/663: dwrite d1/d3/df/d38/fd8 [0,4194304] 0 2026-03-10T07:51:39.281 INFO:tasks.workunit.client.1.vm08.stdout:0/623: creat dd/d10/d14/d15/d20/d5f/fc8 x:0 0 0 2026-03-10T07:51:39.284 INFO:tasks.workunit.client.1.vm08.stdout:0/624: fsync dd/d10/d2f/f8e 0 2026-03-10T07:51:39.286 INFO:tasks.workunit.client.1.vm08.stdout:0/625: truncate dd/d10/d2f/d37/d64/d95/d58/d3d/fa1 893839 0 2026-03-10T07:51:39.286 INFO:tasks.workunit.client.1.vm08.stdout:6/664: dwrite d1/d3/d3e/f81 [0,4194304] 0 2026-03-10T07:51:39.291 INFO:tasks.workunit.client.1.vm08.stdout:0/626: symlink dd/d10/d2f/d37/d64/d95/d58/d3d/lc9 0 2026-03-10T07:51:39.299 INFO:tasks.workunit.client.1.vm08.stdout:9/626: sync 2026-03-10T07:51:39.301 INFO:tasks.workunit.client.1.vm08.stdout:0/627: mkdir dd/d10/d2f/d37/d64/d95/d5c/dca 0 2026-03-10T07:51:39.304 INFO:tasks.workunit.client.1.vm08.stdout:9/627: mkdir d2/de/dd6 0 2026-03-10T07:51:39.304 INFO:tasks.workunit.client.1.vm08.stdout:0/628: mknod dd/d10/d14/d15/d20/d92/dc1/ccb 0 2026-03-10T07:51:39.305 INFO:tasks.workunit.client.1.vm08.stdout:9/628: fsync d2/d58/dbf/dd0/d35/f6c 0 2026-03-10T07:51:39.308 INFO:tasks.workunit.client.1.vm08.stdout:9/629: mkdir d2/d3/d84/dca/dd7 0 2026-03-10T07:51:39.346 INFO:tasks.workunit.client.1.vm08.stdout:9/630: write d2/d3/d84/fd2 [200761,66748] 0 2026-03-10T07:51:39.381 INFO:tasks.workunit.client.1.vm08.stdout:5/672: dwrite d0/d4/df/dbf/d41/f5c [0,4194304] 0 2026-03-10T07:51:39.389 INFO:tasks.workunit.client.1.vm08.stdout:4/521: truncate d5/d8/f39 2163518 0 2026-03-10T07:51:39.393 INFO:tasks.workunit.client.1.vm08.stdout:7/584: write d3/da/d25/d9/d2f/d3a/d71/f8b [516593,2159] 0 2026-03-10T07:51:39.395 INFO:tasks.workunit.client.1.vm08.stdout:1/580: write d2/d6/d3a/f7d [1093017,115065] 0 2026-03-10T07:51:39.396 INFO:tasks.workunit.client.1.vm08.stdout:4/522: creat d5/da0/d12/d7b/d48/d4f/d7c/fb2 x:0 0 0 2026-03-10T07:51:39.396 INFO:tasks.workunit.client.1.vm08.stdout:4/523: chown f0 8797 1 2026-03-10T07:51:39.397 INFO:tasks.workunit.client.1.vm08.stdout:2/632: dwrite d0/d1/d17/d6b/f72 [0,4194304] 0 2026-03-10T07:51:39.397 INFO:tasks.workunit.client.1.vm08.stdout:7/585: rmdir d3/da/d25/d9/d2f/d39/d43/d4f 39 2026-03-10T07:51:39.399 INFO:tasks.workunit.client.1.vm08.stdout:2/633: dread - d0/d1/d3/d56/d78/dad/fc6 zero size 2026-03-10T07:51:39.407 INFO:tasks.workunit.client.1.vm08.stdout:3/620: write d0/d3c/d18/d32/d61/d52/f70 [5235657,115467] 0 2026-03-10T07:51:39.430 INFO:tasks.workunit.client.1.vm08.stdout:7/586: creat d3/da/d25/d9/d2f/d4d/db6/fc6 x:0 0 0 2026-03-10T07:51:39.437 INFO:tasks.workunit.client.1.vm08.stdout:3/621: readlink d0/l31 0 2026-03-10T07:51:39.437 INFO:tasks.workunit.client.1.vm08.stdout:3/622: chown d0/d3c/d18/d48/d55/lbf 4217780 1 2026-03-10T07:51:39.437 INFO:tasks.workunit.client.1.vm08.stdout:4/524: symlink d5/da0/d12/d7b/d48/da2/lb3 0 2026-03-10T07:51:39.437 INFO:tasks.workunit.client.1.vm08.stdout:7/587: truncate d3/da/d25/d9/d2f/d3a/d4b/fa3 56449 0 2026-03-10T07:51:39.437 INFO:tasks.workunit.client.1.vm08.stdout:5/673: getdents d0/d4/d19/d60 0 2026-03-10T07:51:39.437 INFO:tasks.workunit.client.1.vm08.stdout:1/581: creat d2/d6/de/fcb x:0 0 0 2026-03-10T07:51:39.437 INFO:tasks.workunit.client.1.vm08.stdout:3/623: creat d0/d3c/d18/d32/d61/fc4 x:0 0 0 2026-03-10T07:51:39.437 INFO:tasks.workunit.client.1.vm08.stdout:1/582: write d2/d6/d9f/fa7 [12209,5423] 0 2026-03-10T07:51:39.438 INFO:tasks.workunit.client.1.vm08.stdout:7/588: dwrite d3/fa4 [4194304,4194304] 0 2026-03-10T07:51:39.438 INFO:tasks.workunit.client.1.vm08.stdout:4/525: symlink d5/da0/d12/d7b/d48/d4f/d8d/d91/lb4 0 2026-03-10T07:51:39.438 INFO:tasks.workunit.client.1.vm08.stdout:7/589: rename d3/da/d25/d9/d2f/d39/d43/f82 to d3/da/d25/d9/d2f/d39/fc7 0 2026-03-10T07:51:39.438 INFO:tasks.workunit.client.1.vm08.stdout:7/590: write d3/da/d25/d9/d2f/d39/f76 [651555,115003] 0 2026-03-10T07:51:39.438 INFO:tasks.workunit.client.1.vm08.stdout:7/591: chown d3/da/d25/d9/d2f/d3a/d4b/c7a 10877 1 2026-03-10T07:51:39.438 INFO:tasks.workunit.client.1.vm08.stdout:4/526: rename d5/da0/d12/d7b/d48/f5d to d5/d8/d89/fb5 0 2026-03-10T07:51:39.438 INFO:tasks.workunit.client.1.vm08.stdout:1/583: fsync d2/d6/de/d1f/d8f/f91 0 2026-03-10T07:51:39.438 INFO:tasks.workunit.client.1.vm08.stdout:1/584: creat d2/d6/de/d47/da0/fcc x:0 0 0 2026-03-10T07:51:39.438 INFO:tasks.workunit.client.1.vm08.stdout:7/592: rename d3/da/d25/d9/d2f/d39/d43/d4f/l5a to d3/da/d25/d9/d2f/d39/db2/lc8 0 2026-03-10T07:51:39.439 INFO:tasks.workunit.client.1.vm08.stdout:4/527: truncate f2 1231291 0 2026-03-10T07:51:39.440 INFO:tasks.workunit.client.1.vm08.stdout:1/585: chown d2/d6/de/d1f/d22/l84 705 1 2026-03-10T07:51:39.441 INFO:tasks.workunit.client.1.vm08.stdout:4/528: truncate d5/da0/d32/f44 2618153 0 2026-03-10T07:51:39.441 INFO:tasks.workunit.client.1.vm08.stdout:7/593: mknod d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/dac/cc9 0 2026-03-10T07:51:39.441 INFO:tasks.workunit.client.1.vm08.stdout:7/594: chown d3/da/f17 25 1 2026-03-10T07:51:39.442 INFO:tasks.workunit.client.1.vm08.stdout:4/529: write d5/d1f/d31/f62 [2159450,122741] 0 2026-03-10T07:51:39.442 INFO:tasks.workunit.client.1.vm08.stdout:1/586: creat d2/d6/de/d1f/da9/fcd x:0 0 0 2026-03-10T07:51:39.443 INFO:tasks.workunit.client.1.vm08.stdout:7/595: unlink d3/f2e 0 2026-03-10T07:51:39.444 INFO:tasks.workunit.client.1.vm08.stdout:4/530: rmdir d5 39 2026-03-10T07:51:39.445 INFO:tasks.workunit.client.1.vm08.stdout:1/587: stat d2/d6/de/d1f/d26/f2f 0 2026-03-10T07:51:39.445 INFO:tasks.workunit.client.1.vm08.stdout:7/596: mkdir d3/da/d25/d9/d2f/d3a/d71/dca 0 2026-03-10T07:51:39.446 INFO:tasks.workunit.client.1.vm08.stdout:4/531: mknod d5/d1f/daf/cb6 0 2026-03-10T07:51:39.447 INFO:tasks.workunit.client.1.vm08.stdout:7/597: unlink d3/da/d25/d9/d2f/d3a/d4b/c7a 0 2026-03-10T07:51:39.447 INFO:tasks.workunit.client.1.vm08.stdout:4/532: mkdir d5/da0/db7 0 2026-03-10T07:51:39.448 INFO:tasks.workunit.client.1.vm08.stdout:4/533: stat d5/d8/d50/c7d 0 2026-03-10T07:51:39.449 INFO:tasks.workunit.client.1.vm08.stdout:4/534: mkdir d5/d1f/dad/db8 0 2026-03-10T07:51:39.449 INFO:tasks.workunit.client.1.vm08.stdout:4/535: chown l3 317876 1 2026-03-10T07:51:39.451 INFO:tasks.workunit.client.1.vm08.stdout:7/598: dwrite d3/da/d25/d9/d2f/f62 [4194304,4194304] 0 2026-03-10T07:51:39.456 INFO:tasks.workunit.client.1.vm08.stdout:4/536: creat d5/da0/d12/d7b/d48/fb9 x:0 0 0 2026-03-10T07:51:39.456 INFO:tasks.workunit.client.1.vm08.stdout:7/599: stat d3/da/d25/d9/d2f/d4d/fb9 0 2026-03-10T07:51:39.463 INFO:tasks.workunit.client.1.vm08.stdout:4/537: creat d5/d1f/d31/fba x:0 0 0 2026-03-10T07:51:39.465 INFO:tasks.workunit.client.1.vm08.stdout:7/600: rename d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/l77 to d3/da/d25/d9/d2f/d4d/db6/dc1/lcb 0 2026-03-10T07:51:39.465 INFO:tasks.workunit.client.1.vm08.stdout:4/538: dread d5/da0/f18 [4194304,4194304] 0 2026-03-10T07:51:39.466 INFO:tasks.workunit.client.1.vm08.stdout:7/601: symlink d3/da/d25/d9/lcc 0 2026-03-10T07:51:39.523 INFO:tasks.workunit.client.1.vm08.stdout:5/674: sync 2026-03-10T07:51:39.527 INFO:tasks.workunit.client.1.vm08.stdout:1/588: sync 2026-03-10T07:51:39.528 INFO:tasks.workunit.client.1.vm08.stdout:5/675: mkdir d0/d4/d19/d43/ddc 0 2026-03-10T07:51:39.531 INFO:tasks.workunit.client.1.vm08.stdout:8/736: truncate d0/df/d2e/f44 1058972 0 2026-03-10T07:51:39.539 INFO:tasks.workunit.client.1.vm08.stdout:1/589: rmdir d2/d6/de/d1f/d8f/db5 0 2026-03-10T07:51:39.543 INFO:tasks.workunit.client.1.vm08.stdout:6/665: write d1/d46/f74 [398359,50315] 0 2026-03-10T07:51:39.554 INFO:tasks.workunit.client.1.vm08.stdout:9/631: write d2/d58/dbf/dd0/d35/f6c [1157383,98760] 0 2026-03-10T07:51:39.554 INFO:tasks.workunit.client.1.vm08.stdout:8/737: link d0/df/d15/d23/d39/d5b/d4a/ce8 d0/df/d15/d95/ced 0 2026-03-10T07:51:39.554 INFO:tasks.workunit.client.1.vm08.stdout:0/629: write dd/d10/d2f/d37/d64/f68 [410719,105260] 0 2026-03-10T07:51:39.554 INFO:tasks.workunit.client.1.vm08.stdout:9/632: write d2/d58/d73/f7e [371275,87712] 0 2026-03-10T07:51:39.554 INFO:tasks.workunit.client.1.vm08.stdout:6/666: mkdir d1/d17/d2b/d58/d76/dde 0 2026-03-10T07:51:39.554 INFO:tasks.workunit.client.1.vm08.stdout:8/738: rmdir d0/df/d15/d23/d39 39 2026-03-10T07:51:39.557 INFO:tasks.workunit.client.1.vm08.stdout:9/633: mknod d2/de/cd8 0 2026-03-10T07:51:39.568 INFO:tasks.workunit.client.1.vm08.stdout:8/739: creat d0/df/d15/d53/fee x:0 0 0 2026-03-10T07:51:39.569 INFO:tasks.workunit.client.1.vm08.stdout:9/634: readlink d2/d58/dbf/dd0/d35/d9b/lb0 0 2026-03-10T07:51:39.569 INFO:tasks.workunit.client.1.vm08.stdout:6/667: dwrite d1/db/fd2 [0,4194304] 0 2026-03-10T07:51:39.569 INFO:tasks.workunit.client.1.vm08.stdout:0/630: dread dd/d10/d2f/d37/d64/d95/d58/d3d/d9b/fb4 [0,4194304] 0 2026-03-10T07:51:39.569 INFO:tasks.workunit.client.1.vm08.stdout:8/740: symlink d0/df/d2e/d49/lef 0 2026-03-10T07:51:39.569 INFO:tasks.workunit.client.1.vm08.stdout:9/635: mkdir d2/de/d28/d98/dbb/dd9 0 2026-03-10T07:51:39.570 INFO:tasks.workunit.client.1.vm08.stdout:6/668: creat d1/d3/df/d1d/d40/d45/fdf x:0 0 0 2026-03-10T07:51:39.583 INFO:tasks.workunit.client.1.vm08.stdout:8/741: creat d0/df/d15/d23/d54/dba/d89/dbf/ff0 x:0 0 0 2026-03-10T07:51:39.583 INFO:tasks.workunit.client.1.vm08.stdout:9/636: dread d2/d58/dbf/d2b/f37 [0,4194304] 0 2026-03-10T07:51:39.589 INFO:tasks.workunit.client.1.vm08.stdout:0/631: dread dd/d10/d2f/d37/d64/f70 [0,4194304] 0 2026-03-10T07:51:39.592 INFO:tasks.workunit.client.1.vm08.stdout:0/632: dread dd/d10/d2f/d37/d64/d95/d5c/f8f [0,4194304] 0 2026-03-10T07:51:39.601 INFO:tasks.workunit.client.1.vm08.stdout:5/676: sync 2026-03-10T07:51:39.602 INFO:tasks.workunit.client.1.vm08.stdout:5/677: mkdir d0/d33/ddd 0 2026-03-10T07:51:39.606 INFO:tasks.workunit.client.1.vm08.stdout:5/678: rename d0/d4/d19/f79 to d0/d4/d19/d3a/d69/fde 0 2026-03-10T07:51:39.613 INFO:tasks.workunit.client.1.vm08.stdout:5/679: rename d0/d4/df/d12/d22 to d0/ddf 0 2026-03-10T07:51:39.617 INFO:tasks.workunit.client.1.vm08.stdout:5/680: mkdir d0/d77/d83/de0 0 2026-03-10T07:51:39.617 INFO:tasks.workunit.client.1.vm08.stdout:0/633: dread dd/d10/d2f/d37/d64/f68 [0,4194304] 0 2026-03-10T07:51:39.618 INFO:tasks.workunit.client.1.vm08.stdout:0/634: chown dd/d10/d2f/f4c 153962271 1 2026-03-10T07:51:39.619 INFO:tasks.workunit.client.1.vm08.stdout:0/635: chown dd/d10/d14/d15/l17 438760748 1 2026-03-10T07:51:39.630 INFO:tasks.workunit.client.1.vm08.stdout:0/636: unlink dd/d10/d14/d15/l57 0 2026-03-10T07:51:39.643 INFO:tasks.workunit.client.1.vm08.stdout:0/637: stat dd/d10/d14/d15/d20/d5f/d9f/fbb 0 2026-03-10T07:51:39.643 INFO:tasks.workunit.client.1.vm08.stdout:2/634: dwrite d0/d1/d3/f4e [0,4194304] 0 2026-03-10T07:51:39.645 INFO:tasks.workunit.client.1.vm08.stdout:0/638: truncate dd/d10/d2f/f81 936692 0 2026-03-10T07:51:39.646 INFO:tasks.workunit.client.1.vm08.stdout:2/635: write d0/d1/d17/d6b/f6f [151741,58212] 0 2026-03-10T07:51:39.646 INFO:tasks.workunit.client.1.vm08.stdout:0/639: chown dd/d10/d2f/d37/d64/d95/d58/c4e 122 1 2026-03-10T07:51:39.647 INFO:tasks.workunit.client.1.vm08.stdout:2/636: dread - d0/d1/d3/d56/d78/dad/db1/d61/d84/fa8 zero size 2026-03-10T07:51:39.648 INFO:tasks.workunit.client.1.vm08.stdout:0/640: readlink dd/d10/d2f/d37/d64/d95/d5c/l60 0 2026-03-10T07:51:39.654 INFO:tasks.workunit.client.1.vm08.stdout:2/637: symlink d0/d1/d17/db2/d9c/lcd 0 2026-03-10T07:51:39.654 INFO:tasks.workunit.client.1.vm08.stdout:2/638: chown d0/d1/d3/d10/d65/f7c 172 1 2026-03-10T07:51:39.657 INFO:tasks.workunit.client.1.vm08.stdout:2/639: chown d0/d1/d3/d56/d78/dad/db1/d61/cb5 13487811 1 2026-03-10T07:51:39.659 INFO:tasks.workunit.client.1.vm08.stdout:3/624: write d0/d3c/d1f/d44/f8f [969622,16091] 0 2026-03-10T07:51:39.661 INFO:tasks.workunit.client.1.vm08.stdout:2/640: mknod d0/d1/d3/d56/cce 0 2026-03-10T07:51:39.672 INFO:tasks.workunit.client.1.vm08.stdout:2/641: fsync d0/d1/d3/f8 0 2026-03-10T07:51:39.676 INFO:tasks.workunit.client.1.vm08.stdout:3/625: creat d0/d3c/d18/d48/fc5 x:0 0 0 2026-03-10T07:51:39.677 INFO:tasks.workunit.client.1.vm08.stdout:2/642: chown d0/d1/d3/d10/d38/c8c 15 1 2026-03-10T07:51:39.689 INFO:tasks.workunit.client.1.vm08.stdout:2/643: mknod d0/d1/d3/d39/d7d/d86/d55/ccf 0 2026-03-10T07:51:39.694 INFO:tasks.workunit.client.1.vm08.stdout:2/644: rmdir d0/d1/d3/d39/d7d/d86/d55/d7a 39 2026-03-10T07:51:39.696 INFO:tasks.workunit.client.1.vm08.stdout:3/626: getdents d0/d3c/d18/d80 0 2026-03-10T07:51:39.704 INFO:tasks.workunit.client.1.vm08.stdout:4/539: dwrite d5/f10 [0,4194304] 0 2026-03-10T07:51:39.707 INFO:tasks.workunit.client.1.vm08.stdout:7/602: dwrite d3/da/d25/d9/d2f/d39/d43/f7e [0,4194304] 0 2026-03-10T07:51:39.713 INFO:tasks.workunit.client.1.vm08.stdout:7/603: dread - d3/da/d25/d9/f59 zero size 2026-03-10T07:51:39.715 INFO:tasks.workunit.client.1.vm08.stdout:2/645: dread d0/f1e [0,4194304] 0 2026-03-10T07:51:39.723 INFO:tasks.workunit.client.1.vm08.stdout:3/627: rmdir d0/d3c/d18/d32/d61/d52/dc0 0 2026-03-10T07:51:39.731 INFO:tasks.workunit.client.1.vm08.stdout:1/590: dwrite d2/d6/de/d1f/f2a [0,4194304] 0 2026-03-10T07:51:39.731 INFO:tasks.workunit.client.1.vm08.stdout:1/591: stat d2/d6/de/d1f/d26/d58/d8c/f97 0 2026-03-10T07:51:39.734 INFO:tasks.workunit.client.1.vm08.stdout:4/540: creat d5/da0/d32/fbb x:0 0 0 2026-03-10T07:51:39.736 INFO:tasks.workunit.client.1.vm08.stdout:7/604: creat d3/da/d8a/fcd x:0 0 0 2026-03-10T07:51:39.739 INFO:tasks.workunit.client.1.vm08.stdout:9/637: write d2/d58/f95 [562287,62674] 0 2026-03-10T07:51:39.749 INFO:tasks.workunit.client.1.vm08.stdout:8/742: dwrite d0/df/d15/d23/da8/fc2 [0,4194304] 0 2026-03-10T07:51:39.749 INFO:tasks.workunit.client.1.vm08.stdout:6/669: dwrite d1/db/d24/f25 [0,4194304] 0 2026-03-10T07:51:39.749 INFO:tasks.workunit.client.1.vm08.stdout:2/646: rename d0/d1/c1d to d0/d1/d3/d10/cd0 0 2026-03-10T07:51:39.749 INFO:tasks.workunit.client.1.vm08.stdout:6/670: rename d1/db to d1/db/d24/d3d/de0 22 2026-03-10T07:51:39.762 INFO:tasks.workunit.client.1.vm08.stdout:5/681: dwrite d0/d4/d19/d3a/d69/fa3 [0,4194304] 0 2026-03-10T07:51:39.780 INFO:tasks.workunit.client.1.vm08.stdout:4/541: creat d5/da0/d12/d7b/fbc x:0 0 0 2026-03-10T07:51:39.788 INFO:tasks.workunit.client.1.vm08.stdout:0/641: write dd/d10/d14/d15/d20/fc5 [3250953,118164] 0 2026-03-10T07:51:39.789 INFO:tasks.workunit.client.1.vm08.stdout:6/671: unlink d1/db/d24/dac/dad/c9a 0 2026-03-10T07:51:39.797 INFO:tasks.workunit.client.1.vm08.stdout:7/605: dread d3/f6 [0,4194304] 0 2026-03-10T07:51:39.798 INFO:tasks.workunit.client.1.vm08.stdout:1/592: symlink d2/d6/de/d1f/d40/lce 0 2026-03-10T07:51:39.805 INFO:tasks.workunit.client.1.vm08.stdout:5/682: rename d0/d8/dce/fd6 to d0/d4/d19/d60/d6d/d70/fe1 0 2026-03-10T07:51:39.815 INFO:tasks.workunit.client.1.vm08.stdout:4/542: creat d5/d1f/d70/fbd x:0 0 0 2026-03-10T07:51:39.819 INFO:tasks.workunit.client.1.vm08.stdout:4/543: dwrite d5/d8/f90 [0,4194304] 0 2026-03-10T07:51:39.820 INFO:tasks.workunit.client.1.vm08.stdout:6/672: rmdir d1/d3 39 2026-03-10T07:51:39.830 INFO:tasks.workunit.client.1.vm08.stdout:2/647: creat d0/d1/d3/d39/d7d/d86/d55/dc9/fd1 x:0 0 0 2026-03-10T07:51:39.837 INFO:tasks.workunit.client.1.vm08.stdout:2/648: dwrite d0/d1/d3/d39/d7d/d7e/fa5 [0,4194304] 0 2026-03-10T07:51:39.837 INFO:tasks.workunit.client.1.vm08.stdout:3/628: creat d0/d3c/d18/d48/d55/fc6 x:0 0 0 2026-03-10T07:51:39.846 INFO:tasks.workunit.client.1.vm08.stdout:7/606: creat d3/da/d25/d9/d2f/d39/fce x:0 0 0 2026-03-10T07:51:39.850 INFO:tasks.workunit.client.1.vm08.stdout:1/593: rmdir d2/d6/de/d1f/da9 39 2026-03-10T07:51:39.859 INFO:tasks.workunit.client.1.vm08.stdout:5/683: stat d0/d77/d83/c9f 0 2026-03-10T07:51:39.863 INFO:tasks.workunit.client.1.vm08.stdout:5/684: dwrite d0/d4/d19/d3a/d69/f71 [0,4194304] 0 2026-03-10T07:51:39.888 INFO:tasks.workunit.client.1.vm08.stdout:8/743: getdents d0/df/d15/d23/d54 0 2026-03-10T07:51:39.905 INFO:tasks.workunit.client.1.vm08.stdout:6/673: chown d1/d3/d3e/f4a 4311118 1 2026-03-10T07:51:39.910 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:39 vm05.local ceph-mon[50387]: pgmap v39: 65 pgs: 65 active+clean; 3.0 GiB data, 9.8 GiB used, 110 GiB / 120 GiB avail; 50 MiB/s rd, 149 MiB/s wr, 331 op/s 2026-03-10T07:51:39.911 INFO:tasks.workunit.client.1.vm08.stdout:4/544: creat d5/d1f/d31/fbe x:0 0 0 2026-03-10T07:51:39.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:39 vm08.local ceph-mon[59917]: pgmap v39: 65 pgs: 65 active+clean; 3.0 GiB data, 9.8 GiB used, 110 GiB / 120 GiB avail; 50 MiB/s rd, 149 MiB/s wr, 331 op/s 2026-03-10T07:51:39.924 INFO:tasks.workunit.client.1.vm08.stdout:3/629: creat d0/d3c/d18/d32/fc7 x:0 0 0 2026-03-10T07:51:39.966 INFO:tasks.workunit.client.1.vm08.stdout:3/630: write d0/d3c/d1f/f6f [5230650,25865] 0 2026-03-10T07:51:39.966 INFO:tasks.workunit.client.1.vm08.stdout:3/631: write d0/d3c/d18/d32/d61/d52/f70 [2425925,1626] 0 2026-03-10T07:51:39.966 INFO:tasks.workunit.client.1.vm08.stdout:9/638: getdents d2/d58 0 2026-03-10T07:51:39.966 INFO:tasks.workunit.client.1.vm08.stdout:0/642: link dd/d10/d2f/d37/d64/d95/f48 dd/d10/d2f/d37/daf/fcc 0 2026-03-10T07:51:39.966 INFO:tasks.workunit.client.1.vm08.stdout:8/744: write d0/df/d15/d23/d39/d5b/dea/dce/fd2 [196848,112303] 0 2026-03-10T07:51:39.967 INFO:tasks.workunit.client.1.vm08.stdout:1/594: mknod d2/d6/de/d1f/ccf 0 2026-03-10T07:51:39.967 INFO:tasks.workunit.client.1.vm08.stdout:5/685: getdents d0/d4/df/dbf/d41/dad 0 2026-03-10T07:51:39.967 INFO:tasks.workunit.client.1.vm08.stdout:5/686: readlink d0/d4/d19/l87 0 2026-03-10T07:51:39.967 INFO:tasks.workunit.client.1.vm08.stdout:8/745: creat d0/df/d15/d9c/ff1 x:0 0 0 2026-03-10T07:51:39.967 INFO:tasks.workunit.client.1.vm08.stdout:7/607: symlink d3/da/d25/d9/d6f/lcf 0 2026-03-10T07:51:39.970 INFO:tasks.workunit.client.1.vm08.stdout:5/687: fdatasync d0/d8/f1b 0 2026-03-10T07:51:39.975 INFO:tasks.workunit.client.1.vm08.stdout:8/746: unlink d0/df/d2e/fb9 0 2026-03-10T07:51:39.978 INFO:tasks.workunit.client.1.vm08.stdout:6/674: getdents d1/d3/df/d1d/d40/d87 0 2026-03-10T07:51:39.978 INFO:tasks.workunit.client.1.vm08.stdout:1/595: creat d2/d6/de/d47/dbd/dc3/fd0 x:0 0 0 2026-03-10T07:51:39.980 INFO:tasks.workunit.client.1.vm08.stdout:7/608: creat d3/da/d25/d9/d2f/d39/db2/fd0 x:0 0 0 2026-03-10T07:51:39.981 INFO:tasks.workunit.client.1.vm08.stdout:7/609: write d3/da/d25/d9/fd [3745016,119999] 0 2026-03-10T07:51:39.982 INFO:tasks.workunit.client.1.vm08.stdout:5/688: mkdir d0/d8/d24/de2 0 2026-03-10T07:51:39.993 INFO:tasks.workunit.client.1.vm08.stdout:6/675: unlink d1/d3/df/d38/lca 0 2026-03-10T07:51:40.012 INFO:tasks.workunit.client.1.vm08.stdout:6/676: read d1/d3/d3e/f43 [95445,5893] 0 2026-03-10T07:51:40.012 INFO:tasks.workunit.client.1.vm08.stdout:6/677: write d1/d17/d2b/d58/d76/fb6 [4812,126604] 0 2026-03-10T07:51:40.012 INFO:tasks.workunit.client.1.vm08.stdout:1/596: rename d2/d6/de/d1f/d26/d89/fa8 to d2/d6/de/d1f/d40/d76/fd1 0 2026-03-10T07:51:40.012 INFO:tasks.workunit.client.1.vm08.stdout:7/610: truncate d3/f2b 401890 0 2026-03-10T07:51:40.012 INFO:tasks.workunit.client.1.vm08.stdout:5/689: mknod d0/d77/daa/ce3 0 2026-03-10T07:51:40.012 INFO:tasks.workunit.client.1.vm08.stdout:1/597: fdatasync d2/d6/de/f7c 0 2026-03-10T07:51:40.012 INFO:tasks.workunit.client.1.vm08.stdout:7/611: chown d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/l4c 0 1 2026-03-10T07:51:40.012 INFO:tasks.workunit.client.1.vm08.stdout:7/612: mkdir d3/da/d8a/dd1 0 2026-03-10T07:51:40.012 INFO:tasks.workunit.client.1.vm08.stdout:7/613: chown d3/da/d25/f94 5 1 2026-03-10T07:51:40.012 INFO:tasks.workunit.client.1.vm08.stdout:7/614: chown d3/da/d25/d9/f59 877248 1 2026-03-10T07:51:40.012 INFO:tasks.workunit.client.1.vm08.stdout:1/598: symlink d2/d6/de/d1f/d26/d58/ld2 0 2026-03-10T07:51:40.012 INFO:tasks.workunit.client.1.vm08.stdout:1/599: mkdir d2/d6/de/d1f/d22/dd3 0 2026-03-10T07:51:40.012 INFO:tasks.workunit.client.1.vm08.stdout:1/600: chown d2/d6/de/d1f/d26/f2f 1 1 2026-03-10T07:51:40.013 INFO:tasks.workunit.client.1.vm08.stdout:1/601: truncate d2/d6/d9f/fa7 336264 0 2026-03-10T07:51:40.014 INFO:tasks.workunit.client.1.vm08.stdout:1/602: readlink d2/d6/d50/l55 0 2026-03-10T07:51:40.028 INFO:tasks.workunit.client.1.vm08.stdout:1/603: symlink d2/d6/de/d1f/d40/d76/ld4 0 2026-03-10T07:51:40.029 INFO:tasks.workunit.client.1.vm08.stdout:1/604: readlink d2/d6/de/d5f/lba 0 2026-03-10T07:51:40.029 INFO:tasks.workunit.client.1.vm08.stdout:1/605: rmdir d2/d6 39 2026-03-10T07:51:40.037 INFO:tasks.workunit.client.1.vm08.stdout:1/606: link d2/d6/de/d1f/d26/f4a d2/d6/de/d5f/fd5 0 2026-03-10T07:51:40.040 INFO:tasks.workunit.client.1.vm08.stdout:9/639: sync 2026-03-10T07:51:40.040 INFO:tasks.workunit.client.1.vm08.stdout:6/678: sync 2026-03-10T07:51:40.041 INFO:tasks.workunit.client.1.vm08.stdout:1/607: creat d2/d6/de/d47/dbd/fd6 x:0 0 0 2026-03-10T07:51:40.043 INFO:tasks.workunit.client.1.vm08.stdout:6/679: creat d1/d3/d3e/fe1 x:0 0 0 2026-03-10T07:51:40.043 INFO:tasks.workunit.client.1.vm08.stdout:9/640: fsync d2/d58/dbf/d2b/f6a 0 2026-03-10T07:51:40.050 INFO:tasks.workunit.client.1.vm08.stdout:2/649: truncate d0/f12 2016284 0 2026-03-10T07:51:40.051 INFO:tasks.workunit.client.1.vm08.stdout:2/650: readlink d0/d1/d3/d56/d78/dad/db1/lb3 0 2026-03-10T07:51:40.052 INFO:tasks.workunit.client.1.vm08.stdout:0/643: write dd/d18/f9a [777943,58257] 0 2026-03-10T07:51:40.053 INFO:tasks.workunit.client.1.vm08.stdout:3/632: truncate d0/d3c/d1f/d44/d51/d34/fb5 1399472 0 2026-03-10T07:51:40.054 INFO:tasks.workunit.client.1.vm08.stdout:0/644: dread - dd/d10/fb8 zero size 2026-03-10T07:51:40.056 INFO:tasks.workunit.client.1.vm08.stdout:4/545: dwrite d5/f2f [0,4194304] 0 2026-03-10T07:51:40.057 INFO:tasks.workunit.client.1.vm08.stdout:0/645: stat dd/d10/d2f/d37/d64/d52/c97 0 2026-03-10T07:51:40.064 INFO:tasks.workunit.client.1.vm08.stdout:1/608: mkdir d2/d10/dd7 0 2026-03-10T07:51:40.097 INFO:tasks.workunit.client.1.vm08.stdout:0/646: fdatasync dd/d10/d2f/d37/d64/d95/d5c/f63 0 2026-03-10T07:51:40.098 INFO:tasks.workunit.client.1.vm08.stdout:0/647: read - dd/d10/d2f/f4b zero size 2026-03-10T07:51:40.098 INFO:tasks.workunit.client.1.vm08.stdout:8/747: dwrite d0/df/d15/d23/da8/f6a [0,4194304] 0 2026-03-10T07:51:40.098 INFO:tasks.workunit.client.1.vm08.stdout:5/690: dwrite d0/d4/d19/d43/f7c [0,4194304] 0 2026-03-10T07:51:40.098 INFO:tasks.workunit.client.1.vm08.stdout:0/648: rename dd/d10/d14/d15/d20/l3e to dd/d10/d14/d15/d20/d5f/d9f/lcd 0 2026-03-10T07:51:40.101 INFO:tasks.workunit.client.1.vm08.stdout:9/641: getdents d2/de/d28 0 2026-03-10T07:51:40.106 INFO:tasks.workunit.client.1.vm08.stdout:0/649: dread dd/d10/d2f/d37/d64/d95/f48 [0,4194304] 0 2026-03-10T07:51:40.111 INFO:tasks.workunit.client.1.vm08.stdout:0/650: dwrite dd/d10/d2f/d37/d64/d95/d58/d3d/faa [0,4194304] 0 2026-03-10T07:51:40.123 INFO:tasks.workunit.client.1.vm08.stdout:0/651: creat dd/d10/d2f/d37/d64/d95/d5c/dca/fce x:0 0 0 2026-03-10T07:51:40.123 INFO:tasks.workunit.client.1.vm08.stdout:2/651: dread d0/d1/d3/d10/d65/fb8 [0,4194304] 0 2026-03-10T07:51:40.124 INFO:tasks.workunit.client.1.vm08.stdout:0/652: rename dd/d10/d14/d15 to dd/d10/d14/d15/dad/dcf 22 2026-03-10T07:51:40.127 INFO:tasks.workunit.client.1.vm08.stdout:3/633: dread d0/d3c/d18/d32/d61/d52/f7f [0,4194304] 0 2026-03-10T07:51:40.132 INFO:tasks.workunit.client.1.vm08.stdout:2/652: dread d0/d1/fa2 [0,4194304] 0 2026-03-10T07:51:40.132 INFO:tasks.workunit.client.1.vm08.stdout:2/653: truncate d0/f81 1179220 0 2026-03-10T07:51:40.132 INFO:tasks.workunit.client.1.vm08.stdout:2/654: fdatasync d0/d1/d3/d56/d57/f79 0 2026-03-10T07:51:40.133 INFO:tasks.workunit.client.1.vm08.stdout:3/634: mknod d0/d3c/d18/d32/daa/cc8 0 2026-03-10T07:51:40.135 INFO:tasks.workunit.client.1.vm08.stdout:2/655: mknod d0/d1/d3/d56/d78/dad/db1/d61/d84/cd2 0 2026-03-10T07:51:40.138 INFO:tasks.workunit.client.1.vm08.stdout:2/656: fdatasync d0/d1/d3/d56/d78/dad/db1/d61/d8e/fa1 0 2026-03-10T07:51:40.139 INFO:tasks.workunit.client.1.vm08.stdout:2/657: rmdir d0/d1/d3/d56/d57 39 2026-03-10T07:51:40.146 INFO:tasks.workunit.client.1.vm08.stdout:6/680: sync 2026-03-10T07:51:40.153 INFO:tasks.workunit.client.1.vm08.stdout:7/615: dwrite d3/da/d25/d9/f59 [0,4194304] 0 2026-03-10T07:51:40.154 INFO:tasks.workunit.client.1.vm08.stdout:7/616: stat d3/da/d25/d9/d2f/d39/d43/d4f/la7 0 2026-03-10T07:51:40.158 INFO:tasks.workunit.client.1.vm08.stdout:7/617: write d3/da/f17 [2035129,22742] 0 2026-03-10T07:51:40.161 INFO:tasks.workunit.client.1.vm08.stdout:7/618: mkdir d3/da/d25/d9/d2f/d3a/d71/dca/dd2 0 2026-03-10T07:51:40.165 INFO:tasks.workunit.client.1.vm08.stdout:7/619: creat d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/dac/fd3 x:0 0 0 2026-03-10T07:51:40.165 INFO:tasks.workunit.client.1.vm08.stdout:4/546: sync 2026-03-10T07:51:40.166 INFO:tasks.workunit.client.1.vm08.stdout:4/547: dread - d5/d8/f86 zero size 2026-03-10T07:51:40.180 INFO:tasks.workunit.client.1.vm08.stdout:7/620: dread d3/da/d25/d9/d2f/f97 [0,4194304] 0 2026-03-10T07:51:40.180 INFO:tasks.workunit.client.1.vm08.stdout:4/548: dwrite d5/d1f/d31/fbe [0,4194304] 0 2026-03-10T07:51:40.180 INFO:tasks.workunit.client.1.vm08.stdout:4/549: mknod d5/da0/d12/d7b/d48/da2/cbf 0 2026-03-10T07:51:40.180 INFO:tasks.workunit.client.1.vm08.stdout:4/550: rmdir d5/da0/d95 39 2026-03-10T07:51:40.180 INFO:tasks.workunit.client.1.vm08.stdout:4/551: chown d5/da0/d95/lb1 85258429 1 2026-03-10T07:51:40.180 INFO:tasks.workunit.client.1.vm08.stdout:2/658: sync 2026-03-10T07:51:40.187 INFO:tasks.workunit.client.1.vm08.stdout:4/552: dread d5/d8/f1e [4194304,4194304] 0 2026-03-10T07:51:40.187 INFO:tasks.workunit.client.1.vm08.stdout:4/553: readlink d5/da0/d95/lb1 0 2026-03-10T07:51:40.188 INFO:tasks.workunit.client.1.vm08.stdout:4/554: truncate d5/da0/f46 564953 0 2026-03-10T07:51:40.189 INFO:tasks.workunit.client.1.vm08.stdout:4/555: symlink d5/d1f/lc0 0 2026-03-10T07:51:40.193 INFO:tasks.workunit.client.1.vm08.stdout:4/556: dwrite d5/d1f/d70/fa9 [0,4194304] 0 2026-03-10T07:51:40.222 INFO:tasks.workunit.client.1.vm08.stdout:4/557: chown d5/d1f/d31/lac 200905303 1 2026-03-10T07:51:40.222 INFO:tasks.workunit.client.1.vm08.stdout:4/558: dwrite d5/d1f/d41/f97 [0,4194304] 0 2026-03-10T07:51:40.337 INFO:tasks.workunit.client.1.vm08.stdout:1/609: dwrite d2/d6/de/d1f/d26/f29 [0,4194304] 0 2026-03-10T07:51:40.343 INFO:tasks.workunit.client.1.vm08.stdout:5/691: write d0/d4/df/d12/f46 [1039588,97273] 0 2026-03-10T07:51:40.343 INFO:tasks.workunit.client.1.vm08.stdout:8/748: write d0/df/d17/d25/fc4 [1493982,12683] 0 2026-03-10T07:51:40.344 INFO:tasks.workunit.client.1.vm08.stdout:9/642: dwrite d2/d58/dbf/d2b/f6a [0,4194304] 0 2026-03-10T07:51:40.349 INFO:tasks.workunit.client.1.vm08.stdout:8/749: write d0/df/d15/d23/d54/dba/d89/fd0 [1476641,107593] 0 2026-03-10T07:51:40.350 INFO:tasks.workunit.client.1.vm08.stdout:8/750: write d0/f2a [4299990,37247] 0 2026-03-10T07:51:40.369 INFO:tasks.workunit.client.1.vm08.stdout:6/681: write d1/d3/d3e/f81 [5136617,24855] 0 2026-03-10T07:51:40.370 INFO:tasks.workunit.client.1.vm08.stdout:3/635: dwrite d0/d3c/d1f/d44/f8c [0,4194304] 0 2026-03-10T07:51:40.371 INFO:tasks.workunit.client.1.vm08.stdout:0/653: dwrite dd/d10/d2f/f6b [0,4194304] 0 2026-03-10T07:51:40.373 INFO:tasks.workunit.client.1.vm08.stdout:9/643: mkdir d2/dda 0 2026-03-10T07:51:40.396 INFO:tasks.workunit.client.1.vm08.stdout:7/621: dwrite d3/f57 [0,4194304] 0 2026-03-10T07:51:40.403 INFO:tasks.workunit.client.1.vm08.stdout:8/751: rename d0/df/d15/d23/d39/f40 to d0/df/d15/d23/d54/dba/d89/dbf/ff2 0 2026-03-10T07:51:40.405 INFO:tasks.workunit.client.1.vm08.stdout:8/752: write d0/df/d15/d23/da8/fc2 [187057,94184] 0 2026-03-10T07:51:40.405 INFO:tasks.workunit.client.1.vm08.stdout:1/610: fsync d2/d6/de/d1f/d40/d76/fd1 0 2026-03-10T07:51:40.407 INFO:tasks.workunit.client.1.vm08.stdout:2/659: dwrite d0/d1/d3/d56/d78/dad/db1/d61/f3d [0,4194304] 0 2026-03-10T07:51:40.413 INFO:tasks.workunit.client.1.vm08.stdout:8/753: dread d0/df/d5d/f81 [0,4194304] 0 2026-03-10T07:51:40.413 INFO:tasks.workunit.client.1.vm08.stdout:6/682: fdatasync d1/db/d24/dac/dad/f59 0 2026-03-10T07:51:40.417 INFO:tasks.workunit.client.1.vm08.stdout:0/654: write dd/d10/d2f/f81 [173300,828] 0 2026-03-10T07:51:40.425 INFO:tasks.workunit.client.1.vm08.stdout:1/611: dread d2/d6/d50/f54 [0,4194304] 0 2026-03-10T07:51:40.435 INFO:tasks.workunit.client.1.vm08.stdout:5/692: creat d0/d4/d19/d81/fe4 x:0 0 0 2026-03-10T07:51:40.435 INFO:tasks.workunit.client.1.vm08.stdout:3/636: symlink d0/d3c/d1f/d44/d51/lc9 0 2026-03-10T07:51:40.435 INFO:tasks.workunit.client.1.vm08.stdout:6/683: rename d1/d3/df/d1d/d40/lc0 to d1/d17/d2b/d58/d76/le2 0 2026-03-10T07:51:40.435 INFO:tasks.workunit.client.1.vm08.stdout:0/655: mknod dd/d10/d14/d15/d20/d7a/cd0 0 2026-03-10T07:51:40.435 INFO:tasks.workunit.client.1.vm08.stdout:9/644: rmdir d2/de/dd6 0 2026-03-10T07:51:40.435 INFO:tasks.workunit.client.1.vm08.stdout:5/693: creat d0/d4/d19/d60/d6d/d70/d40/dba/fe5 x:0 0 0 2026-03-10T07:51:40.435 INFO:tasks.workunit.client.1.vm08.stdout:1/612: rename d2/d6/de/d1f/d26/d98/fbe to d2/d10/fd8 0 2026-03-10T07:51:40.435 INFO:tasks.workunit.client.1.vm08.stdout:7/622: getdents d3/da/d8a 0 2026-03-10T07:51:40.435 INFO:tasks.workunit.client.1.vm08.stdout:0/656: creat dd/d10/d14/d15/d20/d5f/d9f/fd1 x:0 0 0 2026-03-10T07:51:40.436 INFO:tasks.workunit.client.1.vm08.stdout:5/694: chown d0/d4/df 1 1 2026-03-10T07:51:40.436 INFO:tasks.workunit.client.1.vm08.stdout:1/613: write d2/d10/f3f [325026,89995] 0 2026-03-10T07:51:40.436 INFO:tasks.workunit.client.1.vm08.stdout:8/754: dread d0/df/f19 [0,4194304] 0 2026-03-10T07:51:40.443 INFO:tasks.workunit.client.1.vm08.stdout:9/645: rename d2/l5e to d2/de/d28/d98/dbb/dd9/ldb 0 2026-03-10T07:51:40.445 INFO:tasks.workunit.client.1.vm08.stdout:7/623: dwrite d3/da/d25/d9/d2f/f62 [0,4194304] 0 2026-03-10T07:51:40.447 INFO:tasks.workunit.client.1.vm08.stdout:7/624: write d3/da/d25/d9/d2f/d39/d43/d4f/f68 [4972910,105046] 0 2026-03-10T07:51:40.449 INFO:tasks.workunit.client.1.vm08.stdout:0/657: mkdir dd/d10/d14/d15/d20/d7a/dd2 0 2026-03-10T07:51:40.449 INFO:tasks.workunit.client.1.vm08.stdout:8/755: creat d0/df/d2e/d49/ff3 x:0 0 0 2026-03-10T07:51:40.450 INFO:tasks.workunit.client.1.vm08.stdout:3/637: read d0/d3c/d1f/d44/d51/d2d/d85/fa0 [615618,64378] 0 2026-03-10T07:51:40.452 INFO:tasks.workunit.client.1.vm08.stdout:0/658: dread - dd/d10/d2f/f4b zero size 2026-03-10T07:51:40.460 INFO:tasks.workunit.client.1.vm08.stdout:8/756: dread d0/df/f19 [0,4194304] 0 2026-03-10T07:51:40.460 INFO:tasks.workunit.client.1.vm08.stdout:5/695: creat d0/d8/d24/de2/fe6 x:0 0 0 2026-03-10T07:51:40.467 INFO:tasks.workunit.client.1.vm08.stdout:0/659: truncate dd/f44 1454190 0 2026-03-10T07:51:40.469 INFO:tasks.workunit.client.1.vm08.stdout:3/638: mkdir d0/d3c/d18/d32/d61/d52/dca 0 2026-03-10T07:51:40.478 INFO:tasks.workunit.client.1.vm08.stdout:1/614: link d2/d6/c52 d2/d6/de/d47/dbd/dc3/cd9 0 2026-03-10T07:51:40.479 INFO:tasks.workunit.client.1.vm08.stdout:8/757: link d0/df/cd6 d0/df/d15/d23/d54/dba/d89/cf4 0 2026-03-10T07:51:40.480 INFO:tasks.workunit.client.1.vm08.stdout:0/660: symlink dd/d10/d2f/d37/ld3 0 2026-03-10T07:51:40.480 INFO:tasks.workunit.client.1.vm08.stdout:9/646: getdents d2/de/d28/d98 0 2026-03-10T07:51:40.481 INFO:tasks.workunit.client.1.vm08.stdout:7/625: link d3/c78 d3/da/d25/d9/d2f/d3a/d4b/cd4 0 2026-03-10T07:51:40.481 INFO:tasks.workunit.client.1.vm08.stdout:0/661: write dd/d10/d2f/f81 [1563080,9503] 0 2026-03-10T07:51:40.481 INFO:tasks.workunit.client.1.vm08.stdout:8/758: rename d0/fa3 to d0/df/d15/d23/da8/ff5 0 2026-03-10T07:51:40.482 INFO:tasks.workunit.client.1.vm08.stdout:5/696: link d0/d77/daa/ce3 d0/d4/d19/d81/ce7 0 2026-03-10T07:51:40.482 INFO:tasks.workunit.client.1.vm08.stdout:8/759: chown d0/df/d15/d23/d54/fad 623218 1 2026-03-10T07:51:40.483 INFO:tasks.workunit.client.1.vm08.stdout:1/615: mknod d2/d6/d3a/d61/d6f/dad/cda 0 2026-03-10T07:51:40.483 INFO:tasks.workunit.client.1.vm08.stdout:5/697: truncate d0/ddf/f44 1934441 0 2026-03-10T07:51:40.485 INFO:tasks.workunit.client.1.vm08.stdout:9/647: creat d2/de/fdc x:0 0 0 2026-03-10T07:51:40.486 INFO:tasks.workunit.client.1.vm08.stdout:9/648: chown d2/d58/d73 61041124 1 2026-03-10T07:51:40.488 INFO:tasks.workunit.client.1.vm08.stdout:9/649: creat d2/d58/dbf/dd0/d35/fdd x:0 0 0 2026-03-10T07:51:40.488 INFO:tasks.workunit.client.1.vm08.stdout:9/650: symlink d2/d58/d73/lde 0 2026-03-10T07:51:40.493 INFO:tasks.workunit.client.1.vm08.stdout:0/662: read dd/d18/f21 [1857115,39361] 0 2026-03-10T07:51:40.495 INFO:tasks.workunit.client.1.vm08.stdout:7/626: dread d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/f49 [0,4194304] 0 2026-03-10T07:51:40.495 INFO:tasks.workunit.client.1.vm08.stdout:0/663: dread - dd/d10/d2f/d37/d64/d95/d5c/fb0 zero size 2026-03-10T07:51:40.499 INFO:tasks.workunit.client.1.vm08.stdout:7/627: dwrite d3/da/d25/d9/d2f/d4d/db6/fc6 [0,4194304] 0 2026-03-10T07:51:40.519 INFO:tasks.workunit.client.1.vm08.stdout:4/559: dwrite f2 [0,4194304] 0 2026-03-10T07:51:40.520 INFO:tasks.workunit.client.1.vm08.stdout:7/628: dread d3/da/d25/d9/d2f/d4d/db6/fc6 [0,4194304] 0 2026-03-10T07:51:40.523 INFO:tasks.workunit.client.1.vm08.stdout:4/560: write d5/f21 [1118397,73172] 0 2026-03-10T07:51:40.523 INFO:tasks.workunit.client.1.vm08.stdout:0/664: creat dd/d10/d14/d15/d20/d7a/dd2/fd4 x:0 0 0 2026-03-10T07:51:40.528 INFO:tasks.workunit.client.1.vm08.stdout:9/651: dread d2/d3/f5f [0,4194304] 0 2026-03-10T07:51:40.532 INFO:tasks.workunit.client.1.vm08.stdout:3/639: sync 2026-03-10T07:51:40.533 INFO:tasks.workunit.client.1.vm08.stdout:0/665: mknod dd/d10/d14/d15/d20/d22/cd5 0 2026-03-10T07:51:40.534 INFO:tasks.workunit.client.1.vm08.stdout:3/640: creat d0/d3c/d1f/d44/d51/d2d/d85/fcb x:0 0 0 2026-03-10T07:51:40.535 INFO:tasks.workunit.client.1.vm08.stdout:4/561: creat d5/d8/fc1 x:0 0 0 2026-03-10T07:51:40.538 INFO:tasks.workunit.client.1.vm08.stdout:9/652: getdents d2/d26/da4 0 2026-03-10T07:51:40.541 INFO:tasks.workunit.client.1.vm08.stdout:4/562: mkdir d5/da0/d95/dc2 0 2026-03-10T07:51:40.541 INFO:tasks.workunit.client.1.vm08.stdout:3/641: mkdir d0/d3c/d18/da9/dcc 0 2026-03-10T07:51:40.543 INFO:tasks.workunit.client.1.vm08.stdout:3/642: write d0/d3c/d1f/d89/fba [907958,101323] 0 2026-03-10T07:51:40.543 INFO:tasks.workunit.client.1.vm08.stdout:0/666: dwrite dd/d10/d2f/d37/d64/d95/d58/fb2 [0,4194304] 0 2026-03-10T07:51:40.544 INFO:tasks.workunit.client.1.vm08.stdout:4/563: creat d5/d1f/dad/fc3 x:0 0 0 2026-03-10T07:51:40.545 INFO:tasks.workunit.client.1.vm08.stdout:3/643: chown d0/d3c/d18/d32 45245917 1 2026-03-10T07:51:40.548 INFO:tasks.workunit.client.1.vm08.stdout:3/644: readlink d0/d3c/d18/da9/lae 0 2026-03-10T07:51:40.551 INFO:tasks.workunit.client.1.vm08.stdout:3/645: dread - d0/d3c/d1f/d44/fb4 zero size 2026-03-10T07:51:40.551 INFO:tasks.workunit.client.1.vm08.stdout:4/564: unlink d5/da0/f69 0 2026-03-10T07:51:40.551 INFO:tasks.workunit.client.1.vm08.stdout:0/667: symlink dd/d10/d14/d15/dad/ld6 0 2026-03-10T07:51:40.551 INFO:tasks.workunit.client.1.vm08.stdout:3/646: write d0/d3c/d1f/d95/fbd [589322,102864] 0 2026-03-10T07:51:40.554 INFO:tasks.workunit.client.1.vm08.stdout:3/647: dread - d0/d3c/d18/fa5 zero size 2026-03-10T07:51:40.555 INFO:tasks.workunit.client.1.vm08.stdout:4/565: dwrite d5/da0/d12/d7b/fbc [0,4194304] 0 2026-03-10T07:51:40.557 INFO:tasks.workunit.client.1.vm08.stdout:4/566: stat d5/da0/d32/l5e 0 2026-03-10T07:51:40.563 INFO:tasks.workunit.client.1.vm08.stdout:4/567: rename d5/d8/d50/c98 to d5/d1f/dad/cc4 0 2026-03-10T07:51:40.564 INFO:tasks.workunit.client.1.vm08.stdout:4/568: unlink d5/da0/d32/l8e 0 2026-03-10T07:51:40.565 INFO:tasks.workunit.client.1.vm08.stdout:4/569: creat d5/da0/d95/dc2/fc5 x:0 0 0 2026-03-10T07:51:40.565 INFO:tasks.workunit.client.1.vm08.stdout:4/570: chown d5/da0/d32/c35 935706429 1 2026-03-10T07:51:40.566 INFO:tasks.workunit.client.1.vm08.stdout:4/571: write d5/da0/d32/fbb [134753,61679] 0 2026-03-10T07:51:40.568 INFO:tasks.workunit.client.1.vm08.stdout:7/629: sync 2026-03-10T07:51:40.574 INFO:tasks.workunit.client.1.vm08.stdout:7/630: dread d3/da/d25/f35 [0,4194304] 0 2026-03-10T07:51:40.620 INFO:tasks.workunit.client.1.vm08.stdout:9/653: dread d2/d58/dbf/f7c [0,4194304] 0 2026-03-10T07:51:40.621 INFO:tasks.workunit.client.1.vm08.stdout:9/654: readlink d2/d3/lf 0 2026-03-10T07:51:40.623 INFO:tasks.workunit.client.1.vm08.stdout:6/684: dread d1/d3/df/d1d/f6b [0,4194304] 0 2026-03-10T07:51:40.627 INFO:tasks.workunit.client.1.vm08.stdout:6/685: truncate d1/d3/d3e/f43 28178 0 2026-03-10T07:51:40.628 INFO:tasks.workunit.client.1.vm08.stdout:6/686: creat d1/d17/d2b/d58/d76/dde/fe3 x:0 0 0 2026-03-10T07:51:40.628 INFO:tasks.workunit.client.1.vm08.stdout:6/687: chown d1/d3/df/d1d/d40/l6a 14 1 2026-03-10T07:51:40.630 INFO:tasks.workunit.client.1.vm08.stdout:6/688: truncate d1/d17/d2b/d58/d76/dde/fe3 594493 0 2026-03-10T07:51:40.633 INFO:tasks.workunit.client.1.vm08.stdout:6/689: dwrite d1/db/fd2 [0,4194304] 0 2026-03-10T07:51:40.647 INFO:tasks.workunit.client.1.vm08.stdout:6/690: truncate d1/d3/df/d38/f60 1568283 0 2026-03-10T07:51:40.666 INFO:tasks.workunit.client.1.vm08.stdout:6/691: sync 2026-03-10T07:51:40.667 INFO:tasks.workunit.client.1.vm08.stdout:6/692: write d1/d46/fd3 [2221511,55279] 0 2026-03-10T07:51:40.676 INFO:tasks.workunit.client.1.vm08.stdout:2/660: write d0/d1/d3/f8 [5193884,101762] 0 2026-03-10T07:51:40.677 INFO:tasks.workunit.client.1.vm08.stdout:2/661: write d0/d1/d3/d10/d65/f7c [5965102,52757] 0 2026-03-10T07:51:40.688 INFO:tasks.workunit.client.1.vm08.stdout:2/662: chown d0/d1/d3/d10/cd0 797961 1 2026-03-10T07:51:40.689 INFO:tasks.workunit.client.1.vm08.stdout:2/663: write d0/d1/d17/d6b/f9a [774546,127619] 0 2026-03-10T07:51:40.699 INFO:tasks.workunit.client.1.vm08.stdout:2/664: unlink d0/d1/fa2 0 2026-03-10T07:51:40.722 INFO:tasks.workunit.client.1.vm08.stdout:2/665: creat d0/d1/d3/d56/d78/dad/db1/fd3 x:0 0 0 2026-03-10T07:51:40.744 INFO:tasks.workunit.client.1.vm08.stdout:2/666: dread d0/d1/d3/d56/d78/f62 [4194304,4194304] 0 2026-03-10T07:51:40.766 INFO:tasks.workunit.client.1.vm08.stdout:5/698: truncate d0/d4/df/dbf/f37 375871 0 2026-03-10T07:51:40.767 INFO:tasks.workunit.client.1.vm08.stdout:1/616: write d2/d6/de/d5f/fd5 [1136672,80271] 0 2026-03-10T07:51:40.769 INFO:tasks.workunit.client.1.vm08.stdout:1/617: write d2/d6/de/fc0 [844947,49229] 0 2026-03-10T07:51:40.769 INFO:tasks.workunit.client.1.vm08.stdout:8/760: dwrite d0/df/d2e/f44 [0,4194304] 0 2026-03-10T07:51:40.774 INFO:tasks.workunit.client.1.vm08.stdout:5/699: mkdir d0/d4/df/dbf/d41/de8 0 2026-03-10T07:51:40.775 INFO:tasks.workunit.client.1.vm08.stdout:5/700: chown d0/d4/d19/d3a 38972535 1 2026-03-10T07:51:40.778 INFO:tasks.workunit.client.1.vm08.stdout:8/761: rmdir d0/df/d15/d9c 39 2026-03-10T07:51:40.781 INFO:tasks.workunit.client.1.vm08.stdout:5/701: rmdir d0/d4/df/d12 39 2026-03-10T07:51:40.782 INFO:tasks.workunit.client.1.vm08.stdout:5/702: readlink d0/ddf/l2f 0 2026-03-10T07:51:40.792 INFO:tasks.workunit.client.1.vm08.stdout:8/762: creat d0/df/d17/ff6 x:0 0 0 2026-03-10T07:51:40.793 INFO:tasks.workunit.client.1.vm08.stdout:8/763: write d0/df/d17/ff6 [424858,64585] 0 2026-03-10T07:51:40.818 INFO:tasks.workunit.client.1.vm08.stdout:8/764: link d0/df/l71 d0/df/d17/d72/lf7 0 2026-03-10T07:51:40.819 INFO:tasks.workunit.client.1.vm08.stdout:8/765: dread - d0/d69/fd9 zero size 2026-03-10T07:51:40.826 INFO:tasks.workunit.client.1.vm08.stdout:8/766: symlink d0/df/d17/d25/lf8 0 2026-03-10T07:51:40.839 INFO:tasks.workunit.client.1.vm08.stdout:8/767: mkdir d0/df/d15/d23/d54/dba/d89/dbf/df9 0 2026-03-10T07:51:40.844 INFO:tasks.workunit.client.1.vm08.stdout:8/768: fsync d0/df/d15/d23/da8/fc3 0 2026-03-10T07:51:40.855 INFO:tasks.workunit.client.1.vm08.stdout:8/769: symlink d0/df/d15/d23/d54/dba/d89/dc5/lfa 0 2026-03-10T07:51:40.859 INFO:tasks.workunit.client.1.vm08.stdout:8/770: symlink d0/df/d15/d23/d39/d5b/d4a/lfb 0 2026-03-10T07:51:40.865 INFO:tasks.workunit.client.1.vm08.stdout:0/668: dwrite dd/d10/d2f/d37/d64/d95/f2a [0,4194304] 0 2026-03-10T07:51:40.868 INFO:tasks.workunit.client.1.vm08.stdout:9/655: truncate d2/de/f4d 3654820 0 2026-03-10T07:51:40.878 INFO:tasks.workunit.client.1.vm08.stdout:0/669: fsync dd/f44 0 2026-03-10T07:51:40.878 INFO:tasks.workunit.client.1.vm08.stdout:9/656: getdents d2/d3/d84/dca/dd7 0 2026-03-10T07:51:40.889 INFO:tasks.workunit.client.1.vm08.stdout:0/670: truncate dd/d10/d2f/d37/d64/f3f 2830024 0 2026-03-10T07:51:40.893 INFO:tasks.workunit.client.1.vm08.stdout:0/671: unlink dd/d10/d14/f36 0 2026-03-10T07:51:40.893 INFO:tasks.workunit.client.1.vm08.stdout:0/672: write dd/d10/d2f/d37/d64/d95/f91 [1408764,67845] 0 2026-03-10T07:51:40.894 INFO:tasks.workunit.client.1.vm08.stdout:0/673: symlink dd/d10/d14/d15/d20/d92/ld7 0 2026-03-10T07:51:40.894 INFO:tasks.workunit.client.1.vm08.stdout:2/667: write d0/d1/d3/d39/d7d/f80 [3550944,39527] 0 2026-03-10T07:51:40.897 INFO:tasks.workunit.client.1.vm08.stdout:2/668: write d0/d1/d3/d39/d7d/d86/d55/db9/fc2 [1017111,56993] 0 2026-03-10T07:51:40.899 INFO:tasks.workunit.client.1.vm08.stdout:0/674: dwrite dd/d10/d2f/d37/d64/d95/d5c/f63 [8388608,4194304] 0 2026-03-10T07:51:40.901 INFO:tasks.workunit.client.1.vm08.stdout:0/675: write dd/d10/d2f/f6b [3325757,109155] 0 2026-03-10T07:51:40.926 INFO:tasks.workunit.client.1.vm08.stdout:2/669: link d0/d1/d3/d39/l67 d0/d1/d3/d39/ld4 0 2026-03-10T07:51:40.926 INFO:tasks.workunit.client.1.vm08.stdout:4/572: rename d5/d8/f1e to d5/da0/d12/fc6 0 2026-03-10T07:51:40.926 INFO:tasks.workunit.client.1.vm08.stdout:1/618: write d2/d6/f86 [908925,50192] 0 2026-03-10T07:51:40.929 INFO:tasks.workunit.client.1.vm08.stdout:4/573: dread d5/d8/f90 [0,4194304] 0 2026-03-10T07:51:40.938 INFO:tasks.workunit.client.1.vm08.stdout:1/619: dread d2/f36 [4194304,4194304] 0 2026-03-10T07:51:40.943 INFO:tasks.workunit.client.1.vm08.stdout:2/670: mkdir d0/d1/d3/d39/d7d/d7e/dd5 0 2026-03-10T07:51:40.943 INFO:tasks.workunit.client.1.vm08.stdout:4/574: fdatasync d5/d1f/f37 0 2026-03-10T07:51:40.961 INFO:tasks.workunit.client.1.vm08.stdout:1/620: stat d2/d6/de/d1f/da9 0 2026-03-10T07:51:40.966 INFO:tasks.workunit.client.1.vm08.stdout:8/771: rmdir d0/df/d15/d23/d54/dba/d89/dc5 39 2026-03-10T07:51:40.968 INFO:tasks.workunit.client.1.vm08.stdout:4/575: mknod d5/da0/d12/d7b/d48/d4f/d7c/cc7 0 2026-03-10T07:51:40.972 INFO:tasks.workunit.client.1.vm08.stdout:0/676: getdents dd/d10/d2f/d37/d64/d52 0 2026-03-10T07:51:40.976 INFO:tasks.workunit.client.1.vm08.stdout:7/631: rename d3/da/d25/d9/d2f/d3a/d40/d54/c5d to d3/da/d25/d9/d6f/cd5 0 2026-03-10T07:51:40.982 INFO:tasks.workunit.client.1.vm08.stdout:8/772: unlink d0/df/f26 0 2026-03-10T07:51:40.983 INFO:tasks.workunit.client.1.vm08.stdout:8/773: chown d0/df/d15/d23/d54/dba/d89/dbf/fdb 20384 1 2026-03-10T07:51:40.986 INFO:tasks.workunit.client.1.vm08.stdout:0/677: symlink dd/d10/d14/d15/d20/d5f/d9f/ld8 0 2026-03-10T07:51:40.988 INFO:tasks.workunit.client.1.vm08.stdout:9/657: dwrite d2/d58/dbf/faa [0,4194304] 0 2026-03-10T07:51:40.989 INFO:tasks.workunit.client.1.vm08.stdout:7/632: dread - d3/da/d25/d9/d2f/d4d/fa9 zero size 2026-03-10T07:51:40.990 INFO:tasks.workunit.client.1.vm08.stdout:7/633: chown d3/da/d25/f29 5485708 1 2026-03-10T07:51:41.007 INFO:tasks.workunit.client.1.vm08.stdout:2/671: creat d0/d1/d3/d39/d7d/d86/fd6 x:0 0 0 2026-03-10T07:51:41.012 INFO:tasks.workunit.client.1.vm08.stdout:8/774: dwrite d0/df/d15/d23/d54/dba/d89/dc5/fcc [0,4194304] 0 2026-03-10T07:51:41.038 INFO:tasks.workunit.client.1.vm08.stdout:3/648: mkdir d0/d3c/d1f/d44/d51/dcd 0 2026-03-10T07:51:41.067 INFO:tasks.workunit.client.1.vm08.stdout:6/693: rename d1/db/d24/d73/f78 to d1/d17/d2b/d58/d77/daf/db1/fe4 0 2026-03-10T07:51:41.069 INFO:tasks.workunit.client.1.vm08.stdout:2/672: mkdir d0/d1/d17/d6b/da0/dd7 0 2026-03-10T07:51:41.075 INFO:tasks.workunit.client.1.vm08.stdout:3/649: symlink d0/d3c/d18/d80/lce 0 2026-03-10T07:51:41.079 INFO:tasks.workunit.client.1.vm08.stdout:3/650: dwrite d0/d3c/d1f/d44/d51/d2d/d85/fcb [0,4194304] 0 2026-03-10T07:51:41.080 INFO:tasks.workunit.client.1.vm08.stdout:5/703: rename d0/d8/d24/la7 to d0/d77/le9 0 2026-03-10T07:51:41.081 INFO:tasks.workunit.client.1.vm08.stdout:3/651: chown d0/d3c/d1f/d89/cb8 84812615 1 2026-03-10T07:51:41.081 INFO:tasks.workunit.client.1.vm08.stdout:6/694: creat d1/d3/df/d52/fe5 x:0 0 0 2026-03-10T07:51:41.082 INFO:tasks.workunit.client.1.vm08.stdout:5/704: truncate d0/d4/d19/d60/d6d/d70/fe1 512439 0 2026-03-10T07:51:41.094 INFO:tasks.workunit.client.1.vm08.stdout:9/658: mkdir d2/d58/dbf/ddf 0 2026-03-10T07:51:41.097 INFO:tasks.workunit.client.1.vm08.stdout:4/576: getdents d5/d1f/dad 0 2026-03-10T07:51:41.097 INFO:tasks.workunit.client.1.vm08.stdout:5/705: dread d0/d4/fab [0,4194304] 0 2026-03-10T07:51:41.098 INFO:tasks.workunit.client.1.vm08.stdout:5/706: readlink d0/d33/l39 0 2026-03-10T07:51:41.100 INFO:tasks.workunit.client.1.vm08.stdout:7/634: rename d3/da/d25/d9/d2f/d3a/d71/f8b to d3/da/d25/d9/fd6 0 2026-03-10T07:51:41.101 INFO:tasks.workunit.client.1.vm08.stdout:1/621: truncate d2/d6/d3a/f7d 3981206 0 2026-03-10T07:51:41.103 INFO:tasks.workunit.client.1.vm08.stdout:4/577: dwrite d5/d1f/d31/f82 [0,4194304] 0 2026-03-10T07:51:41.107 INFO:tasks.workunit.client.1.vm08.stdout:4/578: truncate d5/da0/d12/d7b/d48/d4f/d7c/fb2 131905 0 2026-03-10T07:51:41.116 INFO:tasks.workunit.client.1.vm08.stdout:0/678: write dd/d10/d2f/d37/d64/d52/f74 [218300,18005] 0 2026-03-10T07:51:41.141 INFO:tasks.workunit.client.1.vm08.stdout:6/695: symlink d1/d3/df/d52/le6 0 2026-03-10T07:51:41.141 INFO:tasks.workunit.client.1.vm08.stdout:3/652: rename d0/d3c/d18/d32/cac to d0/d3c/d18/d80/ccf 0 2026-03-10T07:51:41.141 INFO:tasks.workunit.client.1.vm08.stdout:1/622: symlink d2/d6/de/d47/ldb 0 2026-03-10T07:51:41.141 INFO:tasks.workunit.client.1.vm08.stdout:4/579: mknod d5/da0/d12/d7b/d48/d4f/d8d/d91/cc8 0 2026-03-10T07:51:41.141 INFO:tasks.workunit.client.1.vm08.stdout:6/696: mkdir d1/d17/d2b/d58/de7 0 2026-03-10T07:51:41.141 INFO:tasks.workunit.client.1.vm08.stdout:0/679: creat dd/d10/d2f/d37/d64/d95/d58/fd9 x:0 0 0 2026-03-10T07:51:41.141 INFO:tasks.workunit.client.1.vm08.stdout:6/697: chown d1/d3/df/d1d/d40/d45/fdf 40197765 1 2026-03-10T07:51:41.141 INFO:tasks.workunit.client.1.vm08.stdout:0/680: truncate dd/d10/d14/d15/d20/f7e 329273 0 2026-03-10T07:51:41.141 INFO:tasks.workunit.client.1.vm08.stdout:9/659: getdents d2/d58/dbf/ddf 0 2026-03-10T07:51:41.141 INFO:tasks.workunit.client.1.vm08.stdout:5/707: sync 2026-03-10T07:51:41.142 INFO:tasks.workunit.client.1.vm08.stdout:1/623: dwrite d2/d6/d50/f7f [0,4194304] 0 2026-03-10T07:51:41.143 INFO:tasks.workunit.client.1.vm08.stdout:3/653: creat d0/d3c/d18/d32/d61/d52/dca/fd0 x:0 0 0 2026-03-10T07:51:41.144 INFO:tasks.workunit.client.1.vm08.stdout:5/708: chown d0/d77/d83/fdb 3 1 2026-03-10T07:51:41.144 INFO:tasks.workunit.client.1.vm08.stdout:3/654: chown d0/d3c/f87 9149 1 2026-03-10T07:51:41.152 INFO:tasks.workunit.client.1.vm08.stdout:3/655: rmdir d0/d3c/d18/d32/d61/d52 39 2026-03-10T07:51:41.152 INFO:tasks.workunit.client.1.vm08.stdout:1/624: symlink d2/d6/de/d1f/d26/d58/d83/dc2/ldc 0 2026-03-10T07:51:41.153 INFO:tasks.workunit.client.1.vm08.stdout:5/709: dread d0/d4/d19/d60/fb6 [0,4194304] 0 2026-03-10T07:51:41.157 INFO:tasks.workunit.client.1.vm08.stdout:9/660: getdents d2/d26 0 2026-03-10T07:51:41.161 INFO:tasks.workunit.client.1.vm08.stdout:9/661: write d2/d58/fc6 [2150038,77571] 0 2026-03-10T07:51:41.161 INFO:tasks.workunit.client.1.vm08.stdout:0/681: dwrite dd/d10/d2f/d37/d64/d95/d58/d3d/f83 [0,4194304] 0 2026-03-10T07:51:41.163 INFO:tasks.workunit.client.1.vm08.stdout:0/682: readlink dd/d10/d14/d15/d20/d92/ld7 0 2026-03-10T07:51:41.168 INFO:tasks.workunit.client.1.vm08.stdout:6/698: dread d1/f49 [0,4194304] 0 2026-03-10T07:51:41.169 INFO:tasks.workunit.client.1.vm08.stdout:5/710: sync 2026-03-10T07:51:41.172 INFO:tasks.workunit.client.1.vm08.stdout:9/662: dwrite d2/d58/dbf/dd0/d35/d97/d9d/fcc [0,4194304] 0 2026-03-10T07:51:41.173 INFO:tasks.workunit.client.1.vm08.stdout:0/683: read dd/d10/d2f/d37/d64/d95/d58/d3d/faa [209778,101755] 0 2026-03-10T07:51:41.174 INFO:tasks.workunit.client.1.vm08.stdout:4/580: dread d5/da0/d32/f44 [0,4194304] 0 2026-03-10T07:51:41.174 INFO:tasks.workunit.client.1.vm08.stdout:0/684: write dd/d10/d14/d15/d20/fc5 [1100177,85372] 0 2026-03-10T07:51:41.181 INFO:tasks.workunit.client.1.vm08.stdout:6/699: symlink d1/d3/d3e/le8 0 2026-03-10T07:51:41.183 INFO:tasks.workunit.client.1.vm08.stdout:6/700: write d1/d3/d3e/f81 [1615689,20600] 0 2026-03-10T07:51:41.186 INFO:tasks.workunit.client.1.vm08.stdout:8/775: write d0/df/f9f [587516,129537] 0 2026-03-10T07:51:41.200 INFO:tasks.workunit.client.1.vm08.stdout:2/673: write d0/d1/d3/d10/d38/f60 [4151097,9309] 0 2026-03-10T07:51:41.201 INFO:tasks.workunit.client.1.vm08.stdout:2/674: chown d0/d1/d3/d56/d78/dad 51709764 1 2026-03-10T07:51:41.205 INFO:tasks.workunit.client.1.vm08.stdout:0/685: dwrite dd/d10/d2f/d37/d64/d95/d58/d3d/d9b/fb4 [0,4194304] 0 2026-03-10T07:51:41.206 INFO:tasks.workunit.client.1.vm08.stdout:1/625: link d2/d6/de/f74 d2/d6/de/d47/dbd/dc3/fdd 0 2026-03-10T07:51:41.219 INFO:tasks.workunit.client.1.vm08.stdout:9/663: dread d2/de/f4d [0,4194304] 0 2026-03-10T07:51:41.221 INFO:tasks.workunit.client.1.vm08.stdout:6/701: getdents d1/d17/d2b/d5e/da8 0 2026-03-10T07:51:41.223 INFO:tasks.workunit.client.1.vm08.stdout:1/626: mknod d2/d6/de/d1f/d26/d89/d8e/cde 0 2026-03-10T07:51:41.224 INFO:tasks.workunit.client.1.vm08.stdout:9/664: mknod d2/d58/dbf/dd0/ce0 0 2026-03-10T07:51:41.226 INFO:tasks.workunit.client.1.vm08.stdout:9/665: chown d2/d58/dbf/f75 158 1 2026-03-10T07:51:41.227 INFO:tasks.workunit.client.1.vm08.stdout:2/675: creat d0/d1/d3/d10/d38/daf/fd8 x:0 0 0 2026-03-10T07:51:41.228 INFO:tasks.workunit.client.1.vm08.stdout:6/702: dwrite d1/d3/df/d1d/f9b [0,4194304] 0 2026-03-10T07:51:41.230 INFO:tasks.workunit.client.1.vm08.stdout:6/703: readlink d1/d3/df/d1d/d6f/lcf 0 2026-03-10T07:51:41.230 INFO:tasks.workunit.client.1.vm08.stdout:7/635: dwrite d3/f34 [4194304,4194304] 0 2026-03-10T07:51:41.231 INFO:tasks.workunit.client.1.vm08.stdout:6/704: chown d1/d7d/f91 21161 1 2026-03-10T07:51:41.232 INFO:tasks.workunit.client.1.vm08.stdout:0/686: rename dd/d10/c2b to dd/d10/d2f/d37/d64/d95/d58/d3d/cda 0 2026-03-10T07:51:41.234 INFO:tasks.workunit.client.1.vm08.stdout:1/627: symlink d2/d6/d9f/ldf 0 2026-03-10T07:51:41.237 INFO:tasks.workunit.client.1.vm08.stdout:9/666: unlink d2/d26/f61 0 2026-03-10T07:51:41.240 INFO:tasks.workunit.client.1.vm08.stdout:9/667: truncate d2/d3/f5f 3442592 0 2026-03-10T07:51:41.242 INFO:tasks.workunit.client.1.vm08.stdout:6/705: dread d1/d3/df/d1d/d40/d87/fb3 [0,4194304] 0 2026-03-10T07:51:41.242 INFO:tasks.workunit.client.1.vm08.stdout:9/668: read d2/d58/d73/f7e [4136342,35453] 0 2026-03-10T07:51:41.243 INFO:tasks.workunit.client.1.vm08.stdout:2/676: rename d0/d1/d3/d56/f70 to d0/d1/d3/d10/d38/daf/fd9 0 2026-03-10T07:51:41.244 INFO:tasks.workunit.client.1.vm08.stdout:1/628: creat d2/d6/de/d47/da0/fe0 x:0 0 0 2026-03-10T07:51:41.247 INFO:tasks.workunit.client.1.vm08.stdout:7/636: creat d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/dac/fd7 x:0 0 0 2026-03-10T07:51:41.259 INFO:tasks.workunit.client.1.vm08.stdout:9/669: creat d2/d58/dbf/dd0/d35/d97/fe1 x:0 0 0 2026-03-10T07:51:41.263 INFO:tasks.workunit.client.1.vm08.stdout:6/706: symlink d1/d3/df/d44/le9 0 2026-03-10T07:51:41.263 INFO:tasks.workunit.client.1.vm08.stdout:2/677: unlink d0/d1/d3/d39/cac 0 2026-03-10T07:51:41.264 INFO:tasks.workunit.client.1.vm08.stdout:2/678: stat d0/d1/d17/cd 0 2026-03-10T07:51:41.264 INFO:tasks.workunit.client.1.vm08.stdout:6/707: chown d1/db/d24/dac/ld5 27 1 2026-03-10T07:51:41.269 INFO:tasks.workunit.client.1.vm08.stdout:3/656: write d0/f45 [2142801,109244] 0 2026-03-10T07:51:41.279 INFO:tasks.workunit.client.1.vm08.stdout:2/679: symlink d0/d1/d3/d3e/lda 0 2026-03-10T07:51:41.279 INFO:tasks.workunit.client.1.vm08.stdout:5/711: dwrite d0/d8/f85 [0,4194304] 0 2026-03-10T07:51:41.279 INFO:tasks.workunit.client.1.vm08.stdout:5/712: creat d0/d4/d19/d60/d6d/fea x:0 0 0 2026-03-10T07:51:41.280 INFO:tasks.workunit.client.1.vm08.stdout:5/713: symlink d0/d77/leb 0 2026-03-10T07:51:41.282 INFO:tasks.workunit.client.1.vm08.stdout:2/680: link d0/d1/d3/d56/d57/f79 d0/d1/d3/d56/fdb 0 2026-03-10T07:51:41.287 INFO:tasks.workunit.client.1.vm08.stdout:2/681: creat d0/d1/d3/d56/d78/dad/fdc x:0 0 0 2026-03-10T07:51:41.290 INFO:tasks.workunit.client.1.vm08.stdout:2/682: mknod d0/d1/d3/d56/d78/dad/db1/d61/d84/cdd 0 2026-03-10T07:51:41.299 INFO:tasks.workunit.client.1.vm08.stdout:2/683: mkdir d0/d1/d17/db2/dde 0 2026-03-10T07:51:41.299 INFO:tasks.workunit.client.1.vm08.stdout:2/684: symlink d0/d1/d3/d10/d38/daf/ldf 0 2026-03-10T07:51:41.303 INFO:tasks.workunit.client.1.vm08.stdout:4/581: write d5/d1f/d31/f38 [2308789,19370] 0 2026-03-10T07:51:41.307 INFO:tasks.workunit.client.1.vm08.stdout:4/582: link d5/da0/d32/c35 d5/d1f/cc9 0 2026-03-10T07:51:41.310 INFO:tasks.workunit.client.1.vm08.stdout:4/583: creat d5/da0/d12/d7b/d48/d4f/d8d/d91/fca x:0 0 0 2026-03-10T07:51:41.310 INFO:tasks.workunit.client.1.vm08.stdout:4/584: chown d5/d8/d50/c7d 30697 1 2026-03-10T07:51:41.320 INFO:tasks.workunit.client.1.vm08.stdout:4/585: dwrite d5/d8/fc1 [0,4194304] 0 2026-03-10T07:51:41.320 INFO:tasks.workunit.client.1.vm08.stdout:4/586: rmdir d5/da0/d12 39 2026-03-10T07:51:41.320 INFO:tasks.workunit.client.1.vm08.stdout:4/587: readlink d5/l88 0 2026-03-10T07:51:41.321 INFO:tasks.workunit.client.1.vm08.stdout:4/588: unlink d5/d1f/d31/c76 0 2026-03-10T07:51:41.346 INFO:tasks.workunit.client.1.vm08.stdout:8/776: dwrite d0/df/f12 [0,4194304] 0 2026-03-10T07:51:41.354 INFO:tasks.workunit.client.1.vm08.stdout:8/777: rmdir d0/df/d15/d23 39 2026-03-10T07:51:41.354 INFO:tasks.workunit.client.1.vm08.stdout:9/670: write d2/d26/f29 [4978977,120323] 0 2026-03-10T07:51:41.354 INFO:tasks.workunit.client.1.vm08.stdout:8/778: readlink d0/df/d2e/d30/lca 0 2026-03-10T07:51:41.358 INFO:tasks.workunit.client.1.vm08.stdout:8/779: creat d0/df/d15/d23/d39/d5b/dbc/ffc x:0 0 0 2026-03-10T07:51:41.358 INFO:tasks.workunit.client.1.vm08.stdout:9/671: dwrite d2/d26/f29 [4194304,4194304] 0 2026-03-10T07:51:41.362 INFO:tasks.workunit.client.1.vm08.stdout:9/672: link d2/de/c59 d2/de/d28/d98/dbb/dd9/ce2 0 2026-03-10T07:51:41.363 INFO:tasks.workunit.client.1.vm08.stdout:8/780: rename d0/df/d15/d23/d39/d5b/d4a/f98 to d0/df/d15/d23/d39/d5b/ffd 0 2026-03-10T07:51:41.365 INFO:tasks.workunit.client.1.vm08.stdout:8/781: rmdir d0/df/d15/d23/d39/d5b/dea 39 2026-03-10T07:51:41.366 INFO:tasks.workunit.client.1.vm08.stdout:9/673: dwrite d2/d26/f29 [8388608,4194304] 0 2026-03-10T07:51:41.366 INFO:tasks.workunit.client.1.vm08.stdout:9/674: fdatasync d2/f51 0 2026-03-10T07:51:41.367 INFO:tasks.workunit.client.1.vm08.stdout:9/675: write d2/de/d28/d98/fc4 [786172,55737] 0 2026-03-10T07:51:41.374 INFO:tasks.workunit.client.1.vm08.stdout:9/676: symlink d2/d3/d84/le3 0 2026-03-10T07:51:41.375 INFO:tasks.workunit.client.1.vm08.stdout:8/782: rename d0/d37/d86/lda to d0/df/d17/d25/lfe 0 2026-03-10T07:51:41.376 INFO:tasks.workunit.client.1.vm08.stdout:8/783: chown d0/d37/d86 166275554 1 2026-03-10T07:51:41.376 INFO:tasks.workunit.client.1.vm08.stdout:8/784: write d0/df/d5d/fcf [5049265,75797] 0 2026-03-10T07:51:41.380 INFO:tasks.workunit.client.1.vm08.stdout:9/677: dread d2/d58/dbf/f75 [0,4194304] 0 2026-03-10T07:51:41.380 INFO:tasks.workunit.client.1.vm08.stdout:9/678: dread - d2/de/d28/d98/fd4 zero size 2026-03-10T07:51:41.383 INFO:tasks.workunit.client.1.vm08.stdout:9/679: creat d2/d58/d73/fe4 x:0 0 0 2026-03-10T07:51:41.385 INFO:tasks.workunit.client.1.vm08.stdout:8/785: creat d0/df/d15/d23/d39/d5b/fff x:0 0 0 2026-03-10T07:51:41.387 INFO:tasks.workunit.client.1.vm08.stdout:8/786: chown d0/df/d15/d23/d39/f3e 47945903 1 2026-03-10T07:51:41.387 INFO:tasks.workunit.client.1.vm08.stdout:8/787: chown d0/df/f1b 1723 1 2026-03-10T07:51:41.391 INFO:tasks.workunit.client.1.vm08.stdout:9/680: getdents d2 0 2026-03-10T07:51:41.396 INFO:tasks.workunit.client.1.vm08.stdout:8/788: getdents d0/df/d17 0 2026-03-10T07:51:41.409 INFO:tasks.workunit.client.1.vm08.stdout:8/789: rename d0/d69/d3f/c93 to d0/df/d15/d95/c100 0 2026-03-10T07:51:41.409 INFO:tasks.workunit.client.1.vm08.stdout:8/790: dread - d0/df/d15/d23/d54/dba/d89/fac zero size 2026-03-10T07:51:41.409 INFO:tasks.workunit.client.1.vm08.stdout:8/791: dwrite d0/df/fdd [0,4194304] 0 2026-03-10T07:51:41.411 INFO:tasks.workunit.client.1.vm08.stdout:8/792: creat d0/df/d2e/f101 x:0 0 0 2026-03-10T07:51:41.416 INFO:tasks.workunit.client.1.vm08.stdout:8/793: creat d0/df/d2e/f102 x:0 0 0 2026-03-10T07:51:41.416 INFO:tasks.workunit.client.1.vm08.stdout:8/794: readlink d0/df/d15/d23/da8/lc8 0 2026-03-10T07:51:41.419 INFO:tasks.workunit.client.1.vm08.stdout:8/795: rmdir d0/df/d15/d9c 39 2026-03-10T07:51:41.452 INFO:tasks.workunit.client.1.vm08.stdout:0/687: truncate dd/d10/d14/d15/d20/fc5 3747476 0 2026-03-10T07:51:41.457 INFO:tasks.workunit.client.1.vm08.stdout:1/629: dwrite d2/d6/de/d1f/d26/d89/d8e/fbf [0,4194304] 0 2026-03-10T07:51:41.458 INFO:tasks.workunit.client.1.vm08.stdout:7/637: dwrite d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/f49 [0,4194304] 0 2026-03-10T07:51:41.464 INFO:tasks.workunit.client.1.vm08.stdout:3/657: truncate d0/d3c/d1f/d44/d51/d2d/d85/fa0 1669206 0 2026-03-10T07:51:41.465 INFO:tasks.workunit.client.1.vm08.stdout:0/688: rmdir dd/d10/d2f/d37 39 2026-03-10T07:51:41.466 INFO:tasks.workunit.client.1.vm08.stdout:0/689: chown dd/d10/d2f/c67 111 1 2026-03-10T07:51:41.466 INFO:tasks.workunit.client.1.vm08.stdout:6/708: dwrite d1/d3/d3e/f5b [0,4194304] 0 2026-03-10T07:51:41.470 INFO:tasks.workunit.client.1.vm08.stdout:5/714: truncate d0/d4/df/d82/f8d 4159664 0 2026-03-10T07:51:41.471 INFO:tasks.workunit.client.1.vm08.stdout:6/709: chown d1/d17/d2b/d5e/lb5 109294 1 2026-03-10T07:51:41.476 INFO:tasks.workunit.client.1.vm08.stdout:7/638: creat d3/da/d25/d9/d2f/d3a/d71/fd8 x:0 0 0 2026-03-10T07:51:41.480 INFO:tasks.workunit.client.1.vm08.stdout:2/685: write d0/d1/d3/d39/d7d/d86/d55/d1b/f23 [700980,24711] 0 2026-03-10T07:51:41.480 INFO:tasks.workunit.client.1.vm08.stdout:7/639: readlink d3/da/d25/d9/d2f/d39/d43/d4f/lae 0 2026-03-10T07:51:41.480 INFO:tasks.workunit.client.1.vm08.stdout:2/686: chown d0/d1/d3/d39/d7d/d86/d55/dc9/fd1 2012676949 1 2026-03-10T07:51:41.486 INFO:tasks.workunit.client.1.vm08.stdout:0/690: fsync dd/d10/d2f/f4b 0 2026-03-10T07:51:41.488 INFO:tasks.workunit.client.1.vm08.stdout:6/710: mkdir d1/db/d24/d3d/dea 0 2026-03-10T07:51:41.491 INFO:tasks.workunit.client.1.vm08.stdout:7/640: mknod d3/da/d25/d9/d2f/d3a/d71/dca/cd9 0 2026-03-10T07:51:41.492 INFO:tasks.workunit.client.1.vm08.stdout:4/589: write d5/d8/f39 [1309341,80639] 0 2026-03-10T07:51:41.497 INFO:tasks.workunit.client.1.vm08.stdout:4/590: dwrite d5/d1f/d31/f38 [0,4194304] 0 2026-03-10T07:51:41.498 INFO:tasks.workunit.client.1.vm08.stdout:4/591: fdatasync d5/d1f/d9b/fa8 0 2026-03-10T07:51:41.503 INFO:tasks.workunit.client.1.vm08.stdout:3/658: getdents d0/d3c/d18/da9/dcc 0 2026-03-10T07:51:41.504 INFO:tasks.workunit.client.1.vm08.stdout:3/659: readlink d0/d3c/d18/da9/lae 0 2026-03-10T07:51:41.504 INFO:tasks.workunit.client.1.vm08.stdout:5/715: mkdir d0/d4/d19/d43/ddc/dec 0 2026-03-10T07:51:41.506 INFO:tasks.workunit.client.1.vm08.stdout:4/592: readlink d5/d8/l2a 0 2026-03-10T07:51:41.508 INFO:tasks.workunit.client.1.vm08.stdout:0/691: mkdir dd/d10/d2f/d37/d64/d95/d5c/dca/ddb 0 2026-03-10T07:51:41.509 INFO:tasks.workunit.client.1.vm08.stdout:3/660: rename d0/d3c/d1f to d0/d3c/d1f/dd1 22 2026-03-10T07:51:41.509 INFO:tasks.workunit.client.1.vm08.stdout:0/692: chown dd/d10/d14/d15/d20/d22/dc6 3502608 1 2026-03-10T07:51:41.509 INFO:tasks.workunit.client.1.vm08.stdout:7/641: mkdir d3/da/d25/d9/d2f/d3a/dc0/dda 0 2026-03-10T07:51:41.511 INFO:tasks.workunit.client.1.vm08.stdout:0/693: write dd/d10/d2f/d37/d64/d95/d5c/dca/fce [871792,45628] 0 2026-03-10T07:51:41.513 INFO:tasks.workunit.client.1.vm08.stdout:7/642: read - d3/da/d25/f94 zero size 2026-03-10T07:51:41.517 INFO:tasks.workunit.client.1.vm08.stdout:0/694: dread dd/d10/d2f/d37/d64/d95/d58/d3d/faa [0,4194304] 0 2026-03-10T07:51:41.521 INFO:tasks.workunit.client.1.vm08.stdout:4/593: rename d5/da0/d12/d7b/d48/da2/lb3 to d5/lcb 0 2026-03-10T07:51:41.521 INFO:tasks.workunit.client.1.vm08.stdout:7/643: creat d3/da/d8a/dd1/fdb x:0 0 0 2026-03-10T07:51:41.521 INFO:tasks.workunit.client.1.vm08.stdout:4/594: chown d5/da0/cb 7 1 2026-03-10T07:51:41.522 INFO:tasks.workunit.client.1.vm08.stdout:9/681: write d2/d3/fa [1149324,32674] 0 2026-03-10T07:51:41.524 INFO:tasks.workunit.client.1.vm08.stdout:4/595: dread d5/da0/d12/d7b/d48/d4f/d7c/fb2 [0,4194304] 0 2026-03-10T07:51:41.527 INFO:tasks.workunit.client.1.vm08.stdout:3/661: getdents d0/d3c/d1f/d44/d51/d2d/d85 0 2026-03-10T07:51:41.529 INFO:tasks.workunit.client.1.vm08.stdout:0/695: link dd/d10/d2f/d37/d64/d95/d58/fb2 dd/d10/d14/da6/fdc 0 2026-03-10T07:51:41.531 INFO:tasks.workunit.client.1.vm08.stdout:5/716: dread d0/d4/df/d12/f97 [0,4194304] 0 2026-03-10T07:51:41.531 INFO:tasks.workunit.client.1.vm08.stdout:7/644: write d3/da/d25/d9/d2f/d39/fc7 [2242358,51261] 0 2026-03-10T07:51:41.535 INFO:tasks.workunit.client.1.vm08.stdout:3/662: fsync d0/d3c/d1f/d44/d51/d2d/f3a 0 2026-03-10T07:51:41.536 INFO:tasks.workunit.client.1.vm08.stdout:0/696: dwrite dd/d10/fb8 [0,4194304] 0 2026-03-10T07:51:41.546 INFO:tasks.workunit.client.1.vm08.stdout:0/697: read dd/d10/d2f/f8e [174200,86605] 0 2026-03-10T07:51:41.547 INFO:tasks.workunit.client.1.vm08.stdout:9/682: dread d2/d58/dbf/f90 [0,4194304] 0 2026-03-10T07:51:41.551 INFO:tasks.workunit.client.1.vm08.stdout:3/663: mkdir d0/d3c/d18/d32/d61/d52/dca/dd2 0 2026-03-10T07:51:41.552 INFO:tasks.workunit.client.1.vm08.stdout:9/683: fsync d2/d58/dbf/dd0/d35/d9b/fa1 0 2026-03-10T07:51:41.554 INFO:tasks.workunit.client.1.vm08.stdout:1/630: dwrite d2/d6/de/d5f/fb1 [0,4194304] 0 2026-03-10T07:51:41.557 INFO:tasks.workunit.client.1.vm08.stdout:9/684: readlink d2/d3/lf 0 2026-03-10T07:51:41.558 INFO:tasks.workunit.client.1.vm08.stdout:5/717: rename d0/d4/df/dbf/cbd to d0/d4/ced 0 2026-03-10T07:51:41.572 INFO:tasks.workunit.client.1.vm08.stdout:3/664: truncate d0/d3c/d18/d80/f86 544639 0 2026-03-10T07:51:41.572 INFO:tasks.workunit.client.1.vm08.stdout:5/718: mknod d0/d4/d19/d60/cee 0 2026-03-10T07:51:41.572 INFO:tasks.workunit.client.1.vm08.stdout:3/665: truncate d0/d3c/d18/f38 3523972 0 2026-03-10T07:51:41.572 INFO:tasks.workunit.client.1.vm08.stdout:9/685: mknod d2/d58/dbf/ddf/ce5 0 2026-03-10T07:51:41.573 INFO:tasks.workunit.client.1.vm08.stdout:5/719: rmdir d0/d4/d19/d81/da4 39 2026-03-10T07:51:41.573 INFO:tasks.workunit.client.1.vm08.stdout:3/666: creat d0/d3c/d18/d48/d55/d56/fd3 x:0 0 0 2026-03-10T07:51:41.573 INFO:tasks.workunit.client.1.vm08.stdout:5/720: creat d0/d8/d24/fef x:0 0 0 2026-03-10T07:51:41.573 INFO:tasks.workunit.client.1.vm08.stdout:5/721: creat d0/d8/d24/ff0 x:0 0 0 2026-03-10T07:51:41.575 INFO:tasks.workunit.client.1.vm08.stdout:5/722: dread d0/d4/f75 [0,4194304] 0 2026-03-10T07:51:41.581 INFO:tasks.workunit.client.1.vm08.stdout:4/596: dread d5/da0/d12/d7b/d48/d4f/d7c/f9f [0,4194304] 0 2026-03-10T07:51:41.581 INFO:tasks.workunit.client.1.vm08.stdout:5/723: truncate d0/d8/d24/de2/fe6 1001834 0 2026-03-10T07:51:41.581 INFO:tasks.workunit.client.1.vm08.stdout:4/597: mknod d5/d8/ccc 0 2026-03-10T07:51:41.581 INFO:tasks.workunit.client.1.vm08.stdout:4/598: write d5/da0/d32/fbb [165773,1001] 0 2026-03-10T07:51:41.581 INFO:tasks.workunit.client.1.vm08.stdout:4/599: dread - d5/da0/d32/fae zero size 2026-03-10T07:51:41.581 INFO:tasks.workunit.client.1.vm08.stdout:4/600: dread - d5/d1f/d70/fbd zero size 2026-03-10T07:51:41.582 INFO:tasks.workunit.client.1.vm08.stdout:0/698: sync 2026-03-10T07:51:41.584 INFO:tasks.workunit.client.1.vm08.stdout:4/601: creat d5/da0/d12/d7b/d48/d4f/d8d/fcd x:0 0 0 2026-03-10T07:51:41.588 INFO:tasks.workunit.client.1.vm08.stdout:5/724: dread d0/d4/df/f27 [0,4194304] 0 2026-03-10T07:51:41.589 INFO:tasks.workunit.client.1.vm08.stdout:4/602: mkdir d5/d1f/d31/dce 0 2026-03-10T07:51:41.590 INFO:tasks.workunit.client.1.vm08.stdout:5/725: mkdir d0/d4/d19/d3a/df1 0 2026-03-10T07:51:41.591 INFO:tasks.workunit.client.1.vm08.stdout:4/603: truncate d5/da0/d12/d7b/d48/f5b 775509 0 2026-03-10T07:51:41.592 INFO:tasks.workunit.client.1.vm08.stdout:5/726: creat d0/d4/d19/d81/d92/ff2 x:0 0 0 2026-03-10T07:51:41.597 INFO:tasks.workunit.client.1.vm08.stdout:5/727: write d0/d4/d19/d43/f35 [1010687,106838] 0 2026-03-10T07:51:41.602 INFO:tasks.workunit.client.1.vm08.stdout:4/604: dwrite d5/da0/d32/fbb [0,4194304] 0 2026-03-10T07:51:41.608 INFO:tasks.workunit.client.1.vm08.stdout:5/728: mkdir d0/d4/df/d82/df3 0 2026-03-10T07:51:41.611 INFO:tasks.workunit.client.1.vm08.stdout:4/605: mkdir d5/d1f/dad/dcf 0 2026-03-10T07:51:41.612 INFO:tasks.workunit.client.1.vm08.stdout:4/606: write d5/d8/fc1 [1825271,101618] 0 2026-03-10T07:51:41.615 INFO:tasks.workunit.client.1.vm08.stdout:4/607: read d5/d8/fc1 [744977,28689] 0 2026-03-10T07:51:41.619 INFO:tasks.workunit.client.1.vm08.stdout:5/729: getdents d0/d4/d19/d60/d6d/d70 0 2026-03-10T07:51:41.619 INFO:tasks.workunit.client.1.vm08.stdout:5/730: write d0/d4/df/dbf/d41/f5c [3634042,86203] 0 2026-03-10T07:51:41.619 INFO:tasks.workunit.client.1.vm08.stdout:4/608: getdents d5/da0/d95/dc2 0 2026-03-10T07:51:41.620 INFO:tasks.workunit.client.1.vm08.stdout:4/609: mknod d5/d1f/dad/db8/cd0 0 2026-03-10T07:51:41.622 INFO:tasks.workunit.client.1.vm08.stdout:7/645: dread d3/da/d25/d9/d2f/d39/f56 [4194304,4194304] 0 2026-03-10T07:51:41.623 INFO:tasks.workunit.client.1.vm08.stdout:5/731: truncate d0/d4/d19/d81/da4/fbe 435776 0 2026-03-10T07:51:41.625 INFO:tasks.workunit.client.1.vm08.stdout:4/610: dread d5/da0/d32/fbb [0,4194304] 0 2026-03-10T07:51:41.629 INFO:tasks.workunit.client.1.vm08.stdout:2/687: write d0/d1/d3/d56/d78/dad/db1/d61/d8e/fa1 [449562,28368] 0 2026-03-10T07:51:41.629 INFO:tasks.workunit.client.1.vm08.stdout:6/711: write d1/db/f4e [2883618,83141] 0 2026-03-10T07:51:41.630 INFO:tasks.workunit.client.1.vm08.stdout:2/688: read d0/d1/d17/d6b/f9a [787562,15463] 0 2026-03-10T07:51:41.634 INFO:tasks.workunit.client.1.vm08.stdout:8/796: dwrite d0/df/d15/d23/d39/d5b/ffd [0,4194304] 0 2026-03-10T07:51:41.635 INFO:tasks.workunit.client.1.vm08.stdout:8/797: fdatasync d0/df/d17/d25/fc4 0 2026-03-10T07:51:41.647 INFO:tasks.workunit.client.1.vm08.stdout:7/646: creat d3/da/d25/d9/d2f/d3a/d4b/fdc x:0 0 0 2026-03-10T07:51:41.652 INFO:tasks.workunit.client.1.vm08.stdout:5/732: fsync d0/d4/d19/d81/d92/fb1 0 2026-03-10T07:51:41.656 INFO:tasks.workunit.client.1.vm08.stdout:1/631: write d2/d6/de/d70/fb4 [882169,20293] 0 2026-03-10T07:51:41.658 INFO:tasks.workunit.client.1.vm08.stdout:5/733: dwrite d0/d4/d19/d43/f7c [0,4194304] 0 2026-03-10T07:51:41.667 INFO:tasks.workunit.client.1.vm08.stdout:1/632: sync 2026-03-10T07:51:41.669 INFO:tasks.workunit.client.1.vm08.stdout:2/689: chown d0/c2 0 1 2026-03-10T07:51:41.669 INFO:tasks.workunit.client.1.vm08.stdout:2/690: readlink d0/d1/d3/d10/d38/l47 0 2026-03-10T07:51:41.676 INFO:tasks.workunit.client.1.vm08.stdout:9/686: dwrite d2/fd [4194304,4194304] 0 2026-03-10T07:51:41.688 INFO:tasks.workunit.client.1.vm08.stdout:6/712: fdatasync d1/db/d24/dac/dad/f59 0 2026-03-10T07:51:41.688 INFO:tasks.workunit.client.1.vm08.stdout:8/798: creat d0/df/d15/d23/d39/d5b/d4a/f103 x:0 0 0 2026-03-10T07:51:41.689 INFO:tasks.workunit.client.1.vm08.stdout:8/799: chown d0/d37/d86/le7 588214 1 2026-03-10T07:51:41.691 INFO:tasks.workunit.client.1.vm08.stdout:3/667: dwrite d0/f9a [0,4194304] 0 2026-03-10T07:51:41.696 INFO:tasks.workunit.client.1.vm08.stdout:8/800: write d0/df/d15/d53/fee [192447,129320] 0 2026-03-10T07:51:41.696 INFO:tasks.workunit.client.1.vm08.stdout:7/647: mkdir d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79/ddd 0 2026-03-10T07:51:41.700 INFO:tasks.workunit.client.1.vm08.stdout:1/633: rmdir d2/d6/de/d1f/da9 39 2026-03-10T07:51:41.701 INFO:tasks.workunit.client.1.vm08.stdout:0/699: dwrite dd/d10/d2f/f90 [0,4194304] 0 2026-03-10T07:51:41.706 INFO:tasks.workunit.client.1.vm08.stdout:3/668: unlink d0/d3c/d1f/d44/cb7 0 2026-03-10T07:51:41.706 INFO:tasks.workunit.client.1.vm08.stdout:3/669: write d0/d3c/d18/d32/d61/d52/dca/fd0 [314903,120593] 0 2026-03-10T07:51:41.710 INFO:tasks.workunit.client.1.vm08.stdout:9/687: dread d2/d58/dbf/f21 [0,4194304] 0 2026-03-10T07:51:41.727 INFO:tasks.workunit.client.1.vm08.stdout:4/611: dwrite d5/d1f/d31/f33 [4194304,4194304] 0 2026-03-10T07:51:41.730 INFO:tasks.workunit.client.1.vm08.stdout:0/700: creat dd/d10/d2f/d37/d64/d95/d58/fdd x:0 0 0 2026-03-10T07:51:41.734 INFO:tasks.workunit.client.1.vm08.stdout:0/701: dwrite dd/d10/d14/f69 [0,4194304] 0 2026-03-10T07:51:41.744 INFO:tasks.workunit.client.1.vm08.stdout:2/691: dread d0/d1/d17/d6b/f6e [0,4194304] 0 2026-03-10T07:51:41.768 INFO:tasks.workunit.client.1.vm08.stdout:4/612: unlink d5/d1f/f25 0 2026-03-10T07:51:41.778 INFO:tasks.workunit.client.1.vm08.stdout:3/670: truncate d0/d3c/d1f/d44/f59 1272436 0 2026-03-10T07:51:41.790 INFO:tasks.workunit.client.1.vm08.stdout:5/734: write d0/d4/d19/d81/d92/f76 [560991,122408] 0 2026-03-10T07:51:41.791 INFO:tasks.workunit.client.1.vm08.stdout:8/801: write d0/df/f13 [7722627,119395] 0 2026-03-10T07:51:41.796 INFO:tasks.workunit.client.1.vm08.stdout:6/713: dwrite d1/db/f57 [0,4194304] 0 2026-03-10T07:51:41.798 INFO:tasks.workunit.client.1.vm08.stdout:7/648: dwrite d3/da/d25/d9/f30 [4194304,4194304] 0 2026-03-10T07:51:41.800 INFO:tasks.workunit.client.1.vm08.stdout:7/649: stat d3 0 2026-03-10T07:51:41.801 INFO:tasks.workunit.client.1.vm08.stdout:1/634: dwrite d2/d6/de/d1f/d8f/f91 [0,4194304] 0 2026-03-10T07:51:41.801 INFO:tasks.workunit.client.1.vm08.stdout:1/635: chown d2/d6/de/d5f/fb1 7964 1 2026-03-10T07:51:41.803 INFO:tasks.workunit.client.1.vm08.stdout:7/650: truncate d3/da/d25/d9/d2f/d39/db2/fd0 206635 0 2026-03-10T07:51:41.808 INFO:tasks.workunit.client.1.vm08.stdout:7/651: write d3/da/d25/f27 [900963,122174] 0 2026-03-10T07:51:41.808 INFO:tasks.workunit.client.1.vm08.stdout:0/702: read - dd/d10/d2f/d37/d64/d95/d58/f45 zero size 2026-03-10T07:51:41.813 INFO:tasks.workunit.client.1.vm08.stdout:7/652: stat d3/da/d25/d9/d2f/d39/d43/d4f/la5 0 2026-03-10T07:51:41.814 INFO:tasks.workunit.client.1.vm08.stdout:1/636: dread d2/d6/de/d1f/d22/f35 [4194304,4194304] 0 2026-03-10T07:51:41.815 INFO:tasks.workunit.client.1.vm08.stdout:7/653: chown d3/da/d25/d9/d2f/d3a 5177 1 2026-03-10T07:51:41.817 INFO:tasks.workunit.client.1.vm08.stdout:7/654: write d3/da/d8a/fcd [700164,2107] 0 2026-03-10T07:51:41.821 INFO:tasks.workunit.client.1.vm08.stdout:0/703: dwrite dd/d10/d2f/d37/d64/d95/f91 [0,4194304] 0 2026-03-10T07:51:41.832 INFO:tasks.workunit.client.1.vm08.stdout:6/714: dwrite d1/f49 [4194304,4194304] 0 2026-03-10T07:51:41.836 INFO:tasks.workunit.client.1.vm08.stdout:2/692: getdents d0/d1/d17/db2/dc3 0 2026-03-10T07:51:41.836 INFO:tasks.workunit.client.1.vm08.stdout:3/671: mkdir d0/d3c/d18/dd4 0 2026-03-10T07:51:41.840 INFO:tasks.workunit.client.1.vm08.stdout:5/735: dread d0/d8/d5e/f6a [0,4194304] 0 2026-03-10T07:51:41.847 INFO:tasks.workunit.client.1.vm08.stdout:1/637: creat d2/d6/d9f/fe1 x:0 0 0 2026-03-10T07:51:41.848 INFO:tasks.workunit.client.1.vm08.stdout:1/638: stat d2/d6/de/d1f/d40/f4d 0 2026-03-10T07:51:41.848 INFO:tasks.workunit.client.1.vm08.stdout:0/704: read dd/d10/d14/d15/d20/d22/f6c [2424829,126155] 0 2026-03-10T07:51:41.849 INFO:tasks.workunit.client.1.vm08.stdout:7/655: truncate d3/da/d25/d9/d2f/d39/f56 8929941 0 2026-03-10T07:51:41.857 INFO:tasks.workunit.client.1.vm08.stdout:6/715: unlink d1/d3/df/d1d/d40/d45/c8a 0 2026-03-10T07:51:41.864 INFO:tasks.workunit.client.1.vm08.stdout:2/693: fdatasync d0/d1/d3/d56/d78/dad/db1/d61/f59 0 2026-03-10T07:51:41.864 INFO:tasks.workunit.client.1.vm08.stdout:3/672: creat d0/d3c/d18/da9/fd5 x:0 0 0 2026-03-10T07:51:41.865 INFO:tasks.workunit.client.1.vm08.stdout:5/736: rename d0/d4/d19/d3a/c4f to d0/d4/df/dbf/daf/cf4 0 2026-03-10T07:51:41.866 INFO:tasks.workunit.client.1.vm08.stdout:1/639: mknod d2/d6/d50/ce2 0 2026-03-10T07:51:41.869 INFO:tasks.workunit.client.1.vm08.stdout:7/656: creat d3/da/d25/d9/d2f/d3a/d4b/fde x:0 0 0 2026-03-10T07:51:41.870 INFO:tasks.workunit.client.1.vm08.stdout:6/716: readlink d1/d3/la 0 2026-03-10T07:51:41.871 INFO:tasks.workunit.client.1.vm08.stdout:2/694: dread d0/f1e [0,4194304] 0 2026-03-10T07:51:41.871 INFO:tasks.workunit.client.1.vm08.stdout:3/673: mknod d0/d3c/d1f/d44/d51/d2d/cd6 0 2026-03-10T07:51:41.872 INFO:tasks.workunit.client.1.vm08.stdout:8/802: link d0/df/l16 d0/df/d15/d23/l104 0 2026-03-10T07:51:41.879 INFO:tasks.workunit.client.1.vm08.stdout:5/737: creat d0/d4/df/ff5 x:0 0 0 2026-03-10T07:51:41.879 INFO:tasks.workunit.client.1.vm08.stdout:1/640: dread d2/d6/d3a/d61/d6f/f9d [0,4194304] 0 2026-03-10T07:51:41.881 INFO:tasks.workunit.client.1.vm08.stdout:6/717: rename d1/d3/ccc to d1/db/d24/d3d/dea/ceb 0 2026-03-10T07:51:41.885 INFO:tasks.workunit.client.1.vm08.stdout:1/641: read - d2/d6/d3a/f6d zero size 2026-03-10T07:51:41.892 INFO:tasks.workunit.client.1.vm08.stdout:1/642: read d2/d10/f3e [735648,118758] 0 2026-03-10T07:51:41.894 INFO:tasks.workunit.client.1.vm08.stdout:7/657: rename d3/da/d25/d9/d2f/d4d/db6/dc1 to d3/da/d25/d9/d2f/d3a/d71/d8c/ddf 0 2026-03-10T07:51:41.894 INFO:tasks.workunit.client.1.vm08.stdout:7/658: fdatasync d3/da/f21 0 2026-03-10T07:51:41.895 INFO:tasks.workunit.client.1.vm08.stdout:8/803: read d0/df/d15/d23/da8/f6a [2682279,116702] 0 2026-03-10T07:51:41.900 INFO:tasks.workunit.client.1.vm08.stdout:9/688: truncate d2/d58/dbf/dd0/d35/f6c 3223523 0 2026-03-10T07:51:41.900 INFO:tasks.workunit.client.1.vm08.stdout:9/689: chown d2/l8f 62029449 1 2026-03-10T07:51:41.902 INFO:tasks.workunit.client.1.vm08.stdout:9/690: write d2/de/d28/fa8 [897195,89494] 0 2026-03-10T07:51:41.909 INFO:tasks.workunit.client.1.vm08.stdout:4/613: dwrite d5/d1f/d31/f4d [0,4194304] 0 2026-03-10T07:51:41.910 INFO:tasks.workunit.client.1.vm08.stdout:8/804: sync 2026-03-10T07:51:41.931 INFO:tasks.workunit.client.1.vm08.stdout:5/738: mkdir d0/d4/df/df6 0 2026-03-10T07:51:41.932 INFO:tasks.workunit.client.1.vm08.stdout:7/659: dread d3/da/f17 [0,4194304] 0 2026-03-10T07:51:41.933 INFO:tasks.workunit.client.1.vm08.stdout:6/718: mkdir d1/dec 0 2026-03-10T07:51:41.934 INFO:tasks.workunit.client.1.vm08.stdout:6/719: write d1/d3/df/d1d/d40/d45/fbb [2120064,25487] 0 2026-03-10T07:51:41.941 INFO:tasks.workunit.client.1.vm08.stdout:0/705: dwrite dd/f44 [0,4194304] 0 2026-03-10T07:51:41.944 INFO:tasks.workunit.client.1.vm08.stdout:0/706: chown dd/d10/d14/d1b/da5 1 1 2026-03-10T07:51:41.944 INFO:tasks.workunit.client.1.vm08.stdout:3/674: dwrite d0/d3c/d18/fa5 [0,4194304] 0 2026-03-10T07:51:41.954 INFO:tasks.workunit.client.1.vm08.stdout:1/643: creat d2/d6/de/d1f/d26/d89/d8e/fe3 x:0 0 0 2026-03-10T07:51:41.955 INFO:tasks.workunit.client.1.vm08.stdout:2/695: rename d0/d1/d3/d39/d7d/d86/d55/d1b/c66 to d0/d1/d3/d39/d7d/d86/d55/dc9/ce0 0 2026-03-10T07:51:41.955 INFO:tasks.workunit.client.1.vm08.stdout:0/707: dwrite dd/d10/d14/d15/d20/d5f/f61 [0,4194304] 0 2026-03-10T07:51:41.962 INFO:tasks.workunit.client.1.vm08.stdout:0/708: write dd/d10/d2f/d37/d64/d95/d5c/f63 [2004846,50328] 0 2026-03-10T07:51:41.963 INFO:tasks.workunit.client.1.vm08.stdout:0/709: chown dd/d10/d14/c1e 5579048 1 2026-03-10T07:51:41.978 INFO:tasks.workunit.client.1.vm08.stdout:6/720: symlink d1/d17/d2b/d58/d77/led 0 2026-03-10T07:51:41.988 INFO:tasks.workunit.client.1.vm08.stdout:6/721: read d1/db/d24/f25 [3379431,117881] 0 2026-03-10T07:51:41.999 INFO:tasks.workunit.client.1.vm08.stdout:1/644: truncate d2/d10/f3e 2179874 0 2026-03-10T07:51:42.001 INFO:tasks.workunit.client.1.vm08.stdout:1/645: dread d2/d6/de/d70/fb4 [0,4194304] 0 2026-03-10T07:51:42.005 INFO:tasks.workunit.client.1.vm08.stdout:1/646: dwrite d2/d6/de/f7c [4194304,4194304] 0 2026-03-10T07:51:42.007 INFO:tasks.workunit.client.1.vm08.stdout:2/696: mknod d0/d1/d3/d56/d78/dad/ce1 0 2026-03-10T07:51:42.022 INFO:tasks.workunit.client.1.vm08.stdout:8/805: mknod d0/d97/c105 0 2026-03-10T07:51:42.025 INFO:tasks.workunit.client.1.vm08.stdout:5/739: write d0/d4/d19/d3a/f3f [4007324,2132] 0 2026-03-10T07:51:42.027 INFO:tasks.workunit.client.1.vm08.stdout:9/691: dwrite d2/d58/dbf/d2b/f7a [0,4194304] 0 2026-03-10T07:51:42.027 INFO:tasks.workunit.client.1.vm08.stdout:5/740: truncate d0/d4/d19/d81/da4/fc2 1777067 0 2026-03-10T07:51:42.028 INFO:tasks.workunit.client.1.vm08.stdout:9/692: chown d2/de/d28/l65 287318895 1 2026-03-10T07:51:42.030 INFO:tasks.workunit.client.1.vm08.stdout:9/693: chown d2/de/d28/c5a 6 1 2026-03-10T07:51:42.031 INFO:tasks.workunit.client.1.vm08.stdout:0/710: creat dd/d10/d14/d15/d20/d7a/fde x:0 0 0 2026-03-10T07:51:42.031 INFO:tasks.workunit.client.1.vm08.stdout:9/694: chown d2/d58/dbf/f75 0 1 2026-03-10T07:51:42.033 INFO:tasks.workunit.client.1.vm08.stdout:7/660: dwrite d3/da/f9f [0,4194304] 0 2026-03-10T07:51:42.033 INFO:tasks.workunit.client.1.vm08.stdout:9/695: chown d2/de/d28/d98/dbb/dd9 12 1 2026-03-10T07:51:42.033 INFO:tasks.workunit.client.1.vm08.stdout:7/661: write d3/da/d25/d9/d2f/d4d/fb9 [856662,47991] 0 2026-03-10T07:51:42.038 INFO:tasks.workunit.client.1.vm08.stdout:8/806: mknod d0/df/d15/d23/d54/dba/d89/dc5/c106 0 2026-03-10T07:51:42.038 INFO:tasks.workunit.client.1.vm08.stdout:7/662: write d3/da/d25/d9/d2f/d3a/d4b/fdc [454413,39149] 0 2026-03-10T07:51:42.041 INFO:tasks.workunit.client.1.vm08.stdout:8/807: write d0/df/d2e/d49/fe0 [698850,120328] 0 2026-03-10T07:51:42.043 INFO:tasks.workunit.client.1.vm08.stdout:0/711: creat dd/d18/fdf x:0 0 0 2026-03-10T07:51:42.047 INFO:tasks.workunit.client.1.vm08.stdout:6/722: mknod d1/d17/d2b/d58/de7/cee 0 2026-03-10T07:51:42.048 INFO:tasks.workunit.client.1.vm08.stdout:7/663: dwrite d3/da/d25/d9/d2f/f42 [0,4194304] 0 2026-03-10T07:51:42.055 INFO:tasks.workunit.client.1.vm08.stdout:6/723: sync 2026-03-10T07:51:42.072 INFO:tasks.workunit.client.1.vm08.stdout:0/712: creat dd/d10/d14/fe0 x:0 0 0 2026-03-10T07:51:42.073 INFO:tasks.workunit.client.1.vm08.stdout:0/713: stat dd/d10/d2f/d37/d64/d95/d5c/dca/ddb 0 2026-03-10T07:51:42.073 INFO:tasks.workunit.client.1.vm08.stdout:9/696: dread d2/de/d28/f96 [0,4194304] 0 2026-03-10T07:51:42.085 INFO:tasks.workunit.client.1.vm08.stdout:0/714: dread dd/d10/d2f/d37/d64/d95/d58/d3d/f40 [0,4194304] 0 2026-03-10T07:51:42.087 INFO:tasks.workunit.client.1.vm08.stdout:9/697: rename d2/d58/f77 to d2/d58/dbf/dd0/d35/d97/dd5/fe6 0 2026-03-10T07:51:42.089 INFO:tasks.workunit.client.1.vm08.stdout:0/715: dwrite dd/d10/d2f/d37/d64/d95/d58/fd9 [0,4194304] 0 2026-03-10T07:51:42.099 INFO:tasks.workunit.client.1.vm08.stdout:0/716: symlink dd/d18/le1 0 2026-03-10T07:51:42.100 INFO:tasks.workunit.client.1.vm08.stdout:0/717: chown dd/d10/d2f/d37/d64/d95/d58/d3d/c89 249949706 1 2026-03-10T07:51:42.105 INFO:tasks.workunit.client.1.vm08.stdout:6/724: dread d1/d3/f2e [8388608,4194304] 0 2026-03-10T07:51:42.114 INFO:tasks.workunit.client.1.vm08.stdout:0/718: creat dd/fe2 x:0 0 0 2026-03-10T07:51:42.120 INFO:tasks.workunit.client.1.vm08.stdout:9/698: dread d2/f5 [0,4194304] 0 2026-03-10T07:51:42.121 INFO:tasks.workunit.client.1.vm08.stdout:4/614: truncate d5/da0/d12/d7b/f43 4106139 0 2026-03-10T07:51:42.121 INFO:tasks.workunit.client.1.vm08.stdout:0/719: write dd/d10/d14/d1b/fb7 [5059775,47480] 0 2026-03-10T07:51:42.121 INFO:tasks.workunit.client.1.vm08.stdout:9/699: rename d2/de/l6f to d2/d3/d84/d91/db8/le7 0 2026-03-10T07:51:42.122 INFO:tasks.workunit.client.1.vm08.stdout:9/700: read - d2/d58/d73/fd3 zero size 2026-03-10T07:51:42.123 INFO:tasks.workunit.client.1.vm08.stdout:4/615: rmdir d5/d1f/daf 39 2026-03-10T07:51:42.129 INFO:tasks.workunit.client.1.vm08.stdout:4/616: fsync d5/d8/f68 0 2026-03-10T07:51:42.129 INFO:tasks.workunit.client.1.vm08.stdout:9/701: symlink d2/d58/le8 0 2026-03-10T07:51:42.132 INFO:tasks.workunit.client.1.vm08.stdout:9/702: dwrite d2/f6 [0,4194304] 0 2026-03-10T07:51:42.136 INFO:tasks.workunit.client.1.vm08.stdout:0/720: link dd/d10/d2f/f8e dd/d10/d14/d15/d20/d92/dc1/fe3 0 2026-03-10T07:51:42.139 INFO:tasks.workunit.client.1.vm08.stdout:0/721: readlink dd/d10/d14/d15/d20/d22/l9e 0 2026-03-10T07:51:42.139 INFO:tasks.workunit.client.1.vm08.stdout:4/617: creat d5/d1f/d31/dce/fd1 x:0 0 0 2026-03-10T07:51:42.140 INFO:tasks.workunit.client.1.vm08.stdout:0/722: readlink dd/d10/d2f/d37/d64/d95/d5c/l60 0 2026-03-10T07:51:42.142 INFO:tasks.workunit.client.1.vm08.stdout:9/703: mknod d2/d58/dbf/d2b/ce9 0 2026-03-10T07:51:42.147 INFO:tasks.workunit.client.1.vm08.stdout:0/723: mknod dd/d10/d2f/d37/d64/d52/ce4 0 2026-03-10T07:51:42.148 INFO:tasks.workunit.client.1.vm08.stdout:3/675: write d0/d3c/d1f/d44/d51/f9d [5241580,115730] 0 2026-03-10T07:51:42.149 INFO:tasks.workunit.client.1.vm08.stdout:4/618: readlink d5/da0/d12/d7b/da7/l7e 0 2026-03-10T07:51:42.151 INFO:tasks.workunit.client.1.vm08.stdout:1/647: dwrite d2/d6/de/d1f/d22/f24 [0,4194304] 0 2026-03-10T07:51:42.167 INFO:tasks.workunit.client.1.vm08.stdout:9/704: symlink d2/d3/d84/lea 0 2026-03-10T07:51:42.173 INFO:tasks.workunit.client.1.vm08.stdout:0/724: dread dd/d10/d14/d15/d20/d5f/d9f/fbb [0,4194304] 0 2026-03-10T07:51:42.174 INFO:tasks.workunit.client.1.vm08.stdout:9/705: chown d2/d26/cbc 101933 1 2026-03-10T07:51:42.180 INFO:tasks.workunit.client.1.vm08.stdout:0/725: mkdir dd/d10/d14/d15/d20/d92/dc1/de5 0 2026-03-10T07:51:42.189 INFO:tasks.workunit.client.1.vm08.stdout:4/619: dread d5/da0/d12/d7b/d48/d4f/f56 [0,4194304] 0 2026-03-10T07:51:42.192 INFO:tasks.workunit.client.1.vm08.stdout:0/726: creat dd/d10/d2f/d37/d64/d52/fe6 x:0 0 0 2026-03-10T07:51:42.195 INFO:tasks.workunit.client.1.vm08.stdout:5/741: dwrite d0/d4/df/f27 [0,4194304] 0 2026-03-10T07:51:42.195 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:42 vm05.local ceph-mon[50387]: pgmap v40: 65 pgs: 65 active+clean; 3.0 GiB data, 9.8 GiB used, 110 GiB / 120 GiB avail; 36 MiB/s rd, 89 MiB/s wr, 217 op/s 2026-03-10T07:51:42.203 INFO:tasks.workunit.client.1.vm08.stdout:5/742: sync 2026-03-10T07:51:42.211 INFO:tasks.workunit.client.1.vm08.stdout:5/743: read d0/d4/df/dbf/f64 [752396,88396] 0 2026-03-10T07:51:42.215 INFO:tasks.workunit.client.1.vm08.stdout:2/697: truncate d0/d1/d17/f95 23588 0 2026-03-10T07:51:42.226 INFO:tasks.workunit.client.1.vm08.stdout:9/706: link d2/d58/d73/f9e d2/d3/d84/d91/db8/feb 0 2026-03-10T07:51:42.227 INFO:tasks.workunit.client.1.vm08.stdout:1/648: link d2/d6/de/d1f/da9/laa d2/d6/de/d1f/d40/le4 0 2026-03-10T07:51:42.227 INFO:tasks.workunit.client.1.vm08.stdout:7/664: write d3/da/d25/d9/d2f/d39/d43/f9d [4233961,76666] 0 2026-03-10T07:51:42.227 INFO:tasks.workunit.client.1.vm08.stdout:5/744: mkdir d0/d8/d24/de2/df7 0 2026-03-10T07:51:42.227 INFO:tasks.workunit.client.1.vm08.stdout:8/808: dwrite d0/d69/d3f/fcd [0,4194304] 0 2026-03-10T07:51:42.227 INFO:tasks.workunit.client.1.vm08.stdout:5/745: chown d0/d4/df/d82/cda 4096 1 2026-03-10T07:51:42.232 INFO:tasks.workunit.client.1.vm08.stdout:6/725: dwrite d1/d3/df/d1d/f9d [4194304,4194304] 0 2026-03-10T07:51:42.256 INFO:tasks.workunit.client.1.vm08.stdout:2/698: read - d0/d1/d3/d39/d7d/f98 zero size 2026-03-10T07:51:42.256 INFO:tasks.workunit.client.1.vm08.stdout:3/676: write d0/d3c/d1f/d44/d51/f82 [88092,39707] 0 2026-03-10T07:51:42.256 INFO:tasks.workunit.client.1.vm08.stdout:1/649: symlink d2/d6/de/d1f/d8f/le5 0 2026-03-10T07:51:42.257 INFO:tasks.workunit.client.1.vm08.stdout:7/665: mknod d3/da/d8a/ce0 0 2026-03-10T07:51:42.257 INFO:tasks.workunit.client.1.vm08.stdout:0/727: link dd/d10/d14/d15/d20/d22/f6c dd/d10/d2f/d37/d64/d95/d5c/dca/ddb/fe7 0 2026-03-10T07:51:42.259 INFO:tasks.workunit.client.1.vm08.stdout:5/746: dread d0/ddf/f8b [0,4194304] 0 2026-03-10T07:51:42.266 INFO:tasks.workunit.client.1.vm08.stdout:1/650: dread - d2/d6/de/d71/fc9 zero size 2026-03-10T07:51:42.266 INFO:tasks.workunit.client.1.vm08.stdout:7/666: chown d3/da/d25/d9/f30 3 1 2026-03-10T07:51:42.266 INFO:tasks.workunit.client.1.vm08.stdout:1/651: readlink d2/d6/de/d1f/d22/l84 0 2026-03-10T07:51:42.266 INFO:tasks.workunit.client.1.vm08.stdout:6/726: dread d1/d3/df/d44/f82 [0,4194304] 0 2026-03-10T07:51:42.266 INFO:tasks.workunit.client.1.vm08.stdout:1/652: dwrite d2/d6/de/d1f/d26/f6e [0,4194304] 0 2026-03-10T07:51:42.267 INFO:tasks.workunit.client.1.vm08.stdout:8/809: symlink d0/df/d2e/l107 0 2026-03-10T07:51:42.279 INFO:tasks.workunit.client.1.vm08.stdout:2/699: dread d0/d1/d3/d39/d7d/d7e/fa5 [0,4194304] 0 2026-03-10T07:51:42.291 INFO:tasks.workunit.client.1.vm08.stdout:4/620: write d5/da0/d32/f6c [136181,86660] 0 2026-03-10T07:51:42.299 INFO:tasks.workunit.client.1.vm08.stdout:3/677: dread d0/d3c/d18/d48/d55/d56/f9c [0,4194304] 0 2026-03-10T07:51:42.306 INFO:tasks.workunit.client.1.vm08.stdout:9/707: truncate d2/f6 6258438 0 2026-03-10T07:51:42.307 INFO:tasks.workunit.client.1.vm08.stdout:9/708: truncate d2/de/d28/d98/fd4 748489 0 2026-03-10T07:51:42.324 INFO:tasks.workunit.client.1.vm08.stdout:0/728: rename dd/d10/d14/d15/cc4 to dd/d10/d2f/d37/d64/d95/d58/d3d/ce8 0 2026-03-10T07:51:42.325 INFO:tasks.workunit.client.1.vm08.stdout:6/727: unlink d1/d3/d3e/le8 0 2026-03-10T07:51:42.325 INFO:tasks.workunit.client.1.vm08.stdout:6/728: chown d1/d3/d3e/db2 12 1 2026-03-10T07:51:42.334 INFO:tasks.workunit.client.1.vm08.stdout:8/810: mknod d0/df/d15/d53/c108 0 2026-03-10T07:51:42.334 INFO:tasks.workunit.client.1.vm08.stdout:5/747: write d0/d4/df/d12/f88 [611680,109721] 0 2026-03-10T07:51:42.336 INFO:tasks.workunit.client.1.vm08.stdout:5/748: chown d0/d4/d19/d60/l7d 1201676 1 2026-03-10T07:51:42.347 INFO:tasks.workunit.client.1.vm08.stdout:2/700: read d0/d1/d3/d39/d7d/d86/d55/d7a/f94 [518829,92234] 0 2026-03-10T07:51:42.359 INFO:tasks.workunit.client.1.vm08.stdout:3/678: read d0/d3c/d1f/d44/f4b [3013677,21987] 0 2026-03-10T07:51:42.359 INFO:tasks.workunit.client.1.vm08.stdout:3/679: dread - d0/d3c/d1f/d44/d51/f5b zero size 2026-03-10T07:51:42.369 INFO:tasks.workunit.client.1.vm08.stdout:7/667: rename d3/da/d25/d9/d2f/d4d/l60 to d3/da/d25/d9/d2f/d39/d43/d4f/d5b/le1 0 2026-03-10T07:51:42.370 INFO:tasks.workunit.client.1.vm08.stdout:6/729: rmdir d1/db/d24/d73/d79 39 2026-03-10T07:51:42.371 INFO:tasks.workunit.client.1.vm08.stdout:6/730: write d1/db/f57 [2266463,12690] 0 2026-03-10T07:51:42.401 INFO:tasks.workunit.client.1.vm08.stdout:8/811: symlink d0/df/d15/d23/d54/l109 0 2026-03-10T07:51:42.402 INFO:tasks.workunit.client.1.vm08.stdout:4/621: unlink d5/da0/d12/d7b/f43 0 2026-03-10T07:51:42.403 INFO:tasks.workunit.client.1.vm08.stdout:8/812: chown d0/df/d15/d23/d54/dba/d89/laa 714289670 1 2026-03-10T07:51:42.408 INFO:tasks.workunit.client.1.vm08.stdout:0/729: mknod dd/d10/d14/d1b/ce9 0 2026-03-10T07:51:42.409 INFO:tasks.workunit.client.1.vm08.stdout:0/730: dread - dd/d18/fdf zero size 2026-03-10T07:51:42.409 INFO:tasks.workunit.client.1.vm08.stdout:6/731: chown d1/db/d24/d73/d79/d7c 3268526 1 2026-03-10T07:51:42.410 INFO:tasks.workunit.client.1.vm08.stdout:1/653: creat d2/fe6 x:0 0 0 2026-03-10T07:51:42.414 INFO:tasks.workunit.client.1.vm08.stdout:1/654: dwrite d2/d6/de/d47/da0/fe0 [0,4194304] 0 2026-03-10T07:51:42.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:42 vm08.local ceph-mon[59917]: pgmap v40: 65 pgs: 65 active+clean; 3.0 GiB data, 9.8 GiB used, 110 GiB / 120 GiB avail; 36 MiB/s rd, 89 MiB/s wr, 217 op/s 2026-03-10T07:51:42.444 INFO:tasks.workunit.client.1.vm08.stdout:4/622: creat d5/d1f/dad/db8/fd2 x:0 0 0 2026-03-10T07:51:42.447 INFO:tasks.workunit.client.1.vm08.stdout:4/623: dwrite d5/d1f/d9b/fa8 [0,4194304] 0 2026-03-10T07:51:42.452 INFO:tasks.workunit.client.1.vm08.stdout:3/680: symlink d0/d3c/d18/d80/dc1/ld7 0 2026-03-10T07:51:42.454 INFO:tasks.workunit.client.1.vm08.stdout:2/701: truncate d0/d1/d3/d39/d7d/d86/f34 3898836 0 2026-03-10T07:51:42.462 INFO:tasks.workunit.client.1.vm08.stdout:2/702: fdatasync d0/d1/d3/d39/d7d/d86/d55/dc9/fd1 0 2026-03-10T07:51:42.462 INFO:tasks.workunit.client.1.vm08.stdout:9/709: dwrite d2/f4e [0,4194304] 0 2026-03-10T07:51:42.469 INFO:tasks.workunit.client.1.vm08.stdout:7/668: creat d3/da/dbc/fe2 x:0 0 0 2026-03-10T07:51:42.470 INFO:tasks.workunit.client.1.vm08.stdout:6/732: mkdir d1/d3/df/d38/def 0 2026-03-10T07:51:42.470 INFO:tasks.workunit.client.1.vm08.stdout:6/733: chown d1/d17/d2b/c7a 10 1 2026-03-10T07:51:42.479 INFO:tasks.workunit.client.1.vm08.stdout:1/655: rename d2/d6/de/d1f/d26/ca6 to d2/d6/d50/ce7 0 2026-03-10T07:51:42.487 INFO:tasks.workunit.client.1.vm08.stdout:8/813: mknod d0/df/d15/c10a 0 2026-03-10T07:51:42.487 INFO:tasks.workunit.client.1.vm08.stdout:3/681: symlink d0/d3c/d18/d80/ld8 0 2026-03-10T07:51:42.487 INFO:tasks.workunit.client.1.vm08.stdout:1/656: read d2/d6/de/d1f/d22/f24 [928347,176] 0 2026-03-10T07:51:42.487 INFO:tasks.workunit.client.1.vm08.stdout:2/703: mkdir d0/d1/d3/d39/de2 0 2026-03-10T07:51:42.487 INFO:tasks.workunit.client.1.vm08.stdout:7/669: chown d3/da/d25/d9/d2f/d3a/d4b/d67/lba 30721 1 2026-03-10T07:51:42.487 INFO:tasks.workunit.client.1.vm08.stdout:5/749: getdents d0/d77 0 2026-03-10T07:51:42.487 INFO:tasks.workunit.client.1.vm08.stdout:6/734: dread d1/f35 [4194304,4194304] 0 2026-03-10T07:51:42.489 INFO:tasks.workunit.client.1.vm08.stdout:7/670: sync 2026-03-10T07:51:42.490 INFO:tasks.workunit.client.1.vm08.stdout:9/710: sync 2026-03-10T07:51:42.491 INFO:tasks.workunit.client.1.vm08.stdout:8/814: symlink d0/df/d15/d23/da8/l10b 0 2026-03-10T07:51:42.493 INFO:tasks.workunit.client.1.vm08.stdout:1/657: unlink d2/d6/de/d1f/d22/l84 0 2026-03-10T07:51:42.500 INFO:tasks.workunit.client.1.vm08.stdout:6/735: dread d1/d3/df/d1d/d40/d45/fbb [0,4194304] 0 2026-03-10T07:51:42.500 INFO:tasks.workunit.client.1.vm08.stdout:2/704: rename d0/d1/d3/d10/d38/f60 to d0/d1/d17/d6b/fe3 0 2026-03-10T07:51:42.512 INFO:tasks.workunit.client.1.vm08.stdout:5/750: creat d0/d4/d19/d60/d6d/d70/d40/dba/ff8 x:0 0 0 2026-03-10T07:51:42.525 INFO:tasks.workunit.client.1.vm08.stdout:9/711: creat d2/de/d28/fec x:0 0 0 2026-03-10T07:51:42.526 INFO:tasks.workunit.client.1.vm08.stdout:3/682: symlink d0/d3c/d1f/d44/d51/ld9 0 2026-03-10T07:51:42.527 INFO:tasks.workunit.client.1.vm08.stdout:7/671: symlink d3/da/d25/d9/d2f/d3a/d71/d8c/le3 0 2026-03-10T07:51:42.533 INFO:tasks.workunit.client.1.vm08.stdout:1/658: chown d2/d6/de/d47/dbd/dc3/cd9 29628 1 2026-03-10T07:51:42.533 INFO:tasks.workunit.client.1.vm08.stdout:1/659: readlink d2/d6/d9f/ldf 0 2026-03-10T07:51:42.533 INFO:tasks.workunit.client.1.vm08.stdout:6/736: rmdir d1/db 39 2026-03-10T07:51:42.535 INFO:tasks.workunit.client.1.vm08.stdout:2/705: mkdir d0/d1/d3/d56/d78/de4 0 2026-03-10T07:51:42.535 INFO:tasks.workunit.client.1.vm08.stdout:0/731: getdents dd/d10/d14/d1b/da5 0 2026-03-10T07:51:42.537 INFO:tasks.workunit.client.1.vm08.stdout:2/706: sync 2026-03-10T07:51:42.539 INFO:tasks.workunit.client.1.vm08.stdout:5/751: rename d0/d4/d19/d43/ddc to d0/d4/df/df6/df9 0 2026-03-10T07:51:42.540 INFO:tasks.workunit.client.1.vm08.stdout:5/752: read - d0/d4/d19/d60/d6d/d70/d40/dba/fd8 zero size 2026-03-10T07:51:42.541 INFO:tasks.workunit.client.1.vm08.stdout:4/624: getdents d5/d8/d89 0 2026-03-10T07:51:42.543 INFO:tasks.workunit.client.1.vm08.stdout:5/753: dread d0/d4/d19/d43/f7c [0,4194304] 0 2026-03-10T07:51:42.543 INFO:tasks.workunit.client.1.vm08.stdout:5/754: readlink d0/d4/d19/d60/d6d/d70/d40/dba/lc9 0 2026-03-10T07:51:42.544 INFO:tasks.workunit.client.1.vm08.stdout:5/755: chown d0/d4/d19/d81/d92/f76 11742 1 2026-03-10T07:51:42.548 INFO:tasks.workunit.client.1.vm08.stdout:8/815: dwrite d0/df/d15/d23/d54/f8f [0,4194304] 0 2026-03-10T07:51:42.557 INFO:tasks.workunit.client.1.vm08.stdout:9/712: creat d2/d58/fed x:0 0 0 2026-03-10T07:51:42.557 INFO:tasks.workunit.client.1.vm08.stdout:9/713: dread - d2/d58/dbf/dd0/d35/fad zero size 2026-03-10T07:51:42.560 INFO:tasks.workunit.client.1.vm08.stdout:7/672: creat d3/da/d25/d9/d2f/d39/db2/fe4 x:0 0 0 2026-03-10T07:51:42.561 INFO:tasks.workunit.client.1.vm08.stdout:7/673: stat d3/da/d25/d9/d2f/d39/caa 0 2026-03-10T07:51:42.571 INFO:tasks.workunit.client.1.vm08.stdout:2/707: unlink d0/d1/d3/d39/d7d/d86/d55/d1b/fb6 0 2026-03-10T07:51:42.584 INFO:tasks.workunit.client.1.vm08.stdout:3/683: rename d0/d3c/d18/d48/d55/d56/fd3 to d0/d3c/d18/d32/d61/d83/fda 0 2026-03-10T07:51:42.585 INFO:tasks.workunit.client.1.vm08.stdout:3/684: dread - d0/d3c/d18/d48/fc5 zero size 2026-03-10T07:51:42.589 INFO:tasks.workunit.client.1.vm08.stdout:3/685: dwrite d0/f9a [0,4194304] 0 2026-03-10T07:51:42.595 INFO:tasks.workunit.client.1.vm08.stdout:4/625: dread - d5/d1f/d41/f99 zero size 2026-03-10T07:51:42.602 INFO:tasks.workunit.client.1.vm08.stdout:0/732: dwrite dd/d10/d2f/d37/d64/d95/d58/d3d/fa4 [0,4194304] 0 2026-03-10T07:51:42.602 INFO:tasks.workunit.client.1.vm08.stdout:0/733: sync 2026-03-10T07:51:42.606 INFO:tasks.workunit.client.1.vm08.stdout:3/686: dwrite d0/d3c/d18/d48/d55/fc6 [0,4194304] 0 2026-03-10T07:51:42.609 INFO:tasks.workunit.client.1.vm08.stdout:5/756: dread - d0/d8/d24/f8f zero size 2026-03-10T07:51:42.624 INFO:tasks.workunit.client.1.vm08.stdout:9/714: rmdir d2/d58/dbf/dd0/d35 39 2026-03-10T07:51:42.628 INFO:tasks.workunit.client.1.vm08.stdout:9/715: dwrite d2/d58/dbf/faa [0,4194304] 0 2026-03-10T07:51:42.628 INFO:tasks.workunit.client.1.vm08.stdout:7/674: mkdir d3/da/d25/d9/d2f/d6c/de5 0 2026-03-10T07:51:42.633 INFO:tasks.workunit.client.1.vm08.stdout:6/737: mknod d1/db/d24/d73/d79/d7c/cf0 0 2026-03-10T07:51:42.634 INFO:tasks.workunit.client.1.vm08.stdout:9/716: fsync d2/d58/fc6 0 2026-03-10T07:51:42.638 INFO:tasks.workunit.client.1.vm08.stdout:7/675: truncate d3/da/d25/d9/d2f/d4d/fb9 1236878 0 2026-03-10T07:51:42.641 INFO:tasks.workunit.client.1.vm08.stdout:1/660: rename d2/d6/de/f5b to d2/d10/fe8 0 2026-03-10T07:51:42.662 INFO:tasks.workunit.client.1.vm08.stdout:0/734: unlink dd/d10/d14/d15/d20/d5f/d9f/fd1 0 2026-03-10T07:51:42.668 INFO:tasks.workunit.client.1.vm08.stdout:8/816: write d0/df/d15/d23/d54/dba/d89/fb6 [888786,45864] 0 2026-03-10T07:51:42.671 INFO:tasks.workunit.client.1.vm08.stdout:4/626: dwrite d5/d1f/d41/f7f [0,4194304] 0 2026-03-10T07:51:42.674 INFO:tasks.workunit.client.1.vm08.stdout:3/687: mkdir d0/d3c/d1f/d89/ddb 0 2026-03-10T07:51:42.685 INFO:tasks.workunit.client.1.vm08.stdout:5/757: dread - d0/d4/d19/d3a/d69/fde zero size 2026-03-10T07:51:42.701 INFO:tasks.workunit.client.1.vm08.stdout:0/735: rename dd/d10/d2f/d37/d64/d95/d58/d3d/l7d to dd/d10/d14/d15/d20/d7a/dd2/lea 0 2026-03-10T07:51:42.702 INFO:tasks.workunit.client.1.vm08.stdout:0/736: fdatasync dd/d10/d2f/d37/d64/d52/f74 0 2026-03-10T07:51:42.703 INFO:tasks.workunit.client.1.vm08.stdout:0/737: chown dd/d10/d14/d15/d20/f7e 3795 1 2026-03-10T07:51:42.710 INFO:tasks.workunit.client.1.vm08.stdout:8/817: unlink d0/df/d2e/f102 0 2026-03-10T07:51:42.729 INFO:tasks.workunit.client.1.vm08.stdout:2/708: write d0/d1/d17/f95 [833001,76932] 0 2026-03-10T07:51:42.732 INFO:tasks.workunit.client.1.vm08.stdout:1/661: write d2/d6/d3a/f6d [871080,58751] 0 2026-03-10T07:51:42.732 INFO:tasks.workunit.client.1.vm08.stdout:1/662: stat d2/d6/de/d47/da0/ca1 0 2026-03-10T07:51:42.733 INFO:tasks.workunit.client.1.vm08.stdout:2/709: dwrite d0/d1/d17/db2/fcb [0,4194304] 0 2026-03-10T07:51:42.744 INFO:tasks.workunit.client.1.vm08.stdout:2/710: dread d0/d1/d3/d56/d78/dad/db1/d61/f3d [0,4194304] 0 2026-03-10T07:51:42.748 INFO:tasks.workunit.client.1.vm08.stdout:5/758: write d0/d4/d19/d60/d6d/d70/d40/fa8 [774184,50706] 0 2026-03-10T07:51:42.766 INFO:tasks.workunit.client.1.vm08.stdout:9/717: rmdir d2/d58/dbf/dd0/d35/d97/dd5 39 2026-03-10T07:51:42.766 INFO:tasks.workunit.client.1.vm08.stdout:9/718: stat d2/d58/l54 0 2026-03-10T07:51:42.766 INFO:tasks.workunit.client.1.vm08.stdout:6/738: truncate d1/f28 198709 0 2026-03-10T07:51:42.766 INFO:tasks.workunit.client.1.vm08.stdout:9/719: dread - d2/d58/dbf/dd0/d35/d97/fe1 zero size 2026-03-10T07:51:42.767 INFO:tasks.workunit.client.1.vm08.stdout:9/720: dread - d2/d58/dbf/dd0/d35/fc8 zero size 2026-03-10T07:51:42.767 INFO:tasks.workunit.client.1.vm08.stdout:7/676: mknod d3/da/ce6 0 2026-03-10T07:51:42.768 INFO:tasks.workunit.client.1.vm08.stdout:7/677: read - d3/da/d25/d9/fc5 zero size 2026-03-10T07:51:42.784 INFO:tasks.workunit.client.1.vm08.stdout:0/738: mknod dd/d10/d14/d15/d20/d92/ceb 0 2026-03-10T07:51:42.798 INFO:tasks.workunit.client.1.vm08.stdout:4/627: dwrite d5/da0/d12/d7b/d48/f5b [0,4194304] 0 2026-03-10T07:51:42.830 INFO:tasks.workunit.client.1.vm08.stdout:5/759: unlink d0/fb5 0 2026-03-10T07:51:42.837 INFO:tasks.workunit.client.1.vm08.stdout:2/711: write d0/d1/d3/d39/d7d/d86/d55/d1b/f23 [751117,98064] 0 2026-03-10T07:51:42.845 INFO:tasks.workunit.client.1.vm08.stdout:2/712: sync 2026-03-10T07:51:42.845 INFO:tasks.workunit.client.1.vm08.stdout:3/688: rename d0/f10 to d0/d3c/d1f/d89/ddb/fdc 0 2026-03-10T07:51:42.846 INFO:tasks.workunit.client.1.vm08.stdout:3/689: write d0/d3c/d1f/d44/f8c [2322960,54640] 0 2026-03-10T07:51:42.848 INFO:tasks.workunit.client.1.vm08.stdout:0/739: rmdir dd/d10/d14/d1b 39 2026-03-10T07:51:42.873 INFO:tasks.workunit.client.1.vm08.stdout:4/628: creat d5/d8/d89/fd3 x:0 0 0 2026-03-10T07:51:42.884 INFO:tasks.workunit.client.1.vm08.stdout:6/739: mknod d1/d17/d2b/d5e/dcb/cf1 0 2026-03-10T07:51:42.884 INFO:tasks.workunit.client.1.vm08.stdout:6/740: chown d1/d17/d2b/l4c 2706974 1 2026-03-10T07:51:42.888 INFO:tasks.workunit.client.1.vm08.stdout:5/760: chown d0/d4/df/f2a 346168189 1 2026-03-10T07:51:42.907 INFO:tasks.workunit.client.1.vm08.stdout:3/690: creat d0/d3c/d1f/d44/d51/d2d/fdd x:0 0 0 2026-03-10T07:51:42.913 INFO:tasks.workunit.client.1.vm08.stdout:4/629: rmdir d5/d1f/d31 39 2026-03-10T07:51:42.928 INFO:tasks.workunit.client.1.vm08.stdout:5/761: creat d0/d8/d5e/d8e/ffa x:0 0 0 2026-03-10T07:51:42.932 INFO:tasks.workunit.client.1.vm08.stdout:0/740: symlink dd/d10/d2f/d37/d64/d95/d5c/dca/ddb/lec 0 2026-03-10T07:51:42.935 INFO:tasks.workunit.client.1.vm08.stdout:0/741: dwrite dd/d18/fdf [0,4194304] 0 2026-03-10T07:51:42.955 INFO:tasks.workunit.client.1.vm08.stdout:5/762: symlink d0/d4/d19/d60/d6d/lfb 0 2026-03-10T07:51:42.955 INFO:tasks.workunit.client.1.vm08.stdout:5/763: chown d0/d8/d24/f56 1419414 1 2026-03-10T07:51:42.964 INFO:tasks.workunit.client.1.vm08.stdout:8/818: rename d0/d37/d86/fe2 to d0/df/d15/f10c 0 2026-03-10T07:51:42.988 INFO:tasks.workunit.client.1.vm08.stdout:3/691: symlink d0/d3c/d18/d48/d55/lde 0 2026-03-10T07:51:42.999 INFO:tasks.workunit.client.1.vm08.stdout:4/630: truncate d5/d1f/d41/f83 1586557 0 2026-03-10T07:51:42.999 INFO:tasks.workunit.client.1.vm08.stdout:6/741: link d1/d3/df/d1d/d40/d45/d5c/c61 d1/d3/df/d52/cf2 0 2026-03-10T07:51:42.999 INFO:tasks.workunit.client.1.vm08.stdout:5/764: rmdir d0/d4/d19 39 2026-03-10T07:51:43.000 INFO:tasks.workunit.client.1.vm08.stdout:4/631: read - d5/da0/d12/d7b/d48/d4f/d8d/d91/fca zero size 2026-03-10T07:51:43.006 INFO:tasks.workunit.client.1.vm08.stdout:6/742: dread d1/db/d24/f75 [0,4194304] 0 2026-03-10T07:51:43.006 INFO:tasks.workunit.client.1.vm08.stdout:5/765: fdatasync d0/d4/df/dbf/d41/dad/fb9 0 2026-03-10T07:51:43.008 INFO:tasks.workunit.client.1.vm08.stdout:3/692: dread d0/d3c/d1f/d44/f59 [0,4194304] 0 2026-03-10T07:51:43.011 INFO:tasks.workunit.client.1.vm08.stdout:8/819: dread d0/d69/f4c [0,4194304] 0 2026-03-10T07:51:43.012 INFO:tasks.workunit.client.1.vm08.stdout:1/663: rename d2/d6/de/d1f/f75 to d2/d6/de/d47/da0/fe9 0 2026-03-10T07:51:43.012 INFO:tasks.workunit.client.1.vm08.stdout:8/820: stat d0/df/d17/d72/fe4 0 2026-03-10T07:51:43.012 INFO:tasks.workunit.client.1.vm08.stdout:1/664: stat d2/f36 0 2026-03-10T07:51:43.014 INFO:tasks.workunit.client.1.vm08.stdout:0/742: stat dd/d10/d14/d1b/d30/c4f 0 2026-03-10T07:51:43.016 INFO:tasks.workunit.client.1.vm08.stdout:1/665: dwrite d2/d6/de/fa5 [0,4194304] 0 2026-03-10T07:51:43.033 INFO:tasks.workunit.client.1.vm08.stdout:1/666: chown d2/d6/de/l14 2447 1 2026-03-10T07:51:43.058 INFO:tasks.workunit.client.1.vm08.stdout:4/632: creat d5/fd4 x:0 0 0 2026-03-10T07:51:43.058 INFO:tasks.workunit.client.1.vm08.stdout:5/766: symlink d0/d4/d19/d60/d6d/d70/d40/lfc 0 2026-03-10T07:51:43.058 INFO:tasks.workunit.client.1.vm08.stdout:3/693: truncate d0/d3c/d1f/d44/d51/f65 570560 0 2026-03-10T07:51:43.058 INFO:tasks.workunit.client.1.vm08.stdout:4/633: fsync d5/d1f/d41/f99 0 2026-03-10T07:51:43.058 INFO:tasks.workunit.client.1.vm08.stdout:4/634: dread - d5/fd4 zero size 2026-03-10T07:51:43.058 INFO:tasks.workunit.client.1.vm08.stdout:9/721: rename d2/d3 to d2/de/dee 0 2026-03-10T07:51:43.058 INFO:tasks.workunit.client.1.vm08.stdout:4/635: mkdir d5/da0/d12/d7b/d48/d4f/d8d/d91/dd5 0 2026-03-10T07:51:43.058 INFO:tasks.workunit.client.1.vm08.stdout:8/821: getdents d0/df/d2e/d30 0 2026-03-10T07:51:43.058 INFO:tasks.workunit.client.1.vm08.stdout:9/722: creat d2/de/dee/d84/dca/dd7/fef x:0 0 0 2026-03-10T07:51:43.059 INFO:tasks.workunit.client.1.vm08.stdout:8/822: creat d0/d97/f10d x:0 0 0 2026-03-10T07:51:43.059 INFO:tasks.workunit.client.1.vm08.stdout:4/636: unlink d5/d1f/d31/f58 0 2026-03-10T07:51:43.059 INFO:tasks.workunit.client.1.vm08.stdout:7/678: rename d3/da/d25/d9/d6f/l99 to d3/le7 0 2026-03-10T07:51:43.059 INFO:tasks.workunit.client.1.vm08.stdout:9/723: mkdir d2/d58/dbf/dd0/df0 0 2026-03-10T07:51:43.059 INFO:tasks.workunit.client.1.vm08.stdout:4/637: creat d5/da0/d12/d7b/da7/fd6 x:0 0 0 2026-03-10T07:51:43.059 INFO:tasks.workunit.client.1.vm08.stdout:8/823: creat d0/df/f10e x:0 0 0 2026-03-10T07:51:43.060 INFO:tasks.workunit.client.1.vm08.stdout:2/713: rename d0/d1/d3/d39/d7d/c89 to d0/d1/d3/d39/d7d/d86/d55/ce5 0 2026-03-10T07:51:43.061 INFO:tasks.workunit.client.1.vm08.stdout:7/679: dwrite d3/da/d25/d9/d2f/f42 [0,4194304] 0 2026-03-10T07:51:43.061 INFO:tasks.workunit.client.1.vm08.stdout:1/667: dread d2/d6/de/fc0 [0,4194304] 0 2026-03-10T07:51:43.062 INFO:tasks.workunit.client.1.vm08.stdout:8/824: unlink d0/df/d2e/l82 0 2026-03-10T07:51:43.062 INFO:tasks.workunit.client.1.vm08.stdout:1/668: write d2/d6/de/d1f/d26/d89/d8e/fe3 [377051,65292] 0 2026-03-10T07:51:43.071 INFO:tasks.workunit.client.1.vm08.stdout:0/743: rename dd/d10/d14/d1b/d30 to dd/d10/d14/d15/d20/d92/dc1/ded 0 2026-03-10T07:51:43.071 INFO:tasks.workunit.client.1.vm08.stdout:0/744: write dd/fe2 [387206,117994] 0 2026-03-10T07:51:43.076 INFO:tasks.workunit.client.1.vm08.stdout:3/694: dread d0/d3c/d1f/d44/d51/d34/f53 [0,4194304] 0 2026-03-10T07:51:43.080 INFO:tasks.workunit.client.1.vm08.stdout:2/714: unlink d0/d1/d3/d56/d78/dad/db1/fd3 0 2026-03-10T07:51:43.088 INFO:tasks.workunit.client.1.vm08.stdout:8/825: unlink d0/d37/d86/f90 0 2026-03-10T07:51:43.088 INFO:tasks.workunit.client.1.vm08.stdout:5/767: rename d0/d4/d19/d60/d6d/d70/f29 to d0/d8/d24/ffd 0 2026-03-10T07:51:43.088 INFO:tasks.workunit.client.1.vm08.stdout:3/695: rename d0/l31 to d0/d3c/d1f/d95/ldf 0 2026-03-10T07:51:43.088 INFO:tasks.workunit.client.1.vm08.stdout:1/669: link d2/d6/de/d47/da0/fcc d2/d6/de/d1f/d22/fea 0 2026-03-10T07:51:43.088 INFO:tasks.workunit.client.1.vm08.stdout:1/670: stat d2/d6/de/d1f/d8f/le5 0 2026-03-10T07:51:43.088 INFO:tasks.workunit.client.1.vm08.stdout:1/671: dread - d2/d6/de/d47/dbd/dc3/fd0 zero size 2026-03-10T07:51:43.088 INFO:tasks.workunit.client.1.vm08.stdout:0/745: dwrite dd/d10/d14/d15/d20/fc5 [0,4194304] 0 2026-03-10T07:51:43.089 INFO:tasks.workunit.client.1.vm08.stdout:5/768: dwrite d0/d4/d19/d81/d92/f76 [0,4194304] 0 2026-03-10T07:51:43.089 INFO:tasks.workunit.client.1.vm08.stdout:7/680: dread d3/fa4 [4194304,4194304] 0 2026-03-10T07:51:43.092 INFO:tasks.workunit.client.1.vm08.stdout:0/746: chown dd/d10/d2f/d37/d64/d95/d5c/f63 372 1 2026-03-10T07:51:43.097 INFO:tasks.workunit.client.1.vm08.stdout:1/672: mkdir d2/d6/de/d1f/d22/deb 0 2026-03-10T07:51:43.102 INFO:tasks.workunit.client.1.vm08.stdout:8/826: dwrite d0/df/d15/d53/fee [0,4194304] 0 2026-03-10T07:51:43.102 INFO:tasks.workunit.client.1.vm08.stdout:8/827: chown d0/df/d15/f70 53303142 1 2026-03-10T07:51:43.113 INFO:tasks.workunit.client.1.vm08.stdout:0/747: mknod dd/d10/d14/d1b/da5/cee 0 2026-03-10T07:51:43.116 INFO:tasks.workunit.client.1.vm08.stdout:1/673: unlink d2/d6/de/d1f/d26/c73 0 2026-03-10T07:51:43.116 INFO:tasks.workunit.client.1.vm08.stdout:2/715: getdents d0/d1/d17 0 2026-03-10T07:51:43.116 INFO:tasks.workunit.client.1.vm08.stdout:7/681: mkdir d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79/ddd/de8 0 2026-03-10T07:51:43.116 INFO:tasks.workunit.client.1.vm08.stdout:2/716: write d0/d1/d3/d56/d78/dad/fdc [826219,94086] 0 2026-03-10T07:51:43.117 INFO:tasks.workunit.client.1.vm08.stdout:0/748: mknod dd/d10/dbd/cef 0 2026-03-10T07:51:43.119 INFO:tasks.workunit.client.1.vm08.stdout:5/769: getdents d0/d8/d5e 0 2026-03-10T07:51:43.121 INFO:tasks.workunit.client.1.vm08.stdout:7/682: rename d3/da/d25/d9/d2f/d3a/d4b/f70 to d3/da/d25/d9/d2f/d3a/d71/d8c/fe9 0 2026-03-10T07:51:43.121 INFO:tasks.workunit.client.1.vm08.stdout:7/683: chown d3/da/d25/f35 299 1 2026-03-10T07:51:43.122 INFO:tasks.workunit.client.1.vm08.stdout:0/749: creat dd/d10/d2f/d37/daf/ff0 x:0 0 0 2026-03-10T07:51:43.122 INFO:tasks.workunit.client.1.vm08.stdout:1/674: dread d2/d6/d3a/f6d [0,4194304] 0 2026-03-10T07:51:43.135 INFO:tasks.workunit.client.1.vm08.stdout:5/770: mkdir d0/d77/d83/de0/dfe 0 2026-03-10T07:51:43.137 INFO:tasks.workunit.client.1.vm08.stdout:2/717: symlink d0/d1/d17/d6b/da0/dd7/le6 0 2026-03-10T07:51:43.141 INFO:tasks.workunit.client.1.vm08.stdout:8/828: dread d0/df/d15/d23/da8/fc7 [0,4194304] 0 2026-03-10T07:51:43.142 INFO:tasks.workunit.client.1.vm08.stdout:7/684: mkdir d3/da/d25/d9/d2f/d3a/d4b/d67/dea 0 2026-03-10T07:51:43.142 INFO:tasks.workunit.client.1.vm08.stdout:7/685: readlink d3/da/d25/d9/d2f/d39/l8e 0 2026-03-10T07:51:43.145 INFO:tasks.workunit.client.1.vm08.stdout:5/771: mkdir d0/d4/d19/d60/d6d/d70/dff 0 2026-03-10T07:51:43.155 INFO:tasks.workunit.client.1.vm08.stdout:8/829: mknod d0/df/d15/d23/c10f 0 2026-03-10T07:51:43.162 INFO:tasks.workunit.client.1.vm08.stdout:6/743: write d1/d17/fa6 [500466,127955] 0 2026-03-10T07:51:43.168 INFO:tasks.workunit.client.1.vm08.stdout:9/724: dwrite d2/de/dee/fc0 [0,4194304] 0 2026-03-10T07:51:43.170 INFO:tasks.workunit.client.1.vm08.stdout:9/725: fsync d2/d58/fc6 0 2026-03-10T07:51:43.178 INFO:tasks.workunit.client.1.vm08.stdout:4/638: write d5/da0/f46 [686115,95156] 0 2026-03-10T07:51:43.187 INFO:tasks.workunit.client.1.vm08.stdout:0/750: getdents dd 0 2026-03-10T07:51:43.189 INFO:tasks.workunit.client.1.vm08.stdout:0/751: dread - dd/d10/d14/d15/d20/d5f/fc8 zero size 2026-03-10T07:51:43.189 INFO:tasks.workunit.client.1.vm08.stdout:0/752: chown dd/d10/d14/d15/d20/d5f/d9f/lba 218 1 2026-03-10T07:51:43.190 INFO:tasks.workunit.client.1.vm08.stdout:0/753: readlink dd/d10/d14/d15/d20/d92/ld7 0 2026-03-10T07:51:43.198 INFO:tasks.workunit.client.1.vm08.stdout:7/686: creat d3/da/d25/d9/d2f/d3a/feb x:0 0 0 2026-03-10T07:51:43.199 INFO:tasks.workunit.client.1.vm08.stdout:7/687: write d3/da/d25/d9/d2f/d39/db2/fd0 [569032,69300] 0 2026-03-10T07:51:43.216 INFO:tasks.workunit.client.1.vm08.stdout:3/696: write d0/d3c/d18/d32/d61/d52/f68 [2460096,50296] 0 2026-03-10T07:51:43.223 INFO:tasks.workunit.client.1.vm08.stdout:6/744: link d1/d17/f63 d1/d17/d2b/d5e/ff3 0 2026-03-10T07:51:43.228 INFO:tasks.workunit.client.1.vm08.stdout:7/688: mknod d3/da/d25/d9/d2f/d4d/db6/cec 0 2026-03-10T07:51:43.232 INFO:tasks.workunit.client.1.vm08.stdout:0/754: getdents dd/d10/d14/d15/d20/d22/dc6 0 2026-03-10T07:51:43.246 INFO:tasks.workunit.client.1.vm08.stdout:1/675: write d2/d6/de/d70/d80/f85 [624613,2725] 0 2026-03-10T07:51:43.252 INFO:tasks.workunit.client.1.vm08.stdout:7/689: chown d3/da/d25/d9/d2f/d3a/c9a 20 1 2026-03-10T07:51:43.256 INFO:tasks.workunit.client.1.vm08.stdout:4/639: creat d5/d1f/fd7 x:0 0 0 2026-03-10T07:51:43.257 INFO:tasks.workunit.client.1.vm08.stdout:4/640: chown d5/l40 113133892 1 2026-03-10T07:51:43.260 INFO:tasks.workunit.client.1.vm08.stdout:2/718: write d0/d1/d3/d10/d65/fae [297220,26214] 0 2026-03-10T07:51:43.266 INFO:tasks.workunit.client.1.vm08.stdout:0/755: rename dd/d10/d2f/d37/d64/d95/d58/d3d/f83 to dd/d10/d14/d15/d20/d92/dc1/ff1 0 2026-03-10T07:51:43.277 INFO:tasks.workunit.client.1.vm08.stdout:5/772: dwrite d0/d4/df/dbf/daf/fca [0,4194304] 0 2026-03-10T07:51:43.287 INFO:tasks.workunit.client.1.vm08.stdout:8/830: write d0/df/d5d/f9d [1007835,114655] 0 2026-03-10T07:51:43.291 INFO:tasks.workunit.client.1.vm08.stdout:9/726: dwrite d2/d58/dbf/f55 [0,4194304] 0 2026-03-10T07:51:43.298 INFO:tasks.workunit.client.1.vm08.stdout:7/690: write d3/f93 [742593,22061] 0 2026-03-10T07:51:43.313 INFO:tasks.workunit.client.1.vm08.stdout:1/676: rename d2/d6/de/d47/f3c to d2/d6/d9f/fec 0 2026-03-10T07:51:43.313 INFO:tasks.workunit.client.1.vm08.stdout:1/677: symlink d2/d6/de/d1f/d26/d98/led 2 2026-03-10T07:51:43.314 INFO:tasks.workunit.client.1.vm08.stdout:1/678: stat d2/d6/d9f/fe1 0 2026-03-10T07:51:43.319 INFO:tasks.workunit.client.1.vm08.stdout:8/831: sync 2026-03-10T07:51:43.323 INFO:tasks.workunit.client.1.vm08.stdout:0/756: creat dd/d10/d2f/d37/d64/d95/d5c/dca/ff2 x:0 0 0 2026-03-10T07:51:43.331 INFO:tasks.workunit.client.1.vm08.stdout:3/697: creat d0/d3c/d18/d48/d55/fe0 x:0 0 0 2026-03-10T07:51:43.341 INFO:tasks.workunit.client.1.vm08.stdout:1/679: dread d2/d6/de/d1f/d26/f29 [0,4194304] 0 2026-03-10T07:51:43.348 INFO:tasks.workunit.client.1.vm08.stdout:6/745: link d1/d3/df/d1d/d40/d87/fb3 d1/d3/d3e/ff4 0 2026-03-10T07:51:43.350 INFO:tasks.workunit.client.1.vm08.stdout:6/746: stat d1/d3/df/d1d/d40/d87/f8e 0 2026-03-10T07:51:43.351 INFO:tasks.workunit.client.1.vm08.stdout:5/773: mknod d0/d4/d19/d60/d6d/c100 0 2026-03-10T07:51:43.352 INFO:tasks.workunit.client.1.vm08.stdout:5/774: stat d0/d4/d19/d60/d6d/d70/d40/lfc 0 2026-03-10T07:51:43.364 INFO:tasks.workunit.client.1.vm08.stdout:4/641: truncate d5/d1f/d41/f83 1897022 0 2026-03-10T07:51:43.364 INFO:tasks.workunit.client.1.vm08.stdout:4/642: dread - d5/d1f/dad/fc3 zero size 2026-03-10T07:51:43.366 INFO:tasks.workunit.client.1.vm08.stdout:4/643: read d5/d8/fc1 [3489086,40945] 0 2026-03-10T07:51:43.366 INFO:tasks.workunit.client.1.vm08.stdout:4/644: write d5/da0/d95/dc2/fc5 [219804,48260] 0 2026-03-10T07:51:43.367 INFO:tasks.workunit.client.1.vm08.stdout:4/645: chown f0 26 1 2026-03-10T07:51:43.377 INFO:tasks.workunit.client.1.vm08.stdout:7/691: truncate d3/da/d25/f29 5512152 0 2026-03-10T07:51:43.378 INFO:tasks.workunit.client.1.vm08.stdout:2/719: mknod d0/d1/d3/d56/d78/da6/ce7 0 2026-03-10T07:51:43.383 INFO:tasks.workunit.client.1.vm08.stdout:8/832: mknod d0/df/d5d/c110 0 2026-03-10T07:51:43.384 INFO:tasks.workunit.client.1.vm08.stdout:8/833: chown d0/c50 4000464 1 2026-03-10T07:51:43.384 INFO:tasks.workunit.client.1.vm08.stdout:8/834: chown d0/df/d2e/l59 120399 1 2026-03-10T07:51:43.384 INFO:tasks.workunit.client.1.vm08.stdout:0/757: truncate dd/d10/d14/d15/d20/d22/f51 601532 0 2026-03-10T07:51:43.385 INFO:tasks.workunit.client.1.vm08.stdout:0/758: fsync dd/d10/d14/d15/d20/d22/f2e 0 2026-03-10T07:51:43.390 INFO:tasks.workunit.client.1.vm08.stdout:8/835: read d0/df/f60 [1937525,74075] 0 2026-03-10T07:51:43.390 INFO:tasks.workunit.client.1.vm08.stdout:3/698: fsync d0/d3c/d18/f22 0 2026-03-10T07:51:43.390 INFO:tasks.workunit.client.1.vm08.stdout:2/720: dread d0/d1/d3/d39/d7d/d86/d55/d1b/f23 [0,4194304] 0 2026-03-10T07:51:43.394 INFO:tasks.workunit.client.1.vm08.stdout:9/727: dread d2/d58/dbf/dd0/d35/d97/dd5/fe6 [4194304,4194304] 0 2026-03-10T07:51:43.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:43 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:51:43.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:43 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:43.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:43 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:43.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:43 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:51:43.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:43 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:51:43.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:43 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:43.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:43 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:43.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:43 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:51:43.473 INFO:tasks.workunit.client.1.vm08.stdout:6/747: dread - d1/db/d24/dac/fba zero size 2026-03-10T07:51:43.473 INFO:tasks.workunit.client.1.vm08.stdout:1/680: dwrite d2/d6/d3a/f6d [0,4194304] 0 2026-03-10T07:51:43.478 INFO:tasks.workunit.client.1.vm08.stdout:7/692: creat d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/dac/fed x:0 0 0 2026-03-10T07:51:43.478 INFO:tasks.workunit.client.1.vm08.stdout:0/759: creat dd/d10/d2f/d37/d64/d95/d58/d3d/ff3 x:0 0 0 2026-03-10T07:51:43.478 INFO:tasks.workunit.client.1.vm08.stdout:8/836: stat d0/df/d15/d23/l104 0 2026-03-10T07:51:43.483 INFO:tasks.workunit.client.1.vm08.stdout:0/760: dwrite dd/d10/d2f/d37/d64/d95/d5c/dca/ff2 [0,4194304] 0 2026-03-10T07:51:43.504 INFO:tasks.workunit.client.1.vm08.stdout:2/721: creat d0/d1/d3/d56/d78/dad/db1/d61/d8e/fe8 x:0 0 0 2026-03-10T07:51:43.505 INFO:tasks.workunit.client.1.vm08.stdout:2/722: chown d0/d1/d3/d10/d38/f53 120163399 1 2026-03-10T07:51:43.505 INFO:tasks.workunit.client.1.vm08.stdout:2/723: fsync d0/d1/d3/d39/d7d/d86/fa9 0 2026-03-10T07:51:43.506 INFO:tasks.workunit.client.1.vm08.stdout:2/724: readlink d0/d1/d17/d6b/da0/dd7/le6 0 2026-03-10T07:51:43.512 INFO:tasks.workunit.client.1.vm08.stdout:9/728: symlink d2/de/dee/d84/dca/lf1 0 2026-03-10T07:51:43.512 INFO:tasks.workunit.client.1.vm08.stdout:4/646: mknod d5/d8/d50/db0/cd8 0 2026-03-10T07:51:43.526 INFO:tasks.workunit.client.1.vm08.stdout:1/681: fdatasync d2/d6/de/d1f/d22/fea 0 2026-03-10T07:51:43.538 INFO:tasks.workunit.client.1.vm08.stdout:6/748: write d1/d3/df/d1d/f1f [329661,130299] 0 2026-03-10T07:51:43.538 INFO:tasks.workunit.client.1.vm08.stdout:5/775: write d0/d8/d24/f56 [3933920,106918] 0 2026-03-10T07:51:43.539 INFO:tasks.workunit.client.1.vm08.stdout:6/749: chown d1/db/d24/fd1 31569 1 2026-03-10T07:51:43.544 INFO:tasks.workunit.client.1.vm08.stdout:6/750: read d1/f49 [3889692,45724] 0 2026-03-10T07:51:43.547 INFO:tasks.workunit.client.1.vm08.stdout:7/693: mkdir d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79/ddd/dee 0 2026-03-10T07:51:43.549 INFO:tasks.workunit.client.1.vm08.stdout:8/837: truncate d0/d69/d3f/fbd 599401 0 2026-03-10T07:51:43.557 INFO:tasks.workunit.client.1.vm08.stdout:0/761: dread dd/d10/d14/d15/d20/d22/f51 [0,4194304] 0 2026-03-10T07:51:43.564 INFO:tasks.workunit.client.1.vm08.stdout:3/699: creat d0/d3c/d18/d32/d61/d52/dca/dd2/fe1 x:0 0 0 2026-03-10T07:51:43.569 INFO:tasks.workunit.client.1.vm08.stdout:3/700: dread d0/f45 [0,4194304] 0 2026-03-10T07:51:43.570 INFO:tasks.workunit.client.1.vm08.stdout:3/701: read d0/d3c/d18/d32/d61/d52/dca/fd0 [415465,80074] 0 2026-03-10T07:51:43.575 INFO:tasks.workunit.client.1.vm08.stdout:4/647: mkdir d5/d1f/d31/dce/dd9 0 2026-03-10T07:51:43.580 INFO:tasks.workunit.client.1.vm08.stdout:9/729: rename d2/d58/dbf/f55 to d2/d58/dbf/daf/ff2 0 2026-03-10T07:51:43.589 INFO:tasks.workunit.client.1.vm08.stdout:1/682: symlink d2/d6/d50/lee 0 2026-03-10T07:51:43.592 INFO:tasks.workunit.client.1.vm08.stdout:1/683: dwrite d2/d6/de/f1c [0,4194304] 0 2026-03-10T07:51:43.604 INFO:tasks.workunit.client.1.vm08.stdout:1/684: dwrite d2/d6/de/d47/da0/fe0 [0,4194304] 0 2026-03-10T07:51:43.609 INFO:tasks.workunit.client.1.vm08.stdout:7/694: rmdir d3/da/d25/d9/d6f 39 2026-03-10T07:51:43.614 INFO:tasks.workunit.client.1.vm08.stdout:6/751: chown d1/d3/df/d1d/c32 19 1 2026-03-10T07:51:43.614 INFO:tasks.workunit.client.1.vm08.stdout:8/838: stat d0/l14 0 2026-03-10T07:51:43.614 INFO:tasks.workunit.client.1.vm08.stdout:7/695: dwrite d3/da/d25/f27 [0,4194304] 0 2026-03-10T07:51:43.616 INFO:tasks.workunit.client.1.vm08.stdout:7/696: write d3/f34 [4822099,45461] 0 2026-03-10T07:51:43.620 INFO:tasks.workunit.client.1.vm08.stdout:0/762: symlink dd/d10/d14/d15/d20/d5f/lf4 0 2026-03-10T07:51:43.622 INFO:tasks.workunit.client.1.vm08.stdout:0/763: fsync dd/d10/d2f/d37/d64/d95/d58/d3d/f5b 0 2026-03-10T07:51:43.628 INFO:tasks.workunit.client.1.vm08.stdout:0/764: dread - dd/d10/dbd/fbe zero size 2026-03-10T07:51:43.632 INFO:tasks.workunit.client.1.vm08.stdout:8/839: sync 2026-03-10T07:51:43.636 INFO:tasks.workunit.client.1.vm08.stdout:2/725: link d0/d1/f85 d0/d1/d17/db2/dde/fe9 0 2026-03-10T07:51:43.646 INFO:tasks.workunit.client.1.vm08.stdout:4/648: rename d5/da0/d12/d7b/d48/d4f/c8f to d5/d1f/d70/cda 0 2026-03-10T07:51:43.646 INFO:tasks.workunit.client.1.vm08.stdout:4/649: fsync d5/d1f/d31/f62 0 2026-03-10T07:51:43.654 INFO:tasks.workunit.client.1.vm08.stdout:9/730: symlink d2/de/d28/d98/dbb/dd9/lf3 0 2026-03-10T07:51:43.654 INFO:tasks.workunit.client.1.vm08.stdout:9/731: stat d2/d58/dbf/dd0/d35/d9b/fa9 0 2026-03-10T07:51:43.656 INFO:tasks.workunit.client.1.vm08.stdout:7/697: dread d3/da/d25/d9/f30 [0,4194304] 0 2026-03-10T07:51:43.660 INFO:tasks.workunit.client.1.vm08.stdout:7/698: dwrite d3/da/d25/d9/fd [0,4194304] 0 2026-03-10T07:51:43.663 INFO:tasks.workunit.client.1.vm08.stdout:7/699: dwrite d3/da/d25/d9/fc5 [0,4194304] 0 2026-03-10T07:51:43.663 INFO:tasks.workunit.client.1.vm08.stdout:7/700: fsync d3/da/f9f 0 2026-03-10T07:51:43.688 INFO:tasks.workunit.client.1.vm08.stdout:1/685: dwrite d2/d6/de/d1f/d22/f35 [0,4194304] 0 2026-03-10T07:51:43.697 INFO:tasks.workunit.client.1.vm08.stdout:3/702: unlink d0/d3c/d18/d80/f86 0 2026-03-10T07:51:43.704 INFO:tasks.workunit.client.1.vm08.stdout:8/840: creat d0/df/d15/d23/d54/dba/d89/dc5/f111 x:0 0 0 2026-03-10T07:51:43.704 INFO:tasks.workunit.client.1.vm08.stdout:2/726: rename d0/d1/d3/d56/d78/dad/db1/d61/f3d to d0/d1/d3/d39/d7d/d86/d55/d7a/fea 0 2026-03-10T07:51:43.704 INFO:tasks.workunit.client.1.vm08.stdout:8/841: chown d0/df/d2e/d30 233722738 1 2026-03-10T07:51:43.704 INFO:tasks.workunit.client.1.vm08.stdout:4/650: mknod d5/da0/d32/cdb 0 2026-03-10T07:51:43.704 INFO:tasks.workunit.client.1.vm08.stdout:4/651: chown d5/fd4 688955765 1 2026-03-10T07:51:43.704 INFO:tasks.workunit.client.1.vm08.stdout:7/701: rmdir d3/da/d25/d9/d2f/d3a/d4b 39 2026-03-10T07:51:43.704 INFO:tasks.workunit.client.1.vm08.stdout:4/652: write d5/d8/f39 [1810162,64471] 0 2026-03-10T07:51:43.704 INFO:tasks.workunit.client.1.vm08.stdout:1/686: fdatasync d2/d6/de/d1f/d22/f30 0 2026-03-10T07:51:43.704 INFO:tasks.workunit.client.1.vm08.stdout:3/703: chown d0/d3c/d1f/d44/d51/d2d/d85/fa0 1451154234 1 2026-03-10T07:51:43.704 INFO:tasks.workunit.client.1.vm08.stdout:8/842: mknod d0/df/d15/d23/d39/d5b/dbc/c112 0 2026-03-10T07:51:43.704 INFO:tasks.workunit.client.1.vm08.stdout:7/702: symlink d3/da/d8a/dd1/lef 0 2026-03-10T07:51:43.707 INFO:tasks.workunit.client.1.vm08.stdout:2/727: creat d0/d1/d3/d56/d78/de4/feb x:0 0 0 2026-03-10T07:51:43.709 INFO:tasks.workunit.client.1.vm08.stdout:3/704: rename d0/d3c/d18/d48/l90 to d0/d3c/d18/d80/dc1/le2 0 2026-03-10T07:51:43.710 INFO:tasks.workunit.client.1.vm08.stdout:4/653: mknod d5/da0/d12/d7b/d48/d4f/d8d/d91/dd5/cdc 0 2026-03-10T07:51:43.711 INFO:tasks.workunit.client.1.vm08.stdout:3/705: dread d0/d3c/d1f/d44/f59 [0,4194304] 0 2026-03-10T07:51:43.712 INFO:tasks.workunit.client.1.vm08.stdout:8/843: rename d0/d37/lbb to d0/df/d15/d23/d39/d5b/dea/dce/l113 0 2026-03-10T07:51:43.713 INFO:tasks.workunit.client.1.vm08.stdout:4/654: unlink d5/d1f/d31/f62 0 2026-03-10T07:51:43.714 INFO:tasks.workunit.client.1.vm08.stdout:7/703: creat d3/ff0 x:0 0 0 2026-03-10T07:51:43.714 INFO:tasks.workunit.client.1.vm08.stdout:4/655: chown d5/da0/d12/d7b/da7/l8c 9244149 1 2026-03-10T07:51:43.715 INFO:tasks.workunit.client.1.vm08.stdout:4/656: write d5/f10 [4448219,96183] 0 2026-03-10T07:51:43.716 INFO:tasks.workunit.client.1.vm08.stdout:7/704: chown d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7 1567804 1 2026-03-10T07:51:43.718 INFO:tasks.workunit.client.1.vm08.stdout:8/844: dwrite d0/f2a [0,4194304] 0 2026-03-10T07:51:43.718 INFO:tasks.workunit.client.1.vm08.stdout:8/845: fdatasync d0/df/d2e/d49/fe0 0 2026-03-10T07:51:43.718 INFO:tasks.workunit.client.1.vm08.stdout:7/705: write d3/da/d25/d9/d2f/d39/d43/fc2 [118848,68831] 0 2026-03-10T07:51:43.722 INFO:tasks.workunit.client.1.vm08.stdout:2/728: getdents d0/d1/d3/d39/d7d 0 2026-03-10T07:51:43.722 INFO:tasks.workunit.client.1.vm08.stdout:7/706: stat d3/da/d25/d9/d2f/d3a/d40/f52 0 2026-03-10T07:51:43.725 INFO:tasks.workunit.client.1.vm08.stdout:8/846: dwrite d0/d69/d3f/fcd [0,4194304] 0 2026-03-10T07:51:43.761 INFO:tasks.workunit.client.1.vm08.stdout:8/847: stat d0/df/d17/d25/ld5 0 2026-03-10T07:51:43.765 INFO:tasks.workunit.client.1.vm08.stdout:4/657: dread d5/d8/f90 [0,4194304] 0 2026-03-10T07:51:43.771 INFO:tasks.workunit.client.1.vm08.stdout:5/776: truncate d0/d4/d19/d3a/d69/f6b 4107514 0 2026-03-10T07:51:43.771 INFO:tasks.workunit.client.1.vm08.stdout:8/848: truncate d0/f20 601254 0 2026-03-10T07:51:43.773 INFO:tasks.workunit.client.1.vm08.stdout:3/706: link d0/d3c/d1f/d44/d51/f4d d0/d3c/d18/d80/fe3 0 2026-03-10T07:51:43.774 INFO:tasks.workunit.client.1.vm08.stdout:3/707: readlink d0/d3c/d18/d80/lce 0 2026-03-10T07:51:43.784 INFO:tasks.workunit.client.1.vm08.stdout:6/752: write d1/d3/df/d1d/d40/d45/d5c/fb9 [388153,74648] 0 2026-03-10T07:51:43.784 INFO:tasks.workunit.client.1.vm08.stdout:5/777: dwrite d0/d4/d19/d3a/d69/f71 [0,4194304] 0 2026-03-10T07:51:43.784 INFO:tasks.workunit.client.1.vm08.stdout:5/778: stat d0/d4/df/dbf/d41/l90 0 2026-03-10T07:51:43.784 INFO:tasks.workunit.client.1.vm08.stdout:0/765: dwrite dd/d10/d2f/d37/d64/d95/d58/d3d/faa [0,4194304] 0 2026-03-10T07:51:43.785 INFO:tasks.workunit.client.1.vm08.stdout:3/708: dwrite d0/d3c/d1f/d89/fba [0,4194304] 0 2026-03-10T07:51:43.791 INFO:tasks.workunit.client.1.vm08.stdout:9/732: truncate d2/de/d28/f96 3181704 0 2026-03-10T07:51:43.792 INFO:tasks.workunit.client.1.vm08.stdout:8/849: creat d0/d37/f114 x:0 0 0 2026-03-10T07:51:43.792 INFO:tasks.workunit.client.1.vm08.stdout:8/850: chown d0/df/d15/c6e 0 1 2026-03-10T07:51:43.800 INFO:tasks.workunit.client.1.vm08.stdout:9/733: read d2/d58/dbf/d2b/f6a [2762363,125450] 0 2026-03-10T07:51:43.801 INFO:tasks.workunit.client.1.vm08.stdout:9/734: fdatasync d2/d58/f95 0 2026-03-10T07:51:43.801 INFO:tasks.workunit.client.1.vm08.stdout:9/735: stat d2/de/d28/l65 0 2026-03-10T07:51:43.804 INFO:tasks.workunit.client.1.vm08.stdout:6/753: creat d1/d17/d2b/d58/d76/ff5 x:0 0 0 2026-03-10T07:51:43.806 INFO:tasks.workunit.client.1.vm08.stdout:7/707: getdents d3/da/d25/d9 0 2026-03-10T07:51:43.809 INFO:tasks.workunit.client.1.vm08.stdout:7/708: dwrite d3/da/f9f [0,4194304] 0 2026-03-10T07:51:43.811 INFO:tasks.workunit.client.1.vm08.stdout:3/709: creat d0/d3c/d1f/d44/d51/d34/fe4 x:0 0 0 2026-03-10T07:51:43.815 INFO:tasks.workunit.client.1.vm08.stdout:7/709: write d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/f49 [216772,6641] 0 2026-03-10T07:51:43.815 INFO:tasks.workunit.client.1.vm08.stdout:7/710: write d3/f57 [5036759,3556] 0 2026-03-10T07:51:43.818 INFO:tasks.workunit.client.1.vm08.stdout:7/711: dread d3/da/d25/d9/d2f/f42 [0,4194304] 0 2026-03-10T07:51:43.832 INFO:tasks.workunit.client.1.vm08.stdout:4/658: getdents d5/d1f/d9b 0 2026-03-10T07:51:43.834 INFO:tasks.workunit.client.1.vm08.stdout:6/754: creat d1/d3/ff6 x:0 0 0 2026-03-10T07:51:43.836 INFO:tasks.workunit.client.1.vm08.stdout:4/659: dwrite d5/d8/d89/fd3 [0,4194304] 0 2026-03-10T07:51:43.844 INFO:tasks.workunit.client.1.vm08.stdout:6/755: dwrite d1/d3/ff6 [0,4194304] 0 2026-03-10T07:51:43.848 INFO:tasks.workunit.client.1.vm08.stdout:1/687: write d2/d6/de/d70/d80/fbb [590469,1144] 0 2026-03-10T07:51:43.851 INFO:tasks.workunit.client.1.vm08.stdout:2/729: dwrite d0/d1/d3/d39/d7d/d86/d55/d1b/f23 [0,4194304] 0 2026-03-10T07:51:43.854 INFO:tasks.workunit.client.1.vm08.stdout:9/736: rename d2/de to d2/d58/dbf/dd0/d35/d97/d9d/df4 0 2026-03-10T07:51:43.866 INFO:tasks.workunit.client.1.vm08.stdout:5/779: write d0/d4/df/f7f [997136,22364] 0 2026-03-10T07:51:43.870 INFO:tasks.workunit.client.1.vm08.stdout:0/766: dwrite dd/f16 [0,4194304] 0 2026-03-10T07:51:43.889 INFO:tasks.workunit.client.1.vm08.stdout:6/756: creat d1/d3/df/d1d/d40/d45/ff7 x:0 0 0 2026-03-10T07:51:43.890 INFO:tasks.workunit.client.1.vm08.stdout:6/757: chown d1/db/d24/d73 441775 1 2026-03-10T07:51:43.892 INFO:tasks.workunit.client.1.vm08.stdout:3/710: dwrite d0/d3c/d1f/f93 [0,4194304] 0 2026-03-10T07:51:43.892 INFO:tasks.workunit.client.1.vm08.stdout:2/730: rmdir d0/d1/d3/d39/d7d/d86/d55/db9 39 2026-03-10T07:51:43.904 INFO:tasks.workunit.client.1.vm08.stdout:9/737: dread - d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/d91/db8/feb zero size 2026-03-10T07:51:43.904 INFO:tasks.workunit.client.1.vm08.stdout:7/712: truncate d3/da/d25/d9/d2f/d39/f76 857354 0 2026-03-10T07:51:43.905 INFO:tasks.workunit.client.1.vm08.stdout:7/713: chown d3/f93 1 1 2026-03-10T07:51:43.905 INFO:tasks.workunit.client.1.vm08.stdout:4/660: creat d5/da0/db7/fdd x:0 0 0 2026-03-10T07:51:43.906 INFO:tasks.workunit.client.1.vm08.stdout:2/731: creat d0/d1/d3/d56/d78/dad/fec x:0 0 0 2026-03-10T07:51:43.911 INFO:tasks.workunit.client.1.vm08.stdout:5/780: mkdir d0/d4/df/dbf/d41/dc8/d101 0 2026-03-10T07:51:43.912 INFO:tasks.workunit.client.1.vm08.stdout:5/781: chown d0/d77/d83/c9f 32 1 2026-03-10T07:51:43.912 INFO:tasks.workunit.client.1.vm08.stdout:8/851: write d0/f20 [497035,19495] 0 2026-03-10T07:51:43.913 INFO:tasks.workunit.client.1.vm08.stdout:8/852: chown d0/df/d15/d23/d54/dba/d89/fb6 6114 1 2026-03-10T07:51:43.914 INFO:tasks.workunit.client.1.vm08.stdout:2/732: rmdir d0/d1/d3 39 2026-03-10T07:51:43.920 INFO:tasks.workunit.client.1.vm08.stdout:4/661: mkdir d5/da0/d12/d7b/d48/d4f/d7c/dde 0 2026-03-10T07:51:43.923 INFO:tasks.workunit.client.1.vm08.stdout:9/738: symlink d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/dca/dd7/lf5 0 2026-03-10T07:51:43.923 INFO:tasks.workunit.client.1.vm08.stdout:9/739: write d2/f51 [1116784,22865] 0 2026-03-10T07:51:43.927 INFO:tasks.workunit.client.1.vm08.stdout:5/782: unlink d0/d4/d19/d60/d6d/d70/fb4 0 2026-03-10T07:51:43.928 INFO:tasks.workunit.client.1.vm08.stdout:8/853: rename d0/df/d2e/d30/fec to d0/df/d15/d23/d39/d5b/d4a/f115 0 2026-03-10T07:51:43.929 INFO:tasks.workunit.client.1.vm08.stdout:8/854: chown d0/df/d15/d23/da8 11871333 1 2026-03-10T07:51:43.929 INFO:tasks.workunit.client.1.vm08.stdout:5/783: write d0/d4/d19/d60/d6d/d70/d40/dba/fe5 [597726,25506] 0 2026-03-10T07:51:43.937 INFO:tasks.workunit.client.1.vm08.stdout:9/740: mknod d2/d58/dbf/dd0/d35/d97/d9d/cf6 0 2026-03-10T07:51:43.939 INFO:tasks.workunit.client.1.vm08.stdout:8/855: rmdir d0/d69/d3f 39 2026-03-10T07:51:43.942 INFO:tasks.workunit.client.1.vm08.stdout:5/784: truncate d0/d4/df/f57 372108 0 2026-03-10T07:51:43.946 INFO:tasks.workunit.client.1.vm08.stdout:4/662: getdents d5/d1f/dad/dcf 0 2026-03-10T07:51:43.946 INFO:tasks.workunit.client.1.vm08.stdout:1/688: dread d2/d6/de/d1f/d26/d58/d83/f72 [0,4194304] 0 2026-03-10T07:51:43.948 INFO:tasks.workunit.client.1.vm08.stdout:9/741: creat d2/d58/dbf/daf/ff7 x:0 0 0 2026-03-10T07:51:43.951 INFO:tasks.workunit.client.1.vm08.stdout:8/856: mkdir d0/df/d15/d23/d54/dba/d89/dbf/d116 0 2026-03-10T07:51:43.951 INFO:tasks.workunit.client.1.vm08.stdout:1/689: dwrite d2/d6/de/d1f/d40/d76/fd1 [0,4194304] 0 2026-03-10T07:51:43.960 INFO:tasks.workunit.client.1.vm08.stdout:5/785: dread d0/d8/d24/f3e [0,4194304] 0 2026-03-10T07:51:43.980 INFO:tasks.workunit.client.1.vm08.stdout:0/767: write f8 [3219543,45407] 0 2026-03-10T07:51:43.981 INFO:tasks.workunit.client.1.vm08.stdout:6/758: write d1/d17/d2b/f68 [1051567,14195] 0 2026-03-10T07:51:43.986 INFO:tasks.workunit.client.1.vm08.stdout:7/714: write d3/da/d25/d9/f23 [2062404,59580] 0 2026-03-10T07:51:43.986 INFO:tasks.workunit.client.1.vm08.stdout:3/711: dwrite d0/d3c/d18/fa6 [0,4194304] 0 2026-03-10T07:51:43.996 INFO:tasks.workunit.client.1.vm08.stdout:1/690: fsync d2/d6/de/d1f/d40/d76/f79 0 2026-03-10T07:51:43.997 INFO:tasks.workunit.client.1.vm08.stdout:2/733: write d0/d1/d3/d10/d38/f53 [1041631,119542] 0 2026-03-10T07:51:43.999 INFO:tasks.workunit.client.1.vm08.stdout:4/663: fdatasync d5/d1f/d41/f83 0 2026-03-10T07:51:44.003 INFO:tasks.workunit.client.1.vm08.stdout:4/664: dwrite d5/da0/db7/fdd [0,4194304] 0 2026-03-10T07:51:44.005 INFO:tasks.workunit.client.1.vm08.stdout:0/768: stat dd/d10/d14/f46 0 2026-03-10T07:51:44.005 INFO:tasks.workunit.client.1.vm08.stdout:6/759: symlink d1/d3/df/d52/lf8 0 2026-03-10T07:51:44.005 INFO:tasks.workunit.client.1.vm08.stdout:0/769: write dd/d10/d14/fe0 [107913,118230] 0 2026-03-10T07:51:44.058 INFO:tasks.workunit.client.1.vm08.stdout:3/712: dwrite d0/d3c/d1f/d44/d51/d34/f3d [0,4194304] 0 2026-03-10T07:51:44.060 INFO:tasks.workunit.client.1.vm08.stdout:3/713: chown d0/d3c/d1f/d44/d51/f5b 433333 1 2026-03-10T07:51:44.077 INFO:tasks.workunit.client.1.vm08.stdout:6/760: creat d1/d17/d2b/d58/d77/ff9 x:0 0 0 2026-03-10T07:51:44.078 INFO:tasks.workunit.client.1.vm08.stdout:6/761: write d1/db/f57 [5056100,82562] 0 2026-03-10T07:51:44.078 INFO:tasks.workunit.client.1.vm08.stdout:6/762: write d1/f6 [1947001,74623] 0 2026-03-10T07:51:44.088 INFO:tasks.workunit.client.1.vm08.stdout:0/770: rmdir dd/d10/d2f/d37/daf 39 2026-03-10T07:51:44.090 INFO:tasks.workunit.client.1.vm08.stdout:5/786: dwrite d0/d4/d19/d81/da4/fbe [0,4194304] 0 2026-03-10T07:51:44.113 INFO:tasks.workunit.client.1.vm08.stdout:9/742: rename d2/d58/dbf/d2b/ce9 to d2/d58/dbf/dd0/cf8 0 2026-03-10T07:51:44.121 INFO:tasks.workunit.client.1.vm08.stdout:4/665: mkdir d5/da0/d12/d7b/d48/d4f/d7c/dde/ddf 0 2026-03-10T07:51:44.122 INFO:tasks.workunit.client.1.vm08.stdout:3/714: symlink d0/d3c/d18/d32/daa/le5 0 2026-03-10T07:51:44.125 INFO:tasks.workunit.client.1.vm08.stdout:4/666: dwrite d5/d8/d89/fd3 [0,4194304] 0 2026-03-10T07:51:44.126 INFO:tasks.workunit.client.1.vm08.stdout:4/667: chown d5/da0/d12/d7b/d48/d4f/d7c 19978 1 2026-03-10T07:51:44.127 INFO:tasks.workunit.client.1.vm08.stdout:6/763: mknod d1/d17/d2b/d58/d77/daf/cfa 0 2026-03-10T07:51:44.136 INFO:tasks.workunit.client.1.vm08.stdout:5/787: fdatasync d0/d4/d19/d81/d92/f74 0 2026-03-10T07:51:44.138 INFO:tasks.workunit.client.1.vm08.stdout:7/715: rename d3/da/d25/d9/d2f/d3a/d71/la0 to d3/da/d8a/dd1/lf1 0 2026-03-10T07:51:44.140 INFO:tasks.workunit.client.1.vm08.stdout:8/857: creat d0/df/d2e/d30/f117 x:0 0 0 2026-03-10T07:51:44.140 INFO:tasks.workunit.client.1.vm08.stdout:1/691: link d2/d6/de/d1f/d26/d58/d8c/l6a d2/d10/dc6/lef 0 2026-03-10T07:51:44.141 INFO:tasks.workunit.client.1.vm08.stdout:9/743: dwrite d2/d58/dbf/dd0/d35/d97/dd5/fe6 [0,4194304] 0 2026-03-10T07:51:44.154 INFO:tasks.workunit.client.1.vm08.stdout:3/715: creat d0/d3c/d18/d80/fe6 x:0 0 0 2026-03-10T07:51:44.154 INFO:tasks.workunit.client.1.vm08.stdout:4/668: creat d5/da0/d12/d7b/d48/d4f/fe0 x:0 0 0 2026-03-10T07:51:44.155 INFO:tasks.workunit.client.1.vm08.stdout:4/669: readlink d5/d1f/d31/l4a 0 2026-03-10T07:51:44.155 INFO:tasks.workunit.client.1.vm08.stdout:4/670: readlink d5/da0/l5c 0 2026-03-10T07:51:44.163 INFO:tasks.workunit.client.1.vm08.stdout:4/671: dread d5/da0/d95/dc2/fc5 [0,4194304] 0 2026-03-10T07:51:44.165 INFO:tasks.workunit.client.1.vm08.stdout:3/716: dread d0/d3c/d18/fa5 [0,4194304] 0 2026-03-10T07:51:44.165 INFO:tasks.workunit.client.1.vm08.stdout:3/717: chown d0/d3c/d18/d4a/f8a 899 1 2026-03-10T07:51:44.166 INFO:tasks.workunit.client.1.vm08.stdout:3/718: readlink d0/d3c/d18/da9/lae 0 2026-03-10T07:51:44.170 INFO:tasks.workunit.client.1.vm08.stdout:6/764: rename d1/d3/df/f85 to d1/d3/df/d1d/d40/d87/d95/ffb 0 2026-03-10T07:51:44.171 INFO:tasks.workunit.client.1.vm08.stdout:8/858: rmdir d0/df/d17/d25 39 2026-03-10T07:51:44.172 INFO:tasks.workunit.client.1.vm08.stdout:8/859: write d0/df/d15/d23/d54/f8f [277534,124683] 0 2026-03-10T07:51:44.174 INFO:tasks.workunit.client.1.vm08.stdout:9/744: fdatasync d2/d58/dbf/dd0/f46 0 2026-03-10T07:51:44.177 INFO:tasks.workunit.client.1.vm08.stdout:2/734: getdents d0/d1/d3/d56 0 2026-03-10T07:51:44.178 INFO:tasks.workunit.client.1.vm08.stdout:1/692: dwrite d2/d6/de/d1f/d26/f6e [4194304,4194304] 0 2026-03-10T07:51:44.189 INFO:tasks.workunit.client.1.vm08.stdout:3/719: mknod d0/d3c/d1f/d89/ce7 0 2026-03-10T07:51:44.191 INFO:tasks.workunit.client.1.vm08.stdout:5/788: symlink d0/d4/d19/d60/d6d/d70/dff/l102 0 2026-03-10T07:51:44.200 INFO:tasks.workunit.client.1.vm08.stdout:6/765: symlink d1/d3/df/d1d/d40/d87/lfc 0 2026-03-10T07:51:44.201 INFO:tasks.workunit.client.1.vm08.stdout:6/766: chown d1/db/d24/d73/d79/f98 1760026541 1 2026-03-10T07:51:44.201 INFO:tasks.workunit.client.1.vm08.stdout:7/716: mknod d3/da/d25/d9/d6f/cf2 0 2026-03-10T07:51:44.210 INFO:tasks.workunit.client.1.vm08.stdout:9/745: symlink d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/d98/dbb/lf9 0 2026-03-10T07:51:44.212 INFO:tasks.workunit.client.1.vm08.stdout:2/735: mkdir d0/d1/d3/d39/d7d/d86/d55/dc9/ded 0 2026-03-10T07:51:44.212 INFO:tasks.workunit.client.1.vm08.stdout:4/672: dwrite d5/da0/d95/f96 [0,4194304] 0 2026-03-10T07:51:44.212 INFO:tasks.workunit.client.1.vm08.stdout:2/736: fsync d0/d1/d3/f4e 0 2026-03-10T07:51:44.212 INFO:tasks.workunit.client.1.vm08.stdout:0/771: link dd/d10/d2f/d37/daf/fb3 dd/d10/d14/d15/d20/d5f/ff5 0 2026-03-10T07:51:44.213 INFO:tasks.workunit.client.1.vm08.stdout:3/720: truncate d0/d3c/d1f/d44/d51/d34/f47 4888713 0 2026-03-10T07:51:44.219 INFO:tasks.workunit.client.1.vm08.stdout:2/737: dwrite d0/d1/d3/d10/d38/f53 [0,4194304] 0 2026-03-10T07:51:44.219 INFO:tasks.workunit.client.1.vm08.stdout:4/673: write d5/d1f/d31/f82 [4073842,127062] 0 2026-03-10T07:51:44.225 INFO:tasks.workunit.client.1.vm08.stdout:4/674: chown d5/da0/d95/f96 11099934 1 2026-03-10T07:51:44.225 INFO:tasks.workunit.client.1.vm08.stdout:8/860: fdatasync d0/df/d2e/d30/f76 0 2026-03-10T07:51:44.228 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:44 vm05.local ceph-mon[50387]: pgmap v41: 65 pgs: 65 active+clean; 3.0 GiB data, 9.8 GiB used, 110 GiB / 120 GiB avail; 36 MiB/s rd, 89 MiB/s wr, 217 op/s 2026-03-10T07:51:44.231 INFO:tasks.workunit.client.1.vm08.stdout:7/717: sync 2026-03-10T07:51:44.231 INFO:tasks.workunit.client.1.vm08.stdout:9/746: sync 2026-03-10T07:51:44.232 INFO:tasks.workunit.client.1.vm08.stdout:2/738: mkdir d0/d1/d3/d56/d78/dad/db1/d61/dee 0 2026-03-10T07:51:44.233 INFO:tasks.workunit.client.1.vm08.stdout:2/739: write d0/d1/d3/d56/d78/dad/fec [802996,3680] 0 2026-03-10T07:51:44.234 INFO:tasks.workunit.client.1.vm08.stdout:1/693: creat d2/d6/ff0 x:0 0 0 2026-03-10T07:51:44.235 INFO:tasks.workunit.client.1.vm08.stdout:2/740: stat d0/d1/d3/d56/d78/dad/db1/d61/d84/da7 0 2026-03-10T07:51:44.240 INFO:tasks.workunit.client.1.vm08.stdout:3/721: dread d0/f39 [0,4194304] 0 2026-03-10T07:51:44.244 INFO:tasks.workunit.client.1.vm08.stdout:8/861: rmdir d0/df/d2e 39 2026-03-10T07:51:44.244 INFO:tasks.workunit.client.1.vm08.stdout:3/722: dread d0/d3c/d1f/d44/d51/d2d/f3a [0,4194304] 0 2026-03-10T07:51:44.244 INFO:tasks.workunit.client.1.vm08.stdout:4/675: creat d5/da0/d12/d7b/d48/d4f/fe1 x:0 0 0 2026-03-10T07:51:44.245 INFO:tasks.workunit.client.1.vm08.stdout:9/747: stat d2/d26/da4/la7 0 2026-03-10T07:51:44.245 INFO:tasks.workunit.client.1.vm08.stdout:9/748: dread - d2/d58/dbf/dd0/d35/fdd zero size 2026-03-10T07:51:44.246 INFO:tasks.workunit.client.1.vm08.stdout:7/718: unlink d3/da/d25/d9/d2f/d3a/d40/f63 0 2026-03-10T07:51:44.253 INFO:tasks.workunit.client.1.vm08.stdout:8/862: dwrite d0/df/d2e/d49/ff3 [0,4194304] 0 2026-03-10T07:51:44.257 INFO:tasks.workunit.client.1.vm08.stdout:1/694: symlink d2/d6/de/lf1 0 2026-03-10T07:51:44.261 INFO:tasks.workunit.client.1.vm08.stdout:8/863: dread - d0/df/d2e/d30/f117 zero size 2026-03-10T07:51:44.261 INFO:tasks.workunit.client.1.vm08.stdout:5/789: dread d0/d8/f7e [0,4194304] 0 2026-03-10T07:51:44.262 INFO:tasks.workunit.client.1.vm08.stdout:2/741: dread d0/d1/d3/d56/d78/dad/db1/d61/f59 [0,4194304] 0 2026-03-10T07:51:44.269 INFO:tasks.workunit.client.1.vm08.stdout:7/719: mkdir d3/da/d25/df3 0 2026-03-10T07:51:44.275 INFO:tasks.workunit.client.1.vm08.stdout:1/695: dwrite d2/d6/de/d1f/d26/d58/fb9 [0,4194304] 0 2026-03-10T07:51:44.284 INFO:tasks.workunit.client.1.vm08.stdout:5/790: truncate d0/d4/df/dbf/fa6 5088165 0 2026-03-10T07:51:44.286 INFO:tasks.workunit.client.1.vm08.stdout:5/791: readlink d0/d77/le9 0 2026-03-10T07:51:44.291 INFO:tasks.workunit.client.1.vm08.stdout:9/749: dread d2/d58/dbf/f7c [0,4194304] 0 2026-03-10T07:51:44.293 INFO:tasks.workunit.client.1.vm08.stdout:3/723: dread d0/d3c/d18/f38 [0,4194304] 0 2026-03-10T07:51:44.294 INFO:tasks.workunit.client.1.vm08.stdout:8/864: dread d0/df/f1b [0,4194304] 0 2026-03-10T07:51:44.302 INFO:tasks.workunit.client.1.vm08.stdout:8/865: readlink d0/d37/d86/le7 0 2026-03-10T07:51:44.302 INFO:tasks.workunit.client.1.vm08.stdout:3/724: dread d0/d3c/d1f/d89/fba [0,4194304] 0 2026-03-10T07:51:44.302 INFO:tasks.workunit.client.1.vm08.stdout:7/720: creat d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/ff4 x:0 0 0 2026-03-10T07:51:44.302 INFO:tasks.workunit.client.1.vm08.stdout:1/696: write d2/d6/de/d1f/d26/f29 [1699586,26711] 0 2026-03-10T07:51:44.302 INFO:tasks.workunit.client.1.vm08.stdout:1/697: stat d2/d6/de/c56 0 2026-03-10T07:51:44.302 INFO:tasks.workunit.client.1.vm08.stdout:8/866: dwrite d0/df/d15/d23/d39/d5b/d4a/fa7 [0,4194304] 0 2026-03-10T07:51:44.302 INFO:tasks.workunit.client.1.vm08.stdout:5/792: creat d0/d4/d19/d3a/d69/f103 x:0 0 0 2026-03-10T07:51:44.303 INFO:tasks.workunit.client.1.vm08.stdout:1/698: stat d2/d6/d50/lee 0 2026-03-10T07:51:44.304 INFO:tasks.workunit.client.1.vm08.stdout:5/793: chown d0/d4/l9a 13980996 1 2026-03-10T07:51:44.306 INFO:tasks.workunit.client.1.vm08.stdout:7/721: write d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/dac/fd3 [1038818,35439] 0 2026-03-10T07:51:44.307 INFO:tasks.workunit.client.1.vm08.stdout:9/750: mknod d2/d58/dbf/daf/cfa 0 2026-03-10T07:51:44.307 INFO:tasks.workunit.client.1.vm08.stdout:0/772: write dd/d10/d2f/d37/d64/f68 [1780981,115909] 0 2026-03-10T07:51:44.316 INFO:tasks.workunit.client.1.vm08.stdout:8/867: rmdir d0/df/d15/d23/d39 39 2026-03-10T07:51:44.317 INFO:tasks.workunit.client.1.vm08.stdout:6/767: truncate d1/d3/df/d1d/d40/d45/d5c/fb9 2571248 0 2026-03-10T07:51:44.317 INFO:tasks.workunit.client.1.vm08.stdout:1/699: mknod d2/d6/de/d1f/d26/d58/d83/cf2 0 2026-03-10T07:51:44.325 INFO:tasks.workunit.client.1.vm08.stdout:2/742: dread d0/d1/d3/d56/d57/f5b [0,4194304] 0 2026-03-10T07:51:44.325 INFO:tasks.workunit.client.1.vm08.stdout:5/794: creat d0/d33/ddd/f104 x:0 0 0 2026-03-10T07:51:44.327 INFO:tasks.workunit.client.1.vm08.stdout:6/768: creat d1/d46/ffd x:0 0 0 2026-03-10T07:51:44.329 INFO:tasks.workunit.client.1.vm08.stdout:4/676: write d5/d1f/f37 [1555280,75093] 0 2026-03-10T07:51:44.329 INFO:tasks.workunit.client.1.vm08.stdout:4/677: chown d5/da0/l5c 7405 1 2026-03-10T07:51:44.335 INFO:tasks.workunit.client.1.vm08.stdout:0/773: symlink dd/d10/d2f/d37/d64/d95/d5c/dca/ddb/lf6 0 2026-03-10T07:51:44.338 INFO:tasks.workunit.client.1.vm08.stdout:9/751: write d2/d58/dbf/dd0/f5d [4421926,120223] 0 2026-03-10T07:51:44.343 INFO:tasks.workunit.client.1.vm08.stdout:2/743: truncate d0/d1/d3/d10/d65/fc5 360188 0 2026-03-10T07:51:44.344 INFO:tasks.workunit.client.1.vm08.stdout:2/744: chown d0/d1/d3/d39/d7d/d86/d55/l16 2 1 2026-03-10T07:51:44.344 INFO:tasks.workunit.client.1.vm08.stdout:2/745: chown d0/d1/d3/f96 788969122 1 2026-03-10T07:51:44.345 INFO:tasks.workunit.client.1.vm08.stdout:7/722: dwrite d3/da/f6b [0,4194304] 0 2026-03-10T07:51:44.345 INFO:tasks.workunit.client.1.vm08.stdout:7/723: chown d3/da/d25/d9/d2f/d6c/f98 76652149 1 2026-03-10T07:51:44.353 INFO:tasks.workunit.client.1.vm08.stdout:5/795: creat d0/d4/d19/d60/d6d/d70/d40/f105 x:0 0 0 2026-03-10T07:51:44.355 INFO:tasks.workunit.client.1.vm08.stdout:5/796: chown d0/d4/df/d82 9 1 2026-03-10T07:51:44.355 INFO:tasks.workunit.client.1.vm08.stdout:3/725: dread d0/d3c/d1f/d44/d51/d34/f47 [0,4194304] 0 2026-03-10T07:51:44.356 INFO:tasks.workunit.client.1.vm08.stdout:6/769: creat d1/d3/df/d1d/d40/d87/d95/ffe x:0 0 0 2026-03-10T07:51:44.370 INFO:tasks.workunit.client.1.vm08.stdout:1/700: dread d2/d6/de/d1f/d26/d58/d8c/f44 [0,4194304] 0 2026-03-10T07:51:44.371 INFO:tasks.workunit.client.1.vm08.stdout:8/868: truncate d0/df/f19 2318157 0 2026-03-10T07:51:44.373 INFO:tasks.workunit.client.1.vm08.stdout:1/701: dwrite d2/d6/de/d1f/da9/fcd [0,4194304] 0 2026-03-10T07:51:44.405 INFO:tasks.workunit.client.1.vm08.stdout:2/746: write d0/d1/d17/d6b/fe3 [1136274,115617] 0 2026-03-10T07:51:44.406 INFO:tasks.workunit.client.1.vm08.stdout:5/797: mkdir d0/d4/d19/d3a/d106 0 2026-03-10T07:51:44.406 INFO:tasks.workunit.client.1.vm08.stdout:4/678: mkdir d5/da0/de2 0 2026-03-10T07:51:44.407 INFO:tasks.workunit.client.1.vm08.stdout:6/770: mkdir d1/d3/d3e/dff 0 2026-03-10T07:51:44.408 INFO:tasks.workunit.client.1.vm08.stdout:9/752: mkdir d2/d58/dbf/dd0/d35/d97/dfb 0 2026-03-10T07:51:44.410 INFO:tasks.workunit.client.1.vm08.stdout:2/747: fdatasync d0/d1/f9f 0 2026-03-10T07:51:44.411 INFO:tasks.workunit.client.1.vm08.stdout:5/798: symlink d0/d8/dce/l107 0 2026-03-10T07:51:44.411 INFO:tasks.workunit.client.1.vm08.stdout:4/679: mknod d5/da0/d32/ce3 0 2026-03-10T07:51:44.415 INFO:tasks.workunit.client.1.vm08.stdout:6/771: creat d1/d46/f100 x:0 0 0 2026-03-10T07:51:44.415 INFO:tasks.workunit.client.1.vm08.stdout:4/680: stat d5/da0/d12/d7b/d48/d4f/d8d 0 2026-03-10T07:51:44.417 INFO:tasks.workunit.client.1.vm08.stdout:0/774: dread dd/d10/f5e [0,4194304] 0 2026-03-10T07:51:44.417 INFO:tasks.workunit.client.1.vm08.stdout:8/869: dwrite d0/df/d15/d23/d39/d5b/dbc/ffc [0,4194304] 0 2026-03-10T07:51:44.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:44 vm08.local ceph-mon[59917]: pgmap v41: 65 pgs: 65 active+clean; 3.0 GiB data, 9.8 GiB used, 110 GiB / 120 GiB avail; 36 MiB/s rd, 89 MiB/s wr, 217 op/s 2026-03-10T07:51:44.423 INFO:tasks.workunit.client.1.vm08.stdout:5/799: stat d0/d77/daa/ce3 0 2026-03-10T07:51:44.423 INFO:tasks.workunit.client.1.vm08.stdout:3/726: link d0/d3c/d18/f38 d0/d3c/d1f/d44/d51/d2d/fe8 0 2026-03-10T07:51:44.424 INFO:tasks.workunit.client.1.vm08.stdout:5/800: readlink d0/d4/d19/d3a/d69/lc7 0 2026-03-10T07:51:44.425 INFO:tasks.workunit.client.1.vm08.stdout:6/772: fdatasync d1/d3/d3e/f4a 0 2026-03-10T07:51:44.425 INFO:tasks.workunit.client.1.vm08.stdout:4/681: rename d5/da0/d32/f44 to d5/da0/d12/d7b/d48/d4f/d7c/dde/ddf/fe4 0 2026-03-10T07:51:44.425 INFO:tasks.workunit.client.1.vm08.stdout:2/748: symlink d0/d1/d17/db2/dde/lef 0 2026-03-10T07:51:44.426 INFO:tasks.workunit.client.1.vm08.stdout:0/775: dwrite dd/d10/d2f/d37/d64/d95/d58/d3d/f5b [4194304,4194304] 0 2026-03-10T07:51:44.452 INFO:tasks.workunit.client.1.vm08.stdout:1/702: getdents d2/d6/de/d1f 0 2026-03-10T07:51:44.454 INFO:tasks.workunit.client.1.vm08.stdout:8/870: creat d0/df/d15/d23/d54/dba/d89/dbf/d116/f118 x:0 0 0 2026-03-10T07:51:44.466 INFO:tasks.workunit.client.1.vm08.stdout:5/801: creat d0/d77/daa/f108 x:0 0 0 2026-03-10T07:51:44.468 INFO:tasks.workunit.client.1.vm08.stdout:6/773: dwrite d1/d3/df/d38/fd8 [0,4194304] 0 2026-03-10T07:51:44.469 INFO:tasks.workunit.client.1.vm08.stdout:0/776: mkdir dd/d10/d14/d15/d20/d92/dc1/ded/df7 0 2026-03-10T07:51:44.471 INFO:tasks.workunit.client.1.vm08.stdout:3/727: fdatasync d0/d3c/d1f/d44/d51/f65 0 2026-03-10T07:51:44.474 INFO:tasks.workunit.client.1.vm08.stdout:0/777: dwrite dd/d10/d2f/d37/d64/d95/d5c/dca/ff2 [0,4194304] 0 2026-03-10T07:51:44.474 INFO:tasks.workunit.client.1.vm08.stdout:8/871: symlink d0/d37/d86/l119 0 2026-03-10T07:51:44.474 INFO:tasks.workunit.client.1.vm08.stdout:0/778: chown dd/d10/d14/d15 102055567 1 2026-03-10T07:51:44.479 INFO:tasks.workunit.client.1.vm08.stdout:2/749: mkdir d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/df0 0 2026-03-10T07:51:44.479 INFO:tasks.workunit.client.1.vm08.stdout:6/774: creat d1/d17/d2b/d58/d77/f101 x:0 0 0 2026-03-10T07:51:44.479 INFO:tasks.workunit.client.1.vm08.stdout:7/724: truncate d3/f57 1837728 0 2026-03-10T07:51:44.480 INFO:tasks.workunit.client.1.vm08.stdout:5/802: sync 2026-03-10T07:51:44.481 INFO:tasks.workunit.client.1.vm08.stdout:5/803: chown d0/d4/f75 203118853 1 2026-03-10T07:51:44.483 INFO:tasks.workunit.client.1.vm08.stdout:6/775: mknod d1/db/d24/d73/d79/c102 0 2026-03-10T07:51:44.484 INFO:tasks.workunit.client.1.vm08.stdout:0/779: symlink dd/d10/d14/d15/d20/d92/dc1/de5/lf8 0 2026-03-10T07:51:44.484 INFO:tasks.workunit.client.1.vm08.stdout:4/682: getdents d5/d1f/d31 0 2026-03-10T07:51:44.484 INFO:tasks.workunit.client.1.vm08.stdout:5/804: dread - d0/ddf/fd5 zero size 2026-03-10T07:51:44.485 INFO:tasks.workunit.client.1.vm08.stdout:1/703: getdents d2/d6/de/d1f/d40 0 2026-03-10T07:51:44.485 INFO:tasks.workunit.client.1.vm08.stdout:4/683: readlink d5/d1f/d31/l4a 0 2026-03-10T07:51:44.486 INFO:tasks.workunit.client.1.vm08.stdout:3/728: rename d0/d3c/d18/d80/lce to d0/le9 0 2026-03-10T07:51:44.487 INFO:tasks.workunit.client.1.vm08.stdout:7/725: getdents d3/da/d25/d9/d2f/d3a/d40/d54/db5 0 2026-03-10T07:51:44.490 INFO:tasks.workunit.client.1.vm08.stdout:6/776: chown d1/db/d24/d3d/c80 0 1 2026-03-10T07:51:44.493 INFO:tasks.workunit.client.1.vm08.stdout:2/750: truncate d0/d1/f9f 3878189 0 2026-03-10T07:51:44.494 INFO:tasks.workunit.client.1.vm08.stdout:9/753: write d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/f5f [2902755,68767] 0 2026-03-10T07:51:44.497 INFO:tasks.workunit.client.1.vm08.stdout:5/805: dread d0/d4/d19/d3a/d69/f71 [0,4194304] 0 2026-03-10T07:51:44.498 INFO:tasks.workunit.client.1.vm08.stdout:4/684: dwrite d5/da0/d12/d7b/d48/d4f/f56 [0,4194304] 0 2026-03-10T07:51:44.503 INFO:tasks.workunit.client.1.vm08.stdout:7/726: dwrite d3/da/d25/d9/d2f/d39/d43/fc2 [0,4194304] 0 2026-03-10T07:51:44.507 INFO:tasks.workunit.client.1.vm08.stdout:3/729: creat d0/d3c/d1f/d89/fea x:0 0 0 2026-03-10T07:51:44.508 INFO:tasks.workunit.client.1.vm08.stdout:6/777: creat d1/d3/df/d38/f103 x:0 0 0 2026-03-10T07:51:44.509 INFO:tasks.workunit.client.1.vm08.stdout:9/754: mkdir d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/d98/dbb/dd9/dfc 0 2026-03-10T07:51:44.512 INFO:tasks.workunit.client.1.vm08.stdout:9/755: chown d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/d98/dbb/lf9 795078917 1 2026-03-10T07:51:44.512 INFO:tasks.workunit.client.1.vm08.stdout:5/806: rmdir d0/d8/dce 39 2026-03-10T07:51:44.515 INFO:tasks.workunit.client.1.vm08.stdout:3/730: dread - d0/d3c/d18/d48/d55/d56/fbc zero size 2026-03-10T07:51:44.516 INFO:tasks.workunit.client.1.vm08.stdout:3/731: readlink d0/d3c/d18/d32/l9b 0 2026-03-10T07:51:44.516 INFO:tasks.workunit.client.1.vm08.stdout:3/732: dread - d0/d3c/d1f/f7e zero size 2026-03-10T07:51:44.521 INFO:tasks.workunit.client.1.vm08.stdout:2/751: rename d0/d1/d3/d56/d78/dad/fec to d0/d1/d3/d56/d78/dad/db1/d61/ff1 0 2026-03-10T07:51:44.530 INFO:tasks.workunit.client.1.vm08.stdout:9/756: creat d2/d58/ffd x:0 0 0 2026-03-10T07:51:44.536 INFO:tasks.workunit.client.1.vm08.stdout:4/685: dread d5/da0/d12/d7b/d48/f5b [0,4194304] 0 2026-03-10T07:51:44.536 INFO:tasks.workunit.client.1.vm08.stdout:2/752: mknod d0/d1/cf2 0 2026-03-10T07:51:44.536 INFO:tasks.workunit.client.1.vm08.stdout:5/807: creat d0/d4/d19/d3a/d106/f109 x:0 0 0 2026-03-10T07:51:44.537 INFO:tasks.workunit.client.1.vm08.stdout:4/686: creat d5/da0/d12/d7b/d48/d4f/fe5 x:0 0 0 2026-03-10T07:51:44.539 INFO:tasks.workunit.client.1.vm08.stdout:5/808: write d0/d4/d19/d3a/d69/f103 [706522,55304] 0 2026-03-10T07:51:44.539 INFO:tasks.workunit.client.1.vm08.stdout:6/778: getdents d1/d3/df/d1d/d40/d45/d5c 0 2026-03-10T07:51:44.540 INFO:tasks.workunit.client.1.vm08.stdout:5/809: write d0/d4/df/ff5 [951591,96616] 0 2026-03-10T07:51:44.550 INFO:tasks.workunit.client.1.vm08.stdout:9/757: creat d2/d26/ffe x:0 0 0 2026-03-10T07:51:44.550 INFO:tasks.workunit.client.1.vm08.stdout:4/687: rmdir d5/d8/d50/db0 39 2026-03-10T07:51:44.550 INFO:tasks.workunit.client.1.vm08.stdout:4/688: write d5/da0/d12/d7b/d48/d4f/d8d/fcd [955028,3956] 0 2026-03-10T07:51:44.550 INFO:tasks.workunit.client.1.vm08.stdout:3/733: getdents d0/d3c/d18/d32/d61/d52 0 2026-03-10T07:51:44.550 INFO:tasks.workunit.client.1.vm08.stdout:9/758: sync 2026-03-10T07:51:44.550 INFO:tasks.workunit.client.1.vm08.stdout:9/759: mkdir d2/d58/dbf/dd0/d35/dff 0 2026-03-10T07:51:44.550 INFO:tasks.workunit.client.1.vm08.stdout:6/779: creat d1/d3/df/d1d/d40/d87/d95/f104 x:0 0 0 2026-03-10T07:51:44.553 INFO:tasks.workunit.client.1.vm08.stdout:5/810: chown d0/d4/d19/d43/f35 30828850 1 2026-03-10T07:51:44.559 INFO:tasks.workunit.client.1.vm08.stdout:6/780: readlink d1/d3/l27 0 2026-03-10T07:51:44.564 INFO:tasks.workunit.client.1.vm08.stdout:3/734: dread d0/d3c/d1f/d95/fab [0,4194304] 0 2026-03-10T07:51:44.567 INFO:tasks.workunit.client.1.vm08.stdout:2/753: dread d0/d1/d3/d10/d65/fae [0,4194304] 0 2026-03-10T07:51:44.571 INFO:tasks.workunit.client.1.vm08.stdout:3/735: dread d0/f45 [0,4194304] 0 2026-03-10T07:51:44.572 INFO:tasks.workunit.client.1.vm08.stdout:3/736: fsync d0/d3c/d18/da9/fd5 0 2026-03-10T07:51:44.578 INFO:tasks.workunit.client.1.vm08.stdout:8/872: dwrite d0/df/d15/d23/d54/dba/d89/f8b [0,4194304] 0 2026-03-10T07:51:44.580 INFO:tasks.workunit.client.1.vm08.stdout:8/873: write d0/df/d5d/fcf [1350241,64408] 0 2026-03-10T07:51:44.583 INFO:tasks.workunit.client.1.vm08.stdout:1/704: dwrite d2/d6/de/d1f/d40/f4d [0,4194304] 0 2026-03-10T07:51:44.592 INFO:tasks.workunit.client.1.vm08.stdout:2/754: creat d0/d1/d17/ff3 x:0 0 0 2026-03-10T07:51:44.613 INFO:tasks.workunit.client.1.vm08.stdout:3/737: dread d0/d3c/d18/d32/d61/d52/f66 [0,4194304] 0 2026-03-10T07:51:44.613 INFO:tasks.workunit.client.1.vm08.stdout:1/705: mknod d2/d6/de/d5f/cf3 0 2026-03-10T07:51:44.613 INFO:tasks.workunit.client.1.vm08.stdout:0/780: dwrite dd/d10/d2f/d37/d64/d95/d58/d3d/fa1 [0,4194304] 0 2026-03-10T07:51:44.613 INFO:tasks.workunit.client.1.vm08.stdout:6/781: dread d1/d17/d2b/d58/d76/f99 [0,4194304] 0 2026-03-10T07:51:44.613 INFO:tasks.workunit.client.1.vm08.stdout:2/755: creat d0/d1/d17/db2/d9c/ff4 x:0 0 0 2026-03-10T07:51:44.613 INFO:tasks.workunit.client.1.vm08.stdout:3/738: mknod d0/d3c/d18/d32/d61/d52/dca/ceb 0 2026-03-10T07:51:44.613 INFO:tasks.workunit.client.1.vm08.stdout:1/706: dread - d2/d6/de/d47/fac zero size 2026-03-10T07:51:44.613 INFO:tasks.workunit.client.1.vm08.stdout:6/782: mkdir d1/d3/d3e/dff/d105 0 2026-03-10T07:51:44.613 INFO:tasks.workunit.client.1.vm08.stdout:6/783: dread d1/d3/df/d44/f5a [0,4194304] 0 2026-03-10T07:51:44.613 INFO:tasks.workunit.client.1.vm08.stdout:6/784: readlink d1/d46/lbf 0 2026-03-10T07:51:44.615 INFO:tasks.workunit.client.1.vm08.stdout:1/707: link d2/d6/de/d1f/d8f/f91 d2/d6/de/d1f/d8f/ff4 0 2026-03-10T07:51:44.617 INFO:tasks.workunit.client.1.vm08.stdout:3/739: rename d0/d3c/d1f/d44/d51 to d0/d3c/d18/dec 0 2026-03-10T07:51:44.618 INFO:tasks.workunit.client.1.vm08.stdout:6/785: rmdir d1/d3/df/d1d/d40 39 2026-03-10T07:51:44.619 INFO:tasks.workunit.client.1.vm08.stdout:0/781: getdents dd/d10/d14/d15/d20 0 2026-03-10T07:51:44.629 INFO:tasks.workunit.client.1.vm08.stdout:2/756: getdents d0/d1/d3/d56/d78/de4 0 2026-03-10T07:51:44.629 INFO:tasks.workunit.client.1.vm08.stdout:6/786: getdents d1/db 0 2026-03-10T07:51:44.629 INFO:tasks.workunit.client.1.vm08.stdout:3/740: dwrite d0/d3c/d1f/f6f [0,4194304] 0 2026-03-10T07:51:44.629 INFO:tasks.workunit.client.1.vm08.stdout:0/782: dwrite dd/d10/d2f/d37/d64/d95/d5c/f63 [0,4194304] 0 2026-03-10T07:51:44.635 INFO:tasks.workunit.client.1.vm08.stdout:3/741: mkdir d0/d3c/d18/d32/daa/ded 0 2026-03-10T07:51:44.636 INFO:tasks.workunit.client.1.vm08.stdout:3/742: read d0/d3c/d18/fa5 [3206042,25781] 0 2026-03-10T07:51:44.643 INFO:tasks.workunit.client.1.vm08.stdout:0/783: getdents dd/d10/d14/d15/d20/d5f 0 2026-03-10T07:51:44.648 INFO:tasks.workunit.client.1.vm08.stdout:3/743: dwrite d0/d3c/d18/d48/d55/fe0 [0,4194304] 0 2026-03-10T07:51:44.648 INFO:tasks.workunit.client.1.vm08.stdout:0/784: dread dd/f16 [0,4194304] 0 2026-03-10T07:51:44.652 INFO:tasks.workunit.client.1.vm08.stdout:0/785: truncate dd/d10/d2f/f4b 947546 0 2026-03-10T07:51:44.653 INFO:tasks.workunit.client.1.vm08.stdout:3/744: unlink d0/d3c/d18/d80/ccf 0 2026-03-10T07:51:44.655 INFO:tasks.workunit.client.1.vm08.stdout:0/786: chown dd/d10/d2f/d37/f65 162 1 2026-03-10T07:51:44.655 INFO:tasks.workunit.client.1.vm08.stdout:3/745: unlink d0/f45 0 2026-03-10T07:51:44.655 INFO:tasks.workunit.client.1.vm08.stdout:0/787: chown dd/d10/d2f/d37/d64/d95/f48 132201912 1 2026-03-10T07:51:44.658 INFO:tasks.workunit.client.1.vm08.stdout:3/746: fsync d0/d3c/d18/d32/d61/d83/f8b 0 2026-03-10T07:51:44.660 INFO:tasks.workunit.client.1.vm08.stdout:0/788: unlink dd/d10/d14/d15/l75 0 2026-03-10T07:51:44.666 INFO:tasks.workunit.client.1.vm08.stdout:2/757: dread d0/d1/d3/d56/d78/f62 [0,4194304] 0 2026-03-10T07:51:44.666 INFO:tasks.workunit.client.1.vm08.stdout:0/789: unlink dd/d10/d14/d15/d20/d5f/ff5 0 2026-03-10T07:51:44.666 INFO:tasks.workunit.client.1.vm08.stdout:2/758: fsync d0/d1/d3/d39/d7d/f98 0 2026-03-10T07:51:44.679 INFO:tasks.workunit.client.1.vm08.stdout:3/747: sync 2026-03-10T07:51:44.679 INFO:tasks.workunit.client.1.vm08.stdout:7/727: write d3/da/d25/d9/d6f/fab [2443447,104521] 0 2026-03-10T07:51:44.681 INFO:tasks.workunit.client.1.vm08.stdout:3/748: symlink d0/d3c/d18/d4a/lee 0 2026-03-10T07:51:44.683 INFO:tasks.workunit.client.1.vm08.stdout:3/749: creat d0/d3c/d18/d32/daa/fef x:0 0 0 2026-03-10T07:51:44.691 INFO:tasks.workunit.client.1.vm08.stdout:7/728: link d3/da/dbc/fe2 d3/da/d25/d9/d2f/d39/d43/d4f/ff5 0 2026-03-10T07:51:44.691 INFO:tasks.workunit.client.1.vm08.stdout:7/729: write d3/da/d25/d9/fc5 [3563948,102033] 0 2026-03-10T07:51:44.691 INFO:tasks.workunit.client.1.vm08.stdout:3/750: rmdir d0/d3c/d18/dd4 0 2026-03-10T07:51:44.691 INFO:tasks.workunit.client.1.vm08.stdout:7/730: dread d3/da/d25/d9/d2f/d3a/d4b/fa3 [0,4194304] 0 2026-03-10T07:51:44.691 INFO:tasks.workunit.client.1.vm08.stdout:3/751: creat d0/d3c/d18/d48/d55/ff0 x:0 0 0 2026-03-10T07:51:44.691 INFO:tasks.workunit.client.1.vm08.stdout:7/731: creat d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/ff6 x:0 0 0 2026-03-10T07:51:44.691 INFO:tasks.workunit.client.1.vm08.stdout:7/732: creat d3/da/d8a/ff7 x:0 0 0 2026-03-10T07:51:44.691 INFO:tasks.workunit.client.1.vm08.stdout:3/752: dread d0/d3c/d18/f38 [0,4194304] 0 2026-03-10T07:51:44.693 INFO:tasks.workunit.client.1.vm08.stdout:7/733: mknod d3/da/d25/d9/d2f/d39/d43/d4f/d5b/cf8 0 2026-03-10T07:51:44.693 INFO:tasks.workunit.client.1.vm08.stdout:7/734: chown d3/da/d25/d9/d2f/d4d/db6/fc6 39 1 2026-03-10T07:51:44.710 INFO:tasks.workunit.client.1.vm08.stdout:4/689: truncate d5/f10 1209538 0 2026-03-10T07:51:44.710 INFO:tasks.workunit.client.1.vm08.stdout:9/760: write d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/f8c [899243,93176] 0 2026-03-10T07:51:44.714 INFO:tasks.workunit.client.1.vm08.stdout:8/874: dread d0/df/f19 [0,4194304] 0 2026-03-10T07:51:44.716 INFO:tasks.workunit.client.1.vm08.stdout:9/761: creat d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/f100 x:0 0 0 2026-03-10T07:51:44.717 INFO:tasks.workunit.client.1.vm08.stdout:9/762: stat d2/f6 0 2026-03-10T07:51:44.718 INFO:tasks.workunit.client.1.vm08.stdout:9/763: fdatasync d2/d58/f9f 0 2026-03-10T07:51:44.719 INFO:tasks.workunit.client.1.vm08.stdout:8/875: dread d0/df/d15/d23/d54/dba/d89/fb6 [0,4194304] 0 2026-03-10T07:51:44.719 INFO:tasks.workunit.client.1.vm08.stdout:2/759: dread d0/f12 [0,4194304] 0 2026-03-10T07:51:44.722 INFO:tasks.workunit.client.1.vm08.stdout:2/760: write d0/d1/d3/d39/d7d/d86/d55/db9/fc2 [884271,3678] 0 2026-03-10T07:51:44.722 INFO:tasks.workunit.client.1.vm08.stdout:8/876: read d0/df/d17/f1a [1550096,3959] 0 2026-03-10T07:51:44.723 INFO:tasks.workunit.client.1.vm08.stdout:9/764: getdents d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/dca 0 2026-03-10T07:51:44.723 INFO:tasks.workunit.client.1.vm08.stdout:9/765: dread - d2/fb5 zero size 2026-03-10T07:51:44.724 INFO:tasks.workunit.client.1.vm08.stdout:8/877: unlink d0/df/d17/f1a 0 2026-03-10T07:51:44.771 INFO:tasks.workunit.client.1.vm08.stdout:5/811: truncate d0/d8/d24/de2/fe6 791437 0 2026-03-10T07:51:44.776 INFO:tasks.workunit.client.1.vm08.stdout:7/735: rename d3/f85 to d3/da/d25/d9/d2f/d3a/dc0/ff9 0 2026-03-10T07:51:44.779 INFO:tasks.workunit.client.1.vm08.stdout:7/736: mknod d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/cfa 0 2026-03-10T07:51:44.779 INFO:tasks.workunit.client.1.vm08.stdout:3/753: rmdir d0/d3c/d18/d32/d61/d52/dca 39 2026-03-10T07:51:44.781 INFO:tasks.workunit.client.1.vm08.stdout:1/708: write d2/f4 [2407907,85320] 0 2026-03-10T07:51:44.784 INFO:tasks.workunit.client.1.vm08.stdout:3/754: fsync d0/d3c/d18/d32/d61/d83/f8b 0 2026-03-10T07:51:44.786 INFO:tasks.workunit.client.1.vm08.stdout:6/787: write d1/d3/df/d1d/f6b [163360,24000] 0 2026-03-10T07:51:44.786 INFO:tasks.workunit.client.1.vm08.stdout:7/737: mkdir d3/da/d25/d9/d2f/dfb 0 2026-03-10T07:51:44.794 INFO:tasks.workunit.client.1.vm08.stdout:1/709: rmdir d2/d6/d3a/d61/d6f/dad 39 2026-03-10T07:51:44.794 INFO:tasks.workunit.client.1.vm08.stdout:3/755: dread d0/d3c/d18/d48/d55/fe0 [0,4194304] 0 2026-03-10T07:51:44.794 INFO:tasks.workunit.client.1.vm08.stdout:7/738: mknod d3/da/dbc/cfc 0 2026-03-10T07:51:44.794 INFO:tasks.workunit.client.1.vm08.stdout:7/739: fdatasync d3/da/d25/d9/d6f/fab 0 2026-03-10T07:51:44.795 INFO:tasks.workunit.client.1.vm08.stdout:0/790: dwrite dd/d10/d14/d15/f94 [0,4194304] 0 2026-03-10T07:51:44.796 INFO:tasks.workunit.client.1.vm08.stdout:3/756: creat d0/d3c/d1f/d95/ff1 x:0 0 0 2026-03-10T07:51:44.805 INFO:tasks.workunit.client.1.vm08.stdout:4/690: truncate d5/da0/d12/d7b/d48/d4f/d7c/f9f 3531865 0 2026-03-10T07:51:44.805 INFO:tasks.workunit.client.1.vm08.stdout:2/761: truncate d0/d1/d3/d56/d57/f5b 1252385 0 2026-03-10T07:51:44.805 INFO:tasks.workunit.client.1.vm08.stdout:8/878: write d0/df/d15/d23/d39/f3e [783325,24076] 0 2026-03-10T07:51:44.809 INFO:tasks.workunit.client.1.vm08.stdout:7/740: rename d3/da/da6 to d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79/ddd/dfd 0 2026-03-10T07:51:44.810 INFO:tasks.workunit.client.1.vm08.stdout:9/766: dwrite d2/f5 [0,4194304] 0 2026-03-10T07:51:44.811 INFO:tasks.workunit.client.1.vm08.stdout:7/741: write d3/ff0 [534592,113043] 0 2026-03-10T07:51:44.813 INFO:tasks.workunit.client.1.vm08.stdout:9/767: fsync d2/d58/dbf/d2b/f37 0 2026-03-10T07:51:44.823 INFO:tasks.workunit.client.1.vm08.stdout:0/791: symlink dd/d10/d14/d15/d20/d92/dc1/lf9 0 2026-03-10T07:51:44.826 INFO:tasks.workunit.client.1.vm08.stdout:4/691: rename d5/da0/d12/d7b to d5/da0/d95/de6 0 2026-03-10T07:51:44.826 INFO:tasks.workunit.client.1.vm08.stdout:0/792: write dd/d10/d2f/d37/d64/d95/d58/d3d/f5b [7013119,102543] 0 2026-03-10T07:51:44.826 INFO:tasks.workunit.client.1.vm08.stdout:0/793: readlink dd/d18/l7c 0 2026-03-10T07:51:44.829 INFO:tasks.workunit.client.1.vm08.stdout:1/710: dread d2/d10/f3f [0,4194304] 0 2026-03-10T07:51:44.829 INFO:tasks.workunit.client.1.vm08.stdout:0/794: dwrite dd/d10/d14/f69 [4194304,4194304] 0 2026-03-10T07:51:44.835 INFO:tasks.workunit.client.1.vm08.stdout:3/757: rmdir d0/d3c/d18/dec/dcd 0 2026-03-10T07:51:44.843 INFO:tasks.workunit.client.1.vm08.stdout:4/692: rmdir d5/d1f/d31/d61 39 2026-03-10T07:51:44.846 INFO:tasks.workunit.client.1.vm08.stdout:1/711: rename d2/d10/c17 to d2/d6/de/d71/cf5 0 2026-03-10T07:51:44.847 INFO:tasks.workunit.client.1.vm08.stdout:5/812: dwrite d0/d4/d19/d60/d6d/d70/d40/f5f [0,4194304] 0 2026-03-10T07:51:44.851 INFO:tasks.workunit.client.1.vm08.stdout:5/813: chown d0/d4/d19/d81/d92/ff2 10033408 1 2026-03-10T07:51:44.851 INFO:tasks.workunit.client.1.vm08.stdout:5/814: dread - d0/d4/d19/d81/fe4 zero size 2026-03-10T07:51:44.853 INFO:tasks.workunit.client.1.vm08.stdout:5/815: fdatasync d0/d4/d19/d60/d6d/d70/d40/f105 0 2026-03-10T07:51:44.854 INFO:tasks.workunit.client.1.vm08.stdout:2/762: link d0/d1/d3/d56/d78/dad/db1/d61/d8e/fa1 d0/d1/d3/d56/d78/dad/db1/ff5 0 2026-03-10T07:51:44.854 INFO:tasks.workunit.client.1.vm08.stdout:9/768: truncate d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/f8e 1765565 0 2026-03-10T07:51:44.858 INFO:tasks.workunit.client.1.vm08.stdout:0/795: creat dd/d10/d14/da6/ffa x:0 0 0 2026-03-10T07:51:44.861 INFO:tasks.workunit.client.1.vm08.stdout:3/758: symlink d0/d3c/d18/d48/lf2 0 2026-03-10T07:51:44.861 INFO:tasks.workunit.client.1.vm08.stdout:5/816: chown d0/d8/dce/dd2 18 1 2026-03-10T07:51:44.861 INFO:tasks.workunit.client.1.vm08.stdout:4/693: getdents d5/d1f/dad/dcf 0 2026-03-10T07:51:44.861 INFO:tasks.workunit.client.1.vm08.stdout:5/817: stat d0/d4/d19/d81/fe4 0 2026-03-10T07:51:44.865 INFO:tasks.workunit.client.1.vm08.stdout:4/694: stat d5/da0/d95/de6/d48/d4f/d8d/d91/dd5/cdc 0 2026-03-10T07:51:44.866 INFO:tasks.workunit.client.1.vm08.stdout:0/796: dread dd/d10/d14/d15/f94 [0,4194304] 0 2026-03-10T07:51:44.867 INFO:tasks.workunit.client.1.vm08.stdout:2/763: creat d0/d1/d3/d39/d7d/d86/d55/dc9/ded/ff6 x:0 0 0 2026-03-10T07:51:44.867 INFO:tasks.workunit.client.1.vm08.stdout:2/764: readlink d0/d1/d3/d39/d7d/d86/d55/d7a/l88 0 2026-03-10T07:51:44.868 INFO:tasks.workunit.client.1.vm08.stdout:9/769: mkdir d2/d58/dbf/dd0/d35/dff/d101 0 2026-03-10T07:51:44.875 INFO:tasks.workunit.client.1.vm08.stdout:1/712: creat d2/d6/de/d1f/d26/ff6 x:0 0 0 2026-03-10T07:51:44.887 INFO:tasks.workunit.client.1.vm08.stdout:1/713: write d2/d6/de/d1f/d26/d58/d83/fa2 [5026202,7078] 0 2026-03-10T07:51:44.887 INFO:tasks.workunit.client.1.vm08.stdout:9/770: dread d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/f66 [0,4194304] 0 2026-03-10T07:51:44.887 INFO:tasks.workunit.client.1.vm08.stdout:5/818: dread d0/d4/df/d12/f46 [0,4194304] 0 2026-03-10T07:51:44.890 INFO:tasks.workunit.client.1.vm08.stdout:2/765: getdents d0/d1/d3 0 2026-03-10T07:51:44.892 INFO:tasks.workunit.client.1.vm08.stdout:5/819: dread d0/d4/f31 [0,4194304] 0 2026-03-10T07:51:44.894 INFO:tasks.workunit.client.1.vm08.stdout:5/820: getdents d0/d4/df/dbf/d41 0 2026-03-10T07:51:44.924 INFO:tasks.workunit.client.1.vm08.stdout:4/695: sync 2026-03-10T07:51:44.924 INFO:tasks.workunit.client.1.vm08.stdout:3/759: sync 2026-03-10T07:51:44.925 INFO:tasks.workunit.client.1.vm08.stdout:4/696: fdatasync d5/da0/d95/de6/d48/d4f/fe5 0 2026-03-10T07:51:44.927 INFO:tasks.workunit.client.1.vm08.stdout:3/760: dwrite d0/d3c/d18/da9/fd5 [0,4194304] 0 2026-03-10T07:51:44.927 INFO:tasks.workunit.client.1.vm08.stdout:3/761: chown d0/d3c/d18/d48/lf2 66605956 1 2026-03-10T07:51:44.928 INFO:tasks.workunit.client.1.vm08.stdout:3/762: readlink d0/d3c/d18/da9/lae 0 2026-03-10T07:51:44.931 INFO:tasks.workunit.client.1.vm08.stdout:4/697: write d5/da0/d95/de6/fbc [1552455,55057] 0 2026-03-10T07:51:44.932 INFO:tasks.workunit.client.1.vm08.stdout:4/698: creat d5/da0/d12/fe7 x:0 0 0 2026-03-10T07:51:44.936 INFO:tasks.workunit.client.1.vm08.stdout:4/699: dwrite d5/d1f/d31/f33 [0,4194304] 0 2026-03-10T07:51:44.938 INFO:tasks.workunit.client.1.vm08.stdout:4/700: dread - d5/d1f/fd7 zero size 2026-03-10T07:51:44.938 INFO:tasks.workunit.client.1.vm08.stdout:4/701: dread - d5/d1f/d70/fbd zero size 2026-03-10T07:51:44.945 INFO:tasks.workunit.client.1.vm08.stdout:4/702: creat d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/fe8 x:0 0 0 2026-03-10T07:51:44.946 INFO:tasks.workunit.client.1.vm08.stdout:6/788: dwrite d1/db/d24/d73/d79/f98 [0,4194304] 0 2026-03-10T07:51:44.950 INFO:tasks.workunit.client.1.vm08.stdout:8/879: truncate d0/df/d15/d23/d39/d5b/dea/dce/fd2 2229235 0 2026-03-10T07:51:44.957 INFO:tasks.workunit.client.1.vm08.stdout:7/742: truncate d3/da/d25/d9/d2f/d39/d43/fc2 3252996 0 2026-03-10T07:51:44.957 INFO:tasks.workunit.client.1.vm08.stdout:6/789: symlink d1/d17/d2b/d58/d77/l106 0 2026-03-10T07:51:44.964 INFO:tasks.workunit.client.1.vm08.stdout:4/703: creat d5/da0/d95/de6/d48/fe9 x:0 0 0 2026-03-10T07:51:44.965 INFO:tasks.workunit.client.1.vm08.stdout:3/763: dread d0/d3c/d18/d48/d55/d56/f9c [0,4194304] 0 2026-03-10T07:51:44.965 INFO:tasks.workunit.client.1.vm08.stdout:6/790: symlink d1/d46/l107 0 2026-03-10T07:51:44.966 INFO:tasks.workunit.client.1.vm08.stdout:7/743: mkdir d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/dfe 0 2026-03-10T07:51:44.967 INFO:tasks.workunit.client.1.vm08.stdout:8/880: rename d0/df/d15/ca1 to d0/df/d15/d9c/c11a 0 2026-03-10T07:51:44.968 INFO:tasks.workunit.client.1.vm08.stdout:8/881: write d0/df/f12 [2841570,99054] 0 2026-03-10T07:51:44.970 INFO:tasks.workunit.client.1.vm08.stdout:4/704: rmdir d5/da0/d95/de6/d48/d4f/d8d/d91 39 2026-03-10T07:51:44.974 INFO:tasks.workunit.client.1.vm08.stdout:6/791: symlink d1/db/d24/d3d/l108 0 2026-03-10T07:51:44.975 INFO:tasks.workunit.client.1.vm08.stdout:1/714: truncate d2/d6/de/d1f/d26/d89/d8e/fbf 1135816 0 2026-03-10T07:51:44.977 INFO:tasks.workunit.client.1.vm08.stdout:2/766: write d0/d1/d3/d10/d65/fc5 [261951,40043] 0 2026-03-10T07:51:44.980 INFO:tasks.workunit.client.1.vm08.stdout:0/797: dwrite dd/d10/d14/d15/f9c [4194304,4194304] 0 2026-03-10T07:51:44.982 INFO:tasks.workunit.client.1.vm08.stdout:6/792: creat d1/d3/df/d44/f109 x:0 0 0 2026-03-10T07:51:44.983 INFO:tasks.workunit.client.1.vm08.stdout:1/715: creat d2/d10/ff7 x:0 0 0 2026-03-10T07:51:44.985 INFO:tasks.workunit.client.1.vm08.stdout:2/767: unlink d0/d1/d3/d10/d38/daf/fd8 0 2026-03-10T07:51:44.987 INFO:tasks.workunit.client.1.vm08.stdout:9/771: dwrite d2/d58/dbf/d2b/f6a [0,4194304] 0 2026-03-10T07:51:44.987 INFO:tasks.workunit.client.1.vm08.stdout:7/744: creat d3/fff x:0 0 0 2026-03-10T07:51:44.988 INFO:tasks.workunit.client.1.vm08.stdout:3/764: dread d0/d3c/d18/d32/d61/d52/f66 [0,4194304] 0 2026-03-10T07:51:44.988 INFO:tasks.workunit.client.1.vm08.stdout:6/793: symlink d1/d46/l10a 0 2026-03-10T07:51:44.989 INFO:tasks.workunit.client.1.vm08.stdout:0/798: symlink dd/d10/d14/d15/d20/d5f/lfb 0 2026-03-10T07:51:44.995 INFO:tasks.workunit.client.1.vm08.stdout:2/768: symlink d0/d1/d17/db2/d9c/lf7 0 2026-03-10T07:51:44.995 INFO:tasks.workunit.client.1.vm08.stdout:1/716: rename d2/d6/de/d1f/d8f/le5 to d2/d6/de/d47/da0/lf8 0 2026-03-10T07:51:44.998 INFO:tasks.workunit.client.1.vm08.stdout:9/772: truncate d2/d58/dbf/dd0/d35/d97/d9d/df4/f4f 511838 0 2026-03-10T07:51:44.998 INFO:tasks.workunit.client.1.vm08.stdout:0/799: creat dd/d10/d14/d15/dad/ffc x:0 0 0 2026-03-10T07:51:44.999 INFO:tasks.workunit.client.1.vm08.stdout:8/882: getdents d0/d69 0 2026-03-10T07:51:45.000 INFO:tasks.workunit.client.1.vm08.stdout:3/765: creat d0/d3c/d18/d32/ff3 x:0 0 0 2026-03-10T07:51:45.001 INFO:tasks.workunit.client.1.vm08.stdout:2/769: rmdir d0/d1/d3/d39/d7d/d86/d55/d7a 39 2026-03-10T07:51:45.003 INFO:tasks.workunit.client.1.vm08.stdout:2/770: write d0/d1/d17/d6b/f72 [1879717,59662] 0 2026-03-10T07:51:45.005 INFO:tasks.workunit.client.1.vm08.stdout:3/766: chown d0/d3c/d18/dec/f4d 7220 1 2026-03-10T07:51:45.006 INFO:tasks.workunit.client.1.vm08.stdout:4/705: sync 2026-03-10T07:51:45.007 INFO:tasks.workunit.client.1.vm08.stdout:7/745: getdents d3/da/d25/d9/d6f 0 2026-03-10T07:51:45.010 INFO:tasks.workunit.client.1.vm08.stdout:1/717: read d2/d6/de/d1f/d26/d58/d8c/f46 [492511,79372] 0 2026-03-10T07:51:45.010 INFO:tasks.workunit.client.1.vm08.stdout:0/800: rename dd/d10/d2f/c88 to dd/d10/d2f/d37/d64/d95/d5c/cfd 0 2026-03-10T07:51:45.012 INFO:tasks.workunit.client.1.vm08.stdout:9/773: mknod d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/d98/dbb/dd9/dfc/c102 0 2026-03-10T07:51:45.015 INFO:tasks.workunit.client.1.vm08.stdout:2/771: rename d0/d1/d17/f95 to d0/d1/d3/d56/d78/ff8 0 2026-03-10T07:51:45.015 INFO:tasks.workunit.client.1.vm08.stdout:3/767: write d0/d3c/d18/d32/d61/d52/f66 [4789205,82519] 0 2026-03-10T07:51:45.015 INFO:tasks.workunit.client.1.vm08.stdout:9/774: symlink d2/l103 0 2026-03-10T07:51:45.015 INFO:tasks.workunit.client.1.vm08.stdout:4/706: creat d5/da0/d95/de6/d48/d4f/d8d/d91/dd5/fea x:0 0 0 2026-03-10T07:51:45.015 INFO:tasks.workunit.client.1.vm08.stdout:1/718: dread - d2/d6/de/d1f/d26/d58/d8c/f97 zero size 2026-03-10T07:51:45.016 INFO:tasks.workunit.client.1.vm08.stdout:3/768: fdatasync d0/d3c/d1f/d95/fbd 0 2026-03-10T07:51:45.017 INFO:tasks.workunit.client.1.vm08.stdout:3/769: chown d0/d3c/d18/d48/d55/d56 320378554 1 2026-03-10T07:51:45.026 INFO:tasks.workunit.client.1.vm08.stdout:3/770: dwrite d0/d3c/d1f/f6f [0,4194304] 0 2026-03-10T07:51:45.029 INFO:tasks.workunit.client.1.vm08.stdout:0/801: rename dd/d10/d14/d15/d20/d5f/fb9 to dd/d10/ffe 0 2026-03-10T07:51:45.029 INFO:tasks.workunit.client.1.vm08.stdout:5/821: write d0/d8/d24/ffd [302873,125427] 0 2026-03-10T07:51:45.030 INFO:tasks.workunit.client.1.vm08.stdout:1/719: dwrite d2/d6/de/d1f/d26/d58/d83/fa2 [0,4194304] 0 2026-03-10T07:51:45.039 INFO:tasks.workunit.client.1.vm08.stdout:1/720: mkdir d2/d6/de/d5f/df9 0 2026-03-10T07:51:45.040 INFO:tasks.workunit.client.1.vm08.stdout:3/771: creat d0/ff4 x:0 0 0 2026-03-10T07:51:45.042 INFO:tasks.workunit.client.1.vm08.stdout:1/721: creat d2/d6/de/d1f/d26/d89/ffa x:0 0 0 2026-03-10T07:51:45.042 INFO:tasks.workunit.client.1.vm08.stdout:0/802: dread - dd/d10/d2f/d37/d64/d95/d58/d3d/d9b/fb1 zero size 2026-03-10T07:51:45.043 INFO:tasks.workunit.client.1.vm08.stdout:5/822: dwrite d0/d4/df/dbf/d41/f5c [4194304,4194304] 0 2026-03-10T07:51:45.043 INFO:tasks.workunit.client.1.vm08.stdout:1/722: dread - d2/d10/fc4 zero size 2026-03-10T07:51:45.043 INFO:tasks.workunit.client.1.vm08.stdout:3/772: write d0/d3c/d1f/f93 [4224499,67219] 0 2026-03-10T07:51:45.051 INFO:tasks.workunit.client.1.vm08.stdout:3/773: creat d0/d3c/d1f/ff5 x:0 0 0 2026-03-10T07:51:45.051 INFO:tasks.workunit.client.1.vm08.stdout:5/823: dwrite d0/d4/d19/d60/d6d/d70/d40/dba/fd8 [0,4194304] 0 2026-03-10T07:51:45.054 INFO:tasks.workunit.client.1.vm08.stdout:1/723: dwrite d2/d6/de/d70/d80/fbb [0,4194304] 0 2026-03-10T07:51:45.056 INFO:tasks.workunit.client.1.vm08.stdout:3/774: creat d0/d3c/d18/d32/daa/ff6 x:0 0 0 2026-03-10T07:51:45.057 INFO:tasks.workunit.client.1.vm08.stdout:5/824: creat d0/d77/d83/de0/dfe/f10a x:0 0 0 2026-03-10T07:51:45.058 INFO:tasks.workunit.client.1.vm08.stdout:5/825: write d0/d8/d24/f56 [1347127,79800] 0 2026-03-10T07:51:45.107 INFO:tasks.workunit.client.1.vm08.stdout:0/803: dread dd/d10/d2f/d37/d64/d95/d5c/dca/ddb/fe7 [0,4194304] 0 2026-03-10T07:51:45.109 INFO:tasks.workunit.client.1.vm08.stdout:0/804: unlink dd/d10/d2f/d37/d64/d95/f2a 0 2026-03-10T07:51:45.110 INFO:tasks.workunit.client.1.vm08.stdout:0/805: dread - dd/d10/d14/d15/dad/ffc zero size 2026-03-10T07:51:45.111 INFO:tasks.workunit.client.1.vm08.stdout:0/806: creat dd/d10/d14/d15/d20/d92/dc1/ded/df7/fff x:0 0 0 2026-03-10T07:51:45.111 INFO:tasks.workunit.client.1.vm08.stdout:0/807: chown dd/d10/dbd/cef 60586343 1 2026-03-10T07:51:45.114 INFO:tasks.workunit.client.1.vm08.stdout:0/808: dread dd/d10/d14/d15/d20/d22/f6c [0,4194304] 0 2026-03-10T07:51:45.115 INFO:tasks.workunit.client.1.vm08.stdout:0/809: truncate dd/d10/d2f/f8e 300372 0 2026-03-10T07:51:45.116 INFO:tasks.workunit.client.1.vm08.stdout:0/810: mkdir dd/d18/d100 0 2026-03-10T07:51:45.117 INFO:tasks.workunit.client.1.vm08.stdout:0/811: unlink dd/d10/lc2 0 2026-03-10T07:51:45.118 INFO:tasks.workunit.client.1.vm08.stdout:0/812: creat dd/d10/dbd/f101 x:0 0 0 2026-03-10T07:51:45.143 INFO:tasks.workunit.client.1.vm08.stdout:5/826: dread d0/d4/d19/d3a/d69/f6b [0,4194304] 0 2026-03-10T07:51:45.183 INFO:tasks.workunit.client.1.vm08.stdout:5/827: read d0/d4/df/d12/f88 [546415,75064] 0 2026-03-10T07:51:45.185 INFO:tasks.workunit.client.1.vm08.stdout:5/828: creat d0/d4/df/d12/f10b x:0 0 0 2026-03-10T07:51:45.186 INFO:tasks.workunit.client.1.vm08.stdout:5/829: creat d0/d4/d19/d81/d92/f10c x:0 0 0 2026-03-10T07:51:45.222 INFO:tasks.workunit.client.1.vm08.stdout:5/830: dread d0/f3b [0,4194304] 0 2026-03-10T07:51:45.224 INFO:tasks.workunit.client.1.vm08.stdout:5/831: write d0/d4/d19/d60/d6d/d70/d40/dba/ff8 [454870,93553] 0 2026-03-10T07:51:45.225 INFO:tasks.workunit.client.1.vm08.stdout:5/832: readlink d0/d4/d19/d60/d6d/d70/d40/lfc 0 2026-03-10T07:51:45.239 INFO:tasks.workunit.client.1.vm08.stdout:5/833: symlink d0/d33/ddd/l10d 0 2026-03-10T07:51:45.256 INFO:tasks.workunit.client.1.vm08.stdout:9/775: truncate d2/d26/f29 739170 0 2026-03-10T07:51:45.256 INFO:tasks.workunit.client.1.vm08.stdout:7/746: write d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/f81 [295718,87426] 0 2026-03-10T07:51:45.260 INFO:tasks.workunit.client.1.vm08.stdout:6/794: dwrite d1/db/fc1 [4194304,4194304] 0 2026-03-10T07:51:45.261 INFO:tasks.workunit.client.1.vm08.stdout:8/883: dwrite d0/df/d17/d72/fe4 [0,4194304] 0 2026-03-10T07:51:45.262 INFO:tasks.workunit.client.1.vm08.stdout:4/707: write f0 [2278329,11372] 0 2026-03-10T07:51:45.268 INFO:tasks.workunit.client.1.vm08.stdout:2/772: dread d0/d1/d3/d39/fb7 [0,4194304] 0 2026-03-10T07:51:45.275 INFO:tasks.workunit.client.1.vm08.stdout:5/834: mkdir d0/d10e 0 2026-03-10T07:51:45.275 INFO:tasks.workunit.client.1.vm08.stdout:9/776: fdatasync d2/fba 0 2026-03-10T07:51:45.278 INFO:tasks.workunit.client.1.vm08.stdout:7/747: unlink d3/da/d8a/ff7 0 2026-03-10T07:51:45.283 INFO:tasks.workunit.client.1.vm08.stdout:8/884: fdatasync d0/df/d15/d23/d54/dba/d89/fb6 0 2026-03-10T07:51:45.290 INFO:tasks.workunit.client.1.vm08.stdout:2/773: truncate d0/d1/d17/db2/d9c/fbc 609371 0 2026-03-10T07:51:45.291 INFO:tasks.workunit.client.1.vm08.stdout:2/774: dread - d0/d1/d3/d56/d78/dad/db1/d61/d84/fa8 zero size 2026-03-10T07:51:45.298 INFO:tasks.workunit.client.1.vm08.stdout:9/777: truncate d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/fd2 670178 0 2026-03-10T07:51:45.300 INFO:tasks.workunit.client.1.vm08.stdout:7/748: rename d3/da/d25/d9/fb3 to d3/da/d25/d9/d2f/d3a/d40/d54/db5/f100 0 2026-03-10T07:51:45.303 INFO:tasks.workunit.client.1.vm08.stdout:6/795: creat d1/d3/df/d1d/d40/f10b x:0 0 0 2026-03-10T07:51:45.308 INFO:tasks.workunit.client.1.vm08.stdout:5/835: creat d0/d4/df/df6/df9/dec/f10f x:0 0 0 2026-03-10T07:51:45.314 INFO:tasks.workunit.client.1.vm08.stdout:9/778: truncate d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/d98/dbb/fcb 523727 0 2026-03-10T07:51:45.316 INFO:tasks.workunit.client.1.vm08.stdout:3/775: getdents d0/d3c/d1f 0 2026-03-10T07:51:45.317 INFO:tasks.workunit.client.1.vm08.stdout:1/724: write d2/d6/de/d1f/d26/f2f [2808326,107461] 0 2026-03-10T07:51:45.325 INFO:tasks.workunit.client.1.vm08.stdout:7/749: creat d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/f101 x:0 0 0 2026-03-10T07:51:45.327 INFO:tasks.workunit.client.1.vm08.stdout:8/885: readlink d0/df/d17/d25/lfe 0 2026-03-10T07:51:45.328 INFO:tasks.workunit.client.1.vm08.stdout:5/836: mknod d0/d4/d19/d43/c110 0 2026-03-10T07:51:45.328 INFO:tasks.workunit.client.1.vm08.stdout:9/779: mknod d2/d58/dbf/dd0/d35/d97/d9d/c104 0 2026-03-10T07:51:45.331 INFO:tasks.workunit.client.1.vm08.stdout:7/750: dwrite d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/dac/fd7 [0,4194304] 0 2026-03-10T07:51:45.334 INFO:tasks.workunit.client.1.vm08.stdout:1/725: truncate d2/d6/de/d1f/d26/d58/f68 3824437 0 2026-03-10T07:51:45.335 INFO:tasks.workunit.client.1.vm08.stdout:1/726: chown d2/d6/d50/lee 1171221 1 2026-03-10T07:51:45.341 INFO:tasks.workunit.client.1.vm08.stdout:6/796: dread d1/d3/df/d44/f82 [0,4194304] 0 2026-03-10T07:51:45.342 INFO:tasks.workunit.client.1.vm08.stdout:6/797: chown d1/d3/df/d44/le9 251686 1 2026-03-10T07:51:45.344 INFO:tasks.workunit.client.1.vm08.stdout:6/798: write d1/d17/fa6 [932579,82392] 0 2026-03-10T07:51:45.344 INFO:tasks.workunit.client.1.vm08.stdout:6/799: chown d1/d17/d2b/d58/d76 2742 1 2026-03-10T07:51:45.345 INFO:tasks.workunit.client.1.vm08.stdout:6/800: dread - d1/d3/df/d44/f109 zero size 2026-03-10T07:51:45.353 INFO:tasks.workunit.client.1.vm08.stdout:7/751: mkdir d3/da/d25/d9/d2f/d3a/d4b/d102 0 2026-03-10T07:51:45.353 INFO:tasks.workunit.client.1.vm08.stdout:1/727: mkdir d2/d6/d3a/d61/dfb 0 2026-03-10T07:51:45.354 INFO:tasks.workunit.client.1.vm08.stdout:7/752: dread - d3/da/d25/f94 zero size 2026-03-10T07:51:45.355 INFO:tasks.workunit.client.1.vm08.stdout:7/753: write d3/da/d8a/fcd [108589,130272] 0 2026-03-10T07:51:45.357 INFO:tasks.workunit.client.1.vm08.stdout:1/728: dread d2/d10/f3e [0,4194304] 0 2026-03-10T07:51:45.360 INFO:tasks.workunit.client.1.vm08.stdout:6/801: dread d1/d3/df/d1d/f9b [0,4194304] 0 2026-03-10T07:51:45.361 INFO:tasks.workunit.client.1.vm08.stdout:6/802: chown d1/d46/l107 120559664 1 2026-03-10T07:51:45.363 INFO:tasks.workunit.client.1.vm08.stdout:4/708: truncate d5/d1f/d31/fbe 1038177 0 2026-03-10T07:51:45.363 INFO:tasks.workunit.client.1.vm08.stdout:4/709: chown d5/da0/d95/de6/d48/d4f/d8d 13103151 1 2026-03-10T07:51:45.372 INFO:tasks.workunit.client.1.vm08.stdout:1/729: dread d2/d6/de/fa5 [0,4194304] 0 2026-03-10T07:51:45.380 INFO:tasks.workunit.client.1.vm08.stdout:0/813: write dd/d10/d14/f69 [7430427,106527] 0 2026-03-10T07:51:45.382 INFO:tasks.workunit.client.1.vm08.stdout:0/814: chown dd/d10/d2f/d37/d64/d95/f91 3616255 1 2026-03-10T07:51:45.389 INFO:tasks.workunit.client.1.vm08.stdout:5/837: write d0/d8/d24/f47 [3266413,129165] 0 2026-03-10T07:51:45.390 INFO:tasks.workunit.client.1.vm08.stdout:3/776: write d0/d3c/d18/d32/d61/d52/dca/fd0 [744492,116301] 0 2026-03-10T07:51:45.392 INFO:tasks.workunit.client.1.vm08.stdout:8/886: dwrite d0/df/d15/d23/d39/d5b/dbc/fe1 [0,4194304] 0 2026-03-10T07:51:45.397 INFO:tasks.workunit.client.1.vm08.stdout:0/815: sync 2026-03-10T07:51:45.421 INFO:tasks.workunit.client.1.vm08.stdout:2/775: getdents d0/d1/d3/d56/d78/dad/db1/d61/d84 0 2026-03-10T07:51:45.425 INFO:tasks.workunit.client.1.vm08.stdout:9/780: creat d2/d26/da4/f105 x:0 0 0 2026-03-10T07:51:45.434 INFO:tasks.workunit.client.1.vm08.stdout:7/754: rename d3/da/d25/d9/d2f/d39/d43/d4f to d3/da/d25/d9/d2f/d39/d43/d4f/d103 22 2026-03-10T07:51:45.435 INFO:tasks.workunit.client.1.vm08.stdout:6/803: mkdir d1/d3/df/d1d/d40/d45/d10c 0 2026-03-10T07:51:45.436 INFO:tasks.workunit.client.1.vm08.stdout:7/755: sync 2026-03-10T07:51:45.442 INFO:tasks.workunit.client.1.vm08.stdout:8/887: chown d0/df/d15/d23/d39/d5b/c34 288 1 2026-03-10T07:51:45.445 INFO:tasks.workunit.client.1.vm08.stdout:5/838: write d0/d4/df/f2a [2735660,32472] 0 2026-03-10T07:51:45.445 INFO:tasks.workunit.client.1.vm08.stdout:1/730: write d2/d6/de/d1f/d40/d76/f79 [1718274,107900] 0 2026-03-10T07:51:45.447 INFO:tasks.workunit.client.1.vm08.stdout:8/888: dwrite d0/df/d17/d72/fe4 [0,4194304] 0 2026-03-10T07:51:45.450 INFO:tasks.workunit.client.1.vm08.stdout:2/776: chown d0/d1/d3/d56/d78/dad/db1/c6c 251086967 1 2026-03-10T07:51:45.451 INFO:tasks.workunit.client.1.vm08.stdout:9/781: mkdir d2/d58/dbf/dd0/d35/d97/dd5/d106 0 2026-03-10T07:51:45.460 INFO:tasks.workunit.client.1.vm08.stdout:7/756: unlink d3/da/d25/d9/f53 0 2026-03-10T07:51:45.460 INFO:tasks.workunit.client.1.vm08.stdout:3/777: mknod d0/d3c/d18/d32/daa/ded/cf7 0 2026-03-10T07:51:45.460 INFO:tasks.workunit.client.1.vm08.stdout:7/757: truncate d3/da/f9f 4994480 0 2026-03-10T07:51:45.463 INFO:tasks.workunit.client.1.vm08.stdout:7/758: write d3/da/d25/d9/d2f/d3a/d40/f52 [3782365,67512] 0 2026-03-10T07:51:45.465 INFO:tasks.workunit.client.1.vm08.stdout:9/782: creat d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/f107 x:0 0 0 2026-03-10T07:51:45.466 INFO:tasks.workunit.client.1.vm08.stdout:5/839: creat d0/d4/df/dbf/d41/de8/f111 x:0 0 0 2026-03-10T07:51:45.467 INFO:tasks.workunit.client.1.vm08.stdout:3/778: dwrite d0/d3c/d1f/f93 [4194304,4194304] 0 2026-03-10T07:51:45.473 INFO:tasks.workunit.client.1.vm08.stdout:6/804: creat d1/f10d x:0 0 0 2026-03-10T07:51:45.473 INFO:tasks.workunit.client.1.vm08.stdout:6/805: fsync d1/d3/df/d1d/f6b 0 2026-03-10T07:51:45.475 INFO:tasks.workunit.client.1.vm08.stdout:0/816: rename dd/d10/d14/d15/d20/d92 to dd/d18/d100/d102 0 2026-03-10T07:51:45.475 INFO:tasks.workunit.client.1.vm08.stdout:0/817: readlink dd/d10/d2f/d37/d64/d95/d58/d3d/lc9 0 2026-03-10T07:51:45.477 INFO:tasks.workunit.client.1.vm08.stdout:4/710: link d5/da0/d95/de6/d48/d4f/d8d/d91/fa6 d5/d1f/daf/feb 0 2026-03-10T07:51:45.478 INFO:tasks.workunit.client.1.vm08.stdout:9/783: creat d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/d98/dbb/f108 x:0 0 0 2026-03-10T07:51:45.480 INFO:tasks.workunit.client.1.vm08.stdout:6/806: rename d1/db/d24/d3d/dea to d1/d3/df/d1d/d6f/d10e 0 2026-03-10T07:51:45.482 INFO:tasks.workunit.client.1.vm08.stdout:5/840: mknod d0/d4/df/c112 0 2026-03-10T07:51:45.485 INFO:tasks.workunit.client.1.vm08.stdout:6/807: truncate d1/d17/d2b/d58/d76/fb6 486683 0 2026-03-10T07:51:45.486 INFO:tasks.workunit.client.1.vm08.stdout:5/841: rmdir d0/d8/dce 39 2026-03-10T07:51:45.488 INFO:tasks.workunit.client.1.vm08.stdout:9/784: mknod d2/dda/c109 0 2026-03-10T07:51:45.489 INFO:tasks.workunit.client.1.vm08.stdout:0/818: link dd/d10/d14/d15/d20/d5f/cbf dd/d10/d14/d15/d20/d5f/d9f/c103 0 2026-03-10T07:51:45.490 INFO:tasks.workunit.client.1.vm08.stdout:7/759: link d3/da/d25/d9/d2f/d3a/d4b/fa3 d3/f104 0 2026-03-10T07:51:45.497 INFO:tasks.workunit.client.1.vm08.stdout:1/731: dread d2/d6/de/d1f/d26/d58/d8c/f96 [4194304,4194304] 0 2026-03-10T07:51:45.497 INFO:tasks.workunit.client.1.vm08.stdout:0/819: creat dd/d10/d2f/d37/d64/f104 x:0 0 0 2026-03-10T07:51:45.497 INFO:tasks.workunit.client.1.vm08.stdout:1/732: chown d2/d6/de/d70/fca 54180 1 2026-03-10T07:51:45.500 INFO:tasks.workunit.client.1.vm08.stdout:9/785: sync 2026-03-10T07:51:45.505 INFO:tasks.workunit.client.1.vm08.stdout:6/808: dread d1/db/f57 [0,4194304] 0 2026-03-10T07:51:45.561 INFO:tasks.workunit.client.1.vm08.stdout:8/889: dwrite d0/f22 [4194304,4194304] 0 2026-03-10T07:51:45.564 INFO:tasks.workunit.client.1.vm08.stdout:4/711: truncate d5/da0/d95/de6/d48/d4f/d8d/d91/fa6 3474216 0 2026-03-10T07:51:45.564 INFO:tasks.workunit.client.1.vm08.stdout:2/777: write d0/d1/f9f [938299,65115] 0 2026-03-10T07:51:45.572 INFO:tasks.workunit.client.1.vm08.stdout:4/712: dread d5/d1f/d41/f83 [0,4194304] 0 2026-03-10T07:51:45.577 INFO:tasks.workunit.client.1.vm08.stdout:0/820: truncate dd/d10/d14/da6/ffa 733494 0 2026-03-10T07:51:45.583 INFO:tasks.workunit.client.1.vm08.stdout:1/733: rename d2/d6/de/d47/cbc to d2/d6/de/d1f/d22/cfc 0 2026-03-10T07:51:45.583 INFO:tasks.workunit.client.1.vm08.stdout:1/734: write d2/d6/de/d1f/d26/d89/ffa [119432,50268] 0 2026-03-10T07:51:45.584 INFO:tasks.workunit.client.1.vm08.stdout:1/735: readlink d2/d6/de/d1f/d40/d76/ld4 0 2026-03-10T07:51:45.585 INFO:tasks.workunit.client.1.vm08.stdout:9/786: creat d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/d98/dbb/dd9/dfc/f10a x:0 0 0 2026-03-10T07:51:45.586 INFO:tasks.workunit.client.1.vm08.stdout:3/779: write d0/d3c/d18/f22 [8140267,15794] 0 2026-03-10T07:51:45.587 INFO:tasks.workunit.client.1.vm08.stdout:3/780: readlink d0/d3c/d18/d80/dc1/ld7 0 2026-03-10T07:51:45.593 INFO:tasks.workunit.client.1.vm08.stdout:6/809: rmdir d1/d17/d2b/d5e/dcb 39 2026-03-10T07:51:45.597 INFO:tasks.workunit.client.1.vm08.stdout:6/810: dwrite d1/d3/df/d1d/d40/d87/d95/ffe [0,4194304] 0 2026-03-10T07:51:45.598 INFO:tasks.workunit.client.1.vm08.stdout:8/890: creat d0/df/d15/d53/f11b x:0 0 0 2026-03-10T07:51:45.603 INFO:tasks.workunit.client.1.vm08.stdout:2/778: dread - d0/d1/d3/d10/fb4 zero size 2026-03-10T07:51:45.605 INFO:tasks.workunit.client.1.vm08.stdout:4/713: readlink d5/d1f/d9b/laa 0 2026-03-10T07:51:45.612 INFO:tasks.workunit.client.1.vm08.stdout:5/842: dwrite d0/d4/d19/d43/f59 [4194304,4194304] 0 2026-03-10T07:51:45.616 INFO:tasks.workunit.client.1.vm08.stdout:2/779: creat d0/d1/d3/d56/ff9 x:0 0 0 2026-03-10T07:51:45.618 INFO:tasks.workunit.client.1.vm08.stdout:4/714: truncate d5/d8/fc 4829119 0 2026-03-10T07:51:45.619 INFO:tasks.workunit.client.1.vm08.stdout:1/736: truncate d2/d6/de/d1f/d22/f81 4566252 0 2026-03-10T07:51:45.625 INFO:tasks.workunit.client.1.vm08.stdout:4/715: creat d5/d1f/d70/fec x:0 0 0 2026-03-10T07:51:45.625 INFO:tasks.workunit.client.1.vm08.stdout:4/716: chown d5/da0/d32/cdb 0 1 2026-03-10T07:51:45.625 INFO:tasks.workunit.client.1.vm08.stdout:2/780: truncate d0/d1/d3/d39/d7d/d86/d55/d1b/fc4 747955 0 2026-03-10T07:51:45.626 INFO:tasks.workunit.client.1.vm08.stdout:1/737: write d2/d6/de/d70/d80/fbb [1090856,74425] 0 2026-03-10T07:51:45.637 INFO:tasks.workunit.client.1.vm08.stdout:2/781: creat d0/d1/d3/d56/d78/dad/db1/d61/dee/ffa x:0 0 0 2026-03-10T07:51:45.647 INFO:tasks.workunit.client.1.vm08.stdout:2/782: mkdir d0/d1/d17/dfb 0 2026-03-10T07:51:45.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:45 vm05.local ceph-mon[50387]: pgmap v42: 65 pgs: 65 active+clean; 3.2 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 54 MiB/s rd, 136 MiB/s wr, 323 op/s 2026-03-10T07:51:45.666 INFO:tasks.workunit.client.1.vm08.stdout:0/821: unlink dd/d10/d14/d1b/cae 0 2026-03-10T07:51:45.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:45 vm08.local ceph-mon[59917]: pgmap v42: 65 pgs: 65 active+clean; 3.2 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 54 MiB/s rd, 136 MiB/s wr, 323 op/s 2026-03-10T07:51:45.668 INFO:tasks.workunit.client.1.vm08.stdout:6/811: rename d1/d3/df/d1d/d40/d87/f89 to d1/db/d24/d73/d79/f10f 0 2026-03-10T07:51:45.672 INFO:tasks.workunit.client.1.vm08.stdout:6/812: chown d1/d3/df/d38/l88 255708824 1 2026-03-10T07:51:45.686 INFO:tasks.workunit.client.1.vm08.stdout:0/822: rename dd/d10/d2f/f90 to dd/d10/d2f/d37/d64/d95/d58/d3d/f105 0 2026-03-10T07:51:45.689 INFO:tasks.workunit.client.1.vm08.stdout:0/823: getdents dd/d10/d2f/d37/d64/d52 0 2026-03-10T07:51:45.699 INFO:tasks.workunit.client.1.vm08.stdout:7/760: write d3/da/d25/d9/d2f/d39/f76 [856994,99464] 0 2026-03-10T07:51:45.711 INFO:tasks.workunit.client.1.vm08.stdout:7/761: creat d3/da/d25/d9/d2f/d3a/dc0/f105 x:0 0 0 2026-03-10T07:51:45.711 INFO:tasks.workunit.client.1.vm08.stdout:7/762: write d3/da/f1d [4892129,51949] 0 2026-03-10T07:51:45.711 INFO:tasks.workunit.client.1.vm08.stdout:7/763: readlink d3/da/d25/d9/l66 0 2026-03-10T07:51:45.711 INFO:tasks.workunit.client.1.vm08.stdout:7/764: read d3/f34 [5434841,48997] 0 2026-03-10T07:51:45.711 INFO:tasks.workunit.client.1.vm08.stdout:7/765: symlink d3/da/d8a/dd1/l106 0 2026-03-10T07:51:45.711 INFO:tasks.workunit.client.1.vm08.stdout:7/766: write d3/da/d25/d9/d2f/d39/fc7 [3227561,110828] 0 2026-03-10T07:51:45.736 INFO:tasks.workunit.client.1.vm08.stdout:9/787: write d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/f94 [691109,25500] 0 2026-03-10T07:51:45.736 INFO:tasks.workunit.client.1.vm08.stdout:3/781: write d0/d3c/d1f/d44/f4b [5219793,117361] 0 2026-03-10T07:51:45.737 INFO:tasks.workunit.client.1.vm08.stdout:9/788: chown d2/d58/dbf/dd0/d35/dff 4072 1 2026-03-10T07:51:45.737 INFO:tasks.workunit.client.1.vm08.stdout:9/789: chown d2/d58/f95 157108 1 2026-03-10T07:51:45.738 INFO:tasks.workunit.client.1.vm08.stdout:9/790: chown d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/d91 1442798409 1 2026-03-10T07:51:45.741 INFO:tasks.workunit.client.1.vm08.stdout:9/791: rmdir d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/d91 39 2026-03-10T07:51:45.742 INFO:tasks.workunit.client.1.vm08.stdout:1/738: truncate d2/d6/d9f/fec 2728436 0 2026-03-10T07:51:45.743 INFO:tasks.workunit.client.1.vm08.stdout:1/739: chown d2/d6/de/d1f/d40/f4d 3310242 1 2026-03-10T07:51:45.747 INFO:tasks.workunit.client.1.vm08.stdout:9/792: dwrite d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/d98/fc4 [0,4194304] 0 2026-03-10T07:51:45.781 INFO:tasks.workunit.client.1.vm08.stdout:9/793: rename d2/d26/fc1 to d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/d98/dbb/dd9/f10b 0 2026-03-10T07:51:45.781 INFO:tasks.workunit.client.1.vm08.stdout:9/794: creat d2/d58/dbf/dd0/d35/d97/dd5/f10c x:0 0 0 2026-03-10T07:51:45.781 INFO:tasks.workunit.client.1.vm08.stdout:5/843: write d0/d4/df/dbf/d41/dad/fb9 [1488097,24130] 0 2026-03-10T07:51:45.781 INFO:tasks.workunit.client.1.vm08.stdout:9/795: dwrite d2/d58/dbf/f75 [0,4194304] 0 2026-03-10T07:51:45.781 INFO:tasks.workunit.client.1.vm08.stdout:4/717: unlink l3 0 2026-03-10T07:51:45.781 INFO:tasks.workunit.client.1.vm08.stdout:9/796: chown d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/d98/dbb/cce 1956 1 2026-03-10T07:51:45.781 INFO:tasks.workunit.client.1.vm08.stdout:5/844: mknod d0/d77/d83/de0/c113 0 2026-03-10T07:51:45.781 INFO:tasks.workunit.client.1.vm08.stdout:4/718: symlink d5/da0/d95/de6/d48/d4f/led 0 2026-03-10T07:51:45.781 INFO:tasks.workunit.client.1.vm08.stdout:5/845: mknod d0/d4/d19/d81/da4/c114 0 2026-03-10T07:51:45.781 INFO:tasks.workunit.client.1.vm08.stdout:5/846: fdatasync d0/d8/d24/dd0/f8a 0 2026-03-10T07:51:45.781 INFO:tasks.workunit.client.1.vm08.stdout:8/891: mkdir d0/df/d15/d23/d54/dba/d11c 0 2026-03-10T07:51:45.781 INFO:tasks.workunit.client.1.vm08.stdout:5/847: dwrite d0/d4/d19/d3a/f3f [0,4194304] 0 2026-03-10T07:51:45.783 INFO:tasks.workunit.client.1.vm08.stdout:2/783: dwrite d0/f81 [0,4194304] 0 2026-03-10T07:51:45.799 INFO:tasks.workunit.client.1.vm08.stdout:2/784: creat d0/d1/d3/d10/d38/daf/ffc x:0 0 0 2026-03-10T07:51:45.802 INFO:tasks.workunit.client.1.vm08.stdout:2/785: creat d0/d1/d3/d39/d7d/d86/d55/dc9/ded/ffd x:0 0 0 2026-03-10T07:51:45.805 INFO:tasks.workunit.client.1.vm08.stdout:2/786: dread d0/d1/d17/db2/d9c/fbc [0,4194304] 0 2026-03-10T07:51:45.824 INFO:tasks.workunit.client.1.vm08.stdout:5/848: sync 2026-03-10T07:51:45.826 INFO:tasks.workunit.client.1.vm08.stdout:5/849: mknod d0/d77/daa/c115 0 2026-03-10T07:51:45.826 INFO:tasks.workunit.client.1.vm08.stdout:5/850: chown d0/d77 25532874 1 2026-03-10T07:51:45.828 INFO:tasks.workunit.client.1.vm08.stdout:5/851: creat d0/d4/d19/d81/d92/f116 x:0 0 0 2026-03-10T07:51:45.830 INFO:tasks.workunit.client.1.vm08.stdout:5/852: fsync d0/d4/d19/d60/d6d/fcb 0 2026-03-10T07:51:45.837 INFO:tasks.workunit.client.1.vm08.stdout:5/853: dread d0/d4/df/d12/f97 [0,4194304] 0 2026-03-10T07:51:45.840 INFO:tasks.workunit.client.1.vm08.stdout:5/854: dwrite d0/d4/d19/d43/f59 [0,4194304] 0 2026-03-10T07:51:45.842 INFO:tasks.workunit.client.1.vm08.stdout:5/855: chown d0/d4/d19/d3a/d69 1448201 1 2026-03-10T07:51:45.844 INFO:tasks.workunit.client.1.vm08.stdout:5/856: creat d0/d4/d19/d81/d92/f117 x:0 0 0 2026-03-10T07:51:45.844 INFO:tasks.workunit.client.1.vm08.stdout:5/857: chown d0/ddf/ca2 31 1 2026-03-10T07:51:45.845 INFO:tasks.workunit.client.1.vm08.stdout:5/858: stat d0/d4/d19/d3a/d69/fde 0 2026-03-10T07:51:45.845 INFO:tasks.workunit.client.1.vm08.stdout:5/859: chown d0/d4/d19/d60/d6d/d70/f67 7 1 2026-03-10T07:51:45.846 INFO:tasks.workunit.client.1.vm08.stdout:5/860: truncate d0/d33/ddd/f104 207010 0 2026-03-10T07:51:45.848 INFO:tasks.workunit.client.1.vm08.stdout:5/861: chown d0/d4/d19/d3a/f3f 33865 1 2026-03-10T07:51:45.851 INFO:tasks.workunit.client.1.vm08.stdout:5/862: getdents d0/d8/d24/de2/df7 0 2026-03-10T07:51:45.853 INFO:tasks.workunit.client.1.vm08.stdout:6/813: write d1/d3/df/fd7 [28240,13498] 0 2026-03-10T07:51:45.855 INFO:tasks.workunit.client.1.vm08.stdout:6/814: symlink d1/db/d24/dac/dad/l110 0 2026-03-10T07:51:45.869 INFO:tasks.workunit.client.1.vm08.stdout:0/824: write dd/f16 [1582563,10135] 0 2026-03-10T07:51:45.869 INFO:tasks.workunit.client.1.vm08.stdout:3/782: mknod d0/cf8 0 2026-03-10T07:51:45.869 INFO:tasks.workunit.client.1.vm08.stdout:0/825: link dd/d10/d14/d15/d20/d5f/d9f/c103 dd/d18/d100/c106 0 2026-03-10T07:51:45.869 INFO:tasks.workunit.client.1.vm08.stdout:0/826: dread - dd/d10/d14/d15/dad/ffc zero size 2026-03-10T07:51:45.869 INFO:tasks.workunit.client.1.vm08.stdout:3/783: dread d0/d3c/d18/d32/d61/d52/dca/fd0 [0,4194304] 0 2026-03-10T07:51:45.869 INFO:tasks.workunit.client.1.vm08.stdout:3/784: fsync d0/d3c/d18/d48/d55/fe0 0 2026-03-10T07:51:45.869 INFO:tasks.workunit.client.1.vm08.stdout:3/785: read d0/d3c/d18/d32/d61/d52/f70 [1922047,88355] 0 2026-03-10T07:51:45.873 INFO:tasks.workunit.client.1.vm08.stdout:3/786: creat d0/d3c/d18/d32/d61/d52/dca/ff9 x:0 0 0 2026-03-10T07:51:45.877 INFO:tasks.workunit.client.1.vm08.stdout:0/827: dread dd/d10/d2f/d37/d64/f68 [0,4194304] 0 2026-03-10T07:51:45.878 INFO:tasks.workunit.client.1.vm08.stdout:3/787: symlink d0/d3c/d18/d48/d55/d56/lfa 0 2026-03-10T07:51:45.881 INFO:tasks.workunit.client.1.vm08.stdout:0/828: dwrite dd/d10/d2f/d37/daf/ff0 [0,4194304] 0 2026-03-10T07:51:45.913 INFO:tasks.workunit.client.1.vm08.stdout:0/829: chown dd/d18/fc3 8163 1 2026-03-10T07:51:45.913 INFO:tasks.workunit.client.1.vm08.stdout:3/788: rmdir d0/d3c/d18/dec/d34 39 2026-03-10T07:51:45.913 INFO:tasks.workunit.client.1.vm08.stdout:3/789: stat d0/c19 0 2026-03-10T07:51:45.913 INFO:tasks.workunit.client.1.vm08.stdout:0/830: mkdir dd/d18/d100/d102/d107 0 2026-03-10T07:51:45.913 INFO:tasks.workunit.client.1.vm08.stdout:0/831: fsync dd/d10/d14/d15/d20/d5f/d9f/fbb 0 2026-03-10T07:51:45.913 INFO:tasks.workunit.client.1.vm08.stdout:3/790: dwrite d0/d3c/d18/dec/f82 [0,4194304] 0 2026-03-10T07:51:45.913 INFO:tasks.workunit.client.1.vm08.stdout:3/791: chown d0/d3c/d1f/ff5 1226279451 1 2026-03-10T07:51:45.913 INFO:tasks.workunit.client.1.vm08.stdout:0/832: mkdir dd/d10/d14/d1b/da5/d108 0 2026-03-10T07:51:45.913 INFO:tasks.workunit.client.1.vm08.stdout:0/833: dread dd/d10/d14/da6/ffa [0,4194304] 0 2026-03-10T07:51:45.913 INFO:tasks.workunit.client.1.vm08.stdout:3/792: mknod d0/d3c/d18/d80/cfb 0 2026-03-10T07:51:45.915 INFO:tasks.workunit.client.1.vm08.stdout:3/793: dwrite d0/d3c/d1f/d89/fea [0,4194304] 0 2026-03-10T07:51:45.924 INFO:tasks.workunit.client.1.vm08.stdout:3/794: unlink d0/d3c/d18/d4a/c7b 0 2026-03-10T07:51:45.965 INFO:tasks.workunit.client.1.vm08.stdout:7/767: truncate d3/da/d25/d9/d2f/d4d/db6/fc6 491505 0 2026-03-10T07:51:45.966 INFO:tasks.workunit.client.1.vm08.stdout:7/768: write d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/ff4 [700341,19997] 0 2026-03-10T07:51:45.971 INFO:tasks.workunit.client.1.vm08.stdout:1/740: dwrite d2/d6/de/d1f/d26/d58/d8c/f87 [0,4194304] 0 2026-03-10T07:51:45.972 INFO:tasks.workunit.client.1.vm08.stdout:7/769: stat d3/da/d25/d9/d2f/d39/f56 0 2026-03-10T07:51:45.979 INFO:tasks.workunit.client.1.vm08.stdout:7/770: mkdir d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/d107 0 2026-03-10T07:51:45.982 INFO:tasks.workunit.client.1.vm08.stdout:7/771: chown d3/da/d25/d9/d2f/d3a/d4b/fa3 105506010 1 2026-03-10T07:51:45.985 INFO:tasks.workunit.client.1.vm08.stdout:7/772: dwrite d3/da/d25/d9/fc5 [0,4194304] 0 2026-03-10T07:51:45.987 INFO:tasks.workunit.client.1.vm08.stdout:8/892: write d0/d69/f4c [4302452,39291] 0 2026-03-10T07:51:45.992 INFO:tasks.workunit.client.1.vm08.stdout:9/797: dwrite d2/d58/dbf/dd0/d35/d9b/fb6 [0,4194304] 0 2026-03-10T07:51:45.993 INFO:tasks.workunit.client.1.vm08.stdout:1/741: dread d2/d6/d50/f54 [0,4194304] 0 2026-03-10T07:51:45.997 INFO:tasks.workunit.client.1.vm08.stdout:4/719: dwrite d5/d8/f42 [0,4194304] 0 2026-03-10T07:51:46.002 INFO:tasks.workunit.client.1.vm08.stdout:2/787: dwrite d0/d1/d17/d6b/f9a [4194304,4194304] 0 2026-03-10T07:51:46.007 INFO:tasks.workunit.client.1.vm08.stdout:1/742: creat d2/d6/de/d70/ffd x:0 0 0 2026-03-10T07:51:46.007 INFO:tasks.workunit.client.1.vm08.stdout:9/798: mkdir d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/dca/dd7/d10d 0 2026-03-10T07:51:46.008 INFO:tasks.workunit.client.1.vm08.stdout:7/773: getdents d3/da/d25/d9/d2f/d3a/dc0/dda 0 2026-03-10T07:51:46.009 INFO:tasks.workunit.client.1.vm08.stdout:8/893: mknod d0/df/d15/d23/c11d 0 2026-03-10T07:51:46.012 INFO:tasks.workunit.client.1.vm08.stdout:2/788: mkdir d0/d1/d17/dfe 0 2026-03-10T07:51:46.012 INFO:tasks.workunit.client.1.vm08.stdout:3/795: dread d0/d3c/d18/dec/d2d/f3a [0,4194304] 0 2026-03-10T07:51:46.013 INFO:tasks.workunit.client.1.vm08.stdout:8/894: chown d0/df/d17/l4d 6806186 1 2026-03-10T07:51:46.015 INFO:tasks.workunit.client.1.vm08.stdout:7/774: rename d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79/dc3 to d3/da/d25/d9/d2f/d3a/dc0/d108 0 2026-03-10T07:51:46.019 INFO:tasks.workunit.client.1.vm08.stdout:9/799: dread d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/f12 [0,4194304] 0 2026-03-10T07:51:46.019 INFO:tasks.workunit.client.1.vm08.stdout:7/775: chown d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/dac 17 1 2026-03-10T07:51:46.024 INFO:tasks.workunit.client.1.vm08.stdout:1/743: mkdir d2/d6/dfe 0 2026-03-10T07:51:46.026 INFO:tasks.workunit.client.1.vm08.stdout:1/744: stat d2/d6/de/d1f/d26/d89/ffa 0 2026-03-10T07:51:46.026 INFO:tasks.workunit.client.1.vm08.stdout:3/796: fdatasync d0/d3c/d18/dec/d2d/f3a 0 2026-03-10T07:51:46.026 INFO:tasks.workunit.client.1.vm08.stdout:8/895: truncate d0/d37/d86/fb7 839120 0 2026-03-10T07:51:46.028 INFO:tasks.workunit.client.1.vm08.stdout:8/896: dread - d0/df/d15/d23/d39/d5b/d4a/f115 zero size 2026-03-10T07:51:46.029 INFO:tasks.workunit.client.1.vm08.stdout:2/789: truncate d0/d1/d3/d39/d7d/d86/f34 3695765 0 2026-03-10T07:51:46.031 INFO:tasks.workunit.client.1.vm08.stdout:7/776: dread d3/f2b [0,4194304] 0 2026-03-10T07:51:46.036 INFO:tasks.workunit.client.1.vm08.stdout:1/745: creat d2/d6/de/d5f/df9/fff x:0 0 0 2026-03-10T07:51:46.046 INFO:tasks.workunit.client.1.vm08.stdout:2/790: mknod d0/cff 0 2026-03-10T07:51:46.047 INFO:tasks.workunit.client.1.vm08.stdout:3/797: getdents d0/d3c/d1f/d89 0 2026-03-10T07:51:46.049 INFO:tasks.workunit.client.1.vm08.stdout:2/791: dread d0/d1/d3/d10/d65/fc5 [0,4194304] 0 2026-03-10T07:51:46.050 INFO:tasks.workunit.client.1.vm08.stdout:7/777: dread d3/da/d25/d9/d2f/d39/f58 [0,4194304] 0 2026-03-10T07:51:46.052 INFO:tasks.workunit.client.1.vm08.stdout:2/792: mkdir d0/d1/d3/d10/d38/d100 0 2026-03-10T07:51:46.053 INFO:tasks.workunit.client.1.vm08.stdout:2/793: readlink d0/d1/d17/d6b/da0/dd7/le6 0 2026-03-10T07:51:46.055 INFO:tasks.workunit.client.1.vm08.stdout:8/897: sync 2026-03-10T07:51:46.057 INFO:tasks.workunit.client.1.vm08.stdout:8/898: mknod d0/d37/d86/c11e 0 2026-03-10T07:51:46.058 INFO:tasks.workunit.client.1.vm08.stdout:2/794: mkdir d0/d1/d3/d10/d38/d100/d101 0 2026-03-10T07:51:46.059 INFO:tasks.workunit.client.1.vm08.stdout:3/798: dwrite d0/d3c/d18/d32/d61/d83/fda [0,4194304] 0 2026-03-10T07:51:46.064 INFO:tasks.workunit.client.1.vm08.stdout:8/899: dread d0/df/d15/d23/d39/f3e [0,4194304] 0 2026-03-10T07:51:46.064 INFO:tasks.workunit.client.1.vm08.stdout:2/795: creat d0/d1/d3/d10/d38/d100/f102 x:0 0 0 2026-03-10T07:51:46.065 INFO:tasks.workunit.client.1.vm08.stdout:3/799: creat d0/d3c/d18/d80/ffc x:0 0 0 2026-03-10T07:51:46.067 INFO:tasks.workunit.client.1.vm08.stdout:5/863: write d0/d8/d5e/d8e/f96 [430229,7004] 0 2026-03-10T07:51:46.072 INFO:tasks.workunit.client.1.vm08.stdout:6/815: dwrite d1/db/d24/d73/d79/f10f [0,4194304] 0 2026-03-10T07:51:46.074 INFO:tasks.workunit.client.1.vm08.stdout:6/816: readlink d1/d3/df/d38/l88 0 2026-03-10T07:51:46.088 INFO:tasks.workunit.client.1.vm08.stdout:3/800: mkdir d0/d3c/d18/d32/dfd 0 2026-03-10T07:51:46.089 INFO:tasks.workunit.client.1.vm08.stdout:3/801: fdatasync d0/d3c/d1f/d95/ff1 0 2026-03-10T07:51:46.096 INFO:tasks.workunit.client.1.vm08.stdout:5/864: fdatasync d0/d4/d19/d81/da4/fc4 0 2026-03-10T07:51:46.098 INFO:tasks.workunit.client.1.vm08.stdout:5/865: dread d0/d4/df/d12/f46 [0,4194304] 0 2026-03-10T07:51:46.122 INFO:tasks.workunit.client.1.vm08.stdout:5/866: unlink d0/d4/df/f27 0 2026-03-10T07:51:46.123 INFO:tasks.workunit.client.1.vm08.stdout:0/834: write dd/d10/d2f/d37/f65 [1400967,70055] 0 2026-03-10T07:51:46.126 INFO:tasks.workunit.client.1.vm08.stdout:2/796: creat d0/d1/d3/d56/d78/dad/db1/f103 x:0 0 0 2026-03-10T07:51:46.136 INFO:tasks.workunit.client.1.vm08.stdout:0/835: unlink dd/d10/d2f/l47 0 2026-03-10T07:51:46.142 INFO:tasks.workunit.client.1.vm08.stdout:9/800: dwrite d2/d58/dbf/dd0/d35/d9b/fa9 [0,4194304] 0 2026-03-10T07:51:46.144 INFO:tasks.workunit.client.1.vm08.stdout:9/801: write d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/d98/fd4 [466520,127095] 0 2026-03-10T07:51:46.144 INFO:tasks.workunit.client.1.vm08.stdout:9/802: stat d2/d58/dbf/dd0/d35/c2f 0 2026-03-10T07:51:46.145 INFO:tasks.workunit.client.1.vm08.stdout:1/746: truncate d2/d6/de/d5f/fb1 2789346 0 2026-03-10T07:51:46.145 INFO:tasks.workunit.client.1.vm08.stdout:1/747: chown d2/d6/de/fc0 268 1 2026-03-10T07:51:46.146 INFO:tasks.workunit.client.1.vm08.stdout:1/748: chown d2/d6/de/d1f/d26/d58/d83 373615888 1 2026-03-10T07:51:46.146 INFO:tasks.workunit.client.1.vm08.stdout:3/802: creat d0/d3c/d18/d32/d61/d52/ffe x:0 0 0 2026-03-10T07:51:46.155 INFO:tasks.workunit.client.1.vm08.stdout:7/778: dwrite d3/da/d25/f29 [0,4194304] 0 2026-03-10T07:51:46.156 INFO:tasks.workunit.client.1.vm08.stdout:7/779: chown d3 951325972 1 2026-03-10T07:51:46.157 INFO:tasks.workunit.client.1.vm08.stdout:7/780: chown d3/da/d25/d9/d2f/dfb 297 1 2026-03-10T07:51:46.160 INFO:tasks.workunit.client.1.vm08.stdout:8/900: dwrite d0/df/d15/d9c/fab [0,4194304] 0 2026-03-10T07:51:46.166 INFO:tasks.workunit.client.1.vm08.stdout:6/817: write d1/db/d24/f93 [197484,98560] 0 2026-03-10T07:51:46.169 INFO:tasks.workunit.client.1.vm08.stdout:8/901: sync 2026-03-10T07:51:46.175 INFO:tasks.workunit.client.1.vm08.stdout:0/836: dwrite dd/d10/d14/d1b/da5/fb6 [0,4194304] 0 2026-03-10T07:51:46.179 INFO:tasks.workunit.client.1.vm08.stdout:1/749: chown d2/d6/de/f74 1 1 2026-03-10T07:51:46.184 INFO:tasks.workunit.client.1.vm08.stdout:4/720: creat d5/d1f/d31/dce/fee x:0 0 0 2026-03-10T07:51:46.216 INFO:tasks.workunit.client.1.vm08.stdout:3/803: dwrite d0/f84 [4194304,4194304] 0 2026-03-10T07:51:46.218 INFO:tasks.workunit.client.1.vm08.stdout:5/867: getdents d0/d4/d19/d3a/d106 0 2026-03-10T07:51:46.230 INFO:tasks.workunit.client.1.vm08.stdout:7/781: creat d3/da/d25/df3/f109 x:0 0 0 2026-03-10T07:51:46.231 INFO:tasks.workunit.client.1.vm08.stdout:6/818: mknod d1/d17/c111 0 2026-03-10T07:51:46.236 INFO:tasks.workunit.client.1.vm08.stdout:2/797: getdents d0/d1/d3/d56/d78/dad/db1/d61/d84/da7 0 2026-03-10T07:51:46.238 INFO:tasks.workunit.client.1.vm08.stdout:7/782: sync 2026-03-10T07:51:46.246 INFO:tasks.workunit.client.1.vm08.stdout:9/803: dwrite d2/d58/dbf/dd0/d35/fc3 [0,4194304] 0 2026-03-10T07:51:46.248 INFO:tasks.workunit.client.1.vm08.stdout:1/750: truncate d2/d6/d3a/f7d 744832 0 2026-03-10T07:51:46.251 INFO:tasks.workunit.client.1.vm08.stdout:8/902: dwrite d0/df/d15/d23/d39/d5b/dea/dce/fd2 [0,4194304] 0 2026-03-10T07:51:46.260 INFO:tasks.workunit.client.1.vm08.stdout:8/903: dwrite d0/df/d2e/d49/ff3 [4194304,4194304] 0 2026-03-10T07:51:46.261 INFO:tasks.workunit.client.1.vm08.stdout:8/904: dread - d0/df/d2e/f101 zero size 2026-03-10T07:51:46.269 INFO:tasks.workunit.client.1.vm08.stdout:6/819: unlink d1/d7d/f91 0 2026-03-10T07:51:46.274 INFO:tasks.workunit.client.1.vm08.stdout:2/798: symlink d0/d1/d3/d39/d7d/d86/d55/dc9/ded/l104 0 2026-03-10T07:51:46.274 INFO:tasks.workunit.client.1.vm08.stdout:2/799: chown d0/d1/d3/d10/d38/daf 1728 1 2026-03-10T07:51:46.281 INFO:tasks.workunit.client.1.vm08.stdout:9/804: creat d2/d58/dbf/dd0/d35/d97/d9d/f10e x:0 0 0 2026-03-10T07:51:46.281 INFO:tasks.workunit.client.1.vm08.stdout:9/805: truncate d2/d26/ffe 633792 0 2026-03-10T07:51:46.283 INFO:tasks.workunit.client.1.vm08.stdout:1/751: mknod d2/d6/de/d1f/d40/d76/c100 0 2026-03-10T07:51:46.284 INFO:tasks.workunit.client.1.vm08.stdout:1/752: read - d2/d6/de/d1f/d26/ff6 zero size 2026-03-10T07:51:46.285 INFO:tasks.workunit.client.1.vm08.stdout:1/753: truncate d2/d6/de/d1f/d26/d89/d8e/fe3 992941 0 2026-03-10T07:51:46.285 INFO:tasks.workunit.client.1.vm08.stdout:1/754: stat d2/d6/de/d47/dbd 0 2026-03-10T07:51:46.286 INFO:tasks.workunit.client.1.vm08.stdout:4/721: mkdir d5/da0/d12/def 0 2026-03-10T07:51:46.293 INFO:tasks.workunit.client.1.vm08.stdout:5/868: mknod d0/d4/d19/c118 0 2026-03-10T07:51:46.298 INFO:tasks.workunit.client.1.vm08.stdout:5/869: chown d0/d4/df/dbf/daf/fc0 324481094 1 2026-03-10T07:51:46.303 INFO:tasks.workunit.client.1.vm08.stdout:2/800: rename d0/f68 to d0/d1/d17/dfb/f105 0 2026-03-10T07:51:46.305 INFO:tasks.workunit.client.1.vm08.stdout:0/837: getdents dd/d10/d2f/d37/d64/d95/d58/d3d 0 2026-03-10T07:51:46.307 INFO:tasks.workunit.client.1.vm08.stdout:9/806: symlink d2/l10f 0 2026-03-10T07:51:46.308 INFO:tasks.workunit.client.1.vm08.stdout:9/807: write d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/f100 [1006411,32299] 0 2026-03-10T07:51:46.309 INFO:tasks.workunit.client.1.vm08.stdout:3/804: link d0/d3c/d18/d80/fe3 d0/d3c/d1f/d89/fff 0 2026-03-10T07:51:46.316 INFO:tasks.workunit.client.1.vm08.stdout:1/755: creat d2/d6/de/d47/da0/f101 x:0 0 0 2026-03-10T07:51:46.316 INFO:tasks.workunit.client.1.vm08.stdout:2/801: dread d0/d1/d3/d39/d7d/d7e/fa5 [0,4194304] 0 2026-03-10T07:51:46.321 INFO:tasks.workunit.client.1.vm08.stdout:8/905: write d0/df/d15/d23/d54/dba/d89/dc5/fd7 [918533,109413] 0 2026-03-10T07:51:46.326 INFO:tasks.workunit.client.1.vm08.stdout:7/783: dwrite d3/da/d25/d9/d2f/d4d/db6/fc6 [0,4194304] 0 2026-03-10T07:51:46.327 INFO:tasks.workunit.client.1.vm08.stdout:6/820: creat d1/d3/df/d38/def/f112 x:0 0 0 2026-03-10T07:51:46.328 INFO:tasks.workunit.client.1.vm08.stdout:8/906: dread d0/df/d15/d23/d39/d5b/dbc/fe1 [0,4194304] 0 2026-03-10T07:51:46.330 INFO:tasks.workunit.client.1.vm08.stdout:5/870: rename d0/d4/d19/l87 to d0/d4/df/dbf/d41/dc8/d101/l119 0 2026-03-10T07:51:46.330 INFO:tasks.workunit.client.1.vm08.stdout:6/821: write d1/d3/df/d44/f109 [223032,12270] 0 2026-03-10T07:51:46.338 INFO:tasks.workunit.client.1.vm08.stdout:0/838: truncate dd/d10/d2f/d37/d64/d95/f48 1867005 0 2026-03-10T07:51:46.344 INFO:tasks.workunit.client.1.vm08.stdout:1/756: mknod d2/d6/de/d47/da0/c102 0 2026-03-10T07:51:46.350 INFO:tasks.workunit.client.1.vm08.stdout:1/757: chown d2/d6/de/d1f/f3d 16085706 1 2026-03-10T07:51:46.370 INFO:tasks.workunit.client.1.vm08.stdout:7/784: rmdir d3/da/d25/d9/d2f/d3a/d40 39 2026-03-10T07:51:46.380 INFO:tasks.workunit.client.1.vm08.stdout:8/907: chown d0/df/cd6 786798 1 2026-03-10T07:51:46.382 INFO:tasks.workunit.client.1.vm08.stdout:9/808: mkdir d2/d58/dbf/dd0/d35/d97/d110 0 2026-03-10T07:51:46.385 INFO:tasks.workunit.client.1.vm08.stdout:6/822: write d1/d17/d2b/d5e/f96 [436951,106282] 0 2026-03-10T07:51:46.387 INFO:tasks.workunit.client.1.vm08.stdout:6/823: readlink d1/d3/df/d1d/d6f/lcf 0 2026-03-10T07:51:46.400 INFO:tasks.workunit.client.1.vm08.stdout:6/824: dread d1/d3/f2e [0,4194304] 0 2026-03-10T07:51:46.423 INFO:tasks.workunit.client.1.vm08.stdout:1/758: creat d2/d6/de/d47/da0/f103 x:0 0 0 2026-03-10T07:51:46.427 INFO:tasks.workunit.client.1.vm08.stdout:7/785: rmdir d3/da/d25/d9/d2f/d3a 39 2026-03-10T07:51:46.429 INFO:tasks.workunit.client.1.vm08.stdout:5/871: symlink d0/d4/df/d82/df3/l11a 0 2026-03-10T07:51:46.433 INFO:tasks.workunit.client.1.vm08.stdout:3/805: creat d0/f100 x:0 0 0 2026-03-10T07:51:46.433 INFO:tasks.workunit.client.1.vm08.stdout:3/806: write d0/d3c/d1f/d95/fbd [330040,62044] 0 2026-03-10T07:51:46.434 INFO:tasks.workunit.client.1.vm08.stdout:0/839: link dd/d10/d14/d15/d20/f7e dd/d10/d2f/d37/d64/d95/d58/d3d/d9b/f109 0 2026-03-10T07:51:46.434 INFO:tasks.workunit.client.1.vm08.stdout:3/807: chown d0/d3c/d1f/c64 385569 1 2026-03-10T07:51:46.437 INFO:tasks.workunit.client.1.vm08.stdout:3/808: write d0/d3c/d18/d80/fe6 [140147,70532] 0 2026-03-10T07:51:46.437 INFO:tasks.workunit.client.1.vm08.stdout:9/809: creat d2/d58/dbf/dd0/d35/dff/f111 x:0 0 0 2026-03-10T07:51:46.441 INFO:tasks.workunit.client.1.vm08.stdout:4/722: creat d5/d8/ff0 x:0 0 0 2026-03-10T07:51:46.443 INFO:tasks.workunit.client.1.vm08.stdout:1/759: chown d2/d6/de/d1f/d26/d58/d8c/l6a 99 1 2026-03-10T07:51:46.445 INFO:tasks.workunit.client.1.vm08.stdout:7/786: rename d3/da/d25/d9/fd6 to d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79/ddd/dfd/f10a 0 2026-03-10T07:51:46.448 INFO:tasks.workunit.client.1.vm08.stdout:5/872: fdatasync d0/d8/f1b 0 2026-03-10T07:51:46.450 INFO:tasks.workunit.client.1.vm08.stdout:4/723: sync 2026-03-10T07:51:46.456 INFO:tasks.workunit.client.1.vm08.stdout:0/840: creat dd/d10/d2f/d37/d64/d95/d58/f10a x:0 0 0 2026-03-10T07:51:46.456 INFO:tasks.workunit.client.1.vm08.stdout:0/841: truncate dd/d10/dbd/fbe 1045053 0 2026-03-10T07:51:46.459 INFO:tasks.workunit.client.1.vm08.stdout:7/787: dread d3/da/d25/d9/d2f/d39/fc7 [4194304,4194304] 0 2026-03-10T07:51:46.461 INFO:tasks.workunit.client.1.vm08.stdout:9/810: unlink d2/d58/dbf/dd0/d35/d97/d9d/c104 0 2026-03-10T07:51:46.464 INFO:tasks.workunit.client.1.vm08.stdout:7/788: dwrite d3/da/f1d [0,4194304] 0 2026-03-10T07:51:46.466 INFO:tasks.workunit.client.1.vm08.stdout:1/760: fdatasync d2/f69 0 2026-03-10T07:51:46.473 INFO:tasks.workunit.client.1.vm08.stdout:5/873: mknod d0/d4/df/dbf/daf/c11b 0 2026-03-10T07:51:46.474 INFO:tasks.workunit.client.1.vm08.stdout:4/724: stat d5/da0/l5c 0 2026-03-10T07:51:46.475 INFO:tasks.workunit.client.1.vm08.stdout:0/842: mknod dd/d10/d14/d15/d20/d7a/c10b 0 2026-03-10T07:51:46.476 INFO:tasks.workunit.client.1.vm08.stdout:2/802: truncate d0/d1/d17/d6b/f9a 2826491 0 2026-03-10T07:51:46.477 INFO:tasks.workunit.client.1.vm08.stdout:3/809: mknod d0/d3c/d18/dec/c101 0 2026-03-10T07:51:46.484 INFO:tasks.workunit.client.1.vm08.stdout:6/825: write d1/db/d24/d73/d79/d7c/fa3 [3897366,77296] 0 2026-03-10T07:51:46.489 INFO:tasks.workunit.client.1.vm08.stdout:1/761: mkdir d2/d6/de/d1f/d26/d58/d83/d104 0 2026-03-10T07:51:46.495 INFO:tasks.workunit.client.1.vm08.stdout:2/803: dread d0/d1/d3/d56/d78/dad/fdc [0,4194304] 0 2026-03-10T07:51:46.497 INFO:tasks.workunit.client.1.vm08.stdout:8/908: dwrite d0/df/d2e/f3c [0,4194304] 0 2026-03-10T07:51:46.506 INFO:tasks.workunit.client.1.vm08.stdout:8/909: sync 2026-03-10T07:51:46.506 INFO:tasks.workunit.client.1.vm08.stdout:5/874: creat d0/d4/df/d82/df3/f11c x:0 0 0 2026-03-10T07:51:46.507 INFO:tasks.workunit.client.1.vm08.stdout:8/910: chown d0/df/d15/d23/d54/dba/d89/fac 36415775 1 2026-03-10T07:51:46.508 INFO:tasks.workunit.client.1.vm08.stdout:8/911: readlink d0/df/d2e/l107 0 2026-03-10T07:51:46.511 INFO:tasks.workunit.client.1.vm08.stdout:4/725: rename d5/da0/d95/de6/d48/d4f/f56 to d5/da0/d95/de6/d48/d4f/d8d/d91/dd5/ff1 0 2026-03-10T07:51:46.515 INFO:tasks.workunit.client.1.vm08.stdout:4/726: dwrite d5/d1f/d70/fbd [0,4194304] 0 2026-03-10T07:51:46.536 INFO:tasks.workunit.client.1.vm08.stdout:9/811: mkdir d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/dca/dd7/d10d/d112 0 2026-03-10T07:51:46.536 INFO:tasks.workunit.client.1.vm08.stdout:5/875: dread d0/d4/df/dbf/d41/f5c [0,4194304] 0 2026-03-10T07:51:46.537 INFO:tasks.workunit.client.1.vm08.stdout:9/812: readlink d2/d58/dbf/d2b/l34 0 2026-03-10T07:51:46.541 INFO:tasks.workunit.client.1.vm08.stdout:6/826: symlink d1/d3/df/d1d/d40/d87/l113 0 2026-03-10T07:51:46.542 INFO:tasks.workunit.client.1.vm08.stdout:6/827: chown d1/d3/df/la1 43 1 2026-03-10T07:51:46.543 INFO:tasks.workunit.client.1.vm08.stdout:6/828: write d1/db/d24/d73/d79/f10f [1499497,77972] 0 2026-03-10T07:51:46.552 INFO:tasks.workunit.client.1.vm08.stdout:1/762: dwrite d2/d6/de/d1f/d26/d58/d8c/f96 [4194304,4194304] 0 2026-03-10T07:51:46.571 INFO:tasks.workunit.client.1.vm08.stdout:5/876: creat d0/d8/d5e/d8e/f11d x:0 0 0 2026-03-10T07:51:46.573 INFO:tasks.workunit.client.1.vm08.stdout:1/763: mknod d2/d6/d9f/c105 0 2026-03-10T07:51:46.576 INFO:tasks.workunit.client.1.vm08.stdout:8/912: mkdir d0/df/d15/d23/d54/dba/d11c/d11f 0 2026-03-10T07:51:46.577 INFO:tasks.workunit.client.1.vm08.stdout:8/913: write d0/df/d15/d23/d54/dba/d89/dc5/f111 [466832,46627] 0 2026-03-10T07:51:46.578 INFO:tasks.workunit.client.1.vm08.stdout:8/914: fsync d0/df/d17/d25/fc4 0 2026-03-10T07:51:46.579 INFO:tasks.workunit.client.1.vm08.stdout:8/915: dread - d0/df/d15/d23/da8/fc3 zero size 2026-03-10T07:51:46.579 INFO:tasks.workunit.client.1.vm08.stdout:0/843: creat dd/d10/d14/d1b/f10c x:0 0 0 2026-03-10T07:51:46.584 INFO:tasks.workunit.client.1.vm08.stdout:4/727: symlink d5/d8/d50/db0/lf2 0 2026-03-10T07:51:46.588 INFO:tasks.workunit.client.1.vm08.stdout:3/810: link d0/d3c/d18/d32/d61/d52/f68 d0/d3c/d18/da9/f102 0 2026-03-10T07:51:46.598 INFO:tasks.workunit.client.1.vm08.stdout:7/789: rmdir d3/da/d25/d9/d2f/d3a/dc0/d108 0 2026-03-10T07:51:46.603 INFO:tasks.workunit.client.1.vm08.stdout:3/811: dread d0/d3c/d18/f38 [0,4194304] 0 2026-03-10T07:51:46.604 INFO:tasks.workunit.client.1.vm08.stdout:0/844: mkdir dd/d10/dbd/d10d 0 2026-03-10T07:51:46.604 INFO:tasks.workunit.client.1.vm08.stdout:8/916: rmdir d0/df/d15/d23/d39 39 2026-03-10T07:51:46.613 INFO:tasks.workunit.client.1.vm08.stdout:7/790: dread d3/da/f21 [4194304,4194304] 0 2026-03-10T07:51:46.614 INFO:tasks.workunit.client.1.vm08.stdout:4/728: dwrite d5/d8/fc1 [0,4194304] 0 2026-03-10T07:51:46.627 INFO:tasks.workunit.client.1.vm08.stdout:2/804: rename d0/d1/d3/d39/d7d/d86/c41 to d0/c106 0 2026-03-10T07:51:46.628 INFO:tasks.workunit.client.1.vm08.stdout:9/813: rename d2/d58/dbf/dd0 to d2/d58/dbf/dd0/d35/d97/dfb/d113 22 2026-03-10T07:51:46.631 INFO:tasks.workunit.client.1.vm08.stdout:0/845: mknod dd/c10e 0 2026-03-10T07:51:46.632 INFO:tasks.workunit.client.1.vm08.stdout:5/877: dwrite d0/d4/df/dbf/f64 [0,4194304] 0 2026-03-10T07:51:46.642 INFO:tasks.workunit.client.1.vm08.stdout:4/729: creat d5/d1f/d9b/ff3 x:0 0 0 2026-03-10T07:51:46.645 INFO:tasks.workunit.client.1.vm08.stdout:6/829: rename d1/db/d24/d73 to d1/d17/d2b/d58/d76/d114 0 2026-03-10T07:51:46.646 INFO:tasks.workunit.client.1.vm08.stdout:4/730: dwrite d5/d8/f42 [0,4194304] 0 2026-03-10T07:51:46.652 INFO:tasks.workunit.client.1.vm08.stdout:9/814: creat d2/d58/dbf/dd0/d35/d9b/f114 x:0 0 0 2026-03-10T07:51:46.652 INFO:tasks.workunit.client.1.vm08.stdout:5/878: unlink d0/d4/d19/d3a/d69/f6b 0 2026-03-10T07:51:46.653 INFO:tasks.workunit.client.1.vm08.stdout:4/731: chown d5/d1f/dad/dcf 7501806 1 2026-03-10T07:51:46.654 INFO:tasks.workunit.client.1.vm08.stdout:3/812: mknod d0/d3c/d18/d32/d61/c103 0 2026-03-10T07:51:46.655 INFO:tasks.workunit.client.1.vm08.stdout:1/764: getdents d2/d6 0 2026-03-10T07:51:46.656 INFO:tasks.workunit.client.1.vm08.stdout:2/805: mkdir d0/d1/d3/d39/de2/d107 0 2026-03-10T07:51:46.656 INFO:tasks.workunit.client.1.vm08.stdout:9/815: dread - d2/d58/dbf/dd0/d35/d97/fe1 zero size 2026-03-10T07:51:46.657 INFO:tasks.workunit.client.1.vm08.stdout:1/765: stat d2/d6/de/d1f/d40/lce 0 2026-03-10T07:51:46.657 INFO:tasks.workunit.client.1.vm08.stdout:8/917: dwrite d0/df/d15/d23/d54/dba/d89/dbf/fe5 [0,4194304] 0 2026-03-10T07:51:46.659 INFO:tasks.workunit.client.1.vm08.stdout:9/816: write d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/d98/fd4 [1788199,53158] 0 2026-03-10T07:51:46.661 INFO:tasks.workunit.client.1.vm08.stdout:5/879: mknod d0/d4/d19/d60/c11e 0 2026-03-10T07:51:46.667 INFO:tasks.workunit.client.1.vm08.stdout:4/732: truncate d5/d1f/d31/fba 39587 0 2026-03-10T07:51:46.668 INFO:tasks.workunit.client.1.vm08.stdout:4/733: chown d5/d8/l94 46405629 1 2026-03-10T07:51:46.669 INFO:tasks.workunit.client.1.vm08.stdout:2/806: creat d0/d1/d3/d39/d7d/d86/d55/dc9/ded/f108 x:0 0 0 2026-03-10T07:51:46.673 INFO:tasks.workunit.client.1.vm08.stdout:1/766: creat d2/d6/de/d1f/d26/d58/d8c/f106 x:0 0 0 2026-03-10T07:51:46.673 INFO:tasks.workunit.client.1.vm08.stdout:5/880: mkdir d0/d4/d19/d60/d6d/d70/d40/dba/d11f 0 2026-03-10T07:51:46.673 INFO:tasks.workunit.client.1.vm08.stdout:1/767: readlink d2/d6/de/lf1 0 2026-03-10T07:51:46.674 INFO:tasks.workunit.client.1.vm08.stdout:8/918: dread d0/df/d15/d9c/fab [0,4194304] 0 2026-03-10T07:51:46.676 INFO:tasks.workunit.client.1.vm08.stdout:4/734: creat d5/da0/d32/ff4 x:0 0 0 2026-03-10T07:51:46.677 INFO:tasks.workunit.client.1.vm08.stdout:1/768: mknod d2/d6/d3a/d61/c107 0 2026-03-10T07:51:46.682 INFO:tasks.workunit.client.1.vm08.stdout:5/881: dwrite d0/d4/d19/d60/d6d/d70/d40/f5f [0,4194304] 0 2026-03-10T07:51:46.684 INFO:tasks.workunit.client.1.vm08.stdout:8/919: creat d0/df/d17/d25/f120 x:0 0 0 2026-03-10T07:51:46.684 INFO:tasks.workunit.client.1.vm08.stdout:0/846: link dd/d18/d100/d102/dc1/fe3 dd/d10/d2f/d37/daf/f10f 0 2026-03-10T07:51:46.684 INFO:tasks.workunit.client.1.vm08.stdout:1/769: creat d2/d6/d3a/f108 x:0 0 0 2026-03-10T07:51:46.685 INFO:tasks.workunit.client.1.vm08.stdout:5/882: mknod d0/d4/d19/d81/d92/c120 0 2026-03-10T07:51:46.685 INFO:tasks.workunit.client.1.vm08.stdout:1/770: readlink d2/d6/d3a/l6c 0 2026-03-10T07:51:46.686 INFO:tasks.workunit.client.1.vm08.stdout:0/847: mkdir dd/d18/d100/d102/d110 0 2026-03-10T07:51:46.691 INFO:tasks.workunit.client.1.vm08.stdout:8/920: link d0/df/d15/d23/lc9 d0/df/d15/d53/l121 0 2026-03-10T07:51:46.693 INFO:tasks.workunit.client.1.vm08.stdout:1/771: rmdir d2/d6/de/d70/d80 39 2026-03-10T07:51:46.693 INFO:tasks.workunit.client.1.vm08.stdout:0/848: truncate dd/d10/d2f/d37/d64/f104 964682 0 2026-03-10T07:51:46.695 INFO:tasks.workunit.client.1.vm08.stdout:8/921: mknod d0/df/d15/d23/d54/dba/d89/dc5/c122 0 2026-03-10T07:51:46.697 INFO:tasks.workunit.client.1.vm08.stdout:0/849: stat dd/d18/d100/d102/dc1/ded/df7/fff 0 2026-03-10T07:51:46.702 INFO:tasks.workunit.client.1.vm08.stdout:7/791: dread d3/da/d25/d9/d2f/d3a/dc0/ff9 [0,4194304] 0 2026-03-10T07:51:46.704 INFO:tasks.workunit.client.1.vm08.stdout:7/792: dread d3/da/d25/f29 [4194304,4194304] 0 2026-03-10T07:51:46.705 INFO:tasks.workunit.client.1.vm08.stdout:2/807: dread d0/d1/d17/dfb/f105 [0,4194304] 0 2026-03-10T07:51:46.706 INFO:tasks.workunit.client.1.vm08.stdout:7/793: mknod d3/da/d25/d9/d2f/d3a/d40/d54/db5/c10b 0 2026-03-10T07:51:46.711 INFO:tasks.workunit.client.1.vm08.stdout:2/808: chown d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/df0 1967 1 2026-03-10T07:51:46.711 INFO:tasks.workunit.client.1.vm08.stdout:2/809: dread - d0/d1/d3/d39/d7d/f98 zero size 2026-03-10T07:51:46.711 INFO:tasks.workunit.client.1.vm08.stdout:2/810: readlink d0/d1/d3/d39/l4c 0 2026-03-10T07:51:46.711 INFO:tasks.workunit.client.1.vm08.stdout:7/794: mknod d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/d107/c10c 0 2026-03-10T07:51:46.711 INFO:tasks.workunit.client.1.vm08.stdout:2/811: stat d0/d1/d3/d56/d57/f5b 0 2026-03-10T07:51:46.711 INFO:tasks.workunit.client.1.vm08.stdout:4/735: sync 2026-03-10T07:51:46.711 INFO:tasks.workunit.client.1.vm08.stdout:4/736: dread - d5/d1f/dad/db8/fd2 zero size 2026-03-10T07:51:46.711 INFO:tasks.workunit.client.1.vm08.stdout:1/772: sync 2026-03-10T07:51:46.713 INFO:tasks.workunit.client.1.vm08.stdout:7/795: creat d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79/ddd/f10d x:0 0 0 2026-03-10T07:51:46.714 INFO:tasks.workunit.client.1.vm08.stdout:4/737: mknod d5/da0/d95/de6/d48/da2/cf5 0 2026-03-10T07:51:46.717 INFO:tasks.workunit.client.1.vm08.stdout:4/738: symlink d5/da0/db7/lf6 0 2026-03-10T07:51:46.722 INFO:tasks.workunit.client.1.vm08.stdout:7/796: dread d3/da/d25/d9/d2f/d4d/fb9 [0,4194304] 0 2026-03-10T07:51:46.724 INFO:tasks.workunit.client.1.vm08.stdout:4/739: link d5/d1f/d31/f33 d5/da0/de2/ff7 0 2026-03-10T07:51:46.728 INFO:tasks.workunit.client.1.vm08.stdout:4/740: symlink d5/da0/d95/de6/lf8 0 2026-03-10T07:51:46.729 INFO:tasks.workunit.client.1.vm08.stdout:4/741: write d5/da0/d95/de6/d48/d4f/d8d/fcd [1950866,45063] 0 2026-03-10T07:51:46.729 INFO:tasks.workunit.client.1.vm08.stdout:4/742: dread - d5/da0/d95/de6/d48/fe9 zero size 2026-03-10T07:51:46.731 INFO:tasks.workunit.client.1.vm08.stdout:4/743: chown d5/d1f/d31/lac 114 1 2026-03-10T07:51:46.731 INFO:tasks.workunit.client.1.vm08.stdout:4/744: write d5/da0/d95/de6/d48/d4f/d8d/fcd [2650821,115280] 0 2026-03-10T07:51:46.744 INFO:tasks.workunit.client.1.vm08.stdout:1/773: dread d2/d6/de/f1c [0,4194304] 0 2026-03-10T07:51:46.745 INFO:tasks.workunit.client.1.vm08.stdout:6/830: write d1/d3/df/d44/f5a [2287611,68179] 0 2026-03-10T07:51:46.745 INFO:tasks.workunit.client.1.vm08.stdout:9/817: write d2/d58/dbf/d2b/f83 [462892,46128] 0 2026-03-10T07:51:46.748 INFO:tasks.workunit.client.1.vm08.stdout:3/813: dwrite d0/d3c/d18/dec/d2d/fe8 [0,4194304] 0 2026-03-10T07:51:46.752 INFO:tasks.workunit.client.1.vm08.stdout:4/745: read d5/f2d [224330,124234] 0 2026-03-10T07:51:46.752 INFO:tasks.workunit.client.1.vm08.stdout:0/850: rename dd/d10/d2f/d37/daf to dd/d10/dbd/d10d/d111 0 2026-03-10T07:51:46.754 INFO:tasks.workunit.client.1.vm08.stdout:0/851: truncate dd/d10/d14/d15/d20/d7a/fde 624881 0 2026-03-10T07:51:46.757 INFO:tasks.workunit.client.1.vm08.stdout:5/883: dwrite d0/d4/d19/d60/fb6 [0,4194304] 0 2026-03-10T07:51:46.757 INFO:tasks.workunit.client.1.vm08.stdout:0/852: write dd/d10/d14/d1b/fb7 [3807367,583] 0 2026-03-10T07:51:46.765 INFO:tasks.workunit.client.1.vm08.stdout:5/884: truncate d0/d4/d19/d3a/d69/f103 1204890 0 2026-03-10T07:51:46.765 INFO:tasks.workunit.client.1.vm08.stdout:1/774: creat d2/d6/de/d1f/d26/d58/d83/dc2/f109 x:0 0 0 2026-03-10T07:51:46.769 INFO:tasks.workunit.client.1.vm08.stdout:8/922: dwrite d0/df/d15/d23/d54/fad [0,4194304] 0 2026-03-10T07:51:46.773 INFO:tasks.workunit.client.1.vm08.stdout:0/853: mknod dd/d10/d2f/d37/d64/c112 0 2026-03-10T07:51:46.779 INFO:tasks.workunit.client.1.vm08.stdout:0/854: write f8 [2345823,80824] 0 2026-03-10T07:51:46.780 INFO:tasks.workunit.client.1.vm08.stdout:3/814: dread d0/d3c/d1f/f6f [0,4194304] 0 2026-03-10T07:51:46.781 INFO:tasks.workunit.client.1.vm08.stdout:7/797: dread d3/da/d8a/f9e [0,4194304] 0 2026-03-10T07:51:46.781 INFO:tasks.workunit.client.1.vm08.stdout:3/815: fsync d0/d3c/d18/f22 0 2026-03-10T07:51:46.786 INFO:tasks.workunit.client.1.vm08.stdout:1/775: rmdir d2/d6 39 2026-03-10T07:51:46.787 INFO:tasks.workunit.client.1.vm08.stdout:4/746: symlink d5/da0/lf9 0 2026-03-10T07:51:46.787 INFO:tasks.workunit.client.1.vm08.stdout:7/798: dread - d3/da/d25/d9/d2f/d3a/d40/d54/db5/f100 zero size 2026-03-10T07:51:46.787 INFO:tasks.workunit.client.1.vm08.stdout:4/747: stat d5/da0/d95/de6/d48/d4f/d7c/fb2 0 2026-03-10T07:51:46.788 INFO:tasks.workunit.client.1.vm08.stdout:4/748: readlink d5/d1f/lc0 0 2026-03-10T07:51:46.794 INFO:tasks.workunit.client.1.vm08.stdout:8/923: fdatasync d0/df/d15/d23/da8/f6a 0 2026-03-10T07:51:46.796 INFO:tasks.workunit.client.1.vm08.stdout:8/924: write d0/df/d15/d23/d54/dba/d89/dc5/fd7 [547704,74200] 0 2026-03-10T07:51:46.796 INFO:tasks.workunit.client.1.vm08.stdout:6/831: dread d1/f49 [4194304,4194304] 0 2026-03-10T07:51:46.798 INFO:tasks.workunit.client.1.vm08.stdout:2/812: rename d0/d1/d3/d10/d65/fc5 to d0/d1/d3/d39/d7d/d86/f109 0 2026-03-10T07:51:46.813 INFO:tasks.workunit.client.1.vm08.stdout:6/832: mkdir d1/d3/df/d1d/d6f/d10e/d115 0 2026-03-10T07:51:46.814 INFO:tasks.workunit.client.1.vm08.stdout:2/813: sync 2026-03-10T07:51:46.815 INFO:tasks.workunit.client.1.vm08.stdout:2/814: stat d0/d1/d3/d56/d78/dad/db1/d61/d84/cdd 0 2026-03-10T07:51:46.818 INFO:tasks.workunit.client.1.vm08.stdout:4/749: rename d5/da0/d95/de6/d48/d4f/d8d/d91/dd5/ff1 to d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/ffa 0 2026-03-10T07:51:46.818 INFO:tasks.workunit.client.1.vm08.stdout:4/750: readlink d5/d8/l85 0 2026-03-10T07:51:46.826 INFO:tasks.workunit.client.1.vm08.stdout:8/925: dread d0/df/d15/d23/f75 [0,4194304] 0 2026-03-10T07:51:46.828 INFO:tasks.workunit.client.1.vm08.stdout:7/799: link d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/dac/fed d3/da/d25/d9/d2f/d3a/d71/dca/f10e 0 2026-03-10T07:51:46.829 INFO:tasks.workunit.client.1.vm08.stdout:7/800: fdatasync d3/da/d8a/dd1/fdb 0 2026-03-10T07:51:46.829 INFO:tasks.workunit.client.1.vm08.stdout:5/885: write d0/d4/df/d12/f13 [2822526,114809] 0 2026-03-10T07:51:46.830 INFO:tasks.workunit.client.1.vm08.stdout:5/886: readlink d0/d4/df/dbf/l2b 0 2026-03-10T07:51:46.833 INFO:tasks.workunit.client.1.vm08.stdout:9/818: dwrite d2/f6 [0,4194304] 0 2026-03-10T07:51:46.835 INFO:tasks.workunit.client.1.vm08.stdout:6/833: mknod d1/db/d24/dac/c116 0 2026-03-10T07:51:46.839 INFO:tasks.workunit.client.1.vm08.stdout:0/855: dwrite dd/d10/d2f/f8e [0,4194304] 0 2026-03-10T07:51:46.840 INFO:tasks.workunit.client.1.vm08.stdout:2/815: rmdir d0/d1/d3/d39/d7d/d86/d55/d1b 39 2026-03-10T07:51:46.840 INFO:tasks.workunit.client.1.vm08.stdout:0/856: dread dd/d10/d14/d15/d20/d7a/fde [0,4194304] 0 2026-03-10T07:51:46.841 INFO:tasks.workunit.client.1.vm08.stdout:2/816: write d0/d1/d17/d6b/f72 [2844056,4096] 0 2026-03-10T07:51:46.841 INFO:tasks.workunit.client.1.vm08.stdout:3/816: creat d0/d3c/d18/f104 x:0 0 0 2026-03-10T07:51:46.843 INFO:tasks.workunit.client.1.vm08.stdout:2/817: truncate d0/d1/d3/d56/d78/dad/db1/d61/d8e/fe8 808505 0 2026-03-10T07:51:46.843 INFO:tasks.workunit.client.1.vm08.stdout:4/751: symlink d5/da0/d95/lfb 0 2026-03-10T07:51:46.848 INFO:tasks.workunit.client.1.vm08.stdout:4/752: write d5/d8/f68 [180360,23018] 0 2026-03-10T07:51:46.848 INFO:tasks.workunit.client.1.vm08.stdout:0/857: stat dd/d10/d2f/d37/d64/d95/d58/fd9 0 2026-03-10T07:51:46.849 INFO:tasks.workunit.client.1.vm08.stdout:3/817: chown d0/d3c/d18/d32/d61 4338 1 2026-03-10T07:51:46.862 INFO:tasks.workunit.client.1.vm08.stdout:7/801: unlink d3/da/d25/d9/d2f/d39/d43/d4f/c9b 0 2026-03-10T07:51:46.868 INFO:tasks.workunit.client.1.vm08.stdout:6/834: truncate d1/f49 8270183 0 2026-03-10T07:51:46.871 INFO:tasks.workunit.client.1.vm08.stdout:6/835: chown d1/d3/df/fd7 1115633930 1 2026-03-10T07:51:46.872 INFO:tasks.workunit.client.1.vm08.stdout:2/818: read d0/d1/d3/d39/f3b [1872959,76399] 0 2026-03-10T07:51:46.874 INFO:tasks.workunit.client.1.vm08.stdout:1/776: link d2/d6/de/d1f/d40/c92 d2/d6/de/d1f/d26/d58/c10a 0 2026-03-10T07:51:46.874 INFO:tasks.workunit.client.1.vm08.stdout:2/819: truncate d0/d1/d3/d56/d78/dad/db1/d61/dee/ffa 843483 0 2026-03-10T07:51:46.875 INFO:tasks.workunit.client.1.vm08.stdout:2/820: chown d0/d1/d3/d39/d7d/d86/ccc 40768616 1 2026-03-10T07:51:46.876 INFO:tasks.workunit.client.1.vm08.stdout:2/821: chown d0/d1/d3/d39/d7d/d7e/fa5 11594 1 2026-03-10T07:51:46.879 INFO:tasks.workunit.client.1.vm08.stdout:8/926: truncate d0/df/d2e/d49/fe0 554981 0 2026-03-10T07:51:46.885 INFO:tasks.workunit.client.1.vm08.stdout:0/858: dread dd/f16 [0,4194304] 0 2026-03-10T07:51:46.887 INFO:tasks.workunit.client.1.vm08.stdout:5/887: symlink d0/d4/d19/d81/l121 0 2026-03-10T07:51:46.888 INFO:tasks.workunit.client.1.vm08.stdout:5/888: readlink d0/d4/d19/d60/d6d/d70/d40/dba/lc9 0 2026-03-10T07:51:46.889 INFO:tasks.workunit.client.1.vm08.stdout:9/819: mknod d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/dca/dd7/d10d/d112/c115 0 2026-03-10T07:51:46.891 INFO:tasks.workunit.client.1.vm08.stdout:7/802: dread d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/f65 [0,4194304] 0 2026-03-10T07:51:46.893 INFO:tasks.workunit.client.1.vm08.stdout:0/859: dwrite dd/d18/fdf [0,4194304] 0 2026-03-10T07:51:46.897 INFO:tasks.workunit.client.1.vm08.stdout:3/818: mkdir d0/d3c/d18/da9/dcc/d105 0 2026-03-10T07:51:46.901 INFO:tasks.workunit.client.1.vm08.stdout:5/889: fsync d0/d8/f1b 0 2026-03-10T07:51:46.904 INFO:tasks.workunit.client.1.vm08.stdout:1/777: symlink d2/d6/dfe/l10b 0 2026-03-10T07:51:46.906 INFO:tasks.workunit.client.1.vm08.stdout:7/803: dread d3/da/d25/d9/d2f/d39/db2/fd0 [0,4194304] 0 2026-03-10T07:51:46.908 INFO:tasks.workunit.client.1.vm08.stdout:4/753: rename d5/d1f/c7a to d5/da0/d95/de6/d48/d4f/d8d/cfc 0 2026-03-10T07:51:46.910 INFO:tasks.workunit.client.1.vm08.stdout:2/822: symlink d0/d1/d3/d39/d7d/d86/l10a 0 2026-03-10T07:51:46.912 INFO:tasks.workunit.client.1.vm08.stdout:0/860: stat dd/c1a 0 2026-03-10T07:51:46.925 INFO:tasks.workunit.client.1.vm08.stdout:6/836: rename d1/d3/df/d44/f82 to d1/d17/d2b/d5e/da8/f117 0 2026-03-10T07:51:46.925 INFO:tasks.workunit.client.1.vm08.stdout:6/837: readlink d1/db/d24/lae 0 2026-03-10T07:51:46.930 INFO:tasks.workunit.client.1.vm08.stdout:6/838: dwrite d1/d17/d2b/f68 [0,4194304] 0 2026-03-10T07:51:46.933 INFO:tasks.workunit.client.1.vm08.stdout:0/861: rmdir dd/d10/d14/d15/d20/d7a/dd2 39 2026-03-10T07:51:46.944 INFO:tasks.workunit.client.1.vm08.stdout:3/819: dread d0/d3c/d18/dec/d34/f53 [0,4194304] 0 2026-03-10T07:51:46.950 INFO:tasks.workunit.client.1.vm08.stdout:6/839: dread d1/db/d24/f75 [0,4194304] 0 2026-03-10T07:51:46.959 INFO:tasks.workunit.client.1.vm08.stdout:5/890: creat d0/d4/d19/d3a/df1/f122 x:0 0 0 2026-03-10T07:51:46.959 INFO:tasks.workunit.client.1.vm08.stdout:5/891: chown d0/d4/d19/d3a/d106 1569 1 2026-03-10T07:51:46.959 INFO:tasks.workunit.client.1.vm08.stdout:1/778: mkdir d2/d10c 0 2026-03-10T07:51:46.959 INFO:tasks.workunit.client.1.vm08.stdout:7/804: mkdir d3/da/d10f 0 2026-03-10T07:51:46.963 INFO:tasks.workunit.client.1.vm08.stdout:7/805: truncate d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/f81 888165 0 2026-03-10T07:51:46.967 INFO:tasks.workunit.client.1.vm08.stdout:1/779: dread d2/d6/de/d1f/d26/d58/d83/f72 [0,4194304] 0 2026-03-10T07:51:46.982 INFO:tasks.workunit.client.1.vm08.stdout:0/862: creat dd/d10/d2f/d37/d64/d52/f113 x:0 0 0 2026-03-10T07:51:46.985 INFO:tasks.workunit.client.1.vm08.stdout:8/927: getdents d0/df/d15/d9c 0 2026-03-10T07:51:46.990 INFO:tasks.workunit.client.1.vm08.stdout:9/820: link d2/d58/dbf/ddf/ce5 d2/d58/dbf/dd0/d35/d97/c116 0 2026-03-10T07:51:46.991 INFO:tasks.workunit.client.1.vm08.stdout:9/821: write d2/d58/d73/fe4 [415364,30989] 0 2026-03-10T07:51:46.993 INFO:tasks.workunit.client.1.vm08.stdout:6/840: mkdir d1/d3/df/d1d/d40/d87/d118 0 2026-03-10T07:51:46.994 INFO:tasks.workunit.client.1.vm08.stdout:6/841: chown d1/d3/df/d1d/d40/d45/d10c 14 1 2026-03-10T07:51:46.998 INFO:tasks.workunit.client.1.vm08.stdout:7/806: rmdir d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/dac 39 2026-03-10T07:51:46.999 INFO:tasks.workunit.client.1.vm08.stdout:7/807: write d3/da/d25/d9/d2f/d4d/db6/fc6 [371609,55157] 0 2026-03-10T07:51:47.003 INFO:tasks.workunit.client.1.vm08.stdout:4/754: link d5/da0/d95/f96 d5/da0/ffd 0 2026-03-10T07:51:47.005 INFO:tasks.workunit.client.1.vm08.stdout:4/755: chown d5/da0/d12/c74 104640506 1 2026-03-10T07:51:47.025 INFO:tasks.workunit.client.1.vm08.stdout:0/863: creat dd/d18/d100/d102/dc1/de5/f114 x:0 0 0 2026-03-10T07:51:47.045 INFO:tasks.workunit.client.1.vm08.stdout:3/820: mknod d0/d3c/d18/d48/c106 0 2026-03-10T07:51:47.050 INFO:tasks.workunit.client.1.vm08.stdout:6/842: rename d1/d17/fa6 to d1/d3/df/d1d/d40/d45/d10c/f119 0 2026-03-10T07:51:47.057 INFO:tasks.workunit.client.1.vm08.stdout:4/756: mknod d5/d1f/d9b/cfe 0 2026-03-10T07:51:47.057 INFO:tasks.workunit.client.1.vm08.stdout:2/823: getdents d0/d1/d17/d6b/da0 0 2026-03-10T07:51:47.059 INFO:tasks.workunit.client.1.vm08.stdout:8/928: symlink d0/df/d15/d23/d54/dba/l123 0 2026-03-10T07:51:47.060 INFO:tasks.workunit.client.1.vm08.stdout:8/929: write d0/df/d15/d23/d54/dba/d89/fac [405349,11099] 0 2026-03-10T07:51:47.064 INFO:tasks.workunit.client.1.vm08.stdout:7/808: dread d3/da/d25/d9/d2f/d3a/d40/f55 [0,4194304] 0 2026-03-10T07:51:47.065 INFO:tasks.workunit.client.1.vm08.stdout:5/892: creat d0/d4/d19/d81/f123 x:0 0 0 2026-03-10T07:51:47.067 INFO:tasks.workunit.client.1.vm08.stdout:6/843: creat d1/db/d24/d3d/f11a x:0 0 0 2026-03-10T07:51:47.068 INFO:tasks.workunit.client.1.vm08.stdout:1/780: link d2/d6/d3a/d61/c6b d2/d6/de/d1f/d26/d58/d83/dc2/c10d 0 2026-03-10T07:51:47.068 INFO:tasks.workunit.client.1.vm08.stdout:6/844: chown d1/d46/f100 54684039 1 2026-03-10T07:51:47.070 INFO:tasks.workunit.client.1.vm08.stdout:2/824: sync 2026-03-10T07:51:47.070 INFO:tasks.workunit.client.1.vm08.stdout:1/781: sync 2026-03-10T07:51:47.071 INFO:tasks.workunit.client.1.vm08.stdout:2/825: sync 2026-03-10T07:51:47.074 INFO:tasks.workunit.client.1.vm08.stdout:8/930: symlink d0/df/d15/d9c/l124 0 2026-03-10T07:51:47.075 INFO:tasks.workunit.client.1.vm08.stdout:8/931: chown d0/df/d15/d23/l2c 3 1 2026-03-10T07:51:47.076 INFO:tasks.workunit.client.1.vm08.stdout:4/757: write d5/da0/d95/dc2/fc5 [723264,94795] 0 2026-03-10T07:51:47.080 INFO:tasks.workunit.client.1.vm08.stdout:7/809: creat d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/f110 x:0 0 0 2026-03-10T07:51:47.080 INFO:tasks.workunit.client.1.vm08.stdout:5/893: chown d0/d8/dce/l107 0 1 2026-03-10T07:51:47.085 INFO:tasks.workunit.client.1.vm08.stdout:3/821: creat d0/d3c/d18/d32/d61/f107 x:0 0 0 2026-03-10T07:51:47.087 INFO:tasks.workunit.client.1.vm08.stdout:1/782: symlink d2/d6/de/d47/dbd/dc3/l10e 0 2026-03-10T07:51:47.088 INFO:tasks.workunit.client.1.vm08.stdout:1/783: dread - d2/d6/de/d1f/d26/d58/d83/dc2/f109 zero size 2026-03-10T07:51:47.089 INFO:tasks.workunit.client.1.vm08.stdout:2/826: creat d0/d1/d3/d10/d38/daf/f10b x:0 0 0 2026-03-10T07:51:47.091 INFO:tasks.workunit.client.1.vm08.stdout:9/822: getdents d2/d58/dbf/dd0/d35/d97 0 2026-03-10T07:51:47.096 INFO:tasks.workunit.client.1.vm08.stdout:4/758: dread - d5/f54 zero size 2026-03-10T07:51:47.098 INFO:tasks.workunit.client.1.vm08.stdout:7/810: creat d3/da/d25/d9/d2f/d3a/d4b/f111 x:0 0 0 2026-03-10T07:51:47.100 INFO:tasks.workunit.client.1.vm08.stdout:8/932: write d0/df/d15/f70 [1401378,5574] 0 2026-03-10T07:51:47.102 INFO:tasks.workunit.client.1.vm08.stdout:0/864: link dd/d18/d100/d102/dc1/ded/l8c dd/l115 0 2026-03-10T07:51:47.104 INFO:tasks.workunit.client.1.vm08.stdout:1/784: creat d2/d6/de/d1f/d8f/f10f x:0 0 0 2026-03-10T07:51:47.105 INFO:tasks.workunit.client.1.vm08.stdout:1/785: read d2/d6/f86 [2004269,105861] 0 2026-03-10T07:51:47.106 INFO:tasks.workunit.client.1.vm08.stdout:1/786: write d2/d6/de/d1f/d26/d58/d83/fa2 [2363518,66028] 0 2026-03-10T07:51:47.112 INFO:tasks.workunit.client.1.vm08.stdout:4/759: mknod d5/d8/d50/db0/cff 0 2026-03-10T07:51:47.114 INFO:tasks.workunit.client.1.vm08.stdout:8/933: symlink d0/d69/l125 0 2026-03-10T07:51:47.125 INFO:tasks.workunit.client.1.vm08.stdout:7/811: mknod d3/da/d25/d9/d2f/d39/d43/c112 0 2026-03-10T07:51:47.127 INFO:tasks.workunit.client.1.vm08.stdout:5/894: rename d0/d8/d24/f8f to d0/d4/f124 0 2026-03-10T07:51:47.127 INFO:tasks.workunit.client.1.vm08.stdout:6/845: getdents d1/d17/d2b/d58/d76/d114/d79/d7c 0 2026-03-10T07:51:47.128 INFO:tasks.workunit.client.1.vm08.stdout:5/895: read d0/d4/d19/d3a/f3f [242139,95529] 0 2026-03-10T07:51:47.129 INFO:tasks.workunit.client.1.vm08.stdout:1/787: symlink d2/d6/de/d1f/d22/deb/l110 0 2026-03-10T07:51:47.129 INFO:tasks.workunit.client.1.vm08.stdout:5/896: chown d0/d4/d19/d3a/d69/f71 1296524 1 2026-03-10T07:51:47.130 INFO:tasks.workunit.client.1.vm08.stdout:4/760: link d5/d8/ff0 d5/da0/d95/de6/d48/d4f/d8d/d91/dd5/f100 0 2026-03-10T07:51:47.132 INFO:tasks.workunit.client.1.vm08.stdout:8/934: mkdir d0/df/d15/d23/d39/d5b/dea/d126 0 2026-03-10T07:51:47.132 INFO:tasks.workunit.client.1.vm08.stdout:8/935: readlink d0/df/l16 0 2026-03-10T07:51:47.133 INFO:tasks.workunit.client.1.vm08.stdout:4/761: sync 2026-03-10T07:51:47.133 INFO:tasks.workunit.client.1.vm08.stdout:6/846: creat d1/d3/d3e/f11b x:0 0 0 2026-03-10T07:51:47.134 INFO:tasks.workunit.client.1.vm08.stdout:5/897: dwrite d0/d33/ddd/f104 [0,4194304] 0 2026-03-10T07:51:47.134 INFO:tasks.workunit.client.1.vm08.stdout:3/822: getdents d0/d3c/d18/d32/daa/ded 0 2026-03-10T07:51:47.135 INFO:tasks.workunit.client.1.vm08.stdout:5/898: sync 2026-03-10T07:51:47.135 INFO:tasks.workunit.client.1.vm08.stdout:5/899: chown d0/d77/le9 0 1 2026-03-10T07:51:47.136 INFO:tasks.workunit.client.1.vm08.stdout:5/900: stat d0/d4/df/f2a 0 2026-03-10T07:51:47.137 INFO:tasks.workunit.client.1.vm08.stdout:2/827: getdents d0/d1/d3/d56/d57 0 2026-03-10T07:51:47.148 INFO:tasks.workunit.client.1.vm08.stdout:0/865: symlink dd/d10/l116 0 2026-03-10T07:51:47.152 INFO:tasks.workunit.client.1.vm08.stdout:7/812: mkdir d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/d113 0 2026-03-10T07:51:47.156 INFO:tasks.workunit.client.1.vm08.stdout:4/762: fdatasync d5/da0/d32/fbb 0 2026-03-10T07:51:47.156 INFO:tasks.workunit.client.1.vm08.stdout:3/823: fdatasync d0/d3c/d18/d48/d55/fe0 0 2026-03-10T07:51:47.156 INFO:tasks.workunit.client.1.vm08.stdout:5/901: symlink d0/d33/ddd/l125 0 2026-03-10T07:51:47.156 INFO:tasks.workunit.client.1.vm08.stdout:1/788: dwrite d2/d6/de/fc0 [0,4194304] 0 2026-03-10T07:51:47.158 INFO:tasks.workunit.client.1.vm08.stdout:1/789: chown d2/d6/c1a 8606 1 2026-03-10T07:51:47.159 INFO:tasks.workunit.client.1.vm08.stdout:2/828: chown d0/d1/d3/d39/d7d/d86/f34 57438 1 2026-03-10T07:51:47.160 INFO:tasks.workunit.client.1.vm08.stdout:9/823: rename d2/d58/dbf/f24 to d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/f117 0 2026-03-10T07:51:47.162 INFO:tasks.workunit.client.1.vm08.stdout:6/847: dread d1/d46/f74 [0,4194304] 0 2026-03-10T07:51:47.173 INFO:tasks.workunit.client.1.vm08.stdout:5/902: symlink d0/d33/ddd/l126 0 2026-03-10T07:51:47.174 INFO:tasks.workunit.client.1.vm08.stdout:7/813: write d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/dac/fd3 [731973,77176] 0 2026-03-10T07:51:47.175 INFO:tasks.workunit.client.1.vm08.stdout:0/866: rename dd/d10/l116 to dd/d10/d2f/d37/d64/d95/d58/d3d/l117 0 2026-03-10T07:51:47.187 INFO:tasks.workunit.client.1.vm08.stdout:6/848: read - d1/d3/df/d1d/d40/d45/fdf zero size 2026-03-10T07:51:47.202 INFO:tasks.workunit.client.1.vm08.stdout:4/763: mknod d5/d1f/d31/d61/c101 0 2026-03-10T07:51:47.207 INFO:tasks.workunit.client.1.vm08.stdout:8/936: dwrite d0/df/d17/fde [0,4194304] 0 2026-03-10T07:51:47.208 INFO:tasks.workunit.client.1.vm08.stdout:3/824: fsync d0/d3c/d18/d80/fe3 0 2026-03-10T07:51:47.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:46 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:47.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:46 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:47.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:46 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:47.210 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:46 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:47.217 INFO:tasks.workunit.client.1.vm08.stdout:1/790: truncate d2/d6/d50/f7f 885673 0 2026-03-10T07:51:47.217 INFO:tasks.workunit.client.1.vm08.stdout:6/849: rmdir d1/d17/d2b/d5e 39 2026-03-10T07:51:47.217 INFO:tasks.workunit.client.1.vm08.stdout:9/824: creat d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/d91/db8/f118 x:0 0 0 2026-03-10T07:51:47.220 INFO:tasks.workunit.client.1.vm08.stdout:8/937: dwrite d0/df/d15/d23/d54/dba/d89/dbf/fe5 [0,4194304] 0 2026-03-10T07:51:47.227 INFO:tasks.workunit.client.1.vm08.stdout:8/938: write d0/df/d15/d23/d39/d5b/fff [81087,116436] 0 2026-03-10T07:51:47.229 INFO:tasks.workunit.client.1.vm08.stdout:7/814: mknod d3/da/d10f/c114 0 2026-03-10T07:51:47.236 INFO:tasks.workunit.client.1.vm08.stdout:3/825: mkdir d0/d3c/d18/d80/dc1/d108 0 2026-03-10T07:51:47.239 INFO:tasks.workunit.client.1.vm08.stdout:5/903: creat d0/d4/df/dbf/f127 x:0 0 0 2026-03-10T07:51:47.240 INFO:tasks.workunit.client.1.vm08.stdout:6/850: mknod d1/d3/df/d1d/d40/d45/d5c/c11c 0 2026-03-10T07:51:47.243 INFO:tasks.workunit.client.1.vm08.stdout:4/764: truncate d5/da0/d95/de6/d48/d4f/d8d/d91/fa6 1370208 0 2026-03-10T07:51:47.243 INFO:tasks.workunit.client.1.vm08.stdout:2/829: rename d0/d1/d17/cd to d0/c10c 0 2026-03-10T07:51:47.244 INFO:tasks.workunit.client.1.vm08.stdout:5/904: mkdir d0/d8/d24/de2/d128 0 2026-03-10T07:51:47.244 INFO:tasks.workunit.client.1.vm08.stdout:3/826: write d0/d3c/d18/d80/fe3 [4693053,49257] 0 2026-03-10T07:51:47.244 INFO:tasks.workunit.client.1.vm08.stdout:9/825: symlink d2/d58/dbf/dd0/d35/d97/d110/l119 0 2026-03-10T07:51:47.246 INFO:tasks.workunit.client.1.vm08.stdout:5/905: write d0/d4/df/d12/f97 [2858411,24417] 0 2026-03-10T07:51:47.247 INFO:tasks.workunit.client.1.vm08.stdout:9/826: mkdir d2/d58/d73/d11a 0 2026-03-10T07:51:47.248 INFO:tasks.workunit.client.1.vm08.stdout:8/939: rmdir d0/df/d15/d23/d39/d5b/dea/d126 0 2026-03-10T07:51:47.248 INFO:tasks.workunit.client.1.vm08.stdout:6/851: link d1/db/d24/d3d/f11a d1/d3/d3e/db2/f11d 0 2026-03-10T07:51:47.248 INFO:tasks.workunit.client.1.vm08.stdout:4/765: creat d5/da0/d12/def/f102 x:0 0 0 2026-03-10T07:51:47.251 INFO:tasks.workunit.client.1.vm08.stdout:4/766: write d5/d1f/d41/f7f [2686515,46645] 0 2026-03-10T07:51:47.251 INFO:tasks.workunit.client.1.vm08.stdout:2/830: truncate d0/f44 1024463 0 2026-03-10T07:51:47.252 INFO:tasks.workunit.client.1.vm08.stdout:5/906: sync 2026-03-10T07:51:47.258 INFO:tasks.workunit.client.1.vm08.stdout:8/940: creat d0/df/d15/d23/d39/d5b/dbc/f127 x:0 0 0 2026-03-10T07:51:47.261 INFO:tasks.workunit.client.1.vm08.stdout:4/767: chown d5/c66 426434 1 2026-03-10T07:51:47.266 INFO:tasks.workunit.client.1.vm08.stdout:3/827: dread d0/d3c/d1f/d89/fa4 [0,4194304] 0 2026-03-10T07:51:47.269 INFO:tasks.workunit.client.1.vm08.stdout:2/831: dread d0/d1/d17/d6b/f72 [0,4194304] 0 2026-03-10T07:51:47.270 INFO:tasks.workunit.client.1.vm08.stdout:1/791: dread d2/d6/de/d1f/d22/f81 [0,4194304] 0 2026-03-10T07:51:47.270 INFO:tasks.workunit.client.1.vm08.stdout:5/907: unlink d0/d4/d19/d60/d6d/d70/f67 0 2026-03-10T07:51:47.280 INFO:tasks.workunit.client.1.vm08.stdout:9/827: creat d2/d58/dbf/dd0/d35/d97/dfb/f11b x:0 0 0 2026-03-10T07:51:47.283 INFO:tasks.workunit.client.1.vm08.stdout:0/867: dwrite dd/d10/d14/d15/d20/d7a/dd2/fd4 [0,4194304] 0 2026-03-10T07:51:47.290 INFO:tasks.workunit.client.1.vm08.stdout:6/852: mkdir d1/d3/df/d11e 0 2026-03-10T07:51:47.290 INFO:tasks.workunit.client.1.vm08.stdout:8/941: creat d0/df/d15/d53/f128 x:0 0 0 2026-03-10T07:51:47.291 INFO:tasks.workunit.client.1.vm08.stdout:8/942: write d0/df/d15/d23/d39/d5b/fff [736914,49441] 0 2026-03-10T07:51:47.292 INFO:tasks.workunit.client.1.vm08.stdout:4/768: unlink d5/da0/c6a 0 2026-03-10T07:51:47.294 INFO:tasks.workunit.client.1.vm08.stdout:4/769: stat d5/da0/d95/de6/d48/da2/cbf 0 2026-03-10T07:51:47.295 INFO:tasks.workunit.client.1.vm08.stdout:7/815: dwrite d3/da/d25/d9/d2f/d39/f58 [0,4194304] 0 2026-03-10T07:51:47.329 INFO:tasks.workunit.client.1.vm08.stdout:1/792: mknod d2/d6/de/d71/dc1/c111 0 2026-03-10T07:51:47.329 INFO:tasks.workunit.client.1.vm08.stdout:2/832: creat d0/d1/d17/db2/dc3/f10d x:0 0 0 2026-03-10T07:51:47.330 INFO:tasks.workunit.client.1.vm08.stdout:6/853: getdents d1/d3/d3e/dff/d105 0 2026-03-10T07:51:47.332 INFO:tasks.workunit.client.1.vm08.stdout:8/943: mknod d0/df/d2e/c129 0 2026-03-10T07:51:47.332 INFO:tasks.workunit.client.1.vm08.stdout:2/833: dread d0/d1/d3/d56/d78/dad/db1/d61/d8e/fe8 [0,4194304] 0 2026-03-10T07:51:47.337 INFO:tasks.workunit.client.1.vm08.stdout:7/816: mkdir d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/dfe/d115 0 2026-03-10T07:51:47.338 INFO:tasks.workunit.client.1.vm08.stdout:3/828: creat d0/d3c/d18/dec/f109 x:0 0 0 2026-03-10T07:51:47.339 INFO:tasks.workunit.client.1.vm08.stdout:3/829: truncate d0/d3c/d1f/d89/fea 5094103 0 2026-03-10T07:51:47.340 INFO:tasks.workunit.client.1.vm08.stdout:3/830: chown d0/d3c/f87 5197866 1 2026-03-10T07:51:47.341 INFO:tasks.workunit.client.1.vm08.stdout:8/944: dwrite d0/df/f60 [0,4194304] 0 2026-03-10T07:51:47.349 INFO:tasks.workunit.client.1.vm08.stdout:6/854: creat d1/db/d24/f11f x:0 0 0 2026-03-10T07:51:47.364 INFO:tasks.workunit.client.1.vm08.stdout:1/793: creat d2/d10c/f112 x:0 0 0 2026-03-10T07:51:47.364 INFO:tasks.workunit.client.1.vm08.stdout:5/908: getdents d0/d4/d19/d81 0 2026-03-10T07:51:47.365 INFO:tasks.workunit.client.1.vm08.stdout:1/794: chown d2/d6/de/d1f/d22/ca4 10 1 2026-03-10T07:51:47.367 INFO:tasks.workunit.client.1.vm08.stdout:6/855: sync 2026-03-10T07:51:47.370 INFO:tasks.workunit.client.1.vm08.stdout:3/831: dread d0/d3c/d18/d48/d55/fc6 [0,4194304] 0 2026-03-10T07:51:47.375 INFO:tasks.workunit.client.1.vm08.stdout:3/832: dwrite d0/d3c/d18/dec/d34/f3d [8388608,4194304] 0 2026-03-10T07:51:47.390 INFO:tasks.workunit.client.1.vm08.stdout:0/868: write dd/f16 [2062528,54795] 0 2026-03-10T07:51:47.393 INFO:tasks.workunit.client.1.vm08.stdout:4/770: truncate d5/d1f/d41/f7f 180384 0 2026-03-10T07:51:47.401 INFO:tasks.workunit.client.1.vm08.stdout:2/834: write d0/d1/d3/d56/d78/f62 [2429821,114118] 0 2026-03-10T07:51:47.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:46 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:47.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:46 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:47.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:46 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:47.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:46 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:47.441 INFO:tasks.workunit.client.1.vm08.stdout:7/817: symlink d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/d113/l116 0 2026-03-10T07:51:47.442 INFO:tasks.workunit.client.1.vm08.stdout:5/909: mknod d0/d4/df/dbf/d41/dc8/c129 0 2026-03-10T07:51:47.443 INFO:tasks.workunit.client.1.vm08.stdout:5/910: chown d0/d4/df/d82 1883 1 2026-03-10T07:51:47.443 INFO:tasks.workunit.client.1.vm08.stdout:6/856: fdatasync d1/d3/df/d1d/d40/d45/d5c/fbc 0 2026-03-10T07:51:47.444 INFO:tasks.workunit.client.1.vm08.stdout:6/857: chown d1/d3/df/d1d/f1f 3517331 1 2026-03-10T07:51:47.448 INFO:tasks.workunit.client.1.vm08.stdout:2/835: creat d0/d1/d3/d10/d65/f10e x:0 0 0 2026-03-10T07:51:47.448 INFO:tasks.workunit.client.1.vm08.stdout:8/945: link d0/f2a d0/df/d15/d23/d39/d5b/dea/dce/f12a 0 2026-03-10T07:51:47.449 INFO:tasks.workunit.client.1.vm08.stdout:9/828: getdents d2/d58/dbf/dd0/d35/d97 0 2026-03-10T07:51:47.450 INFO:tasks.workunit.client.1.vm08.stdout:5/911: rename d0/d4/df/dbf/d41/c48 to d0/d4/df/dbf/daf/c12a 0 2026-03-10T07:51:47.452 INFO:tasks.workunit.client.1.vm08.stdout:6/858: unlink d1/d17/d2b/cc8 0 2026-03-10T07:51:47.453 INFO:tasks.workunit.client.1.vm08.stdout:2/836: mkdir d0/d1/d3/d56/d10f 0 2026-03-10T07:51:47.454 INFO:tasks.workunit.client.1.vm08.stdout:1/795: creat d2/d6/de/f113 x:0 0 0 2026-03-10T07:51:47.455 INFO:tasks.workunit.client.1.vm08.stdout:8/946: truncate d0/df/d2e/d30/f76 948500 0 2026-03-10T07:51:47.469 INFO:tasks.workunit.client.1.vm08.stdout:4/771: creat d5/d1f/f103 x:0 0 0 2026-03-10T07:51:47.474 INFO:tasks.workunit.client.1.vm08.stdout:0/869: fsync dd/d18/d100/d102/dc1/ded/f4d 0 2026-03-10T07:51:47.476 INFO:tasks.workunit.client.1.vm08.stdout:2/837: creat d0/d1/d17/dfb/f110 x:0 0 0 2026-03-10T07:51:47.477 INFO:tasks.workunit.client.1.vm08.stdout:2/838: read - d0/d1/d3/d10/d38/daf/f10b zero size 2026-03-10T07:51:47.478 INFO:tasks.workunit.client.1.vm08.stdout:1/796: unlink d2/d6/de/d47/da0/ca1 0 2026-03-10T07:51:47.479 INFO:tasks.workunit.client.1.vm08.stdout:1/797: chown d2/d6/de/d70/l8a 0 1 2026-03-10T07:51:47.481 INFO:tasks.workunit.client.1.vm08.stdout:9/829: rename d2/d26/c81 to d2/d58/dbf/dd0/df0/c11c 0 2026-03-10T07:51:47.482 INFO:tasks.workunit.client.1.vm08.stdout:3/833: link d0/d3c/d18/dec/d2d/cb2 d0/d3c/d18/dec/c10a 0 2026-03-10T07:51:47.483 INFO:tasks.workunit.client.1.vm08.stdout:1/798: sync 2026-03-10T07:51:47.488 INFO:tasks.workunit.client.1.vm08.stdout:6/859: creat d1/d3/d3e/dff/d105/f120 x:0 0 0 2026-03-10T07:51:47.500 INFO:tasks.workunit.client.1.vm08.stdout:5/912: write d0/d4/d19/d60/d6d/d70/dc5/fd3 [427459,68400] 0 2026-03-10T07:51:47.502 INFO:tasks.workunit.client.1.vm08.stdout:5/913: dread - d0/d4/d19/d60/d6d/d70/d40/f105 zero size 2026-03-10T07:51:47.503 INFO:tasks.workunit.client.1.vm08.stdout:0/870: dwrite dd/d10/d14/da6/ffa [0,4194304] 0 2026-03-10T07:51:47.505 INFO:tasks.workunit.client.1.vm08.stdout:9/830: mkdir d2/dda/d11d 0 2026-03-10T07:51:47.505 INFO:tasks.workunit.client.1.vm08.stdout:7/818: link d3/da/d25/l12 d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79/l117 0 2026-03-10T07:51:47.506 INFO:tasks.workunit.client.1.vm08.stdout:3/834: rmdir d0/d3c/d1f/d44 39 2026-03-10T07:51:47.507 INFO:tasks.workunit.client.1.vm08.stdout:9/831: chown d2/d58/f93 124 1 2026-03-10T07:51:47.508 INFO:tasks.workunit.client.1.vm08.stdout:3/835: sync 2026-03-10T07:51:47.508 INFO:tasks.workunit.client.1.vm08.stdout:4/772: mknod d5/d1f/dad/dcf/c104 0 2026-03-10T07:51:47.510 INFO:tasks.workunit.client.1.vm08.stdout:6/860: creat d1/d3/df/d1d/d6f/d10e/f121 x:0 0 0 2026-03-10T07:51:47.515 INFO:tasks.workunit.client.1.vm08.stdout:5/914: creat d0/d4/df/dbf/d41/f12b x:0 0 0 2026-03-10T07:51:47.515 INFO:tasks.workunit.client.1.vm08.stdout:4/773: truncate d5/da0/d95/de6/d48/d4f/d8d/d91/dd5/fea 709326 0 2026-03-10T07:51:47.523 INFO:tasks.workunit.client.1.vm08.stdout:3/836: dwrite d0/d3c/d18/da9/fd5 [0,4194304] 0 2026-03-10T07:51:47.527 INFO:tasks.workunit.client.1.vm08.stdout:9/832: dwrite d2/d26/ffe [0,4194304] 0 2026-03-10T07:51:47.527 INFO:tasks.workunit.client.1.vm08.stdout:7/819: dread d3/da/d25/f29 [0,4194304] 0 2026-03-10T07:51:47.528 INFO:tasks.workunit.client.1.vm08.stdout:9/833: dread d2/d58/dbf/dd0/d35/d9b/fa1 [0,4194304] 0 2026-03-10T07:51:47.528 INFO:tasks.workunit.client.1.vm08.stdout:0/871: rename dd/d10/d14/c3b to dd/d10/dbd/d10d/c118 0 2026-03-10T07:51:47.533 INFO:tasks.workunit.client.1.vm08.stdout:9/834: chown d2/d58/dbf/dd0/d35/d97/d9d 26683911 1 2026-03-10T07:51:47.534 INFO:tasks.workunit.client.1.vm08.stdout:8/947: truncate d0/d69/d3f/fbd 1495176 0 2026-03-10T07:51:47.534 INFO:tasks.workunit.client.1.vm08.stdout:4/774: symlink d5/d8/d50/l105 0 2026-03-10T07:51:47.535 INFO:tasks.workunit.client.1.vm08.stdout:5/915: creat d0/d8/d24/de2/df7/f12c x:0 0 0 2026-03-10T07:51:47.538 INFO:tasks.workunit.client.1.vm08.stdout:6/861: dread d1/d3/df/d1d/f1f [0,4194304] 0 2026-03-10T07:51:47.538 INFO:tasks.workunit.client.1.vm08.stdout:7/820: creat d3/da/d8a/f118 x:0 0 0 2026-03-10T07:51:47.539 INFO:tasks.workunit.client.1.vm08.stdout:0/872: mkdir dd/d10/d2f/d37/d64/d95/d5c/d119 0 2026-03-10T07:51:47.540 INFO:tasks.workunit.client.1.vm08.stdout:9/835: chown d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/c8b 26 1 2026-03-10T07:51:47.540 INFO:tasks.workunit.client.1.vm08.stdout:4/775: mkdir d5/da0/d95/de6/d48/da2/d106 0 2026-03-10T07:51:47.542 INFO:tasks.workunit.client.1.vm08.stdout:8/948: dwrite d0/df/f10e [0,4194304] 0 2026-03-10T07:51:47.545 INFO:tasks.workunit.client.1.vm08.stdout:0/873: mkdir dd/d18/d100/d102/dc1/d11a 0 2026-03-10T07:51:47.547 INFO:tasks.workunit.client.1.vm08.stdout:7/821: mknod d3/c119 0 2026-03-10T07:51:47.554 INFO:tasks.workunit.client.1.vm08.stdout:7/822: stat d3/da/d25/d9/d2f/d4d/db6 0 2026-03-10T07:51:47.558 INFO:tasks.workunit.client.1.vm08.stdout:8/949: dwrite d0/df/d15/d23/d39/d5b/d4a/f115 [0,4194304] 0 2026-03-10T07:51:47.560 INFO:tasks.workunit.client.1.vm08.stdout:8/950: chown d0/df/d2e/d49/ldc 44 1 2026-03-10T07:51:47.563 INFO:tasks.workunit.client.1.vm08.stdout:7/823: creat d3/da/d25/d9/d2f/d39/f11a x:0 0 0 2026-03-10T07:51:47.569 INFO:tasks.workunit.client.1.vm08.stdout:5/916: dread d0/d8/d5e/d8e/f96 [0,4194304] 0 2026-03-10T07:51:47.570 INFO:tasks.workunit.client.1.vm08.stdout:5/917: write d0/d4/d19/d60/fb6 [1216787,56822] 0 2026-03-10T07:51:47.572 INFO:tasks.workunit.client.1.vm08.stdout:9/836: rename d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/f94 to d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/f11e 0 2026-03-10T07:51:47.574 INFO:tasks.workunit.client.1.vm08.stdout:9/837: write d2/d58/dbf/dd0/d35/d97/dd5/f10c [142439,42597] 0 2026-03-10T07:51:47.574 INFO:tasks.workunit.client.1.vm08.stdout:8/951: dread d0/df/d15/d23/d39/d5b/fff [0,4194304] 0 2026-03-10T07:51:47.581 INFO:tasks.workunit.client.1.vm08.stdout:2/839: dwrite d0/d1/d3/f63 [0,4194304] 0 2026-03-10T07:51:47.583 INFO:tasks.workunit.client.1.vm08.stdout:5/918: creat d0/d4/df/d82/df3/f12d x:0 0 0 2026-03-10T07:51:47.583 INFO:tasks.workunit.client.1.vm08.stdout:9/838: symlink d2/d58/dbf/d2b/l11f 0 2026-03-10T07:51:47.586 INFO:tasks.workunit.client.1.vm08.stdout:8/952: symlink d0/df/d17/l12b 0 2026-03-10T07:51:47.586 INFO:tasks.workunit.client.1.vm08.stdout:9/839: rmdir d2/d58/dbf/dd0/d35 39 2026-03-10T07:51:47.589 INFO:tasks.workunit.client.1.vm08.stdout:8/953: chown d0/d69/fe6 7870216 1 2026-03-10T07:51:47.595 INFO:tasks.workunit.client.1.vm08.stdout:5/919: sync 2026-03-10T07:51:47.601 INFO:tasks.workunit.client.1.vm08.stdout:1/799: dwrite d2/d6/de/d47/f38 [0,4194304] 0 2026-03-10T07:51:47.605 INFO:tasks.workunit.client.1.vm08.stdout:9/840: creat d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/d91/f120 x:0 0 0 2026-03-10T07:51:47.607 INFO:tasks.workunit.client.1.vm08.stdout:1/800: dwrite d2/d6/d3a/f6d [0,4194304] 0 2026-03-10T07:51:47.608 INFO:tasks.workunit.client.1.vm08.stdout:4/776: dread d5/da0/d12/fc6 [0,4194304] 0 2026-03-10T07:51:47.613 INFO:tasks.workunit.client.1.vm08.stdout:6/862: dread d1/d3/ff6 [0,4194304] 0 2026-03-10T07:51:47.618 INFO:tasks.workunit.client.1.vm08.stdout:5/920: unlink d0/d4/df/d12/c72 0 2026-03-10T07:51:47.643 INFO:tasks.workunit.client.1.vm08.stdout:1/801: symlink d2/d6/de/d1f/d26/d58/l114 0 2026-03-10T07:51:47.644 INFO:tasks.workunit.client.1.vm08.stdout:1/802: chown d2/d6/de/d1f/d8f/f91 458 1 2026-03-10T07:51:47.645 INFO:tasks.workunit.client.1.vm08.stdout:6/863: rename d1/d3/df/d1d/d6f/d10e/d115 to d1/d3/df/d1d/d40/d45/d122 0 2026-03-10T07:51:47.646 INFO:tasks.workunit.client.1.vm08.stdout:6/864: truncate d1/d3/df/d1d/d40/f10b 656188 0 2026-03-10T07:51:47.648 INFO:tasks.workunit.client.1.vm08.stdout:5/921: unlink d0/d4/df/dbf/d41/de8/f111 0 2026-03-10T07:51:47.648 INFO:tasks.workunit.client.1.vm08.stdout:8/954: link d0/c5 d0/df/d2e/d49/c12c 0 2026-03-10T07:51:47.656 INFO:tasks.workunit.client.1.vm08.stdout:0/874: write dd/d10/d2f/d37/d64/f70 [3185870,119133] 0 2026-03-10T07:51:47.661 INFO:tasks.workunit.client.1.vm08.stdout:3/837: dwrite d0/d3c/f87 [0,4194304] 0 2026-03-10T07:51:47.661 INFO:tasks.workunit.client.1.vm08.stdout:7/824: dwrite d3/da/d25/d9/d2f/d3a/d4b/f8d [0,4194304] 0 2026-03-10T07:51:47.670 INFO:tasks.workunit.client.1.vm08.stdout:8/955: creat d0/df/d15/d23/d54/dba/d89/dbf/f12d x:0 0 0 2026-03-10T07:51:47.671 INFO:tasks.workunit.client.1.vm08.stdout:1/803: creat d2/d10/dd7/f115 x:0 0 0 2026-03-10T07:51:47.671 INFO:tasks.workunit.client.1.vm08.stdout:1/804: chown d2/d10/dc6/lef 405 1 2026-03-10T07:51:47.675 INFO:tasks.workunit.client.1.vm08.stdout:0/875: mknod dd/d10/dbd/d10d/c11b 0 2026-03-10T07:51:47.676 INFO:tasks.workunit.client.1.vm08.stdout:0/876: chown dd/d10/d14/d15/d20/d22/l9e 46670 1 2026-03-10T07:51:47.678 INFO:tasks.workunit.client.1.vm08.stdout:8/956: dread d0/df/d17/d25/fc4 [0,4194304] 0 2026-03-10T07:51:47.690 INFO:tasks.workunit.client.1.vm08.stdout:7/825: truncate d3/da/d25/d9/d2f/d3a/d4b/f7b 593460 0 2026-03-10T07:51:47.691 INFO:tasks.workunit.client.1.vm08.stdout:6/865: rename d1/d3/f71 to d1/db/d24/d3d/f123 0 2026-03-10T07:51:47.691 INFO:tasks.workunit.client.1.vm08.stdout:1/805: creat d2/d6/de/d1f/d26/d58/f116 x:0 0 0 2026-03-10T07:51:47.691 INFO:tasks.workunit.client.1.vm08.stdout:5/922: rename d0/d4/df/c16 to d0/d4/df/dbf/d41/dad/c12e 0 2026-03-10T07:51:47.693 INFO:tasks.workunit.client.1.vm08.stdout:6/866: mkdir d1/d3/df/d1d/d40/d45/d124 0 2026-03-10T07:51:47.695 INFO:tasks.workunit.client.1.vm08.stdout:1/806: mkdir d2/d6/de/d1f/d26/d89/d117 0 2026-03-10T07:51:47.696 INFO:tasks.workunit.client.1.vm08.stdout:3/838: truncate d0/d3c/d1f/d44/f59 1022531 0 2026-03-10T07:51:47.698 INFO:tasks.workunit.client.1.vm08.stdout:0/877: rename dd/d10/d14/d1b/da5/fb6 to dd/d18/d100/d102/dc1/ded/df7/f11c 0 2026-03-10T07:51:47.699 INFO:tasks.workunit.client.1.vm08.stdout:5/923: creat d0/d77/daa/f12f x:0 0 0 2026-03-10T07:51:47.702 INFO:tasks.workunit.client.1.vm08.stdout:1/807: truncate d2/d6/de/d1f/d40/d76/fab 639460 0 2026-03-10T07:51:47.703 INFO:tasks.workunit.client.1.vm08.stdout:5/924: dwrite d0/d4/d19/d81/d92/f117 [0,4194304] 0 2026-03-10T07:51:47.705 INFO:tasks.workunit.client.1.vm08.stdout:6/867: sync 2026-03-10T07:51:47.713 INFO:tasks.workunit.client.1.vm08.stdout:2/840: write d0/d1/d3/d56/d78/dad/db1/d61/fbd [485083,9494] 0 2026-03-10T07:51:47.715 INFO:tasks.workunit.client.1.vm08.stdout:9/841: truncate d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/f5f 2105737 0 2026-03-10T07:51:47.716 INFO:tasks.workunit.client.1.vm08.stdout:4/777: write d5/d1f/daf/feb [2185963,2295] 0 2026-03-10T07:51:47.728 INFO:tasks.workunit.client.1.vm08.stdout:7/826: link d3/da/d25/d9/d2f/d6c/cbe d3/da/d25/d9/d2f/d39/c11b 0 2026-03-10T07:51:47.731 INFO:tasks.workunit.client.1.vm08.stdout:7/827: dread d3/f93 [0,4194304] 0 2026-03-10T07:51:47.732 INFO:tasks.workunit.client.1.vm08.stdout:8/957: getdents d0/df/d15/d23/d54/dba/d11c 0 2026-03-10T07:51:47.736 INFO:tasks.workunit.client.1.vm08.stdout:5/925: rename d0/f3b to d0/d77/f130 0 2026-03-10T07:51:47.740 INFO:tasks.workunit.client.1.vm08.stdout:1/808: truncate d2/d6/de/d1f/d26/d58/d8c/f97 600168 0 2026-03-10T07:51:47.746 INFO:tasks.workunit.client.1.vm08.stdout:3/839: mknod d0/d3c/d18/dec/c10b 0 2026-03-10T07:51:47.755 INFO:tasks.workunit.client.1.vm08.stdout:9/842: mknod d2/d58/dbf/dd0/d35/dff/c121 0 2026-03-10T07:51:47.756 INFO:tasks.workunit.client.1.vm08.stdout:6/868: write d1/db/d24/f25 [1445703,110111] 0 2026-03-10T07:51:47.756 INFO:tasks.workunit.client.1.vm08.stdout:4/778: symlink d5/da0/d95/de6/d48/d4f/l107 0 2026-03-10T07:51:47.758 INFO:tasks.workunit.client.1.vm08.stdout:6/869: stat d1/d3/df/d1d/d40/d87/l113 0 2026-03-10T07:51:47.758 INFO:tasks.workunit.client.1.vm08.stdout:4/779: read - d5/d1f/d41/f99 zero size 2026-03-10T07:51:47.771 INFO:tasks.workunit.client.1.vm08.stdout:0/878: dwrite dd/d10/d2f/f4b [0,4194304] 0 2026-03-10T07:51:47.776 INFO:tasks.workunit.client.1.vm08.stdout:8/958: dwrite d0/fa [0,4194304] 0 2026-03-10T07:51:47.777 INFO:tasks.workunit.client.1.vm08.stdout:8/959: chown d0/df/d15/d23/d39/f3e 22 1 2026-03-10T07:51:47.780 INFO:tasks.workunit.client.1.vm08.stdout:8/960: dread d0/df/d15/d23/f75 [0,4194304] 0 2026-03-10T07:51:47.794 INFO:tasks.workunit.client.1.vm08.stdout:5/926: rename d0/d33/ddd/f104 to d0/d4/df/dbf/daf/f131 0 2026-03-10T07:51:47.802 INFO:tasks.workunit.client.1.vm08.stdout:5/927: fsync d0/d4/df/ff5 0 2026-03-10T07:51:47.802 INFO:tasks.workunit.client.1.vm08.stdout:7/828: write d3/da/d25/d9/d2f/d4d/fb9 [1920005,103419] 0 2026-03-10T07:51:47.803 INFO:tasks.workunit.client.1.vm08.stdout:7/829: write d3/da/d25/d9/d6f/fab [805354,51927] 0 2026-03-10T07:51:47.824 INFO:tasks.workunit.client.1.vm08.stdout:3/840: creat d0/d3c/d18/dec/d2d/f10c x:0 0 0 2026-03-10T07:51:47.839 INFO:tasks.workunit.client.1.vm08.stdout:9/843: rmdir d2/d58/dbf/dd0/d35/d97/d9d 39 2026-03-10T07:51:47.842 INFO:tasks.workunit.client.1.vm08.stdout:4/780: symlink d5/da0/d95/l108 0 2026-03-10T07:51:47.851 INFO:tasks.workunit.client.1.vm08.stdout:2/841: dwrite d0/d1/d3/d39/d7d/d86/f109 [0,4194304] 0 2026-03-10T07:51:47.851 INFO:tasks.workunit.client.1.vm08.stdout:2/842: chown d0/d1/d17/ff3 1 1 2026-03-10T07:51:47.882 INFO:tasks.workunit.client.1.vm08.stdout:5/928: rmdir d0/d4/df 39 2026-03-10T07:51:47.892 INFO:tasks.workunit.client.1.vm08.stdout:1/809: mknod d2/d6/de/d70/c118 0 2026-03-10T07:51:47.896 INFO:tasks.workunit.client.1.vm08.stdout:1/810: dwrite d2/d10/f3e [4194304,4194304] 0 2026-03-10T07:51:47.900 INFO:tasks.workunit.client.1.vm08.stdout:6/870: symlink d1/d3/df/d1d/d40/d45/d124/l125 0 2026-03-10T07:51:47.908 INFO:tasks.workunit.client.1.vm08.stdout:4/781: unlink d5/d8/f39 0 2026-03-10T07:51:47.908 INFO:tasks.workunit.client.1.vm08.stdout:7/830: dwrite d3/f93 [0,4194304] 0 2026-03-10T07:51:47.919 INFO:tasks.workunit.client.1.vm08.stdout:0/879: symlink dd/d18/d100/d102/d107/l11d 0 2026-03-10T07:51:47.920 INFO:tasks.workunit.client.1.vm08.stdout:8/961: unlink d0/df/d17/c1d 0 2026-03-10T07:51:47.920 INFO:tasks.workunit.client.1.vm08.stdout:5/929: symlink d0/d77/d83/de0/l132 0 2026-03-10T07:51:47.920 INFO:tasks.workunit.client.1.vm08.stdout:3/841: symlink d0/d3c/d18/d48/d55/l10d 0 2026-03-10T07:51:47.921 INFO:tasks.workunit.client.1.vm08.stdout:5/930: write d0/ddf/f44 [2789691,9518] 0 2026-03-10T07:51:47.923 INFO:tasks.workunit.client.1.vm08.stdout:9/844: symlink d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/dca/dd7/l122 0 2026-03-10T07:51:47.924 INFO:tasks.workunit.client.1.vm08.stdout:6/871: creat d1/d17/d2b/d58/d76/d114/d79/d7c/f126 x:0 0 0 2026-03-10T07:51:47.927 INFO:tasks.workunit.client.1.vm08.stdout:9/845: dwrite d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/d98/fd4 [0,4194304] 0 2026-03-10T07:51:47.933 INFO:tasks.workunit.client.1.vm08.stdout:9/846: chown d2/d58/dbf/dd0/d35/d97/d9d/lae 175566 1 2026-03-10T07:51:47.937 INFO:tasks.workunit.client.1.vm08.stdout:7/831: mkdir d3/da/d25/d9/d6f/d11c 0 2026-03-10T07:51:47.945 INFO:tasks.workunit.client.1.vm08.stdout:8/962: mkdir d0/df/d15/d53/d12e 0 2026-03-10T07:51:47.954 INFO:tasks.workunit.client.1.vm08.stdout:9/847: dwrite d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/f66 [0,4194304] 0 2026-03-10T07:51:47.956 INFO:tasks.workunit.client.1.vm08.stdout:7/832: dread d3/da/d25/d9/fbf [0,4194304] 0 2026-03-10T07:51:47.956 INFO:tasks.workunit.client.1.vm08.stdout:7/833: chown d3/da/d25/d9/lcc 208 1 2026-03-10T07:51:47.968 INFO:tasks.workunit.client.1.vm08.stdout:3/842: mknod d0/d3c/d18/d32/d61/d52/c10e 0 2026-03-10T07:51:47.968 INFO:tasks.workunit.client.1.vm08.stdout:3/843: chown d0/d3c/l1b 7832649 1 2026-03-10T07:51:47.972 INFO:tasks.workunit.client.1.vm08.stdout:1/811: write d2/f69 [245606,9069] 0 2026-03-10T07:51:47.977 INFO:tasks.workunit.client.1.vm08.stdout:1/812: dwrite d2/d6/de/d47/da0/f101 [0,4194304] 0 2026-03-10T07:51:47.987 INFO:tasks.workunit.client.1.vm08.stdout:1/813: dwrite d2/d10/fe8 [4194304,4194304] 0 2026-03-10T07:51:47.997 INFO:tasks.workunit.client.1.vm08.stdout:9/848: creat d2/d26/da4/f123 x:0 0 0 2026-03-10T07:51:48.003 INFO:tasks.workunit.client.1.vm08.stdout:4/782: write d5/d8/f86 [534299,41090] 0 2026-03-10T07:51:48.003 INFO:tasks.workunit.client.1.vm08.stdout:2/843: truncate d0/d1/d3/d39/d7d/d86/f109 3592194 0 2026-03-10T07:51:48.003 INFO:tasks.workunit.client.1.vm08.stdout:0/880: write dd/d10/f5e [358979,20623] 0 2026-03-10T07:51:48.003 INFO:tasks.workunit.client.1.vm08.stdout:5/931: write d0/d4/d19/d81/da4/fc2 [1191761,62059] 0 2026-03-10T07:51:48.004 INFO:tasks.workunit.client.1.vm08.stdout:5/932: chown d0/d8/dce/fcd 18595 1 2026-03-10T07:51:48.004 INFO:tasks.workunit.client.1.vm08.stdout:4/783: sync 2026-03-10T07:51:48.004 INFO:tasks.workunit.client.1.vm08.stdout:1/814: sync 2026-03-10T07:51:48.005 INFO:tasks.workunit.client.1.vm08.stdout:0/881: write dd/d10/d2f/d37/d64/d95/d58/f10a [891183,119781] 0 2026-03-10T07:51:48.009 INFO:tasks.workunit.client.1.vm08.stdout:1/815: read d2/d6/de/d1f/d26/d89/d8e/fe3 [596074,91583] 0 2026-03-10T07:51:48.010 INFO:tasks.workunit.client.1.vm08.stdout:1/816: read - d2/d10/fc4 zero size 2026-03-10T07:51:48.015 INFO:tasks.workunit.client.1.vm08.stdout:8/963: mknod d0/df/c12f 0 2026-03-10T07:51:48.021 INFO:tasks.workunit.client.1.vm08.stdout:7/834: truncate d3/da/d25/d9/d2f/d39/f56 9289384 0 2026-03-10T07:51:48.021 INFO:tasks.workunit.client.1.vm08.stdout:7/835: fsync d3/da/d25/d9/fd 0 2026-03-10T07:51:48.034 INFO:tasks.workunit.client.1.vm08.stdout:9/849: dread d2/d58/dbf/dd0/d35/f6c [0,4194304] 0 2026-03-10T07:51:48.034 INFO:tasks.workunit.client.1.vm08.stdout:9/850: chown d2/d58/dbf/dd0/d35/d97/dfb 3593390 1 2026-03-10T07:51:48.036 INFO:tasks.workunit.client.1.vm08.stdout:4/784: mknod d5/d8/d50/db0/c109 0 2026-03-10T07:51:48.078 INFO:tasks.workunit.client.1.vm08.stdout:8/964: fdatasync d0/df/fdd 0 2026-03-10T07:51:48.082 INFO:tasks.workunit.client.1.vm08.stdout:6/872: getdents d1/d3/d3e 0 2026-03-10T07:51:48.083 INFO:tasks.workunit.client.1.vm08.stdout:2/844: rename d0/d1/d3/d39/d7d/d86/d55/d1b to d0/d1/d3/d39/de2/d107/d111 0 2026-03-10T07:51:48.085 INFO:tasks.workunit.client.1.vm08.stdout:5/933: mknod d0/d4/df/d82/c133 0 2026-03-10T07:51:48.086 INFO:tasks.workunit.client.1.vm08.stdout:9/851: mkdir d2/d58/dbf/dd0/d35/d9b/d124 0 2026-03-10T07:51:48.087 INFO:tasks.workunit.client.1.vm08.stdout:4/785: unlink d5/d8/d50/db0/cff 0 2026-03-10T07:51:48.090 INFO:tasks.workunit.client.1.vm08.stdout:4/786: sync 2026-03-10T07:51:48.099 INFO:tasks.workunit.client.1.vm08.stdout:7/836: mknod d3/da/d25/d9/d2f/d3a/d4b/d102/c11d 0 2026-03-10T07:51:48.101 INFO:tasks.workunit.client.1.vm08.stdout:0/882: dwrite dd/d10/f77 [0,4194304] 0 2026-03-10T07:51:48.101 INFO:tasks.workunit.client.1.vm08.stdout:1/817: dwrite d2/d6/d3a/f7d [0,4194304] 0 2026-03-10T07:51:48.102 INFO:tasks.workunit.client.1.vm08.stdout:7/837: stat d3/da/d25/d9/d2f/d39/d43/c112 0 2026-03-10T07:51:48.105 INFO:tasks.workunit.client.1.vm08.stdout:1/818: dwrite d2/f36 [4194304,4194304] 0 2026-03-10T07:51:48.105 INFO:tasks.workunit.client.1.vm08.stdout:1/819: chown d2/d6/de/d1f/d22/ca4 429105 1 2026-03-10T07:51:48.106 INFO:tasks.workunit.client.1.vm08.stdout:2/845: mkdir d0/d1/d3/d56/d78/dad/d112 0 2026-03-10T07:51:48.109 INFO:tasks.workunit.client.1.vm08.stdout:5/934: rename d0/d4/df/d82/df3 to d0/d4/d19/d81/d92/d134 0 2026-03-10T07:51:48.109 INFO:tasks.workunit.client.1.vm08.stdout:9/852: truncate d2/d58/dbf/dd0/d35/d97/d9d/df4/f1e 6054277 0 2026-03-10T07:51:48.112 INFO:tasks.workunit.client.1.vm08.stdout:4/787: dread - d5/fd4 zero size 2026-03-10T07:51:48.112 INFO:tasks.workunit.client.1.vm08.stdout:5/935: chown d0/d4/df/dbf/d41/dad/fb9 2273 1 2026-03-10T07:51:48.118 INFO:tasks.workunit.client.1.vm08.stdout:5/936: dread - d0/d4/d19/d81/f123 zero size 2026-03-10T07:51:48.119 INFO:tasks.workunit.client.1.vm08.stdout:0/883: mkdir dd/d10/d2f/d37/d64/d11e 0 2026-03-10T07:51:48.129 INFO:tasks.workunit.client.1.vm08.stdout:6/873: fsync d1/d3/df/d38/f60 0 2026-03-10T07:51:48.139 INFO:tasks.workunit.client.1.vm08.stdout:3/844: getdents d0 0 2026-03-10T07:51:48.146 INFO:tasks.workunit.client.1.vm08.stdout:2/846: unlink d0/d1/d3/d10/d65/f10e 0 2026-03-10T07:51:48.146 INFO:tasks.workunit.client.1.vm08.stdout:2/847: chown d0/d1/d3/d39/d7d/d7e/c8f 37511 1 2026-03-10T07:51:48.146 INFO:tasks.workunit.client.1.vm08.stdout:1/820: mkdir d2/d6/de/d1f/d40/d119 0 2026-03-10T07:51:48.146 INFO:tasks.workunit.client.1.vm08.stdout:9/853: mknod d2/d58/dbf/dd0/d35/d97/dd5/c125 0 2026-03-10T07:51:48.146 INFO:tasks.workunit.client.1.vm08.stdout:9/854: write d2/d58/fc6 [3464864,47487] 0 2026-03-10T07:51:48.146 INFO:tasks.workunit.client.1.vm08.stdout:4/788: chown d5/d1f/c2c 746549636 1 2026-03-10T07:51:48.146 INFO:tasks.workunit.client.1.vm08.stdout:9/855: truncate d2/d58/dbf/dd0/d35/d97/dd5/fe6 8535974 0 2026-03-10T07:51:48.148 INFO:tasks.workunit.client.1.vm08.stdout:7/838: dread d3/da/d25/f32 [0,4194304] 0 2026-03-10T07:51:48.156 INFO:tasks.workunit.client.1.vm08.stdout:3/845: dread d0/d3c/d18/d32/d61/d83/fda [0,4194304] 0 2026-03-10T07:51:48.158 INFO:tasks.workunit.client.1.vm08.stdout:1/821: mkdir d2/d6/dfe/d11a 0 2026-03-10T07:51:48.158 INFO:tasks.workunit.client.1.vm08.stdout:8/965: truncate d0/df/fdd 427527 0 2026-03-10T07:51:48.159 INFO:tasks.workunit.client.1.vm08.stdout:1/822: chown d2/d6/de/d1f/d26/d58/d83/dc2 92 1 2026-03-10T07:51:48.164 INFO:tasks.workunit.client.1.vm08.stdout:4/789: symlink d5/d1f/d70/l10a 0 2026-03-10T07:51:48.164 INFO:tasks.workunit.client.1.vm08.stdout:4/790: read - d5/da0/d95/de6/d48/d4f/fe0 zero size 2026-03-10T07:51:48.167 INFO:tasks.workunit.client.1.vm08.stdout:4/791: dwrite d5/d1f/dad/db8/fd2 [0,4194304] 0 2026-03-10T07:51:48.169 INFO:tasks.workunit.client.1.vm08.stdout:2/848: write d0/d1/d3/d10/f58 [1673509,23634] 0 2026-03-10T07:51:48.172 INFO:tasks.workunit.client.1.vm08.stdout:0/884: symlink dd/d10/d14/d1b/l11f 0 2026-03-10T07:51:48.187 INFO:tasks.workunit.client.1.vm08.stdout:7/839: rmdir d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7 39 2026-03-10T07:51:48.194 INFO:tasks.workunit.client.1.vm08.stdout:8/966: dread - d0/d69/fd9 zero size 2026-03-10T07:51:48.213 INFO:tasks.workunit.client.1.vm08.stdout:5/937: rmdir d0/d10e 0 2026-03-10T07:51:48.214 INFO:tasks.workunit.client.1.vm08.stdout:5/938: truncate d0/d4/d19/d81/d92/f116 236234 0 2026-03-10T07:51:48.218 INFO:tasks.workunit.client.1.vm08.stdout:6/874: dwrite d1/d3/f21 [0,4194304] 0 2026-03-10T07:51:48.228 INFO:tasks.workunit.client.1.vm08.stdout:4/792: rename d5/d1f to d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b 0 2026-03-10T07:51:48.229 INFO:tasks.workunit.client.1.vm08.stdout:9/856: write d2/d58/fb3 [528856,79170] 0 2026-03-10T07:51:48.230 INFO:tasks.workunit.client.1.vm08.stdout:9/857: write d2/d58/dbf/dd0/f5d [5227636,51686] 0 2026-03-10T07:51:48.230 INFO:tasks.workunit.client.1.vm08.stdout:9/858: write d2/d58/dbf/f21 [3695473,240] 0 2026-03-10T07:51:48.232 INFO:tasks.workunit.client.1.vm08.stdout:2/849: write d0/d1/d3/d39/f3b [729261,96059] 0 2026-03-10T07:51:48.233 INFO:tasks.workunit.client.1.vm08.stdout:2/850: fsync d0/d1/d17/f1a 0 2026-03-10T07:51:48.235 INFO:tasks.workunit.client.1.vm08.stdout:0/885: creat dd/d10/d2f/d37/d64/d95/d58/f120 x:0 0 0 2026-03-10T07:51:48.235 INFO:tasks.workunit.client.1.vm08.stdout:2/851: write d0/d1/d3/d56/d78/dad/db1/d61/dee/ffa [1082225,37853] 0 2026-03-10T07:51:48.239 INFO:tasks.workunit.client.1.vm08.stdout:0/886: dwrite dd/d10/d14/d15/f9c [0,4194304] 0 2026-03-10T07:51:48.243 INFO:tasks.workunit.client.1.vm08.stdout:0/887: dread - dd/d10/dbd/f101 zero size 2026-03-10T07:51:48.254 INFO:tasks.workunit.client.1.vm08.stdout:1/823: link d2/d10/f3f d2/d6/de/d1f/d22/deb/f11b 0 2026-03-10T07:51:48.255 INFO:tasks.workunit.client.1.vm08.stdout:1/824: truncate d2/d6/de/d47/da0/f103 585888 0 2026-03-10T07:51:48.264 INFO:tasks.workunit.client.1.vm08.stdout:1/825: read d2/d6/de/f74 [2182718,38581] 0 2026-03-10T07:51:48.268 INFO:tasks.workunit.client.1.vm08.stdout:9/859: mkdir d2/d58/dbf/dd0/d35/dff/d126 0 2026-03-10T07:51:48.269 INFO:tasks.workunit.client.1.vm08.stdout:1/826: dwrite d2/d6/de/d1f/d26/d58/d83/fa2 [4194304,4194304] 0 2026-03-10T07:51:48.277 INFO:tasks.workunit.client.1.vm08.stdout:7/840: mknod d3/c11e 0 2026-03-10T07:51:48.280 INFO:tasks.workunit.client.1.vm08.stdout:2/852: symlink d0/d1/d3/d56/l113 0 2026-03-10T07:51:48.281 INFO:tasks.workunit.client.1.vm08.stdout:6/875: dwrite d1/d3/df/d52/f8f [0,4194304] 0 2026-03-10T07:51:48.282 INFO:tasks.workunit.client.1.vm08.stdout:4/793: dwrite d5/da0/d95/de6/d48/d4f/d7c/fb2 [0,4194304] 0 2026-03-10T07:51:48.283 INFO:tasks.workunit.client.1.vm08.stdout:6/876: dread - d1/d3/d3e/dff/d105/f120 zero size 2026-03-10T07:51:48.284 INFO:tasks.workunit.client.1.vm08.stdout:3/846: creat d0/d3c/d18/f10f x:0 0 0 2026-03-10T07:51:48.287 INFO:tasks.workunit.client.1.vm08.stdout:0/888: truncate dd/d10/d14/d15/d20/d7a/fde 71436 0 2026-03-10T07:51:48.287 INFO:tasks.workunit.client.1.vm08.stdout:8/967: mkdir d0/df/d130 0 2026-03-10T07:51:48.312 INFO:tasks.workunit.client.1.vm08.stdout:1/827: rmdir d2/d6/de/d70 39 2026-03-10T07:51:48.323 INFO:tasks.workunit.client.1.vm08.stdout:2/853: fdatasync d0/f1e 0 2026-03-10T07:51:48.324 INFO:tasks.workunit.client.1.vm08.stdout:2/854: read d0/d1/d3/d10/f58 [779713,49913] 0 2026-03-10T07:51:48.328 INFO:tasks.workunit.client.1.vm08.stdout:6/877: mkdir d1/d3/df/d1d/d40/d87/d127 0 2026-03-10T07:51:48.336 INFO:tasks.workunit.client.1.vm08.stdout:7/841: write d3/da/d25/d9/d2f/d39/fc7 [7195609,25748] 0 2026-03-10T07:51:48.339 INFO:tasks.workunit.client.1.vm08.stdout:7/842: dwrite d3/da/d8a/fcd [0,4194304] 0 2026-03-10T07:51:48.339 INFO:tasks.workunit.client.1.vm08.stdout:0/889: rmdir dd/d10/d2f/d37/d64/d95/d58/d3d/d9b 39 2026-03-10T07:51:48.339 INFO:tasks.workunit.client.1.vm08.stdout:0/890: stat dd/d10/d14/d15/d20/d22/f51 0 2026-03-10T07:51:48.347 INFO:tasks.workunit.client.1.vm08.stdout:5/939: creat d0/f135 x:0 0 0 2026-03-10T07:51:48.348 INFO:tasks.workunit.client.1.vm08.stdout:7/843: sync 2026-03-10T07:51:48.352 INFO:tasks.workunit.client.1.vm08.stdout:1/828: mknod d2/d6/de/d1f/d40/c11c 0 2026-03-10T07:51:48.353 INFO:tasks.workunit.client.1.vm08.stdout:8/968: dread d0/df/d2e/d30/f43 [0,4194304] 0 2026-03-10T07:51:48.360 INFO:tasks.workunit.client.1.vm08.stdout:5/940: sync 2026-03-10T07:51:48.367 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:47 vm05.local ceph-mon[50387]: pgmap v43: 65 pgs: 65 active+clean; 3.2 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 32 MiB/s rd, 91 MiB/s wr, 209 op/s 2026-03-10T07:51:48.370 INFO:tasks.workunit.client.1.vm08.stdout:4/794: fdatasync d5/f77 0 2026-03-10T07:51:48.374 INFO:tasks.workunit.client.1.vm08.stdout:3/847: creat d0/d3c/d18/d80/dc1/d108/f110 x:0 0 0 2026-03-10T07:51:48.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:48 vm08.local ceph-mon[59917]: pgmap v43: 65 pgs: 65 active+clean; 3.2 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 32 MiB/s rd, 91 MiB/s wr, 209 op/s 2026-03-10T07:51:48.442 INFO:tasks.workunit.client.1.vm08.stdout:7/844: truncate d3/da/d25/d9/d2f/d3a/dc0/ff9 2027690 0 2026-03-10T07:51:48.442 INFO:tasks.workunit.client.1.vm08.stdout:1/829: creat d2/d6/de/d5f/df9/f11d x:0 0 0 2026-03-10T07:51:48.442 INFO:tasks.workunit.client.1.vm08.stdout:8/969: write d0/df/d15/d23/d39/f3e [1607674,6914] 0 2026-03-10T07:51:48.444 INFO:tasks.workunit.client.1.vm08.stdout:5/941: rename d0/d4/df/dbf/d41/dc8/c129 to d0/d4/d19/d81/c136 0 2026-03-10T07:51:48.447 INFO:tasks.workunit.client.1.vm08.stdout:8/970: truncate d0/df/d15/f10c 127802 0 2026-03-10T07:51:48.448 INFO:tasks.workunit.client.1.vm08.stdout:8/971: readlink d0/df/d15/d23/l104 0 2026-03-10T07:51:48.451 INFO:tasks.workunit.client.1.vm08.stdout:0/891: symlink dd/d10/d14/d15/d20/d22/dc6/l121 0 2026-03-10T07:51:48.451 INFO:tasks.workunit.client.1.vm08.stdout:2/855: dwrite d0/d1/d3/d39/d7d/d86/d55/d7a/fea [0,4194304] 0 2026-03-10T07:51:48.452 INFO:tasks.workunit.client.1.vm08.stdout:2/856: stat d0/d1/d17/l2c 0 2026-03-10T07:51:48.453 INFO:tasks.workunit.client.1.vm08.stdout:2/857: chown d0/d1/d3/d56/d57/f79 30720920 1 2026-03-10T07:51:48.462 INFO:tasks.workunit.client.1.vm08.stdout:7/845: rmdir d3/da/d25/d9/d2f/d39/d43/d4f/d5b 39 2026-03-10T07:51:48.462 INFO:tasks.workunit.client.1.vm08.stdout:9/860: getdents d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/d91 0 2026-03-10T07:51:48.462 INFO:tasks.workunit.client.1.vm08.stdout:9/861: fsync d2/d58/dbf/daf/ff7 0 2026-03-10T07:51:48.467 INFO:tasks.workunit.client.1.vm08.stdout:7/846: dwrite d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/f101 [0,4194304] 0 2026-03-10T07:51:48.480 INFO:tasks.workunit.client.1.vm08.stdout:3/848: dwrite d0/d3c/d18/d32/d61/d83/f8b [0,4194304] 0 2026-03-10T07:51:48.493 INFO:tasks.workunit.client.1.vm08.stdout:6/878: creat d1/d17/d2b/f128 x:0 0 0 2026-03-10T07:51:48.495 INFO:tasks.workunit.client.1.vm08.stdout:2/858: rename d0/d1/d17/d6b/da0/dd7 to d0/d1/d3/d39/de2/d114 0 2026-03-10T07:51:48.495 INFO:tasks.workunit.client.1.vm08.stdout:9/862: creat d2/d58/dbf/dd0/d35/f127 x:0 0 0 2026-03-10T07:51:48.496 INFO:tasks.workunit.client.1.vm08.stdout:9/863: readlink d2/l9 0 2026-03-10T07:51:48.497 INFO:tasks.workunit.client.1.vm08.stdout:0/892: dread dd/d10/d2f/d37/d64/d95/d5c/dca/fce [0,4194304] 0 2026-03-10T07:51:48.498 INFO:tasks.workunit.client.1.vm08.stdout:8/972: dwrite d0/df/d15/d23/d39/d5b/fff [0,4194304] 0 2026-03-10T07:51:48.504 INFO:tasks.workunit.client.1.vm08.stdout:7/847: rmdir d3/da/d25/d9/d2f/d39/d43 39 2026-03-10T07:51:48.507 INFO:tasks.workunit.client.1.vm08.stdout:1/830: truncate d2/d6/de/f1c 1014259 0 2026-03-10T07:51:48.507 INFO:tasks.workunit.client.1.vm08.stdout:3/849: write d0/d3c/d18/d32/d61/d83/fda [2582488,87202] 0 2026-03-10T07:51:48.507 INFO:tasks.workunit.client.1.vm08.stdout:2/859: creat d0/d1/d3/d39/d7d/f115 x:0 0 0 2026-03-10T07:51:48.507 INFO:tasks.workunit.client.1.vm08.stdout:0/893: symlink dd/d18/l122 0 2026-03-10T07:51:48.513 INFO:tasks.workunit.client.1.vm08.stdout:7/848: creat d3/da/d8a/dd1/f11f x:0 0 0 2026-03-10T07:51:48.513 INFO:tasks.workunit.client.1.vm08.stdout:7/849: chown d3/c5 2379624 1 2026-03-10T07:51:48.514 INFO:tasks.workunit.client.1.vm08.stdout:2/860: rename d0/d1/d3/d10/d65 to d0/d1/d17/db2/dde/d116 0 2026-03-10T07:51:48.516 INFO:tasks.workunit.client.1.vm08.stdout:8/973: getdents d0/df/d15/d53/d12e 0 2026-03-10T07:51:48.519 INFO:tasks.workunit.client.1.vm08.stdout:9/864: creat d2/d58/dbf/f128 x:0 0 0 2026-03-10T07:51:48.523 INFO:tasks.workunit.client.1.vm08.stdout:3/850: dwrite d0/d3c/d18/f38 [0,4194304] 0 2026-03-10T07:51:48.534 INFO:tasks.workunit.client.1.vm08.stdout:8/974: fdatasync d0/df/d15/d23/da8/fc2 0 2026-03-10T07:51:48.534 INFO:tasks.workunit.client.1.vm08.stdout:0/894: dread dd/d18/fdf [0,4194304] 0 2026-03-10T07:51:48.537 INFO:tasks.workunit.client.1.vm08.stdout:7/850: chown d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79/l117 4254591 1 2026-03-10T07:51:48.537 INFO:tasks.workunit.client.1.vm08.stdout:8/975: write d0/df/d17/d72/fe4 [1188831,12095] 0 2026-03-10T07:51:48.537 INFO:tasks.workunit.client.1.vm08.stdout:9/865: mknod d2/d26/c129 0 2026-03-10T07:51:48.538 INFO:tasks.workunit.client.1.vm08.stdout:4/795: dwrite d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/fbe [0,4194304] 0 2026-03-10T07:51:48.540 INFO:tasks.workunit.client.1.vm08.stdout:7/851: mkdir d3/da/d25/d9/d2f/d4d/d120 0 2026-03-10T07:51:48.540 INFO:tasks.workunit.client.1.vm08.stdout:8/976: mkdir d0/df/d17/d72/d131 0 2026-03-10T07:51:48.540 INFO:tasks.workunit.client.1.vm08.stdout:7/852: chown d3/da/dbc 5212 1 2026-03-10T07:51:48.546 INFO:tasks.workunit.client.1.vm08.stdout:5/942: dread d0/d4/d19/d81/d92/f73 [0,4194304] 0 2026-03-10T07:51:48.548 INFO:tasks.workunit.client.1.vm08.stdout:9/866: link d2/d58/dbf/dd0/d35/f127 d2/d58/dbf/ddf/f12a 0 2026-03-10T07:51:48.550 INFO:tasks.workunit.client.1.vm08.stdout:0/895: mknod dd/d18/d100/d102/c123 0 2026-03-10T07:51:48.555 INFO:tasks.workunit.client.1.vm08.stdout:1/831: dread d2/d6/de/d1f/d26/d58/d8c/f44 [0,4194304] 0 2026-03-10T07:51:48.556 INFO:tasks.workunit.client.1.vm08.stdout:1/832: chown d2/d10/fd8 3810 1 2026-03-10T07:51:48.556 INFO:tasks.workunit.client.1.vm08.stdout:1/833: fdatasync d2/d6/de/d47/da0/f101 0 2026-03-10T07:51:48.557 INFO:tasks.workunit.client.1.vm08.stdout:1/834: fdatasync d2/d6/de/d1f/d40/d76/f79 0 2026-03-10T07:51:48.558 INFO:tasks.workunit.client.1.vm08.stdout:1/835: readlink d2/d6/de/d1f/d40/d76/ld4 0 2026-03-10T07:51:48.559 INFO:tasks.workunit.client.1.vm08.stdout:1/836: dread - d2/d10c/f112 zero size 2026-03-10T07:51:48.562 INFO:tasks.workunit.client.1.vm08.stdout:8/977: mknod d0/c132 0 2026-03-10T07:51:48.562 INFO:tasks.workunit.client.1.vm08.stdout:3/851: getdents d0/d3c/d18/d80/dc1/d108 0 2026-03-10T07:51:48.563 INFO:tasks.workunit.client.1.vm08.stdout:7/853: mkdir d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/d121 0 2026-03-10T07:51:48.564 INFO:tasks.workunit.client.1.vm08.stdout:5/943: write d0/d8/d5e/d8e/f96 [5141380,6276] 0 2026-03-10T07:51:48.565 INFO:tasks.workunit.client.1.vm08.stdout:9/867: rename d2/d58/dbf/dd0/c5c to d2/d58/c12b 0 2026-03-10T07:51:48.567 INFO:tasks.workunit.client.1.vm08.stdout:0/896: truncate dd/d10/d14/d15/d20/d22/f6c 735669 0 2026-03-10T07:51:48.570 INFO:tasks.workunit.client.1.vm08.stdout:2/861: readlink d0/d1/d3/d39/d7d/d86/d55/l16 0 2026-03-10T07:51:48.571 INFO:tasks.workunit.client.1.vm08.stdout:6/879: dread d1/d3/df/d44/fa2 [0,4194304] 0 2026-03-10T07:51:48.573 INFO:tasks.workunit.client.1.vm08.stdout:2/862: truncate d0/d1/d3/d10/d38/daf/ffc 209173 0 2026-03-10T07:51:48.574 INFO:tasks.workunit.client.1.vm08.stdout:5/944: dwrite d0/d4/d19/d3a/df1/f122 [0,4194304] 0 2026-03-10T07:51:48.579 INFO:tasks.workunit.client.1.vm08.stdout:5/945: dwrite d0/d4/df/dbf/f64 [0,4194304] 0 2026-03-10T07:51:48.580 INFO:tasks.workunit.client.1.vm08.stdout:8/978: readlink d0/d69/lb1 0 2026-03-10T07:51:48.581 INFO:tasks.workunit.client.1.vm08.stdout:3/852: creat d0/d3c/d18/dec/d34/f111 x:0 0 0 2026-03-10T07:51:48.589 INFO:tasks.workunit.client.1.vm08.stdout:7/854: mknod d3/da/d25/d9/d2f/d3a/d71/c122 0 2026-03-10T07:51:48.589 INFO:tasks.workunit.client.1.vm08.stdout:9/868: creat d2/dda/f12c x:0 0 0 2026-03-10T07:51:48.591 INFO:tasks.workunit.client.1.vm08.stdout:0/897: symlink dd/d10/d14/d15/d20/d7a/dd2/l124 0 2026-03-10T07:51:48.596 INFO:tasks.workunit.client.1.vm08.stdout:6/880: fdatasync d1/d3/f2e 0 2026-03-10T07:51:48.597 INFO:tasks.workunit.client.1.vm08.stdout:7/855: dwrite d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/ff6 [0,4194304] 0 2026-03-10T07:51:48.602 INFO:tasks.workunit.client.1.vm08.stdout:2/863: rename d0/d1/d3/d10/d38/d100/d101 to d0/d1/d3/d56/d78/de4/d117 0 2026-03-10T07:51:48.606 INFO:tasks.workunit.client.1.vm08.stdout:7/856: write d3/da/d25/d9/d2f/d3a/d4b/f8d [3967020,69825] 0 2026-03-10T07:51:48.607 INFO:tasks.workunit.client.1.vm08.stdout:7/857: write d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/ff6 [3768849,63816] 0 2026-03-10T07:51:48.607 INFO:tasks.workunit.client.1.vm08.stdout:7/858: chown d3/l96 104740 1 2026-03-10T07:51:48.611 INFO:tasks.workunit.client.1.vm08.stdout:4/796: write d5/da0/d95/de6/d48/fb9 [424850,120008] 0 2026-03-10T07:51:48.612 INFO:tasks.workunit.client.1.vm08.stdout:5/946: symlink d0/d8/d5e/d8e/l137 0 2026-03-10T07:51:48.614 INFO:tasks.workunit.client.1.vm08.stdout:4/797: chown d5/da0/d12/def 1 1 2026-03-10T07:51:48.617 INFO:tasks.workunit.client.1.vm08.stdout:9/869: creat d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/dca/dd7/d10d/f12d x:0 0 0 2026-03-10T07:51:48.618 INFO:tasks.workunit.client.1.vm08.stdout:2/864: dread d0/d1/d3/d56/d78/f62 [4194304,4194304] 0 2026-03-10T07:51:48.623 INFO:tasks.workunit.client.1.vm08.stdout:9/870: write d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/f107 [147035,46148] 0 2026-03-10T07:51:48.623 INFO:tasks.workunit.client.1.vm08.stdout:6/881: mkdir d1/d3/df/d1d/d40/d45/d10c/d129 0 2026-03-10T07:51:48.627 INFO:tasks.workunit.client.1.vm08.stdout:7/859: readlink d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/l4c 0 2026-03-10T07:51:48.628 INFO:tasks.workunit.client.1.vm08.stdout:3/853: rmdir d0/d3c/d1f/d89 39 2026-03-10T07:51:48.633 INFO:tasks.workunit.client.1.vm08.stdout:3/854: creat d0/d3c/d18/dec/d34/f112 x:0 0 0 2026-03-10T07:51:48.639 INFO:tasks.workunit.client.1.vm08.stdout:7/860: mknod d3/da/d25/d9/d2f/d3a/d4b/d67/dea/c123 0 2026-03-10T07:51:48.639 INFO:tasks.workunit.client.1.vm08.stdout:5/947: sync 2026-03-10T07:51:48.639 INFO:tasks.workunit.client.1.vm08.stdout:5/948: dread d0/d8/d24/de2/fe6 [0,4194304] 0 2026-03-10T07:51:48.639 INFO:tasks.workunit.client.1.vm08.stdout:5/949: symlink d0/d77/l138 0 2026-03-10T07:51:48.640 INFO:tasks.workunit.client.1.vm08.stdout:1/837: write d2/d6/de/d1f/d26/f62 [1304937,42197] 0 2026-03-10T07:51:48.640 INFO:tasks.workunit.client.1.vm08.stdout:3/855: fdatasync d0/d3c/d18/dec/d34/f111 0 2026-03-10T07:51:48.647 INFO:tasks.workunit.client.1.vm08.stdout:7/861: dwrite d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/f110 [0,4194304] 0 2026-03-10T07:51:48.651 INFO:tasks.workunit.client.1.vm08.stdout:5/950: unlink d0/d4/df/d12/f97 0 2026-03-10T07:51:48.651 INFO:tasks.workunit.client.1.vm08.stdout:7/862: stat d3/da/d25/d9/d2f/d4d/db6/fc6 0 2026-03-10T07:51:48.651 INFO:tasks.workunit.client.1.vm08.stdout:7/863: stat d3/da/d8a/f118 0 2026-03-10T07:51:48.662 INFO:tasks.workunit.client.1.vm08.stdout:3/856: dwrite d0/d3c/d18/d32/d61/d83/fda [0,4194304] 0 2026-03-10T07:51:48.667 INFO:tasks.workunit.client.1.vm08.stdout:5/951: mkdir d0/d8/dce/dd2/d139 0 2026-03-10T07:51:48.667 INFO:tasks.workunit.client.1.vm08.stdout:0/898: dread dd/d10/fb8 [0,4194304] 0 2026-03-10T07:51:48.668 INFO:tasks.workunit.client.1.vm08.stdout:0/899: readlink dd/d10/d14/d15/l17 0 2026-03-10T07:51:48.672 INFO:tasks.workunit.client.1.vm08.stdout:7/864: rename d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/d121 to d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/d107/d124 0 2026-03-10T07:51:48.672 INFO:tasks.workunit.client.1.vm08.stdout:5/952: read d0/d4/d19/d60/d6d/d70/dc5/fd3 [128794,31053] 0 2026-03-10T07:51:48.674 INFO:tasks.workunit.client.1.vm08.stdout:7/865: chown d3/da/d25/d9/d2f/f97 0 1 2026-03-10T07:51:48.676 INFO:tasks.workunit.client.1.vm08.stdout:0/900: mkdir dd/d18/d100/d102/dc1/d125 0 2026-03-10T07:51:48.677 INFO:tasks.workunit.client.1.vm08.stdout:6/882: dread d1/d3/df/d1d/d40/d45/fbb [0,4194304] 0 2026-03-10T07:51:48.678 INFO:tasks.workunit.client.1.vm08.stdout:7/866: chown d3/da/d25/d9/d2f/d4d/fb9 3238269 1 2026-03-10T07:51:48.678 INFO:tasks.workunit.client.1.vm08.stdout:7/867: chown d3/f104 20482686 1 2026-03-10T07:51:48.681 INFO:tasks.workunit.client.1.vm08.stdout:5/953: sync 2026-03-10T07:51:48.682 INFO:tasks.workunit.client.1.vm08.stdout:5/954: chown d0/d8/d24 880283993 1 2026-03-10T07:51:48.683 INFO:tasks.workunit.client.1.vm08.stdout:7/868: dwrite d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/dac/fd3 [0,4194304] 0 2026-03-10T07:51:48.685 INFO:tasks.workunit.client.1.vm08.stdout:8/979: write d0/d69/d77/f87 [259886,130731] 0 2026-03-10T07:51:48.685 INFO:tasks.workunit.client.1.vm08.stdout:4/798: write d5/da0/f5a [1824076,36956] 0 2026-03-10T07:51:48.685 INFO:tasks.workunit.client.1.vm08.stdout:9/871: write d2/d58/dbf/dd0/d35/d97/d9d/df4/f4d [3349318,89408] 0 2026-03-10T07:51:48.696 INFO:tasks.workunit.client.1.vm08.stdout:2/865: dwrite d0/d1/d3/d10/fb4 [0,4194304] 0 2026-03-10T07:51:48.697 INFO:tasks.workunit.client.1.vm08.stdout:6/883: unlink d1/d17/d2b/f128 0 2026-03-10T07:51:48.699 INFO:tasks.workunit.client.1.vm08.stdout:5/955: mkdir d0/d8/d5e/d8e/d13a 0 2026-03-10T07:51:48.701 INFO:tasks.workunit.client.1.vm08.stdout:9/872: truncate d2/fba 985832 0 2026-03-10T07:51:48.705 INFO:tasks.workunit.client.1.vm08.stdout:7/869: creat d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/dac/f125 x:0 0 0 2026-03-10T07:51:48.708 INFO:tasks.workunit.client.1.vm08.stdout:8/980: dwrite d0/df/d15/d23/da8/ff5 [0,4194304] 0 2026-03-10T07:51:48.729 INFO:tasks.workunit.client.1.vm08.stdout:1/838: write d2/d6/de/d1f/d26/f48 [1399430,87748] 0 2026-03-10T07:51:48.729 INFO:tasks.workunit.client.1.vm08.stdout:1/839: write d2/d6/d3a/f108 [307687,27345] 0 2026-03-10T07:51:48.731 INFO:tasks.workunit.client.1.vm08.stdout:4/799: mkdir d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/dad/db8/d10c 0 2026-03-10T07:51:48.731 INFO:tasks.workunit.client.1.vm08.stdout:3/857: creat d0/d3c/d18/d48/d55/f113 x:0 0 0 2026-03-10T07:51:48.732 INFO:tasks.workunit.client.1.vm08.stdout:3/858: fdatasync d0/d3c/d18/dec/f109 0 2026-03-10T07:51:48.735 INFO:tasks.workunit.client.1.vm08.stdout:6/884: rmdir d1/d17/d2b/d58/d76 39 2026-03-10T07:51:48.739 INFO:tasks.workunit.client.1.vm08.stdout:4/800: sync 2026-03-10T07:51:48.746 INFO:tasks.workunit.client.1.vm08.stdout:2/866: dwrite d0/d1/d3/d56/d78/ff8 [0,4194304] 0 2026-03-10T07:51:48.748 INFO:tasks.workunit.client.1.vm08.stdout:2/867: chown d0/d1/d3/d39/d7d/d86/d55/cf 101 1 2026-03-10T07:51:48.758 INFO:tasks.workunit.client.1.vm08.stdout:2/868: truncate d0/d1/d3/d56/d78/dad/db1/f103 144446 0 2026-03-10T07:51:48.767 INFO:tasks.workunit.client.1.vm08.stdout:2/869: chown d0/d1/d3/d56/d10f 1 1 2026-03-10T07:51:48.767 INFO:tasks.workunit.client.1.vm08.stdout:0/901: dread dd/d10/d2f/d37/d64/d95/d5c/dca/ddb/fe7 [0,4194304] 0 2026-03-10T07:51:48.767 INFO:tasks.workunit.client.1.vm08.stdout:5/956: dwrite d0/d4/d19/d60/d6d/d70/d40/dba/fb7 [0,4194304] 0 2026-03-10T07:51:48.775 INFO:tasks.workunit.client.1.vm08.stdout:5/957: dwrite d0/d8/d24/f47 [0,4194304] 0 2026-03-10T07:51:48.777 INFO:tasks.workunit.client.1.vm08.stdout:6/885: creat d1/d3/d3e/dff/d105/f12a x:0 0 0 2026-03-10T07:51:48.779 INFO:tasks.workunit.client.1.vm08.stdout:2/870: dwrite d0/d1/d17/d6b/fe3 [0,4194304] 0 2026-03-10T07:51:48.785 INFO:tasks.workunit.client.1.vm08.stdout:9/873: write d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/dca/dd7/fef [282289,13127] 0 2026-03-10T07:51:48.807 INFO:tasks.workunit.client.1.vm08.stdout:8/981: symlink d0/df/d15/d53/d12e/l133 0 2026-03-10T07:51:48.808 INFO:tasks.workunit.client.1.vm08.stdout:8/982: truncate d0/f2a 5291525 0 2026-03-10T07:51:48.813 INFO:tasks.workunit.client.1.vm08.stdout:1/840: symlink d2/d6/d3a/l11e 0 2026-03-10T07:51:48.820 INFO:tasks.workunit.client.1.vm08.stdout:3/859: creat d0/d3c/d18/da9/dcc/d105/f114 x:0 0 0 2026-03-10T07:51:48.820 INFO:tasks.workunit.client.1.vm08.stdout:3/860: chown d0/d3c/d18/dec/d34/c5e 61 1 2026-03-10T07:51:48.826 INFO:tasks.workunit.client.1.vm08.stdout:5/958: fsync d0/d4/d19/d81/d92/f78 0 2026-03-10T07:51:48.830 INFO:tasks.workunit.client.1.vm08.stdout:8/983: dread d0/df/d15/d23/d54/dba/d89/dbf/fe5 [0,4194304] 0 2026-03-10T07:51:48.834 INFO:tasks.workunit.client.1.vm08.stdout:6/886: dwrite d1/d3/d3e/f56 [4194304,4194304] 0 2026-03-10T07:51:48.835 INFO:tasks.workunit.client.1.vm08.stdout:6/887: chown d1/d46/l10a 315 1 2026-03-10T07:51:48.835 INFO:tasks.workunit.client.1.vm08.stdout:6/888: chown d1/d46/l10a 30147564 1 2026-03-10T07:51:48.843 INFO:tasks.workunit.client.1.vm08.stdout:0/902: mkdir dd/d10/d2f/d37/d64/d11e/d126 0 2026-03-10T07:51:48.844 INFO:tasks.workunit.client.1.vm08.stdout:1/841: mkdir d2/d6/de/d1f/d40/d11f 0 2026-03-10T07:51:48.845 INFO:tasks.workunit.client.1.vm08.stdout:0/903: dread dd/d10/dbd/fbe [0,4194304] 0 2026-03-10T07:51:48.850 INFO:tasks.workunit.client.1.vm08.stdout:2/871: fdatasync d0/d1/d17/d6b/f9a 0 2026-03-10T07:51:48.850 INFO:tasks.workunit.client.1.vm08.stdout:2/872: chown d0/d1/d3/d10/d38/f53 1884763793 1 2026-03-10T07:51:48.851 INFO:tasks.workunit.client.1.vm08.stdout:8/984: creat d0/df/d15/d23/d39/d5b/d4a/f134 x:0 0 0 2026-03-10T07:51:48.855 INFO:tasks.workunit.client.1.vm08.stdout:7/870: link d3/da/d25/d9/d2f/d6c/cbe d3/da/d25/d9/d2f/dfb/c126 0 2026-03-10T07:51:48.858 INFO:tasks.workunit.client.1.vm08.stdout:1/842: rename d2/d6/de/d47/da0/c102 to d2/d6/de/d1f/d26/d58/c120 0 2026-03-10T07:51:48.864 INFO:tasks.workunit.client.1.vm08.stdout:4/801: write f1 [116835,59482] 0 2026-03-10T07:51:48.865 INFO:tasks.workunit.client.1.vm08.stdout:9/874: write d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/d98/fd4 [5240527,45607] 0 2026-03-10T07:51:48.869 INFO:tasks.workunit.client.1.vm08.stdout:3/861: write d0/d3c/d18/f23 [8926529,18741] 0 2026-03-10T07:51:48.870 INFO:tasks.workunit.client.1.vm08.stdout:0/904: symlink dd/d10/dbd/d10d/l127 0 2026-03-10T07:51:48.871 INFO:tasks.workunit.client.1.vm08.stdout:9/875: dwrite d2/d58/dbf/dd0/d35/d97/dd5/fe6 [4194304,4194304] 0 2026-03-10T07:51:48.875 INFO:tasks.workunit.client.1.vm08.stdout:0/905: read f8 [3846087,122940] 0 2026-03-10T07:51:48.881 INFO:tasks.workunit.client.1.vm08.stdout:3/862: dread d0/d3c/d18/dec/f65 [0,4194304] 0 2026-03-10T07:51:48.886 INFO:tasks.workunit.client.1.vm08.stdout:3/863: dwrite d0/d3c/d18/dec/f4d [0,4194304] 0 2026-03-10T07:51:48.893 INFO:tasks.workunit.client.1.vm08.stdout:8/985: rmdir d0/df/d15/d23/d39/d5b/dbc 39 2026-03-10T07:51:48.894 INFO:tasks.workunit.client.1.vm08.stdout:5/959: write d0/d4/f124 [237195,29177] 0 2026-03-10T07:51:48.899 INFO:tasks.workunit.client.1.vm08.stdout:1/843: symlink d2/d6/d50/l121 0 2026-03-10T07:51:48.902 INFO:tasks.workunit.client.1.vm08.stdout:1/844: stat d2/d6/de/d47/da0/f103 0 2026-03-10T07:51:48.902 INFO:tasks.workunit.client.1.vm08.stdout:4/802: creat d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/dad/dcf/f10d x:0 0 0 2026-03-10T07:51:48.903 INFO:tasks.workunit.client.1.vm08.stdout:0/906: symlink dd/d10/d14/d15/d20/d5f/l128 0 2026-03-10T07:51:48.904 INFO:tasks.workunit.client.1.vm08.stdout:0/907: chown dd/d10/d14/d15/d20/d5f/fa2 1 1 2026-03-10T07:51:48.908 INFO:tasks.workunit.client.1.vm08.stdout:7/871: dread d3/da/f1d [4194304,4194304] 0 2026-03-10T07:51:48.908 INFO:tasks.workunit.client.1.vm08.stdout:2/873: unlink d0/d1/d3/d39/d7d/d7e/c8f 0 2026-03-10T07:51:48.911 INFO:tasks.workunit.client.1.vm08.stdout:2/874: truncate d0/d1/d17/f1a 7049041 0 2026-03-10T07:51:48.911 INFO:tasks.workunit.client.1.vm08.stdout:7/872: dread - d3/da/d25/d9/d2f/d3a/d40/d54/db5/f100 zero size 2026-03-10T07:51:48.912 INFO:tasks.workunit.client.1.vm08.stdout:3/864: truncate d0/d3c/d18/d32/d61/d52/f70 3936265 0 2026-03-10T07:51:48.913 INFO:tasks.workunit.client.1.vm08.stdout:3/865: truncate d0/d3c/d18/fa5 4207819 0 2026-03-10T07:51:48.913 INFO:tasks.workunit.client.1.vm08.stdout:8/986: truncate d0/df/d15/d23/f3d 39146 0 2026-03-10T07:51:48.915 INFO:tasks.workunit.client.1.vm08.stdout:8/987: chown d0/df/d15/d23/d39/d5b/d4a/lb8 490503315 1 2026-03-10T07:51:48.916 INFO:tasks.workunit.client.1.vm08.stdout:6/889: rename d1/d3/df/d52/lf8 to d1/d17/d2b/d5e/dcb/l12b 0 2026-03-10T07:51:48.916 INFO:tasks.workunit.client.1.vm08.stdout:1/845: read - d2/d10/f99 zero size 2026-03-10T07:51:48.926 INFO:tasks.workunit.client.1.vm08.stdout:2/875: fdatasync d0/d1/d3/d39/d7d/d86/d55/dc9/fd1 0 2026-03-10T07:51:48.932 INFO:tasks.workunit.client.1.vm08.stdout:3/866: mkdir d0/d3c/d18/d80/d115 0 2026-03-10T07:51:48.938 INFO:tasks.workunit.client.1.vm08.stdout:4/803: rename d5/d8/fc to d5/da0/de2/f10e 0 2026-03-10T07:51:48.944 INFO:tasks.workunit.client.1.vm08.stdout:6/890: symlink d1/d3/df/d1d/d40/d87/l12c 0 2026-03-10T07:51:48.945 INFO:tasks.workunit.client.1.vm08.stdout:1/846: symlink d2/d10/dd7/l122 0 2026-03-10T07:51:48.968 INFO:tasks.workunit.client.1.vm08.stdout:2/876: creat d0/d1/d3/d39/d7d/d86/d55/db9/f118 x:0 0 0 2026-03-10T07:51:48.980 INFO:tasks.workunit.client.1.vm08.stdout:9/876: truncate d2/fd 309077 0 2026-03-10T07:51:48.980 INFO:tasks.workunit.client.1.vm08.stdout:9/877: write d2/d58/dbf/d2b/f37 [1172908,72999] 0 2026-03-10T07:51:48.981 INFO:tasks.workunit.client.1.vm08.stdout:0/908: write dd/d18/f25 [1517474,52532] 0 2026-03-10T07:51:48.985 INFO:tasks.workunit.client.1.vm08.stdout:0/909: read dd/d10/d14/d15/f84 [2965882,39696] 0 2026-03-10T07:51:48.985 INFO:tasks.workunit.client.1.vm08.stdout:7/873: dwrite d3/da/d25/d9/d2f/f62 [4194304,4194304] 0 2026-03-10T07:51:48.987 INFO:tasks.workunit.client.1.vm08.stdout:0/910: stat dd/d10/d2f/d37/d64/d95/d58/fb2 0 2026-03-10T07:51:49.001 INFO:tasks.workunit.client.1.vm08.stdout:5/960: truncate d0/d4/df/dbf/f64 1637618 0 2026-03-10T07:51:49.011 INFO:tasks.workunit.client.1.vm08.stdout:2/877: fsync d0/d1/d3/d39/de2/d107/d111/f26 0 2026-03-10T07:51:49.016 INFO:tasks.workunit.client.1.vm08.stdout:2/878: dwrite d0/d1/d3/d10/d38/d100/f102 [0,4194304] 0 2026-03-10T07:51:49.021 INFO:tasks.workunit.client.1.vm08.stdout:2/879: dread d0/d1/d3/d56/d57/f5b [0,4194304] 0 2026-03-10T07:51:49.062 INFO:tasks.workunit.client.1.vm08.stdout:0/911: rmdir dd/d18/d100/d102/dc1/ded 39 2026-03-10T07:51:49.064 INFO:tasks.workunit.client.1.vm08.stdout:7/874: mknod d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/c127 0 2026-03-10T07:51:49.064 INFO:tasks.workunit.client.1.vm08.stdout:3/867: write d0/d3c/d18/d48/fc5 [205822,107551] 0 2026-03-10T07:51:49.065 INFO:tasks.workunit.client.1.vm08.stdout:5/961: creat d0/d77/d83/de0/f13b x:0 0 0 2026-03-10T07:51:49.086 INFO:tasks.workunit.client.1.vm08.stdout:4/804: dwrite d5/f54 [0,4194304] 0 2026-03-10T07:51:49.091 INFO:tasks.workunit.client.1.vm08.stdout:8/988: getdents d0 0 2026-03-10T07:51:49.094 INFO:tasks.workunit.client.1.vm08.stdout:6/891: dwrite d1/d17/d2b/fa7 [0,4194304] 0 2026-03-10T07:51:49.099 INFO:tasks.workunit.client.1.vm08.stdout:1/847: write d2/d6/d9f/fa7 [569476,42066] 0 2026-03-10T07:51:49.100 INFO:tasks.workunit.client.1.vm08.stdout:9/878: truncate d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/f5f 551934 0 2026-03-10T07:51:49.104 INFO:tasks.workunit.client.1.vm08.stdout:7/875: creat d3/da/d25/d9/d2f/d4d/db6/f128 x:0 0 0 2026-03-10T07:51:49.135 INFO:tasks.workunit.client.1.vm08.stdout:3/868: dread d0/d3c/d18/d32/d61/d52/f7f [0,4194304] 0 2026-03-10T07:51:49.136 INFO:tasks.workunit.client.1.vm08.stdout:3/869: write d0/d3c/d18/f23 [5925998,27135] 0 2026-03-10T07:51:49.138 INFO:tasks.workunit.client.1.vm08.stdout:4/805: truncate d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/f4d 2436420 0 2026-03-10T07:51:49.142 INFO:tasks.workunit.client.1.vm08.stdout:2/880: fdatasync d0/d1/d3/d39/d7d/d86/f109 0 2026-03-10T07:51:49.146 INFO:tasks.workunit.client.1.vm08.stdout:8/989: mkdir d0/df/d15/d23/d54/dba/d11c/d135 0 2026-03-10T07:51:49.148 INFO:tasks.workunit.client.1.vm08.stdout:6/892: creat d1/d3/d3e/dff/d105/f12d x:0 0 0 2026-03-10T07:51:49.148 INFO:tasks.workunit.client.1.vm08.stdout:1/848: rmdir d2/d6/d3a 39 2026-03-10T07:51:49.155 INFO:tasks.workunit.client.1.vm08.stdout:0/912: creat dd/d10/d2f/d37/d64/d95/d58/d3d/d9b/f129 x:0 0 0 2026-03-10T07:51:49.158 INFO:tasks.workunit.client.1.vm08.stdout:7/876: rmdir d3/da/d25/d9/d2f/d3a/d4b/d67 39 2026-03-10T07:51:49.167 INFO:tasks.workunit.client.1.vm08.stdout:3/870: truncate d0/d3c/d18/d48/d55/d56/fbc 381405 0 2026-03-10T07:51:49.188 INFO:tasks.workunit.client.1.vm08.stdout:9/879: mknod d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/c12e 0 2026-03-10T07:51:49.188 INFO:tasks.workunit.client.1.vm08.stdout:0/913: rmdir dd/d10/d2f/d37/d64/d95/d5c/dca/ddb 39 2026-03-10T07:51:49.190 INFO:tasks.workunit.client.1.vm08.stdout:5/962: rmdir d0/d4/df/dbf/d41/de8 0 2026-03-10T07:51:49.191 INFO:tasks.workunit.client.1.vm08.stdout:7/877: creat d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79/ddd/f129 x:0 0 0 2026-03-10T07:51:49.194 INFO:tasks.workunit.client.1.vm08.stdout:4/806: link d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/f37 d5/da0/d12/def/f10f 0 2026-03-10T07:51:49.198 INFO:tasks.workunit.client.1.vm08.stdout:8/990: write d0/d69/fd9 [702435,118964] 0 2026-03-10T07:51:49.202 INFO:tasks.workunit.client.1.vm08.stdout:3/871: dread d0/d3c/d1f/d44/f8c [4194304,4194304] 0 2026-03-10T07:51:49.203 INFO:tasks.workunit.client.1.vm08.stdout:3/872: readlink d0/d3c/d18/d48/d55/lde 0 2026-03-10T07:51:49.206 INFO:tasks.workunit.client.1.vm08.stdout:0/914: mknod dd/d10/d2f/d37/d64/d52/c12a 0 2026-03-10T07:51:49.207 INFO:tasks.workunit.client.1.vm08.stdout:0/915: fsync dd/d10/d2f/d37/d64/d95/d58/d3d/fa1 0 2026-03-10T07:51:49.211 INFO:tasks.workunit.client.1.vm08.stdout:5/963: chown d0/d4/df/dbf/daf/cf4 15951 1 2026-03-10T07:51:49.211 INFO:tasks.workunit.client.1.vm08.stdout:6/893: truncate d1/d3/ff6 3294586 0 2026-03-10T07:51:49.212 INFO:tasks.workunit.client.1.vm08.stdout:6/894: truncate d1/d3/df/d1d/d40/d45/ff7 116863 0 2026-03-10T07:51:49.213 INFO:tasks.workunit.client.1.vm08.stdout:7/878: rename d3/da/d25/d9/d2f/l3e to d3/da/d25/d9/d2f/d6c/l12a 0 2026-03-10T07:51:49.213 INFO:tasks.workunit.client.1.vm08.stdout:6/895: chown d1/d17/d2b/d58/d77/f101 0 1 2026-03-10T07:51:49.214 INFO:tasks.workunit.client.1.vm08.stdout:2/881: link d0/d1/d17/db2/d9c/lf7 d0/d1/d3/d39/d7d/d86/d55/db9/l119 0 2026-03-10T07:51:49.215 INFO:tasks.workunit.client.1.vm08.stdout:4/807: creat d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/dad/f110 x:0 0 0 2026-03-10T07:51:49.216 INFO:tasks.workunit.client.1.vm08.stdout:2/882: dread d0/d1/d17/db2/d9c/fbc [0,4194304] 0 2026-03-10T07:51:49.224 INFO:tasks.workunit.client.1.vm08.stdout:7/879: dread d3/da/d25/d9/d2f/f42 [0,4194304] 0 2026-03-10T07:51:49.231 INFO:tasks.workunit.client.1.vm08.stdout:3/873: creat d0/d3c/d18/d32/daa/ded/f116 x:0 0 0 2026-03-10T07:51:49.232 INFO:tasks.workunit.client.1.vm08.stdout:1/849: write d2/d6/de/d71/fc9 [637826,112194] 0 2026-03-10T07:51:49.233 INFO:tasks.workunit.client.1.vm08.stdout:1/850: readlink d2/d6/de/d47/dbd/dc3/l10e 0 2026-03-10T07:51:49.242 INFO:tasks.workunit.client.1.vm08.stdout:4/808: mknod d5/d8/d89/c111 0 2026-03-10T07:51:49.244 INFO:tasks.workunit.client.1.vm08.stdout:6/896: write d1/d3/df/d1d/f9d [4701294,129410] 0 2026-03-10T07:51:49.246 INFO:tasks.workunit.client.1.vm08.stdout:2/883: creat d0/d1/d3/d10/f11a x:0 0 0 2026-03-10T07:51:49.247 INFO:tasks.workunit.client.1.vm08.stdout:8/991: unlink d0/df/d15/c1c 0 2026-03-10T07:51:49.250 INFO:tasks.workunit.client.1.vm08.stdout:9/880: creat d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/f12f x:0 0 0 2026-03-10T07:51:49.251 INFO:tasks.workunit.client.1.vm08.stdout:5/964: mknod d0/d4/d19/d81/c13c 0 2026-03-10T07:51:49.252 INFO:tasks.workunit.client.1.vm08.stdout:0/916: rename dd/d10/ffe to dd/d10/d2f/d37/d64/d95/f12b 0 2026-03-10T07:51:49.255 INFO:tasks.workunit.client.1.vm08.stdout:3/874: dwrite d0/d3c/d1f/d89/fff [0,4194304] 0 2026-03-10T07:51:49.260 INFO:tasks.workunit.client.1.vm08.stdout:4/809: fsync d5/d8/ff 0 2026-03-10T07:51:49.261 INFO:tasks.workunit.client.1.vm08.stdout:6/897: dread d1/db/fc1 [4194304,4194304] 0 2026-03-10T07:51:49.262 INFO:tasks.workunit.client.1.vm08.stdout:3/875: read - d0/d3c/d18/d32/ff3 zero size 2026-03-10T07:51:49.265 INFO:tasks.workunit.client.1.vm08.stdout:8/992: mknod d0/df/d15/d9c/c136 0 2026-03-10T07:51:49.265 INFO:tasks.workunit.client.1.vm08.stdout:4/810: rename d5/l88 to d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d70/l112 0 2026-03-10T07:51:49.267 INFO:tasks.workunit.client.1.vm08.stdout:0/917: dread dd/d10/d2f/f4c [0,4194304] 0 2026-03-10T07:51:49.268 INFO:tasks.workunit.client.1.vm08.stdout:7/880: creat d3/da/d25/d9/d2f/d39/d43/d4f/d5b/f12b x:0 0 0 2026-03-10T07:51:49.269 INFO:tasks.workunit.client.1.vm08.stdout:5/965: dwrite d0/d8/d5e/d8e/f96 [0,4194304] 0 2026-03-10T07:51:49.270 INFO:tasks.workunit.client.1.vm08.stdout:3/876: write d0/d3c/d18/d32/d61/d52/dca/ff9 [890249,6968] 0 2026-03-10T07:51:49.270 INFO:tasks.workunit.client.1.vm08.stdout:1/851: creat d2/d6/de/f123 x:0 0 0 2026-03-10T07:51:49.278 INFO:tasks.workunit.client.1.vm08.stdout:6/898: rename d1/db/f4e to d1/d3/df/d44/f12e 0 2026-03-10T07:51:49.278 INFO:tasks.workunit.client.1.vm08.stdout:8/993: dwrite d0/df/d17/d25/f120 [0,4194304] 0 2026-03-10T07:51:49.282 INFO:tasks.workunit.client.1.vm08.stdout:4/811: dwrite d5/da0/d95/de6/d48/fe9 [0,4194304] 0 2026-03-10T07:51:49.282 INFO:tasks.workunit.client.1.vm08.stdout:8/994: stat d0/df/d15/d23/d54/dba/d89/dbf 0 2026-03-10T07:51:49.282 INFO:tasks.workunit.client.1.vm08.stdout:9/881: sync 2026-03-10T07:51:49.287 INFO:tasks.workunit.client.1.vm08.stdout:4/812: dwrite d5/d8/f68 [0,4194304] 0 2026-03-10T07:51:49.288 INFO:tasks.workunit.client.1.vm08.stdout:4/813: write d5/da0/d12/def/f102 [629642,16963] 0 2026-03-10T07:51:49.288 INFO:tasks.workunit.client.1.vm08.stdout:5/966: fdatasync d0/d4/df/dbf/f25 0 2026-03-10T07:51:49.288 INFO:tasks.workunit.client.1.vm08.stdout:1/852: mknod d2/d6/d9f/c124 0 2026-03-10T07:51:49.289 INFO:tasks.workunit.client.1.vm08.stdout:3/877: mkdir d0/d3c/d18/d32/d61/d52/dca/dd2/d117 0 2026-03-10T07:51:49.295 INFO:tasks.workunit.client.1.vm08.stdout:2/884: dread d0/d1/d17/db2/dde/d116/f7c [0,4194304] 0 2026-03-10T07:51:49.301 INFO:tasks.workunit.client.1.vm08.stdout:0/918: mknod dd/d10/d2f/d37/d64/d52/da9/c12c 0 2026-03-10T07:51:49.301 INFO:tasks.workunit.client.1.vm08.stdout:8/995: chown d0/df/ca4 214 1 2026-03-10T07:51:49.310 INFO:tasks.workunit.client.1.vm08.stdout:4/814: read d5/d8/ff [4219823,35268] 0 2026-03-10T07:51:49.310 INFO:tasks.workunit.client.1.vm08.stdout:5/967: fsync d0/d4/d19/d43/f7c 0 2026-03-10T07:51:49.313 INFO:tasks.workunit.client.1.vm08.stdout:4/815: fsync d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/dad/fc3 0 2026-03-10T07:51:49.314 INFO:tasks.workunit.client.1.vm08.stdout:9/882: dwrite d2/d58/dbf/dd0/f5d [0,4194304] 0 2026-03-10T07:51:49.314 INFO:tasks.workunit.client.1.vm08.stdout:0/919: unlink dd/d10/d2f/d37/d64/d95/d58/fd9 0 2026-03-10T07:51:49.314 INFO:tasks.workunit.client.1.vm08.stdout:8/996: dwrite d0/df/f10e [0,4194304] 0 2026-03-10T07:51:49.318 INFO:tasks.workunit.client.1.vm08.stdout:7/881: link d3/cc d3/da/c12c 0 2026-03-10T07:51:49.318 INFO:tasks.workunit.client.1.vm08.stdout:1/853: mkdir d2/d6/de/d1f/d26/d58/d83/d104/d125 0 2026-03-10T07:51:49.318 INFO:tasks.workunit.client.1.vm08.stdout:4/816: symlink d5/da0/d95/de6/da7/l113 0 2026-03-10T07:51:49.325 INFO:tasks.workunit.client.1.vm08.stdout:2/885: symlink d0/d1/d3/d56/d78/dad/db1/l11b 0 2026-03-10T07:51:49.325 INFO:tasks.workunit.client.1.vm08.stdout:4/817: read - d5/da0/d95/de6/d48/d4f/d8d/d91/dd5/f100 zero size 2026-03-10T07:51:49.325 INFO:tasks.workunit.client.1.vm08.stdout:0/920: dwrite dd/d10/d2f/d37/d64/f104 [0,4194304] 0 2026-03-10T07:51:49.325 INFO:tasks.workunit.client.1.vm08.stdout:9/883: dread - d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/dca/dd7/d10d/f12d zero size 2026-03-10T07:51:49.330 INFO:tasks.workunit.client.1.vm08.stdout:8/997: mkdir d0/df/d15/d23/da8/d137 0 2026-03-10T07:51:49.331 INFO:tasks.workunit.client.1.vm08.stdout:5/968: mknod d0/d4/df/dbf/c13d 0 2026-03-10T07:51:49.335 INFO:tasks.workunit.client.1.vm08.stdout:4/818: write d5/da0/d95/de6/d48/d4f/fe5 [628242,74976] 0 2026-03-10T07:51:49.338 INFO:tasks.workunit.client.1.vm08.stdout:7/882: rename d3/da/f17 to d3/da/d25/d9/d2f/d3a/d71/d8c/f12d 0 2026-03-10T07:51:49.347 INFO:tasks.workunit.client.1.vm08.stdout:0/921: creat dd/d10/d14/d15/dad/f12d x:0 0 0 2026-03-10T07:51:49.352 INFO:tasks.workunit.client.1.vm08.stdout:2/886: rename d0/fca to d0/d1/d3/d39/de2/d107/d111/f11c 0 2026-03-10T07:51:49.356 INFO:tasks.workunit.client.1.vm08.stdout:0/922: unlink dd/d10/d2f/d37/d64/d95/d58/f10a 0 2026-03-10T07:51:49.357 INFO:tasks.workunit.client.1.vm08.stdout:8/998: sync 2026-03-10T07:51:49.362 INFO:tasks.workunit.client.1.vm08.stdout:8/999: dread d0/df/d15/d23/da8/ff5 [0,4194304] 0 2026-03-10T07:51:49.381 INFO:tasks.workunit.client.1.vm08.stdout:0/923: write dd/d10/d2f/d37/d64/f68 [4929809,65083] 0 2026-03-10T07:51:49.384 INFO:tasks.workunit.client.1.vm08.stdout:3/878: dread d0/d3c/d18/dec/d2d/d85/fcb [0,4194304] 0 2026-03-10T07:51:49.392 INFO:tasks.workunit.client.1.vm08.stdout:1/854: getdents d2/d6/de/d1f/d26/d89/d8e 0 2026-03-10T07:51:49.392 INFO:tasks.workunit.client.1.vm08.stdout:0/924: symlink dd/d10/d14/d15/d20/l12e 0 2026-03-10T07:51:49.392 INFO:tasks.workunit.client.1.vm08.stdout:2/887: link d0/d1/d17/db2/d9c/ff4 d0/d1/d3/d56/d78/dad/db1/f11d 0 2026-03-10T07:51:49.392 INFO:tasks.workunit.client.1.vm08.stdout:5/969: rename d0/d4/df/dbf/d41/c54 to d0/c13e 0 2026-03-10T07:51:49.392 INFO:tasks.workunit.client.1.vm08.stdout:1/855: truncate d2/d6/de/d1f/d26/d58/f68 2939254 0 2026-03-10T07:51:49.392 INFO:tasks.workunit.client.1.vm08.stdout:5/970: rmdir d0/d8/d24/dd0 39 2026-03-10T07:51:49.392 INFO:tasks.workunit.client.1.vm08.stdout:0/925: creat dd/d10/d2f/d37/d64/d95/d58/d3d/f12f x:0 0 0 2026-03-10T07:51:49.392 INFO:tasks.workunit.client.1.vm08.stdout:0/926: readlink dd/d18/l24 0 2026-03-10T07:51:49.393 INFO:tasks.workunit.client.1.vm08.stdout:2/888: getdents d0/d1/d17/dfb 0 2026-03-10T07:51:49.393 INFO:tasks.workunit.client.1.vm08.stdout:0/927: rmdir dd/d10/d14/d15/d20 39 2026-03-10T07:51:49.396 INFO:tasks.workunit.client.1.vm08.stdout:3/879: sync 2026-03-10T07:51:49.397 INFO:tasks.workunit.client.1.vm08.stdout:1/856: fsync d2/d6/de/d1f/d26/d89/d8e/fbf 0 2026-03-10T07:51:49.399 INFO:tasks.workunit.client.1.vm08.stdout:0/928: creat dd/d10/d2f/d37/d64/f130 x:0 0 0 2026-03-10T07:51:49.401 INFO:tasks.workunit.client.1.vm08.stdout:2/889: dwrite d0/d1/d3/d56/d78/ff8 [0,4194304] 0 2026-03-10T07:51:49.401 INFO:tasks.workunit.client.1.vm08.stdout:5/971: getdents d0/d4/d19/d81 0 2026-03-10T07:51:49.405 INFO:tasks.workunit.client.1.vm08.stdout:5/972: readlink d0/d8/dce/l107 0 2026-03-10T07:51:49.409 INFO:tasks.workunit.client.1.vm08.stdout:1/857: fsync d2/d6/d3a/d61/f88 0 2026-03-10T07:51:49.409 INFO:tasks.workunit.client.1.vm08.stdout:0/929: chown dd/d18/d100/d102/dc1/ded/df7/fff 1776154 1 2026-03-10T07:51:49.413 INFO:tasks.workunit.client.1.vm08.stdout:2/890: rmdir d0/d1/d3/d10/d38/daf 39 2026-03-10T07:51:49.413 INFO:tasks.workunit.client.1.vm08.stdout:1/858: dread d2/d6/de/d71/fc9 [0,4194304] 0 2026-03-10T07:51:49.415 INFO:tasks.workunit.client.1.vm08.stdout:5/973: creat d0/d8/d24/dd0/f13f x:0 0 0 2026-03-10T07:51:49.417 INFO:tasks.workunit.client.1.vm08.stdout:0/930: symlink dd/d10/d2f/d37/d64/l131 0 2026-03-10T07:51:49.418 INFO:tasks.workunit.client.1.vm08.stdout:9/884: dwrite d2/d58/dbf/dd0/d35/d97/d9d/fbd [0,4194304] 0 2026-03-10T07:51:49.423 INFO:tasks.workunit.client.1.vm08.stdout:7/883: write d3/f2b [145447,82339] 0 2026-03-10T07:51:49.424 INFO:tasks.workunit.client.1.vm08.stdout:9/885: fsync d2/d58/dbf/f128 0 2026-03-10T07:51:49.431 INFO:tasks.workunit.client.1.vm08.stdout:4/819: dwrite d5/da0/d12/def/f10f [0,4194304] 0 2026-03-10T07:51:49.433 INFO:tasks.workunit.client.1.vm08.stdout:5/974: creat d0/d77/d83/de0/dfe/f140 x:0 0 0 2026-03-10T07:51:49.441 INFO:tasks.workunit.client.1.vm08.stdout:2/891: rename d0/d1/d3/d56/d78/dad/db1/l11b to d0/d1/d3/d56/d78/dad/db1/d61/l11e 0 2026-03-10T07:51:49.446 INFO:tasks.workunit.client.1.vm08.stdout:9/886: mkdir d2/d58/dbf/dd0/d35/d97/d9d/d130 0 2026-03-10T07:51:49.446 INFO:tasks.workunit.client.1.vm08.stdout:4/820: mknod d5/da0/d12/def/c114 0 2026-03-10T07:51:49.447 INFO:tasks.workunit.client.1.vm08.stdout:9/887: fsync d2/d26/f29 0 2026-03-10T07:51:49.447 INFO:tasks.workunit.client.1.vm08.stdout:6/899: write d1/d3/ff6 [2989719,16086] 0 2026-03-10T07:51:49.451 INFO:tasks.workunit.client.1.vm08.stdout:7/884: mkdir d3/da/d25/d9/d2f/d4d/d120/d12e 0 2026-03-10T07:51:49.451 INFO:tasks.workunit.client.1.vm08.stdout:4/821: read - d5/da0/d95/de6/d48/d4f/d8d/d91/dd5/f100 zero size 2026-03-10T07:51:49.453 INFO:tasks.workunit.client.1.vm08.stdout:3/880: truncate d0/d3c/d18/f23 8871127 0 2026-03-10T07:51:49.453 INFO:tasks.workunit.client.1.vm08.stdout:4/822: write d5/da0/d12/fe7 [382001,6460] 0 2026-03-10T07:51:49.453 INFO:tasks.workunit.client.1.vm08.stdout:1/859: write d2/d6/de/d1f/da9/faf [413386,31602] 0 2026-03-10T07:51:49.455 INFO:tasks.workunit.client.1.vm08.stdout:0/931: creat dd/d10/d14/d15/f132 x:0 0 0 2026-03-10T07:51:49.456 INFO:tasks.workunit.client.1.vm08.stdout:0/932: dread - dd/d10/d2f/d37/d64/d95/d58/f120 zero size 2026-03-10T07:51:49.459 INFO:tasks.workunit.client.1.vm08.stdout:4/823: mknod d5/da0/d12/def/c115 0 2026-03-10T07:51:49.462 INFO:tasks.workunit.client.1.vm08.stdout:9/888: mkdir d2/dda/d11d/d131 0 2026-03-10T07:51:49.463 INFO:tasks.workunit.client.1.vm08.stdout:7/885: dread d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/f49 [0,4194304] 0 2026-03-10T07:51:49.464 INFO:tasks.workunit.client.1.vm08.stdout:7/886: write d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/f81 [535675,41693] 0 2026-03-10T07:51:49.468 INFO:tasks.workunit.client.1.vm08.stdout:5/975: getdents d0/d4/d19/d60/d6d/d70/dc5 0 2026-03-10T07:51:49.472 INFO:tasks.workunit.client.1.vm08.stdout:6/900: dread d1/d17/d2b/d5e/ff3 [0,4194304] 0 2026-03-10T07:51:49.474 INFO:tasks.workunit.client.1.vm08.stdout:2/892: dwrite d0/d1/d3/d39/d7d/d86/d55/fb0 [0,4194304] 0 2026-03-10T07:51:49.474 INFO:tasks.workunit.client.1.vm08.stdout:4/824: fdatasync d5/f2f 0 2026-03-10T07:51:49.476 INFO:tasks.workunit.client.1.vm08.stdout:4/825: write d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d9b/ff3 [19197,120463] 0 2026-03-10T07:51:49.482 INFO:tasks.workunit.client.1.vm08.stdout:1/860: dwrite d2/d6/ff0 [0,4194304] 0 2026-03-10T07:51:49.482 INFO:tasks.workunit.client.1.vm08.stdout:9/889: dwrite d2/d58/dbf/dd0/d35/fdd [0,4194304] 0 2026-03-10T07:51:49.491 INFO:tasks.workunit.client.1.vm08.stdout:1/861: truncate d2/d10/dd7/f115 860491 0 2026-03-10T07:51:49.497 INFO:tasks.workunit.client.1.vm08.stdout:3/881: rename d0/d3c/d18/da9/dcc/d105/f114 to d0/d3c/d18/d32/d61/f118 0 2026-03-10T07:51:49.498 INFO:tasks.workunit.client.1.vm08.stdout:0/933: rmdir dd/d18/d100/d102/d110 0 2026-03-10T07:51:49.503 INFO:tasks.workunit.client.1.vm08.stdout:7/887: dwrite d3/fa4 [0,4194304] 0 2026-03-10T07:51:49.506 INFO:tasks.workunit.client.1.vm08.stdout:7/888: write d3/da/d25/d9/fc5 [1313457,2170] 0 2026-03-10T07:51:49.513 INFO:tasks.workunit.client.1.vm08.stdout:3/882: dread d0/d3c/d18/d32/d61/d52/f66 [0,4194304] 0 2026-03-10T07:51:49.513 INFO:tasks.workunit.client.1.vm08.stdout:2/893: mkdir d0/d1/d3/d39/d7d/d86/d55/d7a/d11f 0 2026-03-10T07:51:49.514 INFO:tasks.workunit.client.1.vm08.stdout:4/826: unlink d5/da0/d95/de6/d48/d4f/d8d/d91/dd5/cdc 0 2026-03-10T07:51:49.520 INFO:tasks.workunit.client.1.vm08.stdout:0/934: read - dd/d10/d2f/d37/d64/d52/fe6 zero size 2026-03-10T07:51:49.521 INFO:tasks.workunit.client.1.vm08.stdout:0/935: dread - dd/d10/d2f/d37/d64/d52/f113 zero size 2026-03-10T07:51:49.530 INFO:tasks.workunit.client.1.vm08.stdout:3/883: stat d0/d3c/d18/dec/f2b 0 2026-03-10T07:51:49.530 INFO:tasks.workunit.client.1.vm08.stdout:2/894: rename d0/d1/d3/d56/d78/dad/db1/d61/fbd to d0/d1/d3/d56/d78/dad/f120 0 2026-03-10T07:51:49.533 INFO:tasks.workunit.client.1.vm08.stdout:4/827: creat d5/d8/d50/f116 x:0 0 0 2026-03-10T07:51:49.534 INFO:tasks.workunit.client.1.vm08.stdout:3/884: creat d0/d3c/d1f/f119 x:0 0 0 2026-03-10T07:51:49.535 INFO:tasks.workunit.client.1.vm08.stdout:2/895: getdents d0/d1/d3/d39/d7d 0 2026-03-10T07:51:49.538 INFO:tasks.workunit.client.1.vm08.stdout:2/896: mkdir d0/d1/d3/d56/d57/d121 0 2026-03-10T07:51:49.541 INFO:tasks.workunit.client.1.vm08.stdout:4/828: dwrite d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/fd7 [0,4194304] 0 2026-03-10T07:51:49.541 INFO:tasks.workunit.client.1.vm08.stdout:2/897: creat d0/d1/d17/db2/dde/d116/f122 x:0 0 0 2026-03-10T07:51:49.549 INFO:tasks.workunit.client.1.vm08.stdout:0/936: sync 2026-03-10T07:51:49.549 INFO:tasks.workunit.client.1.vm08.stdout:4/829: creat d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/d61/f117 x:0 0 0 2026-03-10T07:51:49.553 INFO:tasks.workunit.client.1.vm08.stdout:0/937: creat dd/d10/d2f/d37/d64/d52/da9/f133 x:0 0 0 2026-03-10T07:51:49.557 INFO:tasks.workunit.client.1.vm08.stdout:0/938: fdatasync dd/d10/d2f/d37/d64/d52/fc0 0 2026-03-10T07:51:49.558 INFO:tasks.workunit.client.1.vm08.stdout:0/939: fsync dd/d10/f77 0 2026-03-10T07:51:49.561 INFO:tasks.workunit.client.1.vm08.stdout:5/976: truncate d0/d4/df/dbf/d41/dad/fb9 823300 0 2026-03-10T07:51:49.561 INFO:tasks.workunit.client.1.vm08.stdout:0/940: fdatasync dd/d10/d14/d15/dad/ffc 0 2026-03-10T07:51:49.562 INFO:tasks.workunit.client.1.vm08.stdout:4/830: dread d5/da0/d95/de6/d48/d4f/d7c/fb2 [0,4194304] 0 2026-03-10T07:51:49.564 INFO:tasks.workunit.client.1.vm08.stdout:1/862: dwrite d2/d6/de/d47/da0/fe9 [0,4194304] 0 2026-03-10T07:51:49.573 INFO:tasks.workunit.client.1.vm08.stdout:0/941: fdatasync dd/d10/d2f/d37/d64/d95/d5c/dca/fce 0 2026-03-10T07:51:49.574 INFO:tasks.workunit.client.1.vm08.stdout:6/901: write d1/d17/d2b/d58/d76/dde/fe3 [1013127,25903] 0 2026-03-10T07:51:49.576 INFO:tasks.workunit.client.1.vm08.stdout:7/889: write d3/da/d25/d9/d2f/d3a/d4b/fa3 [761442,122196] 0 2026-03-10T07:51:49.576 INFO:tasks.workunit.client.1.vm08.stdout:9/890: write d2/d58/dbf/dd0/d35/f79 [4148812,1579] 0 2026-03-10T07:51:49.579 INFO:tasks.workunit.client.1.vm08.stdout:3/885: write d0/d3c/d18/d32/fc7 [368759,76174] 0 2026-03-10T07:51:49.579 INFO:tasks.workunit.client.1.vm08.stdout:3/886: chown d0/d3c/d18/dec/d2d/c6e 1145160 1 2026-03-10T07:51:49.580 INFO:tasks.workunit.client.1.vm08.stdout:4/831: fdatasync d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/f38 0 2026-03-10T07:51:49.582 INFO:tasks.workunit.client.1.vm08.stdout:1/863: rmdir d2/d10/dd7 39 2026-03-10T07:51:49.584 INFO:tasks.workunit.client.1.vm08.stdout:2/898: dwrite d0/d1/d17/d6b/f9a [0,4194304] 0 2026-03-10T07:51:49.591 INFO:tasks.workunit.client.1.vm08.stdout:2/899: dread - d0/d1/d3/d39/d7d/d86/d55/dc9/ded/ff6 zero size 2026-03-10T07:51:49.593 INFO:tasks.workunit.client.1.vm08.stdout:3/887: rmdir d0/d3c/d1f/d89/ddb 39 2026-03-10T07:51:49.597 INFO:tasks.workunit.client.1.vm08.stdout:0/942: rename dd/d10/d14/d1b/da5/d108 to dd/d10/d14/d15/d20/d134 0 2026-03-10T07:51:49.600 INFO:tasks.workunit.client.1.vm08.stdout:3/888: creat d0/d3c/d18/d32/daa/ded/f11a x:0 0 0 2026-03-10T07:51:49.601 INFO:tasks.workunit.client.1.vm08.stdout:0/943: mknod dd/d18/d100/d102/dc1/de5/c135 0 2026-03-10T07:51:49.602 INFO:tasks.workunit.client.1.vm08.stdout:6/902: rmdir d1/d3/df/d1d/d40/d87/d118 0 2026-03-10T07:51:49.603 INFO:tasks.workunit.client.1.vm08.stdout:6/903: fdatasync d1/d3/df/d52/f8f 0 2026-03-10T07:51:49.603 INFO:tasks.workunit.client.1.vm08.stdout:9/891: link d2/d58/dbf/l38 d2/d58/dbf/dd0/d35/d97/dd5/d106/l132 0 2026-03-10T07:51:49.610 INFO:tasks.workunit.client.1.vm08.stdout:2/900: dread d0/d1/d3/d39/f3b [0,4194304] 0 2026-03-10T07:51:49.613 INFO:tasks.workunit.client.1.vm08.stdout:5/977: truncate d0/d4/f2e 256727 0 2026-03-10T07:51:49.613 INFO:tasks.workunit.client.1.vm08.stdout:7/890: dread d3/f4 [0,4194304] 0 2026-03-10T07:51:49.614 INFO:tasks.workunit.client.1.vm08.stdout:4/832: write d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d70/f78 [4697086,62065] 0 2026-03-10T07:51:49.616 INFO:tasks.workunit.client.1.vm08.stdout:2/901: sync 2026-03-10T07:51:49.616 INFO:tasks.workunit.client.1.vm08.stdout:0/944: dread - dd/d10/d2f/d37/d64/d95/d58/fdd zero size 2026-03-10T07:51:49.617 INFO:tasks.workunit.client.1.vm08.stdout:9/892: mknod d2/d58/dbf/d2b/c133 0 2026-03-10T07:51:49.620 INFO:tasks.workunit.client.1.vm08.stdout:2/902: readlink d0/d1/d3/d39/de2/d114/le6 0 2026-03-10T07:51:49.621 INFO:tasks.workunit.client.1.vm08.stdout:2/903: stat d0/d1/d3/d3e 0 2026-03-10T07:51:49.625 INFO:tasks.workunit.client.1.vm08.stdout:9/893: dwrite d2/d26/f29 [4194304,4194304] 0 2026-03-10T07:51:49.627 INFO:tasks.workunit.client.1.vm08.stdout:1/864: link d2/d6/c9 d2/d10/c126 0 2026-03-10T07:51:49.628 INFO:tasks.workunit.client.1.vm08.stdout:6/904: mkdir d1/d17/d2b/d58/d76/d114/d12f 0 2026-03-10T07:51:49.629 INFO:tasks.workunit.client.1.vm08.stdout:5/978: dwrite d0/d4/df/dbf/daf/f131 [0,4194304] 0 2026-03-10T07:51:49.642 INFO:tasks.workunit.client.1.vm08.stdout:0/945: symlink dd/d10/d2f/d37/d64/d95/d5c/dca/l136 0 2026-03-10T07:51:49.644 INFO:tasks.workunit.client.1.vm08.stdout:9/894: symlink d2/d58/dbf/dd0/d35/d97/d110/l134 0 2026-03-10T07:51:49.644 INFO:tasks.workunit.client.1.vm08.stdout:4/833: dwrite d5/da0/d32/fae [0,4194304] 0 2026-03-10T07:51:49.646 INFO:tasks.workunit.client.1.vm08.stdout:4/834: chown d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/c2c 245254489 1 2026-03-10T07:51:49.649 INFO:tasks.workunit.client.1.vm08.stdout:9/895: dread d2/d58/d73/fe4 [0,4194304] 0 2026-03-10T07:51:49.650 INFO:tasks.workunit.client.1.vm08.stdout:9/896: write d2/d58/dbf/daf/ff7 [472436,93074] 0 2026-03-10T07:51:49.656 INFO:tasks.workunit.client.1.vm08.stdout:7/891: link d3/da/f84 d3/da/d25/d9/f12f 0 2026-03-10T07:51:49.659 INFO:tasks.workunit.client.1.vm08.stdout:3/889: getdents d0/d3c/d18/d32/d61/d52 0 2026-03-10T07:51:49.659 INFO:tasks.workunit.client.1.vm08.stdout:3/890: dread - d0/d3c/d18/d32/daa/fef zero size 2026-03-10T07:51:49.659 INFO:tasks.workunit.client.1.vm08.stdout:1/865: unlink d2/d6/de/d70/l8a 0 2026-03-10T07:51:49.660 INFO:tasks.workunit.client.1.vm08.stdout:5/979: creat d0/d8/d24/de2/d128/f141 x:0 0 0 2026-03-10T07:51:49.661 INFO:tasks.workunit.client.1.vm08.stdout:3/891: dwrite d0/ff4 [0,4194304] 0 2026-03-10T07:51:49.663 INFO:tasks.workunit.client.1.vm08.stdout:0/946: fdatasync dd/f44 0 2026-03-10T07:51:49.664 INFO:tasks.workunit.client.1.vm08.stdout:0/947: write dd/d10/d14/d15/dad/ffc [447854,58867] 0 2026-03-10T07:51:49.665 INFO:tasks.workunit.client.1.vm08.stdout:6/905: readlink d1/db/d24/dac/dad/l8b 0 2026-03-10T07:51:49.673 INFO:tasks.workunit.client.1.vm08.stdout:4/835: creat d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d70/f118 x:0 0 0 2026-03-10T07:51:49.676 INFO:tasks.workunit.client.1.vm08.stdout:3/892: symlink d0/d3c/d18/dec/d2d/l11b 0 2026-03-10T07:51:49.680 INFO:tasks.workunit.client.1.vm08.stdout:7/892: mkdir d3/da/d25/d9/d2f/d3a/d4b/d67/dea/d130 0 2026-03-10T07:51:49.680 INFO:tasks.workunit.client.1.vm08.stdout:0/948: creat dd/d10/dbd/d10d/f137 x:0 0 0 2026-03-10T07:51:49.682 INFO:tasks.workunit.client.1.vm08.stdout:9/897: rmdir d2/d58/dbf/dd0/d35/d9b/d124 0 2026-03-10T07:51:49.686 INFO:tasks.workunit.client.1.vm08.stdout:3/893: chown d0/d3c/d1f/c27 66744928 1 2026-03-10T07:51:49.687 INFO:tasks.workunit.client.1.vm08.stdout:4/836: dread f2 [0,4194304] 0 2026-03-10T07:51:49.689 INFO:tasks.workunit.client.1.vm08.stdout:9/898: creat d2/d58/dbf/dd0/d35/dff/d126/f135 x:0 0 0 2026-03-10T07:51:49.690 INFO:tasks.workunit.client.1.vm08.stdout:0/949: symlink dd/l138 0 2026-03-10T07:51:49.693 INFO:tasks.workunit.client.1.vm08.stdout:9/899: fdatasync d2/d58/dbf/dd0/d35/d97/d9d/df4/fa5 0 2026-03-10T07:51:49.694 INFO:tasks.workunit.client.1.vm08.stdout:3/894: rmdir d0/d3c/d18/d32 39 2026-03-10T07:51:49.696 INFO:tasks.workunit.client.1.vm08.stdout:3/895: write d0/d3c/d1f/d89/fff [4118099,64665] 0 2026-03-10T07:51:49.699 INFO:tasks.workunit.client.1.vm08.stdout:2/904: write d0/d1/d3/d56/d78/dad/db1/d61/f59 [4700896,29008] 0 2026-03-10T07:51:49.702 INFO:tasks.workunit.client.1.vm08.stdout:1/866: write d2/d6/d3a/d61/d6f/f9d [558246,83740] 0 2026-03-10T07:51:49.706 INFO:tasks.workunit.client.1.vm08.stdout:1/867: chown d2/d6/de/d1f/l4c 42 1 2026-03-10T07:51:49.708 INFO:tasks.workunit.client.1.vm08.stdout:9/900: sync 2026-03-10T07:51:49.708 INFO:tasks.workunit.client.1.vm08.stdout:1/868: sync 2026-03-10T07:51:49.709 INFO:tasks.workunit.client.1.vm08.stdout:3/896: mknod d0/d3c/d18/d32/d61/d83/c11c 0 2026-03-10T07:51:49.710 INFO:tasks.workunit.client.1.vm08.stdout:1/869: write d2/d6/de/d1f/d26/d58/d8c/f106 [678825,8062] 0 2026-03-10T07:51:49.712 INFO:tasks.workunit.client.1.vm08.stdout:9/901: read d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/d98/fd4 [2914296,82333] 0 2026-03-10T07:51:49.718 INFO:tasks.workunit.client.1.vm08.stdout:1/870: mknod d2/d6/d50/c127 0 2026-03-10T07:51:49.719 INFO:tasks.workunit.client.1.vm08.stdout:5/980: write d0/d4/d19/d81/d92/f74 [1422468,12453] 0 2026-03-10T07:51:49.720 INFO:tasks.workunit.client.1.vm08.stdout:6/906: write d1/d3/df/d1d/d40/d87/d95/ffb [1190795,88961] 0 2026-03-10T07:51:49.722 INFO:tasks.workunit.client.1.vm08.stdout:7/893: truncate d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/dac/fd3 1437351 0 2026-03-10T07:51:49.722 INFO:tasks.workunit.client.1.vm08.stdout:4/837: write d5/da0/ffd [4245472,57030] 0 2026-03-10T07:51:49.727 INFO:tasks.workunit.client.1.vm08.stdout:5/981: symlink d0/d77/d83/de0/dfe/l142 0 2026-03-10T07:51:49.727 INFO:tasks.workunit.client.1.vm08.stdout:0/950: dwrite dd/d10/d14/d1b/f2c [0,4194304] 0 2026-03-10T07:51:49.728 INFO:tasks.workunit.client.1.vm08.stdout:1/871: dread d2/d6/d3a/f108 [0,4194304] 0 2026-03-10T07:51:49.728 INFO:tasks.workunit.client.1.vm08.stdout:5/982: fsync d0/d4/d19/d3a/d69/f71 0 2026-03-10T07:51:49.729 INFO:tasks.workunit.client.1.vm08.stdout:1/872: chown d2/d6/de/d1f/d26/d89/d8e 0 1 2026-03-10T07:51:49.729 INFO:tasks.workunit.client.1.vm08.stdout:2/905: write d0/d1/d3/d39/de2/d107/d111/fc4 [20887,41441] 0 2026-03-10T07:51:49.730 INFO:tasks.workunit.client.1.vm08.stdout:0/951: readlink dd/d10/d14/d15/d20/d5f/lf4 0 2026-03-10T07:51:49.730 INFO:tasks.workunit.client.1.vm08.stdout:5/983: dread - d0/d77/d83/de0/f13b zero size 2026-03-10T07:51:49.731 INFO:tasks.workunit.client.1.vm08.stdout:7/894: rmdir d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79 39 2026-03-10T07:51:49.737 INFO:tasks.workunit.client.1.vm08.stdout:0/952: write dd/d10/d2f/d37/d64/d95/d5c/f8f [3497545,85359] 0 2026-03-10T07:51:49.738 INFO:tasks.workunit.client.1.vm08.stdout:4/838: mknod d5/da0/d95/dc2/c119 0 2026-03-10T07:51:49.739 INFO:tasks.workunit.client.1.vm08.stdout:5/984: mknod d0/d4/df/df6/c143 0 2026-03-10T07:51:49.746 INFO:tasks.workunit.client.1.vm08.stdout:9/902: dwrite d2/d58/dbf/faa [0,4194304] 0 2026-03-10T07:51:49.748 INFO:tasks.workunit.client.1.vm08.stdout:3/897: dwrite d0/d3c/d18/dec/d34/f4e [0,4194304] 0 2026-03-10T07:51:49.752 INFO:tasks.workunit.client.1.vm08.stdout:7/895: symlink d3/da/d25/d9/d2f/d3a/dc0/dda/l131 0 2026-03-10T07:51:49.762 INFO:tasks.workunit.client.1.vm08.stdout:4/839: unlink d5/da0/d95/de6/da7/l8c 0 2026-03-10T07:51:49.762 INFO:tasks.workunit.client.1.vm08.stdout:9/903: mkdir d2/d58/dbf/daf/d136 0 2026-03-10T07:51:49.767 INFO:tasks.workunit.client.1.vm08.stdout:3/898: write d0/d3c/d18/d32/d61/f118 [793226,52763] 0 2026-03-10T07:51:49.770 INFO:tasks.workunit.client.1.vm08.stdout:2/906: rename d0/d1/d3/d39 to d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123 0 2026-03-10T07:51:49.773 INFO:tasks.workunit.client.1.vm08.stdout:4/840: symlink d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/dad/db8/l11a 0 2026-03-10T07:51:49.774 INFO:tasks.workunit.client.1.vm08.stdout:6/907: dread d1/d17/d2b/d58/d76/d114/d79/d7c/fa3 [0,4194304] 0 2026-03-10T07:51:49.778 INFO:tasks.workunit.client.1.vm08.stdout:9/904: creat d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/d91/db8/f137 x:0 0 0 2026-03-10T07:51:49.778 INFO:tasks.workunit.client.1.vm08.stdout:0/953: dread dd/d10/d2f/d37/d64/f3f [0,4194304] 0 2026-03-10T07:51:49.778 INFO:tasks.workunit.client.1.vm08.stdout:3/899: sync 2026-03-10T07:51:49.780 INFO:tasks.workunit.client.1.vm08.stdout:3/900: write d0/d3c/d18/dec/d34/f112 [535773,60083] 0 2026-03-10T07:51:49.783 INFO:tasks.workunit.client.1.vm08.stdout:9/905: mknod d2/d58/dbf/dd0/d35/d9b/c138 0 2026-03-10T07:51:49.787 INFO:tasks.workunit.client.1.vm08.stdout:7/896: link d3/da/l4e d3/da/d25/d9/d2f/d3a/d71/l132 0 2026-03-10T07:51:49.789 INFO:tasks.workunit.client.1.vm08.stdout:3/901: rename d0/c19 to d0/d3c/d18/da9/c11d 0 2026-03-10T07:51:49.790 INFO:tasks.workunit.client.1.vm08.stdout:9/906: mknod d2/d58/c139 0 2026-03-10T07:51:49.792 INFO:tasks.workunit.client.1.vm08.stdout:4/841: creat d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/f11b x:0 0 0 2026-03-10T07:51:49.792 INFO:tasks.workunit.client.1.vm08.stdout:3/902: sync 2026-03-10T07:51:49.793 INFO:tasks.workunit.client.1.vm08.stdout:7/897: dwrite d3/da/d25/d9/d2f/d3a/d40/f52 [0,4194304] 0 2026-03-10T07:51:49.794 INFO:tasks.workunit.client.1.vm08.stdout:4/842: read - d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/dad/dcf/f10d zero size 2026-03-10T07:51:49.796 INFO:tasks.workunit.client.1.vm08.stdout:1/873: truncate d2/d6/de/fc0 1287059 0 2026-03-10T07:51:49.801 INFO:tasks.workunit.client.1.vm08.stdout:6/908: write d1/d17/f66 [8712732,6903] 0 2026-03-10T07:51:49.801 INFO:tasks.workunit.client.1.vm08.stdout:5/985: dwrite d0/d4/df/d12/f11 [0,4194304] 0 2026-03-10T07:51:49.807 INFO:tasks.workunit.client.1.vm08.stdout:0/954: write dd/d10/d14/d15/d20/d22/f5d [1559852,49533] 0 2026-03-10T07:51:49.807 INFO:tasks.workunit.client.1.vm08.stdout:4/843: symlink d5/d8/d50/db0/l11c 0 2026-03-10T07:51:49.814 INFO:tasks.workunit.client.1.vm08.stdout:0/955: readlink dd/d18/d100/d102/d107/l11d 0 2026-03-10T07:51:49.814 INFO:tasks.workunit.client.1.vm08.stdout:7/898: creat d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/f133 x:0 0 0 2026-03-10T07:51:49.815 INFO:tasks.workunit.client.1.vm08.stdout:1/874: read d2/d6/f86 [34265,43040] 0 2026-03-10T07:51:49.815 INFO:tasks.workunit.client.1.vm08.stdout:9/907: symlink d2/d58/dbf/dd0/d35/dff/d101/l13a 0 2026-03-10T07:51:49.817 INFO:tasks.workunit.client.1.vm08.stdout:0/956: stat dd/d10/d2f/d37/d64/f130 0 2026-03-10T07:51:49.819 INFO:tasks.workunit.client.1.vm08.stdout:5/986: symlink d0/d4/d19/d60/d6d/d70/dff/l144 0 2026-03-10T07:51:49.821 INFO:tasks.workunit.client.1.vm08.stdout:9/908: symlink d2/d58/dbf/dd0/d35/dff/l13b 0 2026-03-10T07:51:49.821 INFO:tasks.workunit.client.1.vm08.stdout:4/844: dread d5/da0/d95/de6/d48/d4f/d7c/fb2 [0,4194304] 0 2026-03-10T07:51:49.824 INFO:tasks.workunit.client.1.vm08.stdout:0/957: fdatasync dd/d18/d100/d102/dc1/ded/f4d 0 2026-03-10T07:51:49.830 INFO:tasks.workunit.client.1.vm08.stdout:5/987: creat d0/d4/df/df6/df9/dec/f145 x:0 0 0 2026-03-10T07:51:49.831 INFO:tasks.workunit.client.1.vm08.stdout:9/909: fdatasync d2/d58/dbf/dd0/d35/d97/d9d/fcc 0 2026-03-10T07:51:49.832 INFO:tasks.workunit.client.1.vm08.stdout:9/910: write d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/f100 [249527,88505] 0 2026-03-10T07:51:49.839 INFO:tasks.workunit.client.1.vm08.stdout:5/988: creat d0/d4/d19/d60/f146 x:0 0 0 2026-03-10T07:51:49.839 INFO:tasks.workunit.client.1.vm08.stdout:5/989: read - d0/d4/df/dbf/d41/f12b zero size 2026-03-10T07:51:49.840 INFO:tasks.workunit.client.1.vm08.stdout:5/990: write d0/d4/d19/d81/d92/d134/f11c [766506,112713] 0 2026-03-10T07:51:49.845 INFO:tasks.workunit.client.1.vm08.stdout:6/909: truncate d1/d3/df/d1d/d40/f10b 209682 0 2026-03-10T07:51:49.845 INFO:tasks.workunit.client.1.vm08.stdout:7/899: write d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/dac/fed [456179,31666] 0 2026-03-10T07:51:49.851 INFO:tasks.workunit.client.1.vm08.stdout:1/875: link d2/d6/de/d1f/d26/d58/fb9 d2/d6/de/d1f/f128 0 2026-03-10T07:51:49.851 INFO:tasks.workunit.client.1.vm08.stdout:4/845: dwrite d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d41/f99 [0,4194304] 0 2026-03-10T07:51:49.853 INFO:tasks.workunit.client.1.vm08.stdout:9/911: creat d2/d58/dbf/dd0/d35/d97/d110/f13c x:0 0 0 2026-03-10T07:51:49.854 INFO:tasks.workunit.client.1.vm08.stdout:2/907: rename d0/d1/d3/d56/d78/dad/db1/c6c to d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/d7d/d86/c124 0 2026-03-10T07:51:49.857 INFO:tasks.workunit.client.1.vm08.stdout:3/903: readlink d0/d3c/d1f/d44/l5c 0 2026-03-10T07:51:49.862 INFO:tasks.workunit.client.1.vm08.stdout:6/910: chown d1/db/c31 466 1 2026-03-10T07:51:49.868 INFO:tasks.workunit.client.1.vm08.stdout:1/876: creat d2/d6/de/d1f/d26/d58/f129 x:0 0 0 2026-03-10T07:51:49.873 INFO:tasks.workunit.client.1.vm08.stdout:0/958: creat dd/d10/d2f/d37/f139 x:0 0 0 2026-03-10T07:51:49.873 INFO:tasks.workunit.client.1.vm08.stdout:9/912: unlink d2/d58/f9f 0 2026-03-10T07:51:49.873 INFO:tasks.workunit.client.1.vm08.stdout:9/913: write d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/f107 [411299,11571] 0 2026-03-10T07:51:49.875 INFO:tasks.workunit.client.1.vm08.stdout:3/904: creat d0/d3c/d1f/d44/f11e x:0 0 0 2026-03-10T07:51:49.876 INFO:tasks.workunit.client.1.vm08.stdout:6/911: sync 2026-03-10T07:51:49.879 INFO:tasks.workunit.client.1.vm08.stdout:1/877: mkdir d2/d6/de/d1f/d26/d58/d83/dc2/d12a 0 2026-03-10T07:51:49.880 INFO:tasks.workunit.client.1.vm08.stdout:0/959: read dd/d10/d14/d15/d20/d5f/f61 [3665681,76029] 0 2026-03-10T07:51:49.880 INFO:tasks.workunit.client.1.vm08.stdout:2/908: creat d0/d1/d3/d3e/f125 x:0 0 0 2026-03-10T07:51:49.885 INFO:tasks.workunit.client.1.vm08.stdout:9/914: rmdir d2/d58/dbf/dd0/d35/d97/dd5 39 2026-03-10T07:51:49.886 INFO:tasks.workunit.client.1.vm08.stdout:9/915: fsync d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/d98/dbb/dd9/f10b 0 2026-03-10T07:51:49.886 INFO:tasks.workunit.client.1.vm08.stdout:9/916: chown d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/d98/fc4 68155 1 2026-03-10T07:51:49.888 INFO:tasks.workunit.client.1.vm08.stdout:6/912: creat d1/d3/d3e/dff/d105/f130 x:0 0 0 2026-03-10T07:51:49.888 INFO:tasks.workunit.client.1.vm08.stdout:1/878: truncate d2/d10/fc4 617545 0 2026-03-10T07:51:49.892 INFO:tasks.workunit.client.1.vm08.stdout:5/991: dwrite d0/d4/d19/d3a/fa1 [0,4194304] 0 2026-03-10T07:51:49.897 INFO:tasks.workunit.client.1.vm08.stdout:4/846: dwrite d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d70/fa9 [4194304,4194304] 0 2026-03-10T07:51:49.901 INFO:tasks.workunit.client.1.vm08.stdout:4/847: chown d5/f2f 10002 1 2026-03-10T07:51:49.902 INFO:tasks.workunit.client.1.vm08.stdout:9/917: mkdir d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/dca/dd7/d10d/d112/d13d 0 2026-03-10T07:51:49.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:49.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:49.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:51:49.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:51:49.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:49.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T07:51:49.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T07:51:49.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: Upgrade: Finalizing container_image settings 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: Upgrade: Complete! 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: pgmap v44: 65 pgs: 65 active+clean; 3.4 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 57 MiB/s rd, 140 MiB/s wr, 336 op/s 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:51:49.908 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:49 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:49.909 INFO:tasks.workunit.client.1.vm08.stdout:7/900: dwrite d3/da/d25/d9/d2f/d3a/dc0/ff9 [0,4194304] 0 2026-03-10T07:51:49.909 INFO:tasks.workunit.client.1.vm08.stdout:7/901: stat d3/da/d25/d9/f12f 0 2026-03-10T07:51:49.913 INFO:tasks.workunit.client.1.vm08.stdout:1/879: mknod d2/d6/de/d1f/c12b 0 2026-03-10T07:51:49.919 INFO:tasks.workunit.client.1.vm08.stdout:2/909: rename d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/l67 to d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/d7d/d86/d55/dc9/l126 0 2026-03-10T07:51:49.919 INFO:tasks.workunit.client.1.vm08.stdout:2/910: chown d0/d1/d17/d6b/fe3 303646 1 2026-03-10T07:51:49.919 INFO:tasks.workunit.client.1.vm08.stdout:3/905: link d0/d3c/d18/d32/d61/d83/f8b d0/d3c/d18/d48/d55/f11f 0 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "dashboard get-grafana-api-url"}]: dispatch 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: Upgrade: Finalizing container_image settings 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-10T07:51:49.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: Upgrade: Complete! 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: pgmap v44: 65 pgs: 65 active+clean; 3.4 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 57 MiB/s rd, 140 MiB/s wr, 336 op/s 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:51:49.920 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:49 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:49.921 INFO:tasks.workunit.client.1.vm08.stdout:6/913: rename d1/d17/d2b/d58/d76/d114/d79 to d1/d3/d3e/db2/d131 0 2026-03-10T07:51:49.926 INFO:tasks.workunit.client.1.vm08.stdout:0/960: getdents dd/d10/d2f/d37/d64/d95/d58 0 2026-03-10T07:51:49.928 INFO:tasks.workunit.client.1.vm08.stdout:7/902: mkdir d3/da/d25/d9/d2f/d3a/d134 0 2026-03-10T07:51:49.928 INFO:tasks.workunit.client.1.vm08.stdout:3/906: chown d0/d3c/d1f/d89/ddb 3555018 1 2026-03-10T07:51:49.930 INFO:tasks.workunit.client.1.vm08.stdout:1/880: rename d2/d6/de/d1f/d26/d98/cb7 to d2/d6/de/d1f/d40/d119/c12c 0 2026-03-10T07:51:49.931 INFO:tasks.workunit.client.1.vm08.stdout:1/881: write d2/d6/de/d1f/d40/f4d [4673549,14838] 0 2026-03-10T07:51:49.934 INFO:tasks.workunit.client.1.vm08.stdout:6/914: creat d1/d17/d2b/d5e/dcb/f132 x:0 0 0 2026-03-10T07:51:49.935 INFO:tasks.workunit.client.1.vm08.stdout:6/915: stat d1/d17/d2b/d58/d77/daf/cfa 0 2026-03-10T07:51:49.936 INFO:tasks.workunit.client.1.vm08.stdout:0/961: mkdir dd/d10/d2f/d37/d64/d52/da9/d13a 0 2026-03-10T07:51:49.937 INFO:tasks.workunit.client.1.vm08.stdout:3/907: mknod d0/d3c/d18/d32/d61/d52/c120 0 2026-03-10T07:51:49.946 INFO:tasks.workunit.client.1.vm08.stdout:1/882: dread d2/d6/de/d1f/d26/f6e [4194304,4194304] 0 2026-03-10T07:51:49.948 INFO:tasks.workunit.client.1.vm08.stdout:2/911: write d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/d7d/d86/f7f [3422062,49635] 0 2026-03-10T07:51:49.953 INFO:tasks.workunit.client.1.vm08.stdout:5/992: getdents d0/d8/d24/de2/df7 0 2026-03-10T07:51:49.954 INFO:tasks.workunit.client.1.vm08.stdout:4/848: getdents d5/da0/d95/de6/d48/d4f/d8d 0 2026-03-10T07:51:49.955 INFO:tasks.workunit.client.1.vm08.stdout:6/916: mkdir d1/d17/d2b/d58/d76/d133 0 2026-03-10T07:51:49.955 INFO:tasks.workunit.client.1.vm08.stdout:9/918: getdents d2/d58/dbf/dd0/d35/d97/d9d/df4/dee 0 2026-03-10T07:51:49.955 INFO:tasks.workunit.client.1.vm08.stdout:6/917: readlink d1/l2f 0 2026-03-10T07:51:49.956 INFO:tasks.workunit.client.1.vm08.stdout:1/883: dwrite d2/d6/de/d1f/d26/d58/d8c/f87 [0,4194304] 0 2026-03-10T07:51:49.957 INFO:tasks.workunit.client.1.vm08.stdout:1/884: chown d2/d6/de/d70/fca 132 1 2026-03-10T07:51:49.965 INFO:tasks.workunit.client.1.vm08.stdout:7/903: rename d3/da/d25/d9/d2f/d39/d43/d4f/f68 to d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79/ddd/f135 0 2026-03-10T07:51:49.965 INFO:tasks.workunit.client.1.vm08.stdout:3/908: write d0/d3c/d18/dec/d2d/d85/fcb [3559407,124647] 0 2026-03-10T07:51:49.966 INFO:tasks.workunit.client.1.vm08.stdout:4/849: mkdir d5/da0/d12/d11d 0 2026-03-10T07:51:49.969 INFO:tasks.workunit.client.1.vm08.stdout:3/909: chown d0/d3c/d18/dec/d34/c5e 106 1 2026-03-10T07:51:49.970 INFO:tasks.workunit.client.1.vm08.stdout:6/918: creat d1/db/d24/dac/dad/f134 x:0 0 0 2026-03-10T07:51:49.971 INFO:tasks.workunit.client.1.vm08.stdout:3/910: fsync d0/d3c/d18/d32/fc7 0 2026-03-10T07:51:49.976 INFO:tasks.workunit.client.1.vm08.stdout:4/850: chown d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d70/cda 376675672 1 2026-03-10T07:51:49.980 INFO:tasks.workunit.client.1.vm08.stdout:6/919: mknod d1/d17/d2b/d58/d77/daf/c135 0 2026-03-10T07:51:49.980 INFO:tasks.workunit.client.1.vm08.stdout:9/919: mknod d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/c13e 0 2026-03-10T07:51:49.982 INFO:tasks.workunit.client.1.vm08.stdout:1/885: mknod d2/d6/de/d1f/d26/d89/d117/c12d 0 2026-03-10T07:51:49.983 INFO:tasks.workunit.client.1.vm08.stdout:7/904: mkdir d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d136 0 2026-03-10T07:51:49.984 INFO:tasks.workunit.client.1.vm08.stdout:1/886: dread - d2/d6/de/d70/fca zero size 2026-03-10T07:51:49.985 INFO:tasks.workunit.client.1.vm08.stdout:0/962: link dd/d10/d2f/d37/d64/d95/d58/d3d/faa dd/d10/d2f/d37/f13b 0 2026-03-10T07:51:49.986 INFO:tasks.workunit.client.1.vm08.stdout:4/851: mknod d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d9b/c11e 0 2026-03-10T07:51:49.987 INFO:tasks.workunit.client.1.vm08.stdout:9/920: rename d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/le3 to d2/dda/l13f 0 2026-03-10T07:51:49.987 INFO:tasks.workunit.client.1.vm08.stdout:5/993: getdents d0/d8/d24/de2/d128 0 2026-03-10T07:51:49.989 INFO:tasks.workunit.client.1.vm08.stdout:9/921: read d2/d58/dbf/dd0/d35/fc3 [2558798,36641] 0 2026-03-10T07:51:49.991 INFO:tasks.workunit.client.1.vm08.stdout:1/887: mknod d2/d6/de/d1f/c12e 0 2026-03-10T07:51:49.992 INFO:tasks.workunit.client.1.vm08.stdout:0/963: symlink dd/d10/d2f/d37/d64/d95/d58/l13c 0 2026-03-10T07:51:49.992 INFO:tasks.workunit.client.1.vm08.stdout:6/920: mkdir d1/d17/d2b/d58/d76/d133/d136 0 2026-03-10T07:51:49.993 INFO:tasks.workunit.client.1.vm08.stdout:6/921: readlink d1/d46/l107 0 2026-03-10T07:51:49.993 INFO:tasks.workunit.client.1.vm08.stdout:3/911: creat d0/d3c/d18/dec/f121 x:0 0 0 2026-03-10T07:51:49.993 INFO:tasks.workunit.client.1.vm08.stdout:0/964: chown dd/d10/d14/d15/d20/d7a/dd2/fd4 58 1 2026-03-10T07:51:49.994 INFO:tasks.workunit.client.1.vm08.stdout:4/852: mknod d5/da0/d95/dc2/c11f 0 2026-03-10T07:51:49.998 INFO:tasks.workunit.client.1.vm08.stdout:5/994: creat d0/d4/d19/d81/d92/f147 x:0 0 0 2026-03-10T07:51:50.001 INFO:tasks.workunit.client.1.vm08.stdout:1/888: symlink d2/d6/de/d1f/d26/d58/d83/d104/l12f 0 2026-03-10T07:51:50.006 INFO:tasks.workunit.client.1.vm08.stdout:4/853: dread d5/da0/ffd [0,4194304] 0 2026-03-10T07:51:50.007 INFO:tasks.workunit.client.1.vm08.stdout:2/912: dwrite d0/d1/d3/d10/d38/daf/fd9 [0,4194304] 0 2026-03-10T07:51:50.010 INFO:tasks.workunit.client.1.vm08.stdout:5/995: dread d0/d4/d19/d81/da4/fc2 [0,4194304] 0 2026-03-10T07:51:50.016 INFO:tasks.workunit.client.1.vm08.stdout:2/913: sync 2026-03-10T07:51:50.023 INFO:tasks.workunit.client.1.vm08.stdout:9/922: mkdir d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/dca/dd7/d10d/d112/d13d/d140 0 2026-03-10T07:51:50.026 INFO:tasks.workunit.client.1.vm08.stdout:1/889: creat d2/d6/de/d1f/d26/d58/d83/dc2/f130 x:0 0 0 2026-03-10T07:51:50.029 INFO:tasks.workunit.client.1.vm08.stdout:7/905: getdents d3/da/d25/d9/d2f/d39 0 2026-03-10T07:51:50.031 INFO:tasks.workunit.client.1.vm08.stdout:2/914: creat d0/d1/d3/d56/d78/dad/f127 x:0 0 0 2026-03-10T07:51:50.033 INFO:tasks.workunit.client.1.vm08.stdout:4/854: dread f0 [0,4194304] 0 2026-03-10T07:51:50.033 INFO:tasks.workunit.client.1.vm08.stdout:7/906: write d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/f81 [855816,23945] 0 2026-03-10T07:51:50.035 INFO:tasks.workunit.client.1.vm08.stdout:2/915: write d0/d1/d3/d10/d38/daf/f10b [742688,130358] 0 2026-03-10T07:51:50.036 INFO:tasks.workunit.client.1.vm08.stdout:2/916: fdatasync d0/d1/d3/d10/d38/daf/ffc 0 2026-03-10T07:51:50.038 INFO:tasks.workunit.client.1.vm08.stdout:2/917: sync 2026-03-10T07:51:50.041 INFO:tasks.workunit.client.1.vm08.stdout:2/918: truncate d0/d1/d17/f1a 7323477 0 2026-03-10T07:51:50.044 INFO:tasks.workunit.client.1.vm08.stdout:9/923: dread d2/d58/fc6 [0,4194304] 0 2026-03-10T07:51:50.050 INFO:tasks.workunit.client.1.vm08.stdout:5/996: dread d0/d4/d19/d81/d92/fcf [0,4194304] 0 2026-03-10T07:51:50.057 INFO:tasks.workunit.client.1.vm08.stdout:1/890: mknod d2/d6/de/d1f/c131 0 2026-03-10T07:51:50.061 INFO:tasks.workunit.client.1.vm08.stdout:4/855: fdatasync d5/da0/d95/de6/d48/d4f/fe1 0 2026-03-10T07:51:50.064 INFO:tasks.workunit.client.1.vm08.stdout:0/965: dwrite dd/fe [0,4194304] 0 2026-03-10T07:51:50.073 INFO:tasks.workunit.client.1.vm08.stdout:0/966: sync 2026-03-10T07:51:50.074 INFO:tasks.workunit.client.1.vm08.stdout:6/922: getdents d1/d3/df/d38/def 0 2026-03-10T07:51:50.074 INFO:tasks.workunit.client.1.vm08.stdout:0/967: chown dd/d10/d14/da6 1526 1 2026-03-10T07:51:50.077 INFO:tasks.workunit.client.1.vm08.stdout:0/968: chown dd/d10/f77 3524 1 2026-03-10T07:51:50.084 INFO:tasks.workunit.client.1.vm08.stdout:9/924: mkdir d2/d58/dbf/dd0/d35/d97/d9d/d141 0 2026-03-10T07:51:50.085 INFO:tasks.workunit.client.1.vm08.stdout:5/997: creat d0/d4/d19/d60/d6d/d70/f148 x:0 0 0 2026-03-10T07:51:50.086 INFO:tasks.workunit.client.1.vm08.stdout:5/998: write d0/d8/d24/de2/d128/f141 [244556,61494] 0 2026-03-10T07:51:50.088 INFO:tasks.workunit.client.1.vm08.stdout:3/912: rename d0/d3c/d1f/f7e to d0/d3c/d18/d48/f122 0 2026-03-10T07:51:50.090 INFO:tasks.workunit.client.1.vm08.stdout:1/891: creat d2/d6/de/d1f/d26/d58/d83/f132 x:0 0 0 2026-03-10T07:51:50.094 INFO:tasks.workunit.client.1.vm08.stdout:7/907: creat d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d136/f137 x:0 0 0 2026-03-10T07:51:50.094 INFO:tasks.workunit.client.1.vm08.stdout:2/919: dread d0/d1/d3/d10/f58 [0,4194304] 0 2026-03-10T07:51:50.096 INFO:tasks.workunit.client.1.vm08.stdout:6/923: creat d1/d3/df/d1d/d40/d45/d10c/f137 x:0 0 0 2026-03-10T07:51:50.098 INFO:tasks.workunit.client.1.vm08.stdout:0/969: unlink dd/d10/d2f/d37/d64/d95/d58/d3d/d9b/fb4 0 2026-03-10T07:51:50.131 INFO:tasks.workunit.client.1.vm08.stdout:5/999: symlink d0/d4/d19/d60/d6d/d70/l149 0 2026-03-10T07:51:50.132 INFO:tasks.workunit.client.1.vm08.stdout:3/913: creat d0/d3c/d18/dec/d2d/f123 x:0 0 0 2026-03-10T07:51:50.133 INFO:tasks.workunit.client.1.vm08.stdout:1/892: chown d2/d6/de/d1f/d40/d119/c12c 0 1 2026-03-10T07:51:50.134 INFO:tasks.workunit.client.1.vm08.stdout:1/893: chown d2/d6/ff0 250 1 2026-03-10T07:51:50.136 INFO:tasks.workunit.client.1.vm08.stdout:7/908: readlink d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/lcb 0 2026-03-10T07:51:50.139 INFO:tasks.workunit.client.1.vm08.stdout:3/914: read d0/f9a [2069705,5919] 0 2026-03-10T07:51:50.141 INFO:tasks.workunit.client.1.vm08.stdout:6/924: fdatasync d1/db/fd2 0 2026-03-10T07:51:50.143 INFO:tasks.workunit.client.1.vm08.stdout:4/856: getdents d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d41 0 2026-03-10T07:51:50.143 INFO:tasks.workunit.client.1.vm08.stdout:1/894: symlink d2/d6/de/l133 0 2026-03-10T07:51:50.144 INFO:tasks.workunit.client.1.vm08.stdout:0/970: creat dd/d10/d14/d1b/f13d x:0 0 0 2026-03-10T07:51:50.145 INFO:tasks.workunit.client.1.vm08.stdout:1/895: fsync d2/d6/de/d1f/f2a 0 2026-03-10T07:51:50.150 INFO:tasks.workunit.client.1.vm08.stdout:0/971: creat dd/d10/d2f/d37/d64/d95/d5c/f13e x:0 0 0 2026-03-10T07:51:50.153 INFO:tasks.workunit.client.1.vm08.stdout:2/920: dread d0/d1/d17/db2/dde/d116/fb8 [0,4194304] 0 2026-03-10T07:51:50.155 INFO:tasks.workunit.client.1.vm08.stdout:2/921: chown d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/lc8 5 1 2026-03-10T07:51:50.155 INFO:tasks.workunit.client.1.vm08.stdout:4/857: dread d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d70/fa9 [0,4194304] 0 2026-03-10T07:51:50.157 INFO:tasks.workunit.client.1.vm08.stdout:9/925: truncate d2/f5 3465294 0 2026-03-10T07:51:50.159 INFO:tasks.workunit.client.1.vm08.stdout:4/858: dread d5/da0/d95/de6/d48/d4f/d8d/d91/dd5/fea [0,4194304] 0 2026-03-10T07:51:50.164 INFO:tasks.workunit.client.1.vm08.stdout:7/909: dwrite d3/da/d25/d9/f87 [0,4194304] 0 2026-03-10T07:51:50.166 INFO:tasks.workunit.client.1.vm08.stdout:2/922: rmdir d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123 39 2026-03-10T07:51:50.167 INFO:tasks.workunit.client.1.vm08.stdout:6/925: truncate d1/d17/f66 8401449 0 2026-03-10T07:51:50.167 INFO:tasks.workunit.client.1.vm08.stdout:0/972: dread dd/d10/d2f/d37/f13b [0,4194304] 0 2026-03-10T07:51:50.177 INFO:tasks.workunit.client.1.vm08.stdout:1/896: dwrite d2/d6/de/d1f/d26/d58/fb9 [0,4194304] 0 2026-03-10T07:51:50.177 INFO:tasks.workunit.client.1.vm08.stdout:4/859: creat d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/dce/f120 x:0 0 0 2026-03-10T07:51:50.177 INFO:tasks.workunit.client.1.vm08.stdout:9/926: creat d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/dca/f142 x:0 0 0 2026-03-10T07:51:50.184 INFO:tasks.workunit.client.1.vm08.stdout:3/915: link d0/c98 d0/d3c/d18/d32/d61/d52/dca/dd2/d117/c124 0 2026-03-10T07:51:50.184 INFO:tasks.workunit.client.1.vm08.stdout:6/926: symlink d1/d17/d2b/d58/l138 0 2026-03-10T07:51:50.191 INFO:tasks.workunit.client.1.vm08.stdout:7/910: dread d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/dac/fed [0,4194304] 0 2026-03-10T07:51:50.191 INFO:tasks.workunit.client.1.vm08.stdout:0/973: dread dd/d10/d14/d15/d20/f7e [0,4194304] 0 2026-03-10T07:51:50.191 INFO:tasks.workunit.client.1.vm08.stdout:3/916: dread - d0/d3c/d18/d32/d61/f107 zero size 2026-03-10T07:51:50.191 INFO:tasks.workunit.client.1.vm08.stdout:7/911: dread - d3/da/d8a/dd1/f11f zero size 2026-03-10T07:51:50.196 INFO:tasks.workunit.client.1.vm08.stdout:4/860: dwrite d5/da0/d95/de6/d48/d4f/d8d/fcd [0,4194304] 0 2026-03-10T07:51:50.198 INFO:tasks.workunit.client.1.vm08.stdout:0/974: rename dd/d10/d14/d15/d20 to dd/d10/d2f/d37/d64/d11e/d13f 0 2026-03-10T07:51:50.199 INFO:tasks.workunit.client.1.vm08.stdout:7/912: sync 2026-03-10T07:51:50.199 INFO:tasks.workunit.client.1.vm08.stdout:0/975: truncate dd/d10/d2f/d37/d64/d52/da9/f133 208489 0 2026-03-10T07:51:50.200 INFO:tasks.workunit.client.1.vm08.stdout:4/861: chown d5/d8/d50/f116 3 1 2026-03-10T07:51:50.201 INFO:tasks.workunit.client.1.vm08.stdout:2/923: mknod d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/d7d/d7e/dd5/c128 0 2026-03-10T07:51:50.205 INFO:tasks.workunit.client.1.vm08.stdout:9/927: mkdir d2/d58/dbf/dd0/d143 0 2026-03-10T07:51:50.207 INFO:tasks.workunit.client.1.vm08.stdout:3/917: rename d0/d3c/d1f/d89/fff to d0/d3c/d18/dec/d34/f125 0 2026-03-10T07:51:50.210 INFO:tasks.workunit.client.1.vm08.stdout:0/976: creat dd/d10/d2f/d37/d64/d11e/d13f/f140 x:0 0 0 2026-03-10T07:51:50.212 INFO:tasks.workunit.client.1.vm08.stdout:2/924: fdatasync d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/de2/d107/d111/f26 0 2026-03-10T07:51:50.214 INFO:tasks.workunit.client.1.vm08.stdout:6/927: rename d1/d17/d2b/d5e/dcb to d1/d3/df/d11e/d139 0 2026-03-10T07:51:50.216 INFO:tasks.workunit.client.1.vm08.stdout:1/897: dread d2/d6/de/d70/d80/f85 [0,4194304] 0 2026-03-10T07:51:50.216 INFO:tasks.workunit.client.1.vm08.stdout:0/977: mkdir dd/d18/d100/d102/dc1/ded/df7/d141 0 2026-03-10T07:51:50.221 INFO:tasks.workunit.client.1.vm08.stdout:9/928: mknod d2/d58/dbf/dd0/d35/d97/dd5/c144 0 2026-03-10T07:51:50.223 INFO:tasks.workunit.client.1.vm08.stdout:6/928: dread d1/d17/d2b/d5e/f96 [0,4194304] 0 2026-03-10T07:51:50.223 INFO:tasks.workunit.client.1.vm08.stdout:1/898: fdatasync d2/d10/fd8 0 2026-03-10T07:51:50.223 INFO:tasks.workunit.client.1.vm08.stdout:0/978: creat dd/d10/d2f/d37/d64/d11e/f142 x:0 0 0 2026-03-10T07:51:50.225 INFO:tasks.workunit.client.1.vm08.stdout:9/929: truncate d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/f107 1354606 0 2026-03-10T07:51:50.225 INFO:tasks.workunit.client.1.vm08.stdout:2/925: link d0/d1/d17/db2/d9c/ff4 d0/d1/d17/dfe/f129 0 2026-03-10T07:51:50.227 INFO:tasks.workunit.client.1.vm08.stdout:1/899: mknod d2/d6/dfe/c134 0 2026-03-10T07:51:50.228 INFO:tasks.workunit.client.1.vm08.stdout:2/926: truncate d0/d1/d3/d3e/f125 612827 0 2026-03-10T07:51:50.229 INFO:tasks.workunit.client.1.vm08.stdout:6/929: creat d1/d17/d2b/d58/d77/daf/f13a x:0 0 0 2026-03-10T07:51:50.231 INFO:tasks.workunit.client.1.vm08.stdout:6/930: readlink d1/d17/d2b/d58/d77/ld4 0 2026-03-10T07:51:50.234 INFO:tasks.workunit.client.1.vm08.stdout:0/979: dread dd/d10/d14/d15/f84 [0,4194304] 0 2026-03-10T07:51:50.235 INFO:tasks.workunit.client.1.vm08.stdout:1/900: link d2/l2c d2/d6/de/d47/dbd/l135 0 2026-03-10T07:51:50.235 INFO:tasks.workunit.client.1.vm08.stdout:2/927: creat d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/d7d/d86/d55/d7a/f12a x:0 0 0 2026-03-10T07:51:50.238 INFO:tasks.workunit.client.1.vm08.stdout:9/930: dwrite d2/d58/dbf/dd0/d35/d97/d9d/df4/f4d [0,4194304] 0 2026-03-10T07:51:50.240 INFO:tasks.workunit.client.1.vm08.stdout:2/928: chown d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/d7d 3072043 1 2026-03-10T07:51:50.247 INFO:tasks.workunit.client.1.vm08.stdout:2/929: rmdir d0/d1/d3/d10/d38 39 2026-03-10T07:51:50.248 INFO:tasks.workunit.client.1.vm08.stdout:9/931: symlink d2/d58/dbf/dd0/d35/d97/dd5/d106/l145 0 2026-03-10T07:51:50.249 INFO:tasks.workunit.client.1.vm08.stdout:1/901: link d2/d6/de/d1f/c131 d2/d6/de/d1f/d26/d58/d83/d104/d125/c136 0 2026-03-10T07:51:50.249 INFO:tasks.workunit.client.1.vm08.stdout:0/980: getdents dd/d10/d2f/d37/d64/d11e/d13f/d5f/d9f 0 2026-03-10T07:51:50.250 INFO:tasks.workunit.client.1.vm08.stdout:1/902: fdatasync d2/d6/de/d5f/df9/f11d 0 2026-03-10T07:51:50.250 INFO:tasks.workunit.client.1.vm08.stdout:2/930: creat d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/d7d/d86/d55/dc9/f12b x:0 0 0 2026-03-10T07:51:50.256 INFO:tasks.workunit.client.1.vm08.stdout:0/981: creat dd/d18/d100/d102/dc1/d125/f143 x:0 0 0 2026-03-10T07:51:50.257 INFO:tasks.workunit.client.1.vm08.stdout:2/931: chown d0/d1/d3/d10/d38/daf/ldf 1787356 1 2026-03-10T07:51:50.263 INFO:tasks.workunit.client.1.vm08.stdout:2/932: mknod d0/d1/c12c 0 2026-03-10T07:51:50.264 INFO:tasks.workunit.client.1.vm08.stdout:2/933: dread - d0/d1/d3/d10/f11a zero size 2026-03-10T07:51:50.264 INFO:tasks.workunit.client.1.vm08.stdout:2/934: dread - d0/d1/d17/dfb/f110 zero size 2026-03-10T07:51:50.264 INFO:tasks.workunit.client.1.vm08.stdout:1/903: sync 2026-03-10T07:51:50.267 INFO:tasks.workunit.client.1.vm08.stdout:1/904: creat d2/d6/dfe/f137 x:0 0 0 2026-03-10T07:51:50.270 INFO:tasks.workunit.client.1.vm08.stdout:1/905: dwrite d2/d6/dfe/f137 [0,4194304] 0 2026-03-10T07:51:50.270 INFO:tasks.workunit.client.1.vm08.stdout:1/906: dread - d2/d10/f99 zero size 2026-03-10T07:51:50.278 INFO:tasks.workunit.client.1.vm08.stdout:2/935: mknod d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/df0/c12d 0 2026-03-10T07:51:50.279 INFO:tasks.workunit.client.1.vm08.stdout:1/907: mkdir d2/d6/de/d1f/da9/d138 0 2026-03-10T07:51:50.281 INFO:tasks.workunit.client.1.vm08.stdout:1/908: rmdir d2/d10 39 2026-03-10T07:51:50.282 INFO:tasks.workunit.client.1.vm08.stdout:2/936: truncate d0/d1/d3/d10/d38/daf/ffc 26824 0 2026-03-10T07:51:50.285 INFO:tasks.workunit.client.1.vm08.stdout:1/909: link d2/d6/de/d47/da0/fe9 d2/d10/f139 0 2026-03-10T07:51:50.286 INFO:tasks.workunit.client.1.vm08.stdout:2/937: link d0/d1/d3/d56/d78/dad/db1/d61/d84/cba d0/d1/d3/c12e 0 2026-03-10T07:51:50.288 INFO:tasks.workunit.client.1.vm08.stdout:2/938: read d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/de2/d107/d111/f23 [1901108,32944] 0 2026-03-10T07:51:50.289 INFO:tasks.workunit.client.1.vm08.stdout:2/939: chown d0/d1/d17/db2/dde/d116 6 1 2026-03-10T07:51:50.290 INFO:tasks.workunit.client.1.vm08.stdout:2/940: dread - d0/d1/d3/d56/d78/de4/feb zero size 2026-03-10T07:51:50.302 INFO:tasks.workunit.client.1.vm08.stdout:7/913: write d3/da/d25/d9/d2f/d39/fce [511960,8970] 0 2026-03-10T07:51:50.303 INFO:tasks.workunit.client.1.vm08.stdout:4/862: write d5/da0/f18 [8626675,57626] 0 2026-03-10T07:51:50.309 INFO:tasks.workunit.client.1.vm08.stdout:7/914: mknod d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/d107/c138 0 2026-03-10T07:51:50.312 INFO:tasks.workunit.client.1.vm08.stdout:9/932: write d2/d58/d73/f7e [749770,14636] 0 2026-03-10T07:51:50.314 INFO:tasks.workunit.client.1.vm08.stdout:0/982: truncate dd/d10/d2f/d37/d64/d95/d5c/f63 1009680 0 2026-03-10T07:51:50.316 INFO:tasks.workunit.client.1.vm08.stdout:3/918: dwrite d0/d3c/d1f/d95/fab [0,4194304] 0 2026-03-10T07:51:50.317 INFO:tasks.workunit.client.1.vm08.stdout:6/931: dwrite d1/d46/f74 [0,4194304] 0 2026-03-10T07:51:50.318 INFO:tasks.workunit.client.1.vm08.stdout:1/910: write d2/d10/f139 [3001961,36030] 0 2026-03-10T07:51:50.325 INFO:tasks.workunit.client.1.vm08.stdout:2/941: dwrite d0/d1/d17/db2/dde/d116/f7c [4194304,4194304] 0 2026-03-10T07:51:50.338 INFO:tasks.workunit.client.1.vm08.stdout:9/933: fdatasync d2/d58/d73/fd3 0 2026-03-10T07:51:50.344 INFO:tasks.workunit.client.1.vm08.stdout:7/915: symlink d3/l139 0 2026-03-10T07:51:50.344 INFO:tasks.workunit.client.1.vm08.stdout:9/934: write d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/f107 [1249671,103624] 0 2026-03-10T07:51:50.345 INFO:tasks.workunit.client.1.vm08.stdout:3/919: chown d0/cf8 0 1 2026-03-10T07:51:50.347 INFO:tasks.workunit.client.1.vm08.stdout:9/935: rename d2/d58/dbf/dd0/d35/dff/f111 to d2/d58/d73/f146 0 2026-03-10T07:51:50.347 INFO:tasks.workunit.client.1.vm08.stdout:6/932: getdents d1/d3/df/d1d/d40/d45/d10c/d129 0 2026-03-10T07:51:50.348 INFO:tasks.workunit.client.1.vm08.stdout:9/936: write d2/d58/dbf/dd0/d35/d97/d110/f13c [1000435,87415] 0 2026-03-10T07:51:50.349 INFO:tasks.workunit.client.1.vm08.stdout:1/911: symlink d2/d6/de/d1f/d26/l13a 0 2026-03-10T07:51:50.351 INFO:tasks.workunit.client.1.vm08.stdout:2/942: symlink d0/d1/d3/d10/d38/daf/l12f 0 2026-03-10T07:51:50.353 INFO:tasks.workunit.client.1.vm08.stdout:6/933: unlink d1/d3/df/d1d/f6b 0 2026-03-10T07:51:50.361 INFO:tasks.workunit.client.1.vm08.stdout:0/983: getdents dd/d18/d100/d102/dc1 0 2026-03-10T07:51:50.364 INFO:tasks.workunit.client.1.vm08.stdout:2/943: symlink d0/d1/d3/d56/d78/l130 0 2026-03-10T07:51:50.368 INFO:tasks.workunit.client.1.vm08.stdout:6/934: sync 2026-03-10T07:51:50.375 INFO:tasks.workunit.client.1.vm08.stdout:4/863: dread d5/f21 [0,4194304] 0 2026-03-10T07:51:50.400 INFO:tasks.workunit.client.1.vm08.stdout:7/916: write d3/da/d25/d9/d2f/d3a/d40/f55 [669886,70146] 0 2026-03-10T07:51:50.402 INFO:tasks.workunit.client.1.vm08.stdout:1/912: write d2/d6/de/d47/dbd/dc3/fd0 [746162,25061] 0 2026-03-10T07:51:50.408 INFO:tasks.workunit.client.1.vm08.stdout:3/920: link d0/d3c/d1f/d44/fb4 d0/d3c/d18/d32/d61/d83/f126 0 2026-03-10T07:51:50.413 INFO:tasks.workunit.client.1.vm08.stdout:6/935: symlink d1/d3/df/d38/def/l13b 0 2026-03-10T07:51:50.415 INFO:tasks.workunit.client.1.vm08.stdout:7/917: mkdir d3/da/d25/d9/d2f/d3a/d4b/d67/dea/d13a 0 2026-03-10T07:51:50.416 INFO:tasks.workunit.client.1.vm08.stdout:3/921: mkdir d0/d3c/d127 0 2026-03-10T07:51:50.417 INFO:tasks.workunit.client.1.vm08.stdout:3/922: stat d0/d3c/d18/dec/d34/caf 0 2026-03-10T07:51:50.419 INFO:tasks.workunit.client.1.vm08.stdout:9/937: link d2/d58/dbf/dd0/d35/dff/d101/l13a d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/d91/l147 0 2026-03-10T07:51:50.421 INFO:tasks.workunit.client.1.vm08.stdout:3/923: dwrite d0/d3c/d18/d80/ffc [0,4194304] 0 2026-03-10T07:51:50.431 INFO:tasks.workunit.client.1.vm08.stdout:6/936: symlink d1/d17/d2b/d5e/da8/l13c 0 2026-03-10T07:51:50.432 INFO:tasks.workunit.client.1.vm08.stdout:0/984: write dd/d10/d2f/d37/d64/d11e/d13f/f7e [680835,96028] 0 2026-03-10T07:51:50.434 INFO:tasks.workunit.client.1.vm08.stdout:4/864: unlink d5/d8/c4b 0 2026-03-10T07:51:50.442 INFO:tasks.workunit.client.1.vm08.stdout:7/918: mknod d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d136/c13b 0 2026-03-10T07:51:50.442 INFO:tasks.workunit.client.1.vm08.stdout:9/938: mkdir d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/dca/d148 0 2026-03-10T07:51:50.442 INFO:tasks.workunit.client.1.vm08.stdout:2/944: dwrite d0/d1/d3/d10/d38/f53 [0,4194304] 0 2026-03-10T07:51:50.447 INFO:tasks.workunit.client.1.vm08.stdout:6/937: rename d1/d3/df/d1d/d6f/d10e to d1/d17/d2b/d58/d76/d133/d13d 0 2026-03-10T07:51:50.447 INFO:tasks.workunit.client.1.vm08.stdout:4/865: symlink d5/da0/d95/l121 0 2026-03-10T07:51:50.450 INFO:tasks.workunit.client.1.vm08.stdout:7/919: dread d3/da/d25/d9/f87 [0,4194304] 0 2026-03-10T07:51:50.457 INFO:tasks.workunit.client.1.vm08.stdout:0/985: dread dd/d10/d14/d15/f9c [4194304,4194304] 0 2026-03-10T07:51:50.458 INFO:tasks.workunit.client.1.vm08.stdout:1/913: getdents d2/d6/de/d1f/d26/d58/d83/d104/d125 0 2026-03-10T07:51:50.460 INFO:tasks.workunit.client.1.vm08.stdout:1/914: write d2/d6/de/d1f/d26/d89/ffa [1171781,71422] 0 2026-03-10T07:51:50.460 INFO:tasks.workunit.client.1.vm08.stdout:2/945: chown d0/d1/d3/d56/d78/dad/db1/d61/l11e 14 1 2026-03-10T07:51:50.462 INFO:tasks.workunit.client.1.vm08.stdout:7/920: rmdir d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79/ddd 39 2026-03-10T07:51:50.464 INFO:tasks.workunit.client.1.vm08.stdout:1/915: mkdir d2/d6/de/d1f/d26/d98/d13b 0 2026-03-10T07:51:50.469 INFO:tasks.workunit.client.1.vm08.stdout:0/986: rename dd/d10/d2f/d37/d64/d95/d58/d3d/d9b/fb1 to dd/d10/d2f/d37/d64/d95/d5c/dca/ddb/f144 0 2026-03-10T07:51:50.469 INFO:tasks.workunit.client.1.vm08.stdout:7/921: truncate d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/f101 4949954 0 2026-03-10T07:51:50.469 INFO:tasks.workunit.client.1.vm08.stdout:7/922: chown d3/da/d25/d9/d2f/d4d/d120 30100 1 2026-03-10T07:51:50.469 INFO:tasks.workunit.client.1.vm08.stdout:7/923: write d3/f93 [4301410,40289] 0 2026-03-10T07:51:50.469 INFO:tasks.workunit.client.1.vm08.stdout:7/924: write d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/ff6 [4317784,111346] 0 2026-03-10T07:51:50.470 INFO:tasks.workunit.client.1.vm08.stdout:0/987: write dd/d10/d2f/d37/d64/d95/d5c/f13e [109861,107269] 0 2026-03-10T07:51:50.473 INFO:tasks.workunit.client.1.vm08.stdout:7/925: fsync d3/da/d25/d9/d2f/d3a/d71/dca/f10e 0 2026-03-10T07:51:50.475 INFO:tasks.workunit.client.1.vm08.stdout:7/926: truncate d3/da/d25/d9/d2f/d39/d43/d4f/d5b/f12b 767106 0 2026-03-10T07:51:50.477 INFO:tasks.workunit.client.1.vm08.stdout:0/988: creat dd/d10/d2f/d37/d64/d11e/d13f/d22/dc6/f145 x:0 0 0 2026-03-10T07:51:50.480 INFO:tasks.workunit.client.1.vm08.stdout:1/916: dread d2/d6/de/d47/da0/f101 [0,4194304] 0 2026-03-10T07:51:50.482 INFO:tasks.workunit.client.1.vm08.stdout:7/927: sync 2026-03-10T07:51:50.482 INFO:tasks.workunit.client.1.vm08.stdout:0/989: sync 2026-03-10T07:51:50.483 INFO:tasks.workunit.client.1.vm08.stdout:1/917: creat d2/d6/de/d1f/d26/d58/d8c/f13c x:0 0 0 2026-03-10T07:51:50.484 INFO:tasks.workunit.client.1.vm08.stdout:3/924: write d0/d3c/d1f/d44/f59 [625023,102521] 0 2026-03-10T07:51:50.486 INFO:tasks.workunit.client.1.vm08.stdout:6/938: write d1/db/fd2 [3579669,102562] 0 2026-03-10T07:51:50.486 INFO:tasks.workunit.client.1.vm08.stdout:4/866: write d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/f38 [2480711,44878] 0 2026-03-10T07:51:50.486 INFO:tasks.workunit.client.1.vm08.stdout:0/990: read dd/d10/d2f/d37/d64/d95/d5c/f8f [2694766,9471] 0 2026-03-10T07:51:50.487 INFO:tasks.workunit.client.1.vm08.stdout:6/939: readlink d1/db/d24/dac/ld5 0 2026-03-10T07:51:50.487 INFO:tasks.workunit.client.1.vm08.stdout:1/918: truncate d2/d6/de/d70/ffd 229948 0 2026-03-10T07:51:50.491 INFO:tasks.workunit.client.1.vm08.stdout:9/939: truncate d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/d98/fd4 2220407 0 2026-03-10T07:51:50.492 INFO:tasks.workunit.client.1.vm08.stdout:7/928: link d3/f34 d3/da/d25/d9/d2f/d4d/f13c 0 2026-03-10T07:51:50.495 INFO:tasks.workunit.client.1.vm08.stdout:2/946: write d0/d1/d17/dfe/f129 [1039976,68788] 0 2026-03-10T07:51:50.497 INFO:tasks.workunit.client.1.vm08.stdout:4/867: read d5/f2f [2407485,42696] 0 2026-03-10T07:51:50.497 INFO:tasks.workunit.client.1.vm08.stdout:9/940: dwrite d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/dca/f142 [0,4194304] 0 2026-03-10T07:51:50.499 INFO:tasks.workunit.client.1.vm08.stdout:4/868: chown d5/da0/d95/f96 38776005 1 2026-03-10T07:51:50.546 INFO:tasks.workunit.client.1.vm08.stdout:1/919: symlink d2/d6/de/d1f/d26/d58/l13d 0 2026-03-10T07:51:50.550 INFO:tasks.workunit.client.1.vm08.stdout:7/929: creat d3/da/d25/d9/d2f/d3a/d40/d54/f13d x:0 0 0 2026-03-10T07:51:50.554 INFO:tasks.workunit.client.1.vm08.stdout:2/947: rmdir d0/d1/d3 39 2026-03-10T07:51:50.558 INFO:tasks.workunit.client.1.vm08.stdout:1/920: dread d2/d6/de/d1f/d26/d58/d8c/f46 [0,4194304] 0 2026-03-10T07:51:50.569 INFO:tasks.workunit.client.1.vm08.stdout:4/869: creat d5/da0/d95/dc2/f122 x:0 0 0 2026-03-10T07:51:50.570 INFO:tasks.workunit.client.1.vm08.stdout:9/941: mkdir d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/d91/d149 0 2026-03-10T07:51:50.571 INFO:tasks.workunit.client.1.vm08.stdout:9/942: stat d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/d91/f120 0 2026-03-10T07:51:50.575 INFO:tasks.workunit.client.1.vm08.stdout:9/943: dwrite d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/d98/fc4 [0,4194304] 0 2026-03-10T07:51:50.576 INFO:tasks.workunit.client.1.vm08.stdout:9/944: chown d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/d98/dbb/dd9/dfc 59 1 2026-03-10T07:51:50.577 INFO:tasks.workunit.client.1.vm08.stdout:2/948: mkdir d0/d1/d17/db2/dde/d116/d131 0 2026-03-10T07:51:50.586 INFO:tasks.workunit.client.1.vm08.stdout:9/945: rmdir d2/d58/dbf/dd0/d35/dff/d101 39 2026-03-10T07:51:50.587 INFO:tasks.workunit.client.1.vm08.stdout:9/946: read d2/d58/dbf/dd0/d35/f6c [441247,35008] 0 2026-03-10T07:51:50.588 INFO:tasks.workunit.client.1.vm08.stdout:1/921: getdents d2/d6/de/d1f/d22/dd3 0 2026-03-10T07:51:50.591 INFO:tasks.workunit.client.1.vm08.stdout:9/947: symlink d2/l14a 0 2026-03-10T07:51:50.593 INFO:tasks.workunit.client.1.vm08.stdout:9/948: read d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/d98/fc4 [2547545,19902] 0 2026-03-10T07:51:50.595 INFO:tasks.workunit.client.1.vm08.stdout:2/949: dread d0/d1/d3/d3e/f125 [0,4194304] 0 2026-03-10T07:51:50.605 INFO:tasks.workunit.client.1.vm08.stdout:9/949: dread d2/d58/dbf/dd0/d35/d97/dd5/fe6 [0,4194304] 0 2026-03-10T07:51:50.606 INFO:tasks.workunit.client.1.vm08.stdout:1/922: dread d2/d6/de/d1f/da9/fcd [0,4194304] 0 2026-03-10T07:51:50.658 INFO:tasks.workunit.client.1.vm08.stdout:3/925: rename d0/d3c/d18/d48/f122 to d0/d3c/d18/d32/daa/f128 0 2026-03-10T07:51:50.666 INFO:tasks.workunit.client.1.vm08.stdout:3/926: creat d0/d3c/d18/d48/d55/f129 x:0 0 0 2026-03-10T07:51:50.671 INFO:tasks.workunit.client.1.vm08.stdout:3/927: dwrite d0/d3c/f87 [0,4194304] 0 2026-03-10T07:51:50.674 INFO:tasks.workunit.client.1.vm08.stdout:3/928: creat d0/d3c/d1f/d89/ddb/f12a x:0 0 0 2026-03-10T07:51:50.678 INFO:tasks.workunit.client.1.vm08.stdout:3/929: mknod d0/d3c/d18/d32/d61/d52/dca/c12b 0 2026-03-10T07:51:50.679 INFO:tasks.workunit.client.1.vm08.stdout:3/930: fdatasync d0/d3c/d18/d32/daa/ded/f116 0 2026-03-10T07:51:50.687 INFO:tasks.workunit.client.1.vm08.stdout:3/931: dread d0/d3c/d18/dec/d2d/d85/fa0 [0,4194304] 0 2026-03-10T07:51:50.689 INFO:tasks.workunit.client.1.vm08.stdout:3/932: getdents d0/d3c/d18/d80/dc1/d108 0 2026-03-10T07:51:50.690 INFO:tasks.workunit.client.1.vm08.stdout:3/933: chown d0/f84 1483183 1 2026-03-10T07:51:50.691 INFO:tasks.workunit.client.1.vm08.stdout:3/934: chown d0/d3c/d1f/d95/fbd 16249 1 2026-03-10T07:51:50.691 INFO:tasks.workunit.client.1.vm08.stdout:3/935: chown d0/d3c/d18/d48/d55/f11f 27293687 1 2026-03-10T07:51:50.701 INFO:tasks.workunit.client.1.vm08.stdout:0/991: truncate dd/d10/d2f/d37/d64/f70 3201740 0 2026-03-10T07:51:50.716 INFO:tasks.workunit.client.1.vm08.stdout:0/992: dread dd/d10/dbd/d10d/d111/ff0 [0,4194304] 0 2026-03-10T07:51:50.725 INFO:tasks.workunit.client.1.vm08.stdout:0/993: dread dd/d10/d14/f46 [0,4194304] 0 2026-03-10T07:51:50.727 INFO:tasks.workunit.client.1.vm08.stdout:0/994: creat dd/d10/dbd/d10d/f146 x:0 0 0 2026-03-10T07:51:50.728 INFO:tasks.workunit.client.1.vm08.stdout:0/995: creat dd/d18/d100/d102/dc1/d11a/f147 x:0 0 0 2026-03-10T07:51:50.732 INFO:tasks.workunit.client.1.vm08.stdout:0/996: dwrite dd/d10/d2f/d37/d64/d95/d58/d3d/d9b/f129 [0,4194304] 0 2026-03-10T07:51:50.740 INFO:tasks.workunit.client.1.vm08.stdout:0/997: getdents dd/d10/d2f/d37/d64/d95/d5c 0 2026-03-10T07:51:50.748 INFO:tasks.workunit.client.1.vm08.stdout:4/870: truncate d5/d8/f68 4118072 0 2026-03-10T07:51:50.765 INFO:tasks.workunit.client.1.vm08.stdout:4/871: dwrite d5/fd4 [0,4194304] 0 2026-03-10T07:51:50.765 INFO:tasks.workunit.client.1.vm08.stdout:4/872: mknod d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/dad/c123 0 2026-03-10T07:51:50.765 INFO:tasks.workunit.client.1.vm08.stdout:1/923: write d2/d6/de/d1f/d26/f2f [6342822,107275] 0 2026-03-10T07:51:50.765 INFO:tasks.workunit.client.1.vm08.stdout:2/950: creat d0/d1/d17/f132 x:0 0 0 2026-03-10T07:51:50.767 INFO:tasks.workunit.client.1.vm08.stdout:9/950: dwrite d2/f1a [0,4194304] 0 2026-03-10T07:51:50.771 INFO:tasks.workunit.client.1.vm08.stdout:2/951: creat d0/d1/d3/d10/d38/daf/f133 x:0 0 0 2026-03-10T07:51:50.772 INFO:tasks.workunit.client.1.vm08.stdout:9/951: truncate d2/d58/dbf/ddf/f12a 434640 0 2026-03-10T07:51:50.773 INFO:tasks.workunit.client.1.vm08.stdout:9/952: write d2/d58/d73/f7e [352283,34792] 0 2026-03-10T07:51:50.773 INFO:tasks.workunit.client.1.vm08.stdout:2/952: symlink d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/d7d/d7e/l134 0 2026-03-10T07:51:50.777 INFO:tasks.workunit.client.1.vm08.stdout:2/953: creat d0/d1/d17/db2/dc3/f135 x:0 0 0 2026-03-10T07:51:50.778 INFO:tasks.workunit.client.1.vm08.stdout:9/953: mkdir d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/d91/d149/d14b 0 2026-03-10T07:51:50.780 INFO:tasks.workunit.client.1.vm08.stdout:9/954: mkdir d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/d98/dbb/dd9/d14c 0 2026-03-10T07:51:50.781 INFO:tasks.workunit.client.1.vm08.stdout:2/954: read d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/d7d/d86/d55/fb0 [2717886,31216] 0 2026-03-10T07:51:50.783 INFO:tasks.workunit.client.1.vm08.stdout:2/955: dread - d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/d7d/d86/d55/dc9/ded/ffd zero size 2026-03-10T07:51:50.783 INFO:tasks.workunit.client.1.vm08.stdout:9/955: dread d2/d58/fc6 [0,4194304] 0 2026-03-10T07:51:50.785 INFO:tasks.workunit.client.1.vm08.stdout:2/956: chown d0/d1/d3/d56/d78/dad/db1/cbe 20912 1 2026-03-10T07:51:50.787 INFO:tasks.workunit.client.1.vm08.stdout:9/956: dwrite d2/d58/dbf/dd0/d35/d9b/fa9 [0,4194304] 0 2026-03-10T07:51:50.797 INFO:tasks.workunit.client.1.vm08.stdout:9/957: dread - d2/d58/ffd zero size 2026-03-10T07:51:50.797 INFO:tasks.workunit.client.1.vm08.stdout:9/958: chown d2/d58/dbf/dd0/d35/d97/d9d/fbd 1 1 2026-03-10T07:51:50.801 INFO:tasks.workunit.client.1.vm08.stdout:9/959: fdatasync d2/d58/f95 0 2026-03-10T07:51:50.809 INFO:tasks.workunit.client.1.vm08.stdout:9/960: mknod d2/d58/dbf/dd0/d35/d97/d9d/d141/c14d 0 2026-03-10T07:51:50.820 INFO:tasks.workunit.client.1.vm08.stdout:9/961: creat d2/d58/dbf/dd0/d35/dff/f14e x:0 0 0 2026-03-10T07:51:50.820 INFO:tasks.workunit.client.1.vm08.stdout:9/962: rmdir d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/d91 39 2026-03-10T07:51:50.821 INFO:tasks.workunit.client.1.vm08.stdout:9/963: truncate d2/d58/dbf/d2b/f7a 482117 0 2026-03-10T07:51:50.830 INFO:tasks.workunit.client.1.vm08.stdout:4/873: sync 2026-03-10T07:51:50.830 INFO:tasks.workunit.client.1.vm08.stdout:2/957: sync 2026-03-10T07:51:50.831 INFO:tasks.workunit.client.1.vm08.stdout:4/874: chown d5/da0/d95/de6/lf8 532697935 1 2026-03-10T07:51:50.836 INFO:tasks.workunit.client.1.vm08.stdout:2/958: dread d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/d7d/d86/fa9 [0,4194304] 0 2026-03-10T07:51:50.836 INFO:tasks.workunit.client.1.vm08.stdout:4/875: creat d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d9b/f124 x:0 0 0 2026-03-10T07:51:50.841 INFO:tasks.workunit.client.1.vm08.stdout:2/959: chown d0/d1/d3/d10/d38/daf/ldf 4659638 1 2026-03-10T07:51:50.841 INFO:tasks.workunit.client.1.vm08.stdout:6/940: rename d1/d3/df/d11e/d139 to d1/d17/d2b/d58/d13e 0 2026-03-10T07:51:50.841 INFO:tasks.workunit.client.1.vm08.stdout:6/941: dread - d1/f10d zero size 2026-03-10T07:51:50.841 INFO:tasks.workunit.client.1.vm08.stdout:4/876: chown d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/cc9 13950786 1 2026-03-10T07:51:50.841 INFO:tasks.workunit.client.1.vm08.stdout:7/930: symlink d3/da/d25/d9/d2f/l13e 0 2026-03-10T07:51:50.841 INFO:tasks.workunit.client.1.vm08.stdout:6/942: write d1/d3/d3e/f56 [245531,70343] 0 2026-03-10T07:51:50.842 INFO:tasks.workunit.client.1.vm08.stdout:3/936: rename d0/d3c/d1f/d89/fa4 to d0/d3c/d18/d48/d55/d56/f12c 0 2026-03-10T07:51:50.844 INFO:tasks.workunit.client.1.vm08.stdout:3/937: chown d0/d3c/d1f/d89/cb8 34129 1 2026-03-10T07:51:50.848 INFO:tasks.workunit.client.1.vm08.stdout:0/998: dread dd/d10/d2f/d37/d64/f70 [0,4194304] 0 2026-03-10T07:51:50.849 INFO:tasks.workunit.client.1.vm08.stdout:4/877: symlink d5/da0/d95/de6/l125 0 2026-03-10T07:51:50.852 INFO:tasks.workunit.client.1.vm08.stdout:0/999: sync 2026-03-10T07:51:50.853 INFO:tasks.workunit.client.1.vm08.stdout:2/960: creat d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/d7d/d86/d55/d7a/f136 x:0 0 0 2026-03-10T07:51:50.858 INFO:tasks.workunit.client.1.vm08.stdout:4/878: truncate d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/f82 788312 0 2026-03-10T07:51:50.859 INFO:tasks.workunit.client.1.vm08.stdout:1/924: write d2/d6/de/d1f/d22/deb/f11b [700799,118804] 0 2026-03-10T07:51:50.859 INFO:tasks.workunit.client.1.vm08.stdout:2/961: mknod d0/d1/d17/db2/d9c/c137 0 2026-03-10T07:51:50.864 INFO:tasks.workunit.client.1.vm08.stdout:7/931: creat d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79/ddd/de8/f13f x:0 0 0 2026-03-10T07:51:50.865 INFO:tasks.workunit.client.1.vm08.stdout:1/925: creat d2/d6/de/d1f/d26/d58/d83/d104/d125/f13e x:0 0 0 2026-03-10T07:51:50.866 INFO:tasks.workunit.client.1.vm08.stdout:6/943: rename d1/d17/d2b/d5e/dd6 to d1/d17/d2b/d58/d13f 0 2026-03-10T07:51:50.869 INFO:tasks.workunit.client.1.vm08.stdout:2/962: dwrite d0/d1/f24 [0,4194304] 0 2026-03-10T07:51:50.869 INFO:tasks.workunit.client.1.vm08.stdout:3/938: creat d0/d3c/d18/d48/d55/f12d x:0 0 0 2026-03-10T07:51:50.878 INFO:tasks.workunit.client.1.vm08.stdout:9/964: write d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/fec [503407,68466] 0 2026-03-10T07:51:50.880 INFO:tasks.workunit.client.1.vm08.stdout:4/879: dwrite d5/d8/d89/fb5 [0,4194304] 0 2026-03-10T07:51:50.882 INFO:tasks.workunit.client.1.vm08.stdout:6/944: mkdir d1/d17/d2b/d58/d76/d133/d140 0 2026-03-10T07:51:50.883 INFO:tasks.workunit.client.1.vm08.stdout:6/945: dread - d1/d3/d3e/db2/f11d zero size 2026-03-10T07:51:50.883 INFO:tasks.workunit.client.1.vm08.stdout:2/963: dread - d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/d7d/d86/d55/d7a/f136 zero size 2026-03-10T07:51:50.885 INFO:tasks.workunit.client.1.vm08.stdout:4/880: write d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d70/f78 [4882421,44723] 0 2026-03-10T07:51:50.886 INFO:tasks.workunit.client.1.vm08.stdout:1/926: dwrite d2/d6/d9f/fa7 [0,4194304] 0 2026-03-10T07:51:50.889 INFO:tasks.workunit.client.1.vm08.stdout:4/881: chown d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/dad/cc4 0 1 2026-03-10T07:51:50.893 INFO:tasks.workunit.client.1.vm08.stdout:3/939: mkdir d0/d3c/d18/d32/d61/d52/dca/d12e 0 2026-03-10T07:51:50.898 INFO:tasks.workunit.client.1.vm08.stdout:7/932: rename d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/f49 to d3/da/f140 0 2026-03-10T07:51:50.899 INFO:tasks.workunit.client.1.vm08.stdout:1/927: fsync d2/d6/de/d1f/d22/f24 0 2026-03-10T07:51:50.908 INFO:tasks.workunit.client.1.vm08.stdout:7/933: mknod d3/da/d25/d9/d2f/d6c/de5/c141 0 2026-03-10T07:51:50.918 INFO:tasks.workunit.client.1.vm08.stdout:1/928: rmdir d2/d6/de/d1f/d26/d98/d9b 39 2026-03-10T07:51:50.918 INFO:tasks.workunit.client.1.vm08.stdout:1/929: dread - d2/d6/de/d70/fca zero size 2026-03-10T07:51:50.918 INFO:tasks.workunit.client.1.vm08.stdout:7/934: creat d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79/f142 x:0 0 0 2026-03-10T07:51:50.918 INFO:tasks.workunit.client.1.vm08.stdout:1/930: creat d2/d6/de/f13f x:0 0 0 2026-03-10T07:51:50.918 INFO:tasks.workunit.client.1.vm08.stdout:1/931: unlink d2/d6/de/d71/cf5 0 2026-03-10T07:51:50.918 INFO:tasks.workunit.client.1.vm08.stdout:7/935: truncate d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79/ddd/de8/f13f 387239 0 2026-03-10T07:51:50.918 INFO:tasks.workunit.client.1.vm08.stdout:7/936: truncate d3/da/d25/d9/d2f/d4d/f13c 7179587 0 2026-03-10T07:51:50.918 INFO:tasks.workunit.client.1.vm08.stdout:7/937: chown d3/f2b 1012 1 2026-03-10T07:51:50.920 INFO:tasks.workunit.client.1.vm08.stdout:7/938: rename d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/dac/f125 to d3/da/d25/d9/d2f/d3a/d40/d54/db5/f143 0 2026-03-10T07:51:50.924 INFO:tasks.workunit.client.1.vm08.stdout:7/939: dwrite d3/da/d25/d9/d2f/d39/fc7 [4194304,4194304] 0 2026-03-10T07:51:50.934 INFO:tasks.workunit.client.1.vm08.stdout:7/940: creat d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/d113/f144 x:0 0 0 2026-03-10T07:51:50.935 INFO:tasks.workunit.client.1.vm08.stdout:7/941: write d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/ff6 [1185187,99615] 0 2026-03-10T07:51:50.936 INFO:tasks.workunit.client.1.vm08.stdout:7/942: chown d3/da/d25/d9/d2f/d39/d43/f7e 1881933 1 2026-03-10T07:51:50.937 INFO:tasks.workunit.client.1.vm08.stdout:7/943: write d3/da/d25/d9/d2f/d4d/db6/f128 [279516,1738] 0 2026-03-10T07:51:50.941 INFO:tasks.workunit.client.1.vm08.stdout:7/944: mkdir d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79/d145 0 2026-03-10T07:51:50.944 INFO:tasks.workunit.client.1.vm08.stdout:7/945: write d3/da/d25/d9/d2f/d39/d43/d4f/d5b/f12b [967252,53300] 0 2026-03-10T07:51:50.949 INFO:tasks.workunit.client.1.vm08.stdout:4/882: dread d5/da0/de2/ff7 [4194304,4194304] 0 2026-03-10T07:51:50.952 INFO:tasks.workunit.client.1.vm08.stdout:4/883: dwrite d5/da0/d95/dc2/fc5 [0,4194304] 0 2026-03-10T07:51:50.955 INFO:tasks.workunit.client.1.vm08.stdout:4/884: read d5/da0/de2/f10e [3662106,11972] 0 2026-03-10T07:51:50.961 INFO:tasks.workunit.client.1.vm08.stdout:4/885: creat d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/f126 x:0 0 0 2026-03-10T07:51:50.964 INFO:tasks.workunit.client.1.vm08.stdout:9/965: write d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/fc [209152,79381] 0 2026-03-10T07:51:50.965 INFO:tasks.workunit.client.1.vm08.stdout:4/886: dwrite d5/d8/d50/f116 [0,4194304] 0 2026-03-10T07:51:50.966 INFO:tasks.workunit.client.1.vm08.stdout:9/966: stat d2/d58/dbf/dd0/d35/d97/d9d/d141 0 2026-03-10T07:51:50.969 INFO:tasks.workunit.client.1.vm08.stdout:9/967: chown d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/dca/d148 3210 1 2026-03-10T07:51:50.972 INFO:tasks.workunit.client.1.vm08.stdout:6/946: truncate d1/d3/f21 4007768 0 2026-03-10T07:51:50.978 INFO:tasks.workunit.client.1.vm08.stdout:1/932: write d2/fe6 [1015899,76395] 0 2026-03-10T07:51:50.978 INFO:tasks.workunit.client.1.vm08.stdout:2/964: dwrite d0/d1/d3/d56/d78/dad/fc6 [0,4194304] 0 2026-03-10T07:51:50.980 INFO:tasks.workunit.client.1.vm08.stdout:3/940: dwrite d0/d3c/d1f/ff5 [0,4194304] 0 2026-03-10T07:51:50.983 INFO:tasks.workunit.client.1.vm08.stdout:2/965: readlink d0/d1/d3/d10/d38/l47 0 2026-03-10T07:51:50.984 INFO:tasks.workunit.client.1.vm08.stdout:1/933: dread d2/d6/d9f/fa7 [0,4194304] 0 2026-03-10T07:51:50.987 INFO:tasks.workunit.client.1.vm08.stdout:4/887: dwrite d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d70/fa9 [4194304,4194304] 0 2026-03-10T07:51:50.998 INFO:tasks.workunit.client.1.vm08.stdout:9/968: rename d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/dca/dd7/d10d/d112 to d2/d58/dbf/ddf/d14f 0 2026-03-10T07:51:51.001 INFO:tasks.workunit.client.1.vm08.stdout:3/941: creat d0/d3c/d18/d32/d61/d83/f12f x:0 0 0 2026-03-10T07:51:51.001 INFO:tasks.workunit.client.1.vm08.stdout:9/969: chown d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/dca/dd7/lf5 0 1 2026-03-10T07:51:51.009 INFO:tasks.workunit.client.1.vm08.stdout:1/934: mkdir d2/d6/d9f/d140 0 2026-03-10T07:51:51.022 INFO:tasks.workunit.client.1.vm08.stdout:4/888: mknod d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/c127 0 2026-03-10T07:51:51.022 INFO:tasks.workunit.client.1.vm08.stdout:4/889: dread - d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d9b/f124 zero size 2026-03-10T07:51:51.023 INFO:tasks.workunit.client.1.vm08.stdout:6/947: rmdir d1/d17/d2b/d58/d76/d133/d140 0 2026-03-10T07:51:51.024 INFO:tasks.workunit.client.1.vm08.stdout:4/890: creat d5/da0/d95/dc2/f128 x:0 0 0 2026-03-10T07:51:51.025 INFO:tasks.workunit.client.1.vm08.stdout:3/942: rename d0/d3c/d1f/d95/fab to d0/f130 0 2026-03-10T07:51:51.026 INFO:tasks.workunit.client.1.vm08.stdout:6/948: truncate d1/db/d24/dac/dad/f59 2980611 0 2026-03-10T07:51:51.027 INFO:tasks.workunit.client.1.vm08.stdout:6/949: readlink d1/d17/d2b/d58/d76/le2 0 2026-03-10T07:51:51.028 INFO:tasks.workunit.client.1.vm08.stdout:3/943: mknod d0/d3c/d18/d48/d55/d56/c131 0 2026-03-10T07:51:51.028 INFO:tasks.workunit.client.1.vm08.stdout:6/950: fdatasync d1/d17/d2b/fa7 0 2026-03-10T07:51:51.029 INFO:tasks.workunit.client.1.vm08.stdout:6/951: rmdir d1/d3/df/d1d/d6f 39 2026-03-10T07:51:51.034 INFO:tasks.workunit.client.1.vm08.stdout:9/970: sync 2026-03-10T07:51:51.036 INFO:tasks.workunit.client.1.vm08.stdout:6/952: read d1/d3/df/fd7 [28570,39196] 0 2026-03-10T07:51:51.039 INFO:tasks.workunit.client.1.vm08.stdout:9/971: dread d2/d58/fb3 [0,4194304] 0 2026-03-10T07:51:51.044 INFO:tasks.workunit.client.1.vm08.stdout:9/972: truncate d2/d58/dbf/d2b/f6a 4050455 0 2026-03-10T07:51:51.045 INFO:tasks.workunit.client.1.vm08.stdout:3/944: dread d0/d3c/d1f/d44/f8c [0,4194304] 0 2026-03-10T07:51:51.045 INFO:tasks.workunit.client.1.vm08.stdout:9/973: fsync d2/d58/dbf/dd0/d35/f79 0 2026-03-10T07:51:51.045 INFO:tasks.workunit.client.1.vm08.stdout:9/974: chown d2/la3 1209 1 2026-03-10T07:51:51.047 INFO:tasks.workunit.client.1.vm08.stdout:3/945: write d0/d3c/d18/d80/fe3 [1272640,120147] 0 2026-03-10T07:51:51.049 INFO:tasks.workunit.client.1.vm08.stdout:9/975: dread - d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/d91/db8/feb zero size 2026-03-10T07:51:51.049 INFO:tasks.workunit.client.1.vm08.stdout:9/976: write d2/d58/dbf/dd0/d35/d97/d9d/fbd [542146,26916] 0 2026-03-10T07:51:51.058 INFO:tasks.workunit.client.1.vm08.stdout:6/953: read d1/d3/d3e/f56 [6911630,18473] 0 2026-03-10T07:51:51.061 INFO:tasks.workunit.client.1.vm08.stdout:9/977: write d2/d58/d73/f146 [913935,20169] 0 2026-03-10T07:51:51.063 INFO:tasks.workunit.client.1.vm08.stdout:7/946: dwrite d3/da/d25/d9/f47 [0,4194304] 0 2026-03-10T07:51:51.071 INFO:tasks.workunit.client.1.vm08.stdout:6/954: mknod d1/d3/df/c141 0 2026-03-10T07:51:51.073 INFO:tasks.workunit.client.1.vm08.stdout:6/955: symlink d1/d3/df/d1d/d40/d45/d124/l142 0 2026-03-10T07:51:51.074 INFO:tasks.workunit.client.1.vm08.stdout:7/947: symlink d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79/l146 0 2026-03-10T07:51:51.077 INFO:tasks.workunit.client.1.vm08.stdout:1/935: write d2/d6/d50/f54 [1023289,125798] 0 2026-03-10T07:51:51.081 INFO:tasks.workunit.client.1.vm08.stdout:4/891: write d5/da0/d95/de6/f1a [7971338,47647] 0 2026-03-10T07:51:51.081 INFO:tasks.workunit.client.1.vm08.stdout:2/966: dwrite d0/d1/d3/f8 [0,4194304] 0 2026-03-10T07:51:51.088 INFO:tasks.workunit.client.1.vm08.stdout:4/892: dread - d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d9b/f124 zero size 2026-03-10T07:51:51.091 INFO:tasks.workunit.client.1.vm08.stdout:9/978: rename d2/d58/dbf/d2b/f83 to d2/d58/dbf/dd0/d35/d97/d9d/df4/f150 0 2026-03-10T07:51:51.096 INFO:tasks.workunit.client.1.vm08.stdout:9/979: write d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/f12f [363768,94359] 0 2026-03-10T07:51:51.099 INFO:tasks.workunit.client.1.vm08.stdout:4/893: dread d5/da0/d12/fe7 [0,4194304] 0 2026-03-10T07:51:51.100 INFO:tasks.workunit.client.1.vm08.stdout:4/894: chown d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/d61/c63 159 1 2026-03-10T07:51:51.101 INFO:tasks.workunit.client.1.vm08.stdout:7/948: rename d3/da/d8a/l92 to d3/da/d8a/l147 0 2026-03-10T07:51:51.102 INFO:tasks.workunit.client.1.vm08.stdout:3/946: link d0/d3c/d18/d32/d61/d52/f7f d0/d3c/d18/d32/d61/d52/dca/d12e/f132 0 2026-03-10T07:51:51.106 INFO:tasks.workunit.client.1.vm08.stdout:7/949: dwrite d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/f101 [0,4194304] 0 2026-03-10T07:51:51.106 INFO:tasks.workunit.client.1.vm08.stdout:2/967: write d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/d7d/d86/d55/fb0 [583901,6467] 0 2026-03-10T07:51:51.110 INFO:tasks.workunit.client.1.vm08.stdout:4/895: symlink d5/d8/d89/l129 0 2026-03-10T07:51:51.111 INFO:tasks.workunit.client.1.vm08.stdout:2/968: dread - d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/d7d/d86/d55/dc9/ded/ffd zero size 2026-03-10T07:51:51.115 INFO:tasks.workunit.client.1.vm08.stdout:1/936: mknod d2/d6/d3a/d61/d6f/c141 0 2026-03-10T07:51:51.117 INFO:tasks.workunit.client.1.vm08.stdout:3/947: unlink d0/d3c/d18/d80/lbe 0 2026-03-10T07:51:51.117 INFO:tasks.workunit.client.1.vm08.stdout:7/950: readlink d3/da/d25/d9/d2f/d3a/d71/l132 0 2026-03-10T07:51:51.117 INFO:tasks.workunit.client.1.vm08.stdout:2/969: symlink d0/d1/d17/db2/l138 0 2026-03-10T07:51:51.117 INFO:tasks.workunit.client.1.vm08.stdout:3/948: dread - d0/d3c/d18/dec/f109 zero size 2026-03-10T07:51:51.125 INFO:tasks.workunit.client.1.vm08.stdout:7/951: dwrite d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/f133 [0,4194304] 0 2026-03-10T07:51:51.127 INFO:tasks.workunit.client.1.vm08.stdout:2/970: dwrite d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/d7d/d86/d55/dc9/f12b [0,4194304] 0 2026-03-10T07:51:51.137 INFO:tasks.workunit.client.1.vm08.stdout:2/971: dread d0/d1/d3/d56/d78/dad/db1/f103 [0,4194304] 0 2026-03-10T07:51:51.141 INFO:tasks.workunit.client.1.vm08.stdout:2/972: rmdir d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/de2/d107 39 2026-03-10T07:51:51.142 INFO:tasks.workunit.client.1.vm08.stdout:1/937: dread d2/f4 [0,4194304] 0 2026-03-10T07:51:51.147 INFO:tasks.workunit.client.1.vm08.stdout:1/938: chown d2/d6/de/d1f/d26/d58/d8c/f96 1568 1 2026-03-10T07:51:51.147 INFO:tasks.workunit.client.1.vm08.stdout:3/949: getdents d0/d3c/d18/d32/d61/d52 0 2026-03-10T07:51:51.147 INFO:tasks.workunit.client.1.vm08.stdout:1/939: stat d2/d6/d3a/d61/d6f/f9d 0 2026-03-10T07:51:51.147 INFO:tasks.workunit.client.1.vm08.stdout:2/973: symlink d0/d1/d3/l139 0 2026-03-10T07:51:51.156 INFO:tasks.workunit.client.1.vm08.stdout:3/950: creat d0/d3c/d1f/d44/f133 x:0 0 0 2026-03-10T07:51:51.157 INFO:tasks.workunit.client.1.vm08.stdout:3/951: chown d0/d3c/d18/da9 25820 1 2026-03-10T07:51:51.157 INFO:tasks.workunit.client.1.vm08.stdout:3/952: write d0/d3c/d1f/d44/f133 [550613,91185] 0 2026-03-10T07:51:51.159 INFO:tasks.workunit.client.1.vm08.stdout:1/940: dread d2/d6/de/d1f/d26/d89/d8e/fbf [0,4194304] 0 2026-03-10T07:51:51.165 INFO:tasks.workunit.client.1.vm08.stdout:3/953: creat d0/d3c/d18/d32/d61/d83/f134 x:0 0 0 2026-03-10T07:51:51.166 INFO:tasks.workunit.client.1.vm08.stdout:3/954: truncate d0/d3c/d18/dec/f121 825113 0 2026-03-10T07:51:51.169 INFO:tasks.workunit.client.1.vm08.stdout:6/956: dwrite d1/d3/f2e [8388608,4194304] 0 2026-03-10T07:51:51.171 INFO:tasks.workunit.client.1.vm08.stdout:3/955: creat d0/d3c/d18/d4a/f135 x:0 0 0 2026-03-10T07:51:51.177 INFO:tasks.workunit.client.1.vm08.stdout:2/974: link d0/d1/d3/f63 d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/d7d/f13a 0 2026-03-10T07:51:51.177 INFO:tasks.workunit.client.1.vm08.stdout:9/980: write d2/d26/f4b [1936551,111916] 0 2026-03-10T07:51:51.184 INFO:tasks.workunit.client.1.vm08.stdout:4/896: truncate d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/fbe 900366 0 2026-03-10T07:51:51.195 INFO:tasks.workunit.client.1.vm08.stdout:6/957: dread - d1/d17/d2b/d58/d76/ff5 zero size 2026-03-10T07:51:51.195 INFO:tasks.workunit.client.1.vm08.stdout:3/956: rename d0/d3c/d1f/d44/f8c to d0/d3c/f136 0 2026-03-10T07:51:51.195 INFO:tasks.workunit.client.1.vm08.stdout:3/957: truncate d0/d3c/d18/d32/d61/d83/f134 124021 0 2026-03-10T07:51:51.195 INFO:tasks.workunit.client.1.vm08.stdout:3/958: stat d0/d3c/d1f/d89/ddb/f12a 0 2026-03-10T07:51:51.195 INFO:tasks.workunit.client.1.vm08.stdout:4/897: link d5/da0/d95/de6/d48/fe9 d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/dad/db8/f12a 0 2026-03-10T07:51:51.196 INFO:tasks.workunit.client.1.vm08.stdout:9/981: sync 2026-03-10T07:51:51.197 INFO:tasks.workunit.client.1.vm08.stdout:3/959: read d0/d3c/d18/d32/d61/d52/f70 [2962058,119700] 0 2026-03-10T07:51:51.198 INFO:tasks.workunit.client.1.vm08.stdout:9/982: sync 2026-03-10T07:51:51.199 INFO:tasks.workunit.client.1.vm08.stdout:6/958: rename d1/d17/d2b/d58/d77/daf/c135 to d1/d17/d2b/d58/d76/d114/c143 0 2026-03-10T07:51:51.201 INFO:tasks.workunit.client.1.vm08.stdout:6/959: chown d1/d3/df/d1d/d40/d87/fb3 4 1 2026-03-10T07:51:51.202 INFO:tasks.workunit.client.1.vm08.stdout:9/983: stat d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/d84/d91/f120 0 2026-03-10T07:51:51.204 INFO:tasks.workunit.client.1.vm08.stdout:3/960: dwrite d0/d3c/d18/d48/d55/f113 [0,4194304] 0 2026-03-10T07:51:51.207 INFO:tasks.workunit.client.1.vm08.stdout:3/961: chown d0/d3c/d18/dec/f5b 26982 1 2026-03-10T07:51:51.209 INFO:tasks.workunit.client.1.vm08.stdout:9/984: rename d2/d58/fb3 to d2/d58/dbf/dd0/df0/f151 0 2026-03-10T07:51:51.209 INFO:tasks.workunit.client.1.vm08.stdout:3/962: mknod d0/d3c/d18/d32/d61/d83/c137 0 2026-03-10T07:51:51.210 INFO:tasks.workunit.client.1.vm08.stdout:3/963: dread - d0/d3c/d18/dec/d2d/f10c zero size 2026-03-10T07:51:51.211 INFO:tasks.workunit.client.1.vm08.stdout:9/985: write d2/d58/dbf/dd0/d35/d97/d9d/fbd [688742,25807] 0 2026-03-10T07:51:51.213 INFO:tasks.workunit.client.1.vm08.stdout:3/964: chown d0/d3c/d18/dec/d34/f3d 0 1 2026-03-10T07:51:51.220 INFO:tasks.workunit.client.1.vm08.stdout:9/986: creat d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/d98/f152 x:0 0 0 2026-03-10T07:51:51.220 INFO:tasks.workunit.client.1.vm08.stdout:6/960: link d1/d17/d2b/d5e/ff3 d1/d17/f144 0 2026-03-10T07:51:51.222 INFO:tasks.workunit.client.1.vm08.stdout:9/987: chown d2/d58/dbf/l38 308 1 2026-03-10T07:51:51.224 INFO:tasks.workunit.client.1.vm08.stdout:9/988: mknod d2/d58/dbf/dd0/d35/d97/d9d/c153 0 2026-03-10T07:51:51.224 INFO:tasks.workunit.client.1.vm08.stdout:6/961: creat d1/d3/df/d1d/f145 x:0 0 0 2026-03-10T07:51:51.225 INFO:tasks.workunit.client.1.vm08.stdout:3/965: getdents d0/d3c/d18/da9/dcc 0 2026-03-10T07:51:51.227 INFO:tasks.workunit.client.1.vm08.stdout:6/962: creat d1/f146 x:0 0 0 2026-03-10T07:51:51.228 INFO:tasks.workunit.client.1.vm08.stdout:6/963: chown d1/d3/d3e/db2/d131/c102 2617 1 2026-03-10T07:51:51.235 INFO:tasks.workunit.client.1.vm08.stdout:7/952: dwrite d3/f6 [0,4194304] 0 2026-03-10T07:51:51.239 INFO:tasks.workunit.client.1.vm08.stdout:1/941: dwrite d2/d6/d9f/fe1 [0,4194304] 0 2026-03-10T07:51:51.240 INFO:tasks.workunit.client.1.vm08.stdout:1/942: truncate d2/d6/de/d1f/d26/d58/d83/dc2/f130 593117 0 2026-03-10T07:51:51.252 INFO:tasks.workunit.client.1.vm08.stdout:9/989: dread d2/d26/ffe [0,4194304] 0 2026-03-10T07:51:51.253 INFO:tasks.workunit.client.1.vm08.stdout:7/953: getdents d3/da/d25/d9/d2f/d3a 0 2026-03-10T07:51:51.254 INFO:tasks.workunit.client.1.vm08.stdout:7/954: stat d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/f110 0 2026-03-10T07:51:51.259 INFO:tasks.workunit.client.1.vm08.stdout:9/990: creat d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/f154 x:0 0 0 2026-03-10T07:51:51.259 INFO:tasks.workunit.client.1.vm08.stdout:9/991: getdents d2/d58/dbf/ddf/d14f 0 2026-03-10T07:51:51.259 INFO:tasks.workunit.client.1.vm08.stdout:9/992: readlink d2/d58/dbf/dd0/lb1 0 2026-03-10T07:51:51.259 INFO:tasks.workunit.client.1.vm08.stdout:7/955: dwrite d3/da/d25/d9/d2f/d39/f11a [0,4194304] 0 2026-03-10T07:51:51.259 INFO:tasks.workunit.client.1.vm08.stdout:9/993: fdatasync d2/d58/dbf/dd0/d35/f64 0 2026-03-10T07:51:51.269 INFO:tasks.workunit.client.1.vm08.stdout:7/956: rename d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/d107/c10c to d3/da/d25/d9/d2f/d39/c148 0 2026-03-10T07:51:51.271 INFO:tasks.workunit.client.1.vm08.stdout:7/957: fdatasync d3/da/d25/f29 0 2026-03-10T07:51:51.283 INFO:tasks.workunit.client.1.vm08.stdout:7/958: chown d3/da/d8a/fcd 11 1 2026-03-10T07:51:51.284 INFO:tasks.workunit.client.1.vm08.stdout:7/959: read d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/ff6 [2141758,8543] 0 2026-03-10T07:51:51.306 INFO:tasks.workunit.client.1.vm08.stdout:2/975: write d0/d1/d17/dfb/f105 [104784,14111] 0 2026-03-10T07:51:51.313 INFO:tasks.workunit.client.1.vm08.stdout:2/976: mknod d0/d1/d3/d56/d78/dad/d112/c13b 0 2026-03-10T07:51:51.313 INFO:tasks.workunit.client.1.vm08.stdout:2/977: readlink d0/d1/d17/db2/dde/d116/l7b 0 2026-03-10T07:51:51.313 INFO:tasks.workunit.client.1.vm08.stdout:2/978: symlink d0/d1/d3/d10/l13c 0 2026-03-10T07:51:51.314 INFO:tasks.workunit.client.1.vm08.stdout:2/979: link d0/d1/d3/d56/d78/dad/db1/lb3 d0/d1/d17/db2/dde/d116/l13d 0 2026-03-10T07:51:51.321 INFO:tasks.workunit.client.1.vm08.stdout:2/980: dread d0/f50 [0,4194304] 0 2026-03-10T07:51:51.327 INFO:tasks.workunit.client.1.vm08.stdout:4/898: dwrite d5/da0/f46 [0,4194304] 0 2026-03-10T07:51:51.327 INFO:tasks.workunit.client.1.vm08.stdout:2/981: sync 2026-03-10T07:51:51.328 INFO:tasks.workunit.client.1.vm08.stdout:4/899: write d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/f126 [1022295,64063] 0 2026-03-10T07:51:51.329 INFO:tasks.workunit.client.1.vm08.stdout:4/900: fdatasync f1 0 2026-03-10T07:51:51.329 INFO:tasks.workunit.client.1.vm08.stdout:4/901: stat d5/d8/d50/f116 0 2026-03-10T07:51:51.332 INFO:tasks.workunit.client.1.vm08.stdout:6/964: write d1/d17/d2b/d58/d76/fb6 [603698,126553] 0 2026-03-10T07:51:51.334 INFO:tasks.workunit.client.1.vm08.stdout:3/966: truncate d0/d3c/d1f/d44/f4b 3814075 0 2026-03-10T07:51:51.334 INFO:tasks.workunit.client.1.vm08.stdout:1/943: truncate d2/d6/de/d1f/d22/f35 256284 0 2026-03-10T07:51:51.341 INFO:tasks.workunit.client.1.vm08.stdout:4/902: dwrite d5/f54 [4194304,4194304] 0 2026-03-10T07:51:51.347 INFO:tasks.workunit.client.1.vm08.stdout:9/994: fsync d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/f5f 0 2026-03-10T07:51:51.352 INFO:tasks.workunit.client.1.vm08.stdout:7/960: mkdir d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/dac/d149 0 2026-03-10T07:51:51.352 INFO:tasks.workunit.client.1.vm08.stdout:6/965: truncate d1/d3/df/d1d/d40/f10b 292294 0 2026-03-10T07:51:51.352 INFO:tasks.workunit.client.1.vm08.stdout:4/903: mknod d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d41/c12b 0 2026-03-10T07:51:51.353 INFO:tasks.workunit.client.1.vm08.stdout:4/904: rmdir d5/da0/d95/dc2 39 2026-03-10T07:51:51.354 INFO:tasks.workunit.client.1.vm08.stdout:3/967: link d0/d3c/d18/dec/d2d/cb2 d0/d3c/d1f/c138 0 2026-03-10T07:51:51.355 INFO:tasks.workunit.client.1.vm08.stdout:9/995: rename d2/d26/l2d to d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/l155 0 2026-03-10T07:51:51.363 INFO:tasks.workunit.client.1.vm08.stdout:6/966: dread d1/d3/d3e/f56 [0,4194304] 0 2026-03-10T07:51:51.366 INFO:tasks.workunit.client.1.vm08.stdout:6/967: write d1/d3/df/d52/f8f [3523656,9671] 0 2026-03-10T07:51:51.368 INFO:tasks.workunit.client.1.vm08.stdout:3/968: sync 2026-03-10T07:51:51.370 INFO:tasks.workunit.client.1.vm08.stdout:3/969: chown d0/d3c/d18/d4a/f135 211759 1 2026-03-10T07:51:51.374 INFO:tasks.workunit.client.1.vm08.stdout:2/982: write d0/d1/d17/ff3 [888496,17900] 0 2026-03-10T07:51:51.387 INFO:tasks.workunit.client.1.vm08.stdout:7/961: dwrite d3/da/d25/d9/d2f/f97 [0,4194304] 0 2026-03-10T07:51:51.392 INFO:tasks.workunit.client.1.vm08.stdout:2/983: mknod d0/d1/d17/db2/d9c/c13e 0 2026-03-10T07:51:51.392 INFO:tasks.workunit.client.1.vm08.stdout:2/984: readlink d0/d1/d17/db2/l138 0 2026-03-10T07:51:51.393 INFO:tasks.workunit.client.1.vm08.stdout:6/968: mkdir d1/d3/df/d1d/d40/d87/d127/d147 0 2026-03-10T07:51:51.394 INFO:tasks.workunit.client.1.vm08.stdout:7/962: symlink d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/dac/l14a 0 2026-03-10T07:51:51.396 INFO:tasks.workunit.client.1.vm08.stdout:3/970: mkdir d0/d3c/d18/d32/dfd/d139 0 2026-03-10T07:51:51.401 INFO:tasks.workunit.client.1.vm08.stdout:7/963: dread d3/da/d25/d9/d2f/d39/f76 [0,4194304] 0 2026-03-10T07:51:51.404 INFO:tasks.workunit.client.1.vm08.stdout:2/985: truncate d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/d7d/d86/d55/d7a/f136 371982 0 2026-03-10T07:51:51.405 INFO:tasks.workunit.client.1.vm08.stdout:6/969: creat d1/f148 x:0 0 0 2026-03-10T07:51:51.406 INFO:tasks.workunit.client.1.vm08.stdout:3/971: dread d0/d3c/d18/d48/d55/fe0 [0,4194304] 0 2026-03-10T07:51:51.407 INFO:tasks.workunit.client.1.vm08.stdout:2/986: symlink d0/d1/d3/d56/d78/dad/db1/d61/d84/l13f 0 2026-03-10T07:51:51.408 INFO:tasks.workunit.client.1.vm08.stdout:6/970: symlink d1/d17/d2b/d58/d76/d133/d13d/l149 0 2026-03-10T07:51:51.408 INFO:tasks.workunit.client.1.vm08.stdout:2/987: fsync d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/d7d/d86/d55/dc9/ded/f108 0 2026-03-10T07:51:51.410 INFO:tasks.workunit.client.1.vm08.stdout:3/972: symlink d0/d3c/d18/d48/d55/l13a 0 2026-03-10T07:51:51.414 INFO:tasks.workunit.client.1.vm08.stdout:7/964: link d3/da/d25/d9/d2f/d3a/d4b/f7b d3/da/d25/d9/d2f/d3a/f14b 0 2026-03-10T07:51:51.414 INFO:tasks.workunit.client.1.vm08.stdout:3/973: write d0/d3c/d18/fa5 [4930108,84390] 0 2026-03-10T07:51:51.415 INFO:tasks.workunit.client.1.vm08.stdout:2/988: rename d0/d1/d3/d56/d78/dad/db1/d61/d84/da7/d123/d7d/d86 to d0/d1/d17/d6b/d140 0 2026-03-10T07:51:51.415 INFO:tasks.workunit.client.1.vm08.stdout:3/974: chown d0/d3c/d18/d32/daa/ded 2 1 2026-03-10T07:51:51.417 INFO:tasks.workunit.client.1.vm08.stdout:3/975: write d0/d3c/d18/d48/d55/f12d [1025003,13659] 0 2026-03-10T07:51:51.418 INFO:tasks.workunit.client.1.vm08.stdout:2/989: truncate d0/d1/d3/d56/d78/dad/f127 691078 0 2026-03-10T07:51:51.418 INFO:tasks.workunit.client.1.vm08.stdout:1/944: dwrite d2/d6/de/d1f/d26/d58/d8c/f96 [0,4194304] 0 2026-03-10T07:51:51.418 INFO:tasks.workunit.client.1.vm08.stdout:3/976: chown d0/d3c/d18/d48/d55/d56/lfa 630006 1 2026-03-10T07:51:51.427 INFO:tasks.workunit.client.1.vm08.stdout:3/977: unlink d0/le9 0 2026-03-10T07:51:51.427 INFO:tasks.workunit.client.1.vm08.stdout:1/945: getdents d2/d6/de/d1f/da9/d138 0 2026-03-10T07:51:51.429 INFO:tasks.workunit.client.1.vm08.stdout:1/946: fsync d2/d6/de/d1f/f2a 0 2026-03-10T07:51:51.429 INFO:tasks.workunit.client.1.vm08.stdout:3/978: symlink d0/l13b 0 2026-03-10T07:51:51.430 INFO:tasks.workunit.client.1.vm08.stdout:1/947: mknod d2/d6/c142 0 2026-03-10T07:51:51.433 INFO:tasks.workunit.client.1.vm08.stdout:1/948: mkdir d2/d6/de/d1f/d26/d98/d13b/d143 0 2026-03-10T07:51:51.434 INFO:tasks.workunit.client.1.vm08.stdout:6/971: dread d1/d3/f21 [0,4194304] 0 2026-03-10T07:51:51.437 INFO:tasks.workunit.client.1.vm08.stdout:2/990: dread d0/d1/d3/d10/d38/daf/f10b [0,4194304] 0 2026-03-10T07:51:51.439 INFO:tasks.workunit.client.1.vm08.stdout:1/949: mknod d2/d6/de/d1f/d26/d89/d117/c144 0 2026-03-10T07:51:51.440 INFO:tasks.workunit.client.1.vm08.stdout:4/905: truncate d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/fd7 725314 0 2026-03-10T07:51:51.442 INFO:tasks.workunit.client.1.vm08.stdout:6/972: truncate d1/d17/d2b/f3c 336045 0 2026-03-10T07:51:51.444 INFO:tasks.workunit.client.1.vm08.stdout:6/973: creat d1/d3/d3e/f14a x:0 0 0 2026-03-10T07:51:51.444 INFO:tasks.workunit.client.1.vm08.stdout:1/950: symlink d2/d6/d9f/d140/l145 0 2026-03-10T07:51:51.445 INFO:tasks.workunit.client.1.vm08.stdout:9/996: dwrite d2/d58/dbf/dd0/d35/d97/d9d/df4/dee/fa [0,4194304] 0 2026-03-10T07:51:51.445 INFO:tasks.workunit.client.1.vm08.stdout:6/974: chown d1/db/d24/d3d/l108 0 1 2026-03-10T07:51:51.450 INFO:tasks.workunit.client.1.vm08.stdout:6/975: chown d1/db/d24/f11f 15075019 1 2026-03-10T07:51:51.455 INFO:tasks.workunit.client.1.vm08.stdout:4/906: rename d5/d8/d50/db0/l11c to d5/da0/d95/de6/d48/l12c 0 2026-03-10T07:51:51.456 INFO:tasks.workunit.client.1.vm08.stdout:4/907: chown d5/da0/l45 88701 1 2026-03-10T07:51:51.456 INFO:tasks.workunit.client.1.vm08.stdout:1/951: creat d2/d6/de/d1f/d26/d89/f146 x:0 0 0 2026-03-10T07:51:51.456 INFO:tasks.workunit.client.1.vm08.stdout:6/976: rmdir d1/d17/d2b/d58/de7 39 2026-03-10T07:51:51.457 INFO:tasks.workunit.client.1.vm08.stdout:9/997: symlink d2/d58/dbf/dd0/d35/d97/dd5/l156 0 2026-03-10T07:51:51.457 INFO:tasks.workunit.client.1.vm08.stdout:4/908: truncate d5/f2d 2010695 0 2026-03-10T07:51:51.459 INFO:tasks.workunit.client.1.vm08.stdout:1/952: mknod d2/d6/de/d1f/d26/d58/d83/dc2/c147 0 2026-03-10T07:51:51.459 INFO:tasks.workunit.client.1.vm08.stdout:7/965: symlink d3/da/d25/d9/d2f/d3a/d40/d54/l14c 0 2026-03-10T07:51:51.459 INFO:tasks.workunit.client.1.vm08.stdout:4/909: fsync d5/da0/d95/de6/da7/fd6 0 2026-03-10T07:51:51.461 INFO:tasks.workunit.client.1.vm08.stdout:1/953: rmdir d2/d6/de/d70/d80 39 2026-03-10T07:51:51.464 INFO:tasks.workunit.client.1.vm08.stdout:4/910: mkdir d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/dce/dd9/d12d 0 2026-03-10T07:51:51.465 INFO:tasks.workunit.client.1.vm08.stdout:1/954: dread d2/d6/de/d1f/d40/d76/fab [0,4194304] 0 2026-03-10T07:51:51.469 INFO:tasks.workunit.client.1.vm08.stdout:4/911: link d5/da0/d12/def/c114 d5/da0/d95/dc2/c12e 0 2026-03-10T07:51:51.471 INFO:tasks.workunit.client.1.vm08.stdout:1/955: dread d2/d6/de/d1f/d22/f35 [0,4194304] 0 2026-03-10T07:51:51.472 INFO:tasks.workunit.client.1.vm08.stdout:4/912: truncate d5/d8/f68 1407733 0 2026-03-10T07:51:51.474 INFO:tasks.workunit.client.1.vm08.stdout:3/979: write d0/d3c/d18/dec/d2d/f3a [2079347,41532] 0 2026-03-10T07:51:51.479 INFO:tasks.workunit.client.1.vm08.stdout:3/980: creat d0/d3c/f13c x:0 0 0 2026-03-10T07:51:51.480 INFO:tasks.workunit.client.1.vm08.stdout:2/991: dwrite d0/d1/d17/db2/dde/d116/fae [0,4194304] 0 2026-03-10T07:51:51.481 INFO:tasks.workunit.client.1.vm08.stdout:4/913: link d5/da0/d95/de6/d48/l71 d5/da0/d95/de6/d48/d4f/d7c/l12f 0 2026-03-10T07:51:51.481 INFO:tasks.workunit.client.1.vm08.stdout:1/956: dwrite d2/d6/de/d1f/d26/d58/d83/fa2 [0,4194304] 0 2026-03-10T07:51:51.486 INFO:tasks.workunit.client.1.vm08.stdout:4/914: creat d5/da0/d95/dc2/f130 x:0 0 0 2026-03-10T07:51:51.493 INFO:tasks.workunit.client.1.vm08.stdout:4/915: symlink d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/l131 0 2026-03-10T07:51:51.494 INFO:tasks.workunit.client.1.vm08.stdout:4/916: write d5/da0/f5a [166637,764] 0 2026-03-10T07:51:51.498 INFO:tasks.workunit.client.1.vm08.stdout:3/981: symlink d0/d3c/d18/d32/d61/d52/dca/dd2/d117/l13d 0 2026-03-10T07:51:51.500 INFO:tasks.workunit.client.1.vm08.stdout:1/957: dread d2/d6/de/d47/f38 [0,4194304] 0 2026-03-10T07:51:51.510 INFO:tasks.workunit.client.1.vm08.stdout:2/992: symlink d0/d1/d3/d10/l141 0 2026-03-10T07:51:51.510 INFO:tasks.workunit.client.1.vm08.stdout:6/977: write d1/db/d24/d3d/f11a [244672,96816] 0 2026-03-10T07:51:51.510 INFO:tasks.workunit.client.1.vm08.stdout:9/998: write d2/d58/dbf/dd0/d35/d97/d9d/df4/d28/d98/fd4 [1996646,63362] 0 2026-03-10T07:51:51.510 INFO:tasks.workunit.client.1.vm08.stdout:4/917: dwrite d5/fd4 [0,4194304] 0 2026-03-10T07:51:51.510 INFO:tasks.workunit.client.1.vm08.stdout:2/993: creat d0/d1/d17/dfb/f142 x:0 0 0 2026-03-10T07:51:51.510 INFO:tasks.workunit.client.1.vm08.stdout:2/994: readlink d0/d1/d17/d6b/d140/d55/d7a/l88 0 2026-03-10T07:51:51.511 INFO:tasks.workunit.client.1.vm08.stdout:2/995: read d0/d1/d3/d56/d78/f62 [3976138,105251] 0 2026-03-10T07:51:51.513 INFO:tasks.workunit.client.1.vm08.stdout:3/982: dwrite d0/d3c/d18/d48/d55/d56/f81 [0,4194304] 0 2026-03-10T07:51:51.518 INFO:tasks.workunit.client.1.vm08.stdout:1/958: unlink d2/d6/de/d71/fc9 0 2026-03-10T07:51:51.518 INFO:tasks.workunit.client.1.vm08.stdout:9/999: mkdir d2/d58/dbf/ddf/d14f/d157 0 2026-03-10T07:51:51.520 INFO:tasks.workunit.client.1.vm08.stdout:3/983: chown d0/d3c/d18/d48/d55/d56/ca8 13363 1 2026-03-10T07:51:51.523 INFO:tasks.workunit.client.1.vm08.stdout:1/959: unlink d2/d6/de/d1f/d40/d76/fab 0 2026-03-10T07:51:51.527 INFO:tasks.workunit.client.1.vm08.stdout:3/984: dwrite d0/d3c/d1f/d95/fbd [0,4194304] 0 2026-03-10T07:51:51.549 INFO:tasks.workunit.client.1.vm08.stdout:3/985: rename d0/d3c/d18/f10f to d0/d3c/d1f/f13e 0 2026-03-10T07:51:51.549 INFO:tasks.workunit.client.1.vm08.stdout:3/986: creat d0/d3c/d18/dec/d2d/d85/f13f x:0 0 0 2026-03-10T07:51:51.549 INFO:tasks.workunit.client.1.vm08.stdout:4/918: dread f1 [0,4194304] 0 2026-03-10T07:51:51.549 INFO:tasks.workunit.client.1.vm08.stdout:7/966: dwrite d3/da/d25/d9/d2f/d3a/dc0/f105 [0,4194304] 0 2026-03-10T07:51:51.551 INFO:tasks.workunit.client.1.vm08.stdout:2/996: sync 2026-03-10T07:51:51.552 INFO:tasks.workunit.client.1.vm08.stdout:7/967: mkdir d3/da/d25/d14d 0 2026-03-10T07:51:51.552 INFO:tasks.workunit.client.1.vm08.stdout:3/987: getdents d0/d3c/d18/d48 0 2026-03-10T07:51:51.554 INFO:tasks.workunit.client.1.vm08.stdout:7/968: chown d3/f51 43420 1 2026-03-10T07:51:51.555 INFO:tasks.workunit.client.1.vm08.stdout:2/997: dwrite d0/d1/d17/db2/dc3/f135 [0,4194304] 0 2026-03-10T07:51:51.562 INFO:tasks.workunit.client.1.vm08.stdout:7/969: chown d3/da/d25/d9/d2f/d3a/f14b 0 1 2026-03-10T07:51:51.564 INFO:tasks.workunit.client.1.vm08.stdout:2/998: creat d0/d1/d17/d6b/da0/f143 x:0 0 0 2026-03-10T07:51:51.565 INFO:tasks.workunit.client.1.vm08.stdout:2/999: write d0/d1/d3/f8 [2475072,23297] 0 2026-03-10T07:51:51.565 INFO:tasks.workunit.client.1.vm08.stdout:3/988: mknod d0/d3c/d1f/c140 0 2026-03-10T07:51:51.568 INFO:tasks.workunit.client.1.vm08.stdout:3/989: mkdir d0/d3c/d18/d32/d61/d52/dca/dd2/d141 0 2026-03-10T07:51:51.571 INFO:tasks.workunit.client.1.vm08.stdout:7/970: creat d3/da/d25/d9/d2f/d3a/d4b/d67/dea/d13a/f14e x:0 0 0 2026-03-10T07:51:51.575 INFO:tasks.workunit.client.1.vm08.stdout:7/971: truncate d3/da/d25/d9/f23 137784 0 2026-03-10T07:51:51.615 INFO:tasks.workunit.client.1.vm08.stdout:6/978: write d1/d17/d2b/d5e/da8/f117 [3220184,34660] 0 2026-03-10T07:51:51.616 INFO:tasks.workunit.client.1.vm08.stdout:6/979: fsync d1/d3/df/d1d/d40/d45/d10c/f137 0 2026-03-10T07:51:51.617 INFO:tasks.workunit.client.1.vm08.stdout:6/980: dread - d1/d3/df/d52/fe5 zero size 2026-03-10T07:51:51.617 INFO:tasks.workunit.client.1.vm08.stdout:6/981: dread - d1/f10d zero size 2026-03-10T07:51:51.618 INFO:tasks.workunit.client.1.vm08.stdout:6/982: mknod d1/d17/d2b/d58/d77/c14b 0 2026-03-10T07:51:51.621 INFO:tasks.workunit.client.1.vm08.stdout:6/983: creat d1/d17/d2b/d58/d77/f14c x:0 0 0 2026-03-10T07:51:51.630 INFO:tasks.workunit.client.1.vm08.stdout:1/960: dwrite d2/d6/de/d5f/df9/fff [0,4194304] 0 2026-03-10T07:51:51.630 INFO:tasks.workunit.client.1.vm08.stdout:4/919: write d5/d8/ff0 [778734,122855] 0 2026-03-10T07:51:51.641 INFO:tasks.workunit.client.1.vm08.stdout:1/961: rename d2/d6/de/c21 to d2/d6/de/d5f/df9/c148 0 2026-03-10T07:51:51.641 INFO:tasks.workunit.client.1.vm08.stdout:1/962: write d2/f36 [4377260,13290] 0 2026-03-10T07:51:51.642 INFO:tasks.workunit.client.1.vm08.stdout:4/920: mknod d5/d8/d89/c132 0 2026-03-10T07:51:51.642 INFO:tasks.workunit.client.1.vm08.stdout:3/990: truncate d0/d3c/d1f/ff5 838468 0 2026-03-10T07:51:51.643 INFO:tasks.workunit.client.1.vm08.stdout:3/991: dread - d0/d3c/d1f/f13e zero size 2026-03-10T07:51:51.646 INFO:tasks.workunit.client.1.vm08.stdout:1/963: write d2/d10/dd7/f115 [436214,59047] 0 2026-03-10T07:51:51.647 INFO:tasks.workunit.client.1.vm08.stdout:3/992: rmdir d0/d3c/d18/d48/d55/d56 39 2026-03-10T07:51:51.649 INFO:tasks.workunit.client.1.vm08.stdout:7/972: dwrite d3/f51 [0,4194304] 0 2026-03-10T07:51:51.652 INFO:tasks.workunit.client.1.vm08.stdout:4/921: link d5/da0/d95/lfb d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/dad/l133 0 2026-03-10T07:51:51.652 INFO:tasks.workunit.client.1.vm08.stdout:6/984: dwrite d1/d17/f63 [0,4194304] 0 2026-03-10T07:51:51.653 INFO:tasks.workunit.client.1.vm08.stdout:6/985: chown d1/db/d24/dac/dad 225 1 2026-03-10T07:51:51.653 INFO:tasks.workunit.client.1.vm08.stdout:3/993: write d0/d3c/d1f/f6f [1615134,34106] 0 2026-03-10T07:51:51.660 INFO:tasks.workunit.client.1.vm08.stdout:6/986: creat d1/d17/d2b/d58/d76/dde/f14d x:0 0 0 2026-03-10T07:51:51.661 INFO:tasks.workunit.client.1.vm08.stdout:1/964: sync 2026-03-10T07:51:51.663 INFO:tasks.workunit.client.1.vm08.stdout:6/987: symlink d1/d3/df/d38/def/l14e 0 2026-03-10T07:51:51.663 INFO:tasks.workunit.client.1.vm08.stdout:3/994: link d0/d3c/d18/d32/daa/cc8 d0/d3c/d18/d32/d61/d52/dca/dd2/d117/c142 0 2026-03-10T07:51:51.702 INFO:tasks.workunit.client.1.vm08.stdout:7/973: truncate d3/da/d25/d9/fc5 3228937 0 2026-03-10T07:51:51.702 INFO:tasks.workunit.client.1.vm08.stdout:7/974: readlink d3/da/l24 0 2026-03-10T07:51:51.703 INFO:tasks.workunit.client.1.vm08.stdout:6/988: rename d1/d17/d2b/d58/d76/dde to d1/d3/d3e/db2/d14f 0 2026-03-10T07:51:51.704 INFO:tasks.workunit.client.1.vm08.stdout:1/965: truncate d2/d6/de/d1f/d26/f48 897848 0 2026-03-10T07:51:51.706 INFO:tasks.workunit.client.1.vm08.stdout:4/922: dwrite d5/da0/de2/ff7 [0,4194304] 0 2026-03-10T07:51:51.706 INFO:tasks.workunit.client.1.vm08.stdout:7/975: sync 2026-03-10T07:51:51.707 INFO:tasks.workunit.client.1.vm08.stdout:7/976: fdatasync d3/da/f21 0 2026-03-10T07:51:51.707 INFO:tasks.workunit.client.1.vm08.stdout:3/995: dwrite d0/d3c/d18/d48/d55/fe0 [0,4194304] 0 2026-03-10T07:51:51.711 INFO:tasks.workunit.client.1.vm08.stdout:7/977: chown d3/da/d25/d9/d2f/d39/d43/d4f/d5b 240803 1 2026-03-10T07:51:51.712 INFO:tasks.workunit.client.1.vm08.stdout:7/978: stat d3/da/d25/d9/d2f 0 2026-03-10T07:51:51.712 INFO:tasks.workunit.client.1.vm08.stdout:3/996: truncate d0/d3c/d18/d80/fe6 354433 0 2026-03-10T07:51:51.714 INFO:tasks.workunit.client.1.vm08.stdout:4/923: dwrite d5/d8/fc1 [0,4194304] 0 2026-03-10T07:51:51.724 INFO:tasks.workunit.client.1.vm08.stdout:4/924: rename d5/d8/d50/db0/c109 to d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d9b/c134 0 2026-03-10T07:51:51.725 INFO:tasks.workunit.client.1.vm08.stdout:4/925: dread - d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/dad/f110 zero size 2026-03-10T07:51:51.734 INFO:tasks.workunit.client.1.vm08.stdout:3/997: unlink d0/f100 0 2026-03-10T07:51:51.736 INFO:tasks.workunit.client.1.vm08.stdout:1/966: link d2/d6/de/d47/da0/fcc d2/d6/d3a/f149 0 2026-03-10T07:51:51.736 INFO:tasks.workunit.client.1.vm08.stdout:3/998: rmdir d0/d3c/d18/d32/daa 39 2026-03-10T07:51:51.737 INFO:tasks.workunit.client.1.vm08.stdout:3/999: chown d0/d3c/d1f/d89/fba 7659410 1 2026-03-10T07:51:51.764 INFO:tasks.workunit.client.1.vm08.stdout:6/989: write d1/d3/df/d1d/d40/d87/f8e [1704859,44513] 0 2026-03-10T07:51:51.765 INFO:tasks.workunit.client.1.vm08.stdout:7/979: truncate d3/da/f21 4825506 0 2026-03-10T07:51:51.766 INFO:tasks.workunit.client.1.vm08.stdout:6/990: creat d1/d3/d3e/dff/f150 x:0 0 0 2026-03-10T07:51:51.768 INFO:tasks.workunit.client.1.vm08.stdout:6/991: chown d1/d3/df/d1d/l4b 0 1 2026-03-10T07:51:51.768 INFO:tasks.workunit.client.1.vm08.stdout:4/926: write d5/da0/d95/dc2/fc5 [4777229,67242] 0 2026-03-10T07:51:51.770 INFO:tasks.workunit.client.1.vm08.stdout:6/992: symlink d1/db/d24/l151 0 2026-03-10T07:51:51.770 INFO:tasks.workunit.client.1.vm08.stdout:4/927: read - d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/dce/fee zero size 2026-03-10T07:51:51.771 INFO:tasks.workunit.client.1.vm08.stdout:6/993: creat d1/d3/d3e/dff/f152 x:0 0 0 2026-03-10T07:51:51.771 INFO:tasks.workunit.client.1.vm08.stdout:7/980: read d3/da/d25/d9/d2f/d39/d43/f7e [910841,79004] 0 2026-03-10T07:51:51.773 INFO:tasks.workunit.client.1.vm08.stdout:4/928: mknod d5/c135 0 2026-03-10T07:51:51.773 INFO:tasks.workunit.client.1.vm08.stdout:6/994: rmdir d1/d3 39 2026-03-10T07:51:51.775 INFO:tasks.workunit.client.1.vm08.stdout:1/967: dwrite d2/d6/de/d1f/d26/f29 [4194304,4194304] 0 2026-03-10T07:51:51.786 INFO:tasks.workunit.client.1.vm08.stdout:7/981: dwrite d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79/f142 [0,4194304] 0 2026-03-10T07:51:51.786 INFO:tasks.workunit.client.1.vm08.stdout:1/968: dwrite d2/d6/d3a/f7d [0,4194304] 0 2026-03-10T07:51:51.786 INFO:tasks.workunit.client.1.vm08.stdout:4/929: unlink d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d41/f7f 0 2026-03-10T07:51:51.790 INFO:tasks.workunit.client.1.vm08.stdout:7/982: creat d3/da/d25/d9/d2f/d39/d43/d4f/d5b/db7/d113/f14f x:0 0 0 2026-03-10T07:51:51.798 INFO:tasks.workunit.client.1.vm08.stdout:4/930: dread d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/f82 [0,4194304] 0 2026-03-10T07:51:51.798 INFO:tasks.workunit.client.1.vm08.stdout:6/995: dwrite d1/d3/df/d1d/d40/d45/d10c/f119 [0,4194304] 0 2026-03-10T07:51:51.804 INFO:tasks.workunit.client.1.vm08.stdout:4/931: truncate d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/f103 623126 0 2026-03-10T07:51:51.808 INFO:tasks.workunit.client.1.vm08.stdout:7/983: rmdir d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/dfe/d115 0 2026-03-10T07:51:51.811 INFO:tasks.workunit.client.1.vm08.stdout:7/984: getdents d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d79/ddd 0 2026-03-10T07:51:51.813 INFO:tasks.workunit.client.1.vm08.stdout:7/985: link d3/da/l24 d3/da/d25/d9/d6f/l150 0 2026-03-10T07:51:51.828 INFO:tasks.workunit.client.1.vm08.stdout:4/932: dread d5/da0/d95/f96 [0,4194304] 0 2026-03-10T07:51:51.829 INFO:tasks.workunit.client.1.vm08.stdout:4/933: read d5/f2f [3112449,128796] 0 2026-03-10T07:51:51.831 INFO:tasks.workunit.client.1.vm08.stdout:4/934: dread - d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/dce/fd1 zero size 2026-03-10T07:51:51.833 INFO:tasks.workunit.client.1.vm08.stdout:4/935: mknod d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/dce/c136 0 2026-03-10T07:51:51.833 INFO:tasks.workunit.client.1.vm08.stdout:4/936: fsync f1 0 2026-03-10T07:51:51.833 INFO:tasks.workunit.client.1.vm08.stdout:4/937: write d5/da0/d95/dc2/f128 [501928,39556] 0 2026-03-10T07:51:51.837 INFO:tasks.workunit.client.1.vm08.stdout:4/938: link d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/d61/c63 d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/dad/db8/c137 0 2026-03-10T07:51:51.840 INFO:tasks.workunit.client.1.vm08.stdout:4/939: rename d5/da0/d12/c57 to d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/dce/dd9/c138 0 2026-03-10T07:51:51.846 INFO:tasks.workunit.client.1.vm08.stdout:4/940: mkdir d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/d139 0 2026-03-10T07:51:51.848 INFO:tasks.workunit.client.1.vm08.stdout:4/941: unlink d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/daf/cb6 0 2026-03-10T07:51:51.849 INFO:tasks.workunit.client.1.vm08.stdout:4/942: write d5/da0/d95/de6/d48/d4f/d8d/d91/dd5/f100 [1868094,113156] 0 2026-03-10T07:51:51.852 INFO:tasks.workunit.client.1.vm08.stdout:4/943: creat d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/f13a x:0 0 0 2026-03-10T07:51:51.853 INFO:tasks.workunit.client.1.vm08.stdout:4/944: write d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/f126 [996615,69375] 0 2026-03-10T07:51:51.854 INFO:tasks.workunit.client.1.vm08.stdout:4/945: write d5/da0/d95/dc2/f128 [1566344,70712] 0 2026-03-10T07:51:51.861 INFO:tasks.workunit.client.1.vm08.stdout:4/946: dread d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/fe4 [0,4194304] 0 2026-03-10T07:51:51.865 INFO:tasks.workunit.client.1.vm08.stdout:4/947: mkdir d5/da0/d95/dc2/d13b 0 2026-03-10T07:51:51.866 INFO:tasks.workunit.client.1.vm08.stdout:4/948: fsync d5/da0/d95/de6/d48/d4f/d8d/d91/fca 0 2026-03-10T07:51:51.868 INFO:tasks.workunit.client.1.vm08.stdout:4/949: unlink d5/d8/f30 0 2026-03-10T07:51:51.868 INFO:tasks.workunit.client.1.vm08.stdout:4/950: readlink d5/da0/d12/l13 0 2026-03-10T07:51:51.869 INFO:tasks.workunit.client.1.vm08.stdout:6/996: write d1/d17/f66 [8454401,93402] 0 2026-03-10T07:51:51.871 INFO:tasks.workunit.client.1.vm08.stdout:4/951: readlink d5/da0/d95/de6/d48/d4f/d8d/d91/lb4 0 2026-03-10T07:51:51.872 INFO:tasks.workunit.client.1.vm08.stdout:1/969: dwrite d2/d6/de/d1f/d26/d58/d8c/f97 [0,4194304] 0 2026-03-10T07:51:51.874 INFO:tasks.workunit.client.1.vm08.stdout:7/986: truncate d3/da/f6b 4419590 0 2026-03-10T07:51:51.874 INFO:tasks.workunit.client.1.vm08.stdout:1/970: dread - d2/d6/de/d1f/d26/d58/d83/dc2/f109 zero size 2026-03-10T07:51:51.876 INFO:tasks.workunit.client.1.vm08.stdout:6/997: dwrite d1/d3/d3e/f14a [0,4194304] 0 2026-03-10T07:51:51.878 INFO:tasks.workunit.client.1.vm08.stdout:4/952: truncate d5/da0/d12/def/f10f 4216664 0 2026-03-10T07:51:51.883 INFO:tasks.workunit.client.1.vm08.stdout:4/953: write d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/f38 [4809495,41607] 0 2026-03-10T07:51:51.883 INFO:tasks.workunit.client.1.vm08.stdout:7/987: symlink d3/da/d25/d9/d2f/d39/d43/d4f/d5b/d136/l151 0 2026-03-10T07:51:51.886 INFO:tasks.workunit.client.1.vm08.stdout:7/988: write d3/da/d25/d9/d2f/d3a/d4b/f111 [926123,39241] 0 2026-03-10T07:51:51.890 INFO:tasks.workunit.client.1.vm08.stdout:6/998: creat d1/db/d24/dac/f153 x:0 0 0 2026-03-10T07:51:51.897 INFO:tasks.workunit.client.1.vm08.stdout:7/989: symlink d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/l152 0 2026-03-10T07:51:51.897 INFO:tasks.workunit.client.1.vm08.stdout:4/954: truncate d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d9b/fa8 76258 0 2026-03-10T07:51:51.902 INFO:tasks.workunit.client.1.vm08.stdout:4/955: mkdir d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d41/d13c 0 2026-03-10T07:51:51.914 INFO:tasks.workunit.client.1.vm08.stdout:4/956: dread d5/da0/d95/de6/d48/d4f/d7c/fb2 [0,4194304] 0 2026-03-10T07:51:51.926 INFO:tasks.workunit.client.1.vm08.stdout:7/990: getdents d3/da/d25/d9/d2f/d3a/d71/d8c/ddf 0 2026-03-10T07:51:51.927 INFO:tasks.workunit.client.1.vm08.stdout:1/971: dwrite d2/d10/ff7 [0,4194304] 0 2026-03-10T07:51:51.930 INFO:tasks.workunit.client.1.vm08.stdout:1/972: chown d2/d6/de/d1f/ccf 5024 1 2026-03-10T07:51:51.936 INFO:tasks.workunit.client.1.vm08.stdout:6/999: dwrite d1/d3/d3e/fe1 [0,4194304] 0 2026-03-10T07:51:51.937 INFO:tasks.workunit.client.1.vm08.stdout:7/991: unlink d3/da/d25/d9/d2f/d3a/d71/d8c/f12d 0 2026-03-10T07:51:51.942 INFO:tasks.workunit.client.1.vm08.stdout:7/992: truncate d3/da/d25/d9/fc5 4079868 0 2026-03-10T07:51:51.946 INFO:tasks.workunit.client.1.vm08.stdout:4/957: write d5/da0/d95/de6/fbc [1007448,33972] 0 2026-03-10T07:51:51.953 INFO:tasks.workunit.client.1.vm08.stdout:4/958: dread d5/d8/f68 [0,4194304] 0 2026-03-10T07:51:51.954 INFO:tasks.workunit.client.1.vm08.stdout:4/959: chown d5/d8/d89/c132 13 1 2026-03-10T07:51:51.955 INFO:tasks.workunit.client.1.vm08.stdout:4/960: creat d5/d8/d50/db0/f13d x:0 0 0 2026-03-10T07:51:51.969 INFO:tasks.workunit.client.1.vm08.stdout:1/973: dwrite d2/d6/de/d1f/f78 [0,4194304] 0 2026-03-10T07:51:51.980 INFO:tasks.workunit.client.1.vm08.stdout:1/974: rename d2/d6/de/d1f/d26/d89/d117 to d2/d6/de/d1f/d8f/d14a 0 2026-03-10T07:51:51.985 INFO:tasks.workunit.client.1.vm08.stdout:7/993: dwrite d3/da/d25/f35 [0,4194304] 0 2026-03-10T07:51:51.985 INFO:tasks.workunit.client.1.vm08.stdout:4/961: write d5/da0/d12/fc6 [2325748,73225] 0 2026-03-10T07:51:51.995 INFO:tasks.workunit.client.1.vm08.stdout:7/994: creat d3/da/d25/d9/d2f/d4d/f153 x:0 0 0 2026-03-10T07:51:51.998 INFO:tasks.workunit.client.1.vm08.stdout:1/975: getdents d2/d6/de/d1f/d26/d58/d83/dc2 0 2026-03-10T07:51:51.999 INFO:tasks.workunit.client.1.vm08.stdout:7/995: creat d3/da/d25/d9/d2f/dfb/f154 x:0 0 0 2026-03-10T07:51:52.015 INFO:tasks.workunit.client.1.vm08.stdout:4/962: dwrite d5/f21 [4194304,4194304] 0 2026-03-10T07:51:52.018 INFO:tasks.workunit.client.1.vm08.stdout:1/976: dwrite d2/d6/de/f7c [4194304,4194304] 0 2026-03-10T07:51:52.022 INFO:tasks.workunit.client.1.vm08.stdout:7/996: dwrite d3/da/d25/d9/d2f/d39/d43/d4f/ff5 [0,4194304] 0 2026-03-10T07:51:52.031 INFO:tasks.workunit.client.1.vm08.stdout:4/963: readlink d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d70/l112 0 2026-03-10T07:51:52.036 INFO:tasks.workunit.client.1.vm08.stdout:7/997: dwrite d3/da/d25/d9/d2f/d3a/d71/d8c/ddf/f133 [4194304,4194304] 0 2026-03-10T07:51:52.040 INFO:tasks.workunit.client.1.vm08.stdout:4/964: mkdir d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/dce/dd9/d13e 0 2026-03-10T07:51:52.043 INFO:tasks.workunit.client.1.vm08.stdout:4/965: creat d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/d61/f13f x:0 0 0 2026-03-10T07:51:52.043 INFO:tasks.workunit.client.1.vm08.stdout:7/998: getdents d3/da/d25/d9/d2f/d3a/d71/d8c/ddf 0 2026-03-10T07:51:52.046 INFO:tasks.workunit.client.1.vm08.stdout:7/999: symlink d3/da/d25/d9/d2f/d39/d43/d4f/d5b/l155 0 2026-03-10T07:51:52.046 INFO:tasks.workunit.client.1.vm08.stdout:4/966: getdents d5/da0/de2 0 2026-03-10T07:51:52.050 INFO:tasks.workunit.client.1.vm08.stdout:4/967: creat d5/d8/d50/f140 x:0 0 0 2026-03-10T07:51:52.054 INFO:tasks.workunit.client.1.vm08.stdout:4/968: dwrite d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/dce/f120 [0,4194304] 0 2026-03-10T07:51:52.060 INFO:tasks.workunit.client.1.vm08.stdout:4/969: fdatasync d5/d8/f68 0 2026-03-10T07:51:52.113 INFO:tasks.workunit.client.1.vm08.stdout:4/970: sync 2026-03-10T07:51:52.114 INFO:tasks.workunit.client.1.vm08.stdout:4/971: dread - d5/da0/d32/ff4 zero size 2026-03-10T07:51:52.116 INFO:tasks.workunit.client.1.vm08.stdout:4/972: symlink d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/l141 0 2026-03-10T07:51:52.116 INFO:tasks.workunit.client.1.vm08.stdout:4/973: write d5/fd4 [3504199,4205] 0 2026-03-10T07:51:52.118 INFO:tasks.workunit.client.1.vm08.stdout:4/974: mknod d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d9b/c142 0 2026-03-10T07:51:52.119 INFO:tasks.workunit.client.1.vm08.stdout:4/975: chown d5/da0/d95/de6/d48 522 1 2026-03-10T07:51:52.124 INFO:tasks.workunit.client.1.vm08.stdout:4/976: dwrite d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d9b/ff3 [0,4194304] 0 2026-03-10T07:51:52.130 INFO:tasks.workunit.client.1.vm08.stdout:4/977: dwrite d5/da0/f46 [0,4194304] 0 2026-03-10T07:51:52.132 INFO:tasks.workunit.client.1.vm08.stdout:4/978: readlink d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/l131 0 2026-03-10T07:51:52.136 INFO:tasks.workunit.client.1.vm08.stdout:4/979: dwrite d5/f21 [0,4194304] 0 2026-03-10T07:51:52.138 INFO:tasks.workunit.client.1.vm08.stdout:4/980: fsync d5/d8/f86 0 2026-03-10T07:51:52.153 INFO:tasks.workunit.client.1.vm08.stdout:1/977: write d2/d6/de/f74 [1367336,19077] 0 2026-03-10T07:51:52.157 INFO:tasks.workunit.client.1.vm08.stdout:1/978: mkdir d2/d6/de/d47/d14b 0 2026-03-10T07:51:52.166 INFO:tasks.workunit.client.1.vm08.stdout:1/979: link d2/d6/de/d5f/df9/fff d2/d10/dc6/f14c 0 2026-03-10T07:51:52.171 INFO:tasks.workunit.client.1.vm08.stdout:1/980: rmdir d2/d6/de/d70/d80 39 2026-03-10T07:51:52.200 INFO:tasks.workunit.client.1.vm08.stdout:4/981: dread d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d41/f83 [0,4194304] 0 2026-03-10T07:51:52.211 INFO:tasks.workunit.client.1.vm08.stdout:4/982: creat d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/dce/dd9/d13e/f143 x:0 0 0 2026-03-10T07:51:52.223 INFO:tasks.workunit.client.1.vm08.stdout:4/983: link d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/f13a d5/da0/d95/dc2/f144 0 2026-03-10T07:51:52.231 INFO:tasks.workunit.client.1.vm08.stdout:4/984: rename d5/da0/d95/de6/la3 to d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/dce/dd9/l145 0 2026-03-10T07:51:52.234 INFO:tasks.workunit.client.1.vm08.stdout:4/985: rename d5/da0/d95/de6/d48/d4f/fe1 to d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d9b/f146 0 2026-03-10T07:51:52.237 INFO:tasks.workunit.client.1.vm08.stdout:1/981: truncate d2/d10/fe8 3741234 0 2026-03-10T07:51:52.240 INFO:tasks.workunit.client.1.vm08.stdout:1/982: rename d2/d6/d3a/l6c to d2/d6/de/d71/dc1/l14d 0 2026-03-10T07:51:52.242 INFO:tasks.workunit.client.1.vm08.stdout:1/983: symlink d2/l14e 0 2026-03-10T07:51:52.244 INFO:tasks.workunit.client.1.vm08.stdout:4/986: truncate d5/da0/d32/fae 3623102 0 2026-03-10T07:51:52.245 INFO:tasks.workunit.client.1.vm08.stdout:1/984: symlink d2/d6/de/d1f/d26/d98/d13b/l14f 0 2026-03-10T07:51:52.246 INFO:tasks.workunit.client.1.vm08.stdout:4/987: creat d5/da0/de2/f147 x:0 0 0 2026-03-10T07:51:52.248 INFO:tasks.workunit.client.1.vm08.stdout:1/985: rename d2/d6/de/d1f/d26/d89/l9a to d2/d6/de/d1f/d26/d58/d83/d104/l150 0 2026-03-10T07:51:52.248 INFO:tasks.workunit.client.1.vm08.stdout:1/986: write d2/d6/de/d47/dbd/dc3/fd0 [414535,118423] 0 2026-03-10T07:51:52.249 INFO:tasks.workunit.client.1.vm08.stdout:4/988: creat d5/da0/d95/de6/d48/d4f/d7c/dde/f148 x:0 0 0 2026-03-10T07:51:52.252 INFO:tasks.workunit.client.1.vm08.stdout:4/989: truncate d5/d8/d50/db0/f13d 80568 0 2026-03-10T07:51:52.254 INFO:tasks.workunit.client.1.vm08.stdout:4/990: mknod d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/dce/dd9/c149 0 2026-03-10T07:51:52.256 INFO:tasks.workunit.client.1.vm08.stdout:1/987: dread d2/d6/d9f/fec [0,4194304] 0 2026-03-10T07:51:52.256 INFO:tasks.workunit.client.1.vm08.stdout:1/988: readlink d2/d6/de/d1f/l51 0 2026-03-10T07:51:52.258 INFO:tasks.workunit.client.1.vm08.stdout:1/989: write d2/d6/de/f7c [4000487,42216] 0 2026-03-10T07:51:52.259 INFO:tasks.workunit.client.1.vm08.stdout:1/990: readlink d2/d6/de/d1f/d40/d76/l8b 0 2026-03-10T07:51:52.262 INFO:tasks.workunit.client.1.vm08.stdout:1/991: chown d2/d6/de/d1f/d26/d58/d8c/f13c 2383 1 2026-03-10T07:51:52.264 INFO:tasks.workunit.client.1.vm08.stdout:4/991: dwrite d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d9b/f124 [0,4194304] 0 2026-03-10T07:51:52.266 INFO:tasks.workunit.client.1.vm08.stdout:4/992: truncate d5/da0/de2/f147 592337 0 2026-03-10T07:51:52.268 INFO:tasks.workunit.client.1.vm08.stdout:1/992: symlink d2/d6/de/d1f/d22/deb/l151 0 2026-03-10T07:51:52.275 INFO:tasks.workunit.client.1.vm08.stdout:4/993: dwrite d5/da0/f18 [0,4194304] 0 2026-03-10T07:51:52.280 INFO:tasks.workunit.client.1.vm08.stdout:4/994: fdatasync d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d9b/ff3 0 2026-03-10T07:51:52.283 INFO:tasks.workunit.client.1.vm08.stdout:1/993: dread d2/d10/f139 [0,4194304] 0 2026-03-10T07:51:52.319 INFO:tasks.workunit.client.1.vm08.stdout:4/995: write d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/d31/f82 [395172,44845] 0 2026-03-10T07:51:52.321 INFO:tasks.workunit.client.1.vm08.stdout:4/996: mknod d5/da0/d95/dc2/c14a 0 2026-03-10T07:51:52.321 INFO:tasks.workunit.client.1.vm08.stdout:1/994: dwrite d2/d10/dc6/f14c [0,4194304] 0 2026-03-10T07:51:52.322 INFO:tasks.workunit.client.1.vm08.stdout:1/995: stat d2/d10/f99 0 2026-03-10T07:51:52.328 INFO:tasks.workunit.client.1.vm08.stdout:1/996: symlink d2/d6/de/d1f/d40/d76/l152 0 2026-03-10T07:51:52.329 INFO:tasks.workunit.client.1.vm08.stdout:1/997: fdatasync d2/d6/de/d1f/d40/f4d 0 2026-03-10T07:51:52.333 INFO:tasks.workunit.client.1.vm08.stdout:4/997: rename d5/f2f to d5/da0/d95/de6/d48/d4f/d7c/dde/ddf/d10b/f14b 0 2026-03-10T07:51:52.339 INFO:tasks.workunit.client.1.vm08.stdout:1/998: getdents d2/d6 0 2026-03-10T07:51:52.354 INFO:tasks.workunit.client.1.vm08.stdout:1/999: sync 2026-03-10T07:51:52.356 INFO:tasks.workunit.client.1.vm08.stdout:4/998: rmdir d5/da0 39 2026-03-10T07:51:52.406 INFO:tasks.workunit.client.1.vm08.stdout:4/999: creat d5/da0/d95/f14c x:0 0 0 2026-03-10T07:51:52.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:52 vm05.local ceph-mon[50387]: pgmap v45: 65 pgs: 65 active+clean; 3.4 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 42 MiB/s rd, 97 MiB/s wr, 233 op/s 2026-03-10T07:51:52.409 INFO:tasks.workunit.client.1.vm08.stderr:+ rm -rf -- ./tmp.npN7WPNL4y 2026-03-10T07:51:52.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:52 vm08.local ceph-mon[59917]: pgmap v45: 65 pgs: 65 active+clean; 3.4 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 42 MiB/s rd, 97 MiB/s wr, 233 op/s 2026-03-10T07:51:53.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:53 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:53.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:53 vm05.local ceph-mon[50387]: pgmap v46: 65 pgs: 65 active+clean; 3.4 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 42 MiB/s rd, 97 MiB/s wr, 233 op/s 2026-03-10T07:51:53.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:53 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:51:53.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:53 vm08.local ceph-mon[59917]: pgmap v46: 65 pgs: 65 active+clean; 3.4 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 42 MiB/s rd, 97 MiB/s wr, 233 op/s 2026-03-10T07:51:55.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:55 vm05.local ceph-mon[50387]: pgmap v47: 65 pgs: 65 active+clean; 3.4 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 63 MiB/s rd, 144 MiB/s wr, 367 op/s 2026-03-10T07:51:55.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:55 vm08.local ceph-mon[59917]: pgmap v47: 65 pgs: 65 active+clean; 3.4 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 63 MiB/s rd, 144 MiB/s wr, 367 op/s 2026-03-10T07:51:57.947 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:57 vm05.local ceph-mon[50387]: pgmap v48: 65 pgs: 65 active+clean; 3.4 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 45 MiB/s rd, 97 MiB/s wr, 261 op/s 2026-03-10T07:51:57.947 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:51:57 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:51:58.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:57 vm08.local ceph-mon[59917]: pgmap v48: 65 pgs: 65 active+clean; 3.4 GiB data, 11 GiB used, 109 GiB / 120 GiB avail; 45 MiB/s rd, 97 MiB/s wr, 261 op/s 2026-03-10T07:51:58.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:51:57 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:52:00.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:00 vm05.local ceph-mon[50387]: pgmap v49: 65 pgs: 65 active+clean; 2.3 GiB data, 8.4 GiB used, 112 GiB / 120 GiB avail; 45 MiB/s rd, 98 MiB/s wr, 341 op/s 2026-03-10T07:52:00.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:00 vm05.local ceph-mon[50387]: mgrmap e31: vm05.blexke(active, since 92s), standbys: vm08.orfpog 2026-03-10T07:52:00.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:00 vm08.local ceph-mon[59917]: pgmap v49: 65 pgs: 65 active+clean; 2.3 GiB data, 8.4 GiB used, 112 GiB / 120 GiB avail; 45 MiB/s rd, 98 MiB/s wr, 341 op/s 2026-03-10T07:52:00.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:00 vm08.local ceph-mon[59917]: mgrmap e31: vm05.blexke(active, since 92s), standbys: vm08.orfpog 2026-03-10T07:52:01.434 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-10T07:52:01.434 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0/tmp 2026-03-10T07:52:02.475 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:02 vm05.local ceph-mon[50387]: pgmap v50: 65 pgs: 65 active+clean; 2.3 GiB data, 8.4 GiB used, 112 GiB / 120 GiB avail; 21 MiB/s rd, 48 MiB/s wr, 214 op/s 2026-03-10T07:52:02.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:02 vm08.local ceph-mon[59917]: pgmap v50: 65 pgs: 65 active+clean; 2.3 GiB data, 8.4 GiB used, 112 GiB / 120 GiB avail; 21 MiB/s rd, 48 MiB/s wr, 214 op/s 2026-03-10T07:52:02.669 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.667+0000 7f6176b56700 1 -- 192.168.123.105:0/922146271 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61700ff370 msgr2=0x7f6170106380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:02.669 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.667+0000 7f6176b56700 1 --2- 192.168.123.105:0/922146271 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61700ff370 0x7f6170106380 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f6160009b00 tx=0x7f6160009e10 comp rx=0 tx=0).stop 2026-03-10T07:52:02.670 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.668+0000 7f6176b56700 1 -- 192.168.123.105:0/922146271 shutdown_connections 2026-03-10T07:52:02.670 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.668+0000 7f6176b56700 1 --2- 192.168.123.105:0/922146271 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61700ff370 0x7f6170106380 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:02.670 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.668+0000 7f6176b56700 1 --2- 192.168.123.105:0/922146271 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61700fea10 0x7f61700fee30 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:02.670 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.668+0000 7f6176b56700 1 -- 192.168.123.105:0/922146271 >> 192.168.123.105:0/922146271 conn(0x7f61700fa450 msgr2=0x7f61700fc8b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:02.671 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.670+0000 7f6176b56700 1 -- 192.168.123.105:0/922146271 shutdown_connections 2026-03-10T07:52:02.671 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.670+0000 7f6176b56700 1 -- 192.168.123.105:0/922146271 wait complete. 2026-03-10T07:52:02.671 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.670+0000 7f6176b56700 1 Processor -- start 2026-03-10T07:52:02.671 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.670+0000 7f6176b56700 1 -- start start 2026-03-10T07:52:02.672 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.670+0000 7f6176b56700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61700fea10 0x7f61701987b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:02.672 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.670+0000 7f6176b56700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61700ff370 0x7f6170198cf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:02.672 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.670+0000 7f6176b56700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6170199310 con 0x7f61700ff370 2026-03-10T07:52:02.672 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.670+0000 7f6176b56700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6170199450 con 0x7f61700fea10 2026-03-10T07:52:02.672 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.671+0000 7f6175b54700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61700fea10 0x7f61701987b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:02.672 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.671+0000 7f6175b54700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61700fea10 0x7f61701987b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:57868/0 (socket says 192.168.123.105:57868) 2026-03-10T07:52:02.672 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.671+0000 7f6175b54700 1 -- 192.168.123.105:0/1486706854 learned_addr learned my addr 192.168.123.105:0/1486706854 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:02.672 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.671+0000 7f6175353700 1 --2- 192.168.123.105:0/1486706854 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61700ff370 0x7f6170198cf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:02.673 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.671+0000 7f6175b54700 1 -- 192.168.123.105:0/1486706854 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61700ff370 msgr2=0x7f6170198cf0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:02.673 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.671+0000 7f6175b54700 1 --2- 192.168.123.105:0/1486706854 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61700ff370 0x7f6170198cf0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:02.673 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.671+0000 7f6175b54700 1 -- 192.168.123.105:0/1486706854 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f61600097e0 con 0x7f61700fea10 2026-03-10T07:52:02.673 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.671+0000 7f6175b54700 1 --2- 192.168.123.105:0/1486706854 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61700fea10 0x7f61701987b0 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f616c00eb10 tx=0x7f616c00ee20 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:02.673 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.672+0000 7f6166ffd700 1 -- 192.168.123.105:0/1486706854 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f616c00cc40 con 0x7f61700fea10 2026-03-10T07:52:02.673 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.672+0000 7f6176b56700 1 -- 192.168.123.105:0/1486706854 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f617019df00 con 0x7f61700fea10 2026-03-10T07:52:02.673 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.672+0000 7f6176b56700 1 -- 192.168.123.105:0/1486706854 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f617019e3c0 con 0x7f61700fea10 2026-03-10T07:52:02.674 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.673+0000 7f6166ffd700 1 -- 192.168.123.105:0/1486706854 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f616c00cda0 con 0x7f61700fea10 2026-03-10T07:52:02.674 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.673+0000 7f6166ffd700 1 -- 192.168.123.105:0/1486706854 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f616c018810 con 0x7f61700fea10 2026-03-10T07:52:02.675 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.674+0000 7f6166ffd700 1 -- 192.168.123.105:0/1486706854 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 31) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f616c018a50 con 0x7f61700fea10 2026-03-10T07:52:02.675 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.674+0000 7f6166ffd700 1 --2- 192.168.123.105:0/1486706854 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f615c077610 0x7f615c079ad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:02.675 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.674+0000 7f6166ffd700 1 -- 192.168.123.105:0/1486706854 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f616c014070 con 0x7f61700fea10 2026-03-10T07:52:02.675 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.674+0000 7f6164ff9700 1 -- 192.168.123.105:0/1486706854 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6154005320 con 0x7f61700fea10 2026-03-10T07:52:02.676 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.675+0000 7f6175353700 1 --2- 192.168.123.105:0/1486706854 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f615c077610 0x7f615c079ad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:02.678 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.676+0000 7f6175353700 1 --2- 192.168.123.105:0/1486706854 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f615c077610 0x7f615c079ad0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f61600052d0 tx=0x7f616001a040 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:02.679 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.678+0000 7f6166ffd700 1 -- 192.168.123.105:0/1486706854 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f616c063bc0 con 0x7f61700fea10 2026-03-10T07:52:02.823 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.821+0000 7f6164ff9700 1 -- 192.168.123.105:0/1486706854 --> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f6154000bf0 con 0x7f615c077610 2026-03-10T07:52:02.826 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.825+0000 7f6166ffd700 1 -- 192.168.123.105:0/1486706854 <== mgr.14628 v2:192.168.123.105:6800/413688438 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+175 (secure 0 0 0) 0x7f6154000bf0 con 0x7f615c077610 2026-03-10T07:52:02.829 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.828+0000 7f6176b56700 1 -- 192.168.123.105:0/1486706854 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f615c077610 msgr2=0x7f615c079ad0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:02.830 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.828+0000 7f6176b56700 1 --2- 192.168.123.105:0/1486706854 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f615c077610 0x7f615c079ad0 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7f61600052d0 tx=0x7f616001a040 comp rx=0 tx=0).stop 2026-03-10T07:52:02.830 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.828+0000 7f6176b56700 1 -- 192.168.123.105:0/1486706854 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61700fea10 msgr2=0x7f61701987b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:02.830 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.828+0000 7f6176b56700 1 --2- 192.168.123.105:0/1486706854 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61700fea10 0x7f61701987b0 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f616c00eb10 tx=0x7f616c00ee20 comp rx=0 tx=0).stop 2026-03-10T07:52:02.830 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.828+0000 7f6176b56700 1 -- 192.168.123.105:0/1486706854 shutdown_connections 2026-03-10T07:52:02.830 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.828+0000 7f6176b56700 1 --2- 192.168.123.105:0/1486706854 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61700fea10 0x7f61701987b0 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:02.830 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.828+0000 7f6176b56700 1 --2- 192.168.123.105:0/1486706854 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f615c077610 0x7f615c079ad0 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:02.830 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.828+0000 7f6176b56700 1 --2- 192.168.123.105:0/1486706854 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61700ff370 0x7f6170198cf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:02.830 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.828+0000 7f6176b56700 1 -- 192.168.123.105:0/1486706854 >> 192.168.123.105:0/1486706854 conn(0x7f61700fa450 msgr2=0x7f61701009e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:02.830 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.828+0000 7f6176b56700 1 -- 192.168.123.105:0/1486706854 shutdown_connections 2026-03-10T07:52:02.830 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:02.829+0000 7f6176b56700 1 -- 192.168.123.105:0/1486706854 wait complete. 2026-03-10T07:52:02.890 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.mgr | length == 1'"'"'' 2026-03-10T07:52:03.093 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:52:03.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.629+0000 7f67dfeca700 1 -- 192.168.123.105:0/20490246 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67d8107d90 msgr2=0x7f67d8108210 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:03.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.629+0000 7f67dfeca700 1 --2- 192.168.123.105:0/20490246 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67d8107d90 0x7f67d8108210 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f67d000c500 tx=0x7f67d000c810 comp rx=0 tx=0).stop 2026-03-10T07:52:03.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.630+0000 7f67dfeca700 1 -- 192.168.123.105:0/20490246 shutdown_connections 2026-03-10T07:52:03.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.630+0000 7f67dfeca700 1 --2- 192.168.123.105:0/20490246 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67d8107d90 0x7f67d8108210 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:03.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.630+0000 7f67dfeca700 1 --2- 192.168.123.105:0/20490246 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f67d810d680 0x7f67d810da60 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:03.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.630+0000 7f67dfeca700 1 -- 192.168.123.105:0/20490246 >> 192.168.123.105:0/20490246 conn(0x7f67d806d1b0 msgr2=0x7f67d806d5c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:03.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.630+0000 7f67dfeca700 1 -- 192.168.123.105:0/20490246 shutdown_connections 2026-03-10T07:52:03.631 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.630+0000 7f67dfeca700 1 -- 192.168.123.105:0/20490246 wait complete. 2026-03-10T07:52:03.632 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.631+0000 7f67dfeca700 1 Processor -- start 2026-03-10T07:52:03.632 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.631+0000 7f67dfeca700 1 -- start start 2026-03-10T07:52:03.632 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.631+0000 7f67dfeca700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f67d810d680 0x7f67d819cee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:03.632 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.631+0000 7f67dfeca700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67d819ebb0 0x7f67d819d420 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:03.632 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.631+0000 7f67dfeca700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f67d819d9f0 con 0x7f67d810d680 2026-03-10T07:52:03.632 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.631+0000 7f67dfeca700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f67d819db30 con 0x7f67d819ebb0 2026-03-10T07:52:03.633 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.632+0000 7f67dd465700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67d819ebb0 0x7f67d819d420 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:03.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.632+0000 7f67dd465700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67d819ebb0 0x7f67d819d420 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:57876/0 (socket says 192.168.123.105:57876) 2026-03-10T07:52:03.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.632+0000 7f67dd465700 1 -- 192.168.123.105:0/2952000326 learned_addr learned my addr 192.168.123.105:0/2952000326 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:03.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.632+0000 7f67ddc66700 1 --2- 192.168.123.105:0/2952000326 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f67d810d680 0x7f67d819cee0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:03.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.633+0000 7f67dd465700 1 -- 192.168.123.105:0/2952000326 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f67d810d680 msgr2=0x7f67d819cee0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:03.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.633+0000 7f67dd465700 1 --2- 192.168.123.105:0/2952000326 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f67d810d680 0x7f67d819cee0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:03.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.633+0000 7f67dd465700 1 -- 192.168.123.105:0/2952000326 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f67d000c1b0 con 0x7f67d819ebb0 2026-03-10T07:52:03.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.633+0000 7f67dd465700 1 --2- 192.168.123.105:0/2952000326 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67d819ebb0 0x7f67d819d420 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f67d000c4d0 tx=0x7f67d0012a40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:03.634 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.633+0000 7f67ceffd700 1 -- 192.168.123.105:0/2952000326 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f67d0003bb0 con 0x7f67d819ebb0 2026-03-10T07:52:03.635 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.633+0000 7f67dfeca700 1 -- 192.168.123.105:0/2952000326 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f67d819dd30 con 0x7f67d819ebb0 2026-03-10T07:52:03.635 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.633+0000 7f67dfeca700 1 -- 192.168.123.105:0/2952000326 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f67d81a37d0 con 0x7f67d819ebb0 2026-03-10T07:52:03.635 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.634+0000 7f67ceffd700 1 -- 192.168.123.105:0/2952000326 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f67d0003d10 con 0x7f67d819ebb0 2026-03-10T07:52:03.636 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.634+0000 7f67ceffd700 1 -- 192.168.123.105:0/2952000326 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f67d001b640 con 0x7f67d819ebb0 2026-03-10T07:52:03.637 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.635+0000 7f67ceffd700 1 -- 192.168.123.105:0/2952000326 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 31) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f67d0007d50 con 0x7f67d819ebb0 2026-03-10T07:52:03.637 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.636+0000 7f67ceffd700 1 --2- 192.168.123.105:0/2952000326 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f67c4077660 0x7f67c4079b20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:03.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.636+0000 7f67ddc66700 1 --2- 192.168.123.105:0/2952000326 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f67c4077660 0x7f67c4079b20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:03.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.637+0000 7f67ceffd700 1 -- 192.168.123.105:0/2952000326 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f67d000f070 con 0x7f67d819ebb0 2026-03-10T07:52:03.638 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.637+0000 7f67dfeca700 1 -- 192.168.123.105:0/2952000326 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f67bc005320 con 0x7f67d819ebb0 2026-03-10T07:52:03.640 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.638+0000 7f67ddc66700 1 --2- 192.168.123.105:0/2952000326 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f67c4077660 0x7f67c4079b20 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f67d4009de0 tx=0x7f67d4009450 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:03.646 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.642+0000 7f67ceffd700 1 -- 192.168.123.105:0/2952000326 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f67d0062f70 con 0x7f67d819ebb0 2026-03-10T07:52:03.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.869+0000 7f67dfeca700 1 -- 192.168.123.105:0/2952000326 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f67bc006200 con 0x7f67d819ebb0 2026-03-10T07:52:03.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.872+0000 7f67ceffd700 1 -- 192.168.123.105:0/2952000326 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+679 (secure 0 0 0) 0x7f67d0004a10 con 0x7f67d819ebb0 2026-03-10T07:52:03.877 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.876+0000 7f67ccff9700 1 -- 192.168.123.105:0/2952000326 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f67c4077660 msgr2=0x7f67c4079b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:03.878 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.876+0000 7f67ccff9700 1 --2- 192.168.123.105:0/2952000326 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f67c4077660 0x7f67c4079b20 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7f67d4009de0 tx=0x7f67d4009450 comp rx=0 tx=0).stop 2026-03-10T07:52:03.878 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.876+0000 7f67ccff9700 1 -- 192.168.123.105:0/2952000326 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67d819ebb0 msgr2=0x7f67d819d420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:03.878 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.876+0000 7f67ccff9700 1 --2- 192.168.123.105:0/2952000326 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67d819ebb0 0x7f67d819d420 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f67d000c4d0 tx=0x7f67d0012a40 comp rx=0 tx=0).stop 2026-03-10T07:52:03.878 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.876+0000 7f67ccff9700 1 -- 192.168.123.105:0/2952000326 shutdown_connections 2026-03-10T07:52:03.878 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.876+0000 7f67ccff9700 1 --2- 192.168.123.105:0/2952000326 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f67c4077660 0x7f67c4079b20 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:03.878 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.876+0000 7f67ccff9700 1 --2- 192.168.123.105:0/2952000326 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f67d810d680 0x7f67d819cee0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:03.878 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.876+0000 7f67ccff9700 1 --2- 192.168.123.105:0/2952000326 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f67d819ebb0 0x7f67d819d420 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:03.878 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.876+0000 7f67ccff9700 1 -- 192.168.123.105:0/2952000326 >> 192.168.123.105:0/2952000326 conn(0x7f67d806d1b0 msgr2=0x7f67d80710b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:03.879 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.877+0000 7f67ccff9700 1 -- 192.168.123.105:0/2952000326 shutdown_connections 2026-03-10T07:52:03.879 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:03.878+0000 7f67ccff9700 1 -- 192.168.123.105:0/2952000326 wait complete. 2026-03-10T07:52:03.890 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T07:52:04.051 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.mgr | keys'"'"' | grep $sha1' 2026-03-10T07:52:04.149 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:03 vm05.local ceph-mon[50387]: from='client.24493 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:52:04.149 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:03 vm05.local ceph-mon[50387]: pgmap v51: 65 pgs: 65 active+clean; 2.3 GiB data, 8.4 GiB used, 112 GiB / 120 GiB avail; 21 MiB/s rd, 48 MiB/s wr, 214 op/s 2026-03-10T07:52:04.149 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:03 vm05.local ceph-mon[50387]: from='client.? 192.168.123.105:0/2952000326' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:52:04.245 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:52:04.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:03 vm08.local ceph-mon[59917]: from='client.24493 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:52:04.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:03 vm08.local ceph-mon[59917]: pgmap v51: 65 pgs: 65 active+clean; 2.3 GiB data, 8.4 GiB used, 112 GiB / 120 GiB avail; 21 MiB/s rd, 48 MiB/s wr, 214 op/s 2026-03-10T07:52:04.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:03 vm08.local ceph-mon[59917]: from='client.? 192.168.123.105:0/2952000326' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:52:04.594 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.592+0000 7f3acf59e700 1 -- 192.168.123.105:0/2793130951 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ac80a4ca0 msgr2=0x7f3ac80a5120 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:04.594 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.592+0000 7f3acf59e700 1 --2- 192.168.123.105:0/2793130951 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ac80a4ca0 0x7f3ac80a5120 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f3ad0066a30 tx=0x7f3ad0067220 comp rx=0 tx=0).stop 2026-03-10T07:52:04.594 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.593+0000 7f3acf59e700 1 -- 192.168.123.105:0/2793130951 shutdown_connections 2026-03-10T07:52:04.594 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.593+0000 7f3acf59e700 1 --2- 192.168.123.105:0/2793130951 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ac80a4ca0 0x7f3ac80a5120 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:04.594 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.593+0000 7f3acf59e700 1 --2- 192.168.123.105:0/2793130951 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ac80aabf0 0x7f3ac80aafd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:04.594 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.593+0000 7f3acf59e700 1 -- 192.168.123.105:0/2793130951 >> 192.168.123.105:0/2793130951 conn(0x7f3ac801a6f0 msgr2=0x7f3ac801ab00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:04.595 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.593+0000 7f3acf59e700 1 -- 192.168.123.105:0/2793130951 shutdown_connections 2026-03-10T07:52:04.595 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.593+0000 7f3acf59e700 1 -- 192.168.123.105:0/2793130951 wait complete. 2026-03-10T07:52:04.595 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.593+0000 7f3acf59e700 1 Processor -- start 2026-03-10T07:52:04.595 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.593+0000 7f3acf59e700 1 -- start start 2026-03-10T07:52:04.595 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.594+0000 7f3acf59e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ac80aabf0 0x7f3ac800f820 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:04.595 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.594+0000 7f3acf59e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ac800fd60 0x7f3ac80101e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:04.595 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.594+0000 7f3acf59e700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3ac80143a0 con 0x7f3ac800fd60 2026-03-10T07:52:04.596 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.594+0000 7f3acf59e700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3ac8014510 con 0x7f3ac80aabf0 2026-03-10T07:52:04.596 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.594+0000 7f3ace59c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ac80aabf0 0x7f3ac800f820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:04.596 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.594+0000 7f3ace59c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ac80aabf0 0x7f3ac800f820 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:57894/0 (socket says 192.168.123.105:57894) 2026-03-10T07:52:04.596 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.594+0000 7f3ace59c700 1 -- 192.168.123.105:0/3791005137 learned_addr learned my addr 192.168.123.105:0/3791005137 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:04.596 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.594+0000 7f3ace59c700 1 -- 192.168.123.105:0/3791005137 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ac800fd60 msgr2=0x7f3ac80101e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:04.596 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.594+0000 7f3ace59c700 1 --2- 192.168.123.105:0/3791005137 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ac800fd60 0x7f3ac80101e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:04.596 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.594+0000 7f3ace59c700 1 -- 192.168.123.105:0/3791005137 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3ad0067090 con 0x7f3ac80aabf0 2026-03-10T07:52:04.596 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.595+0000 7f3ace59c700 1 --2- 192.168.123.105:0/3791005137 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ac80aabf0 0x7f3ac800f820 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f3ac000d8d0 tx=0x7f3ac000dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:04.599 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.595+0000 7f3abf7fe700 1 -- 192.168.123.105:0/3791005137 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3ac0009940 con 0x7f3ac80aabf0 2026-03-10T07:52:04.599 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.595+0000 7f3abf7fe700 1 -- 192.168.123.105:0/3791005137 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f3ac0010460 con 0x7f3ac80aabf0 2026-03-10T07:52:04.599 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.595+0000 7f3abf7fe700 1 -- 192.168.123.105:0/3791005137 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3ac000f5d0 con 0x7f3ac80aabf0 2026-03-10T07:52:04.599 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.597+0000 7f3acf59e700 1 -- 192.168.123.105:0/3791005137 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3ac80147f0 con 0x7f3ac80aabf0 2026-03-10T07:52:04.599 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.597+0000 7f3acf59e700 1 -- 192.168.123.105:0/3791005137 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3ac8014d10 con 0x7f3ac80aabf0 2026-03-10T07:52:04.599 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.598+0000 7f3acf59e700 1 -- 192.168.123.105:0/3791005137 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3ac80044c0 con 0x7f3ac80aabf0 2026-03-10T07:52:04.601 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.600+0000 7f3abf7fe700 1 -- 192.168.123.105:0/3791005137 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 31) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f3ac00105d0 con 0x7f3ac80aabf0 2026-03-10T07:52:04.602 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.600+0000 7f3abf7fe700 1 --2- 192.168.123.105:0/3791005137 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f3ab8077590 0x7f3ab8079a50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:04.602 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.600+0000 7f3abf7fe700 1 -- 192.168.123.105:0/3791005137 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f3ac00998e0 con 0x7f3ac80aabf0 2026-03-10T07:52:04.602 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.601+0000 7f3acdd9b700 1 --2- 192.168.123.105:0/3791005137 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f3ab8077590 0x7f3ab8079a50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:04.602 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.601+0000 7f3acdd9b700 1 --2- 192.168.123.105:0/3791005137 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f3ab8077590 0x7f3ab8079a50 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f3ad00668a0 tx=0x7f3ad00666f0 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:04.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.601+0000 7f3abf7fe700 1 -- 192.168.123.105:0/3791005137 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f3ac0062000 con 0x7f3ac80aabf0 2026-03-10T07:52:04.808 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.806+0000 7f3acf59e700 1 -- 192.168.123.105:0/3791005137 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f3ac8010e70 con 0x7f3ac80aabf0 2026-03-10T07:52:04.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.808+0000 7f3abf7fe700 1 -- 192.168.123.105:0/3791005137 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+679 (secure 0 0 0) 0x7f3ac0061e20 con 0x7f3ac80aabf0 2026-03-10T07:52:04.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.812+0000 7f3abd7fa700 1 -- 192.168.123.105:0/3791005137 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f3ab8077590 msgr2=0x7f3ab8079a50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:04.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.812+0000 7f3abd7fa700 1 --2- 192.168.123.105:0/3791005137 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f3ab8077590 0x7f3ab8079a50 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f3ad00668a0 tx=0x7f3ad00666f0 comp rx=0 tx=0).stop 2026-03-10T07:52:04.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.812+0000 7f3abd7fa700 1 -- 192.168.123.105:0/3791005137 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ac80aabf0 msgr2=0x7f3ac800f820 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:04.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.812+0000 7f3abd7fa700 1 --2- 192.168.123.105:0/3791005137 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ac80aabf0 0x7f3ac800f820 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f3ac000d8d0 tx=0x7f3ac000dc90 comp rx=0 tx=0).stop 2026-03-10T07:52:04.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.812+0000 7f3abd7fa700 1 -- 192.168.123.105:0/3791005137 shutdown_connections 2026-03-10T07:52:04.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.812+0000 7f3abd7fa700 1 --2- 192.168.123.105:0/3791005137 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3ac80aabf0 0x7f3ac800f820 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:04.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.812+0000 7f3abd7fa700 1 --2- 192.168.123.105:0/3791005137 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f3ab8077590 0x7f3ab8079a50 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:04.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.812+0000 7f3abd7fa700 1 --2- 192.168.123.105:0/3791005137 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3ac800fd60 0x7f3ac80101e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:04.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.812+0000 7f3abd7fa700 1 -- 192.168.123.105:0/3791005137 >> 192.168.123.105:0/3791005137 conn(0x7f3ac801a6f0 msgr2=0x7f3ac80a4520 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:04.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.813+0000 7f3abd7fa700 1 -- 192.168.123.105:0/3791005137 shutdown_connections 2026-03-10T07:52:04.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:04.814+0000 7f3abd7fa700 1 -- 192.168.123.105:0/3791005137 wait complete. 2026-03-10T07:52:04.829 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)" 2026-03-10T07:52:04.891 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.overall | length == 2'"'"'' 2026-03-10T07:52:05.146 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:52:05.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:04 vm05.local ceph-mon[50387]: from='client.? 192.168.123.105:0/3791005137' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:52:05.351 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:04 vm08.local ceph-mon[59917]: from='client.? 192.168.123.105:0/3791005137' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:52:05.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.530+0000 7f0c7b1c1700 1 -- 192.168.123.105:0/2122573699 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0c741030a0 msgr2=0x7f0c74103480 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:05.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.530+0000 7f0c7b1c1700 1 --2- 192.168.123.105:0/2122573699 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0c741030a0 0x7f0c74103480 secure :-1 s=READY pgs=323 cs=0 l=1 rev1=1 crypto rx=0x7f0c70009b00 tx=0x7f0c70009e10 comp rx=0 tx=0).stop 2026-03-10T07:52:05.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.531+0000 7f0c7b1c1700 1 -- 192.168.123.105:0/2122573699 shutdown_connections 2026-03-10T07:52:05.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.531+0000 7f0c7b1c1700 1 --2- 192.168.123.105:0/2122573699 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c74103a50 0x7f0c74107aa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:05.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.531+0000 7f0c7b1c1700 1 --2- 192.168.123.105:0/2122573699 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0c741030a0 0x7f0c74103480 unknown :-1 s=CLOSED pgs=323 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:05.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.531+0000 7f0c7b1c1700 1 -- 192.168.123.105:0/2122573699 >> 192.168.123.105:0/2122573699 conn(0x7f0c740fe930 msgr2=0x7f0c74100d50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:05.533 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.531+0000 7f0c7b1c1700 1 -- 192.168.123.105:0/2122573699 shutdown_connections 2026-03-10T07:52:05.533 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.532+0000 7f0c7b1c1700 1 -- 192.168.123.105:0/2122573699 wait complete. 2026-03-10T07:52:05.533 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.532+0000 7f0c7b1c1700 1 Processor -- start 2026-03-10T07:52:05.533 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.532+0000 7f0c7b1c1700 1 -- start start 2026-03-10T07:52:05.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.532+0000 7f0c7b1c1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0c741030a0 0x7f0c7419e9c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:05.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.532+0000 7f0c7b1c1700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c74103a50 0x7f0c7419ef00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:05.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.532+0000 7f0c7b1c1700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0c7419f590 con 0x7f0c741030a0 2026-03-10T07:52:05.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.532+0000 7f0c7b1c1700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0c74198a40 con 0x7f0c74103a50 2026-03-10T07:52:05.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.532+0000 7f0c7a1bf700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0c741030a0 0x7f0c7419e9c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:05.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.533+0000 7f0c799be700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c74103a50 0x7f0c7419ef00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:05.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.533+0000 7f0c799be700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c74103a50 0x7f0c7419ef00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:57918/0 (socket says 192.168.123.105:57918) 2026-03-10T07:52:05.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.533+0000 7f0c799be700 1 -- 192.168.123.105:0/2490104482 learned_addr learned my addr 192.168.123.105:0/2490104482 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:05.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.533+0000 7f0c799be700 1 -- 192.168.123.105:0/2490104482 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0c741030a0 msgr2=0x7f0c7419e9c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:05.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.533+0000 7f0c799be700 1 --2- 192.168.123.105:0/2490104482 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0c741030a0 0x7f0c7419e9c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:05.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.533+0000 7f0c799be700 1 -- 192.168.123.105:0/2490104482 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0c700097e0 con 0x7f0c74103a50 2026-03-10T07:52:05.535 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.533+0000 7f0c799be700 1 --2- 192.168.123.105:0/2490104482 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c74103a50 0x7f0c7419ef00 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f0c6400d900 tx=0x7f0c6400dc10 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:05.535 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.533+0000 7f0c6b7fe700 1 -- 192.168.123.105:0/2490104482 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0c640098e0 con 0x7f0c74103a50 2026-03-10T07:52:05.535 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.534+0000 7f0c7b1c1700 1 -- 192.168.123.105:0/2490104482 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0c74198d20 con 0x7f0c74103a50 2026-03-10T07:52:05.535 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.534+0000 7f0c7b1c1700 1 -- 192.168.123.105:0/2490104482 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0c74199270 con 0x7f0c74103a50 2026-03-10T07:52:05.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.534+0000 7f0c6b7fe700 1 -- 192.168.123.105:0/2490104482 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f0c64010460 con 0x7f0c74103a50 2026-03-10T07:52:05.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.534+0000 7f0c7b1c1700 1 -- 192.168.123.105:0/2490104482 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0c7410b3f0 con 0x7f0c74103a50 2026-03-10T07:52:05.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.535+0000 7f0c6b7fe700 1 -- 192.168.123.105:0/2490104482 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0c6400f5d0 con 0x7f0c74103a50 2026-03-10T07:52:05.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.536+0000 7f0c6b7fe700 1 -- 192.168.123.105:0/2490104482 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 31) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f0c64009a40 con 0x7f0c74103a50 2026-03-10T07:52:05.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.536+0000 7f0c6b7fe700 1 --2- 192.168.123.105:0/2490104482 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f0c60077380 0x7f0c60079840 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:05.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.536+0000 7f0c6b7fe700 1 -- 192.168.123.105:0/2490104482 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f0c64099360 con 0x7f0c74103a50 2026-03-10T07:52:05.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.537+0000 7f0c7a1bf700 1 --2- 192.168.123.105:0/2490104482 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f0c60077380 0x7f0c60079840 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:05.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.538+0000 7f0c7a1bf700 1 --2- 192.168.123.105:0/2490104482 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f0c60077380 0x7f0c60079840 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f0c7000b5c0 tx=0x7f0c70005fb0 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:05.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.538+0000 7f0c6b7fe700 1 -- 192.168.123.105:0/2490104482 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f0c640619c0 con 0x7f0c74103a50 2026-03-10T07:52:05.711 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.708+0000 7f0c7b1c1700 1 -- 192.168.123.105:0/2490104482 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f0c74198eb0 con 0x7f0c74103a50 2026-03-10T07:52:05.712 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.710+0000 7f0c6b7fe700 1 -- 192.168.123.105:0/2490104482 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+679 (secure 0 0 0) 0x7f0c74198eb0 con 0x7f0c74103a50 2026-03-10T07:52:05.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.714+0000 7f0c697fa700 1 -- 192.168.123.105:0/2490104482 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f0c60077380 msgr2=0x7f0c60079840 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:05.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.714+0000 7f0c697fa700 1 --2- 192.168.123.105:0/2490104482 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f0c60077380 0x7f0c60079840 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f0c7000b5c0 tx=0x7f0c70005fb0 comp rx=0 tx=0).stop 2026-03-10T07:52:05.716 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.714+0000 7f0c697fa700 1 -- 192.168.123.105:0/2490104482 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c74103a50 msgr2=0x7f0c7419ef00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:05.716 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.714+0000 7f0c697fa700 1 --2- 192.168.123.105:0/2490104482 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c74103a50 0x7f0c7419ef00 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f0c6400d900 tx=0x7f0c6400dc10 comp rx=0 tx=0).stop 2026-03-10T07:52:05.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.715+0000 7f0c697fa700 1 -- 192.168.123.105:0/2490104482 shutdown_connections 2026-03-10T07:52:05.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.715+0000 7f0c697fa700 1 --2- 192.168.123.105:0/2490104482 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f0c60077380 0x7f0c60079840 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:05.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.715+0000 7f0c697fa700 1 --2- 192.168.123.105:0/2490104482 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0c741030a0 0x7f0c7419e9c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:05.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.715+0000 7f0c697fa700 1 --2- 192.168.123.105:0/2490104482 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0c74103a50 0x7f0c7419ef00 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:05.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.716+0000 7f0c697fa700 1 -- 192.168.123.105:0/2490104482 >> 192.168.123.105:0/2490104482 conn(0x7f0c740fe930 msgr2=0x7f0c740fff50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:05.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.716+0000 7f0c697fa700 1 -- 192.168.123.105:0/2490104482 shutdown_connections 2026-03-10T07:52:05.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:05.717+0000 7f0c697fa700 1 -- 192.168.123.105:0/2490104482 wait complete. 2026-03-10T07:52:05.744 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T07:52:05.794 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade check quay.ceph.io/ceph-ci/ceph:$sha1 | jq -e '"'"'.up_to_date | length == 2'"'"'' 2026-03-10T07:52:05.989 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:52:06.011 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:06 vm05.local ceph-mon[50387]: pgmap v52: 65 pgs: 65 active+clean; 1.5 GiB data, 6.2 GiB used, 114 GiB / 120 GiB avail; 21 MiB/s rd, 49 MiB/s wr, 282 op/s 2026-03-10T07:52:06.011 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:06 vm05.local ceph-mon[50387]: from='client.? 192.168.123.105:0/2490104482' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:52:06.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:06 vm08.local ceph-mon[59917]: pgmap v52: 65 pgs: 65 active+clean; 1.5 GiB data, 6.2 GiB used, 114 GiB / 120 GiB avail; 21 MiB/s rd, 49 MiB/s wr, 282 op/s 2026-03-10T07:52:06.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:06 vm08.local ceph-mon[59917]: from='client.? 192.168.123.105:0/2490104482' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:52:06.453 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.451+0000 7fa4c72a6700 1 -- 192.168.123.105:0/1730354446 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa4c0107d90 msgr2=0x7fa4c0108210 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:06.453 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.451+0000 7fa4c72a6700 1 --2- 192.168.123.105:0/1730354446 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa4c0107d90 0x7fa4c0108210 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7fa4b800b210 tx=0x7fa4b800b520 comp rx=0 tx=0).stop 2026-03-10T07:52:06.454 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.452+0000 7fa4c72a6700 1 -- 192.168.123.105:0/1730354446 shutdown_connections 2026-03-10T07:52:06.454 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.452+0000 7fa4c72a6700 1 --2- 192.168.123.105:0/1730354446 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa4c0107d90 0x7fa4c0108210 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:06.454 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.452+0000 7fa4c72a6700 1 --2- 192.168.123.105:0/1730354446 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa4c010f420 0x7fa4c010f800 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:06.454 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.452+0000 7fa4c72a6700 1 -- 192.168.123.105:0/1730354446 >> 192.168.123.105:0/1730354446 conn(0x7fa4c006ce20 msgr2=0x7fa4c006d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:06.455 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.453+0000 7fa4c72a6700 1 -- 192.168.123.105:0/1730354446 shutdown_connections 2026-03-10T07:52:06.455 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.454+0000 7fa4c72a6700 1 -- 192.168.123.105:0/1730354446 wait complete. 2026-03-10T07:52:06.455 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.454+0000 7fa4c72a6700 1 Processor -- start 2026-03-10T07:52:06.456 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.454+0000 7fa4c72a6700 1 -- start start 2026-03-10T07:52:06.456 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.454+0000 7fa4c72a6700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa4c0107d90 0x7fa4c0119f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:06.456 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.454+0000 7fa4c72a6700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa4c010f420 0x7fa4c0114f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:06.456 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.454+0000 7fa4c72a6700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa4c01154d0 con 0x7fa4c010f420 2026-03-10T07:52:06.456 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.454+0000 7fa4c72a6700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa4c0115640 con 0x7fa4c0107d90 2026-03-10T07:52:06.456 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.455+0000 7fa4c5042700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa4c0107d90 0x7fa4c0119f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:06.457 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.455+0000 7fa4c5042700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa4c0107d90 0x7fa4c0119f00 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:57924/0 (socket says 192.168.123.105:57924) 2026-03-10T07:52:06.457 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.455+0000 7fa4c5042700 1 -- 192.168.123.105:0/2513044592 learned_addr learned my addr 192.168.123.105:0/2513044592 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:06.457 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.456+0000 7fa4c5042700 1 -- 192.168.123.105:0/2513044592 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa4c010f420 msgr2=0x7fa4c0114f00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:06.457 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.456+0000 7fa4c5042700 1 --2- 192.168.123.105:0/2513044592 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa4c010f420 0x7fa4c0114f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:06.457 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.456+0000 7fa4c5042700 1 -- 192.168.123.105:0/2513044592 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa4b8009e30 con 0x7fa4c0107d90 2026-03-10T07:52:06.458 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.456+0000 7fa4c5042700 1 --2- 192.168.123.105:0/2513044592 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa4c0107d90 0x7fa4c0119f00 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7fa4bc00ea30 tx=0x7fa4bc00edf0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:06.458 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.457+0000 7fa4b67fc700 1 -- 192.168.123.105:0/2513044592 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa4bc00cc40 con 0x7fa4c0107d90 2026-03-10T07:52:06.458 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.457+0000 7fa4c72a6700 1 -- 192.168.123.105:0/2513044592 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa4c0115920 con 0x7fa4c0107d90 2026-03-10T07:52:06.458 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.457+0000 7fa4c72a6700 1 -- 192.168.123.105:0/2513044592 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa4c01b8450 con 0x7fa4c0107d90 2026-03-10T07:52:06.459 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.458+0000 7fa4b67fc700 1 -- 192.168.123.105:0/2513044592 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fa4bc00cda0 con 0x7fa4c0107d90 2026-03-10T07:52:06.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.459+0000 7fa4b67fc700 1 -- 192.168.123.105:0/2513044592 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa4bc010430 con 0x7fa4c0107d90 2026-03-10T07:52:06.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.459+0000 7fa4b67fc700 1 -- 192.168.123.105:0/2513044592 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 31) v1 ==== 99964+0+0 (secure 0 0 0) 0x7fa4bc0106b0 con 0x7fa4c0107d90 2026-03-10T07:52:06.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.459+0000 7fa4b67fc700 1 --2- 192.168.123.105:0/2513044592 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fa4ac0776e0 0x7fa4ac079ba0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:06.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.460+0000 7fa4c4841700 1 --2- 192.168.123.105:0/2513044592 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fa4ac0776e0 0x7fa4ac079ba0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:06.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.460+0000 7fa4b67fc700 1 -- 192.168.123.105:0/2513044592 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7fa4bc014070 con 0x7fa4c0107d90 2026-03-10T07:52:06.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.460+0000 7fa4c4841700 1 --2- 192.168.123.105:0/2513044592 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fa4ac0776e0 0x7fa4ac079ba0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fa4b8015040 tx=0x7fa4b8000f40 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:06.463 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.461+0000 7fa4c72a6700 1 -- 192.168.123.105:0/2513044592 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa4a4005320 con 0x7fa4c0107d90 2026-03-10T07:52:06.466 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.465+0000 7fa4b67fc700 1 -- 192.168.123.105:0/2513044592 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7fa4bc0637f0 con 0x7fa4c0107d90 2026-03-10T07:52:06.611 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:06.609+0000 7fa4c72a6700 1 -- 192.168.123.105:0/2513044592 --> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] -- mgr_command(tid 0: {"prefix": "orch upgrade check", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}) v1 -- 0x7fa4a4000c90 con 0x7fa4ac0776e0 2026-03-10T07:52:07.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:07 vm05.local ceph-mon[50387]: from='client.24509 -' entity='client.admin' cmd=[{"prefix": "orch upgrade check", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:52:07.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:07 vm05.local ceph-mon[50387]: pgmap v53: 65 pgs: 65 active+clean; 1.5 GiB data, 6.2 GiB used, 114 GiB / 120 GiB avail; 23 KiB/s rd, 1.2 MiB/s wr, 148 op/s 2026-03-10T07:52:07.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:07 vm08.local ceph-mon[59917]: from='client.24509 -' entity='client.admin' cmd=[{"prefix": "orch upgrade check", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:52:07.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:07 vm08.local ceph-mon[59917]: pgmap v53: 65 pgs: 65 active+clean; 1.5 GiB data, 6.2 GiB used, 114 GiB / 120 GiB avail; 23 KiB/s rd, 1.2 MiB/s wr, 148 op/s 2026-03-10T07:52:09.313 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.310+0000 7fa4b67fc700 1 -- 192.168.123.105:0/2513044592 <== mgr.14628 v2:192.168.123.105:6800/413688438 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+5308 (secure 0 0 0) 0x7fa4a4000c90 con 0x7fa4ac0776e0 2026-03-10T07:52:09.313 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.312+0000 7fa4abfff700 1 -- 192.168.123.105:0/2513044592 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fa4ac0776e0 msgr2=0x7fa4ac079ba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:09.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.312+0000 7fa4abfff700 1 --2- 192.168.123.105:0/2513044592 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fa4ac0776e0 0x7fa4ac079ba0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fa4b8015040 tx=0x7fa4b8000f40 comp rx=0 tx=0).stop 2026-03-10T07:52:09.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.312+0000 7fa4abfff700 1 -- 192.168.123.105:0/2513044592 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa4c0107d90 msgr2=0x7fa4c0119f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:09.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.312+0000 7fa4abfff700 1 --2- 192.168.123.105:0/2513044592 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa4c0107d90 0x7fa4c0119f00 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7fa4bc00ea30 tx=0x7fa4bc00edf0 comp rx=0 tx=0).stop 2026-03-10T07:52:09.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.312+0000 7fa4abfff700 1 -- 192.168.123.105:0/2513044592 shutdown_connections 2026-03-10T07:52:09.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.312+0000 7fa4abfff700 1 --2- 192.168.123.105:0/2513044592 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa4c0107d90 0x7fa4c0119f00 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:09.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.312+0000 7fa4abfff700 1 --2- 192.168.123.105:0/2513044592 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fa4ac0776e0 0x7fa4ac079ba0 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:09.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.312+0000 7fa4abfff700 1 --2- 192.168.123.105:0/2513044592 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa4c010f420 0x7fa4c0114f00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:09.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.312+0000 7fa4abfff700 1 -- 192.168.123.105:0/2513044592 >> 192.168.123.105:0/2513044592 conn(0x7fa4c006ce20 msgr2=0x7fa4c00700c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:09.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.313+0000 7fa4abfff700 1 -- 192.168.123.105:0/2513044592 shutdown_connections 2026-03-10T07:52:09.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.313+0000 7fa4abfff700 1 -- 192.168.123.105:0/2513044592 wait complete. 2026-03-10T07:52:09.324 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T07:52:09.375 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch ps' 2026-03-10T07:52:09.625 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:52:09.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:09 vm05.local ceph-mon[50387]: pgmap v54: 65 pgs: 65 active+clean; 845 MiB data, 4.2 GiB used, 116 GiB / 120 GiB avail; 31 KiB/s rd, 1.7 MiB/s wr, 206 op/s 2026-03-10T07:52:09.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:09 vm08.local ceph-mon[59917]: pgmap v54: 65 pgs: 65 active+clean; 845 MiB data, 4.2 GiB used, 116 GiB / 120 GiB avail; 31 KiB/s rd, 1.7 MiB/s wr, 206 op/s 2026-03-10T07:52:09.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.990+0000 7fec37038700 1 -- 192.168.123.105:0/700103157 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fec300ffe70 msgr2=0x7fec30103d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:09.992 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.990+0000 7fec37038700 1 --2- 192.168.123.105:0/700103157 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fec300ffe70 0x7fec30103d30 secure :-1 s=READY pgs=324 cs=0 l=1 rev1=1 crypto rx=0x7fec2800b3a0 tx=0x7fec2800b6b0 comp rx=0 tx=0).stop 2026-03-10T07:52:09.992 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.990+0000 7fec37038700 1 -- 192.168.123.105:0/700103157 shutdown_connections 2026-03-10T07:52:09.992 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.990+0000 7fec37038700 1 --2- 192.168.123.105:0/700103157 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fec300ffe70 0x7fec30103d30 unknown :-1 s=CLOSED pgs=324 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:09.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.990+0000 7fec37038700 1 --2- 192.168.123.105:0/700103157 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fec300ff4c0 0x7fec300ff8a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:09.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.990+0000 7fec37038700 1 -- 192.168.123.105:0/700103157 >> 192.168.123.105:0/700103157 conn(0x7fec30074bd0 msgr2=0x7fec30074fe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:09.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.990+0000 7fec37038700 1 -- 192.168.123.105:0/700103157 shutdown_connections 2026-03-10T07:52:09.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.990+0000 7fec37038700 1 -- 192.168.123.105:0/700103157 wait complete. 2026-03-10T07:52:09.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.991+0000 7fec37038700 1 Processor -- start 2026-03-10T07:52:09.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.991+0000 7fec37038700 1 -- start start 2026-03-10T07:52:09.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.991+0000 7fec37038700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fec300ff4c0 0x7fec3006bb30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:09.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.991+0000 7fec37038700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fec3006d850 0x7fec3006c070 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:09.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.991+0000 7fec37038700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fec3006c640 con 0x7fec3006d850 2026-03-10T07:52:09.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.991+0000 7fec37038700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fec3006c7b0 con 0x7fec300ff4c0 2026-03-10T07:52:09.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.992+0000 7fec2ffff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fec3006d850 0x7fec3006c070 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:09.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.992+0000 7fec2ffff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fec3006d850 0x7fec3006c070 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:44624/0 (socket says 192.168.123.105:44624) 2026-03-10T07:52:09.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.992+0000 7fec2ffff700 1 -- 192.168.123.105:0/4158109081 learned_addr learned my addr 192.168.123.105:0/4158109081 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:09.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.992+0000 7fec34dd4700 1 --2- 192.168.123.105:0/4158109081 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fec300ff4c0 0x7fec3006bb30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:09.994 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.993+0000 7fec2ffff700 1 -- 192.168.123.105:0/4158109081 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fec300ff4c0 msgr2=0x7fec3006bb30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:09.994 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.993+0000 7fec2ffff700 1 --2- 192.168.123.105:0/4158109081 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fec300ff4c0 0x7fec3006bb30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:09.994 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.993+0000 7fec2ffff700 1 -- 192.168.123.105:0/4158109081 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fec2800b050 con 0x7fec3006d850 2026-03-10T07:52:09.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.994+0000 7fec2ffff700 1 --2- 192.168.123.105:0/4158109081 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fec3006d850 0x7fec3006c070 secure :-1 s=READY pgs=325 cs=0 l=1 rev1=1 crypto rx=0x7fec28003fd0 tx=0x7fec28007d30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:09.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.994+0000 7fec2dffb700 1 -- 192.168.123.105:0/4158109081 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fec28007980 con 0x7fec3006d850 2026-03-10T07:52:09.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.994+0000 7fec37038700 1 -- 192.168.123.105:0/4158109081 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fec30192460 con 0x7fec3006d850 2026-03-10T07:52:09.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.994+0000 7fec37038700 1 -- 192.168.123.105:0/4158109081 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fec301929b0 con 0x7fec3006d850 2026-03-10T07:52:09.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.995+0000 7fec2dffb700 1 -- 192.168.123.105:0/4158109081 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fec28007ae0 con 0x7fec3006d850 2026-03-10T07:52:09.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.995+0000 7fec2dffb700 1 -- 192.168.123.105:0/4158109081 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fec2801ab50 con 0x7fec3006d850 2026-03-10T07:52:09.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.996+0000 7fec37038700 1 -- 192.168.123.105:0/4158109081 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fec1c005320 con 0x7fec3006d850 2026-03-10T07:52:10.002 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.999+0000 7fec2dffb700 1 -- 192.168.123.105:0/4158109081 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 31) v1 ==== 99964+0+0 (secure 0 0 0) 0x7fec28004210 con 0x7fec3006d850 2026-03-10T07:52:10.002 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.999+0000 7fec2dffb700 1 --2- 192.168.123.105:0/4158109081 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fec18077660 0x7fec18079b20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:10.002 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.999+0000 7fec34dd4700 1 --2- 192.168.123.105:0/4158109081 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fec18077660 0x7fec18079b20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:10.002 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:09.999+0000 7fec2dffb700 1 -- 192.168.123.105:0/4158109081 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7fec2809af00 con 0x7fec3006d850 2026-03-10T07:52:10.002 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.000+0000 7fec34dd4700 1 --2- 192.168.123.105:0/4158109081 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fec18077660 0x7fec18079b20 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fec2001ae60 tx=0x7fec2001a460 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:10.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.003+0000 7fec2dffb700 1 -- 192.168.123.105:0/4158109081 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7fec28063c30 con 0x7fec3006d850 2026-03-10T07:52:10.160 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.158+0000 7fec37038700 1 -- 192.168.123.105:0/4158109081 --> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fec1c000bf0 con 0x7fec18077660 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (48s) 24s ago 5m 16.9M - 0.25.0 c8568f914cd2 ac15d5f35994 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (5m) 24s ago 5m 8530k - 18.2.1 5be31c24972a 26c4db858175 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (5m) 75s ago 5m 8409k - 18.2.1 5be31c24972a 209e2398a09c 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (5m) 24s ago 5m 7419k - 18.2.1 5be31c24972a d3d7b92c8ac3 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (5m) 75s ago 5m 7415k - 18.2.1 5be31c24972a 96136e0195f7 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (27s) 24s ago 5m 41.5M - 10.4.0 c8b91775d855 6acb529ad951 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.omfhnh vm05 running (3m) 24s ago 3m 262M - 18.2.1 5be31c24972a e23de179e09c 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.pavqil vm05 running (3m) 24s ago 3m 15.7M - 18.2.1 5be31c24972a 5b9e5afa214c 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.dgsaon vm08 running (3m) 75s ago 3m 16.4M - 18.2.1 5be31c24972a 1696aee522b5 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ybmbgd vm08 running (3m) 75s ago 3m 14.7M - 18.2.1 5be31c24972a 30b0e51cd2ed 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.blexke vm05 *:8443,9283,8765 running (111s) 24s ago 6m 585M - 19.2.3-678-ge911bdeb 654f31e6858e 5467b6a3bd38 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.orfpog vm08 *:8443,9283,8765 running (92s) 75s ago 5m 487M - 19.2.3-678-ge911bdeb 654f31e6858e 7a15d7811d5b 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (6m) 24s ago 6m 55.0M 2048M 18.2.1 5be31c24972a 2a459bf05146 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (5m) 75s ago 5m 42.1M 2048M 18.2.1 5be31c24972a e01dfb712474 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (82s) 24s ago 5m 9512k - 1.7.0 72c9c2088986 7cd0b23b4118 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (77s) 75s ago 5m 5368k - 1.7.0 72c9c2088986 3dd4d91d5881 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (4m) 24s ago 4m 308M 4096M 18.2.1 5be31c24972a 9b7c5ea48cea 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (4m) 24s ago 4m 332M 4096M 18.2.1 5be31c24972a 88e0b65b2c93 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (4m) 24s ago 4m 262M 4096M 18.2.1 5be31c24972a b8d3cbeb49f1 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (4m) 75s ago 4m 257M 4096M 18.2.1 5be31c24972a 0a62c54a86c0 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (4m) 75s ago 4m 196M 4096M 18.2.1 5be31c24972a bd748b691ccd 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (4m) 75s ago 4m 220M 4096M 18.2.1 5be31c24972a 9f08820ae98b 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (57s) 24s ago 5m 47.1M - 2.51.0 1d3b7f56885b c59a6be07563 2026-03-10T07:52:10.168 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.164+0000 7fec2dffb700 1 -- 192.168.123.105:0/4158109081 <== mgr.14628 v2:192.168.123.105:6800/413688438 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7fec1c000bf0 con 0x7fec18077660 2026-03-10T07:52:10.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.169+0000 7fec177fe700 1 -- 192.168.123.105:0/4158109081 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fec18077660 msgr2=0x7fec18079b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:10.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.169+0000 7fec177fe700 1 --2- 192.168.123.105:0/4158109081 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fec18077660 0x7fec18079b20 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fec2001ae60 tx=0x7fec2001a460 comp rx=0 tx=0).stop 2026-03-10T07:52:10.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.169+0000 7fec177fe700 1 -- 192.168.123.105:0/4158109081 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fec3006d850 msgr2=0x7fec3006c070 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:10.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.169+0000 7fec177fe700 1 --2- 192.168.123.105:0/4158109081 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fec3006d850 0x7fec3006c070 secure :-1 s=READY pgs=325 cs=0 l=1 rev1=1 crypto rx=0x7fec28003fd0 tx=0x7fec28007d30 comp rx=0 tx=0).stop 2026-03-10T07:52:10.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.169+0000 7fec177fe700 1 -- 192.168.123.105:0/4158109081 shutdown_connections 2026-03-10T07:52:10.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.169+0000 7fec177fe700 1 --2- 192.168.123.105:0/4158109081 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fec300ff4c0 0x7fec3006bb30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:10.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.169+0000 7fec177fe700 1 --2- 192.168.123.105:0/4158109081 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fec18077660 0x7fec18079b20 secure :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7fec2001ae60 tx=0x7fec2001a460 comp rx=0 tx=0).stop 2026-03-10T07:52:10.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.169+0000 7fec177fe700 1 --2- 192.168.123.105:0/4158109081 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fec3006d850 0x7fec3006c070 unknown :-1 s=CLOSED pgs=325 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:10.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.169+0000 7fec177fe700 1 -- 192.168.123.105:0/4158109081 >> 192.168.123.105:0/4158109081 conn(0x7fec30074bd0 msgr2=0x7fec300fdc60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:10.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.169+0000 7fec177fe700 1 -- 192.168.123.105:0/4158109081 shutdown_connections 2026-03-10T07:52:10.171 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.169+0000 7fec177fe700 1 -- 192.168.123.105:0/4158109081 wait complete. 2026-03-10T07:52:10.228 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-10T07:52:10.228 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm05.local 2026-03-10T07:52:10.228 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mgr mgr/orchestrator/fail_fs true' 2026-03-10T07:52:10.457 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:52:10.514 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:10 vm05.local ceph-mon[50387]: from='client.14722 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:52:10.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:10 vm08.local ceph-mon[59917]: from='client.14722 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:52:10.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.872+0000 7ff2d9f53700 1 -- 192.168.123.105:0/1723700245 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff2d4107d90 msgr2=0x7ff2d4108210 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:10.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.872+0000 7ff2d9f53700 1 --2- 192.168.123.105:0/1723700245 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff2d4107d90 0x7ff2d4108210 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7ff2cc00b3a0 tx=0x7ff2cc00b6b0 comp rx=0 tx=0).stop 2026-03-10T07:52:10.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.872+0000 7ff2d9f53700 1 -- 192.168.123.105:0/1723700245 shutdown_connections 2026-03-10T07:52:10.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.872+0000 7ff2d9f53700 1 --2- 192.168.123.105:0/1723700245 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff2d4107d90 0x7ff2d4108210 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:10.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.872+0000 7ff2d9f53700 1 --2- 192.168.123.105:0/1723700245 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2d410f420 0x7ff2d410f800 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:10.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.872+0000 7ff2d9f53700 1 -- 192.168.123.105:0/1723700245 >> 192.168.123.105:0/1723700245 conn(0x7ff2d406ce20 msgr2=0x7ff2d406d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:10.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.872+0000 7ff2d9f53700 1 -- 192.168.123.105:0/1723700245 shutdown_connections 2026-03-10T07:52:10.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.872+0000 7ff2d9f53700 1 -- 192.168.123.105:0/1723700245 wait complete. 2026-03-10T07:52:10.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.873+0000 7ff2d9f53700 1 Processor -- start 2026-03-10T07:52:10.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.873+0000 7ff2d9f53700 1 -- start start 2026-03-10T07:52:10.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.873+0000 7ff2d9f53700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2d4107d90 0x7ff2d4119f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:10.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.873+0000 7ff2d9f53700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff2d410f420 0x7ff2d4114f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:10.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.873+0000 7ff2d9f53700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff2d41154e0 con 0x7ff2d4107d90 2026-03-10T07:52:10.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.873+0000 7ff2d9f53700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff2d4115650 con 0x7ff2d410f420 2026-03-10T07:52:10.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.874+0000 7ff2d2ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff2d410f420 0x7ff2d4114f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:10.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.874+0000 7ff2d2ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff2d410f420 0x7ff2d4114f10 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:42856/0 (socket says 192.168.123.105:42856) 2026-03-10T07:52:10.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.874+0000 7ff2d2ffd700 1 -- 192.168.123.105:0/1410473444 learned_addr learned my addr 192.168.123.105:0/1410473444 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:10.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.874+0000 7ff2d2ffd700 1 -- 192.168.123.105:0/1410473444 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2d4107d90 msgr2=0x7ff2d4119f10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:10.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.874+0000 7ff2d2ffd700 1 --2- 192.168.123.105:0/1410473444 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2d4107d90 0x7ff2d4119f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:10.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.874+0000 7ff2d2ffd700 1 -- 192.168.123.105:0/1410473444 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff2cc00b050 con 0x7ff2d410f420 2026-03-10T07:52:10.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.874+0000 7ff2d2ffd700 1 --2- 192.168.123.105:0/1410473444 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff2d410f420 0x7ff2d4114f10 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7ff2cc012640 tx=0x7ff2cc012720 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:10.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.874+0000 7ff2d0ff9700 1 -- 192.168.123.105:0/1410473444 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff2cc00e040 con 0x7ff2d410f420 2026-03-10T07:52:10.876 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.874+0000 7ff2d9f53700 1 -- 192.168.123.105:0/1410473444 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff2d41158d0 con 0x7ff2d410f420 2026-03-10T07:52:10.876 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.874+0000 7ff2d9f53700 1 -- 192.168.123.105:0/1410473444 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff2d41b84f0 con 0x7ff2d410f420 2026-03-10T07:52:10.876 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.875+0000 7ff2d0ff9700 1 -- 192.168.123.105:0/1410473444 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7ff2cc012e90 con 0x7ff2d410f420 2026-03-10T07:52:10.876 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.875+0000 7ff2d0ff9700 1 -- 192.168.123.105:0/1410473444 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff2cc0046e0 con 0x7ff2d410f420 2026-03-10T07:52:10.877 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.876+0000 7ff2d9f53700 1 -- 192.168.123.105:0/1410473444 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff2c0005320 con 0x7ff2d410f420 2026-03-10T07:52:10.878 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.876+0000 7ff2d0ff9700 1 -- 192.168.123.105:0/1410473444 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 31) v1 ==== 99964+0+0 (secure 0 0 0) 0x7ff2cc003c10 con 0x7ff2d410f420 2026-03-10T07:52:10.878 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.877+0000 7ff2d0ff9700 1 --2- 192.168.123.105:0/1410473444 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7ff2bc0775a0 0x7ff2bc079a60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:10.878 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.877+0000 7ff2d0ff9700 1 -- 192.168.123.105:0/1410473444 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7ff2cc09b600 con 0x7ff2d410f420 2026-03-10T07:52:10.878 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.877+0000 7ff2d37fe700 1 --2- 192.168.123.105:0/1410473444 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7ff2bc0775a0 0x7ff2bc079a60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:10.879 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.878+0000 7ff2d37fe700 1 --2- 192.168.123.105:0/1410473444 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7ff2bc0775a0 0x7ff2bc079a60 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7ff2c8009770 tx=0x7ff2c8006cd0 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:10.880 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:10.879+0000 7ff2d0ff9700 1 -- 192.168.123.105:0/1410473444 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7ff2cc064330 con 0x7ff2d410f420 2026-03-10T07:52:11.036 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.035+0000 7ff2d9f53700 1 -- 192.168.123.105:0/1410473444 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command([{prefix=config set, name=mgr/orchestrator/fail_fs}] v 0) v1 -- 0x7ff2c0005cc0 con 0x7ff2d410f420 2026-03-10T07:52:11.053 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.050+0000 7ff2d0ff9700 1 -- 192.168.123.105:0/1410473444 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mgr/orchestrator/fail_fs}]=0 v37)=0 v37) v1 ==== 125+0+0 (secure 0 0 0) 0x7ff2cc004050 con 0x7ff2d410f420 2026-03-10T07:52:11.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.053+0000 7ff2ba7fc700 1 -- 192.168.123.105:0/1410473444 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7ff2bc0775a0 msgr2=0x7ff2bc079a60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:11.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.053+0000 7ff2ba7fc700 1 --2- 192.168.123.105:0/1410473444 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7ff2bc0775a0 0x7ff2bc079a60 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7ff2c8009770 tx=0x7ff2c8006cd0 comp rx=0 tx=0).stop 2026-03-10T07:52:11.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.053+0000 7ff2ba7fc700 1 -- 192.168.123.105:0/1410473444 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff2d410f420 msgr2=0x7ff2d4114f10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:11.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.053+0000 7ff2ba7fc700 1 --2- 192.168.123.105:0/1410473444 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff2d410f420 0x7ff2d4114f10 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7ff2cc012640 tx=0x7ff2cc012720 comp rx=0 tx=0).stop 2026-03-10T07:52:11.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.053+0000 7ff2ba7fc700 1 -- 192.168.123.105:0/1410473444 shutdown_connections 2026-03-10T07:52:11.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.053+0000 7ff2ba7fc700 1 --2- 192.168.123.105:0/1410473444 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7ff2bc0775a0 0x7ff2bc079a60 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:11.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.053+0000 7ff2ba7fc700 1 --2- 192.168.123.105:0/1410473444 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff2d4107d90 0x7ff2d4119f10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:11.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.053+0000 7ff2ba7fc700 1 --2- 192.168.123.105:0/1410473444 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff2d410f420 0x7ff2d4114f10 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:11.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.053+0000 7ff2ba7fc700 1 -- 192.168.123.105:0/1410473444 >> 192.168.123.105:0/1410473444 conn(0x7ff2d406ce20 msgr2=0x7ff2d40700a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:11.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.053+0000 7ff2ba7fc700 1 -- 192.168.123.105:0/1410473444 shutdown_connections 2026-03-10T07:52:11.054 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.053+0000 7ff2ba7fc700 1 -- 192.168.123.105:0/1410473444 wait complete. 2026-03-10T07:52:11.130 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-10T07:52:11.130 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm05.local 2026-03-10T07:52:11.130 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim false --force' 2026-03-10T07:52:11.357 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:52:11.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.727+0000 7f5acf09b700 1 -- 192.168.123.105:0/4111365090 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ac810f340 msgr2=0x7f5ac810f720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:11.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.727+0000 7f5acf09b700 1 --2- 192.168.123.105:0/4111365090 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ac810f340 0x7f5ac810f720 secure :-1 s=READY pgs=326 cs=0 l=1 rev1=1 crypto rx=0x7f5ac4009b00 tx=0x7f5ac4009e10 comp rx=0 tx=0).stop 2026-03-10T07:52:11.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.728+0000 7f5acf09b700 1 -- 192.168.123.105:0/4111365090 shutdown_connections 2026-03-10T07:52:11.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.728+0000 7f5acf09b700 1 --2- 192.168.123.105:0/4111365090 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5ac810d0f0 0x7f5ac810d570 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:11.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.728+0000 7f5acf09b700 1 --2- 192.168.123.105:0/4111365090 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ac810f340 0x7f5ac810f720 unknown :-1 s=CLOSED pgs=326 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:11.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.728+0000 7f5acf09b700 1 -- 192.168.123.105:0/4111365090 >> 192.168.123.105:0/4111365090 conn(0x7f5ac806ce20 msgr2=0x7f5ac806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:11.729 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.728+0000 7f5acf09b700 1 -- 192.168.123.105:0/4111365090 shutdown_connections 2026-03-10T07:52:11.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.728+0000 7f5acf09b700 1 -- 192.168.123.105:0/4111365090 wait complete. 2026-03-10T07:52:11.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.729+0000 7f5acf09b700 1 Processor -- start 2026-03-10T07:52:11.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.729+0000 7f5acf09b700 1 -- start start 2026-03-10T07:52:11.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.729+0000 7f5acf09b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5ac810d0f0 0x7f5ac811beb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:11.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.729+0000 7f5acf09b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ac8116eb0 0x7f5ac8117330 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:11.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.729+0000 7f5acf09b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5ac8117870 con 0x7f5ac8116eb0 2026-03-10T07:52:11.732 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.729+0000 7f5acf09b700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5ac81179e0 con 0x7f5ac810d0f0 2026-03-10T07:52:11.732 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.730+0000 7f5acd898700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ac8116eb0 0x7f5ac8117330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:11.732 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.730+0000 7f5acd898700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ac8116eb0 0x7f5ac8117330 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:44656/0 (socket says 192.168.123.105:44656) 2026-03-10T07:52:11.732 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.730+0000 7f5acd898700 1 -- 192.168.123.105:0/3057153257 learned_addr learned my addr 192.168.123.105:0/3057153257 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:11.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.731+0000 7f5acd898700 1 -- 192.168.123.105:0/3057153257 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5ac810d0f0 msgr2=0x7f5ac811beb0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T07:52:11.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.731+0000 7f5acd898700 1 --2- 192.168.123.105:0/3057153257 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5ac810d0f0 0x7f5ac811beb0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:11.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.731+0000 7f5acd898700 1 -- 192.168.123.105:0/3057153257 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5ac40097e0 con 0x7f5ac8116eb0 2026-03-10T07:52:11.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.732+0000 7f5acd898700 1 --2- 192.168.123.105:0/3057153257 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ac8116eb0 0x7f5ac8117330 secure :-1 s=READY pgs=327 cs=0 l=1 rev1=1 crypto rx=0x7f5ac000c390 tx=0x7f5ac000c750 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:11.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.732+0000 7f5abf7fe700 1 -- 192.168.123.105:0/3057153257 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5ac000e030 con 0x7f5ac8116eb0 2026-03-10T07:52:11.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.732+0000 7f5acf09b700 1 -- 192.168.123.105:0/3057153257 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5ac8117cc0 con 0x7f5ac8116eb0 2026-03-10T07:52:11.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.732+0000 7f5acf09b700 1 -- 192.168.123.105:0/3057153257 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5ac81b8680 con 0x7f5ac8116eb0 2026-03-10T07:52:11.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.733+0000 7f5abf7fe700 1 -- 192.168.123.105:0/3057153257 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f5ac000f040 con 0x7f5ac8116eb0 2026-03-10T07:52:11.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.733+0000 7f5abf7fe700 1 -- 192.168.123.105:0/3057153257 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5ac00146c0 con 0x7f5ac8116eb0 2026-03-10T07:52:11.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.735+0000 7f5abf7fe700 1 -- 192.168.123.105:0/3057153257 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 31) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f5ac0014900 con 0x7f5ac8116eb0 2026-03-10T07:52:11.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.735+0000 7f5abf7fe700 1 --2- 192.168.123.105:0/3057153257 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f5ab4077660 0x7f5ab4079b20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:11.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.736+0000 7f5ace099700 1 --2- 192.168.123.105:0/3057153257 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f5ab4077660 0x7f5ab4079b20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:11.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.736+0000 7f5abf7fe700 1 -- 192.168.123.105:0/3057153257 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f5ac009a620 con 0x7f5ac8116eb0 2026-03-10T07:52:11.738 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.736+0000 7f5ace099700 1 --2- 192.168.123.105:0/3057153257 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f5ab4077660 0x7f5ab4079b20 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f5ac4009ad0 tx=0x7f5ac4000bc0 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:11.738 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.737+0000 7f5acf09b700 1 -- 192.168.123.105:0/3057153257 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5aac005320 con 0x7f5ac8116eb0 2026-03-10T07:52:11.745 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.743+0000 7f5abf7fe700 1 -- 192.168.123.105:0/3057153257 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f5ac0063350 con 0x7f5ac8116eb0 2026-03-10T07:52:11.906 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.903+0000 7f5acf09b700 1 -- 192.168.123.105:0/3057153257 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}] v 0) v1 -- 0x7f5aac0059f0 con 0x7f5ac8116eb0 2026-03-10T07:52:11.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.907+0000 7f5abf7fe700 1 -- 192.168.123.105:0/3057153257 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim}]=0 v37)=0 v37) v1 ==== 155+0+0 (secure 0 0 0) 0x7f5ac0062aa0 con 0x7f5ac8116eb0 2026-03-10T07:52:11.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.912+0000 7f5abd77a700 1 -- 192.168.123.105:0/3057153257 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f5ab4077660 msgr2=0x7f5ab4079b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:11.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.912+0000 7f5abd77a700 1 --2- 192.168.123.105:0/3057153257 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f5ab4077660 0x7f5ab4079b20 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f5ac4009ad0 tx=0x7f5ac4000bc0 comp rx=0 tx=0).stop 2026-03-10T07:52:11.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.912+0000 7f5abd77a700 1 -- 192.168.123.105:0/3057153257 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ac8116eb0 msgr2=0x7f5ac8117330 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:11.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.912+0000 7f5abd77a700 1 --2- 192.168.123.105:0/3057153257 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ac8116eb0 0x7f5ac8117330 secure :-1 s=READY pgs=327 cs=0 l=1 rev1=1 crypto rx=0x7f5ac000c390 tx=0x7f5ac000c750 comp rx=0 tx=0).stop 2026-03-10T07:52:11.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.913+0000 7f5abd77a700 1 -- 192.168.123.105:0/3057153257 shutdown_connections 2026-03-10T07:52:11.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.913+0000 7f5abd77a700 1 --2- 192.168.123.105:0/3057153257 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5ac810d0f0 0x7f5ac811beb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:11.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.913+0000 7f5abd77a700 1 --2- 192.168.123.105:0/3057153257 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f5ab4077660 0x7f5ab4079b20 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:11.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.913+0000 7f5abd77a700 1 --2- 192.168.123.105:0/3057153257 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ac8116eb0 0x7f5ac8117330 secure :-1 s=CLOSED pgs=327 cs=0 l=1 rev1=1 crypto rx=0x7f5ac000c390 tx=0x7f5ac000c750 comp rx=0 tx=0).stop 2026-03-10T07:52:11.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.913+0000 7f5abd77a700 1 -- 192.168.123.105:0/3057153257 >> 192.168.123.105:0/3057153257 conn(0x7f5ac806ce20 msgr2=0x7f5ac810ae50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:11.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.914+0000 7f5abd77a700 1 -- 192.168.123.105:0/3057153257 shutdown_connections 2026-03-10T07:52:11.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:11.914+0000 7f5abd77a700 1 -- 192.168.123.105:0/3057153257 wait complete. 2026-03-10T07:52:12.003 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set mon mon_warn_on_insecure_global_id_reclaim_allowed false --force' 2026-03-10T07:52:12.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:12 vm05.local ceph-mon[50387]: pgmap v55: 65 pgs: 65 active+clean; 845 MiB data, 4.2 GiB used, 116 GiB / 120 GiB avail; 21 KiB/s rd, 1.1 MiB/s wr, 126 op/s 2026-03-10T07:52:12.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:12 vm05.local ceph-mon[50387]: from='client.? ' entity='client.admin' 2026-03-10T07:52:12.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:12 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:52:12.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:12 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:12.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:12 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:52:12.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:12 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:52:12.190 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:52:12.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:12 vm08.local ceph-mon[59917]: pgmap v55: 65 pgs: 65 active+clean; 845 MiB data, 4.2 GiB used, 116 GiB / 120 GiB avail; 21 KiB/s rd, 1.1 MiB/s wr, 126 op/s 2026-03-10T07:52:12.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:12 vm08.local ceph-mon[59917]: from='client.? ' entity='client.admin' 2026-03-10T07:52:12.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:12 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:52:12.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:12 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:12.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:12 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:52:12.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:12 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:52:12.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.595+0000 7fa66521c700 1 -- 192.168.123.105:0/3097209538 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa660107ff0 msgr2=0x7fa6601083d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:12.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.595+0000 7fa66521c700 1 --2- 192.168.123.105:0/3097209538 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa660107ff0 0x7fa6601083d0 secure :-1 s=READY pgs=328 cs=0 l=1 rev1=1 crypto rx=0x7fa650007780 tx=0x7fa650007a90 comp rx=0 tx=0).stop 2026-03-10T07:52:12.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.595+0000 7fa66521c700 1 -- 192.168.123.105:0/3097209538 shutdown_connections 2026-03-10T07:52:12.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.595+0000 7fa66521c700 1 --2- 192.168.123.105:0/3097209538 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa6601089a0 0x7fa66010be70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:12.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.595+0000 7fa66521c700 1 --2- 192.168.123.105:0/3097209538 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa660107ff0 0x7fa6601083d0 unknown :-1 s=CLOSED pgs=328 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:12.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.595+0000 7fa66521c700 1 -- 192.168.123.105:0/3097209538 >> 192.168.123.105:0/3097209538 conn(0x7fa66006ce20 msgr2=0x7fa66006d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:12.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.595+0000 7fa66521c700 1 -- 192.168.123.105:0/3097209538 shutdown_connections 2026-03-10T07:52:12.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.595+0000 7fa66521c700 1 -- 192.168.123.105:0/3097209538 wait complete. 2026-03-10T07:52:12.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.595+0000 7fa66521c700 1 Processor -- start 2026-03-10T07:52:12.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.595+0000 7fa66521c700 1 -- start start 2026-03-10T07:52:12.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.596+0000 7fa66521c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa6601089a0 0x7fa66007cd70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:12.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.596+0000 7fa66521c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa66007d2b0 0x7fa66007d730 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:12.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.596+0000 7fa66521c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa660083de0 con 0x7fa6601089a0 2026-03-10T07:52:12.597 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.596+0000 7fa66521c700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa6600818f0 con 0x7fa66007d2b0 2026-03-10T07:52:12.598 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.597+0000 7fa65e59c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa66007d2b0 0x7fa66007d730 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:12.598 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.597+0000 7fa65e59c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa66007d2b0 0x7fa66007d730 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:42890/0 (socket says 192.168.123.105:42890) 2026-03-10T07:52:12.598 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.597+0000 7fa65e59c700 1 -- 192.168.123.105:0/713797968 learned_addr learned my addr 192.168.123.105:0/713797968 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:12.598 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.597+0000 7fa65ed9d700 1 --2- 192.168.123.105:0/713797968 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa6601089a0 0x7fa66007cd70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:12.598 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.597+0000 7fa65e59c700 1 -- 192.168.123.105:0/713797968 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa6601089a0 msgr2=0x7fa66007cd70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:12.598 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.597+0000 7fa65e59c700 1 --2- 192.168.123.105:0/713797968 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa6601089a0 0x7fa66007cd70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:12.598 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.597+0000 7fa65e59c700 1 -- 192.168.123.105:0/713797968 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa650007430 con 0x7fa66007d2b0 2026-03-10T07:52:12.599 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.597+0000 7fa65e59c700 1 --2- 192.168.123.105:0/713797968 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa66007d2b0 0x7fa66007d730 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7fa65800c370 tx=0x7fa65800c730 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:12.599 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.597+0000 7fa647fff700 1 -- 192.168.123.105:0/713797968 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa65800e050 con 0x7fa66007d2b0 2026-03-10T07:52:12.600 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.598+0000 7fa66521c700 1 -- 192.168.123.105:0/713797968 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa660081b70 con 0x7fa66007d2b0 2026-03-10T07:52:12.600 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.598+0000 7fa66521c700 1 -- 192.168.123.105:0/713797968 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa6600820c0 con 0x7fa66007d2b0 2026-03-10T07:52:12.600 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.598+0000 7fa647fff700 1 -- 192.168.123.105:0/713797968 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fa65800f040 con 0x7fa66007d2b0 2026-03-10T07:52:12.600 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.598+0000 7fa647fff700 1 -- 192.168.123.105:0/713797968 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa658013610 con 0x7fa66007d2b0 2026-03-10T07:52:12.600 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.599+0000 7fa66521c700 1 -- 192.168.123.105:0/713797968 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa64c005320 con 0x7fa66007d2b0 2026-03-10T07:52:12.602 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.600+0000 7fa647fff700 1 -- 192.168.123.105:0/713797968 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 31) v1 ==== 99964+0+0 (secure 0 0 0) 0x7fa6580090d0 con 0x7fa66007d2b0 2026-03-10T07:52:12.602 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.601+0000 7fa647fff700 1 --2- 192.168.123.105:0/713797968 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fa648077590 0x7fa648079a50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:12.602 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.601+0000 7fa65ed9d700 1 --2- 192.168.123.105:0/713797968 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fa648077590 0x7fa648079a50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:12.603 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.601+0000 7fa647fff700 1 -- 192.168.123.105:0/713797968 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7fa658099760 con 0x7fa66007d2b0 2026-03-10T07:52:12.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.602+0000 7fa65ed9d700 1 --2- 192.168.123.105:0/713797968 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fa648077590 0x7fa648079a50 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fa650007e60 tx=0x7fa650005a90 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:12.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.605+0000 7fa647fff700 1 -- 192.168.123.105:0/713797968 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7fa658062490 con 0x7fa66007d2b0 2026-03-10T07:52:12.749 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.747+0000 7fa66521c700 1 -- 192.168.123.105:0/713797968 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}] v 0) v1 -- 0x7fa64c005190 con 0x7fa66007d2b0 2026-03-10T07:52:12.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.749+0000 7fa647fff700 1 -- 192.168.123.105:0/713797968 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{prefix=config set, name=mon_warn_on_insecure_global_id_reclaim_allowed}]=0 v37)=0 v37) v1 ==== 163+0+0 (secure 0 0 0) 0x7fa658061be0 con 0x7fa66007d2b0 2026-03-10T07:52:12.753 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.751+0000 7fa645ffb700 1 -- 192.168.123.105:0/713797968 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fa648077590 msgr2=0x7fa648079a50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:12.753 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.751+0000 7fa645ffb700 1 --2- 192.168.123.105:0/713797968 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fa648077590 0x7fa648079a50 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7fa650007e60 tx=0x7fa650005a90 comp rx=0 tx=0).stop 2026-03-10T07:52:12.753 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.751+0000 7fa645ffb700 1 -- 192.168.123.105:0/713797968 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa66007d2b0 msgr2=0x7fa66007d730 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:12.753 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.751+0000 7fa645ffb700 1 --2- 192.168.123.105:0/713797968 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa66007d2b0 0x7fa66007d730 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7fa65800c370 tx=0x7fa65800c730 comp rx=0 tx=0).stop 2026-03-10T07:52:12.753 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.752+0000 7fa645ffb700 1 -- 192.168.123.105:0/713797968 shutdown_connections 2026-03-10T07:52:12.753 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.752+0000 7fa645ffb700 1 --2- 192.168.123.105:0/713797968 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fa648077590 0x7fa648079a50 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:12.753 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.752+0000 7fa645ffb700 1 --2- 192.168.123.105:0/713797968 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa6601089a0 0x7fa66007cd70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:12.753 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.752+0000 7fa645ffb700 1 --2- 192.168.123.105:0/713797968 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa66007d2b0 0x7fa66007d730 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:12.753 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.752+0000 7fa645ffb700 1 -- 192.168.123.105:0/713797968 >> 192.168.123.105:0/713797968 conn(0x7fa66006ce20 msgr2=0x7fa660071590 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:12.753 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.752+0000 7fa645ffb700 1 -- 192.168.123.105:0/713797968 shutdown_connections 2026-03-10T07:52:12.753 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:12.752+0000 7fa645ffb700 1 -- 192.168.123.105:0/713797968 wait complete. 2026-03-10T07:52:12.820 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph config set global log_to_journald false --force' 2026-03-10T07:52:13.016 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:52:13.393 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:13 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:52:13.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.391+0000 7f34f7712700 1 -- 192.168.123.105:0/2238139289 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f34f01089a0 msgr2=0x7f34f010be70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:13.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.391+0000 7f34f7712700 1 --2- 192.168.123.105:0/2238139289 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f34f01089a0 0x7f34f010be70 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f34e800b210 tx=0x7f34e800b520 comp rx=0 tx=0).stop 2026-03-10T07:52:13.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.391+0000 7f34f7712700 1 -- 192.168.123.105:0/2238139289 shutdown_connections 2026-03-10T07:52:13.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.391+0000 7f34f7712700 1 --2- 192.168.123.105:0/2238139289 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f34f01089a0 0x7f34f010be70 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:13.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.391+0000 7f34f7712700 1 --2- 192.168.123.105:0/2238139289 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f34f0107ff0 0x7f34f01083d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:13.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.391+0000 7f34f7712700 1 -- 192.168.123.105:0/2238139289 >> 192.168.123.105:0/2238139289 conn(0x7f34f006ce20 msgr2=0x7f34f006d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:13.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.391+0000 7f34f7712700 1 -- 192.168.123.105:0/2238139289 shutdown_connections 2026-03-10T07:52:13.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.391+0000 7f34f7712700 1 -- 192.168.123.105:0/2238139289 wait complete. 2026-03-10T07:52:13.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.392+0000 7f34f7712700 1 Processor -- start 2026-03-10T07:52:13.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.392+0000 7f34f7712700 1 -- start start 2026-03-10T07:52:13.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.392+0000 7f34f7712700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f34f0107ff0 0x7f34f007cf80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:13.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.392+0000 7f34f7712700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f34f007d4c0 0x7f34f007d940 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:13.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.392+0000 7f34f7712700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f34f0081a60 con 0x7f34f007d4c0 2026-03-10T07:52:13.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.392+0000 7f34f7712700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f34f0081bd0 con 0x7f34f0107ff0 2026-03-10T07:52:13.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.392+0000 7f34f54ae700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f34f0107ff0 0x7f34f007cf80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:13.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.392+0000 7f34f54ae700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f34f0107ff0 0x7f34f007cf80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:42898/0 (socket says 192.168.123.105:42898) 2026-03-10T07:52:13.393 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.392+0000 7f34f54ae700 1 -- 192.168.123.105:0/2776632656 learned_addr learned my addr 192.168.123.105:0/2776632656 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:13.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.392+0000 7f34f4cad700 1 --2- 192.168.123.105:0/2776632656 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f34f007d4c0 0x7f34f007d940 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:13.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.392+0000 7f34f54ae700 1 -- 192.168.123.105:0/2776632656 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f34f007d4c0 msgr2=0x7f34f007d940 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:13.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.392+0000 7f34f54ae700 1 --2- 192.168.123.105:0/2776632656 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f34f007d4c0 0x7f34f007d940 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:13.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.392+0000 7f34f54ae700 1 -- 192.168.123.105:0/2776632656 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f34e8009e30 con 0x7f34f0107ff0 2026-03-10T07:52:13.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.392+0000 7f34f54ae700 1 --2- 192.168.123.105:0/2776632656 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f34f0107ff0 0x7f34f007cf80 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f34ec00ab70 tx=0x7f34ec00e3b0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:13.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.393+0000 7f34e67fc700 1 -- 192.168.123.105:0/2776632656 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f34ec00a4f0 con 0x7f34f0107ff0 2026-03-10T07:52:13.394 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.393+0000 7f34f7712700 1 -- 192.168.123.105:0/2776632656 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f34f0081eb0 con 0x7f34f0107ff0 2026-03-10T07:52:13.395 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.393+0000 7f34f7712700 1 -- 192.168.123.105:0/2776632656 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f34f0082400 con 0x7f34f0107ff0 2026-03-10T07:52:13.395 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.393+0000 7f34e67fc700 1 -- 192.168.123.105:0/2776632656 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f34ec00ed00 con 0x7f34f0107ff0 2026-03-10T07:52:13.395 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.393+0000 7f34e67fc700 1 -- 192.168.123.105:0/2776632656 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f34ec01b9e0 con 0x7f34f0107ff0 2026-03-10T07:52:13.396 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.395+0000 7f34e67fc700 1 -- 192.168.123.105:0/2776632656 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 31) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f34ec01bb40 con 0x7f34f0107ff0 2026-03-10T07:52:13.396 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.395+0000 7f34e67fc700 1 --2- 192.168.123.105:0/2776632656 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f34dc077660 0x7f34dc079b20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:13.397 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.396+0000 7f34e67fc700 1 -- 192.168.123.105:0/2776632656 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f34ec09e2e0 con 0x7f34f0107ff0 2026-03-10T07:52:13.398 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.396+0000 7f34f7712700 1 -- 192.168.123.105:0/2776632656 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f34d4005320 con 0x7f34f0107ff0 2026-03-10T07:52:13.399 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.397+0000 7f34f4cad700 1 --2- 192.168.123.105:0/2776632656 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f34dc077660 0x7f34dc079b20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:13.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.398+0000 7f34f4cad700 1 --2- 192.168.123.105:0/2776632656 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f34dc077660 0x7f34dc079b20 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f34e8000f80 tx=0x7f34e8014040 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:13.401 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.399+0000 7f34e67fc700 1 -- 192.168.123.105:0/2776632656 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f34ec0670c0 con 0x7f34f0107ff0 2026-03-10T07:52:13.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:13 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:52:13.563 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.561+0000 7f34f7712700 1 -- 192.168.123.105:0/2776632656 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command([{prefix=config set, name=log_to_journald}] v 0) v1 -- 0x7f34d4005f70 con 0x7f34f0107ff0 2026-03-10T07:52:13.564 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.562+0000 7f34e67fc700 1 -- 192.168.123.105:0/2776632656 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{prefix=config set, name=log_to_journald}]=0 v37)=0 v37) v1 ==== 135+0+0 (secure 0 0 0) 0x7f34ec066810 con 0x7f34f0107ff0 2026-03-10T07:52:13.566 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.565+0000 7f34f7712700 1 -- 192.168.123.105:0/2776632656 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f34dc077660 msgr2=0x7f34dc079b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:13.566 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.565+0000 7f34f7712700 1 --2- 192.168.123.105:0/2776632656 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f34dc077660 0x7f34dc079b20 secure :-1 s=READY pgs=42 cs=0 l=1 rev1=1 crypto rx=0x7f34e8000f80 tx=0x7f34e8014040 comp rx=0 tx=0).stop 2026-03-10T07:52:13.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.565+0000 7f34f7712700 1 -- 192.168.123.105:0/2776632656 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f34f0107ff0 msgr2=0x7f34f007cf80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:13.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.565+0000 7f34f7712700 1 --2- 192.168.123.105:0/2776632656 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f34f0107ff0 0x7f34f007cf80 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f34ec00ab70 tx=0x7f34ec00e3b0 comp rx=0 tx=0).stop 2026-03-10T07:52:13.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.565+0000 7f34f7712700 1 -- 192.168.123.105:0/2776632656 shutdown_connections 2026-03-10T07:52:13.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.565+0000 7f34f7712700 1 --2- 192.168.123.105:0/2776632656 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f34f0107ff0 0x7f34f007cf80 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:13.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.565+0000 7f34f7712700 1 --2- 192.168.123.105:0/2776632656 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f34dc077660 0x7f34dc079b20 unknown :-1 s=CLOSED pgs=42 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:13.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.566+0000 7f34f7712700 1 --2- 192.168.123.105:0/2776632656 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f34f007d4c0 0x7f34f007d940 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:13.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.566+0000 7f34f7712700 1 -- 192.168.123.105:0/2776632656 >> 192.168.123.105:0/2776632656 conn(0x7f34f006ce20 msgr2=0x7f34f00705f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:13.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.566+0000 7f34f7712700 1 -- 192.168.123.105:0/2776632656 shutdown_connections 2026-03-10T07:52:13.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:13.566+0000 7f34f7712700 1 -- 192.168.123.105:0/2776632656 wait complete. 2026-03-10T07:52:13.601 INFO:teuthology.orchestra.run:Running command with timeout 3600 2026-03-10T07:52:13.601 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.1/client.1/tmp 2026-03-10T07:52:13.624 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade start --image quay.ceph.io/ceph-ci/ceph:$sha1' 2026-03-10T07:52:13.838 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:52:14.252 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.248+0000 7f61c2c7f700 1 -- 192.168.123.105:0/3315711196 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61bc107ff0 msgr2=0x7f61bc1083d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:14.252 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:14 vm05.local ceph-mon[50387]: pgmap v56: 65 pgs: 65 active+clean; 845 MiB data, 4.2 GiB used, 116 GiB / 120 GiB avail; 21 KiB/s rd, 1.1 MiB/s wr, 126 op/s 2026-03-10T07:52:14.253 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.248+0000 7f61c2c7f700 1 --2- 192.168.123.105:0/3315711196 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61bc107ff0 0x7f61bc1083d0 secure :-1 s=READY pgs=329 cs=0 l=1 rev1=1 crypto rx=0x7f61ac00bc70 tx=0x7f61ac00bf80 comp rx=0 tx=0).stop 2026-03-10T07:52:14.253 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.250+0000 7f61c2c7f700 1 -- 192.168.123.105:0/3315711196 shutdown_connections 2026-03-10T07:52:14.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.250+0000 7f61c2c7f700 1 --2- 192.168.123.105:0/3315711196 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61bc1089a0 0x7f61bc10be70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:14.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.250+0000 7f61c2c7f700 1 --2- 192.168.123.105:0/3315711196 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61bc107ff0 0x7f61bc1083d0 unknown :-1 s=CLOSED pgs=329 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:14.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.250+0000 7f61c2c7f700 1 -- 192.168.123.105:0/3315711196 >> 192.168.123.105:0/3315711196 conn(0x7f61bc06ce20 msgr2=0x7f61bc06d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:14.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.250+0000 7f61c2c7f700 1 -- 192.168.123.105:0/3315711196 shutdown_connections 2026-03-10T07:52:14.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.250+0000 7f61c2c7f700 1 -- 192.168.123.105:0/3315711196 wait complete. 2026-03-10T07:52:14.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.250+0000 7f61c2c7f700 1 Processor -- start 2026-03-10T07:52:14.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.250+0000 7f61c2c7f700 1 -- start start 2026-03-10T07:52:14.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.250+0000 7f61c2c7f700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61bc1089a0 0x7f61bc07ceb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:14.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.250+0000 7f61c2c7f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61bc07d3f0 0x7f61bc07d870 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:14.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.250+0000 7f61c2c7f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f61bc081a30 con 0x7f61bc07d3f0 2026-03-10T07:52:14.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.250+0000 7f61c2c7f700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f61bc081ba0 con 0x7f61bc1089a0 2026-03-10T07:52:14.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.251+0000 7f61c0a1b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61bc1089a0 0x7f61bc07ceb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:14.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.252+0000 7f61bbfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61bc07d3f0 0x7f61bc07d870 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:14.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.252+0000 7f61bbfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61bc07d3f0 0x7f61bc07d870 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:44712/0 (socket says 192.168.123.105:44712) 2026-03-10T07:52:14.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.252+0000 7f61bbfff700 1 -- 192.168.123.105:0/3252184446 learned_addr learned my addr 192.168.123.105:0/3252184446 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:14.255 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.253+0000 7f61c0a1b700 1 -- 192.168.123.105:0/3252184446 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61bc07d3f0 msgr2=0x7f61bc07d870 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:14.255 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.253+0000 7f61c0a1b700 1 --2- 192.168.123.105:0/3252184446 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61bc07d3f0 0x7f61bc07d870 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:14.255 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.253+0000 7f61c0a1b700 1 -- 192.168.123.105:0/3252184446 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f61ac00b920 con 0x7f61bc1089a0 2026-03-10T07:52:14.255 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.253+0000 7f61c0a1b700 1 --2- 192.168.123.105:0/3252184446 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61bc1089a0 0x7f61bc07ceb0 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f61ac003a40 tx=0x7f61ac00df50 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:14.255 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.253+0000 7f61b9ffb700 1 -- 192.168.123.105:0/3252184446 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f61ac010040 con 0x7f61bc1089a0 2026-03-10T07:52:14.255 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.254+0000 7f61c2c7f700 1 -- 192.168.123.105:0/3252184446 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f61bc081e20 con 0x7f61bc1089a0 2026-03-10T07:52:14.256 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.254+0000 7f61c2c7f700 1 -- 192.168.123.105:0/3252184446 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f61bc082310 con 0x7f61bc1089a0 2026-03-10T07:52:14.257 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.255+0000 7f61b9ffb700 1 -- 192.168.123.105:0/3252184446 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f61ac01e410 con 0x7f61bc1089a0 2026-03-10T07:52:14.259 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.255+0000 7f61b9ffb700 1 -- 192.168.123.105:0/3252184446 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f61ac014ca0 con 0x7f61bc1089a0 2026-03-10T07:52:14.259 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.255+0000 7f61b9ffb700 1 -- 192.168.123.105:0/3252184446 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 31) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f61ac01d430 con 0x7f61bc1089a0 2026-03-10T07:52:14.259 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.256+0000 7f61b9ffb700 1 --2- 192.168.123.105:0/3252184446 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f61a4077730 0x7f61a4079bf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:14.259 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.256+0000 7f61b9ffb700 1 -- 192.168.123.105:0/3252184446 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f61ac0a4380 con 0x7f61bc1089a0 2026-03-10T07:52:14.259 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.256+0000 7f61bbfff700 1 --2- 192.168.123.105:0/3252184446 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f61a4077730 0x7f61a4079bf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:14.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.259+0000 7f61bbfff700 1 --2- 192.168.123.105:0/3252184446 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f61a4077730 0x7f61a4079bf0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f61b4003eb0 tx=0x7f61b4008be0 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:14.260 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.259+0000 7f61c2c7f700 1 -- 192.168.123.105:0/3252184446 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f61a8005320 con 0x7f61bc1089a0 2026-03-10T07:52:14.267 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.262+0000 7f61b9ffb700 1 -- 192.168.123.105:0/3252184446 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f61ac06d0b0 con 0x7f61bc1089a0 2026-03-10T07:52:14.413 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.411+0000 7f61c2c7f700 1 -- 192.168.123.105:0/3252184446 --> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] -- mgr_command(tid 0: {"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}) v1 -- 0x7f61a8000c90 con 0x7f61a4077730 2026-03-10T07:52:14.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:14 vm08.local ceph-mon[59917]: pgmap v56: 65 pgs: 65 active+clean; 845 MiB data, 4.2 GiB used, 116 GiB / 120 GiB avail; 21 KiB/s rd, 1.1 MiB/s wr, 126 op/s 2026-03-10T07:52:14.497 INFO:teuthology.orchestra.run.vm05.stdout:Initiating upgrade to quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T07:52:14.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.493+0000 7f61b9ffb700 1 -- 192.168.123.105:0/3252184446 <== mgr.14628 v2:192.168.123.105:6800/413688438 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+89 (secure 0 0 0) 0x7f61a8000c90 con 0x7f61a4077730 2026-03-10T07:52:14.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.496+0000 7f61c2c7f700 1 -- 192.168.123.105:0/3252184446 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f61a4077730 msgr2=0x7f61a4079bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:14.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.496+0000 7f61c2c7f700 1 --2- 192.168.123.105:0/3252184446 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f61a4077730 0x7f61a4079bf0 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f61b4003eb0 tx=0x7f61b4008be0 comp rx=0 tx=0).stop 2026-03-10T07:52:14.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.496+0000 7f61c2c7f700 1 -- 192.168.123.105:0/3252184446 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61bc1089a0 msgr2=0x7f61bc07ceb0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:14.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.496+0000 7f61c2c7f700 1 --2- 192.168.123.105:0/3252184446 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61bc1089a0 0x7f61bc07ceb0 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f61ac003a40 tx=0x7f61ac00df50 comp rx=0 tx=0).stop 2026-03-10T07:52:14.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.496+0000 7f61c2c7f700 1 -- 192.168.123.105:0/3252184446 shutdown_connections 2026-03-10T07:52:14.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.496+0000 7f61c2c7f700 1 --2- 192.168.123.105:0/3252184446 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61bc1089a0 0x7f61bc07ceb0 unknown :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:14.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.496+0000 7f61c2c7f700 1 --2- 192.168.123.105:0/3252184446 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f61a4077730 0x7f61a4079bf0 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:14.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.496+0000 7f61c2c7f700 1 --2- 192.168.123.105:0/3252184446 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61bc07d3f0 0x7f61bc07d870 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:14.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.496+0000 7f61c2c7f700 1 -- 192.168.123.105:0/3252184446 >> 192.168.123.105:0/3252184446 conn(0x7f61bc06ce20 msgr2=0x7f61bc0705d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:14.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.497+0000 7f61c2c7f700 1 -- 192.168.123.105:0/3252184446 shutdown_connections 2026-03-10T07:52:14.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:14.497+0000 7f61c2c7f700 1 -- 192.168.123.105:0/3252184446 wait complete. 2026-03-10T07:52:14.568 INFO:teuthology.task.sequential:In sequential, running task cephadm.shell... 2026-03-10T07:52:14.568 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm05.local 2026-03-10T07:52:14.568 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'while ceph orch upgrade status | jq '"'"'.in_progress'"'"' | grep true && ! ceph orch upgrade status | jq '"'"'.message'"'"' | grep Error ; do ceph orch ps ; ceph versions ; ceph fs dump; ceph orch upgrade status ; ceph health detail ; sleep 30 ; done' 2026-03-10T07:52:14.839 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:52:15.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.383+0000 7fbc87fff700 1 -- 192.168.123.105:0/1717470222 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc8810f340 msgr2=0x7fbc8810f720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:15.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.383+0000 7fbc87fff700 1 --2- 192.168.123.105:0/1717470222 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc8810f340 0x7fbc8810f720 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7fbc78009a60 tx=0x7fbc78009d70 comp rx=0 tx=0).stop 2026-03-10T07:52:15.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.383+0000 7fbc87fff700 1 -- 192.168.123.105:0/1717470222 shutdown_connections 2026-03-10T07:52:15.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.383+0000 7fbc87fff700 1 --2- 192.168.123.105:0/1717470222 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbc8810d0f0 0x7fbc8810d570 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:15.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.383+0000 7fbc87fff700 1 --2- 192.168.123.105:0/1717470222 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc8810f340 0x7fbc8810f720 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:15.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.383+0000 7fbc87fff700 1 -- 192.168.123.105:0/1717470222 >> 192.168.123.105:0/1717470222 conn(0x7fbc8806ce20 msgr2=0x7fbc8806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:15.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.383+0000 7fbc87fff700 1 -- 192.168.123.105:0/1717470222 shutdown_connections 2026-03-10T07:52:15.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.383+0000 7fbc87fff700 1 -- 192.168.123.105:0/1717470222 wait complete. 2026-03-10T07:52:15.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.384+0000 7fbc87fff700 1 Processor -- start 2026-03-10T07:52:15.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.384+0000 7fbc87fff700 1 -- start start 2026-03-10T07:52:15.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.384+0000 7fbc87fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbc8810d0f0 0x7fbc8811bee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:15.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.384+0000 7fbc87fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc88116ee0 0x7fbc88117360 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:15.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.384+0000 7fbc87fff700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbc881178a0 con 0x7fbc8810d0f0 2026-03-10T07:52:15.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.384+0000 7fbc87fff700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fbc88117a10 con 0x7fbc88116ee0 2026-03-10T07:52:15.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.384+0000 7fbc867fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc88116ee0 0x7fbc88117360 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:15.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.384+0000 7fbc867fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc88116ee0 0x7fbc88117360 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:42934/0 (socket says 192.168.123.105:42934) 2026-03-10T07:52:15.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.384+0000 7fbc867fc700 1 -- 192.168.123.105:0/2550929862 learned_addr learned my addr 192.168.123.105:0/2550929862 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:15.387 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.385+0000 7fbc86ffd700 1 --2- 192.168.123.105:0/2550929862 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbc8810d0f0 0x7fbc8811bee0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:15.387 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.385+0000 7fbc867fc700 1 -- 192.168.123.105:0/2550929862 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbc8810d0f0 msgr2=0x7fbc8811bee0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:15.387 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.385+0000 7fbc867fc700 1 --2- 192.168.123.105:0/2550929862 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbc8810d0f0 0x7fbc8811bee0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:15.387 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.385+0000 7fbc867fc700 1 -- 192.168.123.105:0/2550929862 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fbc78009710 con 0x7fbc88116ee0 2026-03-10T07:52:15.387 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.385+0000 7fbc867fc700 1 --2- 192.168.123.105:0/2550929862 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc88116ee0 0x7fbc88117360 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7fbc80007f70 tx=0x7fbc8000d3b0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:15.387 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.385+0000 7fbc6ffff700 1 -- 192.168.123.105:0/2550929862 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbc8000de40 con 0x7fbc88116ee0 2026-03-10T07:52:15.387 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.385+0000 7fbc87fff700 1 -- 192.168.123.105:0/2550929862 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fbc88117cf0 con 0x7fbc88116ee0 2026-03-10T07:52:15.387 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.385+0000 7fbc87fff700 1 -- 192.168.123.105:0/2550929862 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fbc881b86e0 con 0x7fbc88116ee0 2026-03-10T07:52:15.387 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.385+0000 7fbc6ffff700 1 -- 192.168.123.105:0/2550929862 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fbc8000f040 con 0x7fbc88116ee0 2026-03-10T07:52:15.387 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.386+0000 7fbc6ffff700 1 -- 192.168.123.105:0/2550929862 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fbc800149b0 con 0x7fbc88116ee0 2026-03-10T07:52:15.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.386+0000 7fbc87fff700 1 -- 192.168.123.105:0/2550929862 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fbc74005320 con 0x7fbc88116ee0 2026-03-10T07:52:15.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.387+0000 7fbc6ffff700 1 -- 192.168.123.105:0/2550929862 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 31) v1 ==== 99964+0+0 (secure 0 0 0) 0x7fbc80014b10 con 0x7fbc88116ee0 2026-03-10T07:52:15.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.388+0000 7fbc6ffff700 1 --2- 192.168.123.105:0/2550929862 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fbc70077590 0x7fbc70079a50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:15.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.388+0000 7fbc86ffd700 1 --2- 192.168.123.105:0/2550929862 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fbc70077590 0x7fbc70079a50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:15.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.388+0000 7fbc6ffff700 1 -- 192.168.123.105:0/2550929862 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7fbc80098e70 con 0x7fbc88116ee0 2026-03-10T07:52:15.390 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.389+0000 7fbc86ffd700 1 --2- 192.168.123.105:0/2550929862 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fbc70077590 0x7fbc70079a50 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7fbc7800b5c0 tx=0x7fbc78019040 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:15.392 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.391+0000 7fbc6ffff700 1 -- 192.168.123.105:0/2550929862 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7fbc800615b0 con 0x7fbc88116ee0 2026-03-10T07:52:15.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.561+0000 7fbc87fff700 1 -- 192.168.123.105:0/2550929862 --> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fbc74000bf0 con 0x7fbc70077590 2026-03-10T07:52:15.565 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.563+0000 7fbc6ffff700 1 -- 192.168.123.105:0/2550929862 <== mgr.14628 v2:192.168.123.105:6800/413688438 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7fbc74000bf0 con 0x7fbc70077590 2026-03-10T07:52:15.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.567+0000 7fbc6dffb700 1 -- 192.168.123.105:0/2550929862 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fbc70077590 msgr2=0x7fbc70079a50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:15.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.567+0000 7fbc6dffb700 1 --2- 192.168.123.105:0/2550929862 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fbc70077590 0x7fbc70079a50 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7fbc7800b5c0 tx=0x7fbc78019040 comp rx=0 tx=0).stop 2026-03-10T07:52:15.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.567+0000 7fbc6dffb700 1 -- 192.168.123.105:0/2550929862 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc88116ee0 msgr2=0x7fbc88117360 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:15.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.567+0000 7fbc6dffb700 1 --2- 192.168.123.105:0/2550929862 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc88116ee0 0x7fbc88117360 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7fbc80007f70 tx=0x7fbc8000d3b0 comp rx=0 tx=0).stop 2026-03-10T07:52:15.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.568+0000 7fbc6dffb700 1 -- 192.168.123.105:0/2550929862 shutdown_connections 2026-03-10T07:52:15.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.568+0000 7fbc6dffb700 1 --2- 192.168.123.105:0/2550929862 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fbc70077590 0x7fbc70079a50 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:15.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.568+0000 7fbc6dffb700 1 --2- 192.168.123.105:0/2550929862 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fbc8810d0f0 0x7fbc8811bee0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:15.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.568+0000 7fbc6dffb700 1 --2- 192.168.123.105:0/2550929862 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fbc88116ee0 0x7fbc88117360 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:15.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.568+0000 7fbc6dffb700 1 -- 192.168.123.105:0/2550929862 >> 192.168.123.105:0/2550929862 conn(0x7fbc8806ce20 msgr2=0x7fbc88070090 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:15.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.568+0000 7fbc6dffb700 1 -- 192.168.123.105:0/2550929862 shutdown_connections 2026-03-10T07:52:15.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.568+0000 7fbc6dffb700 1 -- 192.168.123.105:0/2550929862 wait complete. 2026-03-10T07:52:15.583 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T07:52:15.682 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.680+0000 7fed46169700 1 -- 192.168.123.105:0/998931637 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fed401089a0 msgr2=0x7fed4010be70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:15.682 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.680+0000 7fed46169700 1 --2- 192.168.123.105:0/998931637 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fed401089a0 0x7fed4010be70 secure :-1 s=READY pgs=330 cs=0 l=1 rev1=1 crypto rx=0x7fed3800d3f0 tx=0x7fed3800d700 comp rx=0 tx=0).stop 2026-03-10T07:52:15.682 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.680+0000 7fed46169700 1 -- 192.168.123.105:0/998931637 shutdown_connections 2026-03-10T07:52:15.682 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.680+0000 7fed46169700 1 --2- 192.168.123.105:0/998931637 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fed401089a0 0x7fed4010be70 unknown :-1 s=CLOSED pgs=330 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:15.682 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.680+0000 7fed46169700 1 --2- 192.168.123.105:0/998931637 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fed40107ff0 0x7fed401083d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:15.682 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.680+0000 7fed46169700 1 -- 192.168.123.105:0/998931637 >> 192.168.123.105:0/998931637 conn(0x7fed4006ce20 msgr2=0x7fed4006d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:15.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.680+0000 7fed46169700 1 -- 192.168.123.105:0/998931637 shutdown_connections 2026-03-10T07:52:15.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.680+0000 7fed46169700 1 -- 192.168.123.105:0/998931637 wait complete. 2026-03-10T07:52:15.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.681+0000 7fed46169700 1 Processor -- start 2026-03-10T07:52:15.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.681+0000 7fed46169700 1 -- start start 2026-03-10T07:52:15.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.681+0000 7fed46169700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fed40107ff0 0x7fed40133210 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:15.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.681+0000 7fed46169700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fed40133750 0x7fed40133bd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:15.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.681+0000 7fed46169700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fed4007ef10 con 0x7fed40107ff0 2026-03-10T07:52:15.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.681+0000 7fed46169700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fed4007f080 con 0x7fed40133750 2026-03-10T07:52:15.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.681+0000 7fed3effd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fed40133750 0x7fed40133bd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:15.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.681+0000 7fed3effd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fed40133750 0x7fed40133bd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:42944/0 (socket says 192.168.123.105:42944) 2026-03-10T07:52:15.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.681+0000 7fed3effd700 1 -- 192.168.123.105:0/1440084614 learned_addr learned my addr 192.168.123.105:0/1440084614 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:15.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.681+0000 7fed3f7fe700 1 --2- 192.168.123.105:0/1440084614 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fed40107ff0 0x7fed40133210 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:15.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.682+0000 7fed3effd700 1 -- 192.168.123.105:0/1440084614 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fed40107ff0 msgr2=0x7fed40133210 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:15.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.682+0000 7fed3effd700 1 --2- 192.168.123.105:0/1440084614 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fed40107ff0 0x7fed40133210 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:15.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.682+0000 7fed3effd700 1 -- 192.168.123.105:0/1440084614 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fed38007ed0 con 0x7fed40133750 2026-03-10T07:52:15.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.682+0000 7fed3effd700 1 --2- 192.168.123.105:0/1440084614 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fed40133750 0x7fed40133bd0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7fed38003c30 tx=0x7fed38004b40 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:15.684 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.683+0000 7fed3cff9700 1 -- 192.168.123.105:0/1440084614 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fed3801c070 con 0x7fed40133750 2026-03-10T07:52:15.685 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.683+0000 7fed46169700 1 -- 192.168.123.105:0/1440084614 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fed4007f2b0 con 0x7fed40133750 2026-03-10T07:52:15.685 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.683+0000 7fed46169700 1 -- 192.168.123.105:0/1440084614 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fed4007f7a0 con 0x7fed40133750 2026-03-10T07:52:15.685 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.683+0000 7fed3cff9700 1 -- 192.168.123.105:0/1440084614 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fed3800deb0 con 0x7fed40133750 2026-03-10T07:52:15.685 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.683+0000 7fed46169700 1 -- 192.168.123.105:0/1440084614 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fed4004f2e0 con 0x7fed40133750 2026-03-10T07:52:15.685 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.684+0000 7fed3cff9700 1 -- 192.168.123.105:0/1440084614 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fed38017b10 con 0x7fed40133750 2026-03-10T07:52:15.686 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.685+0000 7fed3cff9700 1 -- 192.168.123.105:0/1440084614 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 31) v1 ==== 99964+0+0 (secure 0 0 0) 0x7fed38017c70 con 0x7fed40133750 2026-03-10T07:52:15.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.686+0000 7fed3cff9700 1 --2- 192.168.123.105:0/1440084614 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fed28077660 0x7fed28079b20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:15.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.686+0000 7fed3f7fe700 1 --2- 192.168.123.105:0/1440084614 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fed28077660 0x7fed28079b20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:15.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.686+0000 7fed3f7fe700 1 --2- 192.168.123.105:0/1440084614 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fed28077660 0x7fed28079b20 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fed30005950 tx=0x7fed3000b410 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:15.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.686+0000 7fed3cff9700 1 -- 192.168.123.105:0/1440084614 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7fed38013070 con 0x7fed40133750 2026-03-10T07:52:15.690 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.688+0000 7fed3cff9700 1 -- 192.168.123.105:0/1440084614 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7fed38064170 con 0x7fed40133750 2026-03-10T07:52:15.827 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:15 vm05.local ceph-mon[50387]: from='client.24531 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:52:15.827 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:15 vm05.local ceph-mon[50387]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T07:52:15.827 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:15 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:52:15.827 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:15 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:52:15.827 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:15 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:15.827 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:15 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:52:15.827 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:15 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:52:15.827 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:15 vm05.local ceph-mon[50387]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T07:52:15.827 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:15 vm05.local ceph-mon[50387]: pgmap v57: 65 pgs: 65 active+clean; 450 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 32 KiB/s rd, 1.5 MiB/s wr, 160 op/s 2026-03-10T07:52:15.827 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.824+0000 7fed46169700 1 -- 192.168.123.105:0/1440084614 --> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fed40134880 con 0x7fed28077660 2026-03-10T07:52:15.828 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.827+0000 7fed3cff9700 1 -- 192.168.123.105:0/1440084614 <== mgr.14628 v2:192.168.123.105:6800/413688438 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+358 (secure 0 0 0) 0x7fed40134880 con 0x7fed28077660 2026-03-10T07:52:15.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.831+0000 7fed267fc700 1 -- 192.168.123.105:0/1440084614 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fed28077660 msgr2=0x7fed28079b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:15.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.831+0000 7fed267fc700 1 --2- 192.168.123.105:0/1440084614 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fed28077660 0x7fed28079b20 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7fed30005950 tx=0x7fed3000b410 comp rx=0 tx=0).stop 2026-03-10T07:52:15.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.831+0000 7fed267fc700 1 -- 192.168.123.105:0/1440084614 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fed40133750 msgr2=0x7fed40133bd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:15.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.831+0000 7fed267fc700 1 --2- 192.168.123.105:0/1440084614 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fed40133750 0x7fed40133bd0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7fed38003c30 tx=0x7fed38004b40 comp rx=0 tx=0).stop 2026-03-10T07:52:15.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.831+0000 7fed267fc700 1 -- 192.168.123.105:0/1440084614 shutdown_connections 2026-03-10T07:52:15.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.831+0000 7fed267fc700 1 --2- 192.168.123.105:0/1440084614 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fed28077660 0x7fed28079b20 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:15.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.831+0000 7fed267fc700 1 --2- 192.168.123.105:0/1440084614 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fed40107ff0 0x7fed40133210 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:15.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.831+0000 7fed267fc700 1 --2- 192.168.123.105:0/1440084614 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fed40133750 0x7fed40133bd0 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:15.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.831+0000 7fed267fc700 1 -- 192.168.123.105:0/1440084614 >> 192.168.123.105:0/1440084614 conn(0x7fed4006ce20 msgr2=0x7fed40070670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:15.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.831+0000 7fed267fc700 1 -- 192.168.123.105:0/1440084614 shutdown_connections 2026-03-10T07:52:15.833 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.831+0000 7fed267fc700 1 -- 192.168.123.105:0/1440084614 wait complete. 2026-03-10T07:52:15.907 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.905+0000 7fd8dcfb1700 1 -- 192.168.123.105:0/426666050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd8d81089a0 msgr2=0x7fd8d810be70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:15.907 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.905+0000 7fd8dcfb1700 1 --2- 192.168.123.105:0/426666050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd8d81089a0 0x7fd8d810be70 secure :-1 s=READY pgs=331 cs=0 l=1 rev1=1 crypto rx=0x7fd8d000d3f0 tx=0x7fd8d000d700 comp rx=0 tx=0).stop 2026-03-10T07:52:15.907 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.905+0000 7fd8dcfb1700 1 -- 192.168.123.105:0/426666050 shutdown_connections 2026-03-10T07:52:15.907 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.905+0000 7fd8dcfb1700 1 --2- 192.168.123.105:0/426666050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd8d81089a0 0x7fd8d810be70 unknown :-1 s=CLOSED pgs=331 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:15.907 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.905+0000 7fd8dcfb1700 1 --2- 192.168.123.105:0/426666050 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd8d8107ff0 0x7fd8d81083d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:15.907 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.905+0000 7fd8dcfb1700 1 -- 192.168.123.105:0/426666050 >> 192.168.123.105:0/426666050 conn(0x7fd8d806ce20 msgr2=0x7fd8d806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:15.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.905+0000 7fd8dcfb1700 1 -- 192.168.123.105:0/426666050 shutdown_connections 2026-03-10T07:52:15.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.906+0000 7fd8dcfb1700 1 -- 192.168.123.105:0/426666050 wait complete. 2026-03-10T07:52:15.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.906+0000 7fd8dcfb1700 1 Processor -- start 2026-03-10T07:52:15.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.906+0000 7fd8dcfb1700 1 -- start start 2026-03-10T07:52:15.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.906+0000 7fd8dcfb1700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd8d8107ff0 0x7fd8d8133210 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:15.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.906+0000 7fd8dcfb1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd8d8133750 0x7fd8d8133bd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:15.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.906+0000 7fd8dcfb1700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd8d807ef10 con 0x7fd8d8133750 2026-03-10T07:52:15.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.906+0000 7fd8dcfb1700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd8d807f080 con 0x7fd8d8107ff0 2026-03-10T07:52:15.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.906+0000 7fd8d5d9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd8d8133750 0x7fd8d8133bd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:15.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.906+0000 7fd8d5d9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd8d8133750 0x7fd8d8133bd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:44762/0 (socket says 192.168.123.105:44762) 2026-03-10T07:52:15.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.907+0000 7fd8d5d9b700 1 -- 192.168.123.105:0/1436068521 learned_addr learned my addr 192.168.123.105:0/1436068521 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:15.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.907+0000 7fd8d659c700 1 --2- 192.168.123.105:0/1436068521 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd8d8107ff0 0x7fd8d8133210 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:15.909 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.907+0000 7fd8d5d9b700 1 -- 192.168.123.105:0/1436068521 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd8d8107ff0 msgr2=0x7fd8d8133210 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:15.909 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.907+0000 7fd8d5d9b700 1 --2- 192.168.123.105:0/1436068521 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd8d8107ff0 0x7fd8d8133210 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:15.909 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.907+0000 7fd8d5d9b700 1 -- 192.168.123.105:0/1436068521 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd8d0007ed0 con 0x7fd8d8133750 2026-03-10T07:52:15.909 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.907+0000 7fd8d5d9b700 1 --2- 192.168.123.105:0/1436068521 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd8d8133750 0x7fd8d8133bd0 secure :-1 s=READY pgs=332 cs=0 l=1 rev1=1 crypto rx=0x7fd8d00060b0 tx=0x7fd8d0003ce0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:15.909 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.908+0000 7fd8c77fe700 1 -- 192.168.123.105:0/1436068521 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd8d001c070 con 0x7fd8d8133750 2026-03-10T07:52:15.910 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.908+0000 7fd8dcfb1700 1 -- 192.168.123.105:0/1436068521 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd8d807f2b0 con 0x7fd8d8133750 2026-03-10T07:52:15.910 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.908+0000 7fd8dcfb1700 1 -- 192.168.123.105:0/1436068521 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd8d807f7a0 con 0x7fd8d8133750 2026-03-10T07:52:15.910 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.908+0000 7fd8c77fe700 1 -- 192.168.123.105:0/1436068521 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fd8d000fcf0 con 0x7fd8d8133750 2026-03-10T07:52:15.910 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.908+0000 7fd8c77fe700 1 -- 192.168.123.105:0/1436068521 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd8d0017c80 con 0x7fd8d8133750 2026-03-10T07:52:15.910 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.909+0000 7fd8dcfb1700 1 -- 192.168.123.105:0/1436068521 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd8b8005320 con 0x7fd8d8133750 2026-03-10T07:52:15.911 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.910+0000 7fd8c77fe700 1 -- 192.168.123.105:0/1436068521 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 31) v1 ==== 99964+0+0 (secure 0 0 0) 0x7fd8d0017420 con 0x7fd8d8133750 2026-03-10T07:52:15.912 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.910+0000 7fd8c77fe700 1 --2- 192.168.123.105:0/1436068521 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fd8c0077660 0x7fd8c0079b20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:15.912 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.911+0000 7fd8c77fe700 1 -- 192.168.123.105:0/1436068521 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7fd8d0013070 con 0x7fd8d8133750 2026-03-10T07:52:15.912 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.911+0000 7fd8d659c700 1 --2- 192.168.123.105:0/1436068521 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fd8c0077660 0x7fd8c0079b20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:15.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.912+0000 7fd8d659c700 1 --2- 192.168.123.105:0/1436068521 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fd8c0077660 0x7fd8c0079b20 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fd8c8005950 tx=0x7fd8c8005cf0 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:15.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:15.913+0000 7fd8c77fe700 1 -- 192.168.123.105:0/1436068521 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7fd8d0064310 con 0x7fd8d8133750 2026-03-10T07:52:15.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:15 vm08.local ceph-mon[59917]: from='client.24531 -' entity='client.admin' cmd=[{"prefix": "orch upgrade start", "image": "quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:52:15.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:15 vm08.local ceph-mon[59917]: Upgrade: Started with target quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T07:52:15.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:15 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:52:15.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:15 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:52:15.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:15 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:15.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:15 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:52:15.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:15 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:52:15.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:15 vm08.local ceph-mon[59917]: Upgrade: First pull of quay.ceph.io/ceph-ci/ceph:e911bdebe5c8faa3800735d1568fcdca65db60df 2026-03-10T07:52:15.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:15 vm08.local ceph-mon[59917]: pgmap v57: 65 pgs: 65 active+clean; 450 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 32 KiB/s rd, 1.5 MiB/s wr, 160 op/s 2026-03-10T07:52:16.048 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.047+0000 7fd8dcfb1700 1 -- 192.168.123.105:0/1436068521 --> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7fd8b8000bf0 con 0x7fd8c0077660 2026-03-10T07:52:16.057 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T07:52:16.057 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (54s) 30s ago 5m 16.9M - 0.25.0 c8568f914cd2 ac15d5f35994 2026-03-10T07:52:16.057 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (6m) 30s ago 6m 8530k - 18.2.1 5be31c24972a 26c4db858175 2026-03-10T07:52:16.057 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (5m) 81s ago 5m 8409k - 18.2.1 5be31c24972a 209e2398a09c 2026-03-10T07:52:16.057 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (6m) 30s ago 6m 7419k - 18.2.1 5be31c24972a d3d7b92c8ac3 2026-03-10T07:52:16.057 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (5m) 81s ago 5m 7415k - 18.2.1 5be31c24972a 96136e0195f7 2026-03-10T07:52:16.057 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (33s) 30s ago 5m 41.5M - 10.4.0 c8b91775d855 6acb529ad951 2026-03-10T07:52:16.057 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.omfhnh vm05 running (3m) 30s ago 3m 262M - 18.2.1 5be31c24972a e23de179e09c 2026-03-10T07:52:16.057 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.pavqil vm05 running (3m) 30s ago 3m 15.7M - 18.2.1 5be31c24972a 5b9e5afa214c 2026-03-10T07:52:16.057 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.dgsaon vm08 running (3m) 81s ago 3m 16.4M - 18.2.1 5be31c24972a 1696aee522b5 2026-03-10T07:52:16.057 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ybmbgd vm08 running (3m) 81s ago 3m 14.7M - 18.2.1 5be31c24972a 30b0e51cd2ed 2026-03-10T07:52:16.057 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.blexke vm05 *:8443,9283,8765 running (117s) 30s ago 6m 585M - 19.2.3-678-ge911bdeb 654f31e6858e 5467b6a3bd38 2026-03-10T07:52:16.057 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.orfpog vm08 *:8443,9283,8765 running (98s) 81s ago 5m 487M - 19.2.3-678-ge911bdeb 654f31e6858e 7a15d7811d5b 2026-03-10T07:52:16.057 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (6m) 30s ago 6m 55.0M 2048M 18.2.1 5be31c24972a 2a459bf05146 2026-03-10T07:52:16.057 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (5m) 81s ago 5m 42.1M 2048M 18.2.1 5be31c24972a e01dfb712474 2026-03-10T07:52:16.057 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (88s) 30s ago 5m 9512k - 1.7.0 72c9c2088986 7cd0b23b4118 2026-03-10T07:52:16.057 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (83s) 81s ago 5m 5368k - 1.7.0 72c9c2088986 3dd4d91d5881 2026-03-10T07:52:16.057 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (5m) 30s ago 5m 308M 4096M 18.2.1 5be31c24972a 9b7c5ea48cea 2026-03-10T07:52:16.057 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (4m) 30s ago 4m 332M 4096M 18.2.1 5be31c24972a 88e0b65b2c93 2026-03-10T07:52:16.057 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (4m) 30s ago 4m 262M 4096M 18.2.1 5be31c24972a b8d3cbeb49f1 2026-03-10T07:52:16.057 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (4m) 81s ago 4m 257M 4096M 18.2.1 5be31c24972a 0a62c54a86c0 2026-03-10T07:52:16.057 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (4m) 81s ago 4m 196M 4096M 18.2.1 5be31c24972a bd748b691ccd 2026-03-10T07:52:16.057 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (4m) 81s ago 4m 220M 4096M 18.2.1 5be31c24972a 9f08820ae98b 2026-03-10T07:52:16.057 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (63s) 30s ago 5m 47.1M - 2.51.0 1d3b7f56885b c59a6be07563 2026-03-10T07:52:16.058 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.053+0000 7fd8c77fe700 1 -- 192.168.123.105:0/1436068521 <== mgr.14628 v2:192.168.123.105:6800/413688438 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7fd8b8000bf0 con 0x7fd8c0077660 2026-03-10T07:52:16.058 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.055+0000 7fd8c57fa700 1 -- 192.168.123.105:0/1436068521 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fd8c0077660 msgr2=0x7fd8c0079b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:16.058 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.055+0000 7fd8c57fa700 1 --2- 192.168.123.105:0/1436068521 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fd8c0077660 0x7fd8c0079b20 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fd8c8005950 tx=0x7fd8c8005cf0 comp rx=0 tx=0).stop 2026-03-10T07:52:16.058 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.055+0000 7fd8c57fa700 1 -- 192.168.123.105:0/1436068521 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd8d8133750 msgr2=0x7fd8d8133bd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:16.058 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.055+0000 7fd8c57fa700 1 --2- 192.168.123.105:0/1436068521 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd8d8133750 0x7fd8d8133bd0 secure :-1 s=READY pgs=332 cs=0 l=1 rev1=1 crypto rx=0x7fd8d00060b0 tx=0x7fd8d0003ce0 comp rx=0 tx=0).stop 2026-03-10T07:52:16.058 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.056+0000 7fd8c57fa700 1 -- 192.168.123.105:0/1436068521 shutdown_connections 2026-03-10T07:52:16.058 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.056+0000 7fd8c57fa700 1 --2- 192.168.123.105:0/1436068521 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd8d8107ff0 0x7fd8d8133210 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:16.058 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.056+0000 7fd8c57fa700 1 --2- 192.168.123.105:0/1436068521 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fd8c0077660 0x7fd8c0079b20 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:16.058 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.056+0000 7fd8c57fa700 1 --2- 192.168.123.105:0/1436068521 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd8d8133750 0x7fd8d8133bd0 unknown :-1 s=CLOSED pgs=332 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:16.058 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.056+0000 7fd8c57fa700 1 -- 192.168.123.105:0/1436068521 >> 192.168.123.105:0/1436068521 conn(0x7fd8d806ce20 msgr2=0x7fd8d8070670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:16.058 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.056+0000 7fd8c57fa700 1 -- 192.168.123.105:0/1436068521 shutdown_connections 2026-03-10T07:52:16.058 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.056+0000 7fd8c57fa700 1 -- 192.168.123.105:0/1436068521 wait complete. 2026-03-10T07:52:16.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.136+0000 7faf06114700 1 -- 192.168.123.105:0/1534860625 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf001089a0 msgr2=0x7faf0010be70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:16.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.136+0000 7faf06114700 1 --2- 192.168.123.105:0/1534860625 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf001089a0 0x7faf0010be70 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7faef800d3e0 tx=0x7faef800d6f0 comp rx=0 tx=0).stop 2026-03-10T07:52:16.142 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.136+0000 7faf06114700 1 -- 192.168.123.105:0/1534860625 shutdown_connections 2026-03-10T07:52:16.142 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.136+0000 7faf06114700 1 --2- 192.168.123.105:0/1534860625 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf001089a0 0x7faf0010be70 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:16.142 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.136+0000 7faf06114700 1 --2- 192.168.123.105:0/1534860625 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faf00107ff0 0x7faf001083d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:16.142 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.136+0000 7faf06114700 1 -- 192.168.123.105:0/1534860625 >> 192.168.123.105:0/1534860625 conn(0x7faf0006ce20 msgr2=0x7faf0006d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:16.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.141+0000 7faf06114700 1 -- 192.168.123.105:0/1534860625 shutdown_connections 2026-03-10T07:52:16.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.141+0000 7faf06114700 1 -- 192.168.123.105:0/1534860625 wait complete. 2026-03-10T07:52:16.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.141+0000 7faf06114700 1 Processor -- start 2026-03-10T07:52:16.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.141+0000 7faf06114700 1 -- start start 2026-03-10T07:52:16.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.141+0000 7faf06114700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faf00107ff0 0x7faf0007ce50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:16.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.141+0000 7faf06114700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf0007d390 0x7faf0007d810 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:16.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.141+0000 7faf06114700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faf000819d0 con 0x7faf00107ff0 2026-03-10T07:52:16.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.141+0000 7faf06114700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faf00081b10 con 0x7faf0007d390 2026-03-10T07:52:16.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.142+0000 7faefeffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf0007d390 0x7faf0007d810 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:16.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.142+0000 7faefeffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf0007d390 0x7faf0007d810 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:42976/0 (socket says 192.168.123.105:42976) 2026-03-10T07:52:16.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.142+0000 7faefeffd700 1 -- 192.168.123.105:0/3840469171 learned_addr learned my addr 192.168.123.105:0/3840469171 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:16.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.142+0000 7faeff7fe700 1 --2- 192.168.123.105:0/3840469171 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faf00107ff0 0x7faf0007ce50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:16.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.142+0000 7faefeffd700 1 -- 192.168.123.105:0/3840469171 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faf00107ff0 msgr2=0x7faf0007ce50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:16.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.142+0000 7faefeffd700 1 --2- 192.168.123.105:0/3840469171 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faf00107ff0 0x7faf0007ce50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:16.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.142+0000 7faefeffd700 1 -- 192.168.123.105:0/3840469171 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7faef800d090 con 0x7faf0007d390 2026-03-10T07:52:16.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.143+0000 7faefeffd700 1 --2- 192.168.123.105:0/3840469171 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf0007d390 0x7faf0007d810 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7faef800b590 tx=0x7faef800b5c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:16.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.143+0000 7faefcff9700 1 -- 192.168.123.105:0/3840469171 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faef8010040 con 0x7faf0007d390 2026-03-10T07:52:16.145 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.143+0000 7faf06114700 1 -- 192.168.123.105:0/3840469171 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7faf00081d40 con 0x7faf0007d390 2026-03-10T07:52:16.145 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.143+0000 7faf06114700 1 -- 192.168.123.105:0/3840469171 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7faf00082290 con 0x7faf0007d390 2026-03-10T07:52:16.145 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.144+0000 7faefcff9700 1 -- 192.168.123.105:0/3840469171 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7faef8004250 con 0x7faf0007d390 2026-03-10T07:52:16.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.144+0000 7faefcff9700 1 -- 192.168.123.105:0/3840469171 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faef80087a0 con 0x7faf0007d390 2026-03-10T07:52:16.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.145+0000 7faefcff9700 1 -- 192.168.123.105:0/3840469171 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 31) v1 ==== 99964+0+0 (secure 0 0 0) 0x7faef8008900 con 0x7faf0007d390 2026-03-10T07:52:16.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.146+0000 7faefcff9700 1 --2- 192.168.123.105:0/3840469171 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7faee8077590 0x7faee8079a50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:16.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.146+0000 7faefcff9700 1 -- 192.168.123.105:0/3840469171 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7faef809b820 con 0x7faf0007d390 2026-03-10T07:52:16.148 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.147+0000 7faeff7fe700 1 --2- 192.168.123.105:0/3840469171 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7faee8077590 0x7faee8079a50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:16.149 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.148+0000 7faeff7fe700 1 --2- 192.168.123.105:0/3840469171 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7faee8077590 0x7faee8079a50 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7faef000ad00 tx=0x7faef000c040 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:16.149 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.148+0000 7faf06114700 1 -- 192.168.123.105:0/3840469171 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7faeec005320 con 0x7faf0007d390 2026-03-10T07:52:16.153 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.151+0000 7faefcff9700 1 -- 192.168.123.105:0/3840469171 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7faef8064550 con 0x7faf0007d390 2026-03-10T07:52:16.355 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.353+0000 7faf06114700 1 -- 192.168.123.105:0/3840469171 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7faeec006200 con 0x7faf0007d390 2026-03-10T07:52:16.355 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.354+0000 7faefcff9700 1 -- 192.168.123.105:0/3840469171 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+679 (secure 0 0 0) 0x7faef8063ca0 con 0x7faf0007d390 2026-03-10T07:52:16.355 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:52:16.355 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T07:52:16.356 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2 2026-03-10T07:52:16.356 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:52:16.356 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T07:52:16.356 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:52:16.356 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:52:16.356 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T07:52:16.356 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 6 2026-03-10T07:52:16.356 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:52:16.356 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T07:52:16.356 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T07:52:16.356 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:52:16.356 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T07:52:16.356 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 12, 2026-03-10T07:52:16.356 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:52:16.356 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T07:52:16.356 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:52:16.360 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.359+0000 7faf06114700 1 -- 192.168.123.105:0/3840469171 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7faee8077590 msgr2=0x7faee8079a50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:16.360 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.359+0000 7faf06114700 1 --2- 192.168.123.105:0/3840469171 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7faee8077590 0x7faee8079a50 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7faef000ad00 tx=0x7faef000c040 comp rx=0 tx=0).stop 2026-03-10T07:52:16.360 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.359+0000 7faf06114700 1 -- 192.168.123.105:0/3840469171 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf0007d390 msgr2=0x7faf0007d810 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:16.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.359+0000 7faf06114700 1 --2- 192.168.123.105:0/3840469171 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf0007d390 0x7faf0007d810 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7faef800b590 tx=0x7faef800b5c0 comp rx=0 tx=0).stop 2026-03-10T07:52:16.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.359+0000 7faf06114700 1 -- 192.168.123.105:0/3840469171 shutdown_connections 2026-03-10T07:52:16.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.359+0000 7faf06114700 1 --2- 192.168.123.105:0/3840469171 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7faee8077590 0x7faee8079a50 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:16.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.359+0000 7faf06114700 1 --2- 192.168.123.105:0/3840469171 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faf00107ff0 0x7faf0007ce50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:16.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.359+0000 7faf06114700 1 --2- 192.168.123.105:0/3840469171 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faf0007d390 0x7faf0007d810 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:16.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.359+0000 7faf06114700 1 -- 192.168.123.105:0/3840469171 >> 192.168.123.105:0/3840469171 conn(0x7faf0006ce20 msgr2=0x7faf00071590 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:16.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.360+0000 7faf06114700 1 -- 192.168.123.105:0/3840469171 shutdown_connections 2026-03-10T07:52:16.361 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.360+0000 7faf06114700 1 -- 192.168.123.105:0/3840469171 wait complete. 2026-03-10T07:52:16.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.458+0000 7f3492d7e700 1 -- 192.168.123.105:0/1432165674 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f348c107ff0 msgr2=0x7f348c1083d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:16.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.458+0000 7f3492d7e700 1 --2- 192.168.123.105:0/1432165674 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f348c107ff0 0x7f348c1083d0 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f347c00bc70 tx=0x7f347c00bf80 comp rx=0 tx=0).stop 2026-03-10T07:52:16.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.458+0000 7f3492d7e700 1 -- 192.168.123.105:0/1432165674 shutdown_connections 2026-03-10T07:52:16.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.458+0000 7f3492d7e700 1 --2- 192.168.123.105:0/1432165674 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f348c1089a0 0x7f348c10be70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:16.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.458+0000 7f3492d7e700 1 --2- 192.168.123.105:0/1432165674 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f348c107ff0 0x7f348c1083d0 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:16.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.458+0000 7f3492d7e700 1 -- 192.168.123.105:0/1432165674 >> 192.168.123.105:0/1432165674 conn(0x7f348c06ce20 msgr2=0x7f348c06d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:16.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.458+0000 7f3492d7e700 1 -- 192.168.123.105:0/1432165674 shutdown_connections 2026-03-10T07:52:16.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.458+0000 7f3492d7e700 1 -- 192.168.123.105:0/1432165674 wait complete. 2026-03-10T07:52:16.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.458+0000 7f3492d7e700 1 Processor -- start 2026-03-10T07:52:16.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.458+0000 7f3492d7e700 1 -- start start 2026-03-10T07:52:16.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.458+0000 7f3492d7e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f348c1089a0 0x7f348c07cea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:16.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.458+0000 7f3492d7e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f348c07d3e0 0x7f348c07d860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:16.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.458+0000 7f3492d7e700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f348c081a20 con 0x7f348c1089a0 2026-03-10T07:52:16.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.458+0000 7f3492d7e700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f348c081b90 con 0x7f348c07d3e0 2026-03-10T07:52:16.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.459+0000 7f348bfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f348c07d3e0 0x7f348c07d860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:16.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.459+0000 7f348bfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f348c07d3e0 0x7f348c07d860 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:42980/0 (socket says 192.168.123.105:42980) 2026-03-10T07:52:16.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.459+0000 7f348bfff700 1 -- 192.168.123.105:0/60615473 learned_addr learned my addr 192.168.123.105:0/60615473 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:16.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.460+0000 7f348bfff700 1 -- 192.168.123.105:0/60615473 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f348c1089a0 msgr2=0x7f348c07cea0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:16.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.460+0000 7f348bfff700 1 --2- 192.168.123.105:0/60615473 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f348c1089a0 0x7f348c07cea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:16.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.460+0000 7f348bfff700 1 -- 192.168.123.105:0/60615473 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f347c00b920 con 0x7f348c07d3e0 2026-03-10T07:52:16.462 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.460+0000 7f348bfff700 1 --2- 192.168.123.105:0/60615473 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f348c07d3e0 0x7f348c07d860 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f348400ea00 tx=0x7f348400edc0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:16.464 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.461+0000 7f3489ffb700 1 -- 192.168.123.105:0/60615473 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3484004d60 con 0x7f348c07d3e0 2026-03-10T07:52:16.464 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.461+0000 7f3489ffb700 1 -- 192.168.123.105:0/60615473 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f3484013070 con 0x7f348c07d3e0 2026-03-10T07:52:16.464 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.461+0000 7f3489ffb700 1 -- 192.168.123.105:0/60615473 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f348400b6c0 con 0x7f348c07d3e0 2026-03-10T07:52:16.464 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.462+0000 7f3492d7e700 1 -- 192.168.123.105:0/60615473 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f348c081e70 con 0x7f348c07d3e0 2026-03-10T07:52:16.464 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.462+0000 7f3492d7e700 1 -- 192.168.123.105:0/60615473 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f348c082390 con 0x7f348c07d3e0 2026-03-10T07:52:16.464 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.463+0000 7f34737fe700 1 -- 192.168.123.105:0/60615473 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f348c04f2e0 con 0x7f348c07d3e0 2026-03-10T07:52:16.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.466+0000 7f3489ffb700 1 -- 192.168.123.105:0/60615473 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 31) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f34840078f0 con 0x7f348c07d3e0 2026-03-10T07:52:16.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.466+0000 7f3489ffb700 1 --2- 192.168.123.105:0/60615473 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f3474077540 0x7f3474079a00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:16.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.467+0000 7f3489ffb700 1 -- 192.168.123.105:0/60615473 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f3484015070 con 0x7f348c07d3e0 2026-03-10T07:52:16.477 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.474+0000 7f3489ffb700 1 -- 192.168.123.105:0/60615473 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f3484055c50 con 0x7f348c07d3e0 2026-03-10T07:52:16.477 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.474+0000 7f3490b1a700 1 --2- 192.168.123.105:0/60615473 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f3474077540 0x7f3474079a00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:16.483 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.481+0000 7f3490b1a700 1 --2- 192.168.123.105:0/60615473 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f3474077540 0x7f3474079a00 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f347c00bc70 tx=0x7f347c003680 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:16.693 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:16 vm05.local ceph-mon[50387]: from='client.24535 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:52:16.694 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:16 vm05.local ceph-mon[50387]: from='client.24539 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:52:16.694 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:16 vm05.local ceph-mon[50387]: from='client.14750 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:52:16.694 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:16 vm05.local ceph-mon[50387]: from='client.? 192.168.123.105:0/3840469171' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:52:16.726 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.725+0000 7f34737fe700 1 -- 192.168.123.105:0/60615473 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f348c04ea90 con 0x7f348c07d3e0 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T07:48:24.309293+0000 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T07:48:33.425187+0000 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24297} 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:inline_data enabled 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T07:52:16.731 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.omfhnh{0:24297} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/723078808,v1:192.168.123.105:6827/723078808] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:52:16.732 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:52:16.732 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:52:16.732 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T07:52:16.732 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:52:16.732 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.pavqil{-1:14512} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/427555544,v1:192.168.123.105:6829/427555544] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:52:16.732 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ybmbgd{-1:14524} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3815585988,v1:192.168.123.108:6825/3815585988] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:52:16.732 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.dgsaon{-1:24313} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/2963085185,v1:192.168.123.108:6827/2963085185] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:52:16.732 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.729+0000 7f3489ffb700 1 -- 192.168.123.105:0/60615473 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1824 (secure 0 0 0) 0x7f34840629d0 con 0x7f348c07d3e0 2026-03-10T07:52:16.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.742+0000 7f3492d7e700 1 -- 192.168.123.105:0/60615473 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f3474077540 msgr2=0x7f3474079a00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:16.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.742+0000 7f3492d7e700 1 --2- 192.168.123.105:0/60615473 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f3474077540 0x7f3474079a00 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f347c00bc70 tx=0x7f347c003680 comp rx=0 tx=0).stop 2026-03-10T07:52:16.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.742+0000 7f3492d7e700 1 -- 192.168.123.105:0/60615473 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f348c07d3e0 msgr2=0x7f348c07d860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:16.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.742+0000 7f3492d7e700 1 --2- 192.168.123.105:0/60615473 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f348c07d3e0 0x7f348c07d860 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7f348400ea00 tx=0x7f348400edc0 comp rx=0 tx=0).stop 2026-03-10T07:52:16.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.743+0000 7f3492d7e700 1 -- 192.168.123.105:0/60615473 shutdown_connections 2026-03-10T07:52:16.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.743+0000 7f3492d7e700 1 --2- 192.168.123.105:0/60615473 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f3474077540 0x7f3474079a00 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:16.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.743+0000 7f3492d7e700 1 --2- 192.168.123.105:0/60615473 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f348c1089a0 0x7f348c07cea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:16.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.743+0000 7f3492d7e700 1 --2- 192.168.123.105:0/60615473 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f348c07d3e0 0x7f348c07d860 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:16.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.743+0000 7f3492d7e700 1 -- 192.168.123.105:0/60615473 >> 192.168.123.105:0/60615473 conn(0x7f348c06ce20 msgr2=0x7f348c071590 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:16.754 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.746+0000 7f3492d7e700 1 -- 192.168.123.105:0/60615473 shutdown_connections 2026-03-10T07:52:16.754 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.752+0000 7f3492d7e700 1 -- 192.168.123.105:0/60615473 wait complete. 2026-03-10T07:52:16.763 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T07:52:16.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.864+0000 7fa82cd91700 1 -- 192.168.123.105:0/2998765980 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa828107ff0 msgr2=0x7fa8281083d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:16.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.864+0000 7fa82cd91700 1 --2- 192.168.123.105:0/2998765980 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa828107ff0 0x7fa8281083d0 secure :-1 s=READY pgs=333 cs=0 l=1 rev1=1 crypto rx=0x7fa818009b00 tx=0x7fa818009e10 comp rx=0 tx=0).stop 2026-03-10T07:52:16.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.865+0000 7fa82cd91700 1 -- 192.168.123.105:0/2998765980 shutdown_connections 2026-03-10T07:52:16.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.865+0000 7fa82cd91700 1 --2- 192.168.123.105:0/2998765980 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa8281089a0 0x7fa82810be70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:16.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.865+0000 7fa82cd91700 1 --2- 192.168.123.105:0/2998765980 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa828107ff0 0x7fa8281083d0 unknown :-1 s=CLOSED pgs=333 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:16.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.865+0000 7fa82cd91700 1 -- 192.168.123.105:0/2998765980 >> 192.168.123.105:0/2998765980 conn(0x7fa82806ce20 msgr2=0x7fa82806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:16.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.865+0000 7fa82cd91700 1 -- 192.168.123.105:0/2998765980 shutdown_connections 2026-03-10T07:52:16.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.865+0000 7fa82cd91700 1 -- 192.168.123.105:0/2998765980 wait complete. 2026-03-10T07:52:16.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.865+0000 7fa82cd91700 1 Processor -- start 2026-03-10T07:52:16.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.865+0000 7fa82cd91700 1 -- start start 2026-03-10T07:52:16.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.865+0000 7fa82cd91700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa8281089a0 0x7fa82807ceb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:16.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.865+0000 7fa82cd91700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa82807d3f0 0x7fa82807d870 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:16.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.865+0000 7fa82cd91700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa828081a30 con 0x7fa8281089a0 2026-03-10T07:52:16.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.865+0000 7fa82cd91700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fa828081ba0 con 0x7fa82807d3f0 2026-03-10T07:52:16.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.866+0000 7fa825d9b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa82807d3f0 0x7fa82807d870 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:16.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.866+0000 7fa825d9b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa82807d3f0 0x7fa82807d870 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:43002/0 (socket says 192.168.123.105:43002) 2026-03-10T07:52:16.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.866+0000 7fa825d9b700 1 -- 192.168.123.105:0/2086865845 learned_addr learned my addr 192.168.123.105:0/2086865845 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:16.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.866+0000 7fa825d9b700 1 -- 192.168.123.105:0/2086865845 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa8281089a0 msgr2=0x7fa82807ceb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:16.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.866+0000 7fa825d9b700 1 --2- 192.168.123.105:0/2086865845 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa8281089a0 0x7fa82807ceb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:16.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.866+0000 7fa825d9b700 1 -- 192.168.123.105:0/2086865845 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fa8180097e0 con 0x7fa82807d3f0 2026-03-10T07:52:16.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.866+0000 7fa825d9b700 1 --2- 192.168.123.105:0/2086865845 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa82807d3f0 0x7fa82807d870 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7fa82000c6e0 tx=0x7fa82000caa0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:16.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.868+0000 7fa8177fe700 1 -- 192.168.123.105:0/2086865845 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa820012070 con 0x7fa82807d3f0 2026-03-10T07:52:16.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.868+0000 7fa82cd91700 1 -- 192.168.123.105:0/2086865845 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fa828081e80 con 0x7fa82807d3f0 2026-03-10T07:52:16.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.868+0000 7fa82cd91700 1 -- 192.168.123.105:0/2086865845 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fa8280823d0 con 0x7fa82807d3f0 2026-03-10T07:52:16.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.869+0000 7fa8177fe700 1 -- 192.168.123.105:0/2086865845 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fa8200121d0 con 0x7fa82807d3f0 2026-03-10T07:52:16.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.869+0000 7fa8177fe700 1 -- 192.168.123.105:0/2086865845 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fa820007910 con 0x7fa82807d3f0 2026-03-10T07:52:16.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.872+0000 7fa8177fe700 1 -- 192.168.123.105:0/2086865845 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 31) v1 ==== 99964+0+0 (secure 0 0 0) 0x7fa820011540 con 0x7fa82807d3f0 2026-03-10T07:52:16.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.872+0000 7fa8177fe700 1 --2- 192.168.123.105:0/2086865845 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fa810077660 0x7fa810079b20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:16.877 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.875+0000 7fa82659c700 1 --2- 192.168.123.105:0/2086865845 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fa810077660 0x7fa810079b20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:16.877 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.875+0000 7fa82659c700 1 --2- 192.168.123.105:0/2086865845 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fa810077660 0x7fa810079b20 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fa818009ad0 tx=0x7fa818009700 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:16.877 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.875+0000 7fa8177fe700 1 -- 192.168.123.105:0/2086865845 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7fa82009d710 con 0x7fa82807d3f0 2026-03-10T07:52:16.877 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.876+0000 7fa82cd91700 1 -- 192.168.123.105:0/2086865845 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fa808005320 con 0x7fa82807d3f0 2026-03-10T07:52:16.884 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:16.883+0000 7fa8177fe700 1 -- 192.168.123.105:0/2086865845 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7fa820066440 con 0x7fa82807d3f0 2026-03-10T07:52:16.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:16 vm08.local ceph-mon[59917]: from='client.24535 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:52:16.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:16 vm08.local ceph-mon[59917]: from='client.24539 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:52:16.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:16 vm08.local ceph-mon[59917]: from='client.14750 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:52:16.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:16 vm08.local ceph-mon[59917]: from='client.? 192.168.123.105:0/3840469171' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:52:17.083 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.081+0000 7fa82cd91700 1 -- 192.168.123.105:0/2086865845 --> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fa808000bf0 con 0x7fa810077660 2026-03-10T07:52:17.088 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.086+0000 7fa8177fe700 1 -- 192.168.123.105:0/2086865845 <== mgr.14628 v2:192.168.123.105:6800/413688438 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+368 (secure 0 0 0) 0x7fa808000bf0 con 0x7fa810077660 2026-03-10T07:52:17.091 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:52:17.091 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T07:52:17.091 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T07:52:17.091 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T07:52:17.091 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T07:52:17.091 INFO:teuthology.orchestra.run.vm05.stdout: "mgr" 2026-03-10T07:52:17.091 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T07:52:17.091 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "2/23 daemons upgraded", 2026-03-10T07:52:17.091 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading mon daemons", 2026-03-10T07:52:17.091 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T07:52:17.091 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:52:17.091 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.090+0000 7fa8157fa700 1 -- 192.168.123.105:0/2086865845 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fa810077660 msgr2=0x7fa810079b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:17.091 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.090+0000 7fa8157fa700 1 --2- 192.168.123.105:0/2086865845 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fa810077660 0x7fa810079b20 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fa818009ad0 tx=0x7fa818009700 comp rx=0 tx=0).stop 2026-03-10T07:52:17.091 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.090+0000 7fa8157fa700 1 -- 192.168.123.105:0/2086865845 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa82807d3f0 msgr2=0x7fa82807d870 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:17.091 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.090+0000 7fa8157fa700 1 --2- 192.168.123.105:0/2086865845 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa82807d3f0 0x7fa82807d870 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7fa82000c6e0 tx=0x7fa82000caa0 comp rx=0 tx=0).stop 2026-03-10T07:52:17.093 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.092+0000 7fa8157fa700 1 -- 192.168.123.105:0/2086865845 shutdown_connections 2026-03-10T07:52:17.093 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.092+0000 7fa8157fa700 1 --2- 192.168.123.105:0/2086865845 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7fa810077660 0x7fa810079b20 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:17.093 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.092+0000 7fa8157fa700 1 --2- 192.168.123.105:0/2086865845 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fa8281089a0 0x7fa82807ceb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:17.093 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.092+0000 7fa8157fa700 1 --2- 192.168.123.105:0/2086865845 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fa82807d3f0 0x7fa82807d870 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:17.093 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.092+0000 7fa8157fa700 1 -- 192.168.123.105:0/2086865845 >> 192.168.123.105:0/2086865845 conn(0x7fa82806ce20 msgr2=0x7fa8280705d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:17.093 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.092+0000 7fa8157fa700 1 -- 192.168.123.105:0/2086865845 shutdown_connections 2026-03-10T07:52:17.094 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.092+0000 7fa8157fa700 1 -- 192.168.123.105:0/2086865845 wait complete. 2026-03-10T07:52:17.221 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.219+0000 7f94eab09700 1 -- 192.168.123.105:0/2544051749 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f94e41030c0 msgr2=0x7f94e41034a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:17.221 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.219+0000 7f94eab09700 1 --2- 192.168.123.105:0/2544051749 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f94e41030c0 0x7f94e41034a0 secure :-1 s=READY pgs=334 cs=0 l=1 rev1=1 crypto rx=0x7f94e0009b00 tx=0x7f94e0009e10 comp rx=0 tx=0).stop 2026-03-10T07:52:17.221 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.220+0000 7f94eab09700 1 -- 192.168.123.105:0/2544051749 shutdown_connections 2026-03-10T07:52:17.221 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.220+0000 7f94eab09700 1 --2- 192.168.123.105:0/2544051749 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f94e4103a70 0x7f94e4107bd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:17.221 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.220+0000 7f94eab09700 1 --2- 192.168.123.105:0/2544051749 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f94e41030c0 0x7f94e41034a0 unknown :-1 s=CLOSED pgs=334 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:17.221 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.220+0000 7f94eab09700 1 -- 192.168.123.105:0/2544051749 >> 192.168.123.105:0/2544051749 conn(0x7f94e40fe930 msgr2=0x7f94e4100d50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:17.225 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.221+0000 7f94eab09700 1 -- 192.168.123.105:0/2544051749 shutdown_connections 2026-03-10T07:52:17.225 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.224+0000 7f94eab09700 1 -- 192.168.123.105:0/2544051749 wait complete. 2026-03-10T07:52:17.225 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.224+0000 7f94eab09700 1 Processor -- start 2026-03-10T07:52:17.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.224+0000 7f94eab09700 1 -- start start 2026-03-10T07:52:17.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.225+0000 7f94eab09700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f94e41030c0 0x7f94e419ea60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:17.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.225+0000 7f94eab09700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f94e4103a70 0x7f94e419efa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:17.227 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.225+0000 7f94eab09700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f94e419f630 con 0x7f94e41030c0 2026-03-10T07:52:17.227 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.225+0000 7f94eab09700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f94e4198ae0 con 0x7f94e4103a70 2026-03-10T07:52:17.227 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.225+0000 7f94e9306700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f94e4103a70 0x7f94e419efa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:17.227 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.225+0000 7f94e9306700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f94e4103a70 0x7f94e419efa0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:43026/0 (socket says 192.168.123.105:43026) 2026-03-10T07:52:17.227 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.225+0000 7f94e9306700 1 -- 192.168.123.105:0/321914181 learned_addr learned my addr 192.168.123.105:0/321914181 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:17.227 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.225+0000 7f94e9b07700 1 --2- 192.168.123.105:0/321914181 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f94e41030c0 0x7f94e419ea60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:17.227 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.225+0000 7f94e9306700 1 -- 192.168.123.105:0/321914181 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f94e41030c0 msgr2=0x7f94e419ea60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:17.227 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.225+0000 7f94e9306700 1 --2- 192.168.123.105:0/321914181 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f94e41030c0 0x7f94e419ea60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:17.227 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.225+0000 7f94e9306700 1 -- 192.168.123.105:0/321914181 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f94e00097e0 con 0x7f94e4103a70 2026-03-10T07:52:17.227 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.225+0000 7f94e9306700 1 --2- 192.168.123.105:0/321914181 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f94e4103a70 0x7f94e419efa0 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f94d400b700 tx=0x7f94d400bac0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:17.227 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.226+0000 7f94daffd700 1 -- 192.168.123.105:0/321914181 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f94d4010820 con 0x7f94e4103a70 2026-03-10T07:52:17.227 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.226+0000 7f94eab09700 1 -- 192.168.123.105:0/321914181 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f94e4198dc0 con 0x7f94e4103a70 2026-03-10T07:52:17.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.226+0000 7f94eab09700 1 -- 192.168.123.105:0/321914181 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f94e4199310 con 0x7f94e4103a70 2026-03-10T07:52:17.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.226+0000 7f94daffd700 1 -- 192.168.123.105:0/321914181 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f94d4010e60 con 0x7f94e4103a70 2026-03-10T07:52:17.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.226+0000 7f94daffd700 1 -- 192.168.123.105:0/321914181 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f94d4017570 con 0x7f94e4103a70 2026-03-10T07:52:17.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.228+0000 7f94eab09700 1 -- 192.168.123.105:0/321914181 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f94c8005320 con 0x7f94e4103a70 2026-03-10T07:52:17.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.229+0000 7f94daffd700 1 -- 192.168.123.105:0/321914181 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 31) v1 ==== 99964+0+0 (secure 0 0 0) 0x7f94d4010980 con 0x7f94e4103a70 2026-03-10T07:52:17.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.229+0000 7f94daffd700 1 --2- 192.168.123.105:0/321914181 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f94d0077610 0x7f94d0079ad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:17.230 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.229+0000 7f94daffd700 1 -- 192.168.123.105:0/321914181 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(42..42 src has 1..42) v4 ==== 5878+0+0 (secure 0 0 0) 0x7f94d4098d30 con 0x7f94e4103a70 2026-03-10T07:52:17.233 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.229+0000 7f94e9b07700 1 --2- 192.168.123.105:0/321914181 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f94d0077610 0x7f94d0079ad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:17.233 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.230+0000 7f94e9b07700 1 --2- 192.168.123.105:0/321914181 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f94d0077610 0x7f94d0079ad0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f94e0006010 tx=0x7f94e000b580 comp rx=0 tx=0).ready entity=mgr.14628 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:17.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.232+0000 7f94daffd700 1 -- 192.168.123.105:0/321914181 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+191549 (secure 0 0 0) 0x7f94d4061290 con 0x7f94e4103a70 2026-03-10T07:52:17.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.528+0000 7f94eab09700 1 -- 192.168.123.105:0/321914181 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f94c8005190 con 0x7f94e4103a70 2026-03-10T07:52:17.532 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.531+0000 7f94daffd700 1 -- 192.168.123.105:0/321914181 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+201 (secure 0 0 0) 0x7f94d40610b0 con 0x7f94e4103a70 2026-03-10T07:52:17.532 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T07:52:17.532 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T07:52:17.532 INFO:teuthology.orchestra.run.vm05.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T07:52:17.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.535+0000 7f94d8ff9700 1 -- 192.168.123.105:0/321914181 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f94d0077610 msgr2=0x7f94d0079ad0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:17.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.535+0000 7f94d8ff9700 1 --2- 192.168.123.105:0/321914181 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f94d0077610 0x7f94d0079ad0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f94e0006010 tx=0x7f94e000b580 comp rx=0 tx=0).stop 2026-03-10T07:52:17.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.535+0000 7f94d8ff9700 1 -- 192.168.123.105:0/321914181 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f94e4103a70 msgr2=0x7f94e419efa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:17.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.535+0000 7f94d8ff9700 1 --2- 192.168.123.105:0/321914181 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f94e4103a70 0x7f94e419efa0 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f94d400b700 tx=0x7f94d400bac0 comp rx=0 tx=0).stop 2026-03-10T07:52:17.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.536+0000 7f94d8ff9700 1 -- 192.168.123.105:0/321914181 shutdown_connections 2026-03-10T07:52:17.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.536+0000 7f94d8ff9700 1 --2- 192.168.123.105:0/321914181 >> [v2:192.168.123.105:6800/413688438,v1:192.168.123.105:6801/413688438] conn(0x7f94d0077610 0x7f94d0079ad0 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:17.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.536+0000 7f94d8ff9700 1 --2- 192.168.123.105:0/321914181 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f94e41030c0 0x7f94e419ea60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:17.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.536+0000 7f94d8ff9700 1 --2- 192.168.123.105:0/321914181 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f94e4103a70 0x7f94e419efa0 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:17.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.536+0000 7f94d8ff9700 1 -- 192.168.123.105:0/321914181 >> 192.168.123.105:0/321914181 conn(0x7f94e40fe930 msgr2=0x7f94e4100cf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:17.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.537+0000 7f94d8ff9700 1 -- 192.168.123.105:0/321914181 shutdown_connections 2026-03-10T07:52:17.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:17.537+0000 7f94d8ff9700 1 -- 192.168.123.105:0/321914181 wait complete. 2026-03-10T07:52:17.782 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:17 vm05.local ceph-mon[50387]: from='client.? 192.168.123.105:0/60615473' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T07:52:17.782 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:17 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:52:17.782 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:17 vm05.local ceph-mon[50387]: Upgrade: Target is version 19.2.3-678-ge911bdeb (squid) 2026-03-10T07:52:17.782 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:17 vm05.local ceph-mon[50387]: Upgrade: Target container is quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, digests ['quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc'] 2026-03-10T07:52:17.782 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:17 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:52:17.782 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:17 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:52:17.782 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:17 vm05.local ceph-mon[50387]: Upgrade: Setting container_image for all mgr 2026-03-10T07:52:17.782 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:17 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:52:17.782 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:17 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-10T07:52:17.782 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:17 vm05.local ceph-mon[50387]: pgmap v58: 65 pgs: 65 active+clean; 450 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 914 KiB/s wr, 91 op/s 2026-03-10T07:52:17.782 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:17 vm05.local ceph-mon[50387]: from='client.24551 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:52:17.782 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:17 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:52:17.782 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:17 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T07:52:17.782 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:17 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T07:52:17.782 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:17 vm05.local ceph-mon[50387]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:17.782 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:17 vm05.local ceph-mon[50387]: from='client.? 192.168.123.105:0/321914181' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T07:52:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:17 vm08.local ceph-mon[59917]: from='client.? 192.168.123.105:0/60615473' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T07:52:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:17 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:52:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:17 vm08.local ceph-mon[59917]: Upgrade: Target is version 19.2.3-678-ge911bdeb (squid) 2026-03-10T07:52:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:17 vm08.local ceph-mon[59917]: Upgrade: Target container is quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, digests ['quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc'] 2026-03-10T07:52:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:17 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:52:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:17 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:52:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:17 vm08.local ceph-mon[59917]: Upgrade: Setting container_image for all mgr 2026-03-10T07:52:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:17 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:52:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:17 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-10T07:52:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:17 vm08.local ceph-mon[59917]: pgmap v58: 65 pgs: 65 active+clean; 450 MiB data, 3.2 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 914 KiB/s wr, 91 op/s 2026-03-10T07:52:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:17 vm08.local ceph-mon[59917]: from='client.24551 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:52:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:17 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' 2026-03-10T07:52:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:17 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T07:52:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:17 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T07:52:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:17 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:17.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:17 vm08.local ceph-mon[59917]: from='client.? 192.168.123.105:0/321914181' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T07:52:18.105 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:17 vm05.local systemd[1]: Stopping Ceph mon.vm05 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T07:52:18.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm05[50383]: 2026-03-10T07:52:18.102+0000 7f1d864de700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm05 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T07:52:18.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm05[50383]: 2026-03-10T07:52:18.102+0000 7f1d864de700 -1 mon.vm05@0(leader) e2 *** Got Signal Terminated *** 2026-03-10T07:52:18.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local podman[129981]: 2026-03-10 07:52:18.149658934 +0000 UTC m=+0.069791747 container died 2a459bf05146cbba941e46798fca428be9fb365229bdc64db44d9ab8b736af7f (image=quay.io/ceph/ceph:v18.2.1, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm05, io.buildah.version=1.29.1, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , org.label-schema.name=CentOS Stream 8 Base Image, CEPH_POINT_RELEASE=-18.2.1, GIT_CLEAN=True, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, RELEASE=HEAD, org.label-schema.license=GPLv2, org.label-schema.build-date=20240222, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, ceph=True) 2026-03-10T07:52:18.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local podman[129981]: 2026-03-10 07:52:18.175197242 +0000 UTC m=+0.095330045 container remove 2a459bf05146cbba941e46798fca428be9fb365229bdc64db44d9ab8b736af7f (image=quay.io/ceph/ceph:v18.2.1, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm05, org.label-schema.build-date=20240222, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.1, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, RELEASE=HEAD, io.buildah.version=1.29.1, GIT_BRANCH=HEAD, maintainer=Guillaume Abrioux ) 2026-03-10T07:52:18.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local bash[129981]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm05 2026-03-10T07:52:18.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@mon.vm05.service: Deactivated successfully. 2026-03-10T07:52:18.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local systemd[1]: Stopped Ceph mon.vm05 for 12e9780e-1c55-11f1-8896-79f7c2e9b508. 2026-03-10T07:52:18.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@mon.vm05.service: Consumed 7.102s CPU time. 2026-03-10T07:52:18.726 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local systemd[1]: Starting Ceph mon.vm05 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T07:52:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local podman[130103]: 2026-03-10 07:52:18.725694006 +0000 UTC m=+0.034213181 container create f02f076bb820289223dcf007e14aa01f5c8a7787ea73b684f54775da9d486e35 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm05, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T07:52:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local podman[130103]: 2026-03-10 07:52:18.772163174 +0000 UTC m=+0.080682359 container init f02f076bb820289223dcf007e14aa01f5c8a7787ea73b684f54775da9d486e35 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm05, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3) 2026-03-10T07:52:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local podman[130103]: 2026-03-10 07:52:18.775970653 +0000 UTC m=+0.084489828 container start f02f076bb820289223dcf007e14aa01f5c8a7787ea73b684f54775da9d486e35 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm05, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T07:52:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local bash[130103]: f02f076bb820289223dcf007e14aa01f5c8a7787ea73b684f54775da9d486e35 2026-03-10T07:52:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local podman[130103]: 2026-03-10 07:52:18.705719206 +0000 UTC m=+0.014238381 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T07:52:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local systemd[1]: Started Ceph mon.vm05 for 12e9780e-1c55-11f1-8896-79f7c2e9b508. 2026-03-10T07:52:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: set uid:gid to 167:167 (ceph:ceph) 2026-03-10T07:52:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable), process ceph-mon, pid 2 2026-03-10T07:52:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: pidfile_write: ignore empty --pid-file 2026-03-10T07:52:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: load: jerasure load: lrc 2026-03-10T07:52:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: RocksDB version: 7.9.2 2026-03-10T07:52:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Git sha 0 2026-03-10T07:52:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Compile date 2026-02-25 18:11:04 2026-03-10T07:52:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: DB SUMMARY 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: DB Session ID: QBLEWJTWGKYWNSD88NSY 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: CURRENT file: CURRENT 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: IDENTITY file: IDENTITY 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: MANIFEST file: MANIFEST-000015 size: 1020 Bytes 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm05/store.db dir, Total Num: 1, files: 000026.sst 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm05/store.db: 000024.log size: 47259 ; 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.error_if_exists: 0 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.create_if_missing: 0 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.paranoid_checks: 1 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.env: 0x55abc9e14dc0 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.fs: PosixFileSystem 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.info_log: 0x55abcbed9900 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_file_opening_threads: 16 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.statistics: (nil) 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.use_fsync: 0 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_log_file_size: 0 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.keep_log_file_num: 1000 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.recycle_log_file_num: 0 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.allow_fallocate: 1 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.allow_mmap_reads: 0 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.allow_mmap_writes: 0 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.use_direct_reads: 0 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.create_missing_column_families: 0 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.db_log_dir: 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.wal_dir: 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.advise_random_on_open: 1 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.db_write_buffer_size: 0 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.write_buffer_manager: 0x55abcbedd900 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.rate_limiter: (nil) 2026-03-10T07:52:19.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.wal_recovery_mode: 2 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.enable_thread_tracking: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.enable_pipelined_write: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.unordered_write: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.row_cache: None 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.wal_filter: None 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.allow_ingest_behind: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.two_write_queues: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.manual_wal_flush: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.wal_compression: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.atomic_flush: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.log_readahead_size: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.best_efforts_recovery: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.allow_data_in_errors: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.db_host_id: __hostname__ 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_background_jobs: 2 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_background_compactions: -1 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_subcompactions: 1 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_total_wal_size: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_open_files: -1 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.bytes_per_sync: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.compaction_readahead_size: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_background_flushes: -1 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Compression algorithms supported: 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: kZSTD supported: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: kXpressCompression supported: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: kBZip2Compression supported: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: kLZ4Compression supported: 1 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: kZlibCompression supported: 1 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: kLZ4HCCompression supported: 1 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: kSnappyCompression supported: 1 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm05/store.db/MANIFEST-000015 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.merge_operator: 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.compaction_filter: None 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.compaction_filter_factory: None 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.sst_partitioner_factory: None 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55abcbed8500) 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout: cache_index_and_filter_blocks: 1 2026-03-10T07:52:19.159 INFO:journalctl@ceph.mon.vm05.vm05.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: pin_top_level_index_and_filter: 1 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: index_type: 0 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: data_block_index_type: 0 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: index_shortening: 1 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: checksum: 4 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: no_block_cache: 0 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_cache: 0x55abcbefd350 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_cache_name: BinnedLRUCache 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_cache_options: 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: capacity : 536870912 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: num_shard_bits : 4 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: strict_capacity_limit : 0 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: high_pri_pool_ratio: 0.000 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_cache_compressed: (nil) 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: persistent_cache: (nil) 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_size: 4096 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_size_deviation: 10 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_restart_interval: 16 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: index_block_restart_interval: 1 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: metadata_block_size: 4096 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: partition_filters: 0 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: use_delta_encoding: 1 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: filter_policy: bloomfilter 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: whole_key_filtering: 1 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: verify_compression: 0 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: read_amp_bytes_per_bit: 0 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: format_version: 5 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: enable_index_compression: 1 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: block_align: 0 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: max_auto_readahead_size: 262144 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: prepopulate_block_cache: 0 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: initial_auto_readahead_size: 8192 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout: num_file_reads_for_auto_readahead: 2 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.write_buffer_size: 33554432 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_write_buffer_number: 2 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.compression: NoCompression 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.bottommost_compression: Disabled 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.prefix_extractor: nullptr 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.num_levels: 7 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.compression_opts.level: 32767 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.compression_opts.strategy: 0 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.compression_opts.enabled: false 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-10T07:52:19.160 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.target_file_size_base: 67108864 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.arena_block_size: 1048576 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.disable_auto_compactions: 0 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.inplace_update_support: 0 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.bloom_locality: 0 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.max_successive_merges: 0 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.paranoid_file_checks: 0 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.force_consistency_checks: 1 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.report_bg_io_stats: 0 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.ttl: 2592000 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.enable_blob_files: false 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.min_blob_size: 0 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.blob_file_size: 268435456 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.blob_file_starting_level: 0 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm05/store.db/MANIFEST-000015 succeeded,manifest_file_number is 15, next_file_number is 28, last_sequence is 9962, log_number is 24,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 24 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 24 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 07859557-12af-4cf8-b0fd-7862ae4579b0 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773129138836632, "job": 1, "event": "recovery_started", "wal_files": [24]} 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #24 mode 2 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773129138838279, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 29, "file_size": 47573, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 9963, "largest_seqno": 10044, "table_properties": {"data_size": 46152, "index_size": 190, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 197, "raw_key_size": 1484, "raw_average_key_size": 26, "raw_value_size": 44937, "raw_average_value_size": 817, "num_data_blocks": 9, "num_entries": 55, "num_filter_entries": 55, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773129138, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "07859557-12af-4cf8-b0fd-7862ae4579b0", "db_session_id": "QBLEWJTWGKYWNSD88NSY", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}} 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773129138838406, "job": 1, "event": "recovery_finished"} 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: [db/version_set.cc:5047] Creating manifest 31 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm05/store.db/000024.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-10T07:52:19.161 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55abcbefee00 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: DB pointer 0x55abcc00a000 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: starting mon.vm05 rank 0 at public addrs [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] at bind addrs [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] mon_data /var/lib/ceph/mon/ceph-vm05 fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: rocksdb: [db/db_impl/db_impl.cc:1111] 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: ** DB Stats ** 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Interval stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: ** Compaction Stats [default] ** 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: L0 1/0 46.46 KB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 41.7 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: L6 1/0 8.14 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Sum 2/0 8.19 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 41.7 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 41.7 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: ** Compaction Stats [default] ** 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 41.7 0.00 0.00 1 0.001 0 0 0.0 0.0 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Flush(GB): cumulative 0.000, interval 0.000 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: AddFile(GB): cumulative 0.000, interval 0.000 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: AddFile(Total Files): cumulative 0, interval 0 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: AddFile(L0 Files): cumulative 0, interval 0 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: AddFile(Keys): cumulative 0, interval 0 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Cumulative compaction: 0.00 GB write, 5.05 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Interval compaction: 0.00 GB write, 5.05 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Block cache BinnedLRUCache@0x55abcbefd350#2 capacity: 512.00 MB usage: 41.27 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 9e-06 secs_since: 0 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Block cache entry stats(count,size,portion): DataBlock(2,12.66 KB,0.00241399%) FilterBlock(2,8.27 KB,0.00157654%) IndexBlock(2,20.34 KB,0.00388026%) Misc(1,0.00 KB,0%) 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: ** File Read Latency Histogram By Level [default] ** 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: mon.vm05@-1(???) e2 preinit fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: mon.vm05@-1(???).mds e12 new map 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: mon.vm05@-1(???).mds e12 print_map 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: e12 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: btime 1970-01-01T00:00:00:000000+0000 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: legacy client fscid: 1 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Filesystem 'cephfs' (1) 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: fs_name cephfs 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: epoch 12 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: created 2026-03-10T07:48:24.309293+0000 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: modified 2026-03-10T07:48:33.425187+0000 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: tableserver 0 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: root 0 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: session_timeout 60 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: session_autoclose 300 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: max_file_size 1099511627776 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: max_xattr_size 65536 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: required_client_features {} 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: last_failure 0 2026-03-10T07:52:19.162 INFO:journalctl@ceph.mon.vm05.vm05.stdout: last_failure_osd_epoch 39 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout: compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout: max_mds 1 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout: in 0 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout: up {0=24297} 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout: failed 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout: damaged 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout: stopped 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout: data_pools [3] 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout: metadata_pool 2 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout: inline_data enabled 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout: balancer 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout: bal_rank_mask -1 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout: standby_count_wanted 1 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout: qdb_cluster leader: 0 members: 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout: [mds.cephfs.vm05.omfhnh{0:24297} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/723078808,v1:192.168.123.105:6827/723078808] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout: Standby daemons: 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout: 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout: [mds.cephfs.vm05.pavqil{-1:14512} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/427555544,v1:192.168.123.105:6829/427555544] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout: [mds.cephfs.vm08.ybmbgd{-1:14524} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3815585988,v1:192.168.123.108:6825/3815585988] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout: [mds.cephfs.vm08.dgsaon{-1:24313} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/2963085185,v1:192.168.123.108:6827/2963085185] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: mon.vm05@-1(???).osd e42 crush map has features 3314933000852226048, adjusting msgr requires 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: mon.vm05@-1(???).osd e42 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: mon.vm05@-1(???).osd e42 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: mon.vm05@-1(???).osd e42 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T07:52:19.163 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:18 vm05.local ceph-mon[130117]: mon.vm05@-1(???).paxosservice(auth 1..21) refresh upgraded, format 0 -> 3 2026-03-10T07:52:20.725 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:20 vm08.local ceph-mon[59917]: Deploying daemon mon.vm05 on vm05 2026-03-10T07:52:20.725 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:20 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T07:52:20.725 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:20 vm08.local ceph-mon[59917]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:52:20.725 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:20 vm08.local ceph-mon[59917]: pgmap v59: 65 pgs: 65 active+clean; 289 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 28 KiB/s rd, 1.3 MiB/s wr, 123 op/s 2026-03-10T07:52:20.725 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:20 vm08.local ceph-mon[59917]: mon.vm05 calling monitor election 2026-03-10T07:52:20.725 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:20 vm08.local ceph-mon[59917]: mon.vm05 is new leader, mons vm05,vm08 in quorum (ranks 0,1) 2026-03-10T07:52:20.725 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:20 vm08.local ceph-mon[59917]: monmap epoch 2 2026-03-10T07:52:20.725 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:20 vm08.local ceph-mon[59917]: fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T07:52:20.725 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:20 vm08.local ceph-mon[59917]: last_changed 2026-03-10T07:46:54.511496+0000 2026-03-10T07:52:20.725 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:20 vm08.local ceph-mon[59917]: created 2026-03-10T07:45:39.881211+0000 2026-03-10T07:52:20.725 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:20 vm08.local ceph-mon[59917]: min_mon_release 18 (reef) 2026-03-10T07:52:20.725 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:20 vm08.local ceph-mon[59917]: election_strategy: 1 2026-03-10T07:52:20.725 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:20 vm08.local ceph-mon[59917]: 0: [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] mon.vm05 2026-03-10T07:52:20.725 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:20 vm08.local ceph-mon[59917]: 1: [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] mon.vm08 2026-03-10T07:52:20.725 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:20 vm08.local ceph-mon[59917]: fsmap cephfs:1 {0=cephfs.vm05.omfhnh=up:active} 3 up:standby 2026-03-10T07:52:20.725 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:20 vm08.local ceph-mon[59917]: osdmap e42: 6 total, 6 up, 6 in 2026-03-10T07:52:20.725 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:20 vm08.local ceph-mon[59917]: mgrmap e31: vm05.blexke(active, since 113s), standbys: vm08.orfpog 2026-03-10T07:52:20.725 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:20 vm08.local ceph-mon[59917]: Health detail: HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T07:52:20.725 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:20 vm08.local ceph-mon[59917]: [WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T07:52:20.725 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:20 vm08.local ceph-mon[59917]: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T07:52:20.725 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:20 vm08.local ceph-mon[59917]: from='mgr.14628 ' entity='' 2026-03-10T07:52:20.725 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:20 vm08.local ceph-mon[59917]: mgrmap e32: vm05.blexke(active, since 113s), standbys: vm08.orfpog 2026-03-10T07:52:20.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:20 vm05.local ceph-mon[130117]: Deploying daemon mon.vm05 on vm05 2026-03-10T07:52:20.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:20 vm05.local ceph-mon[130117]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T07:52:20.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:20 vm05.local ceph-mon[130117]: from='mgr.14628 192.168.123.105:0/1139394548' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:52:20.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:20 vm05.local ceph-mon[130117]: pgmap v59: 65 pgs: 65 active+clean; 289 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 28 KiB/s rd, 1.3 MiB/s wr, 123 op/s 2026-03-10T07:52:20.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:20 vm05.local ceph-mon[130117]: mon.vm05 calling monitor election 2026-03-10T07:52:20.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:20 vm05.local ceph-mon[130117]: mon.vm05 is new leader, mons vm05,vm08 in quorum (ranks 0,1) 2026-03-10T07:52:20.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:20 vm05.local ceph-mon[130117]: monmap epoch 2 2026-03-10T07:52:20.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:20 vm05.local ceph-mon[130117]: fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T07:52:20.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:20 vm05.local ceph-mon[130117]: last_changed 2026-03-10T07:46:54.511496+0000 2026-03-10T07:52:20.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:20 vm05.local ceph-mon[130117]: created 2026-03-10T07:45:39.881211+0000 2026-03-10T07:52:20.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:20 vm05.local ceph-mon[130117]: min_mon_release 18 (reef) 2026-03-10T07:52:20.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:20 vm05.local ceph-mon[130117]: election_strategy: 1 2026-03-10T07:52:20.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:20 vm05.local ceph-mon[130117]: 0: [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] mon.vm05 2026-03-10T07:52:20.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:20 vm05.local ceph-mon[130117]: 1: [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] mon.vm08 2026-03-10T07:52:20.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:20 vm05.local ceph-mon[130117]: fsmap cephfs:1 {0=cephfs.vm05.omfhnh=up:active} 3 up:standby 2026-03-10T07:52:20.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:20 vm05.local ceph-mon[130117]: osdmap e42: 6 total, 6 up, 6 in 2026-03-10T07:52:20.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:20 vm05.local ceph-mon[130117]: mgrmap e31: vm05.blexke(active, since 113s), standbys: vm08.orfpog 2026-03-10T07:52:20.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:20 vm05.local ceph-mon[130117]: Health detail: HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T07:52:20.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:20 vm05.local ceph-mon[130117]: [WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T07:52:20.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:20 vm05.local ceph-mon[130117]: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T07:52:20.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:20 vm05.local ceph-mon[130117]: from='mgr.14628 ' entity='' 2026-03-10T07:52:20.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:20 vm05.local ceph-mon[130117]: mgrmap e32: vm05.blexke(active, since 113s), standbys: vm08.orfpog 2026-03-10T07:52:26.023 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:25 vm05.local ceph-mon[130117]: Standby manager daemon vm08.orfpog restarted 2026-03-10T07:52:26.023 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:25 vm05.local ceph-mon[130117]: Standby manager daemon vm08.orfpog started 2026-03-10T07:52:26.023 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:25 vm05.local ceph-mon[130117]: from='mgr.? 192.168.123.108:0/4276167841' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.orfpog/crt"}]: dispatch 2026-03-10T07:52:26.023 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:25 vm05.local ceph-mon[130117]: from='mgr.? 192.168.123.108:0/4276167841' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T07:52:26.023 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:25 vm05.local ceph-mon[130117]: from='mgr.? 192.168.123.108:0/4276167841' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.orfpog/key"}]: dispatch 2026-03-10T07:52:26.023 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:25 vm05.local ceph-mon[130117]: from='mgr.? 192.168.123.108:0/4276167841' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T07:52:26.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:25 vm08.local ceph-mon[59917]: Standby manager daemon vm08.orfpog restarted 2026-03-10T07:52:26.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:25 vm08.local ceph-mon[59917]: Standby manager daemon vm08.orfpog started 2026-03-10T07:52:26.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:25 vm08.local ceph-mon[59917]: from='mgr.? 192.168.123.108:0/4276167841' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.orfpog/crt"}]: dispatch 2026-03-10T07:52:26.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:25 vm08.local ceph-mon[59917]: from='mgr.? 192.168.123.108:0/4276167841' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/crt"}]: dispatch 2026-03-10T07:52:26.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:25 vm08.local ceph-mon[59917]: from='mgr.? 192.168.123.108:0/4276167841' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/vm08.orfpog/key"}]: dispatch 2026-03-10T07:52:26.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:25 vm08.local ceph-mon[59917]: from='mgr.? 192.168.123.108:0/4276167841' entity='mgr.vm08.orfpog' cmd=[{"prefix": "config-key get", "key": "mgr/dashboard/key"}]: dispatch 2026-03-10T07:52:26.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:26 vm05.local ceph-mon[130117]: mgrmap e33: vm05.blexke(active, since 118s), standbys: vm08.orfpog 2026-03-10T07:52:26.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:26 vm05.local ceph-mon[130117]: Active manager daemon vm05.blexke restarted 2026-03-10T07:52:26.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:26 vm05.local ceph-mon[130117]: Activating manager daemon vm05.blexke 2026-03-10T07:52:26.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:26 vm05.local ceph-mon[130117]: osdmap e43: 6 total, 6 up, 6 in 2026-03-10T07:52:26.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:26 vm05.local ceph-mon[130117]: mgrmap e34: vm05.blexke(active, starting, since 0.0993942s), standbys: vm08.orfpog 2026-03-10T07:52:26.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T07:52:26.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:52:26.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.pavqil"}]: dispatch 2026-03-10T07:52:26.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.ybmbgd"}]: dispatch 2026-03-10T07:52:26.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.omfhnh"}]: dispatch 2026-03-10T07:52:26.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.dgsaon"}]: dispatch 2026-03-10T07:52:27.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:26 vm08.local ceph-mon[59917]: mgrmap e33: vm05.blexke(active, since 118s), standbys: vm08.orfpog 2026-03-10T07:52:27.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:26 vm08.local ceph-mon[59917]: Active manager daemon vm05.blexke restarted 2026-03-10T07:52:27.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:26 vm08.local ceph-mon[59917]: Activating manager daemon vm05.blexke 2026-03-10T07:52:27.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:26 vm08.local ceph-mon[59917]: osdmap e43: 6 total, 6 up, 6 in 2026-03-10T07:52:27.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:26 vm08.local ceph-mon[59917]: mgrmap e34: vm05.blexke(active, starting, since 0.0993942s), standbys: vm08.orfpog 2026-03-10T07:52:27.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:26 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T07:52:27.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:26 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:52:27.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:26 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.pavqil"}]: dispatch 2026-03-10T07:52:27.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:26 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.ybmbgd"}]: dispatch 2026-03-10T07:52:27.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:26 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.omfhnh"}]: dispatch 2026-03-10T07:52:27.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:26 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.dgsaon"}]: dispatch 2026-03-10T07:52:27.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr metadata", "who": "vm05.blexke", "id": "vm05.blexke"}]: dispatch 2026-03-10T07:52:27.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr metadata", "who": "vm08.orfpog", "id": "vm08.orfpog"}]: dispatch 2026-03-10T07:52:27.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T07:52:27.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T07:52:27.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T07:52:27.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:52:27.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T07:52:27.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T07:52:27.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T07:52:27.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T07:52:27.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T07:52:27.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:27 vm05.local ceph-mon[130117]: Manager daemon vm05.blexke is now available 2026-03-10T07:52:27.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:52:27.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:52:27.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.blexke/mirror_snapshot_schedule"}]: dispatch 2026-03-10T07:52:27.941 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.blexke/trash_purge_schedule"}]: dispatch 2026-03-10T07:52:28.099 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:27 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr metadata", "who": "vm05.blexke", "id": "vm05.blexke"}]: dispatch 2026-03-10T07:52:28.099 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:27 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr metadata", "who": "vm08.orfpog", "id": "vm08.orfpog"}]: dispatch 2026-03-10T07:52:28.099 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:27 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T07:52:28.099 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:27 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T07:52:28.099 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:27 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T07:52:28.099 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:27 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:52:28.099 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:27 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T07:52:28.099 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:27 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T07:52:28.099 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:27 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata"}]: dispatch 2026-03-10T07:52:28.099 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:27 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata"}]: dispatch 2026-03-10T07:52:28.099 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:27 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata"}]: dispatch 2026-03-10T07:52:28.099 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:27 vm08.local ceph-mon[59917]: Manager daemon vm05.blexke is now available 2026-03-10T07:52:28.099 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:27 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:52:28.099 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:27 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:52:28.099 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:27 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.blexke/mirror_snapshot_schedule"}]: dispatch 2026-03-10T07:52:28.099 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:27 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/vm05.blexke/trash_purge_schedule"}]: dispatch 2026-03-10T07:52:29.037 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:28 vm05.local ceph-mon[130117]: mgrmap e35: vm05.blexke(active, since 1.15618s), standbys: vm08.orfpog 2026-03-10T07:52:29.037 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:28 vm05.local ceph-mon[130117]: pgmap v3: 65 pgs: 65 active+clean; 300 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail 2026-03-10T07:52:29.110 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:28 vm08.local ceph-mon[59917]: mgrmap e35: vm05.blexke(active, since 1.15618s), standbys: vm08.orfpog 2026-03-10T07:52:29.110 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:28 vm08.local ceph-mon[59917]: pgmap v3: 65 pgs: 65 active+clean; 300 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail 2026-03-10T07:52:30.065 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:29 vm08.local ceph-mon[59917]: mgrmap e36: vm05.blexke(active, since 2s), standbys: vm08.orfpog 2026-03-10T07:52:30.066 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:29 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:30.066 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:29 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:30.066 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:29 vm08.local ceph-mon[59917]: [10/Mar/2026:07:52:29] ENGINE Bus STARTING 2026-03-10T07:52:30.066 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:29 vm08.local ceph-mon[59917]: [10/Mar/2026:07:52:29] ENGINE Serving on http://192.168.123.105:8765 2026-03-10T07:52:30.066 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:29 vm08.local ceph-mon[59917]: [10/Mar/2026:07:52:29] ENGINE Serving on https://192.168.123.105:7150 2026-03-10T07:52:30.066 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:29 vm08.local ceph-mon[59917]: [10/Mar/2026:07:52:29] ENGINE Bus STARTED 2026-03-10T07:52:30.066 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:29 vm08.local ceph-mon[59917]: [10/Mar/2026:07:52:29] ENGINE Client ('192.168.123.105', 35134) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T07:52:30.066 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:29 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:30.066 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:29 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:30.068 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:29 vm05.local ceph-mon[130117]: mgrmap e36: vm05.blexke(active, since 2s), standbys: vm08.orfpog 2026-03-10T07:52:30.068 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:29 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:30.068 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:29 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:30.068 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:29 vm05.local ceph-mon[130117]: [10/Mar/2026:07:52:29] ENGINE Bus STARTING 2026-03-10T07:52:30.068 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:29 vm05.local ceph-mon[130117]: [10/Mar/2026:07:52:29] ENGINE Serving on http://192.168.123.105:8765 2026-03-10T07:52:30.068 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:29 vm05.local ceph-mon[130117]: [10/Mar/2026:07:52:29] ENGINE Serving on https://192.168.123.105:7150 2026-03-10T07:52:30.068 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:29 vm05.local ceph-mon[130117]: [10/Mar/2026:07:52:29] ENGINE Bus STARTED 2026-03-10T07:52:30.068 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:29 vm05.local ceph-mon[130117]: [10/Mar/2026:07:52:29] ENGINE Client ('192.168.123.105', 35134) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)') 2026-03-10T07:52:30.068 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:29 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:30.068 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:29 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:31.659 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:31 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:31.659 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:31 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:31.659 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:31 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:31.659 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:31 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:31.659 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:31 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:52:31.659 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:31 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:31.659 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:31 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:31.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:31 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:31.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:31 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:31.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:31 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:31.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:31 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:31.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:31 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "who": "osd/host:vm08", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:52:31.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:31 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:31.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:31 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:32.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:32 vm08.local ceph-mon[59917]: pgmap v5: 65 pgs: 65 active+clean; 300 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail 2026-03-10T07:52:32.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:32 vm08.local ceph-mon[59917]: Detected new or changed devices on vm08 2026-03-10T07:52:32.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:32 vm08.local ceph-mon[59917]: mgrmap e37: vm05.blexke(active, since 4s), standbys: vm08.orfpog 2026-03-10T07:52:32.669 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:32 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:32.670 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:32 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:32.670 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:32 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:52:32.670 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:32 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:32.670 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:32 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:52:32.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:32 vm05.local ceph-mon[130117]: pgmap v5: 65 pgs: 65 active+clean; 300 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail 2026-03-10T07:52:32.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:32 vm05.local ceph-mon[130117]: Detected new or changed devices on vm08 2026-03-10T07:52:32.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:32 vm05.local ceph-mon[130117]: mgrmap e37: vm05.blexke(active, since 4s), standbys: vm08.orfpog 2026-03-10T07:52:32.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:32 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:32.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:32 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:32.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:32 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "who": "osd/host:vm05", "name": "osd_memory_target"}]: dispatch 2026-03-10T07:52:32.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:32 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:32.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:32 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:52:33.780 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:33 vm08.local ceph-mon[59917]: Detected new or changed devices on vm05 2026-03-10T07:52:33.780 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:33 vm08.local ceph-mon[59917]: Updating vm05:/etc/ceph/ceph.conf 2026-03-10T07:52:33.780 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:33 vm08.local ceph-mon[59917]: Updating vm08:/etc/ceph/ceph.conf 2026-03-10T07:52:33.780 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:33 vm08.local ceph-mon[59917]: Updating vm08:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.conf 2026-03-10T07:52:33.780 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:33 vm08.local ceph-mon[59917]: Updating vm05:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.conf 2026-03-10T07:52:33.780 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:33 vm08.local ceph-mon[59917]: pgmap v6: 65 pgs: 65 active+clean; 300 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail 2026-03-10T07:52:33.780 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:33 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:33.780 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:33 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:33.780 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:33 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:33.780 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:33 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:33.780 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:33 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:33.780 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:33 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:52:33.780 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:33 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:52:33.780 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:33 vm08.local ceph-mon[59917]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-10T07:52:33.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:33 vm05.local ceph-mon[130117]: Detected new or changed devices on vm05 2026-03-10T07:52:33.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:33 vm05.local ceph-mon[130117]: Updating vm05:/etc/ceph/ceph.conf 2026-03-10T07:52:33.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:33 vm05.local ceph-mon[130117]: Updating vm08:/etc/ceph/ceph.conf 2026-03-10T07:52:33.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:33 vm05.local ceph-mon[130117]: Updating vm08:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.conf 2026-03-10T07:52:33.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:33 vm05.local ceph-mon[130117]: Updating vm05:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.conf 2026-03-10T07:52:33.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:33 vm05.local ceph-mon[130117]: pgmap v6: 65 pgs: 65 active+clean; 300 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail 2026-03-10T07:52:33.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:33 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:33.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:33 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:33.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:33 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:33.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:33 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:33.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:33 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:33.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:33 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:52:33.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:33 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:52:33.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:33 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "quorum_status"}]: dispatch 2026-03-10T07:52:34.662 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:34 vm08.local systemd[1]: Stopping Ceph mon.vm08 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T07:52:34.662 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:34 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm08[59912]: 2026-03-10T07:52:34.546+0000 7f61500ed700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm08 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T07:52:34.662 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:34 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm08[59912]: 2026-03-10T07:52:34.546+0000 7f61500ed700 -1 mon.vm08@1(peon) e2 *** Got Signal Terminated *** 2026-03-10T07:52:34.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:34 vm08.local podman[107772]: 2026-03-10 07:52:34.661587619 +0000 UTC m=+0.256973682 container died e01dfb7124745f12fe3e5b23cd8c968a6154fcd7cdc84628f1ecdd10927244fa (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm08, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, org.label-schema.name=CentOS Stream 8 Base Image, CEPH_POINT_RELEASE=-18.2.1, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2, GIT_CLEAN=True, RELEASE=HEAD, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.29.1, org.label-schema.build-date=20240222, GIT_BRANCH=HEAD) 2026-03-10T07:52:34.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:34 vm08.local podman[107772]: 2026-03-10 07:52:34.727781904 +0000 UTC m=+0.323167967 container remove e01dfb7124745f12fe3e5b23cd8c968a6154fcd7cdc84628f1ecdd10927244fa (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm08, ceph=True, maintainer=Guillaume Abrioux , GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.name=CentOS Stream 8 Base Image, GIT_CLEAN=True, org.label-schema.build-date=20240222, org.label-schema.schema-version=1.0, GIT_BRANCH=HEAD, io.buildah.version=1.29.1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.1, RELEASE=HEAD) 2026-03-10T07:52:34.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:34 vm08.local bash[107772]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm08 2026-03-10T07:52:34.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:34 vm08.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@mon.vm08.service: Deactivated successfully. 2026-03-10T07:52:34.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:34 vm08.local systemd[1]: Stopped Ceph mon.vm08 for 12e9780e-1c55-11f1-8896-79f7c2e9b508. 2026-03-10T07:52:34.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:34 vm08.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@mon.vm08.service: Consumed 4.000s CPU time. 2026-03-10T07:52:35.200 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local systemd[1]: Starting Ceph mon.vm08 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T07:52:35.200 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local podman[107884]: 2026-03-10 07:52:35.173821775 +0000 UTC m=+0.021302702 container create 73d9a504f360f90037d6612292f4143c3913ca51a88877820f797568bb828da0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm08, ceph=True, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T07:52:35.602 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local podman[107884]: 2026-03-10 07:52:35.207999109 +0000 UTC m=+0.055480047 container init 73d9a504f360f90037d6612292f4143c3913ca51a88877820f797568bb828da0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm08, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , ceph=True, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T07:52:35.602 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local podman[107884]: 2026-03-10 07:52:35.214351713 +0000 UTC m=+0.061832640 container start 73d9a504f360f90037d6612292f4143c3913ca51a88877820f797568bb828da0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm08, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default) 2026-03-10T07:52:35.602 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local bash[107884]: 73d9a504f360f90037d6612292f4143c3913ca51a88877820f797568bb828da0 2026-03-10T07:52:35.602 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local podman[107884]: 2026-03-10 07:52:35.164024052 +0000 UTC m=+0.011504989 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T07:52:35.602 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local systemd[1]: Started Ceph mon.vm08 for 12e9780e-1c55-11f1-8896-79f7c2e9b508. 2026-03-10T07:52:35.602 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: set uid:gid to 167:167 (ceph:ceph) 2026-03-10T07:52:35.602 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable), process ceph-mon, pid 2 2026-03-10T07:52:35.602 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: pidfile_write: ignore empty --pid-file 2026-03-10T07:52:35.602 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: load: jerasure load: lrc 2026-03-10T07:52:35.602 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: RocksDB version: 7.9.2 2026-03-10T07:52:35.602 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Git sha 0 2026-03-10T07:52:35.602 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Compile date 2026-02-25 18:11:04 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: DB SUMMARY 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: DB Session ID: Y7D1XZ2LWNGG506ORJPK 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: CURRENT file: CURRENT 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: IDENTITY file: IDENTITY 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: MANIFEST file: MANIFEST-000010 size: 913 Bytes 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: SST files in /var/lib/ceph/mon/ceph-vm08/store.db dir, Total Num: 1, files: 000021.sst 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-vm08/store.db: 000019.log size: 3314385 ; 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.error_if_exists: 0 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.create_if_missing: 0 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.paranoid_checks: 1 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.flush_verify_memtable_count: 1 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.track_and_verify_wals_in_manifest: 0 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.verify_sst_unique_id_in_manifest: 1 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.env: 0x565541594dc0 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.fs: PosixFileSystem 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.info_log: 0x5655437ed900 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_file_opening_threads: 16 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.statistics: (nil) 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.use_fsync: 0 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_log_file_size: 0 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_manifest_file_size: 1073741824 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.log_file_time_to_roll: 0 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.keep_log_file_num: 1000 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.recycle_log_file_num: 0 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.allow_fallocate: 1 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.allow_mmap_reads: 0 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.allow_mmap_writes: 0 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.use_direct_reads: 0 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.use_direct_io_for_flush_and_compaction: 0 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.create_missing_column_families: 0 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.db_log_dir: 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.wal_dir: 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.table_cache_numshardbits: 6 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.WAL_ttl_seconds: 0 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.WAL_size_limit_MB: 0 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_write_batch_group_size_bytes: 1048576 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.manifest_preallocation_size: 4194304 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.is_fd_close_on_exec: 1 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.advise_random_on_open: 1 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.db_write_buffer_size: 0 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.write_buffer_manager: 0x5655437f1900 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.access_hint_on_compaction_start: 1 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.random_access_max_buffer_size: 1048576 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.use_adaptive_mutex: 0 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.rate_limiter: (nil) 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.sst_file_manager.rate_bytes_per_sec: 0 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.wal_recovery_mode: 2 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.enable_thread_tracking: 0 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.enable_pipelined_write: 0 2026-03-10T07:52:35.603 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.unordered_write: 0 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.allow_concurrent_memtable_write: 1 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.enable_write_thread_adaptive_yield: 1 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.write_thread_max_yield_usec: 100 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.write_thread_slow_yield_usec: 3 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.row_cache: None 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.wal_filter: None 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.avoid_flush_during_recovery: 0 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.allow_ingest_behind: 0 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.two_write_queues: 0 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.manual_wal_flush: 0 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.wal_compression: 0 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.atomic_flush: 0 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.avoid_unnecessary_blocking_io: 0 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.persist_stats_to_disk: 0 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.write_dbid_to_manifest: 0 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.log_readahead_size: 0 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.file_checksum_gen_factory: Unknown 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.best_efforts_recovery: 0 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_bgerror_resume_count: 2147483647 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.bgerror_resume_retry_interval: 1000000 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.allow_data_in_errors: 0 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.db_host_id: __hostname__ 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.enforce_single_del_contracts: true 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_background_jobs: 2 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_background_compactions: -1 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_subcompactions: 1 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.avoid_flush_during_shutdown: 0 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.writable_file_max_buffer_size: 1048576 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.delayed_write_rate : 16777216 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_total_wal_size: 0 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.delete_obsolete_files_period_micros: 21600000000 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.stats_dump_period_sec: 600 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.stats_persist_period_sec: 600 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.stats_history_buffer_size: 1048576 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_open_files: -1 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.bytes_per_sync: 0 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.wal_bytes_per_sync: 0 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.strict_bytes_per_sync: 0 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.compaction_readahead_size: 0 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_background_flushes: -1 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Compression algorithms supported: 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: kZSTD supported: 0 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: kXpressCompression supported: 0 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: kBZip2Compression supported: 0 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: kZSTDNotFinalCompression supported: 0 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: kLZ4Compression supported: 1 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: kZlibCompression supported: 1 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: kLZ4HCCompression supported: 1 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: kSnappyCompression supported: 1 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Fast CRC32 supported: Supported on x86 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: DMutex implementation: pthread_mutex_t 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-vm08/store.db/MANIFEST-000010 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]: 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.comparator: leveldb.BytewiseComparator 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.merge_operator: 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.compaction_filter: None 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.compaction_filter_factory: None 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.sst_partitioner_factory: None 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.memtable_factory: SkipListFactory 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.table_factory: BlockBasedTable 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: table_factory options: flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x5655437ed580) 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout: cache_index_and_filter_blocks: 1 2026-03-10T07:52:35.604 INFO:journalctl@ceph.mon.vm08.vm08.stdout: cache_index_and_filter_blocks_with_high_priority: 0 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: pin_l0_filter_and_index_blocks_in_cache: 0 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: pin_top_level_index_and_filter: 1 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: index_type: 0 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: data_block_index_type: 0 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: index_shortening: 1 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: data_block_hash_table_util_ratio: 0.750000 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: checksum: 4 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: no_block_cache: 0 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_cache: 0x5655438109b0 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_cache_name: BinnedLRUCache 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_cache_options: 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: capacity : 536870912 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: num_shard_bits : 4 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: strict_capacity_limit : 0 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: high_pri_pool_ratio: 0.000 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_cache_compressed: (nil) 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: persistent_cache: (nil) 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_size: 4096 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_size_deviation: 10 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_restart_interval: 16 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: index_block_restart_interval: 1 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: metadata_block_size: 4096 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: partition_filters: 0 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: use_delta_encoding: 1 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: filter_policy: bloomfilter 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: whole_key_filtering: 1 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: verify_compression: 0 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: read_amp_bytes_per_bit: 0 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: format_version: 5 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: enable_index_compression: 1 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: block_align: 0 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: max_auto_readahead_size: 262144 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: prepopulate_block_cache: 0 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: initial_auto_readahead_size: 8192 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout: num_file_reads_for_auto_readahead: 2 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.write_buffer_size: 33554432 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_write_buffer_number: 2 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.compression: NoCompression 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.bottommost_compression: Disabled 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.prefix_extractor: nullptr 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.memtable_insert_with_hint_prefix_extractor: nullptr 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.num_levels: 7 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.min_write_buffer_number_to_merge: 1 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_write_buffer_number_to_maintain: 0 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_write_buffer_size_to_maintain: 0 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.bottommost_compression_opts.window_bits: -14 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.bottommost_compression_opts.level: 32767 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.bottommost_compression_opts.strategy: 0 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.bottommost_compression_opts.max_dict_bytes: 0 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.bottommost_compression_opts.zstd_max_train_bytes: 0 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.bottommost_compression_opts.parallel_threads: 1 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.bottommost_compression_opts.enabled: false 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.bottommost_compression_opts.max_dict_buffer_bytes: 0 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.bottommost_compression_opts.use_zstd_dict_trainer: true 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.compression_opts.window_bits: -14 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.compression_opts.level: 32767 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.compression_opts.strategy: 0 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.compression_opts.max_dict_bytes: 0 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.compression_opts.zstd_max_train_bytes: 0 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.compression_opts.use_zstd_dict_trainer: true 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.compression_opts.parallel_threads: 1 2026-03-10T07:52:35.605 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.compression_opts.enabled: false 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.compression_opts.max_dict_buffer_bytes: 0 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.level0_file_num_compaction_trigger: 4 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.level0_slowdown_writes_trigger: 20 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.level0_stop_writes_trigger: 36 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.target_file_size_base: 67108864 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.target_file_size_multiplier: 1 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_bytes_for_level_base: 268435456 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_bytes_for_level_multiplier: 10.000000 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_sequential_skip_in_iterations: 8 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_compaction_bytes: 1677721600 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.ignore_max_compaction_bytes_for_input: true 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.arena_block_size: 1048576 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.soft_pending_compaction_bytes_limit: 68719476736 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.hard_pending_compaction_bytes_limit: 274877906944 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.disable_auto_compactions: 0 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.compaction_style: kCompactionStyleLevel 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.compaction_pri: kMinOverlappingRatio 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.compaction_options_universal.size_ratio: 1 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.compaction_options_universal.min_merge_width: 2 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0); 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.inplace_update_support: 0 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.inplace_update_num_locks: 10000 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.memtable_prefix_bloom_size_ratio: 0.000000 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.memtable_whole_key_filtering: 0 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.memtable_huge_page_size: 0 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.bloom_locality: 0 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.max_successive_merges: 0 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.optimize_filters_for_hits: 0 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.paranoid_file_checks: 0 2026-03-10T07:52:35.606 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.force_consistency_checks: 1 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.report_bg_io_stats: 0 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.ttl: 2592000 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.periodic_compaction_seconds: 0 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.preclude_last_level_data_seconds: 0 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.preserve_internal_time_seconds: 0 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.enable_blob_files: false 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.min_blob_size: 0 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.blob_file_size: 268435456 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.blob_compression_type: NoCompression 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.enable_blob_garbage_collection: false 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.blob_garbage_collection_age_cutoff: 0.250000 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.blob_compaction_readahead_size: 0 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.blob_file_starting_level: 0 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: Options.experimental_mempurge_threshold: 0.000000 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-vm08/store.db/MANIFEST-000010 succeeded,manifest_file_number is 10, next_file_number is 23, last_sequence is 10097, log_number is 19,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 19 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 19 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: d0e6f634-0d76-4fd8-b0e5-605ccc480124 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773129155262808, "job": 1, "event": "recovery_started", "wal_files": [19]} 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #19 mode 2 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773129155277123, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 24, "file_size": 2096140, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 10102, "largest_seqno": 10605, "table_properties": {"data_size": 2092599, "index_size": 1860, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 645, "raw_key_size": 6377, "raw_average_key_size": 24, "raw_value_size": 2086627, "raw_average_value_size": 8150, "num_data_blocks": 85, "num_entries": 256, "num_filter_entries": 256, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1773129155, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "d0e6f634-0d76-4fd8-b0e5-605ccc480124", "db_session_id": "Y7D1XZ2LWNGG506ORJPK", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}} 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: EVENT_LOG_v1 {"time_micros": 1773129155277509, "job": 1, "event": "recovery_finished"} 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: [db/version_set.cc:5047] Creating manifest 26 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed. 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-vm08/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x565543812e00 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: DB pointer 0x565543822000 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS ------- 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: rocksdb: [db/db_impl/db_impl.cc:1111] 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout: ** DB Stats ** 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Interval stall: 00:00:0.000 H:M:S, 0.0 percent 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout: ** Compaction Stats [default] ** 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Level Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout: ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------ 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout: L0 1/0 2.00 MB 0.2 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 191.2 0.01 0.00 1 0.010 0 0 0.0 0.0 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout: L6 1/0 8.15 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.00 0.00 0 0.000 0 0 0.0 0.0 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Sum 2/0 10.14 MB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 191.2 0.01 0.00 1 0.010 0 0 0.0 0.0 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Int 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 1.0 0.0 191.2 0.01 0.00 1 0.010 0 0 0.0 0.0 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-10T07:52:35.607 INFO:journalctl@ceph.mon.vm08.vm08.stdout: ** Compaction Stats [default] ** 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Priority Files Size Score Read(GB) Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB) 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: --------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: User 0/0 0.00 KB 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 191.2 0.01 0.00 1 0.010 0 0 0.0 0.0 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Uptime(secs): 0.0 total, 0.0 interval 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Flush(GB): cumulative 0.002, interval 0.002 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: AddFile(GB): cumulative 0.000, interval 0.000 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: AddFile(Total Files): cumulative 0, interval 0 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: AddFile(L0 Files): cumulative 0, interval 0 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: AddFile(Keys): cumulative 0, interval 0 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Cumulative compaction: 0.00 GB write, 100.92 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Interval compaction: 0.00 GB write, 100.92 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Block cache BinnedLRUCache@0x5655438109b0#2 capacity: 512.00 MB usage: 2.80 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 1.2e-05 secs_since: 0 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Block cache entry stats(count,size,portion): FilterBlock(1,0.72 KB,0.000137091%) IndexBlock(1,2.08 KB,0.000396371%) Misc(1,0.00 KB,0%) 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: ** File Read Latency Histogram By Level [default] ** 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: starting mon.vm08 rank 1 at public addrs [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] at bind addrs [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] mon_data /var/lib/ceph/mon/ceph-vm08 fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: mon.vm08@-1(???) e2 preinit fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: mon.vm08@-1(???).mds e12 new map 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: mon.vm08@-1(???).mds e12 print_map 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: e12 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: btime 1970-01-01T00:00:00:000000+0000 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: legacy client fscid: 1 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Filesystem 'cephfs' (1) 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: fs_name cephfs 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: epoch 12 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: created 2026-03-10T07:48:24.309293+0000 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: modified 2026-03-10T07:48:33.425187+0000 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: tableserver 0 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: root 0 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: session_timeout 60 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: session_autoclose 300 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: max_file_size 1099511627776 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: max_xattr_size 65536 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: required_client_features {} 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: last_failure 0 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: last_failure_osd_epoch 39 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: max_mds 1 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: in 0 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: up {0=24297} 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: failed 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: damaged 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: stopped 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: data_pools [3] 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: metadata_pool 2 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: inline_data enabled 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: balancer 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: bal_rank_mask -1 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: standby_count_wanted 1 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: qdb_cluster leader: 0 members: 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: [mds.cephfs.vm05.omfhnh{0:24297} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/723078808,v1:192.168.123.105:6827/723078808] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:52:35.608 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-10T07:52:35.609 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-10T07:52:35.609 INFO:journalctl@ceph.mon.vm08.vm08.stdout: Standby daemons: 2026-03-10T07:52:35.609 INFO:journalctl@ceph.mon.vm08.vm08.stdout: 2026-03-10T07:52:35.609 INFO:journalctl@ceph.mon.vm08.vm08.stdout: [mds.cephfs.vm05.pavqil{-1:14512} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/427555544,v1:192.168.123.105:6829/427555544] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:52:35.609 INFO:journalctl@ceph.mon.vm08.vm08.stdout: [mds.cephfs.vm08.ybmbgd{-1:14524} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3815585988,v1:192.168.123.108:6825/3815585988] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:52:35.609 INFO:journalctl@ceph.mon.vm08.vm08.stdout: [mds.cephfs.vm08.dgsaon{-1:24313} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/2963085185,v1:192.168.123.108:6827/2963085185] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:52:35.609 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: mon.vm08@-1(???).osd e43 crush map has features 3314933000852226048, adjusting msgr requires 2026-03-10T07:52:35.609 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: mon.vm08@-1(???).osd e43 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T07:52:35.609 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: mon.vm08@-1(???).osd e43 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T07:52:35.609 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: mon.vm08@-1(???).osd e43 crush map has features 288514051259236352, adjusting msgr requires 2026-03-10T07:52:35.609 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:35 vm08.local ceph-mon[107898]: mon.vm08@-1(???).paxosservice(auth 1..22) refresh upgraded, format 0 -> 3 2026-03-10T07:52:36.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:36 vm05.local ceph-mon[130117]: Updating vm08:/etc/ceph/ceph.client.admin.keyring 2026-03-10T07:52:36.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:36 vm05.local ceph-mon[130117]: Updating vm05:/etc/ceph/ceph.client.admin.keyring 2026-03-10T07:52:36.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:36 vm05.local ceph-mon[130117]: Updating vm08:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.client.admin.keyring 2026-03-10T07:52:36.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:36 vm05.local ceph-mon[130117]: Updating vm05:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.client.admin.keyring 2026-03-10T07:52:36.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:36 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:36.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:36 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T07:52:36.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:36 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T07:52:36.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:36 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:36.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:36 vm08.local ceph-mon[107898]: Updating vm08:/etc/ceph/ceph.client.admin.keyring 2026-03-10T07:52:36.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:36 vm08.local ceph-mon[107898]: Updating vm05:/etc/ceph/ceph.client.admin.keyring 2026-03-10T07:52:36.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:36 vm08.local ceph-mon[107898]: Updating vm08:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.client.admin.keyring 2026-03-10T07:52:36.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:36 vm08.local ceph-mon[107898]: Updating vm05:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/config/ceph.client.admin.keyring 2026-03-10T07:52:36.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:36 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:36.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:36 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T07:52:36.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:36 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T07:52:36.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:36 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:38.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:37 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T07:52:38.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:37 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:52:38.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:37 vm08.local ceph-mon[107898]: mon.vm05 calling monitor election 2026-03-10T07:52:38.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:37 vm08.local ceph-mon[107898]: mon.vm08 calling monitor election 2026-03-10T07:52:38.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:37 vm08.local ceph-mon[107898]: mon.vm05 is new leader, mons vm05,vm08 in quorum (ranks 0,1) 2026-03-10T07:52:38.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:37 vm08.local ceph-mon[107898]: monmap epoch 3 2026-03-10T07:52:38.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:37 vm08.local ceph-mon[107898]: fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T07:52:38.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:37 vm08.local ceph-mon[107898]: last_changed 2026-03-10T07:52:36.896107+0000 2026-03-10T07:52:38.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:37 vm08.local ceph-mon[107898]: created 2026-03-10T07:45:39.881211+0000 2026-03-10T07:52:38.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:37 vm08.local ceph-mon[107898]: min_mon_release 19 (squid) 2026-03-10T07:52:38.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:37 vm08.local ceph-mon[107898]: election_strategy: 1 2026-03-10T07:52:38.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:37 vm08.local ceph-mon[107898]: 0: [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] mon.vm05 2026-03-10T07:52:38.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:37 vm08.local ceph-mon[107898]: 1: [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] mon.vm08 2026-03-10T07:52:38.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:37 vm08.local ceph-mon[107898]: fsmap cephfs:1 {0=cephfs.vm05.omfhnh=up:active} 3 up:standby 2026-03-10T07:52:38.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:37 vm08.local ceph-mon[107898]: osdmap e43: 6 total, 6 up, 6 in 2026-03-10T07:52:38.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:37 vm08.local ceph-mon[107898]: mgrmap e37: vm05.blexke(active, since 10s), standbys: vm08.orfpog 2026-03-10T07:52:38.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:37 vm08.local ceph-mon[107898]: Health detail: HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T07:52:38.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:37 vm08.local ceph-mon[107898]: [WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T07:52:38.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:37 vm08.local ceph-mon[107898]: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T07:52:38.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:37 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:38.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:37 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:38.035 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:37 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:52:38.229 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:37 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm05"}]: dispatch 2026-03-10T07:52:38.230 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:37 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mon metadata", "id": "vm08"}]: dispatch 2026-03-10T07:52:38.230 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:37 vm05.local ceph-mon[130117]: mon.vm05 calling monitor election 2026-03-10T07:52:38.230 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:37 vm05.local ceph-mon[130117]: mon.vm08 calling monitor election 2026-03-10T07:52:38.230 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:37 vm05.local ceph-mon[130117]: mon.vm05 is new leader, mons vm05,vm08 in quorum (ranks 0,1) 2026-03-10T07:52:38.230 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:37 vm05.local ceph-mon[130117]: monmap epoch 3 2026-03-10T07:52:38.230 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:37 vm05.local ceph-mon[130117]: fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T07:52:38.230 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:37 vm05.local ceph-mon[130117]: last_changed 2026-03-10T07:52:36.896107+0000 2026-03-10T07:52:38.230 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:37 vm05.local ceph-mon[130117]: created 2026-03-10T07:45:39.881211+0000 2026-03-10T07:52:38.230 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:37 vm05.local ceph-mon[130117]: min_mon_release 19 (squid) 2026-03-10T07:52:38.230 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:37 vm05.local ceph-mon[130117]: election_strategy: 1 2026-03-10T07:52:38.230 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:37 vm05.local ceph-mon[130117]: 0: [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] mon.vm05 2026-03-10T07:52:38.230 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:37 vm05.local ceph-mon[130117]: 1: [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] mon.vm08 2026-03-10T07:52:38.230 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:37 vm05.local ceph-mon[130117]: fsmap cephfs:1 {0=cephfs.vm05.omfhnh=up:active} 3 up:standby 2026-03-10T07:52:38.230 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:37 vm05.local ceph-mon[130117]: osdmap e43: 6 total, 6 up, 6 in 2026-03-10T07:52:38.230 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:37 vm05.local ceph-mon[130117]: mgrmap e37: vm05.blexke(active, since 10s), standbys: vm08.orfpog 2026-03-10T07:52:38.230 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:37 vm05.local ceph-mon[130117]: Health detail: HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T07:52:38.230 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:37 vm05.local ceph-mon[130117]: [WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T07:52:38.230 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:37 vm05.local ceph-mon[130117]: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T07:52:38.230 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:37 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:38.230 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:37 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:38.230 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:37 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:52:39.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:39 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:39.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:39 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:39.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:39 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:39.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:39 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:39.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:39 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:39.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:39 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:39.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:39 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:39.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:39 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:40.604 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:40 vm05.local ceph-mon[130117]: pgmap v9: 65 pgs: 65 active+clean; 304 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 46 KiB/s rd, 1.5 MiB/s wr, 85 op/s 2026-03-10T07:52:40.604 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:40 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:40.604 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:40 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:40.605 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:40 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:40.605 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:40 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:52:40.605 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:40 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:40.605 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:40 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T07:52:40.605 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:40 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T07:52:40.605 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:40 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:40.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:40 vm08.local ceph-mon[107898]: pgmap v9: 65 pgs: 65 active+clean; 304 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 46 KiB/s rd, 1.5 MiB/s wr, 85 op/s 2026-03-10T07:52:40.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:40 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:40.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:40 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:40.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:40 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:40.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:40 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:52:40.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:40 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:40.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:40 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T07:52:40.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:40 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T07:52:40.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:40 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:42.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:41 vm05.local ceph-mon[130117]: Reconfiguring mon.vm05 (monmap changed)... 2026-03-10T07:52:42.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:41 vm05.local ceph-mon[130117]: Reconfiguring daemon mon.vm05 on vm05 2026-03-10T07:52:42.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:41 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:42.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:41 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:42.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:41 vm05.local ceph-mon[130117]: Reconfiguring mgr.vm05.blexke (monmap changed)... 2026-03-10T07:52:42.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:41 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.blexke", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T07:52:42.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:41 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T07:52:42.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:41 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:42.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:41 vm05.local ceph-mon[130117]: Reconfiguring daemon mgr.vm05.blexke on vm05 2026-03-10T07:52:42.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:41 vm05.local ceph-mon[130117]: pgmap v10: 65 pgs: 65 active+clean; 304 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 41 KiB/s rd, 1.3 MiB/s wr, 76 op/s 2026-03-10T07:52:42.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:41 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:42.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:41 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:42.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:41 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T07:52:42.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:41 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T07:52:42.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:41 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T07:52:42.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:41 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm05"}]: dispatch 2026-03-10T07:52:42.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:41 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:42.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:41 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:42.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:41 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:42.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:41 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T07:52:42.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:41 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:42.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:41 vm08.local ceph-mon[107898]: Reconfiguring mon.vm05 (monmap changed)... 2026-03-10T07:52:42.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:41 vm08.local ceph-mon[107898]: Reconfiguring daemon mon.vm05 on vm05 2026-03-10T07:52:42.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:41 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:42.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:41 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:42.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:41 vm08.local ceph-mon[107898]: Reconfiguring mgr.vm05.blexke (monmap changed)... 2026-03-10T07:52:42.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:41 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm05.blexke", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T07:52:42.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:41 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T07:52:42.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:41 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:42.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:41 vm08.local ceph-mon[107898]: Reconfiguring daemon mgr.vm05.blexke on vm05 2026-03-10T07:52:42.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:41 vm08.local ceph-mon[107898]: pgmap v10: 65 pgs: 65 active+clean; 304 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 41 KiB/s rd, 1.3 MiB/s wr, 76 op/s 2026-03-10T07:52:42.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:41 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:42.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:41 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:42.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:41 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T07:52:42.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:41 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T07:52:42.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:41 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T07:52:42.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:41 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm05"}]: dispatch 2026-03-10T07:52:42.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:41 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:42.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:41 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:42.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:41 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:42.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:41 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T07:52:42.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:41 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:43.112 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:42 vm05.local ceph-mon[130117]: Reconfiguring ceph-exporter.vm05 (monmap changed)... 2026-03-10T07:52:43.112 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:42 vm05.local ceph-mon[130117]: Unable to update caps for client.ceph-exporter.vm05 2026-03-10T07:52:43.112 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:42 vm05.local ceph-mon[130117]: Reconfiguring daemon ceph-exporter.vm05 on vm05 2026-03-10T07:52:43.112 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:42 vm05.local ceph-mon[130117]: Reconfiguring crash.vm05 (monmap changed)... 2026-03-10T07:52:43.112 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:42 vm05.local ceph-mon[130117]: Reconfiguring daemon crash.vm05 on vm05 2026-03-10T07:52:43.112 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:52:43.112 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:43.112 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:43.112 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T07:52:43.112 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:43.112 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:43.112 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:43.112 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T07:52:43.112 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:43.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:42 vm08.local ceph-mon[107898]: Reconfiguring ceph-exporter.vm05 (monmap changed)... 2026-03-10T07:52:43.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:42 vm08.local ceph-mon[107898]: Unable to update caps for client.ceph-exporter.vm05 2026-03-10T07:52:43.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:42 vm08.local ceph-mon[107898]: Reconfiguring daemon ceph-exporter.vm05 on vm05 2026-03-10T07:52:43.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:42 vm08.local ceph-mon[107898]: Reconfiguring crash.vm05 (monmap changed)... 2026-03-10T07:52:43.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:42 vm08.local ceph-mon[107898]: Reconfiguring daemon crash.vm05 on vm05 2026-03-10T07:52:43.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:52:43.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:43.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:43.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T07:52:43.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:43.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:43.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:43.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T07:52:43.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:43.891 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:43 vm05.local ceph-mon[130117]: Reconfiguring osd.0 (monmap changed)... 2026-03-10T07:52:43.891 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:43 vm05.local ceph-mon[130117]: Reconfiguring daemon osd.0 on vm05 2026-03-10T07:52:43.891 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:43 vm05.local ceph-mon[130117]: Reconfiguring osd.1 (monmap changed)... 2026-03-10T07:52:43.891 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:43 vm05.local ceph-mon[130117]: Reconfiguring daemon osd.1 on vm05 2026-03-10T07:52:43.891 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:43 vm05.local ceph-mon[130117]: pgmap v11: 65 pgs: 65 active+clean; 304 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 41 KiB/s rd, 1.3 MiB/s wr, 76 op/s 2026-03-10T07:52:43.891 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:43 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:43.891 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:43 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:43.891 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:43 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T07:52:43.891 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:43 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:43.891 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:43 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:43.891 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:43 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:43.891 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:43 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.omfhnh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T07:52:43.891 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:43 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:43.891 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:43 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:43.891 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:43 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:43.891 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:43 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.pavqil", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T07:52:43.891 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:43 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:44.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:43 vm08.local ceph-mon[107898]: Reconfiguring osd.0 (monmap changed)... 2026-03-10T07:52:44.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:43 vm08.local ceph-mon[107898]: Reconfiguring daemon osd.0 on vm05 2026-03-10T07:52:44.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:43 vm08.local ceph-mon[107898]: Reconfiguring osd.1 (monmap changed)... 2026-03-10T07:52:44.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:43 vm08.local ceph-mon[107898]: Reconfiguring daemon osd.1 on vm05 2026-03-10T07:52:44.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:43 vm08.local ceph-mon[107898]: pgmap v11: 65 pgs: 65 active+clean; 304 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 41 KiB/s rd, 1.3 MiB/s wr, 76 op/s 2026-03-10T07:52:44.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:43 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:44.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:43 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:44.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:43 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T07:52:44.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:43 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:44.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:43 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:44.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:43 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:44.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:43 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.omfhnh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T07:52:44.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:43 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:44.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:43 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:44.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:43 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:44.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:43 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.pavqil", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T07:52:44.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:43 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:45.156 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:44 vm08.local ceph-mon[107898]: Reconfiguring osd.2 (monmap changed)... 2026-03-10T07:52:45.156 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:44 vm08.local ceph-mon[107898]: Reconfiguring daemon osd.2 on vm05 2026-03-10T07:52:45.156 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:44 vm08.local ceph-mon[107898]: Reconfiguring mds.cephfs.vm05.omfhnh (monmap changed)... 2026-03-10T07:52:45.156 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:44 vm08.local ceph-mon[107898]: Reconfiguring daemon mds.cephfs.vm05.omfhnh on vm05 2026-03-10T07:52:45.156 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:44 vm08.local ceph-mon[107898]: Reconfiguring mds.cephfs.vm05.pavqil (monmap changed)... 2026-03-10T07:52:45.156 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:44 vm08.local ceph-mon[107898]: Reconfiguring daemon mds.cephfs.vm05.pavqil on vm05 2026-03-10T07:52:45.156 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:45.156 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:45.156 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T07:52:45.156 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T07:52:45.156 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T07:52:45.156 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm08"}]: dispatch 2026-03-10T07:52:45.156 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:45.156 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:45.156 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:45.156 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T07:52:45.156 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:44 vm05.local ceph-mon[130117]: Reconfiguring osd.2 (monmap changed)... 2026-03-10T07:52:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:44 vm05.local ceph-mon[130117]: Reconfiguring daemon osd.2 on vm05 2026-03-10T07:52:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:44 vm05.local ceph-mon[130117]: Reconfiguring mds.cephfs.vm05.omfhnh (monmap changed)... 2026-03-10T07:52:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:44 vm05.local ceph-mon[130117]: Reconfiguring daemon mds.cephfs.vm05.omfhnh on vm05 2026-03-10T07:52:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:44 vm05.local ceph-mon[130117]: Reconfiguring mds.cephfs.vm05.pavqil (monmap changed)... 2026-03-10T07:52:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:44 vm05.local ceph-mon[130117]: Reconfiguring daemon mds.cephfs.vm05.pavqil on vm05 2026-03-10T07:52:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T07:52:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T07:52:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "auth caps", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]': finished 2026-03-10T07:52:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.ceph-exporter.vm08"}]: dispatch 2026-03-10T07:52:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T07:52:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:46.093 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:45 vm08.local ceph-mon[107898]: Reconfiguring ceph-exporter.vm08 (monmap changed)... 2026-03-10T07:52:46.093 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:45 vm08.local ceph-mon[107898]: Unable to update caps for client.ceph-exporter.vm08 2026-03-10T07:52:46.093 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:45 vm08.local ceph-mon[107898]: Reconfiguring daemon ceph-exporter.vm08 on vm08 2026-03-10T07:52:46.093 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:45 vm08.local ceph-mon[107898]: Reconfiguring crash.vm08 (monmap changed)... 2026-03-10T07:52:46.093 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:45 vm08.local ceph-mon[107898]: Reconfiguring daemon crash.vm08 on vm08 2026-03-10T07:52:46.093 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:45 vm08.local ceph-mon[107898]: pgmap v12: 65 pgs: 65 active+clean; 305 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 55 KiB/s rd, 1.9 MiB/s wr, 116 op/s 2026-03-10T07:52:46.093 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:45 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:46.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:45 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:46.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:45 vm08.local ceph-mon[107898]: Reconfiguring mgr.vm08.orfpog (monmap changed)... 2026-03-10T07:52:46.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:45 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.orfpog", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T07:52:46.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:45 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T07:52:46.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:45 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:46.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:45 vm08.local ceph-mon[107898]: Reconfiguring daemon mgr.vm08.orfpog on vm08 2026-03-10T07:52:46.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:45 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:46.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:45 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:46.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:45 vm08.local ceph-mon[107898]: Reconfiguring mon.vm08 (monmap changed)... 2026-03-10T07:52:46.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:45 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T07:52:46.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:45 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T07:52:46.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:45 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:46.094 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:45 vm08.local ceph-mon[107898]: Reconfiguring daemon mon.vm08 on vm08 2026-03-10T07:52:46.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:45 vm05.local ceph-mon[130117]: Reconfiguring ceph-exporter.vm08 (monmap changed)... 2026-03-10T07:52:46.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:45 vm05.local ceph-mon[130117]: Unable to update caps for client.ceph-exporter.vm08 2026-03-10T07:52:46.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:45 vm05.local ceph-mon[130117]: Reconfiguring daemon ceph-exporter.vm08 on vm08 2026-03-10T07:52:46.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:45 vm05.local ceph-mon[130117]: Reconfiguring crash.vm08 (monmap changed)... 2026-03-10T07:52:46.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:45 vm05.local ceph-mon[130117]: Reconfiguring daemon crash.vm08 on vm08 2026-03-10T07:52:46.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:45 vm05.local ceph-mon[130117]: pgmap v12: 65 pgs: 65 active+clean; 305 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 55 KiB/s rd, 1.9 MiB/s wr, 116 op/s 2026-03-10T07:52:46.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:45 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:46.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:45 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:46.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:45 vm05.local ceph-mon[130117]: Reconfiguring mgr.vm08.orfpog (monmap changed)... 2026-03-10T07:52:46.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:45 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mgr.vm08.orfpog", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]: dispatch 2026-03-10T07:52:46.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:45 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mgr services"}]: dispatch 2026-03-10T07:52:46.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:45 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:46.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:45 vm05.local ceph-mon[130117]: Reconfiguring daemon mgr.vm08.orfpog on vm08 2026-03-10T07:52:46.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:45 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:46.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:45 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:46.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:45 vm05.local ceph-mon[130117]: Reconfiguring mon.vm08 (monmap changed)... 2026-03-10T07:52:46.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:45 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "mon."}]: dispatch 2026-03-10T07:52:46.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:45 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config get", "who": "mon", "key": "public_network"}]: dispatch 2026-03-10T07:52:46.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:45 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:46.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:45 vm05.local ceph-mon[130117]: Reconfiguring daemon mon.vm08 on vm08 2026-03-10T07:52:46.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:46 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:46.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:46 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:46.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:46 vm08.local ceph-mon[107898]: Reconfiguring osd.3 (monmap changed)... 2026-03-10T07:52:46.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:46 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T07:52:46.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:46 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:46.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:46 vm08.local ceph-mon[107898]: Reconfiguring daemon osd.3 on vm08 2026-03-10T07:52:46.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:46 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:46.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:46 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:46.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:46 vm08.local ceph-mon[107898]: Reconfiguring osd.4 (monmap changed)... 2026-03-10T07:52:46.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:46 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T07:52:46.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:46 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:46.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:46 vm08.local ceph-mon[107898]: Reconfiguring daemon osd.4 on vm08 2026-03-10T07:52:46.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:46 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:46.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:46 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:46.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:46 vm08.local ceph-mon[107898]: Reconfiguring osd.5 (monmap changed)... 2026-03-10T07:52:46.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:46 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T07:52:46.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:46 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:46.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:46 vm08.local ceph-mon[107898]: Reconfiguring daemon osd.5 on vm08 2026-03-10T07:52:46.846 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:46 vm08.local ceph-mon[107898]: pgmap v13: 65 pgs: 65 active+clean; 305 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 31 KiB/s rd, 1.5 MiB/s wr, 83 op/s 2026-03-10T07:52:47.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:46 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:47.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:46 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:47.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:46 vm05.local ceph-mon[130117]: Reconfiguring osd.3 (monmap changed)... 2026-03-10T07:52:47.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:46 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T07:52:47.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:46 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:47.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:46 vm05.local ceph-mon[130117]: Reconfiguring daemon osd.3 on vm08 2026-03-10T07:52:47.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:46 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:47.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:46 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:47.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:46 vm05.local ceph-mon[130117]: Reconfiguring osd.4 (monmap changed)... 2026-03-10T07:52:47.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:46 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T07:52:47.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:46 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:47.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:46 vm05.local ceph-mon[130117]: Reconfiguring daemon osd.4 on vm08 2026-03-10T07:52:47.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:46 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:47.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:46 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:47.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:46 vm05.local ceph-mon[130117]: Reconfiguring osd.5 (monmap changed)... 2026-03-10T07:52:47.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:46 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T07:52:47.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:46 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:47.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:46 vm05.local ceph-mon[130117]: Reconfiguring daemon osd.5 on vm08 2026-03-10T07:52:47.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:46 vm05.local ceph-mon[130117]: pgmap v13: 65 pgs: 65 active+clean; 305 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 31 KiB/s rd, 1.5 MiB/s wr, 83 op/s 2026-03-10T07:52:47.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.665+0000 7f864a0cb700 1 -- 192.168.123.105:0/1714634857 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f86441089a0 msgr2=0x7f864410be70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:47.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.665+0000 7f864a0cb700 1 --2- 192.168.123.105:0/1714634857 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f86441089a0 0x7f864410be70 secure :-1 s=READY pgs=8 cs=0 l=1 rev1=1 crypto rx=0x7f863c00d3f0 tx=0x7f863c00d700 comp rx=0 tx=0).stop 2026-03-10T07:52:47.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.665+0000 7f864a0cb700 1 -- 192.168.123.105:0/1714634857 shutdown_connections 2026-03-10T07:52:47.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.665+0000 7f864a0cb700 1 --2- 192.168.123.105:0/1714634857 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f86441089a0 0x7f864410be70 unknown :-1 s=CLOSED pgs=8 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:47.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.665+0000 7f864a0cb700 1 --2- 192.168.123.105:0/1714634857 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8644107ff0 0x7f86441083d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:47.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.665+0000 7f864a0cb700 1 -- 192.168.123.105:0/1714634857 >> 192.168.123.105:0/1714634857 conn(0x7f864406ce20 msgr2=0x7f864406d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:47.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.665+0000 7f864a0cb700 1 -- 192.168.123.105:0/1714634857 shutdown_connections 2026-03-10T07:52:47.668 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.665+0000 7f864a0cb700 1 -- 192.168.123.105:0/1714634857 wait complete. 2026-03-10T07:52:47.669 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.666+0000 7f864a0cb700 1 Processor -- start 2026-03-10T07:52:47.669 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.666+0000 7f864a0cb700 1 -- start start 2026-03-10T07:52:47.669 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.666+0000 7f864a0cb700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8644107ff0 0x7f864407cee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:47.669 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.666+0000 7f864a0cb700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f864407d420 0x7f864407d8a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:47.669 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.666+0000 7f864a0cb700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8644081a60 con 0x7f864407d420 2026-03-10T07:52:47.669 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.666+0000 7f864a0cb700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8644081bd0 con 0x7f8644107ff0 2026-03-10T07:52:47.669 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.666+0000 7f8642ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f864407d420 0x7f864407d8a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:47.669 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.668+0000 7f86437fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8644107ff0 0x7f864407cee0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:47.669 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.668+0000 7f86437fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8644107ff0 0x7f864407cee0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:40856/0 (socket says 192.168.123.105:40856) 2026-03-10T07:52:47.669 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.668+0000 7f86437fe700 1 -- 192.168.123.105:0/2306526752 learned_addr learned my addr 192.168.123.105:0/2306526752 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:47.669 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.668+0000 7f86437fe700 1 -- 192.168.123.105:0/2306526752 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f864407d420 msgr2=0x7f864407d8a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:47.669 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.669+0000 7f86437fe700 1 --2- 192.168.123.105:0/2306526752 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f864407d420 0x7f864407d8a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:47.670 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.669+0000 7f86437fe700 1 -- 192.168.123.105:0/2306526752 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f863c007ed0 con 0x7f8644107ff0 2026-03-10T07:52:47.670 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.669+0000 7f86437fe700 1 --2- 192.168.123.105:0/2306526752 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8644107ff0 0x7f864407cee0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f863400bfb0 tx=0x7f863400cda0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:47.670 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.669+0000 7f8640ff9700 1 -- 192.168.123.105:0/2306526752 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f86340085c0 con 0x7f8644107ff0 2026-03-10T07:52:47.671 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.670+0000 7f864a0cb700 1 -- 192.168.123.105:0/2306526752 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8644081eb0 con 0x7f8644107ff0 2026-03-10T07:52:47.671 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.670+0000 7f864a0cb700 1 -- 192.168.123.105:0/2306526752 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8644082400 con 0x7f8644107ff0 2026-03-10T07:52:47.671 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.670+0000 7f8640ff9700 1 -- 192.168.123.105:0/2306526752 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f8634008c00 con 0x7f8644107ff0 2026-03-10T07:52:47.672 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.670+0000 7f8640ff9700 1 -- 192.168.123.105:0/2306526752 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f863401dd80 con 0x7f8644107ff0 2026-03-10T07:52:47.680 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.679+0000 7f8640ff9700 1 -- 192.168.123.105:0/2306526752 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f8634027430 con 0x7f8644107ff0 2026-03-10T07:52:47.681 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.680+0000 7f8640ff9700 1 --2- 192.168.123.105:0/2306526752 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f862c077b10 0x7f862c079fd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:47.682 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.682+0000 7f8640ff9700 1 -- 192.168.123.105:0/2306526752 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f86340a2eb0 con 0x7f8644107ff0 2026-03-10T07:52:47.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.682+0000 7f864a0cb700 1 -- 192.168.123.105:0/2306526752 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8630005320 con 0x7f8644107ff0 2026-03-10T07:52:47.684 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.683+0000 7f8642ffd700 1 --2- 192.168.123.105:0/2306526752 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f862c077b10 0x7f862c079fd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:47.684 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.684+0000 7f8642ffd700 1 --2- 192.168.123.105:0/2306526752 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f862c077b10 0x7f862c079fd0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f864407e810 tx=0x7f863c00db00 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:47.686 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.686+0000 7f8640ff9700 1 -- 192.168.123.105:0/2306526752 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f863406b540 con 0x7f8644107ff0 2026-03-10T07:52:47.838 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.837+0000 7f864a0cb700 1 -- 192.168.123.105:0/2306526752 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f8630000bf0 con 0x7f862c077b10 2026-03-10T07:52:47.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.839+0000 7f8640ff9700 1 -- 192.168.123.105:0/2306526752 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+383 (secure 0 0 0) 0x7f8630000bf0 con 0x7f862c077b10 2026-03-10T07:52:47.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.841+0000 7f864a0cb700 1 -- 192.168.123.105:0/2306526752 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f862c077b10 msgr2=0x7f862c079fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:47.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.841+0000 7f864a0cb700 1 --2- 192.168.123.105:0/2306526752 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f862c077b10 0x7f862c079fd0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7f864407e810 tx=0x7f863c00db00 comp rx=0 tx=0).stop 2026-03-10T07:52:47.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.841+0000 7f864a0cb700 1 -- 192.168.123.105:0/2306526752 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8644107ff0 msgr2=0x7f864407cee0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:47.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.841+0000 7f864a0cb700 1 --2- 192.168.123.105:0/2306526752 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8644107ff0 0x7f864407cee0 secure :-1 s=READY pgs=9 cs=0 l=1 rev1=1 crypto rx=0x7f863400bfb0 tx=0x7f863400cda0 comp rx=0 tx=0).stop 2026-03-10T07:52:47.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.841+0000 7f864a0cb700 1 -- 192.168.123.105:0/2306526752 shutdown_connections 2026-03-10T07:52:47.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.841+0000 7f864a0cb700 1 --2- 192.168.123.105:0/2306526752 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8644107ff0 0x7f864407cee0 unknown :-1 s=CLOSED pgs=9 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:47.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.841+0000 7f864a0cb700 1 --2- 192.168.123.105:0/2306526752 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f862c077b10 0x7f862c079fd0 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:47.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.841+0000 7f864a0cb700 1 --2- 192.168.123.105:0/2306526752 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f864407d420 0x7f864407d8a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:47.842 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.841+0000 7f864a0cb700 1 -- 192.168.123.105:0/2306526752 >> 192.168.123.105:0/2306526752 conn(0x7f864406ce20 msgr2=0x7f8644070410 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:47.843 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.842+0000 7f864a0cb700 1 -- 192.168.123.105:0/2306526752 shutdown_connections 2026-03-10T07:52:47.843 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.842+0000 7f864a0cb700 1 -- 192.168.123.105:0/2306526752 wait complete. 2026-03-10T07:52:47.853 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T07:52:47.936 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.935+0000 7fb1baec7700 1 -- 192.168.123.105:0/4166515849 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb1b4107ff0 msgr2=0x7fb1b41083d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:47.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.935+0000 7fb1baec7700 1 --2- 192.168.123.105:0/4166515849 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb1b4107ff0 0x7fb1b41083d0 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7fb1a4008790 tx=0x7fb1a400ae50 comp rx=0 tx=0).stop 2026-03-10T07:52:47.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.936+0000 7fb1baec7700 1 -- 192.168.123.105:0/4166515849 shutdown_connections 2026-03-10T07:52:47.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.936+0000 7fb1baec7700 1 --2- 192.168.123.105:0/4166515849 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb1b41089a0 0x7fb1b410be70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:47.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.936+0000 7fb1baec7700 1 --2- 192.168.123.105:0/4166515849 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb1b4107ff0 0x7fb1b41083d0 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:47.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.936+0000 7fb1baec7700 1 -- 192.168.123.105:0/4166515849 >> 192.168.123.105:0/4166515849 conn(0x7fb1b406ce20 msgr2=0x7fb1b406d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:47.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.936+0000 7fb1baec7700 1 -- 192.168.123.105:0/4166515849 shutdown_connections 2026-03-10T07:52:47.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.936+0000 7fb1baec7700 1 -- 192.168.123.105:0/4166515849 wait complete. 2026-03-10T07:52:47.937 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.937+0000 7fb1baec7700 1 Processor -- start 2026-03-10T07:52:47.938 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.937+0000 7fb1baec7700 1 -- start start 2026-03-10T07:52:47.938 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.937+0000 7fb1baec7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb1b41089a0 0x7fb1b407cea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:47.938 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.937+0000 7fb1baec7700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb1b407d3e0 0x7fb1b407d860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:47.938 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.937+0000 7fb1baec7700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb1b4081a20 con 0x7fb1b41089a0 2026-03-10T07:52:47.938 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.937+0000 7fb1baec7700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb1b4081b90 con 0x7fb1b407d3e0 2026-03-10T07:52:47.938 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.937+0000 7fb1b3fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb1b407d3e0 0x7fb1b407d860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:47.938 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.937+0000 7fb1b3fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb1b407d3e0 0x7fb1b407d860 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:40872/0 (socket says 192.168.123.105:40872) 2026-03-10T07:52:47.938 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.937+0000 7fb1b3fff700 1 -- 192.168.123.105:0/3059460385 learned_addr learned my addr 192.168.123.105:0/3059460385 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:47.938 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.937+0000 7fb1b3fff700 1 -- 192.168.123.105:0/3059460385 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb1b41089a0 msgr2=0x7fb1b407cea0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:47.938 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.937+0000 7fb1b3fff700 1 --2- 192.168.123.105:0/3059460385 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb1b41089a0 0x7fb1b407cea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:47.939 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.937+0000 7fb1b3fff700 1 -- 192.168.123.105:0/3059460385 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb1a4008440 con 0x7fb1b407d3e0 2026-03-10T07:52:47.939 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.938+0000 7fb1b3fff700 1 --2- 192.168.123.105:0/3059460385 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb1b407d3e0 0x7fb1b407d860 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7fb1ac009fc0 tx=0x7fb1ac007750 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:47.939 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.939+0000 7fb1b1ffb700 1 -- 192.168.123.105:0/3059460385 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb1ac010040 con 0x7fb1b407d3e0 2026-03-10T07:52:47.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.939+0000 7fb1baec7700 1 -- 192.168.123.105:0/3059460385 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb1b4081e70 con 0x7fb1b407d3e0 2026-03-10T07:52:47.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.939+0000 7fb1baec7700 1 -- 192.168.123.105:0/3059460385 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb1b40823c0 con 0x7fb1b407d3e0 2026-03-10T07:52:47.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.940+0000 7fb1b1ffb700 1 -- 192.168.123.105:0/3059460385 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fb1ac0093f0 con 0x7fb1b407d3e0 2026-03-10T07:52:47.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.940+0000 7fb1b1ffb700 1 -- 192.168.123.105:0/3059460385 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb1ac016990 con 0x7fb1b407d3e0 2026-03-10T07:52:47.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.940+0000 7fb1baec7700 1 -- 192.168.123.105:0/3059460385 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb1a0005320 con 0x7fb1b407d3e0 2026-03-10T07:52:47.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.944+0000 7fb1b1ffb700 1 -- 192.168.123.105:0/3059460385 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fb1ac016af0 con 0x7fb1b407d3e0 2026-03-10T07:52:47.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.945+0000 7fb1b1ffb700 1 --2- 192.168.123.105:0/3059460385 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fb19c077b00 0x7fb19c079fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:47.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.945+0000 7fb1b1ffb700 1 -- 192.168.123.105:0/3059460385 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fb1ac099d80 con 0x7fb1b407d3e0 2026-03-10T07:52:47.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.946+0000 7fb1b8c63700 1 --2- 192.168.123.105:0/3059460385 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fb19c077b00 0x7fb19c079fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:47.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.946+0000 7fb1b8c63700 1 --2- 192.168.123.105:0/3059460385 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fb19c077b00 0x7fb19c079fc0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fb1a400b3a0 tx=0x7fb1a400b5b0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:47.958 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:47.954+0000 7fb1b1ffb700 1 -- 192.168.123.105:0/3059460385 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb1ac0624c0 con 0x7fb1b407d3e0 2026-03-10T07:52:48.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.125+0000 7fb1baec7700 1 -- 192.168.123.105:0/3059460385 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fb1a0000bf0 con 0x7fb19c077b00 2026-03-10T07:52:48.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.130+0000 7fb1b1ffb700 1 -- 192.168.123.105:0/3059460385 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+383 (secure 0 0 0) 0x7fb1a0000bf0 con 0x7fb19c077b00 2026-03-10T07:52:48.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.132+0000 7fb1baec7700 1 -- 192.168.123.105:0/3059460385 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fb19c077b00 msgr2=0x7fb19c079fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:48.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.132+0000 7fb1baec7700 1 --2- 192.168.123.105:0/3059460385 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fb19c077b00 0x7fb19c079fc0 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7fb1a400b3a0 tx=0x7fb1a400b5b0 comp rx=0 tx=0).stop 2026-03-10T07:52:48.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.132+0000 7fb1baec7700 1 -- 192.168.123.105:0/3059460385 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb1b407d3e0 msgr2=0x7fb1b407d860 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:48.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.132+0000 7fb1baec7700 1 --2- 192.168.123.105:0/3059460385 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb1b407d3e0 0x7fb1b407d860 secure :-1 s=READY pgs=10 cs=0 l=1 rev1=1 crypto rx=0x7fb1ac009fc0 tx=0x7fb1ac007750 comp rx=0 tx=0).stop 2026-03-10T07:52:48.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.132+0000 7fb1baec7700 1 -- 192.168.123.105:0/3059460385 shutdown_connections 2026-03-10T07:52:48.134 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.132+0000 7fb1baec7700 1 --2- 192.168.123.105:0/3059460385 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fb19c077b00 0x7fb19c079fc0 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:48.134 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.134+0000 7fb1baec7700 1 --2- 192.168.123.105:0/3059460385 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb1b41089a0 0x7fb1b407cea0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:48.134 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.134+0000 7fb1baec7700 1 --2- 192.168.123.105:0/3059460385 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb1b407d3e0 0x7fb1b407d860 unknown :-1 s=CLOSED pgs=10 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:48.134 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.134+0000 7fb1baec7700 1 -- 192.168.123.105:0/3059460385 >> 192.168.123.105:0/3059460385 conn(0x7fb1b406ce20 msgr2=0x7fb1b4071590 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:48.134 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.134+0000 7fb1baec7700 1 -- 192.168.123.105:0/3059460385 shutdown_connections 2026-03-10T07:52:48.134 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.134+0000 7fb1baec7700 1 -- 192.168.123.105:0/3059460385 wait complete. 2026-03-10T07:52:48.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.247+0000 7f770d2ab700 1 -- 192.168.123.105:0/259660931 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7708107d90 msgr2=0x7f77081081f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:48.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.247+0000 7f770d2ab700 1 --2- 192.168.123.105:0/259660931 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7708107d90 0x7f77081081f0 secure :-1 s=READY pgs=11 cs=0 l=1 rev1=1 crypto rx=0x7f770000d3f0 tx=0x7f770000d700 comp rx=0 tx=0).stop 2026-03-10T07:52:48.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.248+0000 7f770d2ab700 1 -- 192.168.123.105:0/259660931 shutdown_connections 2026-03-10T07:52:48.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.248+0000 7f770d2ab700 1 --2- 192.168.123.105:0/259660931 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7708107d90 0x7f77081081f0 unknown :-1 s=CLOSED pgs=11 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:48.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.248+0000 7f770d2ab700 1 --2- 192.168.123.105:0/259660931 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f770810d310 0x7f770810d6f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:48.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.248+0000 7f770d2ab700 1 -- 192.168.123.105:0/259660931 >> 192.168.123.105:0/259660931 conn(0x7f770806ce20 msgr2=0x7f770806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:48.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.248+0000 7f770d2ab700 1 -- 192.168.123.105:0/259660931 shutdown_connections 2026-03-10T07:52:48.249 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.248+0000 7f770d2ab700 1 -- 192.168.123.105:0/259660931 wait complete. 2026-03-10T07:52:48.249 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.248+0000 7f770d2ab700 1 Processor -- start 2026-03-10T07:52:48.249 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.248+0000 7f770d2ab700 1 -- start start 2026-03-10T07:52:48.249 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.248+0000 7f770d2ab700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f770810d310 0x7f77081330e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:48.249 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.248+0000 7f770d2ab700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7708133620 0x7f7708133aa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:48.249 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.248+0000 7f770d2ab700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f770807ef70 con 0x7f7708133620 2026-03-10T07:52:48.249 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.248+0000 7f770d2ab700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f770807f0e0 con 0x7f770810d310 2026-03-10T07:52:48.250 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.249+0000 7f77077fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7708133620 0x7f7708133aa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:48.250 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.249+0000 7f77077fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7708133620 0x7f7708133aa0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57638/0 (socket says 192.168.123.105:57638) 2026-03-10T07:52:48.250 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.249+0000 7f77077fe700 1 -- 192.168.123.105:0/448621588 learned_addr learned my addr 192.168.123.105:0/448621588 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:48.250 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.249+0000 7f7707fff700 1 --2- 192.168.123.105:0/448621588 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f770810d310 0x7f77081330e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:48.250 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.250+0000 7f77077fe700 1 -- 192.168.123.105:0/448621588 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f770810d310 msgr2=0x7f77081330e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:48.250 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.250+0000 7f77077fe700 1 --2- 192.168.123.105:0/448621588 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f770810d310 0x7f77081330e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:48.250 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.250+0000 7f77077fe700 1 -- 192.168.123.105:0/448621588 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7700007ed0 con 0x7f7708133620 2026-03-10T07:52:48.250 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.250+0000 7f77077fe700 1 --2- 192.168.123.105:0/448621588 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7708133620 0x7f7708133aa0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f7700003c60 tx=0x7f7700004b40 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:48.265 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.263+0000 7f77057fa700 1 -- 192.168.123.105:0/448621588 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f770001c070 con 0x7f7708133620 2026-03-10T07:52:48.265 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.263+0000 7f770d2ab700 1 -- 192.168.123.105:0/448621588 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f770807f310 con 0x7f7708133620 2026-03-10T07:52:48.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.264+0000 7f770d2ab700 1 -- 192.168.123.105:0/448621588 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f770807f800 con 0x7f7708133620 2026-03-10T07:52:48.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.264+0000 7f77057fa700 1 -- 192.168.123.105:0/448621588 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f770000deb0 con 0x7f7708133620 2026-03-10T07:52:48.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.264+0000 7f77057fa700 1 -- 192.168.123.105:0/448621588 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7700017c20 con 0x7f7708133620 2026-03-10T07:52:48.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.264+0000 7f770d2ab700 1 -- 192.168.123.105:0/448621588 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f770804f2e0 con 0x7f7708133620 2026-03-10T07:52:48.268 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.267+0000 7f77057fa700 1 -- 192.168.123.105:0/448621588 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f770000f660 con 0x7f7708133620 2026-03-10T07:52:48.268 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.268+0000 7f77057fa700 1 --2- 192.168.123.105:0/448621588 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f76f0077a40 0x7f76f0079f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:48.268 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.268+0000 7f7707fff700 1 --2- 192.168.123.105:0/448621588 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f76f0077a40 0x7f76f0079f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:48.269 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.268+0000 7f77057fa700 1 -- 192.168.123.105:0/448621588 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f7700013070 con 0x7f7708133620 2026-03-10T07:52:48.269 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.269+0000 7f7707fff700 1 --2- 192.168.123.105:0/448621588 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f76f0077a40 0x7f76f0079f00 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f76f8009c80 tx=0x7f76f8009400 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:48.271 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.271+0000 7f77057fa700 1 -- 192.168.123.105:0/448621588 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f7700064490 con 0x7f7708133620 2026-03-10T07:52:48.434 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.433+0000 7f770d2ab700 1 -- 192.168.123.105:0/448621588 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f7708134790 con 0x7f76f0077a40 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (86s) 17s ago 6m 16.9M - 0.25.0 c8568f914cd2 ac15d5f35994 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (6m) 17s ago 6m 8590k - 18.2.1 5be31c24972a 26c4db858175 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (6m) 9s ago 6m 10.4M - 18.2.1 5be31c24972a 209e2398a09c 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (6m) 17s ago 6m 7419k - 18.2.1 5be31c24972a d3d7b92c8ac3 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (5m) 9s ago 5m 7415k - 18.2.1 5be31c24972a 96136e0195f7 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (66s) 17s ago 6m 76.1M - 10.4.0 c8b91775d855 6acb529ad951 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.omfhnh vm05 running (4m) 17s ago 4m 267M - 18.2.1 5be31c24972a e23de179e09c 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.pavqil vm05 running (4m) 17s ago 4m 16.2M - 18.2.1 5be31c24972a 5b9e5afa214c 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.dgsaon vm08 running (4m) 9s ago 4m 17.6M - 18.2.1 5be31c24972a 1696aee522b5 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ybmbgd vm08 running (4m) 9s ago 4m 15.9M - 18.2.1 5be31c24972a 30b0e51cd2ed 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.blexke vm05 *:8443,9283,8765 running (2m) 17s ago 7m 560M - 19.2.3-678-ge911bdeb 654f31e6858e 5467b6a3bd38 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.orfpog vm08 *:8443,9283,8765 running (2m) 9s ago 5m 533M - 19.2.3-678-ge911bdeb 654f31e6858e 7a15d7811d5b 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (29s) 17s ago 7m 44.4M 2048M 19.2.3-678-ge911bdeb 654f31e6858e f02f076bb820 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (13s) 9s ago 5m 35.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 73d9a504f360 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (2m) 17s ago 6m 10.2M - 1.7.0 72c9c2088986 7cd0b23b4118 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (115s) 9s ago 5m 9437k - 1.7.0 72c9c2088986 3dd4d91d5881 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (5m) 17s ago 5m 298M 4096M 18.2.1 5be31c24972a 9b7c5ea48cea 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (5m) 17s ago 5m 295M 4096M 18.2.1 5be31c24972a 88e0b65b2c93 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (5m) 17s ago 5m 247M 4096M 18.2.1 5be31c24972a b8d3cbeb49f1 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (5m) 9s ago 5m 388M 4096M 18.2.1 5be31c24972a 0a62c54a86c0 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (4m) 9s ago 4m 324M 4096M 18.2.1 5be31c24972a bd748b691ccd 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (4m) 9s ago 4m 277M 4096M 18.2.1 5be31c24972a 9f08820ae98b 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (96s) 17s ago 6m 48.8M - 2.51.0 1d3b7f56885b c59a6be07563 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.440+0000 7f77057fa700 1 -- 192.168.123.105:0/448621588 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f7708134790 con 0x7f76f0077a40 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.442+0000 7f76eeffd700 1 -- 192.168.123.105:0/448621588 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f76f0077a40 msgr2=0x7f76f0079f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.442+0000 7f76eeffd700 1 --2- 192.168.123.105:0/448621588 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f76f0077a40 0x7f76f0079f00 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f76f8009c80 tx=0x7f76f8009400 comp rx=0 tx=0).stop 2026-03-10T07:52:48.443 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.443+0000 7f76eeffd700 1 -- 192.168.123.105:0/448621588 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7708133620 msgr2=0x7f7708133aa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:48.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.443+0000 7f76eeffd700 1 --2- 192.168.123.105:0/448621588 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7708133620 0x7f7708133aa0 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f7700003c60 tx=0x7f7700004b40 comp rx=0 tx=0).stop 2026-03-10T07:52:48.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.443+0000 7f76eeffd700 1 -- 192.168.123.105:0/448621588 shutdown_connections 2026-03-10T07:52:48.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.443+0000 7f76eeffd700 1 --2- 192.168.123.105:0/448621588 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f770810d310 0x7f77081330e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:48.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.443+0000 7f76eeffd700 1 --2- 192.168.123.105:0/448621588 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f76f0077a40 0x7f76f0079f00 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:48.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.443+0000 7f76eeffd700 1 --2- 192.168.123.105:0/448621588 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7708133620 0x7f7708133aa0 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:48.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.443+0000 7f76eeffd700 1 -- 192.168.123.105:0/448621588 >> 192.168.123.105:0/448621588 conn(0x7f770806ce20 msgr2=0x7f7708070db0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:48.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.443+0000 7f76eeffd700 1 -- 192.168.123.105:0/448621588 shutdown_connections 2026-03-10T07:52:48.444 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.443+0000 7f76eeffd700 1 -- 192.168.123.105:0/448621588 wait complete. 2026-03-10T07:52:48.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.536+0000 7fafccfc9700 1 -- 192.168.123.105:0/613903246 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fafc81089a0 msgr2=0x7fafc810be70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:48.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.536+0000 7fafccfc9700 1 --2- 192.168.123.105:0/613903246 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fafc81089a0 0x7fafc810be70 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7fafc000d3f0 tx=0x7fafc000d700 comp rx=0 tx=0).stop 2026-03-10T07:52:48.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.537+0000 7fafccfc9700 1 -- 192.168.123.105:0/613903246 shutdown_connections 2026-03-10T07:52:48.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.537+0000 7fafccfc9700 1 --2- 192.168.123.105:0/613903246 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fafc81089a0 0x7fafc810be70 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:48.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.537+0000 7fafccfc9700 1 --2- 192.168.123.105:0/613903246 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fafc8107ff0 0x7fafc81083d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:48.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.537+0000 7fafccfc9700 1 -- 192.168.123.105:0/613903246 >> 192.168.123.105:0/613903246 conn(0x7fafc806ce20 msgr2=0x7fafc806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:48.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.537+0000 7fafccfc9700 1 -- 192.168.123.105:0/613903246 shutdown_connections 2026-03-10T07:52:48.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.537+0000 7fafccfc9700 1 -- 192.168.123.105:0/613903246 wait complete. 2026-03-10T07:52:48.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.537+0000 7fafccfc9700 1 Processor -- start 2026-03-10T07:52:48.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.537+0000 7fafccfc9700 1 -- start start 2026-03-10T07:52:48.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.537+0000 7fafccfc9700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fafc8107ff0 0x7fafc8133330 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:48.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.537+0000 7fafccfc9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fafc8133870 0x7fafc8133cf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:48.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.537+0000 7fafccfc9700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fafc807ef10 con 0x7fafc8133870 2026-03-10T07:52:48.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.537+0000 7fafccfc9700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fafc807f080 con 0x7fafc8107ff0 2026-03-10T07:52:48.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.537+0000 7fafc5d9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fafc8133870 0x7fafc8133cf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:48.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.538+0000 7fafc5d9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fafc8133870 0x7fafc8133cf0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:57654/0 (socket says 192.168.123.105:57654) 2026-03-10T07:52:48.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.538+0000 7fafc5d9b700 1 -- 192.168.123.105:0/980916450 learned_addr learned my addr 192.168.123.105:0/980916450 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:48.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.538+0000 7fafc5d9b700 1 -- 192.168.123.105:0/980916450 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fafc8107ff0 msgr2=0x7fafc8133330 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:48.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.538+0000 7fafc5d9b700 1 --2- 192.168.123.105:0/980916450 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fafc8107ff0 0x7fafc8133330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:48.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.538+0000 7fafc5d9b700 1 -- 192.168.123.105:0/980916450 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fafc0007ed0 con 0x7fafc8133870 2026-03-10T07:52:48.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.538+0000 7fafc5d9b700 1 --2- 192.168.123.105:0/980916450 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fafc8133870 0x7fafc8133cf0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fafc0004010 tx=0x7fafc00040f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:48.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.539+0000 7fafb77fe700 1 -- 192.168.123.105:0/980916450 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fafc001c070 con 0x7fafc8133870 2026-03-10T07:52:48.539 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.539+0000 7fafccfc9700 1 -- 192.168.123.105:0/980916450 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fafc807f2b0 con 0x7fafc8133870 2026-03-10T07:52:48.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.539+0000 7fafccfc9700 1 -- 192.168.123.105:0/980916450 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fafc807f750 con 0x7fafc8133870 2026-03-10T07:52:48.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.539+0000 7fafb77fe700 1 -- 192.168.123.105:0/980916450 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fafc0004760 con 0x7fafc8133870 2026-03-10T07:52:48.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.539+0000 7fafb77fe700 1 -- 192.168.123.105:0/980916450 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fafc0017850 con 0x7fafc8133870 2026-03-10T07:52:48.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.540+0000 7fafccfc9700 1 -- 192.168.123.105:0/980916450 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fafc804f2e0 con 0x7fafc8133870 2026-03-10T07:52:48.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.541+0000 7fafb77fe700 1 -- 192.168.123.105:0/980916450 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fafc00048d0 con 0x7fafc8133870 2026-03-10T07:52:48.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.541+0000 7fafb77fe700 1 --2- 192.168.123.105:0/980916450 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fafb0077a40 0x7fafb0079f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:48.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.541+0000 7fafb77fe700 1 -- 192.168.123.105:0/980916450 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fafc0013070 con 0x7fafc8133870 2026-03-10T07:52:48.544 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.544+0000 7fafb77fe700 1 -- 192.168.123.105:0/980916450 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fafc0063bc0 con 0x7fafc8133870 2026-03-10T07:52:48.550 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.550+0000 7fafc659c700 1 --2- 192.168.123.105:0/980916450 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fafb0077a40 0x7fafb0079f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:48.566 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.565+0000 7fafc659c700 1 --2- 192.168.123.105:0/980916450 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fafb0077a40 0x7fafb0079f00 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fafb80098a0 tx=0x7fafb8006d90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:48.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.771+0000 7fafccfc9700 1 -- 192.168.123.105:0/980916450 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fafc8134900 con 0x7fafc8133870 2026-03-10T07:52:48.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.772+0000 7fafb77fe700 1 -- 192.168.123.105:0/980916450 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+694 (secure 0 0 0) 0x7fafc0063310 con 0x7fafc8133870 2026-03-10T07:52:48.773 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:52:48.773 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T07:52:48.773 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:52:48.773 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:52:48.773 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T07:52:48.773 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:52:48.773 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:52:48.773 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T07:52:48.773 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 6 2026-03-10T07:52:48.773 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:52:48.773 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T07:52:48.773 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T07:52:48.773 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:52:48.773 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T07:52:48.773 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 10, 2026-03-10T07:52:48.773 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T07:52:48.773 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T07:52:48.773 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:52:48.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.775+0000 7fafb57fa700 1 -- 192.168.123.105:0/980916450 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fafb0077a40 msgr2=0x7fafb0079f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:48.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.775+0000 7fafb57fa700 1 --2- 192.168.123.105:0/980916450 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fafb0077a40 0x7fafb0079f00 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7fafb80098a0 tx=0x7fafb8006d90 comp rx=0 tx=0).stop 2026-03-10T07:52:48.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.776+0000 7fafb57fa700 1 -- 192.168.123.105:0/980916450 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fafc8133870 msgr2=0x7fafc8133cf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:48.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.776+0000 7fafb57fa700 1 --2- 192.168.123.105:0/980916450 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fafc8133870 0x7fafc8133cf0 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7fafc0004010 tx=0x7fafc00040f0 comp rx=0 tx=0).stop 2026-03-10T07:52:48.777 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.776+0000 7fafb57fa700 1 -- 192.168.123.105:0/980916450 shutdown_connections 2026-03-10T07:52:48.777 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.776+0000 7fafb57fa700 1 --2- 192.168.123.105:0/980916450 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fafc8107ff0 0x7fafc8133330 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:48.777 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.776+0000 7fafb57fa700 1 --2- 192.168.123.105:0/980916450 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fafb0077a40 0x7fafb0079f00 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:48.777 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.776+0000 7fafb57fa700 1 --2- 192.168.123.105:0/980916450 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fafc8133870 0x7fafc8133cf0 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:48.777 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.776+0000 7fafb57fa700 1 -- 192.168.123.105:0/980916450 >> 192.168.123.105:0/980916450 conn(0x7fafc806ce20 msgr2=0x7fafc80706a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:48.777 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.776+0000 7fafb57fa700 1 -- 192.168.123.105:0/980916450 shutdown_connections 2026-03-10T07:52:48.777 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.777+0000 7fafb57fa700 1 -- 192.168.123.105:0/980916450 wait complete. 2026-03-10T07:52:48.898 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:48 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:48.898 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:48 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:48.898 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:48 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.ybmbgd", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T07:52:48.898 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:48 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:48.898 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:48 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:48.898 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:48 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:48.898 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:48 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.dgsaon", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T07:52:48.898 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:48 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:48.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.897+0000 7f908626c700 1 -- 192.168.123.105:0/3348320269 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f908010d310 msgr2=0x7f908010d6f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:48.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.897+0000 7f908626c700 1 --2- 192.168.123.105:0/3348320269 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f908010d310 0x7f908010d6f0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f907c008790 tx=0x7f907c00ae50 comp rx=0 tx=0).stop 2026-03-10T07:52:48.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.898+0000 7f908626c700 1 -- 192.168.123.105:0/3348320269 shutdown_connections 2026-03-10T07:52:48.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.898+0000 7f908626c700 1 --2- 192.168.123.105:0/3348320269 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9080107d90 0x7f90801081f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:48.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.898+0000 7f908626c700 1 --2- 192.168.123.105:0/3348320269 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f908010d310 0x7f908010d6f0 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:48.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.898+0000 7f908626c700 1 -- 192.168.123.105:0/3348320269 >> 192.168.123.105:0/3348320269 conn(0x7f908006ce20 msgr2=0x7f908006d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:48.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.898+0000 7f908626c700 1 -- 192.168.123.105:0/3348320269 shutdown_connections 2026-03-10T07:52:48.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.898+0000 7f908626c700 1 -- 192.168.123.105:0/3348320269 wait complete. 2026-03-10T07:52:48.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.899+0000 7f908626c700 1 Processor -- start 2026-03-10T07:52:48.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.899+0000 7f908626c700 1 -- start start 2026-03-10T07:52:48.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.899+0000 7f908626c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9080107d90 0x7f908007cec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:48.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.899+0000 7f908626c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f908007d400 0x7f908007d880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:48.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.899+0000 7f908626c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9080083e90 con 0x7f908007d400 2026-03-10T07:52:48.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.899+0000 7f908626c700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f90800819a0 con 0x7f9080107d90 2026-03-10T07:52:48.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.899+0000 7f908526a700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9080107d90 0x7f908007cec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:48.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.899+0000 7f908526a700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9080107d90 0x7f908007cec0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:40320/0 (socket says 192.168.123.105:40320) 2026-03-10T07:52:48.899 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.899+0000 7f908526a700 1 -- 192.168.123.105:0/292644891 learned_addr learned my addr 192.168.123.105:0/292644891 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:48.900 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.899+0000 7f908526a700 1 -- 192.168.123.105:0/292644891 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f908007d400 msgr2=0x7f908007d880 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:48.900 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.899+0000 7f908526a700 1 --2- 192.168.123.105:0/292644891 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f908007d400 0x7f908007d880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:48.900 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.899+0000 7f908526a700 1 -- 192.168.123.105:0/292644891 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f907c008440 con 0x7f9080107d90 2026-03-10T07:52:48.900 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.900+0000 7f908526a700 1 --2- 192.168.123.105:0/292644891 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9080107d90 0x7f908007cec0 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f907c00be10 tx=0x7f907c00bef0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:48.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.900+0000 7f90767fc700 1 -- 192.168.123.105:0/292644891 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f907c00a490 con 0x7f9080107d90 2026-03-10T07:52:48.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.900+0000 7f908626c700 1 -- 192.168.123.105:0/292644891 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9080081c20 con 0x7f9080107d90 2026-03-10T07:52:48.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.900+0000 7f908626c700 1 -- 192.168.123.105:0/292644891 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9080082170 con 0x7f9080107d90 2026-03-10T07:52:48.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.901+0000 7f90767fc700 1 -- 192.168.123.105:0/292644891 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f907c01c420 con 0x7f9080107d90 2026-03-10T07:52:48.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.901+0000 7f90767fc700 1 -- 192.168.123.105:0/292644891 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f907c012630 con 0x7f9080107d90 2026-03-10T07:52:48.903 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.902+0000 7f90767fc700 1 -- 192.168.123.105:0/292644891 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f907c01ca90 con 0x7f9080107d90 2026-03-10T07:52:48.903 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.903+0000 7f90767fc700 1 --2- 192.168.123.105:0/292644891 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f906c077b00 0x7f906c079fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:48.903 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.903+0000 7f90767fc700 1 -- 192.168.123.105:0/292644891 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f907c09a960 con 0x7f9080107d90 2026-03-10T07:52:48.903 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.903+0000 7f908626c700 1 -- 192.168.123.105:0/292644891 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9064005320 con 0x7f9080107d90 2026-03-10T07:52:48.906 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.905+0000 7f9084a69700 1 --2- 192.168.123.105:0/292644891 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f906c077b00 0x7f906c079fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:48.906 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.906+0000 7f9084a69700 1 --2- 192.168.123.105:0/292644891 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f906c077b00 0x7f906c079fc0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f90780095a0 tx=0x7f9078006040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:48.910 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:48.908+0000 7f90767fc700 1 -- 192.168.123.105:0/292644891 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f907c0630b0 con 0x7f9080107d90 2026-03-10T07:52:48.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:48 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:48.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:48 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:48.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:48 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.ybmbgd", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T07:52:48.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:48 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:48.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:48 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:48.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:48 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:48.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:48 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.dgsaon", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T07:52:48.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:48 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:49.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.183+0000 7f908626c700 1 -- 192.168.123.105:0/292644891 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f9064005cc0 con 0x7f9080107d90 2026-03-10T07:52:49.193 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T07:52:49.193 INFO:teuthology.orchestra.run.vm05.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T07:52:49.193 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T07:48:24.309293+0000 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T07:48:33.425187+0000 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:max_xattr_size 65536 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24297} 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:inline_data enabled 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:qdb_cluster leader: 0 members: 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.omfhnh{0:24297} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/723078808,v1:192.168.123.105:6827/723078808] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.pavqil{-1:14512} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/427555544,v1:192.168.123.105:6829/427555544] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ybmbgd{-1:14524} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3815585988,v1:192.168.123.108:6825/3815585988] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.dgsaon{-1:24313} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/2963085185,v1:192.168.123.108:6827/2963085185] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.186+0000 7f90767fc700 1 -- 192.168.123.105:0/292644891 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1915 (secure 0 0 0) 0x7f907c02a790 con 0x7f9080107d90 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.188+0000 7f906bfff700 1 -- 192.168.123.105:0/292644891 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f906c077b00 msgr2=0x7f906c079fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.188+0000 7f906bfff700 1 --2- 192.168.123.105:0/292644891 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f906c077b00 0x7f906c079fc0 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7f90780095a0 tx=0x7f9078006040 comp rx=0 tx=0).stop 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.188+0000 7f906bfff700 1 -- 192.168.123.105:0/292644891 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9080107d90 msgr2=0x7f908007cec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.188+0000 7f906bfff700 1 --2- 192.168.123.105:0/292644891 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9080107d90 0x7f908007cec0 secure :-1 s=READY pgs=12 cs=0 l=1 rev1=1 crypto rx=0x7f907c00be10 tx=0x7f907c00bef0 comp rx=0 tx=0).stop 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.188+0000 7f906bfff700 1 -- 192.168.123.105:0/292644891 shutdown_connections 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.188+0000 7f906bfff700 1 --2- 192.168.123.105:0/292644891 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9080107d90 0x7f908007cec0 unknown :-1 s=CLOSED pgs=12 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.188+0000 7f906bfff700 1 --2- 192.168.123.105:0/292644891 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f906c077b00 0x7f906c079fc0 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.188+0000 7f906bfff700 1 --2- 192.168.123.105:0/292644891 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f908007d400 0x7f908007d880 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.188+0000 7f906bfff700 1 -- 192.168.123.105:0/292644891 >> 192.168.123.105:0/292644891 conn(0x7f908006ce20 msgr2=0x7f9080071e20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.188+0000 7f906bfff700 1 -- 192.168.123.105:0/292644891 shutdown_connections 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.188+0000 7f906bfff700 1 -- 192.168.123.105:0/292644891 wait complete. 2026-03-10T07:52:49.194 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T07:52:49.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.339+0000 7f3dda806700 1 -- 192.168.123.105:0/1127931357 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3dd41089a0 msgr2=0x7f3dd410be70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:49.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.339+0000 7f3dda806700 1 --2- 192.168.123.105:0/1127931357 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3dd41089a0 0x7f3dd410be70 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f3dcc00d3f0 tx=0x7f3dcc00d700 comp rx=0 tx=0).stop 2026-03-10T07:52:49.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.339+0000 7f3dda806700 1 -- 192.168.123.105:0/1127931357 shutdown_connections 2026-03-10T07:52:49.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.339+0000 7f3dda806700 1 --2- 192.168.123.105:0/1127931357 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3dd41089a0 0x7f3dd410be70 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:49.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.339+0000 7f3dda806700 1 --2- 192.168.123.105:0/1127931357 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3dd4107ff0 0x7f3dd41083d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:49.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.339+0000 7f3dda806700 1 -- 192.168.123.105:0/1127931357 >> 192.168.123.105:0/1127931357 conn(0x7f3dd406ce20 msgr2=0x7f3dd406d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:49.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.340+0000 7f3dda806700 1 -- 192.168.123.105:0/1127931357 shutdown_connections 2026-03-10T07:52:49.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.340+0000 7f3dda806700 1 -- 192.168.123.105:0/1127931357 wait complete. 2026-03-10T07:52:49.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.340+0000 7f3dda806700 1 Processor -- start 2026-03-10T07:52:49.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.340+0000 7f3dda806700 1 -- start start 2026-03-10T07:52:49.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.340+0000 7f3dda806700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3dd4107ff0 0x7f3dd4133210 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:49.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.340+0000 7f3dda806700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3dd4133750 0x7f3dd4133bd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:49.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.340+0000 7f3dda806700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3dd407ef10 con 0x7f3dd4133750 2026-03-10T07:52:49.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.340+0000 7f3dda806700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3dd407f080 con 0x7f3dd4107ff0 2026-03-10T07:52:49.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.341+0000 7f3dd37fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3dd4133750 0x7f3dd4133bd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:49.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.341+0000 7f3dd3fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3dd4107ff0 0x7f3dd4133210 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:49.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.341+0000 7f3dd3fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3dd4107ff0 0x7f3dd4133210 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:40334/0 (socket says 192.168.123.105:40334) 2026-03-10T07:52:49.341 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.341+0000 7f3dd3fff700 1 -- 192.168.123.105:0/3437493988 learned_addr learned my addr 192.168.123.105:0/3437493988 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:49.342 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.342+0000 7f3dd3fff700 1 -- 192.168.123.105:0/3437493988 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3dd4133750 msgr2=0x7f3dd4133bd0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:49.342 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.342+0000 7f3dd3fff700 1 --2- 192.168.123.105:0/3437493988 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3dd4133750 0x7f3dd4133bd0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:49.342 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.342+0000 7f3dd3fff700 1 -- 192.168.123.105:0/3437493988 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3dcc007ed0 con 0x7f3dd4107ff0 2026-03-10T07:52:49.342 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.342+0000 7f3dd3fff700 1 --2- 192.168.123.105:0/3437493988 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3dd4107ff0 0x7f3dd4133210 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f3dc400d8d0 tx=0x7f3dc400dbe0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:49.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.342+0000 7f3dd17fa700 1 -- 192.168.123.105:0/3437493988 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3dc4009880 con 0x7f3dd4107ff0 2026-03-10T07:52:49.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.342+0000 7f3dda806700 1 -- 192.168.123.105:0/3437493988 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3dd407f310 con 0x7f3dd4107ff0 2026-03-10T07:52:49.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.342+0000 7f3dda806700 1 -- 192.168.123.105:0/3437493988 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3dd407f860 con 0x7f3dd4107ff0 2026-03-10T07:52:49.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.343+0000 7f3dd17fa700 1 -- 192.168.123.105:0/3437493988 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f3dc4010460 con 0x7f3dd4107ff0 2026-03-10T07:52:49.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.343+0000 7f3dd17fa700 1 -- 192.168.123.105:0/3437493988 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3dc400f5d0 con 0x7f3dd4107ff0 2026-03-10T07:52:49.345 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.344+0000 7f3dd17fa700 1 -- 192.168.123.105:0/3437493988 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f3dc40099e0 con 0x7f3dd4107ff0 2026-03-10T07:52:49.345 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.345+0000 7f3dd17fa700 1 --2- 192.168.123.105:0/3437493988 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3dbc077a50 0x7f3dbc079f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:49.345 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.345+0000 7f3dd17fa700 1 -- 192.168.123.105:0/3437493988 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f3dc4099960 con 0x7f3dd4107ff0 2026-03-10T07:52:49.345 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.345+0000 7f3dda806700 1 -- 192.168.123.105:0/3437493988 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3dc0005320 con 0x7f3dd4107ff0 2026-03-10T07:52:49.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.345+0000 7f3dd37fe700 1 --2- 192.168.123.105:0/3437493988 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3dbc077a50 0x7f3dbc079f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:49.346 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.346+0000 7f3dd37fe700 1 --2- 192.168.123.105:0/3437493988 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3dbc077a50 0x7f3dbc079f10 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f3dcc000f80 tx=0x7f3dcc00db00 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:49.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.351+0000 7f3dd17fa700 1 -- 192.168.123.105:0/3437493988 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3dc4061f70 con 0x7f3dd4107ff0 2026-03-10T07:52:49.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.607+0000 7f3dda806700 1 -- 192.168.123.105:0/3437493988 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f3dc0000bf0 con 0x7f3dbc077a50 2026-03-10T07:52:49.611 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.610+0000 7f3dd17fa700 1 -- 192.168.123.105:0/3437493988 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+385 (secure 0 0 0) 0x7f3dc0000bf0 con 0x7f3dbc077a50 2026-03-10T07:52:49.611 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:52:49.611 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T07:52:49.611 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T07:52:49.611 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T07:52:49.611 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T07:52:49.611 INFO:teuthology.orchestra.run.vm05.stdout: "mgr", 2026-03-10T07:52:49.611 INFO:teuthology.orchestra.run.vm05.stdout: "mon" 2026-03-10T07:52:49.611 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T07:52:49.611 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "4/23 daemons upgraded", 2026-03-10T07:52:49.611 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading crash daemons", 2026-03-10T07:52:49.611 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T07:52:49.611 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:52:49.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.613+0000 7f3dda806700 1 -- 192.168.123.105:0/3437493988 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3dbc077a50 msgr2=0x7f3dbc079f10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:49.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.613+0000 7f3dda806700 1 --2- 192.168.123.105:0/3437493988 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3dbc077a50 0x7f3dbc079f10 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f3dcc000f80 tx=0x7f3dcc00db00 comp rx=0 tx=0).stop 2026-03-10T07:52:49.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.613+0000 7f3dda806700 1 -- 192.168.123.105:0/3437493988 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3dd4107ff0 msgr2=0x7f3dd4133210 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:49.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.613+0000 7f3dda806700 1 --2- 192.168.123.105:0/3437493988 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3dd4107ff0 0x7f3dd4133210 secure :-1 s=READY pgs=13 cs=0 l=1 rev1=1 crypto rx=0x7f3dc400d8d0 tx=0x7f3dc400dbe0 comp rx=0 tx=0).stop 2026-03-10T07:52:49.614 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.613+0000 7f3dda806700 1 -- 192.168.123.105:0/3437493988 shutdown_connections 2026-03-10T07:52:49.614 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.613+0000 7f3dda806700 1 --2- 192.168.123.105:0/3437493988 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3dd4107ff0 0x7f3dd4133210 unknown :-1 s=CLOSED pgs=13 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:49.614 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.613+0000 7f3dda806700 1 --2- 192.168.123.105:0/3437493988 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3dbc077a50 0x7f3dbc079f10 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:49.614 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.613+0000 7f3dda806700 1 --2- 192.168.123.105:0/3437493988 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3dd4133750 0x7f3dd4133bd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:49.614 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.613+0000 7f3dda806700 1 -- 192.168.123.105:0/3437493988 >> 192.168.123.105:0/3437493988 conn(0x7f3dd406ce20 msgr2=0x7f3dd4070670 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:49.614 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.613+0000 7f3dda806700 1 -- 192.168.123.105:0/3437493988 shutdown_connections 2026-03-10T07:52:49.614 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.613+0000 7f3dda806700 1 -- 192.168.123.105:0/3437493988 wait complete. 2026-03-10T07:52:49.680 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.679+0000 7f52c9b49700 1 -- 192.168.123.105:0/3403730124 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f52c40ffc50 msgr2=0x7f52c4100030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:49.680 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.679+0000 7f52c9b49700 1 --2- 192.168.123.105:0/3403730124 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f52c40ffc50 0x7f52c4100030 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f52ac009b00 tx=0x7f52ac009e10 comp rx=0 tx=0).stop 2026-03-10T07:52:49.680 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.679+0000 7f52c9b49700 1 -- 192.168.123.105:0/3403730124 shutdown_connections 2026-03-10T07:52:49.680 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.679+0000 7f52c9b49700 1 --2- 192.168.123.105:0/3403730124 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52c4100600 0x7f52c410d2c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:49.680 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.679+0000 7f52c9b49700 1 --2- 192.168.123.105:0/3403730124 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f52c40ffc50 0x7f52c4100030 secure :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7f52ac009b00 tx=0x7f52ac009e10 comp rx=0 tx=0).stop 2026-03-10T07:52:49.680 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.679+0000 7f52c9b49700 1 -- 192.168.123.105:0/3403730124 >> 192.168.123.105:0/3403730124 conn(0x7f52c40fb830 msgr2=0x7f52c40fdc50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:49.680 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.680+0000 7f52c9b49700 1 -- 192.168.123.105:0/3403730124 shutdown_connections 2026-03-10T07:52:49.680 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.680+0000 7f52c9b49700 1 -- 192.168.123.105:0/3403730124 wait complete. 2026-03-10T07:52:49.681 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.680+0000 7f52c9b49700 1 Processor -- start 2026-03-10T07:52:49.681 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.680+0000 7f52c9b49700 1 -- start start 2026-03-10T07:52:49.681 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.681+0000 7f52c9b49700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f52c4100600 0x7f52c4199120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:49.681 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.681+0000 7f52c9b49700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52c4199660 0x7f52c419dad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:49.681 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.681+0000 7f52c9b49700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f52c4199c80 con 0x7f52c4100600 2026-03-10T07:52:49.682 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.681+0000 7f52c9b49700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f52c4199df0 con 0x7f52c4199660 2026-03-10T07:52:49.682 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.681+0000 7f52c2ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52c4199660 0x7f52c419dad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:49.682 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.681+0000 7f52c2ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52c4199660 0x7f52c419dad0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:40352/0 (socket says 192.168.123.105:40352) 2026-03-10T07:52:49.682 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.681+0000 7f52c2ffd700 1 -- 192.168.123.105:0/1071056951 learned_addr learned my addr 192.168.123.105:0/1071056951 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:52:49.682 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.682+0000 7f52c37fe700 1 --2- 192.168.123.105:0/1071056951 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f52c4100600 0x7f52c4199120 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:49.682 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.682+0000 7f52c37fe700 1 -- 192.168.123.105:0/1071056951 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52c4199660 msgr2=0x7f52c419dad0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:49.682 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.682+0000 7f52c37fe700 1 --2- 192.168.123.105:0/1071056951 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52c4199660 0x7f52c419dad0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:49.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.682+0000 7f52c37fe700 1 -- 192.168.123.105:0/1071056951 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f52ac0097e0 con 0x7f52c4100600 2026-03-10T07:52:49.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.682+0000 7f52c37fe700 1 --2- 192.168.123.105:0/1071056951 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f52c4100600 0x7f52c4199120 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f52ac008000 tx=0x7f52ac0038b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:49.683 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.682+0000 7f52c0ff9700 1 -- 192.168.123.105:0/1071056951 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f52ac01d070 con 0x7f52c4100600 2026-03-10T07:52:49.684 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.683+0000 7f52c0ff9700 1 -- 192.168.123.105:0/1071056951 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f52ac003c80 con 0x7f52c4100600 2026-03-10T07:52:49.684 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.683+0000 7f52c0ff9700 1 -- 192.168.123.105:0/1071056951 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f52ac01d070 con 0x7f52c4100600 2026-03-10T07:52:49.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.684+0000 7f52c9b49700 1 -- 192.168.123.105:0/1071056951 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f52c419e0d0 con 0x7f52c4100600 2026-03-10T07:52:49.687 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.685+0000 7f52c9b49700 1 -- 192.168.123.105:0/1071056951 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f52c419e5a0 con 0x7f52c4100600 2026-03-10T07:52:49.688 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.687+0000 7f52c9b49700 1 -- 192.168.123.105:0/1071056951 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f52c410aa30 con 0x7f52c4100600 2026-03-10T07:52:49.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.689+0000 7f52c0ff9700 1 -- 192.168.123.105:0/1071056951 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f52ac00f460 con 0x7f52c4100600 2026-03-10T07:52:49.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.689+0000 7f52c0ff9700 1 --2- 192.168.123.105:0/1071056951 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f52b0077ab0 0x7f52b0079f70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:52:49.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.689+0000 7f52c0ff9700 1 -- 192.168.123.105:0/1071056951 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(43..43 src has 1..43) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f52ac02f080 con 0x7f52c4100600 2026-03-10T07:52:49.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.691+0000 7f52c2ffd700 1 --2- 192.168.123.105:0/1071056951 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f52b0077ab0 0x7f52b0079f70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:52:49.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.691+0000 7f52c2ffd700 1 --2- 192.168.123.105:0/1071056951 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f52b0077ab0 0x7f52b0079f70 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f52b4009cc0 tx=0x7f52b4009480 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:52:49.695 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.693+0000 7f52c0ff9700 1 -- 192.168.123.105:0/1071056951 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f52ac062dc0 con 0x7f52c4100600 2026-03-10T07:52:49.915 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T07:52:49.915 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T07:52:49.915 INFO:teuthology.orchestra.run.vm05.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T07:52:49.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.908+0000 7f52c9b49700 1 -- 192.168.123.105:0/1071056951 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f52c404ea90 con 0x7f52c4100600 2026-03-10T07:52:49.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.909+0000 7f52c0ff9700 1 -- 192.168.123.105:0/1071056951 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+201 (secure 0 0 0) 0x7f52ac062dc0 con 0x7f52c4100600 2026-03-10T07:52:49.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.912+0000 7f52ba7fc700 1 -- 192.168.123.105:0/1071056951 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f52b0077ab0 msgr2=0x7f52b0079f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:49.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.912+0000 7f52ba7fc700 1 --2- 192.168.123.105:0/1071056951 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f52b0077ab0 0x7f52b0079f70 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f52b4009cc0 tx=0x7f52b4009480 comp rx=0 tx=0).stop 2026-03-10T07:52:49.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.912+0000 7f52ba7fc700 1 -- 192.168.123.105:0/1071056951 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f52c4100600 msgr2=0x7f52c4199120 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:52:49.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.912+0000 7f52ba7fc700 1 --2- 192.168.123.105:0/1071056951 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f52c4100600 0x7f52c4199120 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f52ac008000 tx=0x7f52ac0038b0 comp rx=0 tx=0).stop 2026-03-10T07:52:49.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.912+0000 7f52ba7fc700 1 -- 192.168.123.105:0/1071056951 shutdown_connections 2026-03-10T07:52:49.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.912+0000 7f52ba7fc700 1 --2- 192.168.123.105:0/1071056951 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f52b0077ab0 0x7f52b0079f70 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:49.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.912+0000 7f52ba7fc700 1 --2- 192.168.123.105:0/1071056951 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f52c4100600 0x7f52c4199120 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:49.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.912+0000 7f52ba7fc700 1 --2- 192.168.123.105:0/1071056951 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52c4199660 0x7f52c419dad0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:52:49.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.912+0000 7f52ba7fc700 1 -- 192.168.123.105:0/1071056951 >> 192.168.123.105:0/1071056951 conn(0x7f52c40fb830 msgr2=0x7f52c40fcf00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:52:49.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.912+0000 7f52ba7fc700 1 -- 192.168.123.105:0/1071056951 shutdown_connections 2026-03-10T07:52:49.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:52:49.912+0000 7f52ba7fc700 1 -- 192.168.123.105:0/1071056951 wait complete. 2026-03-10T07:52:50.100 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:49 vm05.local ceph-mon[130117]: from='client.44101 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:52:50.100 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:49 vm05.local ceph-mon[130117]: Reconfiguring mds.cephfs.vm08.ybmbgd (monmap changed)... 2026-03-10T07:52:50.100 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:49 vm05.local ceph-mon[130117]: Reconfiguring daemon mds.cephfs.vm08.ybmbgd on vm08 2026-03-10T07:52:50.100 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:49 vm05.local ceph-mon[130117]: from='client.44105 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:52:50.100 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:49 vm05.local ceph-mon[130117]: Reconfiguring mds.cephfs.vm08.dgsaon (monmap changed)... 2026-03-10T07:52:50.100 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:49 vm05.local ceph-mon[130117]: Reconfiguring daemon mds.cephfs.vm08.dgsaon on vm08 2026-03-10T07:52:50.100 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:49 vm05.local ceph-mon[130117]: from='client.34126 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:52:50.100 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:49 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:50.100 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:49 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:50.100 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:49 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/980916450' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:52:50.100 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:49 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:52:50.100 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:49 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:52:50.100 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:49 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:52:50.100 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:49 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:50.100 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:49 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm05"}]: dispatch 2026-03-10T07:52:50.100 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:49 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm05"}]': finished 2026-03-10T07:52:50.100 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:49 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm08"}]: dispatch 2026-03-10T07:52:50.100 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:49 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm08"}]': finished 2026-03-10T07:52:50.100 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:49 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/292644891' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T07:52:50.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:49 vm08.local ceph-mon[107898]: from='client.44101 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:52:50.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:49 vm08.local ceph-mon[107898]: Reconfiguring mds.cephfs.vm08.ybmbgd (monmap changed)... 2026-03-10T07:52:50.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:49 vm08.local ceph-mon[107898]: Reconfiguring daemon mds.cephfs.vm08.ybmbgd on vm08 2026-03-10T07:52:50.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:49 vm08.local ceph-mon[107898]: from='client.44105 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:52:50.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:49 vm08.local ceph-mon[107898]: Reconfiguring mds.cephfs.vm08.dgsaon (monmap changed)... 2026-03-10T07:52:50.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:49 vm08.local ceph-mon[107898]: Reconfiguring daemon mds.cephfs.vm08.dgsaon on vm08 2026-03-10T07:52:50.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:49 vm08.local ceph-mon[107898]: from='client.34126 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:52:50.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:49 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:50.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:49 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:50.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:49 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/980916450' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:52:50.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:49 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:52:50.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:49 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:52:50.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:49 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:52:50.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:49 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:50.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:49 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm05"}]: dispatch 2026-03-10T07:52:50.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:49 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm05"}]': finished 2026-03-10T07:52:50.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:49 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon.vm08"}]: dispatch 2026-03-10T07:52:50.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:49 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon.vm08"}]': finished 2026-03-10T07:52:50.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:49 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/292644891' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T07:52:51.053 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:50 vm05.local ceph-mon[130117]: pgmap v14: 65 pgs: 65 active+clean; 311 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 41 KiB/s rd, 2.3 MiB/s wr, 121 op/s 2026-03-10T07:52:51.053 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:50 vm05.local ceph-mon[130117]: Upgrade: Setting container_image for all mon 2026-03-10T07:52:51.053 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:50 vm05.local ceph-mon[130117]: Upgrade: Updating crash.vm05 (1/2) 2026-03-10T07:52:51.053 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:50 vm05.local ceph-mon[130117]: from='client.44117 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:52:51.053 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:50 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:51.053 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:50 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T07:52:51.053 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:50 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:51.053 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:50 vm05.local ceph-mon[130117]: Deploying daemon crash.vm05 on vm05 2026-03-10T07:52:51.053 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:50 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/1071056951' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T07:52:51.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:50 vm08.local ceph-mon[107898]: pgmap v14: 65 pgs: 65 active+clean; 311 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 41 KiB/s rd, 2.3 MiB/s wr, 121 op/s 2026-03-10T07:52:51.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:50 vm08.local ceph-mon[107898]: Upgrade: Setting container_image for all mon 2026-03-10T07:52:51.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:50 vm08.local ceph-mon[107898]: Upgrade: Updating crash.vm05 (1/2) 2026-03-10T07:52:51.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:50 vm08.local ceph-mon[107898]: from='client.44117 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:52:51.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:50 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:51.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:50 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm05", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T07:52:51.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:50 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:51.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:50 vm08.local ceph-mon[107898]: Deploying daemon crash.vm05 on vm05 2026-03-10T07:52:51.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:50 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/1071056951' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T07:52:52.739 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:52 vm08.local ceph-mon[107898]: pgmap v15: 65 pgs: 65 active+clean; 311 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 1.4 MiB/s wr, 78 op/s 2026-03-10T07:52:52.739 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:52 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:52.739 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:52 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:52.739 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:52 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:52.739 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:52 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T07:52:52.739 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:52 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:52.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:52 vm05.local ceph-mon[130117]: pgmap v15: 65 pgs: 65 active+clean; 311 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 1.4 MiB/s wr, 78 op/s 2026-03-10T07:52:52.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:52 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:52.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:52 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:52.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:52 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:52.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:52 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.crash.vm08", "caps": ["mon", "profile crash", "mgr", "profile crash"]}]: dispatch 2026-03-10T07:52:52.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:52 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:53.834 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:53 vm08.local ceph-mon[107898]: Upgrade: Updating crash.vm08 (2/2) 2026-03-10T07:52:53.834 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:53 vm08.local ceph-mon[107898]: Deploying daemon crash.vm08 on vm08 2026-03-10T07:52:53.834 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:53 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:53.834 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:53 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:53.834 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:53 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:52:53.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:53 vm05.local ceph-mon[130117]: Upgrade: Updating crash.vm08 (2/2) 2026-03-10T07:52:53.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:53 vm05.local ceph-mon[130117]: Deploying daemon crash.vm08 on vm08 2026-03-10T07:52:53.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:53 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:53.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:53 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:53.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:53 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:52:54.737 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:54 vm05.local ceph-mon[130117]: pgmap v16: 65 pgs: 65 active+clean; 311 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 1.4 MiB/s wr, 78 op/s 2026-03-10T07:52:54.737 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:54 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:54.738 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:54 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:55.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:54 vm08.local ceph-mon[107898]: pgmap v16: 65 pgs: 65 active+clean; 311 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 24 KiB/s rd, 1.4 MiB/s wr, 78 op/s 2026-03-10T07:52:55.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:54 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:55.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:54 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:55.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:55 vm08.local ceph-mon[107898]: pgmap v17: 65 pgs: 65 active+clean; 309 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 1.9 MiB/s wr, 277 op/s 2026-03-10T07:52:55.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:55 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:55.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:55 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:56.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:55 vm05.local ceph-mon[130117]: pgmap v17: 65 pgs: 65 active+clean; 309 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 33 KiB/s rd, 1.9 MiB/s wr, 277 op/s 2026-03-10T07:52:56.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:55 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:56.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:55 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:57.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:56 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:57.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:56 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:57.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:56 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:57.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:56 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:57.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:56 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:57.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:56 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:57.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:56 vm05.local ceph-mon[130117]: pgmap v18: 65 pgs: 65 active+clean; 309 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 19 KiB/s rd, 1.3 MiB/s wr, 237 op/s 2026-03-10T07:52:57.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:56 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:57.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:56 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:57.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:56 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:57.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:56 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:57.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:56 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:57.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:56 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:57.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:56 vm08.local ceph-mon[107898]: pgmap v18: 65 pgs: 65 active+clean; 309 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 19 KiB/s rd, 1.3 MiB/s wr, 237 op/s 2026-03-10T07:52:57.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:52:57.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:57.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:57.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:57.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:52:57.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:57.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:52:57.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:52:57.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:52:57.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:52:57.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:57.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm05"}]: dispatch 2026-03-10T07:52:57.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm05"}]': finished 2026-03-10T07:52:57.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm08"}]: dispatch 2026-03-10T07:52:57.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm08"}]': finished 2026-03-10T07:52:57.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-10T07:52:58.080 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:52:58.080 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:58.080 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:58.080 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:58.080 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:52:58.080 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:58.080 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:52:58.080 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:52:58.080 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:52:58.080 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:52:58.080 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:58.080 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm05"}]: dispatch 2026-03-10T07:52:58.080 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm05"}]': finished 2026-03-10T07:52:58.080 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm08"}]: dispatch 2026-03-10T07:52:58.080 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash.vm08"}]': finished 2026-03-10T07:52:58.080 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-10T07:52:59.805 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:59 vm05.local ceph-mon[130117]: Upgrade: Setting container_image for all crash 2026-03-10T07:52:59.806 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:59 vm05.local ceph-mon[130117]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-10T07:52:59.806 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:59 vm05.local ceph-mon[130117]: Upgrade: osd.0 is safe to restart 2026-03-10T07:52:59.806 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:59 vm05.local ceph-mon[130117]: Upgrade: Updating osd.0 2026-03-10T07:52:59.806 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:59 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:59.806 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:59 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T07:52:59.806 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:59 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:59.806 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:59 vm05.local ceph-mon[130117]: Deploying daemon osd.0 on vm05 2026-03-10T07:52:59.806 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:52:59 vm05.local ceph-mon[130117]: pgmap v19: 65 pgs: 65 active+clean; 304 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 28 KiB/s rd, 1.8 MiB/s wr, 373 op/s 2026-03-10T07:52:59.806 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:52:59 vm05.local systemd[1]: Stopping Ceph osd.0 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T07:52:59.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:59 vm08.local ceph-mon[107898]: Upgrade: Setting container_image for all crash 2026-03-10T07:52:59.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:59 vm08.local ceph-mon[107898]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["0"], "max": 16}]: dispatch 2026-03-10T07:52:59.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:59 vm08.local ceph-mon[107898]: Upgrade: osd.0 is safe to restart 2026-03-10T07:52:59.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:59 vm08.local ceph-mon[107898]: Upgrade: Updating osd.0 2026-03-10T07:52:59.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:59 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:52:59.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:59 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.0"}]: dispatch 2026-03-10T07:52:59.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:59 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:52:59.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:59 vm08.local ceph-mon[107898]: Deploying daemon osd.0 on vm05 2026-03-10T07:52:59.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:52:59 vm08.local ceph-mon[107898]: pgmap v19: 65 pgs: 65 active+clean; 304 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 28 KiB/s rd, 1.8 MiB/s wr, 373 op/s 2026-03-10T07:53:00.157 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:52:59 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0[68323]: 2026-03-10T07:52:59.804+0000 7fe3301c5700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.0 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T07:53:00.157 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:52:59 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0[68323]: 2026-03-10T07:52:59.804+0000 7fe3301c5700 -1 osd.0 43 *** Got signal Terminated *** 2026-03-10T07:53:00.157 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:52:59 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0[68323]: 2026-03-10T07:52:59.804+0000 7fe3301c5700 -1 osd.0 43 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T07:53:00.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:00 vm05.local ceph-mon[130117]: osd.0 marked itself down and dead 2026-03-10T07:53:00.908 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:00 vm05.local podman[138663]: 2026-03-10 07:53:00.771679508 +0000 UTC m=+0.997869009 container died 9b7c5ea48cea7da650dc69460661e169249602058e9cd898bff19e1d061f14ee (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0, org.label-schema.build-date=20240222, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , CEPH_POINT_RELEASE=-18.2.1, ceph=True, org.label-schema.license=GPLv2, GIT_CLEAN=True, RELEASE=HEAD) 2026-03-10T07:53:00.908 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:00 vm05.local podman[138663]: 2026-03-10 07:53:00.809775643 +0000 UTC m=+1.035965144 container remove 9b7c5ea48cea7da650dc69460661e169249602058e9cd898bff19e1d061f14ee (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.1, GIT_CLEAN=True, RELEASE=HEAD, org.label-schema.license=GPLv2, GIT_BRANCH=HEAD, ceph=True, io.buildah.version=1.29.1, org.label-schema.schema-version=1.0, maintainer=Guillaume Abrioux , org.label-schema.build-date=20240222, org.label-schema.name=CentOS Stream 8 Base Image, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, GIT_REPO=https://github.com/ceph/ceph-container.git) 2026-03-10T07:53:00.908 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:00 vm05.local bash[138663]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0 2026-03-10T07:53:00.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:00 vm08.local ceph-mon[107898]: osd.0 marked itself down and dead 2026-03-10T07:53:01.319 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:01 vm05.local podman[138725]: 2026-03-10 07:53:01.11284047 +0000 UTC m=+0.016606002 container create 3548fcb0c1539cf44d07af6de6f915c452a9ca34c097856018493e1458e67709 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-deactivate, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, OSD_FLAVOR=default, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T07:53:01.319 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:01 vm05.local podman[138725]: 2026-03-10 07:53:01.156087577 +0000 UTC m=+0.059853118 container init 3548fcb0c1539cf44d07af6de6f915c452a9ca34c097856018493e1458e67709 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-deactivate, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223) 2026-03-10T07:53:01.319 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:01 vm05.local podman[138725]: 2026-03-10 07:53:01.159241052 +0000 UTC m=+0.063006593 container start 3548fcb0c1539cf44d07af6de6f915c452a9ca34c097856018493e1458e67709 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-deactivate, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0) 2026-03-10T07:53:01.319 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:01 vm05.local podman[138725]: 2026-03-10 07:53:01.16033022 +0000 UTC m=+0.064095761 container attach 3548fcb0c1539cf44d07af6de6f915c452a9ca34c097856018493e1458e67709 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-deactivate, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0) 2026-03-10T07:53:01.319 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:01 vm05.local podman[138725]: 2026-03-10 07:53:01.105895237 +0000 UTC m=+0.009660778 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T07:53:01.319 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:01 vm05.local conmon[138738]: conmon 3548fcb0c1539cf44d07 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-3548fcb0c1539cf44d07af6de6f915c452a9ca34c097856018493e1458e67709.scope/container/memory.events 2026-03-10T07:53:01.319 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:01 vm05.local podman[138725]: 2026-03-10 07:53:01.301870011 +0000 UTC m=+0.205635552 container died 3548fcb0c1539cf44d07af6de6f915c452a9ca34c097856018493e1458e67709 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-deactivate, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True) 2026-03-10T07:53:01.618 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:01 vm05.local podman[138725]: 2026-03-10 07:53:01.334045239 +0000 UTC m=+0.237810780 container remove 3548fcb0c1539cf44d07af6de6f915c452a9ca34c097856018493e1458e67709 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20260223) 2026-03-10T07:53:01.618 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:01 vm05.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.0.service: Deactivated successfully. 2026-03-10T07:53:01.618 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:01 vm05.local systemd[1]: Stopped Ceph osd.0 for 12e9780e-1c55-11f1-8896-79f7c2e9b508. 2026-03-10T07:53:01.618 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:01 vm05.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.0.service: Consumed 29.537s CPU time. 2026-03-10T07:53:01.618 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:01 vm05.local systemd[1]: Starting Ceph osd.0 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T07:53:01.872 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:01 vm05.local ceph-mon[130117]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T07:53:01.872 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:01 vm05.local ceph-mon[130117]: osdmap e44: 6 total, 5 up, 6 in 2026-03-10T07:53:01.872 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:01 vm05.local ceph-mon[130117]: pgmap v21: 65 pgs: 9 stale+active+clean, 56 active+clean; 304 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 22 KiB/s rd, 1.3 MiB/s wr, 401 op/s 2026-03-10T07:53:01.872 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:01 vm05.local podman[138831]: 2026-03-10 07:53:01.703060258 +0000 UTC m=+0.073731471 container create 0d5e47e330fc4527abc7dfd2982209debea5ca438643b45aeab09b54f852569c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-activate, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.build-date=20260223, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T07:53:01.872 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:01 vm05.local podman[138831]: 2026-03-10 07:53:01.67406813 +0000 UTC m=+0.044739343 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T07:53:01.872 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:01 vm05.local podman[138831]: 2026-03-10 07:53:01.767627541 +0000 UTC m=+0.138298764 container init 0d5e47e330fc4527abc7dfd2982209debea5ca438643b45aeab09b54f852569c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, io.buildah.version=1.41.3) 2026-03-10T07:53:01.872 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:01 vm05.local podman[138831]: 2026-03-10 07:53:01.772416797 +0000 UTC m=+0.143088011 container start 0d5e47e330fc4527abc7dfd2982209debea5ca438643b45aeab09b54f852569c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-activate, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df) 2026-03-10T07:53:01.872 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:01 vm05.local podman[138831]: 2026-03-10 07:53:01.779267673 +0000 UTC m=+0.149938886 container attach 0d5e47e330fc4527abc7dfd2982209debea5ca438643b45aeab09b54f852569c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-activate, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T07:53:01.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:01 vm08.local ceph-mon[107898]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T07:53:01.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:01 vm08.local ceph-mon[107898]: osdmap e44: 6 total, 5 up, 6 in 2026-03-10T07:53:01.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:01 vm08.local ceph-mon[107898]: pgmap v21: 65 pgs: 9 stale+active+clean, 56 active+clean; 304 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 22 KiB/s rd, 1.3 MiB/s wr, 401 op/s 2026-03-10T07:53:02.157 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:01 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-activate[138842]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:53:02.157 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:01 vm05.local bash[138831]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:53:02.157 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:01 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-activate[138842]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:53:02.157 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:01 vm05.local bash[138831]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:53:02.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:02 vm05.local ceph-mon[130117]: osdmap e45: 6 total, 5 up, 6 in 2026-03-10T07:53:02.907 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:02 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-activate[138842]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T07:53:02.907 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:02 vm05.local bash[138831]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T07:53:02.907 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:02 vm05.local bash[138831]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:53:02.907 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:02 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-activate[138842]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:53:02.907 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:02 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-activate[138842]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:53:02.907 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:02 vm05.local bash[138831]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:53:02.907 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:02 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-activate[138842]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-10T07:53:02.907 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:02 vm05.local bash[138831]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-10T07:53:02.907 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:02 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-activate[138842]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-57bfe7fe-be4f-4553-bbf4-713c2a371fcd/osd-block-b7030a1b-b939-419d-85b0-9e818f756cd8 --path /var/lib/ceph/osd/ceph-0 --no-mon-config 2026-03-10T07:53:02.907 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:02 vm05.local bash[138831]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-57bfe7fe-be4f-4553-bbf4-713c2a371fcd/osd-block-b7030a1b-b939-419d-85b0-9e818f756cd8 --path /var/lib/ceph/osd/ceph-0 --no-mon-config 2026-03-10T07:53:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:02 vm08.local ceph-mon[107898]: osdmap e45: 6 total, 5 up, 6 in 2026-03-10T07:53:03.304 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:02 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-activate[138842]: Running command: /usr/bin/ln -snf /dev/ceph-57bfe7fe-be4f-4553-bbf4-713c2a371fcd/osd-block-b7030a1b-b939-419d-85b0-9e818f756cd8 /var/lib/ceph/osd/ceph-0/block 2026-03-10T07:53:03.304 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:02 vm05.local bash[138831]: Running command: /usr/bin/ln -snf /dev/ceph-57bfe7fe-be4f-4553-bbf4-713c2a371fcd/osd-block-b7030a1b-b939-419d-85b0-9e818f756cd8 /var/lib/ceph/osd/ceph-0/block 2026-03-10T07:53:03.304 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:02 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-activate[138842]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block 2026-03-10T07:53:03.304 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:02 vm05.local bash[138831]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-0/block 2026-03-10T07:53:03.304 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:02 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-activate[138842]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-10T07:53:03.304 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:02 vm05.local bash[138831]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-10T07:53:03.304 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:02 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-activate[138842]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-10T07:53:03.304 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:02 vm05.local bash[138831]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-0 2026-03-10T07:53:03.304 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:03 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-activate[138842]: --> ceph-volume lvm activate successful for osd ID: 0 2026-03-10T07:53:03.304 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:03 vm05.local bash[138831]: --> ceph-volume lvm activate successful for osd ID: 0 2026-03-10T07:53:03.304 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:03 vm05.local podman[138831]: 2026-03-10 07:53:03.03145699 +0000 UTC m=+1.402128203 container died 0d5e47e330fc4527abc7dfd2982209debea5ca438643b45aeab09b54f852569c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default) 2026-03-10T07:53:03.304 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:03 vm05.local podman[138831]: 2026-03-10 07:53:03.076918833 +0000 UTC m=+1.447590046 container remove 0d5e47e330fc4527abc7dfd2982209debea5ca438643b45aeab09b54f852569c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-activate, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3) 2026-03-10T07:53:03.659 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:03 vm05.local podman[139096]: 2026-03-10 07:53:03.303237958 +0000 UTC m=+0.032865532 container create b35fccc2a4d5cd00d06d04a0b2dbd1e01886793ec7ed9fd823a33bdc2c8ed084 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.license=GPLv2) 2026-03-10T07:53:03.659 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:03 vm05.local podman[139096]: 2026-03-10 07:53:03.353467176 +0000 UTC m=+0.083094750 container init b35fccc2a4d5cd00d06d04a0b2dbd1e01886793ec7ed9fd823a33bdc2c8ed084 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, ceph=True, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T07:53:03.659 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:03 vm05.local podman[139096]: 2026-03-10 07:53:03.357527077 +0000 UTC m=+0.087154651 container start b35fccc2a4d5cd00d06d04a0b2dbd1e01886793ec7ed9fd823a33bdc2c8ed084 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0, io.buildah.version=1.41.3, ceph=True, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid) 2026-03-10T07:53:03.659 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:03 vm05.local bash[139096]: b35fccc2a4d5cd00d06d04a0b2dbd1e01886793ec7ed9fd823a33bdc2c8ed084 2026-03-10T07:53:03.659 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:03 vm05.local podman[139096]: 2026-03-10 07:53:03.292996793 +0000 UTC m=+0.022624378 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T07:53:03.659 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:03 vm05.local systemd[1]: Started Ceph osd.0 for 12e9780e-1c55-11f1-8896-79f7c2e9b508. 2026-03-10T07:53:04.851 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:04 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0[139106]: 2026-03-10T07:53:04.792+0000 7fb1b2582740 -1 Falling back to public interface 2026-03-10T07:53:04.851 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:04 vm05.local ceph-mon[130117]: pgmap v23: 65 pgs: 9 stale+active+clean, 56 active+clean; 304 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 14 KiB/s rd, 808 KiB/s wr, 203 op/s 2026-03-10T07:53:04.851 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:04 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:04.851 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:04 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:04.852 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:04 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:53:04.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:04 vm08.local ceph-mon[107898]: pgmap v23: 65 pgs: 9 stale+active+clean, 56 active+clean; 304 MiB data, 2.9 GiB used, 117 GiB / 120 GiB avail; 14 KiB/s rd, 808 KiB/s wr, 203 op/s 2026-03-10T07:53:04.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:04 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:04.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:04 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:04.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:04 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:53:05.821 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:05 vm05.local ceph-mon[130117]: pgmap v24: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 305 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 32 KiB/s rd, 1.8 MiB/s wr, 506 op/s; 2188/14223 objects degraded (15.384%); 0 B/s, 0 keys/s, 0 objects/s recovering 2026-03-10T07:53:05.877 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:05 vm08.local ceph-mon[107898]: pgmap v24: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 305 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 32 KiB/s rd, 1.8 MiB/s wr, 506 op/s; 2188/14223 objects degraded (15.384%); 0 B/s, 0 keys/s, 0 objects/s recovering 2026-03-10T07:53:06.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:06 vm05.local ceph-mon[130117]: Health check failed: Degraded data redundancy: 2188/14223 objects degraded (15.384%), 34 pgs degraded (PG_DEGRADED) 2026-03-10T07:53:06.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:06 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:06.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:06 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:06.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:06 vm08.local ceph-mon[107898]: Health check failed: Degraded data redundancy: 2188/14223 objects degraded (15.384%), 34 pgs degraded (PG_DEGRADED) 2026-03-10T07:53:06.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:06 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:06.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:06 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:07.764 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:07 vm05.local ceph-mon[130117]: pgmap v25: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 305 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 18 KiB/s rd, 990 KiB/s wr, 302 op/s; 2188/14223 objects degraded (15.384%); 0 B/s, 0 keys/s, 0 objects/s recovering 2026-03-10T07:53:07.764 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:07 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:07.764 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:07 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:07.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:07 vm08.local ceph-mon[107898]: pgmap v25: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 305 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 18 KiB/s rd, 990 KiB/s wr, 302 op/s; 2188/14223 objects degraded (15.384%); 0 B/s, 0 keys/s, 0 objects/s recovering 2026-03-10T07:53:07.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:07 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:07.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:07 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:09.407 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:08 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0[139106]: 2026-03-10T07:53:08.997+0000 7fb1b2582740 -1 osd.0 0 read_superblock omap replica is missing. 2026-03-10T07:53:10.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:09 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:10.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:09 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:10.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:09 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:53:10.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:09 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:53:10.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:09 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:10.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:09 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:53:10.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:09 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:53:10.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:09 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:53:10.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:09 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:53:10.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:09 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T07:53:10.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:09 vm05.local ceph-mon[130117]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T07:53:10.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:09 vm05.local ceph-mon[130117]: Upgrade: unsafe to stop osd(s) at this time (15 PGs are or would become offline) 2026-03-10T07:53:10.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:09 vm05.local ceph-mon[130117]: pgmap v26: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 305 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 31 KiB/s rd, 1.7 MiB/s wr, 432 op/s; 1919/12354 objects degraded (15.533%); 0 B/s, 0 keys/s, 0 objects/s recovering 2026-03-10T07:53:10.157 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:09 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0[139106]: 2026-03-10T07:53:09.732+0000 7fb1b2582740 -1 osd.0 43 log_to_monitors true 2026-03-10T07:53:10.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:09 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:10.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:09 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:10.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:09 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:53:10.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:09 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:53:10.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:09 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:10.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:09 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:53:10.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:09 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:53:10.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:09 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:53:10.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:09 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:53:10.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:09 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T07:53:10.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:09 vm08.local ceph-mon[107898]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T07:53:10.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:09 vm08.local ceph-mon[107898]: Upgrade: unsafe to stop osd(s) at this time (15 PGs are or would become offline) 2026-03-10T07:53:10.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:09 vm08.local ceph-mon[107898]: pgmap v26: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 305 MiB data, 3.0 GiB used, 117 GiB / 120 GiB avail; 31 KiB/s rd, 1.7 MiB/s wr, 432 op/s; 1919/12354 objects degraded (15.533%); 0 B/s, 0 keys/s, 0 objects/s recovering 2026-03-10T07:53:11.060 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:10 vm05.local ceph-mon[130117]: from='osd.0 [v2:192.168.123.105:6802/879069133,v1:192.168.123.105:6803/879069133]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-10T07:53:11.060 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:10 vm05.local ceph-mon[130117]: from='osd.0 [v2:192.168.123.105:6802/879069133,v1:192.168.123.105:6803/879069133]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-10T07:53:11.060 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:10 vm05.local ceph-mon[130117]: osdmap e46: 6 total, 5 up, 6 in 2026-03-10T07:53:11.060 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:10 vm05.local ceph-mon[130117]: from='osd.0 [v2:192.168.123.105:6802/879069133,v1:192.168.123.105:6803/879069133]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T07:53:11.060 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:10 vm05.local ceph-mon[130117]: pgmap v28: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 305 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 27 KiB/s rd, 1.5 MiB/s wr, 386 op/s; 1919/12354 objects degraded (15.533%); 0 B/s, 0 keys/s, 0 objects/s recovering 2026-03-10T07:53:11.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:10 vm08.local ceph-mon[107898]: from='osd.0 [v2:192.168.123.105:6802/879069133,v1:192.168.123.105:6803/879069133]' entity='osd.0' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]: dispatch 2026-03-10T07:53:11.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:10 vm08.local ceph-mon[107898]: from='osd.0 [v2:192.168.123.105:6802/879069133,v1:192.168.123.105:6803/879069133]' entity='osd.0' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["0"]}]': finished 2026-03-10T07:53:11.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:10 vm08.local ceph-mon[107898]: osdmap e46: 6 total, 5 up, 6 in 2026-03-10T07:53:11.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:10 vm08.local ceph-mon[107898]: from='osd.0 [v2:192.168.123.105:6802/879069133,v1:192.168.123.105:6803/879069133]' entity='osd.0' cmd=[{"prefix": "osd crush create-or-move", "id": 0, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T07:53:11.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:10 vm08.local ceph-mon[107898]: pgmap v28: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 305 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 27 KiB/s rd, 1.5 MiB/s wr, 386 op/s; 1919/12354 objects degraded (15.533%); 0 B/s, 0 keys/s, 0 objects/s recovering 2026-03-10T07:53:11.907 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 07:53:11 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0[139106]: 2026-03-10T07:53:11.639+0000 7fb1a9b1b640 -1 osd.0 43 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T07:53:13.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:12 vm05.local ceph-mon[130117]: from='osd.0 [v2:192.168.123.105:6802/879069133,v1:192.168.123.105:6803/879069133]' entity='osd.0' 2026-03-10T07:53:13.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:12 vm05.local ceph-mon[130117]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T07:53:13.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:12 vm05.local ceph-mon[130117]: osd.0 [v2:192.168.123.105:6802/879069133,v1:192.168.123.105:6803/879069133] boot 2026-03-10T07:53:13.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:12 vm05.local ceph-mon[130117]: osdmap e47: 6 total, 6 up, 6 in 2026-03-10T07:53:13.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:12 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T07:53:13.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:12 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:13.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:12 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:53:13.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:12 vm08.local ceph-mon[107898]: from='osd.0 [v2:192.168.123.105:6802/879069133,v1:192.168.123.105:6803/879069133]' entity='osd.0' 2026-03-10T07:53:13.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:12 vm08.local ceph-mon[107898]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T07:53:13.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:12 vm08.local ceph-mon[107898]: osd.0 [v2:192.168.123.105:6802/879069133,v1:192.168.123.105:6803/879069133] boot 2026-03-10T07:53:13.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:12 vm08.local ceph-mon[107898]: osdmap e47: 6 total, 6 up, 6 in 2026-03-10T07:53:13.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:12 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 0}]: dispatch 2026-03-10T07:53:13.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:12 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:13.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:12 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:53:13.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:13 vm05.local ceph-mon[130117]: pgmap v30: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 305 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 13 KiB/s rd, 776 KiB/s wr, 138 op/s; 1919/12354 objects degraded (15.533%) 2026-03-10T07:53:13.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:13 vm05.local ceph-mon[130117]: osdmap e48: 6 total, 6 up, 6 in 2026-03-10T07:53:14.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:13 vm08.local ceph-mon[107898]: pgmap v30: 65 pgs: 34 active+undersized+degraded, 31 active+clean; 305 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 13 KiB/s rd, 776 KiB/s wr, 138 op/s; 1919/12354 objects degraded (15.533%) 2026-03-10T07:53:14.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:13 vm08.local ceph-mon[107898]: osdmap e48: 6 total, 6 up, 6 in 2026-03-10T07:53:15.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:14 vm05.local ceph-mon[130117]: osdmap e49: 6 total, 6 up, 6 in 2026-03-10T07:53:15.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:14 vm05.local ceph-mon[130117]: Health check update: Degraded data redundancy: 1919/12354 objects degraded (15.533%), 34 pgs degraded (PG_DEGRADED) 2026-03-10T07:53:15.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:14 vm08.local ceph-mon[107898]: osdmap e49: 6 total, 6 up, 6 in 2026-03-10T07:53:15.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:14 vm08.local ceph-mon[107898]: Health check update: Degraded data redundancy: 1919/12354 objects degraded (15.533%), 34 pgs degraded (PG_DEGRADED) 2026-03-10T07:53:16.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:15 vm05.local ceph-mon[130117]: pgmap v33: 65 pgs: 4 remapped+peering, 1 active+recovering+degraded, 6 activating+degraded, 13 active+recovery_wait+degraded, 9 active+undersized+degraded, 32 active+clean; 300 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 16 KiB/s rd, 914 KiB/s wr, 144 op/s; 1074/11100 objects degraded (9.676%) 2026-03-10T07:53:16.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:15 vm08.local ceph-mon[107898]: pgmap v33: 65 pgs: 4 remapped+peering, 1 active+recovering+degraded, 6 activating+degraded, 13 active+recovery_wait+degraded, 9 active+undersized+degraded, 32 active+clean; 300 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 16 KiB/s rd, 914 KiB/s wr, 144 op/s; 1074/11100 objects degraded (9.676%) 2026-03-10T07:53:18.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:18 vm05.local ceph-mon[130117]: pgmap v34: 65 pgs: 4 remapped+peering, 1 active+recovering+degraded, 6 activating+degraded, 13 active+recovery_wait+degraded, 41 active+clean; 296 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 13 KiB/s rd, 758 KiB/s wr, 142 op/s; 298/10689 objects degraded (2.788%); 0 B/s, 2 keys/s, 51 objects/s recovering 2026-03-10T07:53:18.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:18 vm08.local ceph-mon[107898]: pgmap v34: 65 pgs: 4 remapped+peering, 1 active+recovering+degraded, 6 activating+degraded, 13 active+recovery_wait+degraded, 41 active+clean; 296 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 13 KiB/s rd, 758 KiB/s wr, 142 op/s; 298/10689 objects degraded (2.788%); 0 B/s, 2 keys/s, 51 objects/s recovering 2026-03-10T07:53:20.011 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.010+0000 7f53cc97c700 1 -- 192.168.123.105:0/1997458796 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f53c0096aa0 msgr2=0x7f53c009a9b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:20.011 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.010+0000 7f53cc97c700 1 --2- 192.168.123.105:0/1997458796 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f53c0096aa0 0x7f53c009a9b0 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7f53b4009b00 tx=0x7f53b4009e10 comp rx=0 tx=0).stop 2026-03-10T07:53:20.012 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.011+0000 7f53cc97c700 1 -- 192.168.123.105:0/1997458796 shutdown_connections 2026-03-10T07:53:20.012 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.011+0000 7f53cc97c700 1 --2- 192.168.123.105:0/1997458796 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f53c0096aa0 0x7f53c009a9b0 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.012 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.011+0000 7f53cc97c700 1 --2- 192.168.123.105:0/1997458796 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f53c00960f0 0x7f53c00964d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.012 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.011+0000 7f53cc97c700 1 -- 192.168.123.105:0/1997458796 >> 192.168.123.105:0/1997458796 conn(0x7f53c000b910 msgr2=0x7f53c000bd20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:20.014 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.012+0000 7f53cc97c700 1 -- 192.168.123.105:0/1997458796 shutdown_connections 2026-03-10T07:53:20.014 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.013+0000 7f53cc97c700 1 -- 192.168.123.105:0/1997458796 wait complete. 2026-03-10T07:53:20.014 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.013+0000 7f53cc97c700 1 Processor -- start 2026-03-10T07:53:20.014 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.013+0000 7f53cc97c700 1 -- start start 2026-03-10T07:53:20.014 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.014+0000 7f53cc97c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f53c00960f0 0x7f53c0132d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:20.014 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.014+0000 7f53cc97c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f53c0096aa0 0x7f53c01332c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:20.014 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.014+0000 7f53cc97c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f53c0133890 con 0x7f53c0096aa0 2026-03-10T07:53:20.014 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.014+0000 7f53cc97c700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f53c01339d0 con 0x7f53c00960f0 2026-03-10T07:53:20.014 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.014+0000 7f53c77fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f53c00960f0 0x7f53c0132d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:20.014 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.014+0000 7f53c77fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f53c00960f0 0x7f53c0132d80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:44022/0 (socket says 192.168.123.105:44022) 2026-03-10T07:53:20.014 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.014+0000 7f53c77fe700 1 -- 192.168.123.105:0/562394437 learned_addr learned my addr 192.168.123.105:0/562394437 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:53:20.014 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.014+0000 7f53c6ffd700 1 --2- 192.168.123.105:0/562394437 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f53c0096aa0 0x7f53c01332c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:20.014 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.014+0000 7f53c77fe700 1 -- 192.168.123.105:0/562394437 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f53c0096aa0 msgr2=0x7f53c01332c0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:20.014 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.014+0000 7f53c77fe700 1 --2- 192.168.123.105:0/562394437 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f53c0096aa0 0x7f53c01332c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.015 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.014+0000 7f53c77fe700 1 -- 192.168.123.105:0/562394437 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f53b40097e0 con 0x7f53c00960f0 2026-03-10T07:53:20.015 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.014+0000 7f53c77fe700 1 --2- 192.168.123.105:0/562394437 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f53c00960f0 0x7f53c0132d80 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f53bc00eb10 tx=0x7f53bc00eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:20.015 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.015+0000 7f53c4ff9700 1 -- 192.168.123.105:0/562394437 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f53bc00cca0 con 0x7f53c00960f0 2026-03-10T07:53:20.015 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.015+0000 7f53c4ff9700 1 -- 192.168.123.105:0/562394437 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f53bc00ce00 con 0x7f53c00960f0 2026-03-10T07:53:20.016 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.015+0000 7f53c4ff9700 1 -- 192.168.123.105:0/562394437 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f53bc0189c0 con 0x7f53c00960f0 2026-03-10T07:53:20.016 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.015+0000 7f53cc97c700 1 -- 192.168.123.105:0/562394437 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f53c013bdf0 con 0x7f53c00960f0 2026-03-10T07:53:20.016 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.015+0000 7f53cc97c700 1 -- 192.168.123.105:0/562394437 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f53c013c200 con 0x7f53c00960f0 2026-03-10T07:53:20.016 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.016+0000 7f53cc97c700 1 -- 192.168.123.105:0/562394437 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f53c00a3ff0 con 0x7f53c00960f0 2026-03-10T07:53:20.017 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.017+0000 7f53c4ff9700 1 -- 192.168.123.105:0/562394437 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f53bc018b20 con 0x7f53c00960f0 2026-03-10T07:53:20.018 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.017+0000 7f53c4ff9700 1 --2- 192.168.123.105:0/562394437 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f53b8077840 0x7f53b8079d00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:20.018 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.017+0000 7f53c4ff9700 1 -- 192.168.123.105:0/562394437 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(49..49 src has 1..49) v4 ==== 6252+0+0 (secure 0 0 0) 0x7f53bc014070 con 0x7f53c00960f0 2026-03-10T07:53:20.018 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.018+0000 7f53c6ffd700 1 --2- 192.168.123.105:0/562394437 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f53b8077840 0x7f53b8079d00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:20.020 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.020+0000 7f53c4ff9700 1 -- 192.168.123.105:0/562394437 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f53bc062eb0 con 0x7f53c00960f0 2026-03-10T07:53:20.022 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.021+0000 7f53c6ffd700 1 --2- 192.168.123.105:0/562394437 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f53b8077840 0x7f53b8079d00 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f53b4006010 tx=0x7f53b400b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:20.166 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.165+0000 7f53cc97c700 1 -- 192.168.123.105:0/562394437 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f53c0004850 con 0x7f53b8077840 2026-03-10T07:53:20.169 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.168+0000 7f53c4ff9700 1 -- 192.168.123.105:0/562394437 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f53c0004850 con 0x7f53b8077840 2026-03-10T07:53:20.174 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.174+0000 7f53b27fc700 1 -- 192.168.123.105:0/562394437 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f53b8077840 msgr2=0x7f53b8079d00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:20.174 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.174+0000 7f53b27fc700 1 --2- 192.168.123.105:0/562394437 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f53b8077840 0x7f53b8079d00 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f53b4006010 tx=0x7f53b400b540 comp rx=0 tx=0).stop 2026-03-10T07:53:20.175 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.174+0000 7f53b27fc700 1 -- 192.168.123.105:0/562394437 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f53c00960f0 msgr2=0x7f53c0132d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:20.175 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.174+0000 7f53b27fc700 1 --2- 192.168.123.105:0/562394437 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f53c00960f0 0x7f53c0132d80 secure :-1 s=READY pgs=14 cs=0 l=1 rev1=1 crypto rx=0x7f53bc00eb10 tx=0x7f53bc00eed0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.175 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.174+0000 7f53b27fc700 1 -- 192.168.123.105:0/562394437 shutdown_connections 2026-03-10T07:53:20.175 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.174+0000 7f53b27fc700 1 --2- 192.168.123.105:0/562394437 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f53c00960f0 0x7f53c0132d80 unknown :-1 s=CLOSED pgs=14 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.175 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.174+0000 7f53b27fc700 1 --2- 192.168.123.105:0/562394437 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f53b8077840 0x7f53b8079d00 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.175 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.174+0000 7f53b27fc700 1 --2- 192.168.123.105:0/562394437 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f53c0096aa0 0x7f53c01332c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.175 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.174+0000 7f53b27fc700 1 -- 192.168.123.105:0/562394437 >> 192.168.123.105:0/562394437 conn(0x7f53c000b910 msgr2=0x7f53c009a220 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:20.175 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.174+0000 7f53b27fc700 1 -- 192.168.123.105:0/562394437 shutdown_connections 2026-03-10T07:53:20.175 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.174+0000 7f53b27fc700 1 -- 192.168.123.105:0/562394437 wait complete. 2026-03-10T07:53:20.192 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T07:53:20.281 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.281+0000 7fddafc73700 1 -- 192.168.123.105:0/208417653 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdda810f420 msgr2=0x7fdda810f800 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:20.281 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.281+0000 7fddafc73700 1 --2- 192.168.123.105:0/208417653 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdda810f420 0x7fdda810f800 secure :-1 s=READY pgs=15 cs=0 l=1 rev1=1 crypto rx=0x7fdda000b3a0 tx=0x7fdda000b6b0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.281 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.281+0000 7fddafc73700 1 -- 192.168.123.105:0/208417653 shutdown_connections 2026-03-10T07:53:20.281 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.281+0000 7fddafc73700 1 --2- 192.168.123.105:0/208417653 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdda8107d90 0x7fdda8108210 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.281 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.281+0000 7fddafc73700 1 --2- 192.168.123.105:0/208417653 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdda810f420 0x7fdda810f800 unknown :-1 s=CLOSED pgs=15 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.281+0000 7fddafc73700 1 -- 192.168.123.105:0/208417653 >> 192.168.123.105:0/208417653 conn(0x7fdda806ce20 msgr2=0x7fdda806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:20.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.281+0000 7fddafc73700 1 -- 192.168.123.105:0/208417653 shutdown_connections 2026-03-10T07:53:20.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.281+0000 7fddafc73700 1 -- 192.168.123.105:0/208417653 wait complete. 2026-03-10T07:53:20.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.281+0000 7fddafc73700 1 Processor -- start 2026-03-10T07:53:20.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.281+0000 7fddafc73700 1 -- start start 2026-03-10T07:53:20.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.282+0000 7fddafc73700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdda8107d90 0x7fdda81a56f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:20.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.282+0000 7fddafc73700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdda810f420 0x7fdda81a5c30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:20.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.282+0000 7fddafc73700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdda81a62c0 con 0x7fdda8107d90 2026-03-10T07:53:20.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.282+0000 7fddafc73700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdda81a8ff0 con 0x7fdda810f420 2026-03-10T07:53:20.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.282+0000 7fddada0f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdda8107d90 0x7fdda81a56f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:20.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.282+0000 7fddada0f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdda8107d90 0x7fdda81a56f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:54010/0 (socket says 192.168.123.105:54010) 2026-03-10T07:53:20.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.282+0000 7fddada0f700 1 -- 192.168.123.105:0/2818713380 learned_addr learned my addr 192.168.123.105:0/2818713380 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:53:20.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.282+0000 7fddad20e700 1 --2- 192.168.123.105:0/2818713380 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdda810f420 0x7fdda81a5c30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:20.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.282+0000 7fddad20e700 1 -- 192.168.123.105:0/2818713380 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdda8107d90 msgr2=0x7fdda81a56f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:20.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.282+0000 7fddad20e700 1 --2- 192.168.123.105:0/2818713380 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdda8107d90 0x7fdda81a56f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.282+0000 7fddad20e700 1 -- 192.168.123.105:0/2818713380 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdda000b050 con 0x7fdda810f420 2026-03-10T07:53:20.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.282+0000 7fddad20e700 1 --2- 192.168.123.105:0/2818713380 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdda810f420 0x7fdda81a5c30 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fdd9800e9a0 tx=0x7fdd9800ecb0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:20.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.283+0000 7fdd9effd700 1 -- 192.168.123.105:0/2818713380 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdd9800cb20 con 0x7fdda810f420 2026-03-10T07:53:20.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.283+0000 7fdd9effd700 1 -- 192.168.123.105:0/2818713380 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fdd98004d10 con 0x7fdda810f420 2026-03-10T07:53:20.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.283+0000 7fddafc73700 1 -- 192.168.123.105:0/2818713380 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdda81a92d0 con 0x7fdda810f420 2026-03-10T07:53:20.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.283+0000 7fddafc73700 1 -- 192.168.123.105:0/2818713380 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdda81a9820 con 0x7fdda810f420 2026-03-10T07:53:20.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.283+0000 7fdd9effd700 1 -- 192.168.123.105:0/2818713380 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdd98005710 con 0x7fdda810f420 2026-03-10T07:53:20.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.284+0000 7fddafc73700 1 -- 192.168.123.105:0/2818713380 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdda804f2e0 con 0x7fdda810f420 2026-03-10T07:53:20.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.284+0000 7fdd9effd700 1 -- 192.168.123.105:0/2818713380 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fdd98004e80 con 0x7fdda810f420 2026-03-10T07:53:20.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.284+0000 7fdd9effd700 1 --2- 192.168.123.105:0/2818713380 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fdd94077a00 0x7fdd94079ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:20.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.285+0000 7fdd9effd700 1 -- 192.168.123.105:0/2818713380 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(49..49 src has 1..49) v4 ==== 6252+0+0 (secure 0 0 0) 0x7fdd98099c30 con 0x7fdda810f420 2026-03-10T07:53:20.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.285+0000 7fddada0f700 1 --2- 192.168.123.105:0/2818713380 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fdd94077a00 0x7fdd94079ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:20.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.285+0000 7fddada0f700 1 --2- 192.168.123.105:0/2818713380 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fdd94077a00 0x7fdd94079ec0 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7fdda000ba80 tx=0x7fdda00090d0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:20.287 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.286+0000 7fdd9effd700 1 -- 192.168.123.105:0/2818713380 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fdd98062250 con 0x7fdda810f420 2026-03-10T07:53:20.455 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.454+0000 7fddafc73700 1 -- 192.168.123.105:0/2818713380 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fdda81a9b00 con 0x7fdd94077a00 2026-03-10T07:53:20.456 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.456+0000 7fdd9effd700 1 -- 192.168.123.105:0/2818713380 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fdda81a9b00 con 0x7fdd94077a00 2026-03-10T07:53:20.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.459+0000 7fdd9cff9700 1 -- 192.168.123.105:0/2818713380 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fdd94077a00 msgr2=0x7fdd94079ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:20.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.459+0000 7fdd9cff9700 1 --2- 192.168.123.105:0/2818713380 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fdd94077a00 0x7fdd94079ec0 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7fdda000ba80 tx=0x7fdda00090d0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.459+0000 7fdd9cff9700 1 -- 192.168.123.105:0/2818713380 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdda810f420 msgr2=0x7fdda81a5c30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:20.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.459+0000 7fdd9cff9700 1 --2- 192.168.123.105:0/2818713380 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdda810f420 0x7fdda81a5c30 secure :-1 s=READY pgs=16 cs=0 l=1 rev1=1 crypto rx=0x7fdd9800e9a0 tx=0x7fdd9800ecb0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.459+0000 7fdd9cff9700 1 -- 192.168.123.105:0/2818713380 shutdown_connections 2026-03-10T07:53:20.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.459+0000 7fdd9cff9700 1 --2- 192.168.123.105:0/2818713380 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fdd94077a00 0x7fdd94079ec0 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.459+0000 7fdd9cff9700 1 --2- 192.168.123.105:0/2818713380 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdda8107d90 0x7fdda81a56f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.459+0000 7fdd9cff9700 1 --2- 192.168.123.105:0/2818713380 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdda810f420 0x7fdda81a5c30 unknown :-1 s=CLOSED pgs=16 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.459+0000 7fdd9cff9700 1 -- 192.168.123.105:0/2818713380 >> 192.168.123.105:0/2818713380 conn(0x7fdda806ce20 msgr2=0x7fdda810d150 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:20.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.460+0000 7fdd9cff9700 1 -- 192.168.123.105:0/2818713380 shutdown_connections 2026-03-10T07:53:20.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.460+0000 7fdd9cff9700 1 -- 192.168.123.105:0/2818713380 wait complete. 2026-03-10T07:53:20.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.568+0000 7f5852f42700 1 -- 192.168.123.105:0/3021607164 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f58440a4ca0 msgr2=0x7f58440a5120 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:20.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.568+0000 7f5852f42700 1 --2- 192.168.123.105:0/3021607164 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f58440a4ca0 0x7f58440a5120 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f584c066a30 tx=0x7f584c067220 comp rx=0 tx=0).stop 2026-03-10T07:53:20.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.568+0000 7f5852f42700 1 -- 192.168.123.105:0/3021607164 shutdown_connections 2026-03-10T07:53:20.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.568+0000 7f5852f42700 1 --2- 192.168.123.105:0/3021607164 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f58440a4ca0 0x7f58440a5120 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.568+0000 7f5852f42700 1 --2- 192.168.123.105:0/3021607164 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f58440aabf0 0x7f58440aafd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.568+0000 7f5852f42700 1 -- 192.168.123.105:0/3021607164 >> 192.168.123.105:0/3021607164 conn(0x7f584401a6f0 msgr2=0x7f584401ab00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:20.569 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:20 vm05.local ceph-mon[130117]: pgmap v35: 65 pgs: 4 active+recovery_wait+undersized+degraded+remapped, 14 active+recovery_wait+degraded, 1 active+recovering, 46 active+clean; 301 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 1.6 MiB/s wr, 235 op/s; 392/9429 objects degraded (4.157%); 180 KiB/s, 17 keys/s, 56 objects/s recovering 2026-03-10T07:53:20.569 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:20 vm05.local ceph-mon[130117]: Health check update: Degraded data redundancy: 392/9429 objects degraded (4.157%), 18 pgs degraded (PG_DEGRADED) 2026-03-10T07:53:20.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.568+0000 7f5852f42700 1 -- 192.168.123.105:0/3021607164 shutdown_connections 2026-03-10T07:53:20.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.568+0000 7f5852f42700 1 -- 192.168.123.105:0/3021607164 wait complete. 2026-03-10T07:53:20.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.569+0000 7f5852f42700 1 Processor -- start 2026-03-10T07:53:20.569 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.569+0000 7f5852f42700 1 -- start start 2026-03-10T07:53:20.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.569+0000 7f5852f42700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f58440aabf0 0x7f58440d0b90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:20.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.569+0000 7f5852f42700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f58440d10d0 0x7f58440d1550 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:20.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.569+0000 7f5852f42700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f58440117a0 con 0x7f58440d10d0 2026-03-10T07:53:20.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.569+0000 7f5852f42700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5844011910 con 0x7f58440aabf0 2026-03-10T07:53:20.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.569+0000 7f5851f40700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f58440aabf0 0x7f58440d0b90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:20.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.569+0000 7f5851f40700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f58440aabf0 0x7f58440d0b90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:44042/0 (socket says 192.168.123.105:44042) 2026-03-10T07:53:20.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.569+0000 7f5851f40700 1 -- 192.168.123.105:0/2506991046 learned_addr learned my addr 192.168.123.105:0/2506991046 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:53:20.571 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.571+0000 7f585173f700 1 --2- 192.168.123.105:0/2506991046 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f58440d10d0 0x7f58440d1550 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:20.571 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.571+0000 7f5851f40700 1 -- 192.168.123.105:0/2506991046 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f58440d10d0 msgr2=0x7f58440d1550 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:20.571 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.571+0000 7f5851f40700 1 --2- 192.168.123.105:0/2506991046 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f58440d10d0 0x7f58440d1550 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.571 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.571+0000 7f5851f40700 1 -- 192.168.123.105:0/2506991046 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f584c067090 con 0x7f58440aabf0 2026-03-10T07:53:20.571 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.571+0000 7f5851f40700 1 --2- 192.168.123.105:0/2506991046 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f58440aabf0 0x7f58440d0b90 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f584800c8a0 tx=0x7f584800cc60 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:20.571 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.571+0000 7f5842ffd700 1 -- 192.168.123.105:0/2506991046 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f584800cea0 con 0x7f58440aabf0 2026-03-10T07:53:20.572 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.571+0000 7f5852f42700 1 -- 192.168.123.105:0/2506991046 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5844011ba0 con 0x7f58440aabf0 2026-03-10T07:53:20.572 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.571+0000 7f5852f42700 1 -- 192.168.123.105:0/2506991046 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f58440120f0 con 0x7f58440aabf0 2026-03-10T07:53:20.572 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.572+0000 7f5842ffd700 1 -- 192.168.123.105:0/2506991046 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f5848004830 con 0x7f58440aabf0 2026-03-10T07:53:20.572 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.572+0000 7f5842ffd700 1 -- 192.168.123.105:0/2506991046 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5848005630 con 0x7f58440aabf0 2026-03-10T07:53:20.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.573+0000 7f5842ffd700 1 -- 192.168.123.105:0/2506991046 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f5848004e10 con 0x7f58440aabf0 2026-03-10T07:53:20.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.574+0000 7f5842ffd700 1 --2- 192.168.123.105:0/2506991046 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f5838077a50 0x7f5838079f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:20.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.574+0000 7f5842ffd700 1 -- 192.168.123.105:0/2506991046 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(49..49 src has 1..49) v4 ==== 6252+0+0 (secure 0 0 0) 0x7f5848099820 con 0x7f58440aabf0 2026-03-10T07:53:20.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.574+0000 7f5852f42700 1 -- 192.168.123.105:0/2506991046 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5830005320 con 0x7f58440aabf0 2026-03-10T07:53:20.575 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.574+0000 7f585173f700 1 --2- 192.168.123.105:0/2506991046 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f5838077a50 0x7f5838079f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:20.575 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.575+0000 7f585173f700 1 --2- 192.168.123.105:0/2506991046 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f5838077a50 0x7f5838079f10 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f584c04ed60 tx=0x7f584c070040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:20.578 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.578+0000 7f5842ffd700 1 -- 192.168.123.105:0/2506991046 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f584801d080 con 0x7f58440aabf0 2026-03-10T07:53:20.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:20 vm08.local ceph-mon[107898]: pgmap v35: 65 pgs: 4 active+recovery_wait+undersized+degraded+remapped, 14 active+recovery_wait+degraded, 1 active+recovering, 46 active+clean; 301 MiB data, 3.1 GiB used, 117 GiB / 120 GiB avail; 26 KiB/s rd, 1.6 MiB/s wr, 235 op/s; 392/9429 objects degraded (4.157%); 180 KiB/s, 17 keys/s, 56 objects/s recovering 2026-03-10T07:53:20.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:20 vm08.local ceph-mon[107898]: Health check update: Degraded data redundancy: 392/9429 objects degraded (4.157%), 18 pgs degraded (PG_DEGRADED) 2026-03-10T07:53:20.732 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.731+0000 7f5852f42700 1 -- 192.168.123.105:0/2506991046 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f5830000bf0 con 0x7f5838077a50 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.736+0000 7f5842ffd700 1 -- 192.168.123.105:0/2506991046 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f5830000bf0 con 0x7f5838077a50 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (118s) 14s ago 6m 23.0M - 0.25.0 c8568f914cd2 ac15d5f35994 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (7m) 14s ago 7m 9680k - 18.2.1 5be31c24972a 26c4db858175 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (6m) 26s ago 6m 10.4M - 18.2.1 5be31c24972a 209e2398a09c 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (29s) 14s ago 7m 7847k - 19.2.3-678-ge911bdeb 654f31e6858e daa831c74cf4 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (27s) 26s ago 6m 8262k - 19.2.3-678-ge911bdeb 654f31e6858e 668ac55c9722 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (98s) 14s ago 6m 81.3M - 10.4.0 c8b91775d855 6acb529ad951 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.omfhnh vm05 running (4m) 14s ago 4m 246M - 18.2.1 5be31c24972a e23de179e09c 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.pavqil vm05 running (4m) 14s ago 4m 16.8M - 18.2.1 5be31c24972a 5b9e5afa214c 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.dgsaon vm08 running (4m) 26s ago 4m 17.7M - 18.2.1 5be31c24972a 1696aee522b5 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ybmbgd vm08 running (4m) 26s ago 4m 16.0M - 18.2.1 5be31c24972a 30b0e51cd2ed 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.blexke vm05 *:8443,9283,8765 running (3m) 14s ago 7m 582M - 19.2.3-678-ge911bdeb 654f31e6858e 5467b6a3bd38 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.orfpog vm08 *:8443,9283,8765 running (2m) 26s ago 6m 534M - 19.2.3-678-ge911bdeb 654f31e6858e 7a15d7811d5b 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (62s) 14s ago 7m 57.5M 2048M 19.2.3-678-ge911bdeb 654f31e6858e f02f076bb820 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (45s) 26s ago 6m 48.8M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 73d9a504f360 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (2m) 14s ago 7m 10.3M - 1.7.0 72c9c2088986 7cd0b23b4118 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (2m) 26s ago 6m 9.92M - 1.7.0 72c9c2088986 3dd4d91d5881 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (17s) 14s ago 6m 33.6M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b35fccc2a4d5 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (5m) 14s ago 5m 333M 4096M 18.2.1 5be31c24972a 88e0b65b2c93 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (5m) 14s ago 5m 272M 4096M 18.2.1 5be31c24972a b8d3cbeb49f1 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (5m) 26s ago 5m 410M 4096M 18.2.1 5be31c24972a 0a62c54a86c0 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (5m) 26s ago 5m 343M 4096M 18.2.1 5be31c24972a bd748b691ccd 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (5m) 26s ago 5m 284M 4096M 18.2.1 5be31c24972a 9f08820ae98b 2026-03-10T07:53:20.737 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (2m) 14s ago 6m 50.6M - 2.51.0 1d3b7f56885b c59a6be07563 2026-03-10T07:53:20.741 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.741+0000 7f5840ff9700 1 -- 192.168.123.105:0/2506991046 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f5838077a50 msgr2=0x7f5838079f10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:20.741 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.741+0000 7f5840ff9700 1 --2- 192.168.123.105:0/2506991046 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f5838077a50 0x7f5838079f10 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f584c04ed60 tx=0x7f584c070040 comp rx=0 tx=0).stop 2026-03-10T07:53:20.741 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.741+0000 7f5840ff9700 1 -- 192.168.123.105:0/2506991046 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f58440aabf0 msgr2=0x7f58440d0b90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:20.741 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.741+0000 7f5840ff9700 1 --2- 192.168.123.105:0/2506991046 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f58440aabf0 0x7f58440d0b90 secure :-1 s=READY pgs=17 cs=0 l=1 rev1=1 crypto rx=0x7f584800c8a0 tx=0x7f584800cc60 comp rx=0 tx=0).stop 2026-03-10T07:53:20.741 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.741+0000 7f5840ff9700 1 -- 192.168.123.105:0/2506991046 shutdown_connections 2026-03-10T07:53:20.741 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.741+0000 7f5840ff9700 1 --2- 192.168.123.105:0/2506991046 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f58440aabf0 0x7f58440d0b90 unknown :-1 s=CLOSED pgs=17 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.741 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.741+0000 7f5840ff9700 1 --2- 192.168.123.105:0/2506991046 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f5838077a50 0x7f5838079f10 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.741 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.741+0000 7f5840ff9700 1 --2- 192.168.123.105:0/2506991046 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f58440d10d0 0x7f58440d1550 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.741 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.741+0000 7f5840ff9700 1 -- 192.168.123.105:0/2506991046 >> 192.168.123.105:0/2506991046 conn(0x7f584401a6f0 msgr2=0x7f58440a3c00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:20.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.741+0000 7f5840ff9700 1 -- 192.168.123.105:0/2506991046 shutdown_connections 2026-03-10T07:53:20.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.741+0000 7f5840ff9700 1 -- 192.168.123.105:0/2506991046 wait complete. 2026-03-10T07:53:20.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.838+0000 7fc8f1897700 1 -- 192.168.123.105:0/3302840000 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc8ec1089a0 msgr2=0x7fc8ec10be70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:20.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.838+0000 7fc8f1897700 1 --2- 192.168.123.105:0/3302840000 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc8ec1089a0 0x7fc8ec10be70 secure :-1 s=READY pgs=18 cs=0 l=1 rev1=1 crypto rx=0x7fc8e400a390 tx=0x7fc8e400a6a0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.838+0000 7fc8f1897700 1 -- 192.168.123.105:0/3302840000 shutdown_connections 2026-03-10T07:53:20.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.838+0000 7fc8f1897700 1 --2- 192.168.123.105:0/3302840000 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc8ec1089a0 0x7fc8ec10be70 unknown :-1 s=CLOSED pgs=18 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.838+0000 7fc8f1897700 1 --2- 192.168.123.105:0/3302840000 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc8ec107ff0 0x7fc8ec1083d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.839 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.838+0000 7fc8f1897700 1 -- 192.168.123.105:0/3302840000 >> 192.168.123.105:0/3302840000 conn(0x7fc8ec06ce20 msgr2=0x7fc8ec06d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:20.840 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.838+0000 7fc8f1897700 1 -- 192.168.123.105:0/3302840000 shutdown_connections 2026-03-10T07:53:20.840 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.838+0000 7fc8f1897700 1 -- 192.168.123.105:0/3302840000 wait complete. 2026-03-10T07:53:20.840 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.839+0000 7fc8f1897700 1 Processor -- start 2026-03-10T07:53:20.840 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.839+0000 7fc8f1897700 1 -- start start 2026-03-10T07:53:20.840 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.839+0000 7fc8f1897700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc8ec107ff0 0x7fc8ec1b7fa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:20.840 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.839+0000 7fc8f1897700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc8ec1b84e0 0x7fc8ec1b8960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:20.840 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.839+0000 7fc8f1897700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc8ec07ee90 con 0x7fc8ec107ff0 2026-03-10T07:53:20.840 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.839+0000 7fc8f1897700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc8ec07f000 con 0x7fc8ec1b84e0 2026-03-10T07:53:20.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.839+0000 7fc8ea7fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc8ec1b84e0 0x7fc8ec1b8960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:20.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.839+0000 7fc8ea7fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc8ec1b84e0 0x7fc8ec1b8960 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:44062/0 (socket says 192.168.123.105:44062) 2026-03-10T07:53:20.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.839+0000 7fc8ea7fc700 1 -- 192.168.123.105:0/3871033246 learned_addr learned my addr 192.168.123.105:0/3871033246 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:53:20.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.841+0000 7fc8eaffd700 1 --2- 192.168.123.105:0/3871033246 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc8ec107ff0 0x7fc8ec1b7fa0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:20.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.841+0000 7fc8ea7fc700 1 -- 192.168.123.105:0/3871033246 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc8ec107ff0 msgr2=0x7fc8ec1b7fa0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:20.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.841+0000 7fc8ea7fc700 1 --2- 192.168.123.105:0/3871033246 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc8ec107ff0 0x7fc8ec1b7fa0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:20.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.841+0000 7fc8ea7fc700 1 -- 192.168.123.105:0/3871033246 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc8e400a040 con 0x7fc8ec1b84e0 2026-03-10T07:53:20.841 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.841+0000 7fc8ea7fc700 1 --2- 192.168.123.105:0/3871033246 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc8ec1b84e0 0x7fc8ec1b8960 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fc8e400b280 tx=0x7fc8e400b360 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:20.843 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.842+0000 7fc8f0895700 1 -- 192.168.123.105:0/3871033246 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc8e400a710 con 0x7fc8ec1b84e0 2026-03-10T07:53:20.843 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.842+0000 7fc8f0895700 1 -- 192.168.123.105:0/3871033246 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fc8e4018070 con 0x7fc8ec1b84e0 2026-03-10T07:53:20.843 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.843+0000 7fc8f0895700 1 -- 192.168.123.105:0/3871033246 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc8e4014620 con 0x7fc8ec1b84e0 2026-03-10T07:53:20.844 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.843+0000 7fc8f1897700 1 -- 192.168.123.105:0/3871033246 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc8ec07f230 con 0x7fc8ec1b84e0 2026-03-10T07:53:20.844 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.843+0000 7fc8f1897700 1 -- 192.168.123.105:0/3871033246 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc8ec07f700 con 0x7fc8ec1b84e0 2026-03-10T07:53:20.844 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.843+0000 7fc8f1897700 1 -- 192.168.123.105:0/3871033246 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc8ec04f2e0 con 0x7fc8ec1b84e0 2026-03-10T07:53:20.846 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.845+0000 7fc8f0895700 1 -- 192.168.123.105:0/3871033246 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fc8e401a030 con 0x7fc8ec1b84e0 2026-03-10T07:53:20.846 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.845+0000 7fc8f0895700 1 --2- 192.168.123.105:0/3871033246 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fc8d4077820 0x7fc8d4079ce0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:20.846 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.845+0000 7fc8f0895700 1 -- 192.168.123.105:0/3871033246 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(49..49 src has 1..49) v4 ==== 6252+0+0 (secure 0 0 0) 0x7fc8e409a030 con 0x7fc8ec1b84e0 2026-03-10T07:53:20.846 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.846+0000 7fc8eaffd700 1 --2- 192.168.123.105:0/3871033246 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fc8d4077820 0x7fc8d4079ce0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:20.847 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.847+0000 7fc8f0895700 1 -- 192.168.123.105:0/3871033246 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc8e4062760 con 0x7fc8ec1b84e0 2026-03-10T07:53:20.848 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:20.848+0000 7fc8eaffd700 1 --2- 192.168.123.105:0/3871033246 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fc8d4077820 0x7fc8d4079ce0 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fc8dc009de0 tx=0x7fc8dc009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:21.038 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.036+0000 7fc8f1897700 1 -- 192.168.123.105:0/3871033246 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fc8ec04ea90 con 0x7fc8ec1b84e0 2026-03-10T07:53:21.038 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.038+0000 7fc8f0895700 1 -- 192.168.123.105:0/3871033246 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7fc8e4061eb0 con 0x7fc8ec1b84e0 2026-03-10T07:53:21.038 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:53:21.038 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T07:53:21.038 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:53:21.038 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:53:21.038 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T07:53:21.038 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:53:21.039 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:53:21.039 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T07:53:21.039 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 5, 2026-03-10T07:53:21.039 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T07:53:21.039 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:53:21.039 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T07:53:21.039 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T07:53:21.039 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:53:21.039 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T07:53:21.039 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 9, 2026-03-10T07:53:21.039 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-10T07:53:21.039 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T07:53:21.039 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:53:21.043 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.043+0000 7fc8f1897700 1 -- 192.168.123.105:0/3871033246 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fc8d4077820 msgr2=0x7fc8d4079ce0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:21.043 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.043+0000 7fc8f1897700 1 --2- 192.168.123.105:0/3871033246 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fc8d4077820 0x7fc8d4079ce0 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7fc8dc009de0 tx=0x7fc8dc009450 comp rx=0 tx=0).stop 2026-03-10T07:53:21.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.043+0000 7fc8f1897700 1 -- 192.168.123.105:0/3871033246 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc8ec1b84e0 msgr2=0x7fc8ec1b8960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:21.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.043+0000 7fc8f1897700 1 --2- 192.168.123.105:0/3871033246 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc8ec1b84e0 0x7fc8ec1b8960 secure :-1 s=READY pgs=19 cs=0 l=1 rev1=1 crypto rx=0x7fc8e400b280 tx=0x7fc8e400b360 comp rx=0 tx=0).stop 2026-03-10T07:53:21.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.043+0000 7fc8f1897700 1 -- 192.168.123.105:0/3871033246 shutdown_connections 2026-03-10T07:53:21.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.043+0000 7fc8f1897700 1 --2- 192.168.123.105:0/3871033246 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fc8d4077820 0x7fc8d4079ce0 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:21.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.043+0000 7fc8f1897700 1 --2- 192.168.123.105:0/3871033246 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc8ec107ff0 0x7fc8ec1b7fa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:21.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.043+0000 7fc8f1897700 1 --2- 192.168.123.105:0/3871033246 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc8ec1b84e0 0x7fc8ec1b8960 unknown :-1 s=CLOSED pgs=19 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:21.044 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.044+0000 7fc8f1897700 1 -- 192.168.123.105:0/3871033246 >> 192.168.123.105:0/3871033246 conn(0x7fc8ec06ce20 msgr2=0x7fc8ec0705f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:21.045 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.044+0000 7fc8f1897700 1 -- 192.168.123.105:0/3871033246 shutdown_connections 2026-03-10T07:53:21.045 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.045+0000 7fc8f1897700 1 -- 192.168.123.105:0/3871033246 wait complete. 2026-03-10T07:53:21.124 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.122+0000 7fb160026700 1 -- 192.168.123.105:0/3196989769 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb158107ff0 msgr2=0x7fb1581083d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:21.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.122+0000 7fb160026700 1 --2- 192.168.123.105:0/3196989769 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb158107ff0 0x7fb1581083d0 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7fb154007780 tx=0x7fb15400c050 comp rx=0 tx=0).stop 2026-03-10T07:53:21.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.124+0000 7fb160026700 1 -- 192.168.123.105:0/3196989769 shutdown_connections 2026-03-10T07:53:21.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.125+0000 7fb160026700 1 --2- 192.168.123.105:0/3196989769 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb1581089a0 0x7fb15810be70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:21.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.125+0000 7fb160026700 1 --2- 192.168.123.105:0/3196989769 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb158107ff0 0x7fb1581083d0 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:21.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.125+0000 7fb160026700 1 -- 192.168.123.105:0/3196989769 >> 192.168.123.105:0/3196989769 conn(0x7fb15806ce20 msgr2=0x7fb15806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:21.126 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.125+0000 7fb160026700 1 -- 192.168.123.105:0/3196989769 shutdown_connections 2026-03-10T07:53:21.126 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.125+0000 7fb160026700 1 -- 192.168.123.105:0/3196989769 wait complete. 2026-03-10T07:53:21.126 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.125+0000 7fb160026700 1 Processor -- start 2026-03-10T07:53:21.126 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.125+0000 7fb160026700 1 -- start start 2026-03-10T07:53:21.127 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.125+0000 7fb160026700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb1581089a0 0x7fb1581331a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:21.127 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.126+0000 7fb160026700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb1581336e0 0x7fb158133b60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:21.127 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.126+0000 7fb160026700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb15807ef10 con 0x7fb1581336e0 2026-03-10T07:53:21.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.126+0000 7fb160026700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb15807f080 con 0x7fb1581089a0 2026-03-10T07:53:21.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.126+0000 7fb15d5c1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb1581336e0 0x7fb158133b60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:21.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.126+0000 7fb15d5c1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb1581336e0 0x7fb158133b60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:54086/0 (socket says 192.168.123.105:54086) 2026-03-10T07:53:21.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.126+0000 7fb15d5c1700 1 -- 192.168.123.105:0/3579419994 learned_addr learned my addr 192.168.123.105:0/3579419994 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:53:21.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.126+0000 7fb15ddc2700 1 --2- 192.168.123.105:0/3579419994 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb1581089a0 0x7fb1581331a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:21.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.126+0000 7fb15d5c1700 1 -- 192.168.123.105:0/3579419994 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb1581089a0 msgr2=0x7fb1581331a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:21.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.126+0000 7fb15d5c1700 1 --2- 192.168.123.105:0/3579419994 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb1581089a0 0x7fb1581331a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:21.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.126+0000 7fb15d5c1700 1 -- 192.168.123.105:0/3579419994 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb154007430 con 0x7fb1581336e0 2026-03-10T07:53:21.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.126+0000 7fb15d5c1700 1 --2- 192.168.123.105:0/3579419994 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb1581336e0 0x7fb158133b60 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fb150007f00 tx=0x7fb15000d3b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:21.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.127+0000 7fb14effd700 1 -- 192.168.123.105:0/3579419994 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb15000dcf0 con 0x7fb1581336e0 2026-03-10T07:53:21.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.127+0000 7fb14effd700 1 -- 192.168.123.105:0/3579419994 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fb15000f040 con 0x7fb1581336e0 2026-03-10T07:53:21.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.127+0000 7fb14effd700 1 -- 192.168.123.105:0/3579419994 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb1500127c0 con 0x7fb1581336e0 2026-03-10T07:53:21.129 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.127+0000 7fb160026700 1 -- 192.168.123.105:0/3579419994 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb15807f310 con 0x7fb1581336e0 2026-03-10T07:53:21.129 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.127+0000 7fb160026700 1 -- 192.168.123.105:0/3579419994 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb15807f860 con 0x7fb1581336e0 2026-03-10T07:53:21.129 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.127+0000 7fb160026700 1 -- 192.168.123.105:0/3579419994 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb15804f2e0 con 0x7fb1581336e0 2026-03-10T07:53:21.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.133+0000 7fb14effd700 1 -- 192.168.123.105:0/3579419994 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fb150004ad0 con 0x7fb1581336e0 2026-03-10T07:53:21.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.133+0000 7fb14effd700 1 --2- 192.168.123.105:0/3579419994 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fb144077b00 0x7fb144079fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:21.133 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.133+0000 7fb14effd700 1 -- 192.168.123.105:0/3579419994 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(49..49 src has 1..49) v4 ==== 6252+0+0 (secure 0 0 0) 0x7fb1500996c0 con 0x7fb1581336e0 2026-03-10T07:53:21.134 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.134+0000 7fb15ddc2700 1 --2- 192.168.123.105:0/3579419994 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fb144077b00 0x7fb144079fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:21.134 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.134+0000 7fb14effd700 1 -- 192.168.123.105:0/3579419994 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb150099b40 con 0x7fb1581336e0 2026-03-10T07:53:21.135 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.134+0000 7fb15ddc2700 1 --2- 192.168.123.105:0/3579419994 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fb144077b00 0x7fb144079fc0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7fb154007400 tx=0x7fb15400c450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:21.286 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.285+0000 7fb160026700 1 -- 192.168.123.105:0/3579419994 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7fb15804ea90 con 0x7fb1581336e0 2026-03-10T07:53:21.459 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T07:53:21.459 INFO:teuthology.orchestra.run.vm05.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T07:53:21.459 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T07:53:21.459 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:53:21.459 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T07:53:21.459 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:53:21.459 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T07:53:21.459 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T07:53:21.459 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T07:53:21.459 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T07:53:21.459 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T07:48:24.309293+0000 2026-03-10T07:53:21.459 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T07:48:33.425187+0000 2026-03-10T07:53:21.459 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T07:53:21.459 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T07:53:21.459 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:max_xattr_size 65536 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24297} 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:inline_data enabled 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:qdb_cluster leader: 0 members: 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.omfhnh{0:24297} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/723078808,v1:192.168.123.105:6827/723078808] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.pavqil{-1:14512} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/427555544,v1:192.168.123.105:6829/427555544] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ybmbgd{-1:14524} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3815585988,v1:192.168.123.108:6825/3815585988] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.dgsaon{-1:24313} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/2963085185,v1:192.168.123.108:6827/2963085185] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.455+0000 7fb14effd700 1 -- 192.168.123.105:0/3579419994 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1915 (secure 0 0 0) 0x7fb150061ce0 con 0x7fb1581336e0 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.459+0000 7fb14cff9700 1 -- 192.168.123.105:0/3579419994 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fb144077b00 msgr2=0x7fb144079fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:21.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.459+0000 7fb14cff9700 1 --2- 192.168.123.105:0/3579419994 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fb144077b00 0x7fb144079fc0 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7fb154007400 tx=0x7fb15400c450 comp rx=0 tx=0).stop 2026-03-10T07:53:21.465 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.459+0000 7fb14cff9700 1 -- 192.168.123.105:0/3579419994 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb1581336e0 msgr2=0x7fb158133b60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:21.465 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.459+0000 7fb14cff9700 1 --2- 192.168.123.105:0/3579419994 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb1581336e0 0x7fb158133b60 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7fb150007f00 tx=0x7fb15000d3b0 comp rx=0 tx=0).stop 2026-03-10T07:53:21.465 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.459+0000 7fb14cff9700 1 -- 192.168.123.105:0/3579419994 shutdown_connections 2026-03-10T07:53:21.465 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.459+0000 7fb14cff9700 1 --2- 192.168.123.105:0/3579419994 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb1581089a0 0x7fb1581331a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:21.465 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.459+0000 7fb14cff9700 1 --2- 192.168.123.105:0/3579419994 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fb144077b00 0x7fb144079fc0 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:21.465 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.460+0000 7fb14cff9700 1 --2- 192.168.123.105:0/3579419994 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb1581336e0 0x7fb158133b60 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:21.465 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.460+0000 7fb14cff9700 1 -- 192.168.123.105:0/3579419994 >> 192.168.123.105:0/3579419994 conn(0x7fb15806ce20 msgr2=0x7fb1580705d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:21.465 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.460+0000 7fb14cff9700 1 -- 192.168.123.105:0/3579419994 shutdown_connections 2026-03-10T07:53:21.465 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.460+0000 7fb14cff9700 1 -- 192.168.123.105:0/3579419994 wait complete. 2026-03-10T07:53:21.469 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T07:53:21.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.570+0000 7feea9198700 1 -- 192.168.123.105:0/481978100 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fee9c0981f0 msgr2=0x7fee9c0985d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:21.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.570+0000 7feea9198700 1 --2- 192.168.123.105:0/481978100 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fee9c0981f0 0x7fee9c0985d0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7fee98009b00 tx=0x7fee98009e10 comp rx=0 tx=0).stop 2026-03-10T07:53:21.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.570+0000 7feea9198700 1 -- 192.168.123.105:0/481978100 shutdown_connections 2026-03-10T07:53:21.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.570+0000 7feea9198700 1 --2- 192.168.123.105:0/481978100 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fee9c098ba0 0x7fee9c09cbf0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:21.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.570+0000 7feea9198700 1 --2- 192.168.123.105:0/481978100 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fee9c0981f0 0x7fee9c0985d0 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:21.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.570+0000 7feea9198700 1 -- 192.168.123.105:0/481978100 >> 192.168.123.105:0/481978100 conn(0x7fee9c093a80 msgr2=0x7fee9c095ea0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:21.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.570+0000 7feea9198700 1 -- 192.168.123.105:0/481978100 shutdown_connections 2026-03-10T07:53:21.571 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.570+0000 7feea9198700 1 -- 192.168.123.105:0/481978100 wait complete. 2026-03-10T07:53:21.571 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.571+0000 7feea9198700 1 Processor -- start 2026-03-10T07:53:21.571 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.571+0000 7feea9198700 1 -- start start 2026-03-10T07:53:21.571 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.571+0000 7feea9198700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fee9c0981f0 0x7fee9c1339a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:21.572 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.571+0000 7feea9198700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fee9c098ba0 0x7fee9c133ee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:21.572 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.571+0000 7feea9198700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fee9c134570 con 0x7fee9c0981f0 2026-03-10T07:53:21.572 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.571+0000 7feea9198700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fee9c12da20 con 0x7fee9c098ba0 2026-03-10T07:53:21.572 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.571+0000 7feea3fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fee9c0981f0 0x7fee9c1339a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:21.572 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.571+0000 7feea3fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fee9c0981f0 0x7fee9c1339a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:54110/0 (socket says 192.168.123.105:54110) 2026-03-10T07:53:21.572 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.571+0000 7feea3fff700 1 -- 192.168.123.105:0/2067974006 learned_addr learned my addr 192.168.123.105:0/2067974006 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:53:21.572 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.571+0000 7feea37fe700 1 --2- 192.168.123.105:0/2067974006 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fee9c098ba0 0x7fee9c133ee0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:21.572 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.572+0000 7feea37fe700 1 -- 192.168.123.105:0/2067974006 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fee9c0981f0 msgr2=0x7fee9c1339a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:21.572 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.572+0000 7feea37fe700 1 --2- 192.168.123.105:0/2067974006 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fee9c0981f0 0x7fee9c1339a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:21.572 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.572+0000 7feea37fe700 1 -- 192.168.123.105:0/2067974006 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fee980097e0 con 0x7fee9c098ba0 2026-03-10T07:53:21.572 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.572+0000 7feea37fe700 1 --2- 192.168.123.105:0/2067974006 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fee9c098ba0 0x7fee9c133ee0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fee9000ea80 tx=0x7fee9000ed90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:21.572 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.572+0000 7feea17fa700 1 -- 192.168.123.105:0/2067974006 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fee9000cb20 con 0x7fee9c098ba0 2026-03-10T07:53:21.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.572+0000 7feea17fa700 1 -- 192.168.123.105:0/2067974006 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fee9000cc80 con 0x7fee9c098ba0 2026-03-10T07:53:21.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.572+0000 7feea17fa700 1 -- 192.168.123.105:0/2067974006 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fee90018890 con 0x7fee9c098ba0 2026-03-10T07:53:21.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.572+0000 7feea9198700 1 -- 192.168.123.105:0/2067974006 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fee9c12dd00 con 0x7fee9c098ba0 2026-03-10T07:53:21.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.572+0000 7feea9198700 1 -- 192.168.123.105:0/2067974006 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fee9c12e250 con 0x7fee9c098ba0 2026-03-10T07:53:21.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.573+0000 7feea9198700 1 -- 192.168.123.105:0/2067974006 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fee9c004d90 con 0x7fee9c098ba0 2026-03-10T07:53:21.575 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.574+0000 7feea17fa700 1 -- 192.168.123.105:0/2067974006 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fee900189f0 con 0x7fee9c098ba0 2026-03-10T07:53:21.575 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.574+0000 7feea17fa700 1 --2- 192.168.123.105:0/2067974006 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fee94077aa0 0x7fee94079f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:21.575 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.574+0000 7feea17fa700 1 -- 192.168.123.105:0/2067974006 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(49..49 src has 1..49) v4 ==== 6252+0+0 (secure 0 0 0) 0x7fee90014070 con 0x7fee9c098ba0 2026-03-10T07:53:21.575 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.575+0000 7feea3fff700 1 --2- 192.168.123.105:0/2067974006 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fee94077aa0 0x7fee94079f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:21.575 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.575+0000 7feea3fff700 1 --2- 192.168.123.105:0/2067974006 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fee94077aa0 0x7fee94079f60 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fee98009ad0 tx=0x7fee9800b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:21.576 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.576+0000 7feea17fa700 1 -- 192.168.123.105:0/2067974006 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fee90062f20 con 0x7fee9c098ba0 2026-03-10T07:53:21.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.718+0000 7feea9198700 1 -- 192.168.123.105:0/2067974006 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fee9c12e530 con 0x7fee94077aa0 2026-03-10T07:53:21.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.722+0000 7feea17fa700 1 -- 192.168.123.105:0/2067974006 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fee9c12e530 con 0x7fee94077aa0 2026-03-10T07:53:21.723 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:53:21.723 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T07:53:21.723 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T07:53:21.723 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T07:53:21.723 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T07:53:21.723 INFO:teuthology.orchestra.run.vm05.stdout: "mgr", 2026-03-10T07:53:21.723 INFO:teuthology.orchestra.run.vm05.stdout: "crash", 2026-03-10T07:53:21.723 INFO:teuthology.orchestra.run.vm05.stdout: "mon" 2026-03-10T07:53:21.723 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T07:53:21.723 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "7/23 daemons upgraded", 2026-03-10T07:53:21.723 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T07:53:21.723 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T07:53:21.723 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:53:21.727 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.727+0000 7fee8effd700 1 -- 192.168.123.105:0/2067974006 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fee94077aa0 msgr2=0x7fee94079f60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:21.727 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.727+0000 7fee8effd700 1 --2- 192.168.123.105:0/2067974006 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fee94077aa0 0x7fee94079f60 secure :-1 s=READY pgs=33 cs=0 l=1 rev1=1 crypto rx=0x7fee98009ad0 tx=0x7fee9800b540 comp rx=0 tx=0).stop 2026-03-10T07:53:21.727 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.727+0000 7fee8effd700 1 -- 192.168.123.105:0/2067974006 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fee9c098ba0 msgr2=0x7fee9c133ee0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:21.727 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.727+0000 7fee8effd700 1 --2- 192.168.123.105:0/2067974006 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fee9c098ba0 0x7fee9c133ee0 secure :-1 s=READY pgs=20 cs=0 l=1 rev1=1 crypto rx=0x7fee9000ea80 tx=0x7fee9000ed90 comp rx=0 tx=0).stop 2026-03-10T07:53:21.731 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.730+0000 7fee8effd700 1 -- 192.168.123.105:0/2067974006 shutdown_connections 2026-03-10T07:53:21.747 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.730+0000 7fee8effd700 1 --2- 192.168.123.105:0/2067974006 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fee94077aa0 0x7fee94079f60 unknown :-1 s=CLOSED pgs=33 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:21.747 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.730+0000 7fee8effd700 1 --2- 192.168.123.105:0/2067974006 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fee9c0981f0 0x7fee9c1339a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:21.747 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.730+0000 7fee8effd700 1 --2- 192.168.123.105:0/2067974006 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fee9c098ba0 0x7fee9c133ee0 unknown :-1 s=CLOSED pgs=20 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:21.747 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.730+0000 7fee8effd700 1 -- 192.168.123.105:0/2067974006 >> 192.168.123.105:0/2067974006 conn(0x7fee9c093a80 msgr2=0x7fee9c095ea0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:21.747 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.731+0000 7fee8effd700 1 -- 192.168.123.105:0/2067974006 shutdown_connections 2026-03-10T07:53:21.747 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.731+0000 7fee8effd700 1 -- 192.168.123.105:0/2067974006 wait complete. 2026-03-10T07:53:21.808 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.808+0000 7fb7dcd2e700 1 -- 192.168.123.105:0/1627706044 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7d8107ff0 msgr2=0x7fb7d81083d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:21.808 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.808+0000 7fb7dcd2e700 1 --2- 192.168.123.105:0/1627706044 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7d8107ff0 0x7fb7d81083d0 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7fb7c8007780 tx=0x7fb7c800c050 comp rx=0 tx=0).stop 2026-03-10T07:53:21.808 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.808+0000 7fb7dcd2e700 1 -- 192.168.123.105:0/1627706044 shutdown_connections 2026-03-10T07:53:21.808 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.808+0000 7fb7dcd2e700 1 --2- 192.168.123.105:0/1627706044 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb7d81089a0 0x7fb7d810be70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:21.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.808+0000 7fb7dcd2e700 1 --2- 192.168.123.105:0/1627706044 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7d8107ff0 0x7fb7d81083d0 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:21.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.808+0000 7fb7dcd2e700 1 -- 192.168.123.105:0/1627706044 >> 192.168.123.105:0/1627706044 conn(0x7fb7d806ce20 msgr2=0x7fb7d806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:21.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.808+0000 7fb7dcd2e700 1 -- 192.168.123.105:0/1627706044 shutdown_connections 2026-03-10T07:53:21.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.808+0000 7fb7dcd2e700 1 -- 192.168.123.105:0/1627706044 wait complete. 2026-03-10T07:53:21.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.809+0000 7fb7dcd2e700 1 Processor -- start 2026-03-10T07:53:21.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.809+0000 7fb7dcd2e700 1 -- start start 2026-03-10T07:53:21.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.809+0000 7fb7dcd2e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7d81089a0 0x7fb7d8133260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:21.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.809+0000 7fb7dcd2e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb7d81337a0 0x7fb7d8133c20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:21.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.809+0000 7fb7dcd2e700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb7d807ef30 con 0x7fb7d81089a0 2026-03-10T07:53:21.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.809+0000 7fb7dcd2e700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fb7d807f0a0 con 0x7fb7d81337a0 2026-03-10T07:53:21.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.809+0000 7fb7d659c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7d81089a0 0x7fb7d8133260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:21.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.809+0000 7fb7d659c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7d81089a0 0x7fb7d8133260 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:54128/0 (socket says 192.168.123.105:54128) 2026-03-10T07:53:21.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.809+0000 7fb7d659c700 1 -- 192.168.123.105:0/3325608134 learned_addr learned my addr 192.168.123.105:0/3325608134 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:53:21.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.810+0000 7fb7d659c700 1 -- 192.168.123.105:0/3325608134 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb7d81337a0 msgr2=0x7fb7d8133c20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:21.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.810+0000 7fb7d659c700 1 --2- 192.168.123.105:0/3325608134 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb7d81337a0 0x7fb7d8133c20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:21.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.810+0000 7fb7d659c700 1 -- 192.168.123.105:0/3325608134 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fb7c8007430 con 0x7fb7d81089a0 2026-03-10T07:53:21.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.810+0000 7fb7d659c700 1 --2- 192.168.123.105:0/3325608134 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7d81089a0 0x7fb7d8133260 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7fb7c8007fd0 tx=0x7fb7c800ca60 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:21.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.811+0000 7fb7c77fe700 1 -- 192.168.123.105:0/3325608134 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb7c800f050 con 0x7fb7d81089a0 2026-03-10T07:53:21.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.811+0000 7fb7dcd2e700 1 -- 192.168.123.105:0/3325608134 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fb7d807f2d0 con 0x7fb7d81089a0 2026-03-10T07:53:21.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.811+0000 7fb7dcd2e700 1 -- 192.168.123.105:0/3325608134 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fb7d807f820 con 0x7fb7d81089a0 2026-03-10T07:53:21.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.812+0000 7fb7c77fe700 1 -- 192.168.123.105:0/3325608134 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fb7c800a760 con 0x7fb7d81089a0 2026-03-10T07:53:21.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.812+0000 7fb7c77fe700 1 -- 192.168.123.105:0/3325608134 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fb7c8008790 con 0x7fb7d81089a0 2026-03-10T07:53:21.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.813+0000 7fb7c77fe700 1 -- 192.168.123.105:0/3325608134 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fb7c801a040 con 0x7fb7d81089a0 2026-03-10T07:53:21.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.813+0000 7fb7c77fe700 1 --2- 192.168.123.105:0/3325608134 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fb7c0077b00 0x7fb7c0079fc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:21.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.813+0000 7fb7d5d9b700 1 --2- 192.168.123.105:0/3325608134 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fb7c0077b00 0x7fb7c0079fc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:21.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.814+0000 7fb7c77fe700 1 -- 192.168.123.105:0/3325608134 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(49..49 src has 1..49) v4 ==== 6252+0+0 (secure 0 0 0) 0x7fb7c809a7b0 con 0x7fb7d81089a0 2026-03-10T07:53:21.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.814+0000 7fb7dcd2e700 1 -- 192.168.123.105:0/3325608134 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fb7b8005320 con 0x7fb7d81089a0 2026-03-10T07:53:21.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.814+0000 7fb7d5d9b700 1 --2- 192.168.123.105:0/3325608134 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fb7c0077b00 0x7fb7c0079fc0 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7fb7d806a180 tx=0x7fb7d0009250 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:21.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.817+0000 7fb7c77fe700 1 -- 192.168.123.105:0/3325608134 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fb7c8062dd0 con 0x7fb7d81089a0 2026-03-10T07:53:21.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:21.984+0000 7fb7dcd2e700 1 -- 192.168.123.105:0/3325608134 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fb7b8005cc0 con 0x7fb7d81089a0 2026-03-10T07:53:22.079 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data; Degraded data redundancy: 392/9057 objects degraded (4.328%), 18 pgs degraded 2026-03-10T07:53:22.079 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T07:53:22.079 INFO:teuthology.orchestra.run.vm05.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T07:53:22.079 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 392/9057 objects degraded (4.328%), 18 pgs degraded 2026-03-10T07:53:22.079 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.0 is active+recovery_wait+degraded, acting [3,1,0] 2026-03-10T07:53:22.079 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.1 is active+recovery_wait+degraded, acting [2,1,0] 2026-03-10T07:53:22.079 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.5 is active+recovery_wait+undersized+degraded+remapped, acting [3,4] 2026-03-10T07:53:22.079 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.8 is active+recovery_wait+degraded, acting [3,5,0] 2026-03-10T07:53:22.079 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.9 is active+recovery_wait+degraded, acting [1,4,0] 2026-03-10T07:53:22.079 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.e is active+recovery_wait+degraded, acting [2,0,3] 2026-03-10T07:53:22.079 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.10 is active+recovery_wait+degraded, acting [2,1,0] 2026-03-10T07:53:22.079 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.15 is active+recovery_wait+degraded, acting [1,3,0] 2026-03-10T07:53:22.079 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.1d is active+recovery_wait+degraded, acting [3,5,0] 2026-03-10T07:53:22.079 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.1e is active+recovery_wait+degraded, acting [2,0,5] 2026-03-10T07:53:22.079 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.3 is active+recovery_wait+undersized+degraded+remapped, acting [4,3] 2026-03-10T07:53:22.079 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.b is active+recovery_wait+degraded, acting [1,0,4] 2026-03-10T07:53:22.079 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.c is active+recovery_wait+undersized+degraded+remapped, acting [5,3] 2026-03-10T07:53:22.079 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.f is active+recovery_wait+undersized+degraded+remapped, acting [5,3] 2026-03-10T07:53:22.079 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.10 is active+recovery_wait+degraded, acting [5,0,1] 2026-03-10T07:53:22.079 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.11 is active+recovery_wait+degraded, acting [3,4,0] 2026-03-10T07:53:22.079 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.15 is active+recovery_wait+degraded, acting [3,0,4] 2026-03-10T07:53:22.079 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.18 is active+recovery_wait+degraded, acting [2,0,1] 2026-03-10T07:53:22.079 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:22.076+0000 7fb7c77fe700 1 -- 192.168.123.105:0/3325608134 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+1537 (secure 0 0 0) 0x7fb7c802a090 con 0x7fb7d81089a0 2026-03-10T07:53:22.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:22.079+0000 7fb7c57fa700 1 -- 192.168.123.105:0/3325608134 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fb7c0077b00 msgr2=0x7fb7c0079fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:22.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:22.079+0000 7fb7c57fa700 1 --2- 192.168.123.105:0/3325608134 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fb7c0077b00 0x7fb7c0079fc0 secure :-1 s=READY pgs=34 cs=0 l=1 rev1=1 crypto rx=0x7fb7d806a180 tx=0x7fb7d0009250 comp rx=0 tx=0).stop 2026-03-10T07:53:22.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:22.079+0000 7fb7c57fa700 1 -- 192.168.123.105:0/3325608134 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7d81089a0 msgr2=0x7fb7d8133260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:22.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:22.079+0000 7fb7c57fa700 1 --2- 192.168.123.105:0/3325608134 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7d81089a0 0x7fb7d8133260 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7fb7c8007fd0 tx=0x7fb7c800ca60 comp rx=0 tx=0).stop 2026-03-10T07:53:22.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:22.079+0000 7fb7c57fa700 1 -- 192.168.123.105:0/3325608134 shutdown_connections 2026-03-10T07:53:22.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:22.079+0000 7fb7c57fa700 1 --2- 192.168.123.105:0/3325608134 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fb7c0077b00 0x7fb7c0079fc0 unknown :-1 s=CLOSED pgs=34 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:22.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:22.080+0000 7fb7c57fa700 1 --2- 192.168.123.105:0/3325608134 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fb7d81089a0 0x7fb7d8133260 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:22.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:22.080+0000 7fb7c57fa700 1 --2- 192.168.123.105:0/3325608134 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fb7d81337a0 0x7fb7d8133c20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:22.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:22.080+0000 7fb7c57fa700 1 -- 192.168.123.105:0/3325608134 >> 192.168.123.105:0/3325608134 conn(0x7fb7d806ce20 msgr2=0x7fb7d8070590 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:22.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:22.080+0000 7fb7c57fa700 1 -- 192.168.123.105:0/3325608134 shutdown_connections 2026-03-10T07:53:22.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:22.080+0000 7fb7c57fa700 1 -- 192.168.123.105:0/3325608134 wait complete. 2026-03-10T07:53:22.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:22 vm05.local ceph-mon[130117]: from='client.44127 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:53:22.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:22 vm05.local ceph-mon[130117]: from='client.44131 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:53:22.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:22 vm05.local ceph-mon[130117]: from='client.44133 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:53:22.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:22 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/3871033246' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:53:22.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:22 vm08.local ceph-mon[107898]: from='client.44127 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:53:22.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:22 vm08.local ceph-mon[107898]: from='client.44131 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:53:22.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:22 vm08.local ceph-mon[107898]: from='client.44133 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:53:22.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:22 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/3871033246' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:53:23.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:23 vm05.local ceph-mon[130117]: pgmap v36: 65 pgs: 4 active+recovery_wait+undersized+degraded+remapped, 14 active+recovery_wait+degraded, 1 active+recovering, 46 active+clean; 300 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 22 KiB/s rd, 1.4 MiB/s wr, 220 op/s; 392/9057 objects degraded (4.328%); 157 KiB/s, 15 keys/s, 49 objects/s recovering 2026-03-10T07:53:23.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:23 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/3579419994' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T07:53:23.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:23 vm05.local ceph-mon[130117]: from='client.44145 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:53:23.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:23 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/3325608134' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T07:53:23.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:23 vm08.local ceph-mon[107898]: pgmap v36: 65 pgs: 4 active+recovery_wait+undersized+degraded+remapped, 14 active+recovery_wait+degraded, 1 active+recovering, 46 active+clean; 300 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 22 KiB/s rd, 1.4 MiB/s wr, 220 op/s; 392/9057 objects degraded (4.328%); 157 KiB/s, 15 keys/s, 49 objects/s recovering 2026-03-10T07:53:23.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:23 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/3579419994' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T07:53:23.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:23 vm08.local ceph-mon[107898]: from='client.44145 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:53:23.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:23 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/3325608134' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T07:53:24.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:24 vm05.local ceph-mon[130117]: pgmap v37: 65 pgs: 4 active+recovery_wait+undersized+degraded+remapped, 14 active+recovery_wait+degraded, 1 active+recovering, 46 active+clean; 300 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 18 KiB/s rd, 1.1 MiB/s wr, 177 op/s; 392/9057 objects degraded (4.328%); 126 KiB/s, 12 keys/s, 39 objects/s recovering 2026-03-10T07:53:24.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:24 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T07:53:24.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:24 vm05.local ceph-mon[130117]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T07:53:24.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:24 vm05.local ceph-mon[130117]: Upgrade: unsafe to stop osd(s) at this time (8 PGs are or would become offline) 2026-03-10T07:53:24.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:24 vm08.local ceph-mon[107898]: pgmap v37: 65 pgs: 4 active+recovery_wait+undersized+degraded+remapped, 14 active+recovery_wait+degraded, 1 active+recovering, 46 active+clean; 300 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 18 KiB/s rd, 1.1 MiB/s wr, 177 op/s; 392/9057 objects degraded (4.328%); 126 KiB/s, 12 keys/s, 39 objects/s recovering 2026-03-10T07:53:24.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:24 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T07:53:24.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:24 vm08.local ceph-mon[107898]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T07:53:24.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:24 vm08.local ceph-mon[107898]: Upgrade: unsafe to stop osd(s) at this time (8 PGs are or would become offline) 2026-03-10T07:53:25.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:25 vm05.local ceph-mon[130117]: osdmap e50: 6 total, 6 up, 6 in 2026-03-10T07:53:25.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:25 vm05.local ceph-mon[130117]: Health check update: Degraded data redundancy: 392/9057 objects degraded (4.328%), 18 pgs degraded (PG_DEGRADED) 2026-03-10T07:53:25.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:25 vm08.local ceph-mon[107898]: osdmap e50: 6 total, 6 up, 6 in 2026-03-10T07:53:25.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:25 vm08.local ceph-mon[107898]: Health check update: Degraded data redundancy: 392/9057 objects degraded (4.328%), 18 pgs degraded (PG_DEGRADED) 2026-03-10T07:53:26.257 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:26 vm08.local ceph-mon[107898]: pgmap v39: 65 pgs: 4 active+recovery_wait+undersized+degraded+remapped, 5 active+recovery_wait+degraded, 1 active+recovering, 55 active+clean; 285 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 28 KiB/s rd, 1.2 MiB/s wr, 208 op/s; 356/6828 objects degraded (5.214%); 126 KiB/s, 13 keys/s, 43 objects/s recovering 2026-03-10T07:53:26.257 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:26 vm08.local ceph-mon[107898]: osdmap e51: 6 total, 6 up, 6 in 2026-03-10T07:53:26.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:26 vm05.local ceph-mon[130117]: pgmap v39: 65 pgs: 4 active+recovery_wait+undersized+degraded+remapped, 5 active+recovery_wait+degraded, 1 active+recovering, 55 active+clean; 285 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 28 KiB/s rd, 1.2 MiB/s wr, 208 op/s; 356/6828 objects degraded (5.214%); 126 KiB/s, 13 keys/s, 43 objects/s recovering 2026-03-10T07:53:26.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:26 vm05.local ceph-mon[130117]: osdmap e51: 6 total, 6 up, 6 in 2026-03-10T07:53:27.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:53:27.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:27 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:53:28.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:28 vm05.local ceph-mon[130117]: pgmap v41: 65 pgs: 4 active+recovery_wait+undersized+degraded+remapped, 5 active+recovery_wait+degraded, 1 active+recovering, 55 active+clean; 284 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 23 KiB/s rd, 650 KiB/s wr, 158 op/s; 356/6516 objects degraded (5.463%); 0 B/s, 1 keys/s, 4 objects/s recovering 2026-03-10T07:53:28.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:28 vm08.local ceph-mon[107898]: pgmap v41: 65 pgs: 4 active+recovery_wait+undersized+degraded+remapped, 5 active+recovery_wait+degraded, 1 active+recovering, 55 active+clean; 284 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 23 KiB/s rd, 650 KiB/s wr, 158 op/s; 356/6516 objects degraded (5.463%); 0 B/s, 1 keys/s, 4 objects/s recovering 2026-03-10T07:53:29.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:29 vm05.local ceph-mon[130117]: Health check update: Degraded data redundancy: 356/6516 objects degraded (5.463%), 9 pgs degraded (PG_DEGRADED) 2026-03-10T07:53:29.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:29 vm08.local ceph-mon[107898]: Health check update: Degraded data redundancy: 356/6516 objects degraded (5.463%), 9 pgs degraded (PG_DEGRADED) 2026-03-10T07:53:30.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:30 vm05.local ceph-mon[130117]: pgmap v42: 65 pgs: 3 active+recovery_wait+undersized+degraded+remapped, 4 active+recovery_wait+degraded, 1 active+recovering, 57 active+clean; 285 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 41 KiB/s rd, 1.6 MiB/s wr, 244 op/s; 310/5496 objects degraded (5.640%); 511 KiB/s, 1 keys/s, 9 objects/s recovering 2026-03-10T07:53:30.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:30 vm08.local ceph-mon[107898]: pgmap v42: 65 pgs: 3 active+recovery_wait+undersized+degraded+remapped, 4 active+recovery_wait+degraded, 1 active+recovering, 57 active+clean; 285 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 41 KiB/s rd, 1.6 MiB/s wr, 244 op/s; 310/5496 objects degraded (5.640%); 511 KiB/s, 1 keys/s, 9 objects/s recovering 2026-03-10T07:53:32.402 INFO:tasks.workunit:Stopping ['suites/fsstress.sh'] on client.0... 2026-03-10T07:53:32.402 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.0 /home/ubuntu/cephtest/clone.client.0 2026-03-10T07:53:32.876 DEBUG:teuthology.parallel:result is None 2026-03-10T07:53:33.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:32 vm05.local ceph-mon[130117]: pgmap v43: 65 pgs: 3 active+recovery_wait+undersized+degraded+remapped, 4 active+recovery_wait+degraded, 1 active+recovering, 57 active+clean; 284 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 41 KiB/s rd, 1.6 MiB/s wr, 258 op/s; 310/5145 objects degraded (6.025%); 511 KiB/s, 1 keys/s, 9 objects/s recovering 2026-03-10T07:53:33.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:32 vm08.local ceph-mon[107898]: pgmap v43: 65 pgs: 3 active+recovery_wait+undersized+degraded+remapped, 4 active+recovery_wait+degraded, 1 active+recovering, 57 active+clean; 284 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 41 KiB/s rd, 1.6 MiB/s wr, 258 op/s; 310/5145 objects degraded (6.025%); 511 KiB/s, 1 keys/s, 9 objects/s recovering 2026-03-10T07:53:34.147 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:33 vm05.local ceph-mon[130117]: pgmap v44: 65 pgs: 3 active+recovery_wait+undersized+degraded+remapped, 4 active+recovery_wait+degraded, 1 active+recovering, 57 active+clean; 284 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 16 KiB/s rd, 950 KiB/s wr, 119 op/s; 310/5145 objects degraded (6.025%); 473 KiB/s, 4 objects/s recovering 2026-03-10T07:53:34.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:33 vm08.local ceph-mon[107898]: pgmap v44: 65 pgs: 3 active+recovery_wait+undersized+degraded+remapped, 4 active+recovery_wait+degraded, 1 active+recovering, 57 active+clean; 284 MiB data, 2.7 GiB used, 117 GiB / 120 GiB avail; 16 KiB/s rd, 950 KiB/s wr, 119 op/s; 310/5145 objects degraded (6.025%); 473 KiB/s, 4 objects/s recovering 2026-03-10T07:53:36.069 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:35 vm08.local ceph-mon[107898]: Health check update: Degraded data redundancy: 310/5145 objects degraded (6.025%), 7 pgs degraded (PG_DEGRADED) 2026-03-10T07:53:36.069 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:35 vm08.local ceph-mon[107898]: osdmap e52: 6 total, 6 up, 6 in 2026-03-10T07:53:36.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:35 vm05.local ceph-mon[130117]: Health check update: Degraded data redundancy: 310/5145 objects degraded (6.025%), 7 pgs degraded (PG_DEGRADED) 2026-03-10T07:53:36.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:35 vm05.local ceph-mon[130117]: osdmap e52: 6 total, 6 up, 6 in 2026-03-10T07:53:37.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:36 vm05.local ceph-mon[130117]: pgmap v46: 65 pgs: 2 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 4 active+recovery_wait+degraded, 58 active+clean; 279 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 31 KiB/s rd, 1.5 MiB/s wr, 176 op/s; 267/3942 objects degraded (6.773%); 430 KiB/s, 8 objects/s recovering 2026-03-10T07:53:37.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:36 vm05.local ceph-mon[130117]: osdmap e53: 6 total, 6 up, 6 in 2026-03-10T07:53:37.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:36 vm08.local ceph-mon[107898]: pgmap v46: 65 pgs: 2 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 4 active+recovery_wait+degraded, 58 active+clean; 279 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 31 KiB/s rd, 1.5 MiB/s wr, 176 op/s; 267/3942 objects degraded (6.773%); 430 KiB/s, 8 objects/s recovering 2026-03-10T07:53:37.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:36 vm08.local ceph-mon[107898]: osdmap e53: 6 total, 6 up, 6 in 2026-03-10T07:53:38.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:37 vm05.local ceph-mon[130117]: pgmap v48: 65 pgs: 2 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 4 active+recovery_wait+degraded, 58 active+clean; 273 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 808 KiB/s wr, 124 op/s; 267/3564 objects degraded (7.492%); 0 B/s, 5 objects/s recovering 2026-03-10T07:53:38.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:37 vm08.local ceph-mon[107898]: pgmap v48: 65 pgs: 2 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering+undersized+degraded+remapped, 4 active+recovery_wait+degraded, 58 active+clean; 273 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 20 KiB/s rd, 808 KiB/s wr, 124 op/s; 267/3564 objects degraded (7.492%); 0 B/s, 5 objects/s recovering 2026-03-10T07:53:39.156 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:38 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T07:53:39.156 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:38 vm08.local ceph-mon[107898]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T07:53:39.156 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:38 vm08.local ceph-mon[107898]: Upgrade: unsafe to stop osd(s) at this time (2 PGs are or would become offline) 2026-03-10T07:53:39.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:38 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T07:53:39.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:38 vm05.local ceph-mon[130117]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T07:53:39.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:38 vm05.local ceph-mon[130117]: Upgrade: unsafe to stop osd(s) at this time (2 PGs are or would become offline) 2026-03-10T07:53:39.436 INFO:tasks.workunit:Stopping ['suites/fsstress.sh'] on client.1... 2026-03-10T07:53:39.436 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -rf -- /home/ubuntu/cephtest/workunits.list.client.1 /home/ubuntu/cephtest/clone.client.1 2026-03-10T07:53:39.854 DEBUG:teuthology.parallel:result is None 2026-03-10T07:53:39.854 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.0/client.0 2026-03-10T07:53:39.891 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.0/client.0 2026-03-10T07:53:39.891 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -rf -- /home/ubuntu/cephtest/mnt.1/client.1 2026-03-10T07:53:39.940 INFO:tasks.workunit:Deleted dir /home/ubuntu/cephtest/mnt.1/client.1 2026-03-10T07:53:39.940 DEBUG:teuthology.parallel:result is None 2026-03-10T07:53:40.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:39 vm05.local ceph-mon[130117]: pgmap v49: 65 pgs: 2 active+recovery_wait+undersized+degraded+remapped, 3 active+recovery_wait+degraded, 1 active+recovering, 59 active+clean; 271 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 1.4 MiB/s wr, 230 op/s; 217/2091 objects degraded (10.378%); 0 B/s, 10 objects/s recovering 2026-03-10T07:53:40.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:39 vm05.local ceph-mon[130117]: Health check update: Degraded data redundancy: 217/2091 objects degraded (10.378%), 5 pgs degraded (PG_DEGRADED) 2026-03-10T07:53:40.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:39 vm08.local ceph-mon[107898]: pgmap v49: 65 pgs: 2 active+recovery_wait+undersized+degraded+remapped, 3 active+recovery_wait+degraded, 1 active+recovering, 59 active+clean; 271 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 1.4 MiB/s wr, 230 op/s; 217/2091 objects degraded (10.378%); 0 B/s, 10 objects/s recovering 2026-03-10T07:53:40.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:39 vm08.local ceph-mon[107898]: Health check update: Degraded data redundancy: 217/2091 objects degraded (10.378%), 5 pgs degraded (PG_DEGRADED) 2026-03-10T07:53:42.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:42 vm05.local ceph-mon[130117]: pgmap v50: 65 pgs: 2 active+recovery_wait+undersized+degraded+remapped, 3 active+recovery_wait+degraded, 1 active+recovering, 59 active+clean; 274 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 1.8 MiB/s wr, 261 op/s; 217/1818 objects degraded (11.936%); 0 B/s, 10 objects/s recovering 2026-03-10T07:53:42.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:53:42.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:42 vm08.local ceph-mon[107898]: pgmap v50: 65 pgs: 2 active+recovery_wait+undersized+degraded+remapped, 3 active+recovery_wait+degraded, 1 active+recovering, 59 active+clean; 274 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 34 KiB/s rd, 1.8 MiB/s wr, 261 op/s; 217/1818 objects degraded (11.936%); 0 B/s, 10 objects/s recovering 2026-03-10T07:53:42.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:53:44.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:44 vm05.local ceph-mon[130117]: pgmap v51: 65 pgs: 2 active+recovery_wait+undersized+degraded+remapped, 3 active+recovery_wait+degraded, 1 active+recovering, 59 active+clean; 274 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 14 KiB/s rd, 967 KiB/s wr, 159 op/s; 217/1818 objects degraded (11.936%); 0 B/s, 5 objects/s recovering 2026-03-10T07:53:44.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:44 vm08.local ceph-mon[107898]: pgmap v51: 65 pgs: 2 active+recovery_wait+undersized+degraded+remapped, 3 active+recovery_wait+degraded, 1 active+recovering, 59 active+clean; 274 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 14 KiB/s rd, 967 KiB/s wr, 159 op/s; 217/1818 objects degraded (11.936%); 0 B/s, 5 objects/s recovering 2026-03-10T07:53:45.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:45 vm05.local ceph-mon[130117]: Health check update: Degraded data redundancy: 217/1818 objects degraded (11.936%), 5 pgs degraded (PG_DEGRADED) 2026-03-10T07:53:45.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:45 vm08.local ceph-mon[107898]: Health check update: Degraded data redundancy: 217/1818 objects degraded (11.936%), 5 pgs degraded (PG_DEGRADED) 2026-03-10T07:53:46.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:46 vm08.local ceph-mon[107898]: pgmap v52: 65 pgs: 2 active+recovery_wait+undersized+degraded+remapped, 1 active+recovery_wait+degraded, 1 active+recovering, 61 active+clean; 259 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 25 KiB/s rd, 831 KiB/s wr, 182 op/s; 154/474 objects degraded (32.489%); 0 B/s, 8 objects/s recovering 2026-03-10T07:53:46.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:46 vm05.local ceph-mon[130117]: pgmap v52: 65 pgs: 2 active+recovery_wait+undersized+degraded+remapped, 1 active+recovery_wait+degraded, 1 active+recovering, 61 active+clean; 259 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 25 KiB/s rd, 831 KiB/s wr, 182 op/s; 154/474 objects degraded (32.489%); 0 B/s, 8 objects/s recovering 2026-03-10T07:53:48.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:48 vm08.local ceph-mon[107898]: pgmap v53: 65 pgs: 2 active+recovery_wait+undersized+degraded+remapped, 1 active+recovery_wait+degraded, 1 active+recovering, 61 active+clean; 255 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 23 KiB/s rd, 786 KiB/s wr, 160 op/s; 154/264 objects degraded (58.333%); 0 B/s, 8 objects/s recovering 2026-03-10T07:53:48.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:48 vm05.local ceph-mon[130117]: pgmap v53: 65 pgs: 2 active+recovery_wait+undersized+degraded+remapped, 1 active+recovery_wait+degraded, 1 active+recovering, 61 active+clean; 255 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 23 KiB/s rd, 786 KiB/s wr, 160 op/s; 154/264 objects degraded (58.333%); 0 B/s, 8 objects/s recovering 2026-03-10T07:53:49.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:49 vm05.local ceph-mon[130117]: Health check update: Degraded data redundancy: 154/264 objects degraded (58.333%), 3 pgs degraded (PG_DEGRADED) 2026-03-10T07:53:49.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:49 vm08.local ceph-mon[107898]: Health check update: Degraded data redundancy: 154/264 objects degraded (58.333%), 3 pgs degraded (PG_DEGRADED) 2026-03-10T07:53:50.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:50 vm05.local ceph-mon[130117]: pgmap v54: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovery_wait+degraded, 1 active+recovering+undersized+degraded+remapped, 62 active+clean; 255 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 21 KiB/s rd, 724 KiB/s wr, 148 op/s; 99/261 objects degraded (37.931%); 0 B/s, 11 objects/s recovering 2026-03-10T07:53:50.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:50 vm08.local ceph-mon[107898]: pgmap v54: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovery_wait+degraded, 1 active+recovering+undersized+degraded+remapped, 62 active+clean; 255 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 21 KiB/s rd, 724 KiB/s wr, 148 op/s; 99/261 objects degraded (37.931%); 0 B/s, 11 objects/s recovering 2026-03-10T07:53:52.157 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.155+0000 7f916d3bd700 1 -- 192.168.123.105:0/579907257 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9168100990 msgr2=0x7f91681049e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:52.157 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.155+0000 7f916d3bd700 1 --2- 192.168.123.105:0/579907257 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9168100990 0x7f91681049e0 secure :-1 s=READY pgs=51 cs=0 l=1 rev1=1 crypto rx=0x7f9158009b00 tx=0x7f9158009e10 comp rx=0 tx=0).stop 2026-03-10T07:53:52.157 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.157+0000 7f916d3bd700 1 -- 192.168.123.105:0/579907257 shutdown_connections 2026-03-10T07:53:52.157 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.157+0000 7f916d3bd700 1 --2- 192.168.123.105:0/579907257 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9168100990 0x7f91681049e0 unknown :-1 s=CLOSED pgs=51 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.157 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.157+0000 7f916d3bd700 1 --2- 192.168.123.105:0/579907257 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f91680fffe0 0x7f91681003c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.157 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.157+0000 7f916d3bd700 1 -- 192.168.123.105:0/579907257 >> 192.168.123.105:0/579907257 conn(0x7f91680fb830 msgr2=0x7f91680fdc50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:52.157 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.157+0000 7f916d3bd700 1 -- 192.168.123.105:0/579907257 shutdown_connections 2026-03-10T07:53:52.157 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.157+0000 7f916d3bd700 1 -- 192.168.123.105:0/579907257 wait complete. 2026-03-10T07:53:52.158 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.158+0000 7f916d3bd700 1 Processor -- start 2026-03-10T07:53:52.158 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.158+0000 7f916d3bd700 1 -- start start 2026-03-10T07:53:52.158 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.158+0000 7f916d3bd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f91680fffe0 0x7f916810ad50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:52.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.158+0000 7f916d3bd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9168100990 0x7f916810b290 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:52.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.158+0000 7f916d3bd700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f916810cb00 con 0x7f9168100990 2026-03-10T07:53:52.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.158+0000 7f916d3bd700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f916810cc70 con 0x7f91680fffe0 2026-03-10T07:53:52.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.158+0000 7f9166ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f91680fffe0 0x7f916810ad50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:52.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.158+0000 7f9166ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f91680fffe0 0x7f916810ad50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:33592/0 (socket says 192.168.123.105:33592) 2026-03-10T07:53:52.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.158+0000 7f9166ffd700 1 -- 192.168.123.105:0/1029687676 learned_addr learned my addr 192.168.123.105:0/1029687676 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:53:52.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.159+0000 7f9166ffd700 1 -- 192.168.123.105:0/1029687676 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9168100990 msgr2=0x7f916810b290 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T07:53:52.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.159+0000 7f91667fc700 1 --2- 192.168.123.105:0/1029687676 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9168100990 0x7f916810b290 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:52.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.159+0000 7f9166ffd700 1 --2- 192.168.123.105:0/1029687676 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9168100990 0x7f916810b290 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.159 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.159+0000 7f9166ffd700 1 -- 192.168.123.105:0/1029687676 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f91580097e0 con 0x7f91680fffe0 2026-03-10T07:53:52.160 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.159+0000 7f9166ffd700 1 --2- 192.168.123.105:0/1029687676 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f91680fffe0 0x7f916810ad50 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f915000d8d0 tx=0x7f915000dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:52.160 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.159+0000 7f91667fc700 1 --2- 192.168.123.105:0/1029687676 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9168100990 0x7f916810b290 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:53:52.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.160+0000 7f915ffff700 1 -- 192.168.123.105:0/1029687676 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9150009940 con 0x7f91680fffe0 2026-03-10T07:53:52.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.160+0000 7f916d3bd700 1 -- 192.168.123.105:0/1029687676 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f916810b890 con 0x7f91680fffe0 2026-03-10T07:53:52.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.160+0000 7f916d3bd700 1 -- 192.168.123.105:0/1029687676 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f91681a77b0 con 0x7f91680fffe0 2026-03-10T07:53:52.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.160+0000 7f915ffff700 1 -- 192.168.123.105:0/1029687676 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f9150010460 con 0x7f91680fffe0 2026-03-10T07:53:52.161 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.160+0000 7f915ffff700 1 -- 192.168.123.105:0/1029687676 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f915000f5d0 con 0x7f91680fffe0 2026-03-10T07:53:52.162 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.161+0000 7f915ffff700 1 -- 192.168.123.105:0/1029687676 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f91500105d0 con 0x7f91680fffe0 2026-03-10T07:53:52.162 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.161+0000 7f915ffff700 1 --2- 192.168.123.105:0/1029687676 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f9154077a40 0x7f9154079f00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:52.162 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.161+0000 7f915ffff700 1 -- 192.168.123.105:0/1029687676 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(53..53 src has 1..53) v4 ==== 6194+0+0 (secure 0 0 0) 0x7f91500998b0 con 0x7f91680fffe0 2026-03-10T07:53:52.162 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.162+0000 7f91667fc700 1 --2- 192.168.123.105:0/1029687676 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f9154077a40 0x7f9154079f00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:52.163 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.163+0000 7f91667fc700 1 --2- 192.168.123.105:0/1029687676 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f9154077a40 0x7f9154079f00 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f915800b5c0 tx=0x7f9158005c00 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:52.163 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.163+0000 7f916d3bd700 1 -- 192.168.123.105:0/1029687676 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9148005320 con 0x7f91680fffe0 2026-03-10T07:53:52.167 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.166+0000 7f915ffff700 1 -- 192.168.123.105:0/1029687676 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f91500617b0 con 0x7f91680fffe0 2026-03-10T07:53:52.303 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.302+0000 7f916d3bd700 1 -- 192.168.123.105:0/1029687676 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9148000bf0 con 0x7f9154077a40 2026-03-10T07:53:52.304 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.303+0000 7f915ffff700 1 -- 192.168.123.105:0/1029687676 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f9148000bf0 con 0x7f9154077a40 2026-03-10T07:53:52.306 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.305+0000 7f916d3bd700 1 -- 192.168.123.105:0/1029687676 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f9154077a40 msgr2=0x7f9154079f00 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:52.306 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.305+0000 7f916d3bd700 1 --2- 192.168.123.105:0/1029687676 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f9154077a40 0x7f9154079f00 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f915800b5c0 tx=0x7f9158005c00 comp rx=0 tx=0).stop 2026-03-10T07:53:52.306 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.305+0000 7f916d3bd700 1 -- 192.168.123.105:0/1029687676 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f91680fffe0 msgr2=0x7f916810ad50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:52.306 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.305+0000 7f916d3bd700 1 --2- 192.168.123.105:0/1029687676 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f91680fffe0 0x7f916810ad50 secure :-1 s=READY pgs=21 cs=0 l=1 rev1=1 crypto rx=0x7f915000d8d0 tx=0x7f915000dc90 comp rx=0 tx=0).stop 2026-03-10T07:53:52.306 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.306+0000 7f916d3bd700 1 -- 192.168.123.105:0/1029687676 shutdown_connections 2026-03-10T07:53:52.306 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.306+0000 7f916d3bd700 1 --2- 192.168.123.105:0/1029687676 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f91680fffe0 0x7f916810ad50 unknown :-1 s=CLOSED pgs=21 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.306 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.306+0000 7f916d3bd700 1 --2- 192.168.123.105:0/1029687676 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f9154077a40 0x7f9154079f00 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.307 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.306+0000 7f916d3bd700 1 --2- 192.168.123.105:0/1029687676 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9168100990 0x7f916810b290 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.307 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.306+0000 7f916d3bd700 1 -- 192.168.123.105:0/1029687676 >> 192.168.123.105:0/1029687676 conn(0x7f91680fb830 msgr2=0x7f91680fdc10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:52.307 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.306+0000 7f916d3bd700 1 -- 192.168.123.105:0/1029687676 shutdown_connections 2026-03-10T07:53:52.307 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.306+0000 7f916d3bd700 1 -- 192.168.123.105:0/1029687676 wait complete. 2026-03-10T07:53:52.316 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T07:53:52.380 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.380+0000 7f4239734700 1 -- 192.168.123.105:0/1133139919 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4234073a50 msgr2=0x7f4234111940 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:52.380 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.380+0000 7f4239734700 1 --2- 192.168.123.105:0/1133139919 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4234073a50 0x7f4234111940 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f4224009b00 tx=0x7f4224009e10 comp rx=0 tx=0).stop 2026-03-10T07:53:52.380 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.380+0000 7f4239734700 1 -- 192.168.123.105:0/1133139919 shutdown_connections 2026-03-10T07:53:52.380 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.380+0000 7f4239734700 1 --2- 192.168.123.105:0/1133139919 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4234073a50 0x7f4234111940 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.381 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.380+0000 7f4239734700 1 --2- 192.168.123.105:0/1133139919 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4234073130 0x7f4234073510 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.381 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.380+0000 7f4239734700 1 -- 192.168.123.105:0/1133139919 >> 192.168.123.105:0/1133139919 conn(0x7f42340fc920 msgr2=0x7f42340fed40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:52.381 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.381+0000 7f4239734700 1 -- 192.168.123.105:0/1133139919 shutdown_connections 2026-03-10T07:53:52.381 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.381+0000 7f4239734700 1 -- 192.168.123.105:0/1133139919 wait complete. 2026-03-10T07:53:52.381 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.381+0000 7f4239734700 1 Processor -- start 2026-03-10T07:53:52.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.381+0000 7f4239734700 1 -- start start 2026-03-10T07:53:52.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.382+0000 7f4239734700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4234073130 0x7f423419d0e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:52.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.382+0000 7f4239734700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4234073a50 0x7f423419d620 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:52.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.382+0000 7f4239734700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f423419dd00 con 0x7f4234073130 2026-03-10T07:53:52.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.382+0000 7f4239734700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f42341a1a90 con 0x7f4234073a50 2026-03-10T07:53:52.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.382+0000 7f4232ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4234073130 0x7f423419d0e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:52.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.382+0000 7f4232ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4234073130 0x7f423419d0e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:48016/0 (socket says 192.168.123.105:48016) 2026-03-10T07:53:52.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.382+0000 7f4232ffd700 1 -- 192.168.123.105:0/20560757 learned_addr learned my addr 192.168.123.105:0/20560757 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:53:52.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.382+0000 7f42327fc700 1 --2- 192.168.123.105:0/20560757 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4234073a50 0x7f423419d620 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:52.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.382+0000 7f4232ffd700 1 -- 192.168.123.105:0/20560757 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4234073a50 msgr2=0x7f423419d620 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:52.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.382+0000 7f4232ffd700 1 --2- 192.168.123.105:0/20560757 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4234073a50 0x7f423419d620 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.384 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.382+0000 7f4232ffd700 1 -- 192.168.123.105:0/20560757 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f42240097e0 con 0x7f4234073130 2026-03-10T07:53:52.384 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.382+0000 7f4232ffd700 1 --2- 192.168.123.105:0/20560757 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4234073130 0x7f423419d0e0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f421c00cc60 tx=0x7f421c0074a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:52.384 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.382+0000 7f422bfff700 1 -- 192.168.123.105:0/20560757 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f421c007af0 con 0x7f4234073130 2026-03-10T07:53:52.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.382+0000 7f422bfff700 1 -- 192.168.123.105:0/20560757 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f421c004d10 con 0x7f4234073130 2026-03-10T07:53:52.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.382+0000 7f422bfff700 1 -- 192.168.123.105:0/20560757 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f421c005710 con 0x7f4234073130 2026-03-10T07:53:52.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.382+0000 7f4239734700 1 -- 192.168.123.105:0/20560757 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f42341a1d70 con 0x7f4234073130 2026-03-10T07:53:52.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.382+0000 7f4239734700 1 -- 192.168.123.105:0/20560757 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f42341a2270 con 0x7f4234073130 2026-03-10T07:53:52.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.384+0000 7f422bfff700 1 -- 192.168.123.105:0/20560757 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f421c00f490 con 0x7f4234073130 2026-03-10T07:53:52.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.384+0000 7f422bfff700 1 --2- 192.168.123.105:0/20560757 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4220077b10 0x7f4220079fd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:52.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.384+0000 7f422bfff700 1 -- 192.168.123.105:0/20560757 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(53..53 src has 1..53) v4 ==== 6194+0+0 (secure 0 0 0) 0x7f421c09a930 con 0x7f4234073130 2026-03-10T07:53:52.385 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.385+0000 7f42327fc700 1 --2- 192.168.123.105:0/20560757 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4220077b10 0x7f4220079fd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:52.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.385+0000 7f42327fc700 1 --2- 192.168.123.105:0/20560757 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4220077b10 0x7f4220079fd0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f4224000c00 tx=0x7f4224005c00 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:52.386 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.385+0000 7f4239734700 1 -- 192.168.123.105:0/20560757 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f423404ea90 con 0x7f4234073130 2026-03-10T07:53:52.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.389+0000 7f422bfff700 1 -- 192.168.123.105:0/20560757 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f421c01d080 con 0x7f4234073130 2026-03-10T07:53:52.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.533+0000 7f4239734700 1 -- 192.168.123.105:0/20560757 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f423419e490 con 0x7f4220077b10 2026-03-10T07:53:52.535 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.534+0000 7f422bfff700 1 -- 192.168.123.105:0/20560757 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f423419e490 con 0x7f4220077b10 2026-03-10T07:53:52.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.537+0000 7f4239734700 1 -- 192.168.123.105:0/20560757 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4220077b10 msgr2=0x7f4220079fd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:52.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.537+0000 7f4239734700 1 --2- 192.168.123.105:0/20560757 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4220077b10 0x7f4220079fd0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f4224000c00 tx=0x7f4224005c00 comp rx=0 tx=0).stop 2026-03-10T07:53:52.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.537+0000 7f4239734700 1 -- 192.168.123.105:0/20560757 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4234073130 msgr2=0x7f423419d0e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:52.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.537+0000 7f4239734700 1 --2- 192.168.123.105:0/20560757 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4234073130 0x7f423419d0e0 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f421c00cc60 tx=0x7f421c0074a0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.537+0000 7f4239734700 1 -- 192.168.123.105:0/20560757 shutdown_connections 2026-03-10T07:53:52.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.537+0000 7f4239734700 1 --2- 192.168.123.105:0/20560757 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4220077b10 0x7f4220079fd0 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.537+0000 7f4239734700 1 --2- 192.168.123.105:0/20560757 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4234073130 0x7f423419d0e0 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.537+0000 7f4239734700 1 --2- 192.168.123.105:0/20560757 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4234073a50 0x7f423419d620 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.537+0000 7f4239734700 1 -- 192.168.123.105:0/20560757 >> 192.168.123.105:0/20560757 conn(0x7f42340fc920 msgr2=0x7f4234103450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:52.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.538+0000 7f4239734700 1 -- 192.168.123.105:0/20560757 shutdown_connections 2026-03-10T07:53:52.538 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.538+0000 7f4239734700 1 -- 192.168.123.105:0/20560757 wait complete. 2026-03-10T07:53:52.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.604+0000 7f8976ed6700 1 -- 192.168.123.105:0/791827210 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8970103a50 msgr2=0x7f8970107aa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:52.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.604+0000 7f8976ed6700 1 --2- 192.168.123.105:0/791827210 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8970103a50 0x7f8970107aa0 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f896c009b00 tx=0x7f896c009e10 comp rx=0 tx=0).stop 2026-03-10T07:53:52.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.605+0000 7f8976ed6700 1 -- 192.168.123.105:0/791827210 shutdown_connections 2026-03-10T07:53:52.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.605+0000 7f8976ed6700 1 --2- 192.168.123.105:0/791827210 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8970103a50 0x7f8970107aa0 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.605+0000 7f8976ed6700 1 --2- 192.168.123.105:0/791827210 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f89701030a0 0x7f8970103480 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.605+0000 7f8976ed6700 1 -- 192.168.123.105:0/791827210 >> 192.168.123.105:0/791827210 conn(0x7f89700fe930 msgr2=0x7f8970100d50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:52.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.605+0000 7f8976ed6700 1 -- 192.168.123.105:0/791827210 shutdown_connections 2026-03-10T07:53:52.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.605+0000 7f8976ed6700 1 -- 192.168.123.105:0/791827210 wait complete. 2026-03-10T07:53:52.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.605+0000 7f8976ed6700 1 Processor -- start 2026-03-10T07:53:52.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.605+0000 7f8976ed6700 1 -- start start 2026-03-10T07:53:52.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.606+0000 7f8976ed6700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f89701030a0 0x7f897019e900 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:52.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.606+0000 7f8976ed6700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8970103a50 0x7f897019ee40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:52.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.606+0000 7f8976ed6700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f897019f4d0 con 0x7f89701030a0 2026-03-10T07:53:52.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.606+0000 7f8976ed6700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8970198980 con 0x7f8970103a50 2026-03-10T07:53:52.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.606+0000 7f8975ed4700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f89701030a0 0x7f897019e900 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:52.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.606+0000 7f8975ed4700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f89701030a0 0x7f897019e900 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:48034/0 (socket says 192.168.123.105:48034) 2026-03-10T07:53:52.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.606+0000 7f8975ed4700 1 -- 192.168.123.105:0/2913743664 learned_addr learned my addr 192.168.123.105:0/2913743664 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:53:52.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.606+0000 7f89756d3700 1 --2- 192.168.123.105:0/2913743664 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8970103a50 0x7f897019ee40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:52.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.606+0000 7f8975ed4700 1 -- 192.168.123.105:0/2913743664 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8970103a50 msgr2=0x7f897019ee40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:52.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.606+0000 7f8975ed4700 1 --2- 192.168.123.105:0/2913743664 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8970103a50 0x7f897019ee40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.606+0000 7f8975ed4700 1 -- 192.168.123.105:0/2913743664 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f896c0097e0 con 0x7f89701030a0 2026-03-10T07:53:52.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.606+0000 7f8975ed4700 1 --2- 192.168.123.105:0/2913743664 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f89701030a0 0x7f897019e900 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f896400cc60 tx=0x7f89640074a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:52.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.606+0000 7f8962ffd700 1 -- 192.168.123.105:0/2913743664 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8964007af0 con 0x7f89701030a0 2026-03-10T07:53:52.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.606+0000 7f8976ed6700 1 -- 192.168.123.105:0/2913743664 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8970198c60 con 0x7f89701030a0 2026-03-10T07:53:52.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.606+0000 7f8976ed6700 1 -- 192.168.123.105:0/2913743664 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f89701991b0 con 0x7f89701030a0 2026-03-10T07:53:52.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.608+0000 7f8962ffd700 1 -- 192.168.123.105:0/2913743664 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f8964004d10 con 0x7f89701030a0 2026-03-10T07:53:52.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.608+0000 7f8962ffd700 1 -- 192.168.123.105:0/2913743664 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f89640056e0 con 0x7f89701030a0 2026-03-10T07:53:52.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.608+0000 7f8962ffd700 1 -- 192.168.123.105:0/2913743664 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f896400f4d0 con 0x7f89701030a0 2026-03-10T07:53:52.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.608+0000 7f8962ffd700 1 --2- 192.168.123.105:0/2913743664 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f895c077b90 0x7f895c07a050 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:52.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.608+0000 7f8962ffd700 1 -- 192.168.123.105:0/2913743664 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(54..54 src has 1..54) v4 ==== 6165+0+0 (secure 0 0 0) 0x7f8964013070 con 0x7f89701030a0 2026-03-10T07:53:52.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.608+0000 7f89756d3700 1 --2- 192.168.123.105:0/2913743664 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f895c077b90 0x7f895c07a050 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:52.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.608+0000 7f8976ed6700 1 -- 192.168.123.105:0/2913743664 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8954005320 con 0x7f89701030a0 2026-03-10T07:53:52.611 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.611+0000 7f8962ffd700 1 -- 192.168.123.105:0/2913743664 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f89640635e0 con 0x7f89701030a0 2026-03-10T07:53:52.612 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.612+0000 7f89756d3700 1 --2- 192.168.123.105:0/2913743664 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f895c077b90 0x7f895c07a050 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f8970100180 tx=0x7f896c005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:52.736 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:52 vm05.local ceph-mon[130117]: pgmap v55: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovery_wait+degraded, 1 active+recovering+undersized+degraded+remapped, 62 active+clean; 255 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 11 KiB/s rd, 330 KiB/s wr, 68 op/s; 99/261 objects degraded (37.931%); 0 B/s, 7 objects/s recovering 2026-03-10T07:53:52.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.735+0000 7f8976ed6700 1 -- 192.168.123.105:0/2913743664 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f8954000bf0 con 0x7f895c077b90 2026-03-10T07:53:52.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.742+0000 7f8962ffd700 1 -- 192.168.123.105:0/2913743664 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f8954000bf0 con 0x7f895c077b90 2026-03-10T07:53:52.743 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T07:53:52.743 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (2m) 46s ago 7m 23.0M - 0.25.0 c8568f914cd2 ac15d5f35994 2026-03-10T07:53:52.743 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (7m) 46s ago 7m 9680k - 18.2.1 5be31c24972a 26c4db858175 2026-03-10T07:53:52.743 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (7m) 58s ago 7m 10.4M - 18.2.1 5be31c24972a 209e2398a09c 2026-03-10T07:53:52.743 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (61s) 46s ago 7m 7847k - 19.2.3-678-ge911bdeb 654f31e6858e daa831c74cf4 2026-03-10T07:53:52.743 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (59s) 58s ago 7m 8262k - 19.2.3-678-ge911bdeb 654f31e6858e 668ac55c9722 2026-03-10T07:53:52.743 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (2m) 46s ago 7m 81.3M - 10.4.0 c8b91775d855 6acb529ad951 2026-03-10T07:53:52.743 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.omfhnh vm05 running (5m) 46s ago 5m 246M - 18.2.1 5be31c24972a e23de179e09c 2026-03-10T07:53:52.743 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.pavqil vm05 running (5m) 46s ago 5m 16.8M - 18.2.1 5be31c24972a 5b9e5afa214c 2026-03-10T07:53:52.743 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.dgsaon vm08 running (5m) 58s ago 5m 17.7M - 18.2.1 5be31c24972a 1696aee522b5 2026-03-10T07:53:52.743 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ybmbgd vm08 running (5m) 58s ago 5m 16.0M - 18.2.1 5be31c24972a 30b0e51cd2ed 2026-03-10T07:53:52.743 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.blexke vm05 *:8443,9283,8765 running (3m) 46s ago 8m 582M - 19.2.3-678-ge911bdeb 654f31e6858e 5467b6a3bd38 2026-03-10T07:53:52.743 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.orfpog vm08 *:8443,9283,8765 running (3m) 58s ago 6m 534M - 19.2.3-678-ge911bdeb 654f31e6858e 7a15d7811d5b 2026-03-10T07:53:52.743 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (94s) 46s ago 8m 57.5M 2048M 19.2.3-678-ge911bdeb 654f31e6858e f02f076bb820 2026-03-10T07:53:52.743 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (77s) 58s ago 6m 48.8M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 73d9a504f360 2026-03-10T07:53:52.743 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (3m) 46s ago 7m 10.3M - 1.7.0 72c9c2088986 7cd0b23b4118 2026-03-10T07:53:52.743 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (3m) 58s ago 7m 9.92M - 1.7.0 72c9c2088986 3dd4d91d5881 2026-03-10T07:53:52.743 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (49s) 46s ago 6m 33.6M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b35fccc2a4d5 2026-03-10T07:53:52.743 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (6m) 46s ago 6m 333M 4096M 18.2.1 5be31c24972a 88e0b65b2c93 2026-03-10T07:53:52.743 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (6m) 46s ago 6m 272M 4096M 18.2.1 5be31c24972a b8d3cbeb49f1 2026-03-10T07:53:52.743 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (6m) 58s ago 6m 410M 4096M 18.2.1 5be31c24972a 0a62c54a86c0 2026-03-10T07:53:52.743 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (6m) 58s ago 6m 343M 4096M 18.2.1 5be31c24972a bd748b691ccd 2026-03-10T07:53:52.743 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (5m) 58s ago 5m 284M 4096M 18.2.1 5be31c24972a 9f08820ae98b 2026-03-10T07:53:52.743 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (2m) 46s ago 7m 50.6M - 2.51.0 1d3b7f56885b c59a6be07563 2026-03-10T07:53:52.745 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.745+0000 7f8976ed6700 1 -- 192.168.123.105:0/2913743664 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f895c077b90 msgr2=0x7f895c07a050 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:52.745 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.745+0000 7f8976ed6700 1 --2- 192.168.123.105:0/2913743664 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f895c077b90 0x7f895c07a050 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f8970100180 tx=0x7f896c005fb0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.745 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.745+0000 7f8976ed6700 1 -- 192.168.123.105:0/2913743664 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f89701030a0 msgr2=0x7f897019e900 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:52.745 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.745+0000 7f8976ed6700 1 --2- 192.168.123.105:0/2913743664 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f89701030a0 0x7f897019e900 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f896400cc60 tx=0x7f89640074a0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.746 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.745+0000 7f8976ed6700 1 -- 192.168.123.105:0/2913743664 shutdown_connections 2026-03-10T07:53:52.746 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.745+0000 7f8976ed6700 1 --2- 192.168.123.105:0/2913743664 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f895c077b90 0x7f895c07a050 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.746 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.745+0000 7f8976ed6700 1 --2- 192.168.123.105:0/2913743664 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f89701030a0 0x7f897019e900 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.746 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.745+0000 7f8976ed6700 1 --2- 192.168.123.105:0/2913743664 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8970103a50 0x7f897019ee40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.746 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.745+0000 7f8976ed6700 1 -- 192.168.123.105:0/2913743664 >> 192.168.123.105:0/2913743664 conn(0x7f89700fe930 msgr2=0x7f8970107310 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:52.746 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.745+0000 7f8976ed6700 1 -- 192.168.123.105:0/2913743664 shutdown_connections 2026-03-10T07:53:52.746 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.745+0000 7f8976ed6700 1 -- 192.168.123.105:0/2913743664 wait complete. 2026-03-10T07:53:52.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.811+0000 7f83f5f95700 1 -- 192.168.123.105:0/72219687 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83f00690e0 msgr2=0x7f83f0105b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:52.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.811+0000 7f83f5f95700 1 --2- 192.168.123.105:0/72219687 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83f00690e0 0x7f83f0105b50 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f83d8009b00 tx=0x7f83d8009e10 comp rx=0 tx=0).stop 2026-03-10T07:53:52.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.811+0000 7f83f5f95700 1 -- 192.168.123.105:0/72219687 shutdown_connections 2026-03-10T07:53:52.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.812+0000 7f83f5f95700 1 --2- 192.168.123.105:0/72219687 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83f00690e0 0x7f83f0105b50 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.812+0000 7f83f5f95700 1 --2- 192.168.123.105:0/72219687 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83f0068730 0x7f83f0068b10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.812+0000 7f83f5f95700 1 -- 192.168.123.105:0/72219687 >> 192.168.123.105:0/72219687 conn(0x7f83f0075960 msgr2=0x7f83f0075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:52.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.812+0000 7f83f5f95700 1 -- 192.168.123.105:0/72219687 shutdown_connections 2026-03-10T07:53:52.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.812+0000 7f83f5f95700 1 -- 192.168.123.105:0/72219687 wait complete. 2026-03-10T07:53:52.812 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.812+0000 7f83f5f95700 1 Processor -- start 2026-03-10T07:53:52.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.812+0000 7f83f5f95700 1 -- start start 2026-03-10T07:53:52.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.813+0000 7f83f5f95700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83f0068730 0x7f83f019a7e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:52.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.813+0000 7f83f5f95700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83f00690e0 0x7f83f019ad20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:52.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.813+0000 7f83f5f95700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f83f019b3b0 con 0x7f83f0068730 2026-03-10T07:53:52.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.813+0000 7f83f5f95700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f83f0194860 con 0x7f83f00690e0 2026-03-10T07:53:52.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.813+0000 7f83eeffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83f00690e0 0x7f83f019ad20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:52.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.813+0000 7f83eeffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83f00690e0 0x7f83f019ad20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:33668/0 (socket says 192.168.123.105:33668) 2026-03-10T07:53:52.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.813+0000 7f83eeffd700 1 -- 192.168.123.105:0/2761631643 learned_addr learned my addr 192.168.123.105:0/2761631643 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:53:52.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.813+0000 7f83ef7fe700 1 --2- 192.168.123.105:0/2761631643 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83f0068730 0x7f83f019a7e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:52.813 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.813+0000 7f83eeffd700 1 -- 192.168.123.105:0/2761631643 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83f0068730 msgr2=0x7f83f019a7e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:52.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.813+0000 7f83eeffd700 1 --2- 192.168.123.105:0/2761631643 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83f0068730 0x7f83f019a7e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.813+0000 7f83eeffd700 1 -- 192.168.123.105:0/2761631643 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f83d80097e0 con 0x7f83f00690e0 2026-03-10T07:53:52.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.813+0000 7f83ef7fe700 1 --2- 192.168.123.105:0/2761631643 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83f0068730 0x7f83f019a7e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T07:53:52.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.814+0000 7f83eeffd700 1 --2- 192.168.123.105:0/2761631643 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83f00690e0 0x7f83f019ad20 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f83d8009fd0 tx=0x7f83d8004930 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:52.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.814+0000 7f83ecff9700 1 -- 192.168.123.105:0/2761631643 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f83d801d070 con 0x7f83f00690e0 2026-03-10T07:53:52.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.814+0000 7f83f5f95700 1 -- 192.168.123.105:0/2761631643 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f83f0194ae0 con 0x7f83f00690e0 2026-03-10T07:53:52.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.814+0000 7f83f5f95700 1 -- 192.168.123.105:0/2761631643 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f83f0194fd0 con 0x7f83f00690e0 2026-03-10T07:53:52.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.815+0000 7f83ecff9700 1 -- 192.168.123.105:0/2761631643 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f83d8004b90 con 0x7f83f00690e0 2026-03-10T07:53:52.815 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.815+0000 7f83f5f95700 1 -- 192.168.123.105:0/2761631643 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f83f0109450 con 0x7f83f00690e0 2026-03-10T07:53:52.816 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.816+0000 7f83ecff9700 1 -- 192.168.123.105:0/2761631643 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f83d800f670 con 0x7f83f00690e0 2026-03-10T07:53:52.816 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.816+0000 7f83ecff9700 1 -- 192.168.123.105:0/2761631643 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f83d8048930 con 0x7f83f00690e0 2026-03-10T07:53:52.816 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.816+0000 7f83ecff9700 1 --2- 192.168.123.105:0/2761631643 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f83dc07bbe0 0x7f83dc07e0a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:52.816 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.816+0000 7f83ef7fe700 1 --2- 192.168.123.105:0/2761631643 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f83dc07bbe0 0x7f83dc07e0a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:52.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.816+0000 7f83ef7fe700 1 --2- 192.168.123.105:0/2761631643 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f83dc07bbe0 0x7f83dc07e0a0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f83e0005950 tx=0x7f83e000a300 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:52.817 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.816+0000 7f83ecff9700 1 -- 192.168.123.105:0/2761631643 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(54..54 src has 1..54) v4 ==== 6165+0+0 (secure 0 0 0) 0x7f83d8022950 con 0x7f83f00690e0 2026-03-10T07:53:52.818 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.818+0000 7f83ecff9700 1 -- 192.168.123.105:0/2761631643 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f83d80647b0 con 0x7f83f00690e0 2026-03-10T07:53:52.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:52 vm08.local ceph-mon[107898]: pgmap v55: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovery_wait+degraded, 1 active+recovering+undersized+degraded+remapped, 62 active+clean; 255 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 11 KiB/s rd, 330 KiB/s wr, 68 op/s; 99/261 objects degraded (37.931%); 0 B/s, 7 objects/s recovering 2026-03-10T07:53:52.990 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.990+0000 7f83f5f95700 1 -- 192.168.123.105:0/2761631643 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f83f004ea90 con 0x7f83f00690e0 2026-03-10T07:53:52.992 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.992+0000 7f83ecff9700 1 -- 192.168.123.105:0/2761631643 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f83d8027070 con 0x7f83f00690e0 2026-03-10T07:53:52.992 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:53:52.992 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T07:53:52.992 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:53:52.992 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:53:52.992 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T07:53:52.992 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:53:52.992 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:53:52.992 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T07:53:52.992 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 5, 2026-03-10T07:53:52.992 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 1 2026-03-10T07:53:52.992 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:53:52.992 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T07:53:52.992 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T07:53:52.992 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:53:52.992 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T07:53:52.992 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 9, 2026-03-10T07:53:52.992 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-10T07:53:52.992 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T07:53:52.992 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:53:52.994 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.994+0000 7f83f5f95700 1 -- 192.168.123.105:0/2761631643 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f83dc07bbe0 msgr2=0x7f83dc07e0a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:52.995 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.994+0000 7f83f5f95700 1 --2- 192.168.123.105:0/2761631643 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f83dc07bbe0 0x7f83dc07e0a0 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f83e0005950 tx=0x7f83e000a300 comp rx=0 tx=0).stop 2026-03-10T07:53:52.995 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.994+0000 7f83f5f95700 1 -- 192.168.123.105:0/2761631643 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83f00690e0 msgr2=0x7f83f019ad20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:52.995 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.994+0000 7f83f5f95700 1 --2- 192.168.123.105:0/2761631643 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83f00690e0 0x7f83f019ad20 secure :-1 s=READY pgs=22 cs=0 l=1 rev1=1 crypto rx=0x7f83d8009fd0 tx=0x7f83d8004930 comp rx=0 tx=0).stop 2026-03-10T07:53:52.995 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.995+0000 7f83f5f95700 1 -- 192.168.123.105:0/2761631643 shutdown_connections 2026-03-10T07:53:52.995 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.995+0000 7f83f5f95700 1 --2- 192.168.123.105:0/2761631643 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f83dc07bbe0 0x7f83dc07e0a0 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.996 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.995+0000 7f83f5f95700 1 --2- 192.168.123.105:0/2761631643 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f83f0068730 0x7f83f019a7e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.996 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.995+0000 7f83f5f95700 1 --2- 192.168.123.105:0/2761631643 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f83f00690e0 0x7f83f019ad20 unknown :-1 s=CLOSED pgs=22 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:52.996 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.995+0000 7f83f5f95700 1 -- 192.168.123.105:0/2761631643 >> 192.168.123.105:0/2761631643 conn(0x7f83f0075960 msgr2=0x7f83f00feac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:52.996 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.995+0000 7f83f5f95700 1 -- 192.168.123.105:0/2761631643 shutdown_connections 2026-03-10T07:53:52.996 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:52.995+0000 7f83f5f95700 1 -- 192.168.123.105:0/2761631643 wait complete. 2026-03-10T07:53:53.061 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.060+0000 7f72e9808700 1 -- 192.168.123.105:0/3031778668 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72e4073130 msgr2=0x7f72e4073510 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:53.061 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.060+0000 7f72e9808700 1 --2- 192.168.123.105:0/3031778668 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72e4073130 0x7f72e4073510 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f72cc009b00 tx=0x7f72cc009e10 comp rx=0 tx=0).stop 2026-03-10T07:53:53.061 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.060+0000 7f72e9808700 1 -- 192.168.123.105:0/3031778668 shutdown_connections 2026-03-10T07:53:53.061 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.060+0000 7f72e9808700 1 --2- 192.168.123.105:0/3031778668 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f72e4073a50 0x7f72e4111940 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:53.061 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.060+0000 7f72e9808700 1 --2- 192.168.123.105:0/3031778668 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72e4073130 0x7f72e4073510 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:53.061 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.060+0000 7f72e9808700 1 -- 192.168.123.105:0/3031778668 >> 192.168.123.105:0/3031778668 conn(0x7f72e40fc920 msgr2=0x7f72e40fed40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:53.061 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.061+0000 7f72e9808700 1 -- 192.168.123.105:0/3031778668 shutdown_connections 2026-03-10T07:53:53.061 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.061+0000 7f72e9808700 1 -- 192.168.123.105:0/3031778668 wait complete. 2026-03-10T07:53:53.062 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.061+0000 7f72e9808700 1 Processor -- start 2026-03-10T07:53:53.062 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.062+0000 7f72e9808700 1 -- start start 2026-03-10T07:53:53.062 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.062+0000 7f72e9808700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f72e4073130 0x7f72e419d0f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:53.062 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.062+0000 7f72e9808700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72e4073a50 0x7f72e419d630 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:53.062 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.062+0000 7f72e9808700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f72e419dd10 con 0x7f72e4073a50 2026-03-10T07:53:53.063 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.062+0000 7f72e9808700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f72e41a1aa0 con 0x7f72e4073130 2026-03-10T07:53:53.063 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.062+0000 7f72e2ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f72e4073130 0x7f72e419d0f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:53.063 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.062+0000 7f72e2ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f72e4073130 0x7f72e419d0f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:33678/0 (socket says 192.168.123.105:33678) 2026-03-10T07:53:53.063 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.062+0000 7f72e2ffd700 1 -- 192.168.123.105:0/4276397381 learned_addr learned my addr 192.168.123.105:0/4276397381 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:53:53.063 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.063+0000 7f72e27fc700 1 --2- 192.168.123.105:0/4276397381 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72e4073a50 0x7f72e419d630 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:53.063 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.063+0000 7f72e2ffd700 1 -- 192.168.123.105:0/4276397381 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72e4073a50 msgr2=0x7f72e419d630 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:53.063 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.063+0000 7f72e2ffd700 1 --2- 192.168.123.105:0/4276397381 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72e4073a50 0x7f72e419d630 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:53.063 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.063+0000 7f72e2ffd700 1 -- 192.168.123.105:0/4276397381 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f72cc0097e0 con 0x7f72e4073130 2026-03-10T07:53:53.063 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.063+0000 7f72e27fc700 1 --2- 192.168.123.105:0/4276397381 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72e4073a50 0x7f72e419d630 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T07:53:53.063 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.063+0000 7f72e2ffd700 1 --2- 192.168.123.105:0/4276397381 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f72e4073130 0x7f72e419d0f0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f72cc0048c0 tx=0x7f72cc0049a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:53.064 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.063+0000 7f72e8806700 1 -- 192.168.123.105:0/4276397381 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f72cc01d070 con 0x7f72e4073130 2026-03-10T07:53:53.064 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.063+0000 7f72e9808700 1 -- 192.168.123.105:0/4276397381 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f72e41a1d80 con 0x7f72e4073130 2026-03-10T07:53:53.065 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.064+0000 7f72e9808700 1 -- 192.168.123.105:0/4276397381 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f72e41a22d0 con 0x7f72e4073130 2026-03-10T07:53:53.065 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.064+0000 7f72e8806700 1 -- 192.168.123.105:0/4276397381 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f72cc00bc50 con 0x7f72e4073130 2026-03-10T07:53:53.065 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.064+0000 7f72e8806700 1 -- 192.168.123.105:0/4276397381 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f72cc00f670 con 0x7f72e4073130 2026-03-10T07:53:53.065 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.065+0000 7f72e8806700 1 -- 192.168.123.105:0/4276397381 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f72cc00f7d0 con 0x7f72e4073130 2026-03-10T07:53:53.065 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.065+0000 7f72e9808700 1 -- 192.168.123.105:0/4276397381 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f72e410f0c0 con 0x7f72e4073130 2026-03-10T07:53:53.066 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.066+0000 7f72e8806700 1 --2- 192.168.123.105:0/4276397381 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f72d00779a0 0x7f72d0079e60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:53.066 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.066+0000 7f72e8806700 1 -- 192.168.123.105:0/4276397381 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(54..54 src has 1..54) v4 ==== 6165+0+0 (secure 0 0 0) 0x7f72cc09b370 con 0x7f72e4073130 2026-03-10T07:53:53.066 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.066+0000 7f72e27fc700 1 --2- 192.168.123.105:0/4276397381 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f72d00779a0 0x7f72d0079e60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:53.067 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.066+0000 7f72e27fc700 1 --2- 192.168.123.105:0/4276397381 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f72d00779a0 0x7f72d0079e60 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f72d4005950 tx=0x7f72d40058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:53.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.068+0000 7f72e8806700 1 -- 192.168.123.105:0/4276397381 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f72cc0639e0 con 0x7f72e4073130 2026-03-10T07:53:53.210 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.210+0000 7f72e9808700 1 -- 192.168.123.105:0/4276397381 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f72e404ea90 con 0x7f72e4073130 2026-03-10T07:53:53.211 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.210+0000 7f72e8806700 1 -- 192.168.123.105:0/4276397381 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1915 (secure 0 0 0) 0x7f72cc027790 con 0x7f72e4073130 2026-03-10T07:53:53.211 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T07:53:53.211 INFO:teuthology.orchestra.run.vm05.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T07:53:53.211 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T07:53:53.211 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:53:53.211 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T07:53:53.211 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:53:53.211 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T07:53:53.211 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T07:53:53.211 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T07:53:53.211 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T07:53:53.211 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T07:48:24.309293+0000 2026-03-10T07:53:53.211 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T07:48:33.425187+0000 2026-03-10T07:53:53.211 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T07:53:53.211 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T07:53:53.211 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T07:53:53.211 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T07:53:53.211 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T07:53:53.211 INFO:teuthology.orchestra.run.vm05.stdout:max_xattr_size 65536 2026-03-10T07:53:53.211 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T07:53:53.211 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T07:53:53.211 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T07:53:53.212 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:53:53.212 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T07:53:53.212 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T07:53:53.212 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24297} 2026-03-10T07:53:53.212 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T07:53:53.212 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T07:53:53.212 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T07:53:53.212 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T07:53:53.212 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T07:53:53.212 INFO:teuthology.orchestra.run.vm05.stdout:inline_data enabled 2026-03-10T07:53:53.212 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T07:53:53.212 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T07:53:53.212 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T07:53:53.212 INFO:teuthology.orchestra.run.vm05.stdout:qdb_cluster leader: 0 members: 2026-03-10T07:53:53.212 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.omfhnh{0:24297} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/723078808,v1:192.168.123.105:6827/723078808] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:53:53.212 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:53:53.212 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:53:53.212 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T07:53:53.212 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:53:53.212 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.pavqil{-1:14512} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/427555544,v1:192.168.123.105:6829/427555544] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:53:53.212 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ybmbgd{-1:14524} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3815585988,v1:192.168.123.108:6825/3815585988] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:53:53.212 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.dgsaon{-1:24313} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/2963085185,v1:192.168.123.108:6827/2963085185] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:53:53.214 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.213+0000 7f72e9808700 1 -- 192.168.123.105:0/4276397381 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f72d00779a0 msgr2=0x7f72d0079e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:53.214 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.214+0000 7f72e9808700 1 --2- 192.168.123.105:0/4276397381 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f72d00779a0 0x7f72d0079e60 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f72d4005950 tx=0x7f72d40058e0 comp rx=0 tx=0).stop 2026-03-10T07:53:53.214 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.214+0000 7f72e9808700 1 -- 192.168.123.105:0/4276397381 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f72e4073130 msgr2=0x7f72e419d0f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:53.214 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.214+0000 7f72e9808700 1 --2- 192.168.123.105:0/4276397381 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f72e4073130 0x7f72e419d0f0 secure :-1 s=READY pgs=23 cs=0 l=1 rev1=1 crypto rx=0x7f72cc0048c0 tx=0x7f72cc0049a0 comp rx=0 tx=0).stop 2026-03-10T07:53:53.214 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.214+0000 7f72e9808700 1 -- 192.168.123.105:0/4276397381 shutdown_connections 2026-03-10T07:53:53.214 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.214+0000 7f72e9808700 1 --2- 192.168.123.105:0/4276397381 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f72e4073130 0x7f72e419d0f0 unknown :-1 s=CLOSED pgs=23 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:53.214 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.214+0000 7f72e9808700 1 --2- 192.168.123.105:0/4276397381 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f72d00779a0 0x7f72d0079e60 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:53.215 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.214+0000 7f72e9808700 1 --2- 192.168.123.105:0/4276397381 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f72e4073a50 0x7f72e419d630 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:53.215 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.214+0000 7f72e9808700 1 -- 192.168.123.105:0/4276397381 >> 192.168.123.105:0/4276397381 conn(0x7f72e40fc920 msgr2=0x7f72e4103450 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:53.215 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.214+0000 7f72e9808700 1 -- 192.168.123.105:0/4276397381 shutdown_connections 2026-03-10T07:53:53.215 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.214+0000 7f72e9808700 1 -- 192.168.123.105:0/4276397381 wait complete. 2026-03-10T07:53:53.215 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T07:53:53.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.283+0000 7fac43e70700 1 -- 192.168.123.105:0/1047570385 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fac3c0ffc10 msgr2=0x7fac3c10d260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:53.284 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.283+0000 7fac43e70700 1 --2- 192.168.123.105:0/1047570385 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fac3c0ffc10 0x7fac3c10d260 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7fac38009b50 tx=0x7fac38009e60 comp rx=0 tx=0).stop 2026-03-10T07:53:53.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.285+0000 7fac43e70700 1 -- 192.168.123.105:0/1047570385 shutdown_connections 2026-03-10T07:53:53.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.285+0000 7fac43e70700 1 --2- 192.168.123.105:0/1047570385 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fac3c0ffc10 0x7fac3c10d260 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:53.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.285+0000 7fac43e70700 1 --2- 192.168.123.105:0/1047570385 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac3c0ff260 0x7fac3c0ff640 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:53.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.285+0000 7fac43e70700 1 -- 192.168.123.105:0/1047570385 >> 192.168.123.105:0/1047570385 conn(0x7fac3c074bd0 msgr2=0x7fac3c074fe0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:53.288 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.288+0000 7fac43e70700 1 -- 192.168.123.105:0/1047570385 shutdown_connections 2026-03-10T07:53:53.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.288+0000 7fac43e70700 1 -- 192.168.123.105:0/1047570385 wait complete. 2026-03-10T07:53:53.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.289+0000 7fac43e70700 1 Processor -- start 2026-03-10T07:53:53.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.289+0000 7fac43e70700 1 -- start start 2026-03-10T07:53:53.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.289+0000 7fac43e70700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac3c0ff260 0x7fac3c10cd40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:53.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.289+0000 7fac43e70700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fac3c0ffc10 0x7fac3c103c90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:53.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.289+0000 7fac43e70700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fac3c10d440 con 0x7fac3c0ffc10 2026-03-10T07:53:53.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.289+0000 7fac43e70700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fac3c10d5b0 con 0x7fac3c0ff260 2026-03-10T07:53:53.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.289+0000 7fac41c0c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac3c0ff260 0x7fac3c10cd40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:53.290 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.289+0000 7fac41c0c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac3c0ff260 0x7fac3c10cd40 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:33698/0 (socket says 192.168.123.105:33698) 2026-03-10T07:53:53.290 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.289+0000 7fac41c0c700 1 -- 192.168.123.105:0/30746031 learned_addr learned my addr 192.168.123.105:0/30746031 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:53:53.290 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.289+0000 7fac41c0c700 1 -- 192.168.123.105:0/30746031 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fac3c0ffc10 msgr2=0x7fac3c103c90 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:53:53.290 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.290+0000 7fac4140b700 1 --2- 192.168.123.105:0/30746031 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fac3c0ffc10 0x7fac3c103c90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:53.290 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.290+0000 7fac41c0c700 1 --2- 192.168.123.105:0/30746031 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fac3c0ffc10 0x7fac3c103c90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:53.290 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.290+0000 7fac41c0c700 1 -- 192.168.123.105:0/30746031 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fac380097e0 con 0x7fac3c0ff260 2026-03-10T07:53:53.290 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.290+0000 7fac4140b700 1 --2- 192.168.123.105:0/30746031 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fac3c0ffc10 0x7fac3c103c90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:53:53.290 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.290+0000 7fac41c0c700 1 --2- 192.168.123.105:0/30746031 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac3c0ff260 0x7fac3c10cd40 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7fac30009fd0 tx=0x7fac3000eea0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:53.290 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.290+0000 7fac2effd700 1 -- 192.168.123.105:0/30746031 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fac30009980 con 0x7fac3c0ff260 2026-03-10T07:53:53.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.290+0000 7fac43e70700 1 -- 192.168.123.105:0/30746031 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fac3c1042f0 con 0x7fac3c0ff260 2026-03-10T07:53:53.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.290+0000 7fac2effd700 1 -- 192.168.123.105:0/30746031 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fac30004500 con 0x7fac3c0ff260 2026-03-10T07:53:53.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.290+0000 7fac2effd700 1 -- 192.168.123.105:0/30746031 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fac30010450 con 0x7fac3c0ff260 2026-03-10T07:53:53.291 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.290+0000 7fac43e70700 1 -- 192.168.123.105:0/30746031 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fac3c104810 con 0x7fac3c0ff260 2026-03-10T07:53:53.292 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.292+0000 7fac2effd700 1 -- 192.168.123.105:0/30746031 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7fac3000cca0 con 0x7fac3c0ff260 2026-03-10T07:53:53.292 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.292+0000 7fac2effd700 1 --2- 192.168.123.105:0/30746031 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fac28077a50 0x7fac28079f10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:53.293 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.293+0000 7fac2effd700 1 -- 192.168.123.105:0/30746031 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(54..54 src has 1..54) v4 ==== 6165+0+0 (secure 0 0 0) 0x7fac30014070 con 0x7fac3c0ff260 2026-03-10T07:53:53.293 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.293+0000 7fac4140b700 1 --2- 192.168.123.105:0/30746031 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fac28077a50 0x7fac28079f10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:53.293 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.293+0000 7fac43e70700 1 -- 192.168.123.105:0/30746031 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fac3c104af0 con 0x7fac3c0ff260 2026-03-10T07:53:53.293 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.293+0000 7fac4140b700 1 --2- 192.168.123.105:0/30746031 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fac28077a50 0x7fac28079f10 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7fac38006010 tx=0x7fac38005c20 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:53.296 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.296+0000 7fac2effd700 1 -- 192.168.123.105:0/30746031 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fac30062860 con 0x7fac3c0ff260 2026-03-10T07:53:53.419 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.419+0000 7fac43e70700 1 -- 192.168.123.105:0/30746031 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7fac3c1051c0 con 0x7fac28077a50 2026-03-10T07:53:53.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.420+0000 7fac2effd700 1 -- 192.168.123.105:0/30746031 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7fac3c1051c0 con 0x7fac28077a50 2026-03-10T07:53:53.420 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:53:53.420 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T07:53:53.420 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T07:53:53.420 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T07:53:53.420 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T07:53:53.420 INFO:teuthology.orchestra.run.vm05.stdout: "mgr", 2026-03-10T07:53:53.420 INFO:teuthology.orchestra.run.vm05.stdout: "crash", 2026-03-10T07:53:53.420 INFO:teuthology.orchestra.run.vm05.stdout: "mon" 2026-03-10T07:53:53.420 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T07:53:53.420 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "7/23 daemons upgraded", 2026-03-10T07:53:53.420 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T07:53:53.420 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T07:53:53.420 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:53:53.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.422+0000 7fac43e70700 1 -- 192.168.123.105:0/30746031 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fac28077a50 msgr2=0x7fac28079f10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:53.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.422+0000 7fac43e70700 1 --2- 192.168.123.105:0/30746031 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fac28077a50 0x7fac28079f10 secure :-1 s=READY pgs=40 cs=0 l=1 rev1=1 crypto rx=0x7fac38006010 tx=0x7fac38005c20 comp rx=0 tx=0).stop 2026-03-10T07:53:53.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.422+0000 7fac43e70700 1 -- 192.168.123.105:0/30746031 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac3c0ff260 msgr2=0x7fac3c10cd40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:53.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.422+0000 7fac43e70700 1 --2- 192.168.123.105:0/30746031 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac3c0ff260 0x7fac3c10cd40 secure :-1 s=READY pgs=24 cs=0 l=1 rev1=1 crypto rx=0x7fac30009fd0 tx=0x7fac3000eea0 comp rx=0 tx=0).stop 2026-03-10T07:53:53.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.422+0000 7fac43e70700 1 -- 192.168.123.105:0/30746031 shutdown_connections 2026-03-10T07:53:53.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.422+0000 7fac43e70700 1 --2- 192.168.123.105:0/30746031 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac3c0ff260 0x7fac3c10cd40 unknown :-1 s=CLOSED pgs=24 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:53.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.422+0000 7fac43e70700 1 --2- 192.168.123.105:0/30746031 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fac28077a50 0x7fac28079f10 unknown :-1 s=CLOSED pgs=40 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:53.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.422+0000 7fac43e70700 1 --2- 192.168.123.105:0/30746031 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fac3c0ffc10 0x7fac3c103c90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:53.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.422+0000 7fac43e70700 1 -- 192.168.123.105:0/30746031 >> 192.168.123.105:0/30746031 conn(0x7fac3c074bd0 msgr2=0x7fac3c0fcc70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:53.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.423+0000 7fac43e70700 1 -- 192.168.123.105:0/30746031 shutdown_connections 2026-03-10T07:53:53.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.423+0000 7fac43e70700 1 -- 192.168.123.105:0/30746031 wait complete. 2026-03-10T07:53:53.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.495+0000 7f9dc7fff700 1 -- 192.168.123.105:0/1130628785 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9dc8103a50 msgr2=0x7f9dc8107aa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:53.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.495+0000 7f9dc7fff700 1 --2- 192.168.123.105:0/1130628785 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9dc8103a50 0x7f9dc8107aa0 secure :-1 s=READY pgs=59 cs=0 l=1 rev1=1 crypto rx=0x7f9db0009b00 tx=0x7f9db0009e10 comp rx=0 tx=0).stop 2026-03-10T07:53:53.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.496+0000 7f9dc7fff700 1 -- 192.168.123.105:0/1130628785 shutdown_connections 2026-03-10T07:53:53.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.496+0000 7f9dc7fff700 1 --2- 192.168.123.105:0/1130628785 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9dc8103a50 0x7f9dc8107aa0 unknown :-1 s=CLOSED pgs=59 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:53.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.496+0000 7f9dc7fff700 1 --2- 192.168.123.105:0/1130628785 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9dc81030a0 0x7f9dc8103480 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:53.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.496+0000 7f9dc7fff700 1 -- 192.168.123.105:0/1130628785 >> 192.168.123.105:0/1130628785 conn(0x7f9dc80fe930 msgr2=0x7f9dc8100d50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:53.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.496+0000 7f9dc7fff700 1 -- 192.168.123.105:0/1130628785 shutdown_connections 2026-03-10T07:53:53.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.496+0000 7f9dc7fff700 1 -- 192.168.123.105:0/1130628785 wait complete. 2026-03-10T07:53:53.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.497+0000 7f9dc7fff700 1 Processor -- start 2026-03-10T07:53:53.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.497+0000 7f9dc7fff700 1 -- start start 2026-03-10T07:53:53.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.497+0000 7f9dc7fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9dc81030a0 0x7f9dc819e930 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:53.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.497+0000 7f9dc7fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9dc8103a50 0x7f9dc819ee70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:53.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.497+0000 7f9dc7fff700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9dc819f500 con 0x7f9dc81030a0 2026-03-10T07:53:53.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.497+0000 7f9dc7fff700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9dc81989b0 con 0x7f9dc8103a50 2026-03-10T07:53:53.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.498+0000 7f9dbffff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9dc8103a50 0x7f9dc819ee70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:53.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.498+0000 7f9dbffff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9dc8103a50 0x7f9dc819ee70 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:33714/0 (socket says 192.168.123.105:33714) 2026-03-10T07:53:53.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.498+0000 7f9dbffff700 1 -- 192.168.123.105:0/4003738678 learned_addr learned my addr 192.168.123.105:0/4003738678 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:53:53.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.498+0000 7f9dc6ffd700 1 --2- 192.168.123.105:0/4003738678 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9dc81030a0 0x7f9dc819e930 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:53.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.498+0000 7f9dbffff700 1 -- 192.168.123.105:0/4003738678 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9dc81030a0 msgr2=0x7f9dc819e930 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:53.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.498+0000 7f9dbffff700 1 --2- 192.168.123.105:0/4003738678 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9dc81030a0 0x7f9dc819e930 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:53.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.498+0000 7f9dbffff700 1 -- 192.168.123.105:0/4003738678 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9db00097e0 con 0x7f9dc8103a50 2026-03-10T07:53:53.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.499+0000 7f9dc6ffd700 1 --2- 192.168.123.105:0/4003738678 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9dc81030a0 0x7f9dc819e930 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:53:53.500 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.499+0000 7f9dbffff700 1 --2- 192.168.123.105:0/4003738678 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9dc8103a50 0x7f9dc819ee70 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f9db0000c00 tx=0x7f9db0004970 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:53.500 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.500+0000 7f9dc4ff9700 1 -- 192.168.123.105:0/4003738678 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9db001d070 con 0x7f9dc8103a50 2026-03-10T07:53:53.500 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.500+0000 7f9dc7fff700 1 -- 192.168.123.105:0/4003738678 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9dc8198c30 con 0x7f9dc8103a50 2026-03-10T07:53:53.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.500+0000 7f9dc4ff9700 1 -- 192.168.123.105:0/4003738678 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f9db000bc50 con 0x7f9dc8103a50 2026-03-10T07:53:53.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.500+0000 7f9dc4ff9700 1 -- 192.168.123.105:0/4003738678 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9db000f700 con 0x7f9dc8103a50 2026-03-10T07:53:53.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.500+0000 7f9dc7fff700 1 -- 192.168.123.105:0/4003738678 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9dc8199120 con 0x7f9dc8103a50 2026-03-10T07:53:53.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.501+0000 7f9dc7fff700 1 -- 192.168.123.105:0/4003738678 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9dc810b3b0 con 0x7f9dc8103a50 2026-03-10T07:53:53.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.501+0000 7f9dc4ff9700 1 -- 192.168.123.105:0/4003738678 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 37) v1 ==== 100217+0+0 (secure 0 0 0) 0x7f9db0022a50 con 0x7f9dc8103a50 2026-03-10T07:53:53.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.502+0000 7f9dc4ff9700 1 --2- 192.168.123.105:0/4003738678 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f9db40779a0 0x7f9db4079e60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:53:53.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.502+0000 7f9dc6ffd700 1 --2- 192.168.123.105:0/4003738678 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f9db40779a0 0x7f9db4079e60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:53:53.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.502+0000 7f9dc4ff9700 1 -- 192.168.123.105:0/4003738678 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(55..55 src has 1..55) v4 ==== 6165+0+0 (secure 0 0 0) 0x7f9db009b320 con 0x7f9dc8103a50 2026-03-10T07:53:53.503 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.503+0000 7f9dc6ffd700 1 --2- 192.168.123.105:0/4003738678 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f9db40779a0 0x7f9db4079e60 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f9db8005fd0 tx=0x7f9db8005ee0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:53:53.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.505+0000 7f9dc4ff9700 1 -- 192.168.123.105:0/4003738678 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9db00639c0 con 0x7f9dc8103a50 2026-03-10T07:53:53.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:53 vm05.local ceph-mon[130117]: from='client.44149 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:53:53.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:53 vm05.local ceph-mon[130117]: osdmap e54: 6 total, 6 up, 6 in 2026-03-10T07:53:53.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:53 vm05.local ceph-mon[130117]: from='client.34178 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:53:53.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:53 vm05.local ceph-mon[130117]: from='client.34182 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:53:53.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:53 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/2761631643' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:53:53.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:53 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/4276397381' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T07:53:53.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.663+0000 7f9dc7fff700 1 -- 192.168.123.105:0/4003738678 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f9dc804f2e0 con 0x7f9dc8103a50 2026-03-10T07:53:53.664 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.664+0000 7f9dc4ff9700 1 -- 192.168.123.105:0/4003738678 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+587 (secure 0 0 0) 0x7f9db0027030 con 0x7f9dc8103a50 2026-03-10T07:53:53.664 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data; Degraded data redundancy: 99/261 objects degraded (37.931%), 3 pgs degraded 2026-03-10T07:53:53.664 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T07:53:53.664 INFO:teuthology.orchestra.run.vm05.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T07:53:53.664 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 99/261 objects degraded (37.931%), 3 pgs degraded 2026-03-10T07:53:53.664 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.c is active+recovery_wait+undersized+degraded+remapped, acting [5,3] 2026-03-10T07:53:53.664 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.f is active+recovering+undersized+degraded+remapped, acting [5,3] 2026-03-10T07:53:53.664 INFO:teuthology.orchestra.run.vm05.stdout: pg 3.11 is active+recovery_wait+degraded, acting [3,4,0] 2026-03-10T07:53:53.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.667+0000 7f9dc7fff700 1 -- 192.168.123.105:0/4003738678 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f9db40779a0 msgr2=0x7f9db4079e60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:53.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.667+0000 7f9dc7fff700 1 --2- 192.168.123.105:0/4003738678 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f9db40779a0 0x7f9db4079e60 secure :-1 s=READY pgs=41 cs=0 l=1 rev1=1 crypto rx=0x7f9db8005fd0 tx=0x7f9db8005ee0 comp rx=0 tx=0).stop 2026-03-10T07:53:53.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.667+0000 7f9dc7fff700 1 -- 192.168.123.105:0/4003738678 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9dc8103a50 msgr2=0x7f9dc819ee70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:53:53.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.667+0000 7f9dc7fff700 1 --2- 192.168.123.105:0/4003738678 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9dc8103a50 0x7f9dc819ee70 secure :-1 s=READY pgs=25 cs=0 l=1 rev1=1 crypto rx=0x7f9db0000c00 tx=0x7f9db0004970 comp rx=0 tx=0).stop 2026-03-10T07:53:53.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.667+0000 7f9dc7fff700 1 -- 192.168.123.105:0/4003738678 shutdown_connections 2026-03-10T07:53:53.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.667+0000 7f9dc7fff700 1 --2- 192.168.123.105:0/4003738678 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f9db40779a0 0x7f9db4079e60 unknown :-1 s=CLOSED pgs=41 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:53.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.667+0000 7f9dc7fff700 1 --2- 192.168.123.105:0/4003738678 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9dc81030a0 0x7f9dc819e930 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:53.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.667+0000 7f9dc7fff700 1 --2- 192.168.123.105:0/4003738678 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9dc8103a50 0x7f9dc819ee70 unknown :-1 s=CLOSED pgs=25 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:53:53.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.667+0000 7f9dc7fff700 1 -- 192.168.123.105:0/4003738678 >> 192.168.123.105:0/4003738678 conn(0x7f9dc80fe930 msgr2=0x7f9dc8107310 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:53:53.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.667+0000 7f9dc7fff700 1 -- 192.168.123.105:0/4003738678 shutdown_connections 2026-03-10T07:53:53.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:53:53.667+0000 7f9dc7fff700 1 -- 192.168.123.105:0/4003738678 wait complete. 2026-03-10T07:53:53.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:53 vm08.local ceph-mon[107898]: from='client.44149 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:53:53.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:53 vm08.local ceph-mon[107898]: osdmap e54: 6 total, 6 up, 6 in 2026-03-10T07:53:53.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:53 vm08.local ceph-mon[107898]: from='client.34178 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:53:53.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:53 vm08.local ceph-mon[107898]: from='client.34182 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:53:53.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:53 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/2761631643' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:53:53.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:53 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/4276397381' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T07:53:54.593 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:54 vm05.local ceph-mon[130117]: pgmap v57: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovery_wait+degraded, 1 active+recovering+undersized+degraded+remapped, 62 active+clean; 255 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 13 KiB/s rd, 78 KiB/s wr, 56 op/s; 99/261 objects degraded (37.931%); 0 B/s, 9 objects/s recovering 2026-03-10T07:53:54.593 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:54 vm05.local ceph-mon[130117]: from='client.44161 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:53:54.593 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:54 vm05.local ceph-mon[130117]: osdmap e55: 6 total, 6 up, 6 in 2026-03-10T07:53:54.593 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:54 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/4003738678' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T07:53:54.593 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:54 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T07:53:54.593 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:54 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:54.593 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:54 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T07:53:54.593 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:54 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:53:54.593 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:54 vm05.local ceph-mon[130117]: Health check update: Degraded data redundancy: 99/261 objects degraded (37.931%), 3 pgs degraded (PG_DEGRADED) 2026-03-10T07:53:54.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:54 vm08.local ceph-mon[107898]: pgmap v57: 65 pgs: 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovery_wait+degraded, 1 active+recovering+undersized+degraded+remapped, 62 active+clean; 255 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 13 KiB/s rd, 78 KiB/s wr, 56 op/s; 99/261 objects degraded (37.931%); 0 B/s, 9 objects/s recovering 2026-03-10T07:53:54.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:54 vm08.local ceph-mon[107898]: from='client.44161 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:53:54.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:54 vm08.local ceph-mon[107898]: osdmap e55: 6 total, 6 up, 6 in 2026-03-10T07:53:54.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:54 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/4003738678' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T07:53:54.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:54 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T07:53:54.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:54 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:54.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:54 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.1"}]: dispatch 2026-03-10T07:53:54.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:54 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:53:54.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:54 vm08.local ceph-mon[107898]: Health check update: Degraded data redundancy: 99/261 objects degraded (37.931%), 3 pgs degraded (PG_DEGRADED) 2026-03-10T07:53:55.157 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:54 vm05.local systemd[1]: Stopping Ceph osd.1 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T07:53:55.157 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:54 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1[75139]: 2026-03-10T07:53:54.967+0000 7fbe66397700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.1 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T07:53:55.157 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:54 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1[75139]: 2026-03-10T07:53:54.967+0000 7fbe66397700 -1 osd.1 55 *** Got signal Terminated *** 2026-03-10T07:53:55.157 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:54 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1[75139]: 2026-03-10T07:53:54.967+0000 7fbe66397700 -1 osd.1 55 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T07:53:55.855 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:55 vm05.local ceph-mon[130117]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T07:53:55.855 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:55 vm05.local ceph-mon[130117]: Upgrade: osd.1 is safe to restart 2026-03-10T07:53:55.855 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:55 vm05.local ceph-mon[130117]: Upgrade: Updating osd.1 2026-03-10T07:53:55.855 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:55 vm05.local ceph-mon[130117]: Deploying daemon osd.1 on vm05 2026-03-10T07:53:55.855 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:55 vm05.local ceph-mon[130117]: osd.1 marked itself down and dead 2026-03-10T07:53:55.855 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:55 vm05.local podman[143747]: 2026-03-10 07:53:55.561200649 +0000 UTC m=+0.608225320 container died 88e0b65b2c932114dfc705443f0d1a867cabc7981f50d58a28432c2ee3dfa07d (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1, CEPH_POINT_RELEASE=-18.2.1, GIT_CLEAN=True, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, org.label-schema.build-date=20240222, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image) 2026-03-10T07:53:55.855 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:55 vm05.local podman[143747]: 2026-03-10 07:53:55.581801016 +0000 UTC m=+0.628825687 container remove 88e0b65b2c932114dfc705443f0d1a867cabc7981f50d58a28432c2ee3dfa07d (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1, maintainer=Guillaume Abrioux , org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.1, RELEASE=HEAD, ceph=True, io.buildah.version=1.29.1, org.label-schema.build-date=20240222, GIT_BRANCH=HEAD, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.schema-version=1.0, GIT_CLEAN=True, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd) 2026-03-10T07:53:55.855 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:55 vm05.local bash[143747]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1 2026-03-10T07:53:55.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:55 vm08.local ceph-mon[107898]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["1"], "max": 16}]: dispatch 2026-03-10T07:53:55.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:55 vm08.local ceph-mon[107898]: Upgrade: osd.1 is safe to restart 2026-03-10T07:53:55.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:55 vm08.local ceph-mon[107898]: Upgrade: Updating osd.1 2026-03-10T07:53:55.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:55 vm08.local ceph-mon[107898]: Deploying daemon osd.1 on vm05 2026-03-10T07:53:55.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:55 vm08.local ceph-mon[107898]: osd.1 marked itself down and dead 2026-03-10T07:53:56.109 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:55 vm05.local podman[143811]: 2026-03-10 07:53:55.756174006 +0000 UTC m=+0.011352625 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T07:53:56.109 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:56 vm05.local podman[143811]: 2026-03-10 07:53:56.109464376 +0000 UTC m=+0.364642984 container create 59af0ccdf63efdc38b6f05a6d3ded979f361dc7d7c313a9a1273650288364241 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-deactivate, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0) 2026-03-10T07:53:56.360 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:56 vm05.local podman[143811]: 2026-03-10 07:53:56.155406259 +0000 UTC m=+0.410584877 container init 59af0ccdf63efdc38b6f05a6d3ded979f361dc7d7c313a9a1273650288364241 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-deactivate, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T07:53:56.360 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:56 vm05.local podman[143811]: 2026-03-10 07:53:56.158519688 +0000 UTC m=+0.413698296 container start 59af0ccdf63efdc38b6f05a6d3ded979f361dc7d7c313a9a1273650288364241 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T07:53:56.360 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:56 vm05.local podman[143811]: 2026-03-10 07:53:56.162354627 +0000 UTC m=+0.417533245 container attach 59af0ccdf63efdc38b6f05a6d3ded979f361dc7d7c313a9a1273650288364241 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-deactivate, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20260223, ceph=True) 2026-03-10T07:53:56.360 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:56 vm05.local conmon[143821]: conmon 59af0ccdf63efdc38b6f : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-59af0ccdf63efdc38b6f05a6d3ded979f361dc7d7c313a9a1273650288364241.scope/container/memory.events 2026-03-10T07:53:56.360 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:56 vm05.local podman[143811]: 2026-03-10 07:53:56.310741738 +0000 UTC m=+0.565920346 container died 59af0ccdf63efdc38b6f05a6d3ded979f361dc7d7c313a9a1273650288364241 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-deactivate, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T07:53:56.360 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:56 vm05.local podman[143830]: 2026-03-10 07:53:56.346236571 +0000 UTC m=+0.034238562 container remove 59af0ccdf63efdc38b6f05a6d3ded979f361dc7d7c313a9a1273650288364241 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, OSD_FLAVOR=default, ceph=True, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T07:53:56.360 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:56 vm05.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.1.service: Deactivated successfully. 2026-03-10T07:53:56.360 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:56 vm05.local systemd[1]: Stopped Ceph osd.1 for 12e9780e-1c55-11f1-8896-79f7c2e9b508. 2026-03-10T07:53:56.360 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:56 vm05.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.1.service: Consumed 38.711s CPU time. 2026-03-10T07:53:56.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:56 vm05.local ceph-mon[130117]: pgmap v59: 65 pgs: 1 peering, 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 62 active+clean; 255 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 127 B/s rd, 33 KiB/s wr, 1 op/s; 60/261 objects degraded (22.989%); 0 B/s, 11 objects/s recovering 2026-03-10T07:53:56.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:56 vm05.local ceph-mon[130117]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T07:53:56.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:56 vm05.local ceph-mon[130117]: osdmap e56: 6 total, 5 up, 6 in 2026-03-10T07:53:56.657 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:56 vm05.local systemd[1]: Starting Ceph osd.1 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T07:53:56.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:56 vm08.local ceph-mon[107898]: pgmap v59: 65 pgs: 1 peering, 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 62 active+clean; 255 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 127 B/s rd, 33 KiB/s wr, 1 op/s; 60/261 objects degraded (22.989%); 0 B/s, 11 objects/s recovering 2026-03-10T07:53:56.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:56 vm08.local ceph-mon[107898]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T07:53:56.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:56 vm08.local ceph-mon[107898]: osdmap e56: 6 total, 5 up, 6 in 2026-03-10T07:53:57.116 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:56 vm05.local podman[143923]: 2026-03-10 07:53:56.717367812 +0000 UTC m=+0.025171547 container create a4b8ea45014f2bda873b41bfaef95260fc20137d476f6240a1d1e2eed891434e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-activate, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T07:53:57.116 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:56 vm05.local podman[143923]: 2026-03-10 07:53:56.752301315 +0000 UTC m=+0.060105050 container init a4b8ea45014f2bda873b41bfaef95260fc20137d476f6240a1d1e2eed891434e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0) 2026-03-10T07:53:57.116 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:56 vm05.local podman[143923]: 2026-03-10 07:53:56.762059223 +0000 UTC m=+0.069862958 container start a4b8ea45014f2bda873b41bfaef95260fc20137d476f6240a1d1e2eed891434e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.build-date=20260223, ceph=True) 2026-03-10T07:53:57.116 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:56 vm05.local podman[143923]: 2026-03-10 07:53:56.762946985 +0000 UTC m=+0.070750720 container attach a4b8ea45014f2bda873b41bfaef95260fc20137d476f6240a1d1e2eed891434e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, CEPH_REF=squid, ceph=True, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T07:53:57.116 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:56 vm05.local podman[143923]: 2026-03-10 07:53:56.704436341 +0000 UTC m=+0.012240076 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T07:53:57.116 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:56 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-activate[143939]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:53:57.116 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:56 vm05.local bash[143923]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:53:57.116 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:56 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-activate[143939]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:53:57.116 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:56 vm05.local bash[143923]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:53:57.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:57 vm05.local ceph-mon[130117]: osdmap e57: 6 total, 5 up, 6 in 2026-03-10T07:53:57.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:57.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:53:57.657 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-activate[143939]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T07:53:57.658 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local bash[143923]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T07:53:57.658 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-activate[143939]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:53:57.658 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local bash[143923]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:53:57.658 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-activate[143939]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:53:57.658 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local bash[143923]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:53:57.658 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-activate[143939]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-10T07:53:57.658 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local bash[143923]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-10T07:53:57.658 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-activate[143939]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-fab81c80-66c1-4c26-9758-b1ba94f2e1d8/osd-block-725f8dca-94da-4c18-aefa-9e9f529cccd4 --path /var/lib/ceph/osd/ceph-1 --no-mon-config 2026-03-10T07:53:57.658 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local bash[143923]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-fab81c80-66c1-4c26-9758-b1ba94f2e1d8/osd-block-725f8dca-94da-4c18-aefa-9e9f529cccd4 --path /var/lib/ceph/osd/ceph-1 --no-mon-config 2026-03-10T07:53:57.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:57 vm08.local ceph-mon[107898]: osdmap e57: 6 total, 5 up, 6 in 2026-03-10T07:53:57.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:57.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:53:58.158 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-activate[143939]: Running command: /usr/bin/ln -snf /dev/ceph-fab81c80-66c1-4c26-9758-b1ba94f2e1d8/osd-block-725f8dca-94da-4c18-aefa-9e9f529cccd4 /var/lib/ceph/osd/ceph-1/block 2026-03-10T07:53:58.158 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local bash[143923]: Running command: /usr/bin/ln -snf /dev/ceph-fab81c80-66c1-4c26-9758-b1ba94f2e1d8/osd-block-725f8dca-94da-4c18-aefa-9e9f529cccd4 /var/lib/ceph/osd/ceph-1/block 2026-03-10T07:53:58.158 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-activate[143939]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block 2026-03-10T07:53:58.158 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local bash[143923]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-1/block 2026-03-10T07:53:58.158 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-activate[143939]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-10T07:53:58.158 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local bash[143923]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-10T07:53:58.158 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-activate[143939]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-10T07:53:58.158 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local bash[143923]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-1 2026-03-10T07:53:58.158 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-activate[143939]: --> ceph-volume lvm activate successful for osd ID: 1 2026-03-10T07:53:58.158 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local bash[143923]: --> ceph-volume lvm activate successful for osd ID: 1 2026-03-10T07:53:58.158 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local podman[143923]: 2026-03-10 07:53:57.74753551 +0000 UTC m=+1.055339245 container died a4b8ea45014f2bda873b41bfaef95260fc20137d476f6240a1d1e2eed891434e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-activate, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T07:53:58.158 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local podman[143923]: 2026-03-10 07:53:57.769420393 +0000 UTC m=+1.077224128 container remove a4b8ea45014f2bda873b41bfaef95260fc20137d476f6240a1d1e2eed891434e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-activate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid) 2026-03-10T07:53:58.158 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local podman[144184]: 2026-03-10 07:53:57.86820634 +0000 UTC m=+0.018458289 container create e3fe4ad5c6a3b0fbeba0080c9d5bed5b344f914552da3fbec9b9b3cc58f959c1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0) 2026-03-10T07:53:58.158 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local podman[144184]: 2026-03-10 07:53:57.902903269 +0000 UTC m=+0.053155218 container init e3fe4ad5c6a3b0fbeba0080c9d5bed5b344f914552da3fbec9b9b3cc58f959c1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.license=GPLv2) 2026-03-10T07:53:58.158 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local podman[144184]: 2026-03-10 07:53:57.905863942 +0000 UTC m=+0.056115891 container start e3fe4ad5c6a3b0fbeba0080c9d5bed5b344f914552da3fbec9b9b3cc58f959c1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20260223) 2026-03-10T07:53:58.158 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local bash[144184]: e3fe4ad5c6a3b0fbeba0080c9d5bed5b344f914552da3fbec9b9b3cc58f959c1 2026-03-10T07:53:58.158 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local podman[144184]: 2026-03-10 07:53:57.861633785 +0000 UTC m=+0.011885734 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T07:53:58.158 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:57 vm05.local systemd[1]: Started Ceph osd.1 for 12e9780e-1c55-11f1-8896-79f7c2e9b508. 2026-03-10T07:53:58.518 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:58 vm05.local ceph-mon[130117]: pgmap v62: 65 pgs: 12 stale+active+clean, 1 peering, 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 50 active+clean; 255 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 60/261 objects degraded (22.989%); 0 B/s, 11 objects/s recovering 2026-03-10T07:53:58.518 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:58 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:58.518 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:58 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:58.518 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:58 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:53:58.518 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:53:58 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1[144194]: 2026-03-10T07:53:58.242+0000 7f267289f740 -1 Falling back to public interface 2026-03-10T07:53:58.820 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:58 vm08.local ceph-mon[107898]: pgmap v62: 65 pgs: 12 stale+active+clean, 1 peering, 1 active+recovery_wait+undersized+degraded+remapped, 1 active+recovering, 50 active+clean; 255 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 60/261 objects degraded (22.989%); 0 B/s, 11 objects/s recovering 2026-03-10T07:53:58.820 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:58 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:58.820 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:58 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:53:58.820 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:58 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:53:59.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:53:59 vm05.local ceph-mon[130117]: Health check update: Degraded data redundancy: 60/261 objects degraded (22.989%), 1 pg degraded (PG_DEGRADED) 2026-03-10T07:53:59.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:53:59 vm08.local ceph-mon[107898]: Health check update: Degraded data redundancy: 60/261 objects degraded (22.989%), 1 pg degraded (PG_DEGRADED) 2026-03-10T07:54:00.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:00 vm05.local ceph-mon[130117]: pgmap v63: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 18 active+undersized, 3 stale+active+clean, 11 active+undersized+degraded, 32 active+clean; 255 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 42/261 objects degraded (16.092%); 0 B/s, 15 objects/s recovering 2026-03-10T07:54:00.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:00 vm05.local ceph-mon[130117]: mgrmap e38: vm05.blexke(active, since 92s), standbys: vm08.orfpog 2026-03-10T07:54:00.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:00 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:00.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:00 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:00.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:00 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:00.658 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:00 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:00.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:00 vm08.local ceph-mon[107898]: pgmap v63: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 18 active+undersized, 3 stale+active+clean, 11 active+undersized+degraded, 32 active+clean; 255 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 42/261 objects degraded (16.092%); 0 B/s, 15 objects/s recovering 2026-03-10T07:54:00.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:00 vm08.local ceph-mon[107898]: mgrmap e38: vm05.blexke(active, since 92s), standbys: vm08.orfpog 2026-03-10T07:54:00.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:00 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:00.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:00 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:00.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:00 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:00.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:00 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:02.157 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:54:01 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1[144194]: 2026-03-10T07:54:01.846+0000 7f267289f740 -1 osd.1 0 read_superblock omap replica is missing. 2026-03-10T07:54:02.157 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:54:02 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1[144194]: 2026-03-10T07:54:02.045+0000 7f267289f740 -1 osd.1 55 log_to_monitors true 2026-03-10T07:54:02.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:02 vm05.local ceph-mon[130117]: pgmap v64: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 21 active+undersized, 13 active+undersized+degraded, 30 active+clean; 255 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 280 B/s wr, 0 op/s; 44/261 objects degraded (16.858%); 0 B/s, 6 objects/s recovering 2026-03-10T07:54:02.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:02 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:02.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:02 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:02.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:02 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:54:02.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:02 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:54:02.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:02 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:02.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:02 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:54:02.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:02 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:54:02.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:02 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:54:02.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:02 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:54:02.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:02 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T07:54:02.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:02 vm05.local ceph-mon[130117]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T07:54:02.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:02 vm05.local ceph-mon[130117]: Upgrade: unsafe to stop osd(s) at this time (12 PGs are or would become offline) 2026-03-10T07:54:02.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:02 vm05.local ceph-mon[130117]: from='osd.1 [v2:192.168.123.105:6810/474627303,v1:192.168.123.105:6811/474627303]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-10T07:54:02.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:02 vm08.local ceph-mon[107898]: pgmap v64: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 21 active+undersized, 13 active+undersized+degraded, 30 active+clean; 255 MiB data, 2.8 GiB used, 117 GiB / 120 GiB avail; 280 B/s wr, 0 op/s; 44/261 objects degraded (16.858%); 0 B/s, 6 objects/s recovering 2026-03-10T07:54:02.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:02 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:02.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:02 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:02.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:02 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:54:02.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:02 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:54:02.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:02 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:02.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:02 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:54:02.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:02 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:54:02.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:02 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:54:02.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:02 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:54:02.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:02 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T07:54:02.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:02 vm08.local ceph-mon[107898]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T07:54:02.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:02 vm08.local ceph-mon[107898]: Upgrade: unsafe to stop osd(s) at this time (12 PGs are or would become offline) 2026-03-10T07:54:02.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:02 vm08.local ceph-mon[107898]: from='osd.1 [v2:192.168.123.105:6810/474627303,v1:192.168.123.105:6811/474627303]' entity='osd.1' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]: dispatch 2026-03-10T07:54:02.907 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 07:54:02 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1[144194]: 2026-03-10T07:54:02.577+0000 7f266a639640 -1 osd.1 55 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T07:54:03.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:03 vm05.local ceph-mon[130117]: from='osd.1 [v2:192.168.123.105:6810/474627303,v1:192.168.123.105:6811/474627303]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-10T07:54:03.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:03 vm05.local ceph-mon[130117]: osdmap e58: 6 total, 5 up, 6 in 2026-03-10T07:54:03.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:03 vm05.local ceph-mon[130117]: from='osd.1 [v2:192.168.123.105:6810/474627303,v1:192.168.123.105:6811/474627303]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T07:54:03.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:03 vm08.local ceph-mon[107898]: from='osd.1 [v2:192.168.123.105:6810/474627303,v1:192.168.123.105:6811/474627303]' entity='osd.1' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["1"]}]': finished 2026-03-10T07:54:03.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:03 vm08.local ceph-mon[107898]: osdmap e58: 6 total, 5 up, 6 in 2026-03-10T07:54:03.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:03 vm08.local ceph-mon[107898]: from='osd.1 [v2:192.168.123.105:6810/474627303,v1:192.168.123.105:6811/474627303]' entity='osd.1' cmd=[{"prefix": "osd crush create-or-move", "id": 1, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T07:54:04.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:04 vm05.local ceph-mon[130117]: pgmap v66: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 21 active+undersized, 13 active+undersized+degraded, 30 active+clean; 255 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 280 B/s wr, 0 op/s; 44/261 objects degraded (16.858%); 0 B/s, 6 objects/s recovering 2026-03-10T07:54:04.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:04 vm05.local ceph-mon[130117]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T07:54:04.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:04 vm05.local ceph-mon[130117]: osd.1 [v2:192.168.123.105:6810/474627303,v1:192.168.123.105:6811/474627303] boot 2026-03-10T07:54:04.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:04 vm05.local ceph-mon[130117]: osdmap e59: 6 total, 6 up, 6 in 2026-03-10T07:54:04.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:04 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T07:54:04.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:04 vm05.local ceph-mon[130117]: Health check update: Degraded data redundancy: 44/261 objects degraded (16.858%), 14 pgs degraded (PG_DEGRADED) 2026-03-10T07:54:04.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:04 vm08.local ceph-mon[107898]: pgmap v66: 65 pgs: 1 active+recovering+undersized+degraded+remapped, 21 active+undersized, 13 active+undersized+degraded, 30 active+clean; 255 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 280 B/s wr, 0 op/s; 44/261 objects degraded (16.858%); 0 B/s, 6 objects/s recovering 2026-03-10T07:54:04.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:04 vm08.local ceph-mon[107898]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T07:54:04.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:04 vm08.local ceph-mon[107898]: osd.1 [v2:192.168.123.105:6810/474627303,v1:192.168.123.105:6811/474627303] boot 2026-03-10T07:54:04.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:04 vm08.local ceph-mon[107898]: osdmap e59: 6 total, 6 up, 6 in 2026-03-10T07:54:04.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:04 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 1}]: dispatch 2026-03-10T07:54:04.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:04 vm08.local ceph-mon[107898]: Health check update: Degraded data redundancy: 44/261 objects degraded (16.858%), 14 pgs degraded (PG_DEGRADED) 2026-03-10T07:54:05.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:05 vm05.local ceph-mon[130117]: osdmap e60: 6 total, 6 up, 6 in 2026-03-10T07:54:05.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:05 vm08.local ceph-mon[107898]: osdmap e60: 6 total, 6 up, 6 in 2026-03-10T07:54:06.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:06 vm05.local ceph-mon[130117]: pgmap v69: 65 pgs: 1 peering, 21 active+undersized, 13 active+undersized+degraded, 30 active+clean; 255 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 341 B/s wr, 0 op/s; 34/261 objects degraded (13.027%); 0 B/s, 6 objects/s recovering 2026-03-10T07:54:06.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:06 vm08.local ceph-mon[107898]: pgmap v69: 65 pgs: 1 peering, 21 active+undersized, 13 active+undersized+degraded, 30 active+clean; 255 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 341 B/s wr, 0 op/s; 34/261 objects degraded (13.027%); 0 B/s, 6 objects/s recovering 2026-03-10T07:54:08.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:08 vm05.local ceph-mon[130117]: pgmap v70: 65 pgs: 1 peering, 19 active+undersized, 13 active+undersized+degraded, 32 active+clean; 255 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 34/261 objects degraded (13.027%); 0 B/s, 6 objects/s recovering 2026-03-10T07:54:08.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:08 vm08.local ceph-mon[107898]: pgmap v70: 65 pgs: 1 peering, 19 active+undersized, 13 active+undersized+degraded, 32 active+clean; 255 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 34/261 objects degraded (13.027%); 0 B/s, 6 objects/s recovering 2026-03-10T07:54:09.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:09 vm05.local ceph-mon[130117]: Health check update: Degraded data redundancy: 34/261 objects degraded (13.027%), 13 pgs degraded (PG_DEGRADED) 2026-03-10T07:54:09.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:09 vm08.local ceph-mon[107898]: Health check update: Degraded data redundancy: 34/261 objects degraded (13.027%), 13 pgs degraded (PG_DEGRADED) 2026-03-10T07:54:10.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:10 vm05.local ceph-mon[130117]: pgmap v71: 65 pgs: 65 active+clean; 255 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 0 B/s, 6 objects/s recovering 2026-03-10T07:54:10.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:10 vm05.local ceph-mon[130117]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 34/261 objects degraded (13.027%), 13 pgs degraded) 2026-03-10T07:54:10.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:10 vm08.local ceph-mon[107898]: pgmap v71: 65 pgs: 65 active+clean; 255 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 0 B/s, 6 objects/s recovering 2026-03-10T07:54:10.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:10 vm08.local ceph-mon[107898]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 34/261 objects degraded (13.027%), 13 pgs degraded) 2026-03-10T07:54:12.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:12 vm05.local ceph-mon[130117]: pgmap v72: 65 pgs: 65 active+clean; 255 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 0 B/s, 5 objects/s recovering 2026-03-10T07:54:12.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:12 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:12.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:12 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:54:12.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:12 vm08.local ceph-mon[107898]: pgmap v72: 65 pgs: 65 active+clean; 255 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 0 B/s, 5 objects/s recovering 2026-03-10T07:54:12.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:12 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:12.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:12 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:54:14.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:14 vm05.local ceph-mon[130117]: pgmap v73: 65 pgs: 65 active+clean; 255 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:14.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:14 vm08.local ceph-mon[107898]: pgmap v73: 65 pgs: 65 active+clean; 255 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:16.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:16 vm05.local ceph-mon[130117]: pgmap v74: 65 pgs: 65 active+clean; 255 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:16.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:16 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T07:54:16.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:16 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:16.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:16 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T07:54:16.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:16 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:54:16.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:16 vm08.local ceph-mon[107898]: pgmap v74: 65 pgs: 65 active+clean; 255 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:16.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:16 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T07:54:16.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:16 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:16.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:16 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.2"}]: dispatch 2026-03-10T07:54:16.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:16 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:54:17.655 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:17 vm05.local systemd[1]: Stopping Ceph osd.2 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T07:54:17.655 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:17 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2[82489]: 2026-03-10T07:54:17.360+0000 7f3ef6462700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.2 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T07:54:17.655 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:17 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2[82489]: 2026-03-10T07:54:17.360+0000 7f3ef6462700 -1 osd.2 60 *** Got signal Terminated *** 2026-03-10T07:54:17.655 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:17 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2[82489]: 2026-03-10T07:54:17.360+0000 7f3ef6462700 -1 osd.2 60 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T07:54:17.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:17 vm05.local ceph-mon[130117]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T07:54:17.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:17 vm05.local ceph-mon[130117]: Upgrade: osd.2 is safe to restart 2026-03-10T07:54:17.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:17 vm05.local ceph-mon[130117]: Upgrade: Updating osd.2 2026-03-10T07:54:17.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:17 vm05.local ceph-mon[130117]: Deploying daemon osd.2 on vm05 2026-03-10T07:54:17.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:17 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:17.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:17 vm05.local ceph-mon[130117]: osd.2 marked itself down and dead 2026-03-10T07:54:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:17 vm08.local ceph-mon[107898]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["2"], "max": 16}]: dispatch 2026-03-10T07:54:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:17 vm08.local ceph-mon[107898]: Upgrade: osd.2 is safe to restart 2026-03-10T07:54:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:17 vm08.local ceph-mon[107898]: Upgrade: Updating osd.2 2026-03-10T07:54:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:17 vm08.local ceph-mon[107898]: Deploying daemon osd.2 on vm05 2026-03-10T07:54:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:17 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:17 vm08.local ceph-mon[107898]: osd.2 marked itself down and dead 2026-03-10T07:54:18.256 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:17 vm05.local podman[148549]: 2026-03-10 07:54:17.927690184 +0000 UTC m=+0.596166585 container died b8d3cbeb49f1b5f95ad53a913eaff2bc2b09916729602df2d3065f2a42f4915f (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2, RELEASE=HEAD, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, maintainer=Guillaume Abrioux , org.label-schema.build-date=20240222, CEPH_POINT_RELEASE=-18.2.1, GIT_BRANCH=HEAD, GIT_CLEAN=True, ceph=True, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, io.buildah.version=1.29.1, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, GIT_REPO=https://github.com/ceph/ceph-container.git) 2026-03-10T07:54:18.256 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:17 vm05.local podman[148549]: 2026-03-10 07:54:17.953113267 +0000 UTC m=+0.621589679 container remove b8d3cbeb49f1b5f95ad53a913eaff2bc2b09916729602df2d3065f2a42f4915f (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2, org.label-schema.schema-version=1.0, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, ceph=True, RELEASE=HEAD, org.label-schema.name=CentOS Stream 8 Base Image, io.buildah.version=1.29.1, maintainer=Guillaume Abrioux , org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.1, GIT_BRANCH=HEAD, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.build-date=20240222) 2026-03-10T07:54:18.256 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:17 vm05.local bash[148549]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2 2026-03-10T07:54:18.256 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:18 vm05.local podman[148618]: 2026-03-10 07:54:18.084370148 +0000 UTC m=+0.016710633 container create 0620a95d7655465089961452555f83fef8c566913196ac2f12f8a9fd788aae30 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T07:54:18.256 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:18 vm05.local podman[148618]: 2026-03-10 07:54:18.123632621 +0000 UTC m=+0.055973116 container init 0620a95d7655465089961452555f83fef8c566913196ac2f12f8a9fd788aae30 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, ceph=True, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid) 2026-03-10T07:54:18.256 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:18 vm05.local podman[148618]: 2026-03-10 07:54:18.126369273 +0000 UTC m=+0.058709758 container start 0620a95d7655465089961452555f83fef8c566913196ac2f12f8a9fd788aae30 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-deactivate, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, OSD_FLAVOR=default) 2026-03-10T07:54:18.256 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:18 vm05.local podman[148618]: 2026-03-10 07:54:18.157540509 +0000 UTC m=+0.089881004 container attach 0620a95d7655465089961452555f83fef8c566913196ac2f12f8a9fd788aae30 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-deactivate, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default) 2026-03-10T07:54:18.256 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:18 vm05.local podman[148618]: 2026-03-10 07:54:18.077491141 +0000 UTC m=+0.009831636 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T07:54:18.551 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:18 vm05.local conmon[148630]: conmon 0620a95d765546508996 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-0620a95d7655465089961452555f83fef8c566913196ac2f12f8a9fd788aae30.scope/container/memory.events 2026-03-10T07:54:18.551 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:18 vm05.local podman[148618]: 2026-03-10 07:54:18.257156647 +0000 UTC m=+0.189497132 container died 0620a95d7655465089961452555f83fef8c566913196ac2f12f8a9fd788aae30 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T07:54:18.551 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:18 vm05.local podman[148618]: 2026-03-10 07:54:18.277665961 +0000 UTC m=+0.210006446 container remove 0620a95d7655465089961452555f83fef8c566913196ac2f12f8a9fd788aae30 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, org.label-schema.schema-version=1.0, ceph=True, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T07:54:18.551 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:18 vm05.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.2.service: Deactivated successfully. 2026-03-10T07:54:18.551 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:18 vm05.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.2.service: Unit process 148630 (conmon) remains running after unit stopped. 2026-03-10T07:54:18.551 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:18 vm05.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.2.service: Unit process 148638 (podman) remains running after unit stopped. 2026-03-10T07:54:18.551 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:18 vm05.local systemd[1]: Stopped Ceph osd.2 for 12e9780e-1c55-11f1-8896-79f7c2e9b508. 2026-03-10T07:54:18.551 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:18 vm05.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.2.service: Consumed 31.771s CPU time, 464.3M memory peak. 2026-03-10T07:54:18.552 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:18 vm05.local systemd[1]: Starting Ceph osd.2 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T07:54:18.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:18 vm05.local ceph-mon[130117]: pgmap v75: 65 pgs: 65 active+clean; 255 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:18.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:18 vm05.local ceph-mon[130117]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T07:54:18.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:18 vm05.local ceph-mon[130117]: osdmap e61: 6 total, 5 up, 6 in 2026-03-10T07:54:18.907 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:18 vm05.local podman[148720]: 2026-03-10 07:54:18.55088759 +0000 UTC m=+0.016806213 container create 38ae970d0dc684bdc92c70901eb67d1378cbdc2454d47a2ebb40a757be6d56e8 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-activate, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, OSD_FLAVOR=default, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T07:54:18.908 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:18 vm05.local podman[148720]: 2026-03-10 07:54:18.592524197 +0000 UTC m=+0.058442820 container init 38ae970d0dc684bdc92c70901eb67d1378cbdc2454d47a2ebb40a757be6d56e8 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2) 2026-03-10T07:54:18.908 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:18 vm05.local podman[148720]: 2026-03-10 07:54:18.595449133 +0000 UTC m=+0.061367756 container start 38ae970d0dc684bdc92c70901eb67d1378cbdc2454d47a2ebb40a757be6d56e8 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-activate, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , ceph=True, CEPH_REF=squid, OSD_FLAVOR=default) 2026-03-10T07:54:18.908 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:18 vm05.local podman[148720]: 2026-03-10 07:54:18.602140108 +0000 UTC m=+0.068058732 container attach 38ae970d0dc684bdc92c70901eb67d1378cbdc2454d47a2ebb40a757be6d56e8 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-activate, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, ceph=True) 2026-03-10T07:54:18.908 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:18 vm05.local podman[148720]: 2026-03-10 07:54:18.544117005 +0000 UTC m=+0.010035628 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T07:54:18.908 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:18 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-activate[148730]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:54:18.908 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:18 vm05.local bash[148720]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:54:18.908 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:18 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-activate[148730]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:54:18.908 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:18 vm05.local bash[148720]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:54:18.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:18 vm08.local ceph-mon[107898]: pgmap v75: 65 pgs: 65 active+clean; 255 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:18.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:18 vm08.local ceph-mon[107898]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T07:54:18.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:18 vm08.local ceph-mon[107898]: osdmap e61: 6 total, 5 up, 6 in 2026-03-10T07:54:19.407 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-activate[148730]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T07:54:19.407 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local bash[148720]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T07:54:19.407 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-activate[148730]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:54:19.407 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local bash[148720]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:54:19.407 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-activate[148730]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:54:19.407 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local bash[148720]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:54:19.407 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-activate[148730]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-10T07:54:19.407 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local bash[148720]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-10T07:54:19.407 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-activate[148730]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-2c1135dd-c052-48f0-bbe8-75720f8022be/osd-block-ef60316b-3b61-4644-aa31-97cef548ba7e --path /var/lib/ceph/osd/ceph-2 --no-mon-config 2026-03-10T07:54:19.407 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local bash[148720]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-2c1135dd-c052-48f0-bbe8-75720f8022be/osd-block-ef60316b-3b61-4644-aa31-97cef548ba7e --path /var/lib/ceph/osd/ceph-2 --no-mon-config 2026-03-10T07:54:19.872 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:19 vm05.local ceph-mon[130117]: pgmap v77: 65 pgs: 12 peering, 7 stale+active+clean, 46 active+clean; 255 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:19.872 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:19 vm05.local ceph-mon[130117]: osdmap e62: 6 total, 5 up, 6 in 2026-03-10T07:54:19.872 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:19 vm05.local ceph-mon[130117]: Health check failed: Reduced data availability: 1 pg peering (PG_AVAILABILITY) 2026-03-10T07:54:19.872 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:19.872 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:19.872 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:54:19.872 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-activate[148730]: Running command: /usr/bin/ln -snf /dev/ceph-2c1135dd-c052-48f0-bbe8-75720f8022be/osd-block-ef60316b-3b61-4644-aa31-97cef548ba7e /var/lib/ceph/osd/ceph-2/block 2026-03-10T07:54:19.872 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local bash[148720]: Running command: /usr/bin/ln -snf /dev/ceph-2c1135dd-c052-48f0-bbe8-75720f8022be/osd-block-ef60316b-3b61-4644-aa31-97cef548ba7e /var/lib/ceph/osd/ceph-2/block 2026-03-10T07:54:19.872 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-activate[148730]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block 2026-03-10T07:54:19.872 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local bash[148720]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block 2026-03-10T07:54:19.872 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-activate[148730]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-10T07:54:19.872 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local bash[148720]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-10T07:54:19.872 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-activate[148730]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-10T07:54:19.872 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local bash[148720]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2 2026-03-10T07:54:19.872 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-activate[148730]: --> ceph-volume lvm activate successful for osd ID: 2 2026-03-10T07:54:19.872 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local bash[148720]: --> ceph-volume lvm activate successful for osd ID: 2 2026-03-10T07:54:19.872 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local conmon[148730]: conmon 38ae970d0dc684bdc92c : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-38ae970d0dc684bdc92c70901eb67d1378cbdc2454d47a2ebb40a757be6d56e8.scope/container/memory.events 2026-03-10T07:54:19.872 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local podman[148720]: 2026-03-10 07:54:19.497852291 +0000 UTC m=+0.963770904 container died 38ae970d0dc684bdc92c70901eb67d1378cbdc2454d47a2ebb40a757be6d56e8 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-activate, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T07:54:19.872 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local podman[148720]: 2026-03-10 07:54:19.51937922 +0000 UTC m=+0.985297843 container remove 38ae970d0dc684bdc92c70901eb67d1378cbdc2454d47a2ebb40a757be6d56e8 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-activate, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, ceph=True) 2026-03-10T07:54:19.872 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local podman[148976]: 2026-03-10 07:54:19.609705718 +0000 UTC m=+0.015876493 container create 108a77e324b80ab459e853640e5097ae7e1f46f7cb10342fee8da863d82fc639 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, ceph=True, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid) 2026-03-10T07:54:19.872 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local podman[148976]: 2026-03-10 07:54:19.643488973 +0000 UTC m=+0.049659758 container init 108a77e324b80ab459e853640e5097ae7e1f46f7cb10342fee8da863d82fc639 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, ceph=True, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T07:54:19.872 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local podman[148976]: 2026-03-10 07:54:19.646958388 +0000 UTC m=+0.053129163 container start 108a77e324b80ab459e853640e5097ae7e1f46f7cb10342fee8da863d82fc639 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T07:54:19.872 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local bash[148976]: 108a77e324b80ab459e853640e5097ae7e1f46f7cb10342fee8da863d82fc639 2026-03-10T07:54:19.872 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local podman[148976]: 2026-03-10 07:54:19.603454455 +0000 UTC m=+0.009625230 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T07:54:19.873 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local systemd[1]: Started Ceph osd.2 for 12e9780e-1c55-11f1-8896-79f7c2e9b508. 2026-03-10T07:54:20.157 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:19 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2[148987]: 2026-03-10T07:54:19.989+0000 7f9763928740 -1 Falling back to public interface 2026-03-10T07:54:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:19 vm08.local ceph-mon[107898]: pgmap v77: 65 pgs: 12 peering, 7 stale+active+clean, 46 active+clean; 255 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:19 vm08.local ceph-mon[107898]: osdmap e62: 6 total, 5 up, 6 in 2026-03-10T07:54:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:19 vm08.local ceph-mon[107898]: Health check failed: Reduced data availability: 1 pg peering (PG_AVAILABILITY) 2026-03-10T07:54:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:54:21.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:20 vm05.local ceph-mon[130117]: Health check failed: Degraded data redundancy: 10/261 objects degraded (3.831%), 4 pgs degraded (PG_DEGRADED) 2026-03-10T07:54:21.319 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:20 vm08.local ceph-mon[107898]: Health check failed: Degraded data redundancy: 10/261 objects degraded (3.831%), 4 pgs degraded (PG_DEGRADED) 2026-03-10T07:54:22.297 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:22 vm05.local ceph-mon[130117]: pgmap v79: 65 pgs: 4 active+undersized, 12 peering, 4 stale+active+clean, 4 active+undersized+degraded, 41 active+clean; 255 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 10/261 objects degraded (3.831%) 2026-03-10T07:54:22.297 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:22 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:22.297 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:22 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:22.297 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:22 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:22.297 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:22 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:22.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:22 vm08.local ceph-mon[107898]: pgmap v79: 65 pgs: 4 active+undersized, 12 peering, 4 stale+active+clean, 4 active+undersized+degraded, 41 active+clean; 255 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 10/261 objects degraded (3.831%) 2026-03-10T07:54:22.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:22 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:22.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:22 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:22.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:22 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:22.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:22 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:23.657 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:23 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2[148987]: 2026-03-10T07:54:23.247+0000 7f9763928740 -1 osd.2 0 read_superblock omap replica is missing. 2026-03-10T07:54:23.657 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:23 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2[148987]: 2026-03-10T07:54:23.423+0000 7f9763928740 -1 osd.2 60 log_to_monitors true 2026-03-10T07:54:23.747 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.745+0000 7f4c884c7700 1 -- 192.168.123.105:0/2656312592 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4c80101280 msgr2=0x7f4c80101660 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:23.747 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.745+0000 7f4c884c7700 1 --2- 192.168.123.105:0/2656312592 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4c80101280 0x7f4c80101660 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7f4c70009b50 tx=0x7f4c70009e60 comp rx=0 tx=0).stop 2026-03-10T07:54:23.747 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.746+0000 7f4c884c7700 1 -- 192.168.123.105:0/2656312592 shutdown_connections 2026-03-10T07:54:23.747 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.746+0000 7f4c884c7700 1 --2- 192.168.123.105:0/2656312592 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c80101c30 0x7f4c80105b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:23.747 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.746+0000 7f4c884c7700 1 --2- 192.168.123.105:0/2656312592 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4c80101280 0x7f4c80101660 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:23.747 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.746+0000 7f4c884c7700 1 -- 192.168.123.105:0/2656312592 >> 192.168.123.105:0/2656312592 conn(0x7f4c80078ea0 msgr2=0x7f4c800792b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:23.747 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.746+0000 7f4c884c7700 1 -- 192.168.123.105:0/2656312592 shutdown_connections 2026-03-10T07:54:23.747 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.746+0000 7f4c884c7700 1 -- 192.168.123.105:0/2656312592 wait complete. 2026-03-10T07:54:23.747 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.747+0000 7f4c884c7700 1 Processor -- start 2026-03-10T07:54:23.747 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.747+0000 7f4c884c7700 1 -- start start 2026-03-10T07:54:23.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.747+0000 7f4c884c7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4c80101280 0x7f4c80198d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:23.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.747+0000 7f4c884c7700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c80101c30 0x7f4c801992c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:23.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.747+0000 7f4c884c7700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4c801999a0 con 0x7f4c80101280 2026-03-10T07:54:23.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.747+0000 7f4c884c7700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4c8019d730 con 0x7f4c80101c30 2026-03-10T07:54:23.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.748+0000 7f4c85a62700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c80101c30 0x7f4c801992c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:23.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.748+0000 7f4c85a62700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c80101c30 0x7f4c801992c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:52180/0 (socket says 192.168.123.105:52180) 2026-03-10T07:54:23.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.748+0000 7f4c85a62700 1 -- 192.168.123.105:0/1036287528 learned_addr learned my addr 192.168.123.105:0/1036287528 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:54:23.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.748+0000 7f4c85a62700 1 -- 192.168.123.105:0/1036287528 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4c80101280 msgr2=0x7f4c80198d80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:23.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.748+0000 7f4c85a62700 1 --2- 192.168.123.105:0/1036287528 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4c80101280 0x7f4c80198d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:23.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.748+0000 7f4c85a62700 1 -- 192.168.123.105:0/1036287528 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4c700097e0 con 0x7f4c80101c30 2026-03-10T07:54:23.748 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.748+0000 7f4c85a62700 1 --2- 192.168.123.105:0/1036287528 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c80101c30 0x7f4c801992c0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f4c7c00eb10 tx=0x7f4c7c00eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:23.749 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.748+0000 7f4c777fe700 1 -- 192.168.123.105:0/1036287528 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4c7c00cca0 con 0x7f4c80101c30 2026-03-10T07:54:23.749 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.748+0000 7f4c884c7700 1 -- 192.168.123.105:0/1036287528 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4c8019da10 con 0x7f4c80101c30 2026-03-10T07:54:23.749 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.749+0000 7f4c884c7700 1 -- 192.168.123.105:0/1036287528 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4c8019df60 con 0x7f4c80101c30 2026-03-10T07:54:23.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.749+0000 7f4c777fe700 1 -- 192.168.123.105:0/1036287528 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f4c7c00ce00 con 0x7f4c80101c30 2026-03-10T07:54:23.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.749+0000 7f4c777fe700 1 -- 192.168.123.105:0/1036287528 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4c7c0189c0 con 0x7f4c80101c30 2026-03-10T07:54:23.755 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.751+0000 7f4c777fe700 1 -- 192.168.123.105:0/1036287528 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4c7c018b20 con 0x7f4c80101c30 2026-03-10T07:54:23.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.752+0000 7f4c777fe700 1 --2- 192.168.123.105:0/1036287528 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4c6c0778c0 0x7f4c6c079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:23.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.752+0000 7f4c86263700 1 --2- 192.168.123.105:0/1036287528 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4c6c0778c0 0x7f4c6c079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:23.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.753+0000 7f4c86263700 1 --2- 192.168.123.105:0/1036287528 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4c6c0778c0 0x7f4c6c079d80 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f4c7000b5c0 tx=0x7f4c700058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:23.756 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.753+0000 7f4c777fe700 1 -- 192.168.123.105:0/1036287528 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(62..62 src has 1..62) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f4c7c014070 con 0x7f4c80101c30 2026-03-10T07:54:23.759 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.759+0000 7f4c884c7700 1 -- 192.168.123.105:0/1036287528 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4c640052f0 con 0x7f4c80101c30 2026-03-10T07:54:23.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.763+0000 7f4c777fe700 1 -- 192.168.123.105:0/1036287528 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4c7c0630f0 con 0x7f4c80101c30 2026-03-10T07:54:23.913 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.912+0000 7f4c884c7700 1 -- 192.168.123.105:0/1036287528 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f4c64000bc0 con 0x7f4c6c0778c0 2026-03-10T07:54:23.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.913+0000 7f4c777fe700 1 -- 192.168.123.105:0/1036287528 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f4c64000bc0 con 0x7f4c6c0778c0 2026-03-10T07:54:23.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.916+0000 7f4c884c7700 1 -- 192.168.123.105:0/1036287528 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4c6c0778c0 msgr2=0x7f4c6c079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:23.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.916+0000 7f4c884c7700 1 --2- 192.168.123.105:0/1036287528 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4c6c0778c0 0x7f4c6c079d80 secure :-1 s=READY pgs=43 cs=0 l=1 rev1=1 crypto rx=0x7f4c7000b5c0 tx=0x7f4c700058e0 comp rx=0 tx=0).stop 2026-03-10T07:54:23.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.916+0000 7f4c884c7700 1 -- 192.168.123.105:0/1036287528 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c80101c30 msgr2=0x7f4c801992c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:23.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.916+0000 7f4c884c7700 1 --2- 192.168.123.105:0/1036287528 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c80101c30 0x7f4c801992c0 secure :-1 s=READY pgs=26 cs=0 l=1 rev1=1 crypto rx=0x7f4c7c00eb10 tx=0x7f4c7c00eed0 comp rx=0 tx=0).stop 2026-03-10T07:54:23.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.917+0000 7f4c884c7700 1 -- 192.168.123.105:0/1036287528 shutdown_connections 2026-03-10T07:54:23.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.917+0000 7f4c884c7700 1 --2- 192.168.123.105:0/1036287528 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4c6c0778c0 0x7f4c6c079d80 unknown :-1 s=CLOSED pgs=43 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:23.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.917+0000 7f4c884c7700 1 --2- 192.168.123.105:0/1036287528 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4c80101280 0x7f4c80198d80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:23.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.917+0000 7f4c884c7700 1 --2- 192.168.123.105:0/1036287528 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c80101c30 0x7f4c801992c0 unknown :-1 s=CLOSED pgs=26 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:23.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.917+0000 7f4c884c7700 1 -- 192.168.123.105:0/1036287528 >> 192.168.123.105:0/1036287528 conn(0x7f4c80078ea0 msgr2=0x7f4c80100850 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:23.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.917+0000 7f4c884c7700 1 -- 192.168.123.105:0/1036287528 shutdown_connections 2026-03-10T07:54:23.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.917+0000 7f4c884c7700 1 -- 192.168.123.105:0/1036287528 wait complete. 2026-03-10T07:54:23.927 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T07:54:23.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.998+0000 7f6e5132b700 1 -- 192.168.123.105:0/4153310687 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6e4c10f340 msgr2=0x7f6e4c10f720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:23.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.998+0000 7f6e4affd700 1 -- 192.168.123.105:0/4153310687 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6e3c00ba20 con 0x7f6e4c10f340 2026-03-10T07:54:23.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.998+0000 7f6e5132b700 1 --2- 192.168.123.105:0/4153310687 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6e4c10f340 0x7f6e4c10f720 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f6e3c009b50 tx=0x7f6e3c009e60 comp rx=0 tx=0).stop 2026-03-10T07:54:23.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:23 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:23.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:23 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:23.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:23 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:54:23.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:23 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:54:23.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:23 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:23.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:23 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:54:23.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:23 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:54:23.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:23 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:54:23.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:23 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:54:23.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:23 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T07:54:23.999 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:23 vm05.local ceph-mon[130117]: from='osd.2 [v2:192.168.123.105:6818/1473398242,v1:192.168.123.105:6819/1473398242]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-10T07:54:24.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.999+0000 7f6e5132b700 1 -- 192.168.123.105:0/4153310687 shutdown_connections 2026-03-10T07:54:24.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.999+0000 7f6e5132b700 1 --2- 192.168.123.105:0/4153310687 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6e4c10d0f0 0x7f6e4c10d570 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.999+0000 7f6e5132b700 1 --2- 192.168.123.105:0/4153310687 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6e4c10f340 0x7f6e4c10f720 secure :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f6e3c009b50 tx=0x7f6e3c009e60 comp rx=0 tx=0).stop 2026-03-10T07:54:24.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.999+0000 7f6e5132b700 1 -- 192.168.123.105:0/4153310687 >> 192.168.123.105:0/4153310687 conn(0x7f6e4c06ce20 msgr2=0x7f6e4c06d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:24.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.999+0000 7f6e5132b700 1 -- 192.168.123.105:0/4153310687 shutdown_connections 2026-03-10T07:54:24.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.999+0000 7f6e5132b700 1 -- 192.168.123.105:0/4153310687 wait complete. 2026-03-10T07:54:24.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.999+0000 7f6e5132b700 1 Processor -- start 2026-03-10T07:54:24.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:23.999+0000 7f6e5132b700 1 -- start start 2026-03-10T07:54:24.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.000+0000 7f6e5132b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6e4c10d0f0 0x7f6e4c1ab8b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:24.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.000+0000 7f6e5132b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6e4c1abdf0 0x7f6e4c1a58a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:24.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.000+0000 7f6e5132b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6e4c1ac3a0 con 0x7f6e4c1abdf0 2026-03-10T07:54:24.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.000+0000 7f6e5132b700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6e4c1a5de0 con 0x7f6e4c10d0f0 2026-03-10T07:54:24.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.000+0000 7f6e4b7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6e4c1abdf0 0x7f6e4c1a58a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:24.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.000+0000 7f6e4b7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6e4c1abdf0 0x7f6e4c1a58a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:47110/0 (socket says 192.168.123.105:47110) 2026-03-10T07:54:24.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.000+0000 7f6e4b7fe700 1 -- 192.168.123.105:0/1106064164 learned_addr learned my addr 192.168.123.105:0/1106064164 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:54:24.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.000+0000 7f6e4bfff700 1 --2- 192.168.123.105:0/1106064164 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6e4c10d0f0 0x7f6e4c1ab8b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:24.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.000+0000 7f6e4b7fe700 1 -- 192.168.123.105:0/1106064164 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6e4c10d0f0 msgr2=0x7f6e4c1ab8b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:24.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.000+0000 7f6e4b7fe700 1 --2- 192.168.123.105:0/1106064164 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6e4c10d0f0 0x7f6e4c1ab8b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.000+0000 7f6e4b7fe700 1 -- 192.168.123.105:0/1106064164 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6e3c0097e0 con 0x7f6e4c1abdf0 2026-03-10T07:54:24.002 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.002+0000 7f6e4b7fe700 1 --2- 192.168.123.105:0/1106064164 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6e4c1abdf0 0x7f6e4c1a58a0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f6e4000b700 tx=0x7f6e4000bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:24.006 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.002+0000 7f6e497fa700 1 -- 192.168.123.105:0/1106064164 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6e40010820 con 0x7f6e4c1abdf0 2026-03-10T07:54:24.006 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.002+0000 7f6e5132b700 1 -- 192.168.123.105:0/1106064164 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6e4c1a5f80 con 0x7f6e4c1abdf0 2026-03-10T07:54:24.006 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.002+0000 7f6e5132b700 1 -- 192.168.123.105:0/1106064164 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6e4c1a64d0 con 0x7f6e4c1abdf0 2026-03-10T07:54:24.006 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.003+0000 7f6e497fa700 1 -- 192.168.123.105:0/1106064164 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f6e40010e60 con 0x7f6e4c1abdf0 2026-03-10T07:54:24.006 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.003+0000 7f6e497fa700 1 -- 192.168.123.105:0/1106064164 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6e40017570 con 0x7f6e4c1abdf0 2026-03-10T07:54:24.006 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.003+0000 7f6e497fa700 1 -- 192.168.123.105:0/1106064164 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6e40017750 con 0x7f6e4c1abdf0 2026-03-10T07:54:24.006 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.004+0000 7f6e497fa700 1 --2- 192.168.123.105:0/1106064164 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6e34077ab0 0x7f6e34079f70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:24.006 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.004+0000 7f6e497fa700 1 -- 192.168.123.105:0/1106064164 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(63..63 src has 1..63) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f6e4009a4d0 con 0x7f6e4c1abdf0 2026-03-10T07:54:24.006 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.004+0000 7f6e32ffd700 1 -- 192.168.123.105:0/1106064164 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6e38005320 con 0x7f6e4c1abdf0 2026-03-10T07:54:24.007 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.007+0000 7f6e4bfff700 1 --2- 192.168.123.105:0/1106064164 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6e34077ab0 0x7f6e34079f70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:24.008 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.007+0000 7f6e497fa700 1 -- 192.168.123.105:0/1106064164 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6e40062d80 con 0x7f6e4c1abdf0 2026-03-10T07:54:24.008 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.008+0000 7f6e4bfff700 1 --2- 192.168.123.105:0/1106064164 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6e34077ab0 0x7f6e34079f70 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f6e3c000c00 tx=0x7f6e3c00b560 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:24.140 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.139+0000 7f6e32ffd700 1 -- 192.168.123.105:0/1106064164 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f6e38000bf0 con 0x7f6e34077ab0 2026-03-10T07:54:24.141 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.141+0000 7f6e497fa700 1 -- 192.168.123.105:0/1106064164 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f6e38000bf0 con 0x7f6e34077ab0 2026-03-10T07:54:24.145 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.144+0000 7f6e32ffd700 1 -- 192.168.123.105:0/1106064164 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6e34077ab0 msgr2=0x7f6e34079f70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:24.145 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.144+0000 7f6e32ffd700 1 --2- 192.168.123.105:0/1106064164 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6e34077ab0 0x7f6e34079f70 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f6e3c000c00 tx=0x7f6e3c00b560 comp rx=0 tx=0).stop 2026-03-10T07:54:24.145 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.144+0000 7f6e32ffd700 1 -- 192.168.123.105:0/1106064164 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6e4c1abdf0 msgr2=0x7f6e4c1a58a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:24.145 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.144+0000 7f6e32ffd700 1 --2- 192.168.123.105:0/1106064164 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6e4c1abdf0 0x7f6e4c1a58a0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f6e4000b700 tx=0x7f6e4000bac0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.145+0000 7f6e32ffd700 1 -- 192.168.123.105:0/1106064164 shutdown_connections 2026-03-10T07:54:24.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.145+0000 7f6e32ffd700 1 --2- 192.168.123.105:0/1106064164 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6e4c10d0f0 0x7f6e4c1ab8b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.145+0000 7f6e32ffd700 1 --2- 192.168.123.105:0/1106064164 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6e34077ab0 0x7f6e34079f70 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.145+0000 7f6e32ffd700 1 --2- 192.168.123.105:0/1106064164 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6e4c1abdf0 0x7f6e4c1a58a0 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.145+0000 7f6e32ffd700 1 -- 192.168.123.105:0/1106064164 >> 192.168.123.105:0/1106064164 conn(0x7f6e4c06ce20 msgr2=0x7f6e4c109eb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:24.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.146+0000 7f6e32ffd700 1 -- 192.168.123.105:0/1106064164 shutdown_connections 2026-03-10T07:54:24.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.146+0000 7f6e32ffd700 1 -- 192.168.123.105:0/1106064164 wait complete. 2026-03-10T07:54:24.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:23 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:24.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:23 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:24.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:23 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:54:24.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:23 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:54:24.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:23 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:24.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:23 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:54:24.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:23 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:54:24.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:23 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:54:24.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:23 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:54:24.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:23 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T07:54:24.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:23 vm08.local ceph-mon[107898]: from='osd.2 [v2:192.168.123.105:6818/1473398242,v1:192.168.123.105:6819/1473398242]' entity='osd.2' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]: dispatch 2026-03-10T07:54:24.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.244+0000 7f2778968700 1 -- 192.168.123.105:0/2362556472 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2770100ee0 msgr2=0x7f27701012c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:24.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.244+0000 7f2778968700 1 --2- 192.168.123.105:0/2362556472 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2770100ee0 0x7f27701012c0 secure :-1 s=READY pgs=27 cs=0 l=1 rev1=1 crypto rx=0x7f276400df10 tx=0x7f276400f330 comp rx=0 tx=0).stop 2026-03-10T07:54:24.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.244+0000 7f2778968700 1 -- 192.168.123.105:0/2362556472 shutdown_connections 2026-03-10T07:54:24.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.244+0000 7f2778968700 1 --2- 192.168.123.105:0/2362556472 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2770101890 0x7f27701058e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.244+0000 7f2778968700 1 --2- 192.168.123.105:0/2362556472 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2770100ee0 0x7f27701012c0 unknown :-1 s=CLOSED pgs=27 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.244+0000 7f2778968700 1 -- 192.168.123.105:0/2362556472 >> 192.168.123.105:0/2362556472 conn(0x7f27700fc790 msgr2=0x7f27700febb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:24.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.244+0000 7f2778968700 1 -- 192.168.123.105:0/2362556472 shutdown_connections 2026-03-10T07:54:24.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.245+0000 7f2778968700 1 -- 192.168.123.105:0/2362556472 wait complete. 2026-03-10T07:54:24.245 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.245+0000 7f2778968700 1 Processor -- start 2026-03-10T07:54:24.246 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.245+0000 7f2778968700 1 -- start start 2026-03-10T07:54:24.246 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.246+0000 7f2778968700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2770100ee0 0x7f2770194830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:24.246 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.246+0000 7f2778968700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2770101890 0x7f2770194d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:24.246 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.246+0000 7f2778968700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f27701953c0 con 0x7f2770100ee0 2026-03-10T07:54:24.246 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.246+0000 7f2778968700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2770195500 con 0x7f2770101890 2026-03-10T07:54:24.246 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.246+0000 7f2776704700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2770100ee0 0x7f2770194830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:24.246 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.246+0000 7f2776704700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2770100ee0 0x7f2770194830 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:47132/0 (socket says 192.168.123.105:47132) 2026-03-10T07:54:24.246 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.246+0000 7f2776704700 1 -- 192.168.123.105:0/4118578266 learned_addr learned my addr 192.168.123.105:0/4118578266 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:54:24.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.246+0000 7f2775f03700 1 --2- 192.168.123.105:0/4118578266 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2770101890 0x7f2770194d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:24.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.246+0000 7f2776704700 1 -- 192.168.123.105:0/4118578266 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2770101890 msgr2=0x7f2770194d70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:24.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.246+0000 7f2776704700 1 --2- 192.168.123.105:0/4118578266 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2770101890 0x7f2770194d70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.246+0000 7f2776704700 1 -- 192.168.123.105:0/4118578266 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f276400dbf0 con 0x7f2770100ee0 2026-03-10T07:54:24.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.247+0000 7f2776704700 1 --2- 192.168.123.105:0/4118578266 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2770100ee0 0x7f2770194830 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f276400f7b0 tx=0x7f2764004bd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:24.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.247+0000 7f27637fe700 1 -- 192.168.123.105:0/4118578266 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2764012040 con 0x7f2770100ee0 2026-03-10T07:54:24.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.247+0000 7f27637fe700 1 -- 192.168.123.105:0/4118578266 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f2764005710 con 0x7f2770100ee0 2026-03-10T07:54:24.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.247+0000 7f2778968700 1 -- 192.168.123.105:0/4118578266 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f27701990e0 con 0x7f2770100ee0 2026-03-10T07:54:24.250 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.248+0000 7f2778968700 1 -- 192.168.123.105:0/4118578266 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f27701995d0 con 0x7f2770100ee0 2026-03-10T07:54:24.250 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.250+0000 7f27637fe700 1 -- 192.168.123.105:0/4118578266 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f276400b7e0 con 0x7f2770100ee0 2026-03-10T07:54:24.252 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.252+0000 7f2778968700 1 -- 192.168.123.105:0/4118578266 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2754005320 con 0x7f2770100ee0 2026-03-10T07:54:24.253 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.253+0000 7f27637fe700 1 -- 192.168.123.105:0/4118578266 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f276400ba50 con 0x7f2770100ee0 2026-03-10T07:54:24.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.253+0000 7f27637fe700 1 --2- 192.168.123.105:0/4118578266 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f275c0776b0 0x7f275c079b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:24.254 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.253+0000 7f27637fe700 1 -- 192.168.123.105:0/4118578266 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(63..63 src has 1..63) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f276409ef20 con 0x7f2770100ee0 2026-03-10T07:54:24.256 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.255+0000 7f2775f03700 1 --2- 192.168.123.105:0/4118578266 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f275c0776b0 0x7f275c079b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:24.256 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.255+0000 7f2775f03700 1 --2- 192.168.123.105:0/4118578266 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f275c0776b0 0x7f275c079b70 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f2770195c40 tx=0x7f276800a380 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:24.257 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.256+0000 7f27637fe700 1 -- 192.168.123.105:0/4118578266 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2764067750 con 0x7f2770100ee0 2026-03-10T07:54:24.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.383+0000 7f2778968700 1 -- 192.168.123.105:0/4118578266 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f2754000bf0 con 0x7f275c0776b0 2026-03-10T07:54:24.389 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.388+0000 7f27637fe700 1 -- 192.168.123.105:0/4118578266 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f2754000bf0 con 0x7f275c0776b0 2026-03-10T07:54:24.392 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T07:54:24.392 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (3m) 3s ago 8m 23.3M - 0.25.0 c8568f914cd2 ac15d5f35994 2026-03-10T07:54:24.392 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (8m) 3s ago 8m 9.98M - 18.2.1 5be31c24972a 26c4db858175 2026-03-10T07:54:24.392 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (7m) 89s ago 7m 10.4M - 18.2.1 5be31c24972a 209e2398a09c 2026-03-10T07:54:24.392 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (93s) 3s ago 8m 7847k - 19.2.3-678-ge911bdeb 654f31e6858e daa831c74cf4 2026-03-10T07:54:24.392 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (91s) 89s ago 7m 8262k - 19.2.3-678-ge911bdeb 654f31e6858e 668ac55c9722 2026-03-10T07:54:24.392 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (2m) 3s ago 7m 81.7M - 10.4.0 c8b91775d855 6acb529ad951 2026-03-10T07:54:24.392 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.omfhnh vm05 running (5m) 3s ago 5m 186M - 18.2.1 5be31c24972a e23de179e09c 2026-03-10T07:54:24.392 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.pavqil vm05 running (5m) 3s ago 5m 17.6M - 18.2.1 5be31c24972a 5b9e5afa214c 2026-03-10T07:54:24.392 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.dgsaon vm08 running (5m) 89s ago 5m 17.7M - 18.2.1 5be31c24972a 1696aee522b5 2026-03-10T07:54:24.392 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ybmbgd vm08 running (5m) 89s ago 5m 16.0M - 18.2.1 5be31c24972a 30b0e51cd2ed 2026-03-10T07:54:24.392 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.blexke vm05 *:8443,9283,8765 running (4m) 3s ago 8m 586M - 19.2.3-678-ge911bdeb 654f31e6858e 5467b6a3bd38 2026-03-10T07:54:24.392 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.orfpog vm08 *:8443,9283,8765 running (3m) 89s ago 7m 534M - 19.2.3-678-ge911bdeb 654f31e6858e 7a15d7811d5b 2026-03-10T07:54:24.392 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (2m) 3s ago 8m 58.0M 2048M 19.2.3-678-ge911bdeb 654f31e6858e f02f076bb820 2026-03-10T07:54:24.392 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (109s) 89s ago 7m 48.8M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 73d9a504f360 2026-03-10T07:54:24.392 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (3m) 3s ago 8m 10.3M - 1.7.0 72c9c2088986 7cd0b23b4118 2026-03-10T07:54:24.392 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (3m) 89s ago 7m 9.92M - 1.7.0 72c9c2088986 3dd4d91d5881 2026-03-10T07:54:24.392 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (81s) 3s ago 7m 201M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b35fccc2a4d5 2026-03-10T07:54:24.392 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (26s) 3s ago 7m 95.0M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e3fe4ad5c6a3 2026-03-10T07:54:24.393 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (4s) 3s ago 6m 13.0M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 108a77e324b8 2026-03-10T07:54:24.393 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (6m) 89s ago 6m 410M 4096M 18.2.1 5be31c24972a 0a62c54a86c0 2026-03-10T07:54:24.393 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (6m) 89s ago 6m 343M 4096M 18.2.1 5be31c24972a bd748b691ccd 2026-03-10T07:54:24.393 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (6m) 89s ago 6m 284M 4096M 18.2.1 5be31c24972a 9f08820ae98b 2026-03-10T07:54:24.393 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (3m) 3s ago 7m 51.4M - 2.51.0 1d3b7f56885b c59a6be07563 2026-03-10T07:54:24.395 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.394+0000 7f2778968700 1 -- 192.168.123.105:0/4118578266 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f275c0776b0 msgr2=0x7f275c079b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:24.396 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.394+0000 7f2778968700 1 --2- 192.168.123.105:0/4118578266 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f275c0776b0 0x7f275c079b70 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f2770195c40 tx=0x7f276800a380 comp rx=0 tx=0).stop 2026-03-10T07:54:24.396 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.394+0000 7f2778968700 1 -- 192.168.123.105:0/4118578266 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2770100ee0 msgr2=0x7f2770194830 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:24.396 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.394+0000 7f2778968700 1 --2- 192.168.123.105:0/4118578266 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2770100ee0 0x7f2770194830 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f276400f7b0 tx=0x7f2764004bd0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.396 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.395+0000 7f2778968700 1 -- 192.168.123.105:0/4118578266 shutdown_connections 2026-03-10T07:54:24.396 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.395+0000 7f2778968700 1 --2- 192.168.123.105:0/4118578266 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f275c0776b0 0x7f275c079b70 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.396 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.395+0000 7f2778968700 1 --2- 192.168.123.105:0/4118578266 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2770100ee0 0x7f2770194830 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.396 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.395+0000 7f2778968700 1 --2- 192.168.123.105:0/4118578266 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2770101890 0x7f2770194d70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.396 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.395+0000 7f2778968700 1 -- 192.168.123.105:0/4118578266 >> 192.168.123.105:0/4118578266 conn(0x7f27700fc790 msgr2=0x7f27700fdb60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:24.396 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.395+0000 7f2778968700 1 -- 192.168.123.105:0/4118578266 shutdown_connections 2026-03-10T07:54:24.396 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.395+0000 7f2778968700 1 -- 192.168.123.105:0/4118578266 wait complete. 2026-03-10T07:54:24.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.475+0000 7f2597fff700 1 -- 192.168.123.105:0/3784446499 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f259810f340 msgr2=0x7f259810f720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:24.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.475+0000 7f2597fff700 1 --2- 192.168.123.105:0/3784446499 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f259810f340 0x7f259810f720 secure :-1 s=READY pgs=28 cs=0 l=1 rev1=1 crypto rx=0x7f2588009b00 tx=0x7f2588009e10 comp rx=0 tx=0).stop 2026-03-10T07:54:24.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.475+0000 7f2597fff700 1 -- 192.168.123.105:0/3784446499 shutdown_connections 2026-03-10T07:54:24.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.475+0000 7f2597fff700 1 --2- 192.168.123.105:0/3784446499 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f259810d0f0 0x7f259810d570 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.475+0000 7f2597fff700 1 --2- 192.168.123.105:0/3784446499 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f259810f340 0x7f259810f720 unknown :-1 s=CLOSED pgs=28 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.475+0000 7f2597fff700 1 -- 192.168.123.105:0/3784446499 >> 192.168.123.105:0/3784446499 conn(0x7f259806ce20 msgr2=0x7f259806d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:24.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.475+0000 7f2597fff700 1 -- 192.168.123.105:0/3784446499 shutdown_connections 2026-03-10T07:54:24.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.475+0000 7f2597fff700 1 -- 192.168.123.105:0/3784446499 wait complete. 2026-03-10T07:54:24.476 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.476+0000 7f2597fff700 1 Processor -- start 2026-03-10T07:54:24.476 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.476+0000 7f2597fff700 1 -- start start 2026-03-10T07:54:24.477 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.476+0000 7f2597fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f259810d0f0 0x7f25981ab550 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:24.477 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.476+0000 7f2597fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f259810f340 0x7f25981aba90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:24.477 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.476+0000 7f2597fff700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f25981ac120 con 0x7f259810d0f0 2026-03-10T07:54:24.477 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.476+0000 7f2597fff700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f25981a55d0 con 0x7f259810f340 2026-03-10T07:54:24.477 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.476+0000 7f25967fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f259810f340 0x7f25981aba90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:24.477 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.476+0000 7f25967fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f259810f340 0x7f25981aba90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:52240/0 (socket says 192.168.123.105:52240) 2026-03-10T07:54:24.477 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.476+0000 7f25967fc700 1 -- 192.168.123.105:0/1970366160 learned_addr learned my addr 192.168.123.105:0/1970366160 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:54:24.477 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.477+0000 7f25967fc700 1 -- 192.168.123.105:0/1970366160 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f259810d0f0 msgr2=0x7f25981ab550 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:54:24.477 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.477+0000 7f25967fc700 1 --2- 192.168.123.105:0/1970366160 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f259810d0f0 0x7f25981ab550 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.477 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.477+0000 7f25967fc700 1 -- 192.168.123.105:0/1970366160 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2590009e30 con 0x7f259810f340 2026-03-10T07:54:24.479 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.477+0000 7f25967fc700 1 --2- 192.168.123.105:0/1970366160 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f259810f340 0x7f25981aba90 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f259000e3f0 tx=0x7f259000e7b0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:24.479 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.477+0000 7f257ffff700 1 -- 192.168.123.105:0/1970366160 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2590019070 con 0x7f259810f340 2026-03-10T07:54:24.479 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.477+0000 7f2597fff700 1 -- 192.168.123.105:0/1970366160 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f25880097e0 con 0x7f259810f340 2026-03-10T07:54:24.479 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.477+0000 7f2597fff700 1 -- 192.168.123.105:0/1970366160 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f25981a5c70 con 0x7f259810f340 2026-03-10T07:54:24.479 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.478+0000 7f257ffff700 1 -- 192.168.123.105:0/1970366160 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f259000f040 con 0x7f259810f340 2026-03-10T07:54:24.479 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.478+0000 7f257ffff700 1 -- 192.168.123.105:0/1970366160 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f25900156b0 con 0x7f259810f340 2026-03-10T07:54:24.479 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.478+0000 7f2597fff700 1 -- 192.168.123.105:0/1970366160 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2584005320 con 0x7f259810f340 2026-03-10T07:54:24.481 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.480+0000 7f257ffff700 1 -- 192.168.123.105:0/1970366160 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2590003bb0 con 0x7f259810f340 2026-03-10T07:54:24.481 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.480+0000 7f257ffff700 1 --2- 192.168.123.105:0/1970366160 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f25800776b0 0x7f2580079b70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:24.481 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.480+0000 7f257ffff700 1 -- 192.168.123.105:0/1970366160 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(63..63 src has 1..63) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f259009a690 con 0x7f259810f340 2026-03-10T07:54:24.481 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.481+0000 7f2596ffd700 1 --2- 192.168.123.105:0/1970366160 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f25800776b0 0x7f2580079b70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:24.490 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.481+0000 7f2596ffd700 1 --2- 192.168.123.105:0/1970366160 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f25800776b0 0x7f2580079b70 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f258800b5c0 tx=0x7f2588005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:24.490 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.482+0000 7f257ffff700 1 -- 192.168.123.105:0/1970366160 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2590062f40 con 0x7f259810f340 2026-03-10T07:54:24.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.649+0000 7f2597fff700 1 -- 192.168.123.105:0/1970366160 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f2584005cc0 con 0x7f259810f340 2026-03-10T07:54:24.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.650+0000 7f257ffff700 1 -- 192.168.123.105:0/1970366160 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f2590062690 con 0x7f259810f340 2026-03-10T07:54:24.651 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:54:24.651 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T07:54:24.651 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:54:24.651 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:54:24.651 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T07:54:24.651 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:54:24.651 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:54:24.651 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T07:54:24.651 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 3, 2026-03-10T07:54:24.651 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:54:24.651 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:54:24.651 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T07:54:24.651 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T07:54:24.651 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:54:24.651 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T07:54:24.651 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 7, 2026-03-10T07:54:24.651 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T07:54:24.651 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T07:54:24.651 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:54:24.653 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.653+0000 7f2597fff700 1 -- 192.168.123.105:0/1970366160 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f25800776b0 msgr2=0x7f2580079b70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:24.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.653+0000 7f2597fff700 1 --2- 192.168.123.105:0/1970366160 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f25800776b0 0x7f2580079b70 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f258800b5c0 tx=0x7f2588005fb0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.653+0000 7f2597fff700 1 -- 192.168.123.105:0/1970366160 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f259810f340 msgr2=0x7f25981aba90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:24.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.653+0000 7f2597fff700 1 --2- 192.168.123.105:0/1970366160 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f259810f340 0x7f25981aba90 secure :-1 s=READY pgs=29 cs=0 l=1 rev1=1 crypto rx=0x7f259000e3f0 tx=0x7f259000e7b0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.654+0000 7f2597fff700 1 -- 192.168.123.105:0/1970366160 shutdown_connections 2026-03-10T07:54:24.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.654+0000 7f2597fff700 1 --2- 192.168.123.105:0/1970366160 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f25800776b0 0x7f2580079b70 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.654+0000 7f2597fff700 1 --2- 192.168.123.105:0/1970366160 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f259810d0f0 0x7f25981ab550 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.654+0000 7f2597fff700 1 --2- 192.168.123.105:0/1970366160 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f259810f340 0x7f25981aba90 unknown :-1 s=CLOSED pgs=29 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.654+0000 7f2597fff700 1 -- 192.168.123.105:0/1970366160 >> 192.168.123.105:0/1970366160 conn(0x7f259806ce20 msgr2=0x7f2598070480 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:24.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.654+0000 7f2597fff700 1 -- 192.168.123.105:0/1970366160 shutdown_connections 2026-03-10T07:54:24.654 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.654+0000 7f2597fff700 1 -- 192.168.123.105:0/1970366160 wait complete. 2026-03-10T07:54:24.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.730+0000 7f7eeb996700 1 -- 192.168.123.105:0/3623609361 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ee4068df0 msgr2=0x7f7ee410d2f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:24.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.730+0000 7f7eeb996700 1 --2- 192.168.123.105:0/3623609361 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ee4068df0 0x7f7ee410d2f0 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f7ee0009b50 tx=0x7f7ee0009e60 comp rx=0 tx=0).stop 2026-03-10T07:54:24.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.730+0000 7f7eeb996700 1 -- 192.168.123.105:0/3623609361 shutdown_connections 2026-03-10T07:54:24.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.730+0000 7f7eeb996700 1 --2- 192.168.123.105:0/3623609361 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ee4068df0 0x7f7ee410d2f0 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.730+0000 7f7eeb996700 1 --2- 192.168.123.105:0/3623609361 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7ee40684d0 0x7f7ee40688b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.730 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.730+0000 7f7eeb996700 1 -- 192.168.123.105:0/3623609361 >> 192.168.123.105:0/3623609361 conn(0x7f7ee40756c0 msgr2=0x7f7ee4075ad0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:24.732 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.730+0000 7f7eeb996700 1 -- 192.168.123.105:0/3623609361 shutdown_connections 2026-03-10T07:54:24.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.731+0000 7f7eeb996700 1 -- 192.168.123.105:0/3623609361 wait complete. 2026-03-10T07:54:24.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.733+0000 7f7eeb996700 1 Processor -- start 2026-03-10T07:54:24.735 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.735+0000 7f7eeb996700 1 -- start start 2026-03-10T07:54:24.735 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.735+0000 7f7eeb996700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7ee40684d0 0x7f7ee419e8e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:24.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.735+0000 7f7eeb996700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ee4068df0 0x7f7ee419ee20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:24.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.735+0000 7f7eeb996700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7ee419f4b0 con 0x7f7ee4068df0 2026-03-10T07:54:24.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.735+0000 7f7eeb996700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7ee4198960 con 0x7f7ee40684d0 2026-03-10T07:54:24.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.736+0000 7f7ee9732700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7ee40684d0 0x7f7ee419e8e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:24.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.736+0000 7f7ee9732700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7ee40684d0 0x7f7ee419e8e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:52256/0 (socket says 192.168.123.105:52256) 2026-03-10T07:54:24.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.736+0000 7f7ee9732700 1 -- 192.168.123.105:0/3136458941 learned_addr learned my addr 192.168.123.105:0/3136458941 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:54:24.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.736+0000 7f7ee8f31700 1 --2- 192.168.123.105:0/3136458941 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ee4068df0 0x7f7ee419ee20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:24.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.736+0000 7f7ee9732700 1 -- 192.168.123.105:0/3136458941 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ee4068df0 msgr2=0x7f7ee419ee20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:24.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.737+0000 7f7ee9732700 1 --2- 192.168.123.105:0/3136458941 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ee4068df0 0x7f7ee419ee20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.737+0000 7f7ee9732700 1 -- 192.168.123.105:0/3136458941 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7ee00097e0 con 0x7f7ee40684d0 2026-03-10T07:54:24.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.737+0000 7f7ee8f31700 1 --2- 192.168.123.105:0/3136458941 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ee4068df0 0x7f7ee419ee20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T07:54:24.737 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.737+0000 7f7ee9732700 1 --2- 192.168.123.105:0/3136458941 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7ee40684d0 0x7f7ee419e8e0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f7ed800eb10 tx=0x7f7ed800eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:24.738 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.737+0000 7f7ed67fc700 1 -- 192.168.123.105:0/3136458941 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7ed800cca0 con 0x7f7ee40684d0 2026-03-10T07:54:24.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.737+0000 7f7ed67fc700 1 -- 192.168.123.105:0/3136458941 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f7ed800ce00 con 0x7f7ee40684d0 2026-03-10T07:54:24.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.737+0000 7f7ed67fc700 1 -- 192.168.123.105:0/3136458941 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7ed8018910 con 0x7f7ee40684d0 2026-03-10T07:54:24.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.738+0000 7f7eeb996700 1 -- 192.168.123.105:0/3136458941 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7ee4198bf0 con 0x7f7ee40684d0 2026-03-10T07:54:24.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.738+0000 7f7eeb996700 1 -- 192.168.123.105:0/3136458941 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7ee4199140 con 0x7f7ee40684d0 2026-03-10T07:54:24.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.739+0000 7f7eeb996700 1 -- 192.168.123.105:0/3136458941 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7ec8005320 con 0x7f7ee40684d0 2026-03-10T07:54:24.743 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.741+0000 7f7ed67fc700 1 -- 192.168.123.105:0/3136458941 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f7ed8018a70 con 0x7f7ee40684d0 2026-03-10T07:54:24.743 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.741+0000 7f7ed67fc700 1 --2- 192.168.123.105:0/3136458941 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f7ed00779e0 0x7f7ed0079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:24.743 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.741+0000 7f7ed67fc700 1 -- 192.168.123.105:0/3136458941 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(63..63 src has 1..63) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f7ed8027080 con 0x7f7ee40684d0 2026-03-10T07:54:24.743 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.741+0000 7f7ee8f31700 1 --2- 192.168.123.105:0/3136458941 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f7ed00779e0 0x7f7ed0079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:24.743 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.742+0000 7f7ee8f31700 1 --2- 192.168.123.105:0/3136458941 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f7ed00779e0 0x7f7ed0079ea0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f7ee0006010 tx=0x7f7ee00058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:24.744 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.744+0000 7f7ed67fc700 1 -- 192.168.123.105:0/3136458941 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f7ed8062e70 con 0x7f7ee40684d0 2026-03-10T07:54:24.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:24 vm05.local ceph-mon[130117]: pgmap v80: 65 pgs: 8 active+undersized, 12 peering, 7 active+undersized+degraded, 38 active+clean; 255 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 24/261 objects degraded (9.195%) 2026-03-10T07:54:24.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:24 vm05.local ceph-mon[130117]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T07:54:24.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:24 vm05.local ceph-mon[130117]: Upgrade: unsafe to stop osd(s) at this time (11 PGs are or would become offline) 2026-03-10T07:54:24.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:24 vm05.local ceph-mon[130117]: from='osd.2 [v2:192.168.123.105:6818/1473398242,v1:192.168.123.105:6819/1473398242]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-10T07:54:24.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:24 vm05.local ceph-mon[130117]: osdmap e63: 6 total, 5 up, 6 in 2026-03-10T07:54:24.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:24 vm05.local ceph-mon[130117]: from='osd.2 [v2:192.168.123.105:6818/1473398242,v1:192.168.123.105:6819/1473398242]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T07:54:24.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:24 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/1970366160' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:54:24.911 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.910+0000 7f7eeb996700 1 -- 192.168.123.105:0/3136458941 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f7ec8005cc0 con 0x7f7ee40684d0 2026-03-10T07:54:24.911 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.911+0000 7f7ed67fc700 1 -- 192.168.123.105:0/3136458941 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1915 (secure 0 0 0) 0x7f7ed80625c0 con 0x7f7ee40684d0 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T07:48:24.309293+0000 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T07:48:33.425187+0000 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:max_xattr_size 65536 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24297} 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T07:54:24.912 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T07:54:24.913 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T07:54:24.913 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T07:54:24.913 INFO:teuthology.orchestra.run.vm05.stdout:inline_data enabled 2026-03-10T07:54:24.913 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T07:54:24.913 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T07:54:24.913 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T07:54:24.913 INFO:teuthology.orchestra.run.vm05.stdout:qdb_cluster leader: 0 members: 2026-03-10T07:54:24.913 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.omfhnh{0:24297} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/723078808,v1:192.168.123.105:6827/723078808] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:54:24.913 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:54:24.913 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:54:24.913 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T07:54:24.913 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:54:24.913 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.pavqil{-1:14512} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/427555544,v1:192.168.123.105:6829/427555544] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:54:24.913 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ybmbgd{-1:14524} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3815585988,v1:192.168.123.108:6825/3815585988] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:54:24.913 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.dgsaon{-1:24313} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/2963085185,v1:192.168.123.108:6827/2963085185] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:54:24.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.915+0000 7f7ecffff700 1 -- 192.168.123.105:0/3136458941 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f7ed00779e0 msgr2=0x7f7ed0079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:24.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.915+0000 7f7ecffff700 1 --2- 192.168.123.105:0/3136458941 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f7ed00779e0 0x7f7ed0079ea0 secure :-1 s=READY pgs=48 cs=0 l=1 rev1=1 crypto rx=0x7f7ee0006010 tx=0x7f7ee00058e0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.915+0000 7f7ecffff700 1 -- 192.168.123.105:0/3136458941 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7ee40684d0 msgr2=0x7f7ee419e8e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:24.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.915+0000 7f7ecffff700 1 --2- 192.168.123.105:0/3136458941 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7ee40684d0 0x7f7ee419e8e0 secure :-1 s=READY pgs=30 cs=0 l=1 rev1=1 crypto rx=0x7f7ed800eb10 tx=0x7f7ed800eed0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.916+0000 7f7ecffff700 1 -- 192.168.123.105:0/3136458941 shutdown_connections 2026-03-10T07:54:24.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.916+0000 7f7ecffff700 1 --2- 192.168.123.105:0/3136458941 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7ee40684d0 0x7f7ee419e8e0 unknown :-1 s=CLOSED pgs=30 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.916+0000 7f7ecffff700 1 --2- 192.168.123.105:0/3136458941 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f7ed00779e0 0x7f7ed0079ea0 unknown :-1 s=CLOSED pgs=48 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.916+0000 7f7ecffff700 1 --2- 192.168.123.105:0/3136458941 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ee4068df0 0x7f7ee419ee20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.916+0000 7f7ecffff700 1 -- 192.168.123.105:0/3136458941 >> 192.168.123.105:0/3136458941 conn(0x7f7ee40756c0 msgr2=0x7f7ee40fe720 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:24.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.916+0000 7f7ecffff700 1 -- 192.168.123.105:0/3136458941 shutdown_connections 2026-03-10T07:54:24.916 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.916+0000 7f7ecffff700 1 -- 192.168.123.105:0/3136458941 wait complete. 2026-03-10T07:54:24.918 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T07:54:24.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.987+0000 7f07f24ef700 1 -- 192.168.123.105:0/3256108195 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f07ec107d90 msgr2=0x7f07ec108210 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:24.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.987+0000 7f07f24ef700 1 --2- 192.168.123.105:0/3256108195 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f07ec107d90 0x7f07ec108210 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f07e400b3a0 tx=0x7f07e400b6b0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.987+0000 7f07f24ef700 1 -- 192.168.123.105:0/3256108195 shutdown_connections 2026-03-10T07:54:24.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.987+0000 7f07f24ef700 1 --2- 192.168.123.105:0/3256108195 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f07ec107d90 0x7f07ec108210 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.987+0000 7f07f24ef700 1 --2- 192.168.123.105:0/3256108195 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f07ec10f420 0x7f07ec10f800 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.987+0000 7f07f24ef700 1 -- 192.168.123.105:0/3256108195 >> 192.168.123.105:0/3256108195 conn(0x7f07ec06ce20 msgr2=0x7f07ec06d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:24.988 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.987+0000 7f07f24ef700 1 -- 192.168.123.105:0/3256108195 shutdown_connections 2026-03-10T07:54:24.988 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.987+0000 7f07f24ef700 1 -- 192.168.123.105:0/3256108195 wait complete. 2026-03-10T07:54:24.988 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.988+0000 7f07f24ef700 1 Processor -- start 2026-03-10T07:54:24.988 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.988+0000 7f07f24ef700 1 -- start start 2026-03-10T07:54:24.988 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.988+0000 7f07f24ef700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f07ec107d90 0x7f07ec113120 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:24.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.988+0000 7f07f24ef700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f07ec10f420 0x7f07ec113660 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:24.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.988+0000 7f07f24ef700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f07ec1182c0 con 0x7f07ec10f420 2026-03-10T07:54:24.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.988+0000 7f07f24ef700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f07ec113ba0 con 0x7f07ec107d90 2026-03-10T07:54:24.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.988+0000 7f07ebfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f07ec107d90 0x7f07ec113120 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:24.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.989+0000 7f07ebfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f07ec107d90 0x7f07ec113120 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:52276/0 (socket says 192.168.123.105:52276) 2026-03-10T07:54:24.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.989+0000 7f07ebfff700 1 -- 192.168.123.105:0/3688516225 learned_addr learned my addr 192.168.123.105:0/3688516225 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:54:24.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.989+0000 7f07ebfff700 1 -- 192.168.123.105:0/3688516225 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f07ec10f420 msgr2=0x7f07ec113660 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:24.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.989+0000 7f07ebfff700 1 --2- 192.168.123.105:0/3688516225 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f07ec10f420 0x7f07ec113660 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:24.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.989+0000 7f07ebfff700 1 -- 192.168.123.105:0/3688516225 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f07e400b050 con 0x7f07ec107d90 2026-03-10T07:54:24.990 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.990+0000 7f07ebfff700 1 --2- 192.168.123.105:0/3688516225 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f07ec107d90 0x7f07ec113120 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f07dc00d8d0 tx=0x7f07dc00dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:24.990 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.990+0000 7f07e97fa700 1 -- 192.168.123.105:0/3688516225 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f07dc009940 con 0x7f07ec107d90 2026-03-10T07:54:24.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.990+0000 7f07f24ef700 1 -- 192.168.123.105:0/3688516225 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f07ec113e80 con 0x7f07ec107d90 2026-03-10T07:54:24.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.990+0000 7f07f24ef700 1 -- 192.168.123.105:0/3688516225 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f07ec1b8620 con 0x7f07ec107d90 2026-03-10T07:54:24.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.991+0000 7f07e97fa700 1 -- 192.168.123.105:0/3688516225 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f07dc010460 con 0x7f07ec107d90 2026-03-10T07:54:24.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.991+0000 7f07e97fa700 1 -- 192.168.123.105:0/3688516225 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f07dc00f5d0 con 0x7f07ec107d90 2026-03-10T07:54:24.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.991+0000 7f07f24ef700 1 -- 192.168.123.105:0/3688516225 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f07d8005320 con 0x7f07ec107d90 2026-03-10T07:54:24.992 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.992+0000 7f07e97fa700 1 -- 192.168.123.105:0/3688516225 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f07dc009aa0 con 0x7f07ec107d90 2026-03-10T07:54:24.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.992+0000 7f07e97fa700 1 --2- 192.168.123.105:0/3688516225 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f07d4077910 0x7f07d4079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:24.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.993+0000 7f07eb7fe700 1 --2- 192.168.123.105:0/3688516225 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f07d4077910 0x7f07d4079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:24.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.993+0000 7f07e97fa700 1 -- 192.168.123.105:0/3688516225 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(63..63 src has 1..63) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f07dc0999b0 con 0x7f07ec107d90 2026-03-10T07:54:24.994 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.993+0000 7f07eb7fe700 1 --2- 192.168.123.105:0/3688516225 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f07d4077910 0x7f07d4079dd0 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f07e400bb30 tx=0x7f07e400bf90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:24.995 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:24.995+0000 7f07e97fa700 1 -- 192.168.123.105:0/3688516225 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f07dc061b90 con 0x7f07ec107d90 2026-03-10T07:54:25.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.124+0000 7f07f24ef700 1 -- 192.168.123.105:0/3688516225 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f07d8000bf0 con 0x7f07d4077910 2026-03-10T07:54:25.125 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.125+0000 7f07e97fa700 1 -- 192.168.123.105:0/3688516225 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+400 (secure 0 0 0) 0x7f07d8000bf0 con 0x7f07d4077910 2026-03-10T07:54:25.126 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:54:25.126 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T07:54:25.126 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T07:54:25.126 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T07:54:25.126 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T07:54:25.126 INFO:teuthology.orchestra.run.vm05.stdout: "mgr", 2026-03-10T07:54:25.126 INFO:teuthology.orchestra.run.vm05.stdout: "crash", 2026-03-10T07:54:25.126 INFO:teuthology.orchestra.run.vm05.stdout: "mon" 2026-03-10T07:54:25.126 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T07:54:25.126 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "9/23 daemons upgraded", 2026-03-10T07:54:25.126 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T07:54:25.126 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T07:54:25.126 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:54:25.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.128+0000 7f07d2ffd700 1 -- 192.168.123.105:0/3688516225 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f07d4077910 msgr2=0x7f07d4079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:25.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.128+0000 7f07d2ffd700 1 --2- 192.168.123.105:0/3688516225 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f07d4077910 0x7f07d4079dd0 secure :-1 s=READY pgs=49 cs=0 l=1 rev1=1 crypto rx=0x7f07e400bb30 tx=0x7f07e400bf90 comp rx=0 tx=0).stop 2026-03-10T07:54:25.129 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.128+0000 7f07d2ffd700 1 -- 192.168.123.105:0/3688516225 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f07ec107d90 msgr2=0x7f07ec113120 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:25.129 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.129+0000 7f07d2ffd700 1 --2- 192.168.123.105:0/3688516225 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f07ec107d90 0x7f07ec113120 secure :-1 s=READY pgs=31 cs=0 l=1 rev1=1 crypto rx=0x7f07dc00d8d0 tx=0x7f07dc00dc90 comp rx=0 tx=0).stop 2026-03-10T07:54:25.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.130+0000 7f07d2ffd700 1 -- 192.168.123.105:0/3688516225 shutdown_connections 2026-03-10T07:54:25.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.130+0000 7f07d2ffd700 1 --2- 192.168.123.105:0/3688516225 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f07ec107d90 0x7f07ec113120 unknown :-1 s=CLOSED pgs=31 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:25.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.130+0000 7f07d2ffd700 1 --2- 192.168.123.105:0/3688516225 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f07d4077910 0x7f07d4079dd0 unknown :-1 s=CLOSED pgs=49 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:25.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.130+0000 7f07d2ffd700 1 --2- 192.168.123.105:0/3688516225 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f07ec10f420 0x7f07ec113660 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:25.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.130+0000 7f07d2ffd700 1 -- 192.168.123.105:0/3688516225 >> 192.168.123.105:0/3688516225 conn(0x7f07ec06ce20 msgr2=0x7f07ec10d250 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:25.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.130+0000 7f07d2ffd700 1 -- 192.168.123.105:0/3688516225 shutdown_connections 2026-03-10T07:54:25.130 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.130+0000 7f07d2ffd700 1 -- 192.168.123.105:0/3688516225 wait complete. 2026-03-10T07:54:25.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:24 vm08.local ceph-mon[107898]: pgmap v80: 65 pgs: 8 active+undersized, 12 peering, 7 active+undersized+degraded, 38 active+clean; 255 MiB data, 2.4 GiB used, 118 GiB / 120 GiB avail; 24/261 objects degraded (9.195%) 2026-03-10T07:54:25.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:24 vm08.local ceph-mon[107898]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T07:54:25.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:24 vm08.local ceph-mon[107898]: Upgrade: unsafe to stop osd(s) at this time (11 PGs are or would become offline) 2026-03-10T07:54:25.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:24 vm08.local ceph-mon[107898]: from='osd.2 [v2:192.168.123.105:6818/1473398242,v1:192.168.123.105:6819/1473398242]' entity='osd.2' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["2"]}]': finished 2026-03-10T07:54:25.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:24 vm08.local ceph-mon[107898]: osdmap e63: 6 total, 5 up, 6 in 2026-03-10T07:54:25.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:24 vm08.local ceph-mon[107898]: from='osd.2 [v2:192.168.123.105:6818/1473398242,v1:192.168.123.105:6819/1473398242]' entity='osd.2' cmd=[{"prefix": "osd crush create-or-move", "id": 2, "weight":0.0195, "args": ["host=vm05", "root=default"]}]: dispatch 2026-03-10T07:54:25.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:24 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/1970366160' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:54:25.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.204+0000 7f8caefa9700 1 -- 192.168.123.105:0/2064473967 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8ca80ffc80 msgr2=0x7f8ca8100100 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:25.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.204+0000 7f8caefa9700 1 --2- 192.168.123.105:0/2064473967 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8ca80ffc80 0x7f8ca8100100 secure :-1 s=READY pgs=32 cs=0 l=1 rev1=1 crypto rx=0x7f8c98009a60 tx=0x7f8c98009d70 comp rx=0 tx=0).stop 2026-03-10T07:54:25.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.205+0000 7f8caefa9700 1 -- 192.168.123.105:0/2064473967 shutdown_connections 2026-03-10T07:54:25.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.205+0000 7f8caefa9700 1 --2- 192.168.123.105:0/2064473967 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8ca80ffc80 0x7f8ca8100100 unknown :-1 s=CLOSED pgs=32 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:25.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.205+0000 7f8caefa9700 1 --2- 192.168.123.105:0/2064473967 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ca8104b00 0x7f8ca8104ee0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:25.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.205+0000 7f8caefa9700 1 -- 192.168.123.105:0/2064473967 >> 192.168.123.105:0/2064473967 conn(0x7f8ca80756b0 msgr2=0x7f8ca8075ac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:25.205 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.205+0000 7f8caefa9700 1 -- 192.168.123.105:0/2064473967 shutdown_connections 2026-03-10T07:54:25.206 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.205+0000 7f8caefa9700 1 -- 192.168.123.105:0/2064473967 wait complete. 2026-03-10T07:54:25.206 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.205+0000 7f8caefa9700 1 Processor -- start 2026-03-10T07:54:25.206 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.206+0000 7f8caefa9700 1 -- start start 2026-03-10T07:54:25.206 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.206+0000 7f8caefa9700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ca8104b00 0x7f8ca81ab5d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:25.206 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.206+0000 7f8caefa9700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8ca81a55c0 0x7f8ca81a5a40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:25.206 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.206+0000 7f8caefa9700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8ca81abbd0 con 0x7f8ca8104b00 2026-03-10T07:54:25.206 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.206+0000 7f8caefa9700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8ca81a5f80 con 0x7f8ca81a55c0 2026-03-10T07:54:25.206 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.206+0000 7f8cadfa7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ca8104b00 0x7f8ca81ab5d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:25.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.206+0000 7f8cadfa7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ca8104b00 0x7f8ca81ab5d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:47200/0 (socket says 192.168.123.105:47200) 2026-03-10T07:54:25.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.206+0000 7f8cadfa7700 1 -- 192.168.123.105:0/3622020771 learned_addr learned my addr 192.168.123.105:0/3622020771 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:54:25.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.207+0000 7f8cadfa7700 1 -- 192.168.123.105:0/3622020771 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8ca81a55c0 msgr2=0x7f8ca81a5a40 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:54:25.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.207+0000 7f8cadfa7700 1 --2- 192.168.123.105:0/3622020771 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8ca81a55c0 0x7f8ca81a5a40 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:25.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.207+0000 7f8cadfa7700 1 -- 192.168.123.105:0/3622020771 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8c98009710 con 0x7f8ca8104b00 2026-03-10T07:54:25.207 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.207+0000 7f8cadfa7700 1 --2- 192.168.123.105:0/3622020771 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ca8104b00 0x7f8ca81ab5d0 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f8ca400cc60 tx=0x7f8ca40074a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:25.212 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.212+0000 7f8c9effd700 1 -- 192.168.123.105:0/3622020771 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8ca4007af0 con 0x7f8ca8104b00 2026-03-10T07:54:25.213 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.213+0000 7f8c9effd700 1 -- 192.168.123.105:0/3622020771 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f8ca4007c50 con 0x7f8ca8104b00 2026-03-10T07:54:25.213 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.213+0000 7f8c9effd700 1 -- 192.168.123.105:0/3622020771 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8ca4018770 con 0x7f8ca8104b00 2026-03-10T07:54:25.213 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.213+0000 7f8caefa9700 1 -- 192.168.123.105:0/3622020771 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8ca81a6260 con 0x7f8ca8104b00 2026-03-10T07:54:25.214 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.213+0000 7f8caefa9700 1 -- 192.168.123.105:0/3622020771 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8ca806e9f0 con 0x7f8ca8104b00 2026-03-10T07:54:25.215 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.215+0000 7f8caefa9700 1 -- 192.168.123.105:0/3622020771 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8ca804f2e0 con 0x7f8ca8104b00 2026-03-10T07:54:25.216 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.216+0000 7f8c9effd700 1 -- 192.168.123.105:0/3622020771 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8ca400f450 con 0x7f8ca8104b00 2026-03-10T07:54:25.216 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.216+0000 7f8c9effd700 1 --2- 192.168.123.105:0/3622020771 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f8c940779e0 0x7f8c94079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:25.216 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.216+0000 7f8c9effd700 1 -- 192.168.123.105:0/3622020771 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(63..63 src has 1..63) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f8ca4099f00 con 0x7f8ca8104b00 2026-03-10T07:54:25.219 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.219+0000 7f8cad7a6700 1 --2- 192.168.123.105:0/3622020771 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f8c940779e0 0x7f8c94079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:25.219 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.219+0000 7f8cad7a6700 1 --2- 192.168.123.105:0/3622020771 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f8c940779e0 0x7f8c94079ea0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f8c98000c00 tx=0x7f8c980046f0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:25.220 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.219+0000 7f8c9effd700 1 -- 192.168.123.105:0/3622020771 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8ca40627b0 con 0x7f8ca8104b00 2026-03-10T07:54:25.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.382+0000 7f8caefa9700 1 -- 192.168.123.105:0/3622020771 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f8ca804ea90 con 0x7f8ca8104b00 2026-03-10T07:54:25.384 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.384+0000 7f8c9effd700 1 -- 192.168.123.105:0/3622020771 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+1026 (secure 0 0 0) 0x7f8ca4061f00 con 0x7f8ca8104b00 2026-03-10T07:54:25.384 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data; 1 osds down; Reduced data availability: 1 pg peering; Degraded data redundancy: 24/261 objects degraded (9.195%), 7 pgs degraded 2026-03-10T07:54:25.384 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T07:54:25.384 INFO:teuthology.orchestra.run.vm05.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T07:54:25.384 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] OSD_DOWN: 1 osds down 2026-03-10T07:54:25.384 INFO:teuthology.orchestra.run.vm05.stdout: osd.2 (root=default,host=vm05) is down 2026-03-10T07:54:25.384 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] PG_AVAILABILITY: Reduced data availability: 1 pg peering 2026-03-10T07:54:25.384 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.c is stuck peering for 65s, current state peering, last acting [3,0] 2026-03-10T07:54:25.384 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 24/261 objects degraded (9.195%), 7 pgs degraded 2026-03-10T07:54:25.384 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.1 is active+undersized+degraded, acting [1,0] 2026-03-10T07:54:25.384 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.d is active+undersized+degraded, acting [1,3] 2026-03-10T07:54:25.384 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.e is active+undersized+degraded, acting [0,3] 2026-03-10T07:54:25.384 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.10 is active+undersized+degraded, acting [1,0] 2026-03-10T07:54:25.384 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.13 is active+undersized+degraded, acting [0,4] 2026-03-10T07:54:25.384 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.19 is active+undersized+degraded, acting [0,4] 2026-03-10T07:54:25.384 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.1e is active+undersized+degraded, acting [0,5] 2026-03-10T07:54:25.387 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.386+0000 7f8c9cff9700 1 -- 192.168.123.105:0/3622020771 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f8c940779e0 msgr2=0x7f8c94079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:25.387 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.386+0000 7f8c9cff9700 1 --2- 192.168.123.105:0/3622020771 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f8c940779e0 0x7f8c94079ea0 secure :-1 s=READY pgs=50 cs=0 l=1 rev1=1 crypto rx=0x7f8c98000c00 tx=0x7f8c980046f0 comp rx=0 tx=0).stop 2026-03-10T07:54:25.387 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.386+0000 7f8c9cff9700 1 -- 192.168.123.105:0/3622020771 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ca8104b00 msgr2=0x7f8ca81ab5d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:25.387 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.386+0000 7f8c9cff9700 1 --2- 192.168.123.105:0/3622020771 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ca8104b00 0x7f8ca81ab5d0 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f8ca400cc60 tx=0x7f8ca40074a0 comp rx=0 tx=0).stop 2026-03-10T07:54:25.387 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.387+0000 7f8c9cff9700 1 -- 192.168.123.105:0/3622020771 shutdown_connections 2026-03-10T07:54:25.387 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.387+0000 7f8c9cff9700 1 --2- 192.168.123.105:0/3622020771 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f8c940779e0 0x7f8c94079ea0 unknown :-1 s=CLOSED pgs=50 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:25.387 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.387+0000 7f8c9cff9700 1 --2- 192.168.123.105:0/3622020771 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8ca8104b00 0x7f8ca81ab5d0 secure :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f8ca400cc60 tx=0x7f8ca40074a0 comp rx=0 tx=0).stop 2026-03-10T07:54:25.387 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.387+0000 7f8c9cff9700 1 --2- 192.168.123.105:0/3622020771 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8ca81a55c0 0x7f8ca81a5a40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:25.387 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.387+0000 7f8c9cff9700 1 -- 192.168.123.105:0/3622020771 >> 192.168.123.105:0/3622020771 conn(0x7f8ca80756b0 msgr2=0x7f8ca80fec70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:25.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.387+0000 7f8c9cff9700 1 -- 192.168.123.105:0/3622020771 shutdown_connections 2026-03-10T07:54:25.388 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:25.387+0000 7f8c9cff9700 1 -- 192.168.123.105:0/3622020771 wait complete. 2026-03-10T07:54:25.776 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 07:54:25 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2[148987]: 2026-03-10T07:54:25.476+0000 7f975aec1640 -1 osd.2 60 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T07:54:26.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:25 vm05.local ceph-mon[130117]: from='client.44177 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:54:26.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:25 vm05.local ceph-mon[130117]: from='client.34208 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:54:26.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:25 vm05.local ceph-mon[130117]: from='client.34210 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:54:26.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:25 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/3136458941' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T07:54:26.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:25 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/3622020771' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T07:54:26.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:25 vm05.local ceph-mon[130117]: from='osd.2 [v2:192.168.123.105:6818/1473398242,v1:192.168.123.105:6819/1473398242]' entity='osd.2' 2026-03-10T07:54:26.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:25 vm08.local ceph-mon[107898]: from='client.44177 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:54:26.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:25 vm08.local ceph-mon[107898]: from='client.34208 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:54:26.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:25 vm08.local ceph-mon[107898]: from='client.34210 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:54:26.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:25 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/3136458941' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T07:54:26.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:25 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/3622020771' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T07:54:26.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:25 vm08.local ceph-mon[107898]: from='osd.2 [v2:192.168.123.105:6818/1473398242,v1:192.168.123.105:6819/1473398242]' entity='osd.2' 2026-03-10T07:54:27.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:26 vm05.local ceph-mon[130117]: pgmap v82: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 255 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 34/261 objects degraded (13.027%) 2026-03-10T07:54:27.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:26 vm05.local ceph-mon[130117]: from='client.44193 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:54:27.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:26 vm05.local ceph-mon[130117]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 1 pg peering) 2026-03-10T07:54:27.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:26 vm05.local ceph-mon[130117]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T07:54:27.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:26 vm05.local ceph-mon[130117]: osd.2 [v2:192.168.123.105:6818/1473398242,v1:192.168.123.105:6819/1473398242] boot 2026-03-10T07:54:27.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:26 vm05.local ceph-mon[130117]: osdmap e64: 6 total, 6 up, 6 in 2026-03-10T07:54:27.059 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T07:54:27.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:26 vm08.local ceph-mon[107898]: pgmap v82: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 255 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 34/261 objects degraded (13.027%) 2026-03-10T07:54:27.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:26 vm08.local ceph-mon[107898]: from='client.44193 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:54:27.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:26 vm08.local ceph-mon[107898]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 1 pg peering) 2026-03-10T07:54:27.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:26 vm08.local ceph-mon[107898]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T07:54:27.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:26 vm08.local ceph-mon[107898]: osd.2 [v2:192.168.123.105:6818/1473398242,v1:192.168.123.105:6819/1473398242] boot 2026-03-10T07:54:27.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:26 vm08.local ceph-mon[107898]: osdmap e64: 6 total, 6 up, 6 in 2026-03-10T07:54:27.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:26 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 2}]: dispatch 2026-03-10T07:54:28.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:27 vm05.local ceph-mon[130117]: pgmap v84: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 255 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 34/261 objects degraded (13.027%) 2026-03-10T07:54:28.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:28.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:54:28.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:27 vm05.local ceph-mon[130117]: osdmap e65: 6 total, 6 up, 6 in 2026-03-10T07:54:28.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:27 vm08.local ceph-mon[107898]: pgmap v84: 65 pgs: 15 active+undersized, 12 active+undersized+degraded, 38 active+clean; 255 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 34/261 objects degraded (13.027%) 2026-03-10T07:54:28.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:27 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:28.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:27 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:54:28.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:27 vm08.local ceph-mon[107898]: osdmap e65: 6 total, 6 up, 6 in 2026-03-10T07:54:29.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:28 vm05.local ceph-mon[130117]: Health check update: Degraded data redundancy: 4/261 objects degraded (1.533%), 2 pgs degraded (PG_DEGRADED) 2026-03-10T07:54:29.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:28 vm08.local ceph-mon[107898]: Health check update: Degraded data redundancy: 4/261 objects degraded (1.533%), 2 pgs degraded (PG_DEGRADED) 2026-03-10T07:54:30.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:29 vm05.local ceph-mon[130117]: pgmap v86: 65 pgs: 3 active+undersized, 2 active+undersized+degraded, 60 active+clean; 255 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 4/261 objects degraded (1.533%) 2026-03-10T07:54:30.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:29 vm08.local ceph-mon[107898]: pgmap v86: 65 pgs: 3 active+undersized, 2 active+undersized+degraded, 60 active+clean; 255 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 4/261 objects degraded (1.533%) 2026-03-10T07:54:31.369 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:30 vm08.local ceph-mon[107898]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 4/261 objects degraded (1.533%), 2 pgs degraded) 2026-03-10T07:54:31.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:30 vm05.local ceph-mon[130117]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 4/261 objects degraded (1.533%), 2 pgs degraded) 2026-03-10T07:54:32.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:31 vm05.local ceph-mon[130117]: pgmap v87: 65 pgs: 65 active+clean; 255 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:32.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:31 vm08.local ceph-mon[107898]: pgmap v87: 65 pgs: 65 active+clean; 255 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:34.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:33 vm05.local ceph-mon[130117]: pgmap v88: 65 pgs: 65 active+clean; 255 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:34.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:33 vm08.local ceph-mon[107898]: pgmap v88: 65 pgs: 65 active+clean; 255 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:36.395 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:35 vm08.local ceph-mon[107898]: pgmap v89: 65 pgs: 65 active+clean; 255 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:36.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:35 vm05.local ceph-mon[130117]: pgmap v89: 65 pgs: 65 active+clean; 255 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:38.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:37 vm05.local ceph-mon[130117]: pgmap v90: 65 pgs: 65 active+clean; 255 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:38.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:37 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:38.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:37 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T07:54:38.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:37 vm08.local ceph-mon[107898]: pgmap v90: 65 pgs: 65 active+clean; 255 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:38.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:37 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:38.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:37 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T07:54:39.148 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:38 vm08.local systemd[1]: Stopping Ceph osd.3 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T07:54:39.148 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:38 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3[65774]: 2026-03-10T07:54:38.946+0000 7fd7a0357700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.3 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T07:54:39.148 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:38 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3[65774]: 2026-03-10T07:54:38.946+0000 7fd7a0357700 -1 osd.3 65 *** Got signal Terminated *** 2026-03-10T07:54:39.148 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:38 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3[65774]: 2026-03-10T07:54:38.946+0000 7fd7a0357700 -1 osd.3 65 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T07:54:39.399 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:39 vm08.local ceph-mon[107898]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T07:54:39.399 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:39 vm08.local ceph-mon[107898]: Upgrade: osd.3 is safe to restart 2026-03-10T07:54:39.399 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:39 vm08.local ceph-mon[107898]: Upgrade: Updating osd.3 2026-03-10T07:54:39.399 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:39 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:39.399 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:39 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T07:54:39.399 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:39 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:54:39.399 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:39 vm08.local ceph-mon[107898]: Deploying daemon osd.3 on vm08 2026-03-10T07:54:39.399 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:39 vm08.local ceph-mon[107898]: osd.3 marked itself down and dead 2026-03-10T07:54:39.400 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:39 vm08.local podman[115184]: 2026-03-10 07:54:39.235337072 +0000 UTC m=+0.304395428 container died 0a62c54a86c0cfee18dda9622bc4e760c44c4bf092848db8af5fe552870dbbc6 (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3, org.label-schema.build-date=20240222, org.label-schema.vendor=CentOS, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=HEAD, maintainer=Guillaume Abrioux , org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, CEPH_POINT_RELEASE=-18.2.1, GIT_BRANCH=HEAD, GIT_CLEAN=True, ceph=True, org.label-schema.license=GPLv2, io.buildah.version=1.29.1) 2026-03-10T07:54:39.400 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:39 vm08.local podman[115184]: 2026-03-10 07:54:39.249378846 +0000 UTC m=+0.318437192 container remove 0a62c54a86c0cfee18dda9622bc4e760c44c4bf092848db8af5fe552870dbbc6 (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3, GIT_BRANCH=HEAD, org.label-schema.build-date=20240222, CEPH_POINT_RELEASE=-18.2.1, GIT_CLEAN=True, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, RELEASE=HEAD, ceph=True, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 8 Base Image, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux , org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.29.1) 2026-03-10T07:54:39.400 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:39 vm08.local bash[115184]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3 2026-03-10T07:54:39.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:39 vm05.local ceph-mon[130117]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["3"], "max": 16}]: dispatch 2026-03-10T07:54:39.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:39 vm05.local ceph-mon[130117]: Upgrade: osd.3 is safe to restart 2026-03-10T07:54:39.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:39 vm05.local ceph-mon[130117]: Upgrade: Updating osd.3 2026-03-10T07:54:39.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:39 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:39.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:39 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.3"}]: dispatch 2026-03-10T07:54:39.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:39 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:54:39.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:39 vm05.local ceph-mon[130117]: Deploying daemon osd.3 on vm08 2026-03-10T07:54:39.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:39 vm05.local ceph-mon[130117]: osd.3 marked itself down and dead 2026-03-10T07:54:39.668 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:39 vm08.local podman[115251]: 2026-03-10 07:54:39.399526537 +0000 UTC m=+0.016095780 container create 321ff0fcc204977869e9ad93febee78254aa0a168ce24a95b38e8be56a124ff6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0) 2026-03-10T07:54:39.668 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:39 vm08.local podman[115251]: 2026-03-10 07:54:39.430001952 +0000 UTC m=+0.046571195 container init 321ff0fcc204977869e9ad93febee78254aa0a168ce24a95b38e8be56a124ff6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-deactivate, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T07:54:39.668 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:39 vm08.local podman[115251]: 2026-03-10 07:54:39.434986553 +0000 UTC m=+0.051555796 container start 321ff0fcc204977869e9ad93febee78254aa0a168ce24a95b38e8be56a124ff6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.label-schema.license=GPLv2, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T07:54:39.668 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:39 vm08.local podman[115251]: 2026-03-10 07:54:39.436071613 +0000 UTC m=+0.052640856 container attach 321ff0fcc204977869e9ad93febee78254aa0a168ce24a95b38e8be56a124ff6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-deactivate, org.label-schema.build-date=20260223, CEPH_REF=squid, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T07:54:39.668 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:39 vm08.local podman[115251]: 2026-03-10 07:54:39.393241493 +0000 UTC m=+0.009810745 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T07:54:39.668 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:39 vm08.local podman[115251]: 2026-03-10 07:54:39.55938182 +0000 UTC m=+0.175951063 container died 321ff0fcc204977869e9ad93febee78254aa0a168ce24a95b38e8be56a124ff6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, OSD_FLAVOR=default) 2026-03-10T07:54:39.668 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:39 vm08.local podman[115251]: 2026-03-10 07:54:39.580118295 +0000 UTC m=+0.196687527 container remove 321ff0fcc204977869e9ad93febee78254aa0a168ce24a95b38e8be56a124ff6 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-deactivate, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T07:54:39.668 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:39 vm08.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.3.service: Deactivated successfully. 2026-03-10T07:54:39.668 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:39 vm08.local systemd[1]: Stopped Ceph osd.3 for 12e9780e-1c55-11f1-8896-79f7c2e9b508. 2026-03-10T07:54:39.668 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:39 vm08.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.3.service: Consumed 44.461s CPU time. 2026-03-10T07:54:40.149 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:39 vm08.local systemd[1]: Starting Ceph osd.3 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T07:54:40.149 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:39 vm08.local podman[115352]: 2026-03-10 07:54:39.852477014 +0000 UTC m=+0.016691635 container create 7eb56a6a9eac18ebb12a1cba672e8db7fd62138575b9c853fad3d200bd627e6e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-activate, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T07:54:40.149 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:39 vm08.local podman[115352]: 2026-03-10 07:54:39.890297818 +0000 UTC m=+0.054512439 container init 7eb56a6a9eac18ebb12a1cba672e8db7fd62138575b9c853fad3d200bd627e6e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-activate, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, CEPH_REF=squid, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.license=GPLv2) 2026-03-10T07:54:40.149 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:39 vm08.local podman[115352]: 2026-03-10 07:54:39.893259823 +0000 UTC m=+0.057474434 container start 7eb56a6a9eac18ebb12a1cba672e8db7fd62138575b9c853fad3d200bd627e6e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-activate, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, io.buildah.version=1.41.3, CEPH_REF=squid) 2026-03-10T07:54:40.149 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:39 vm08.local podman[115352]: 2026-03-10 07:54:39.895351158 +0000 UTC m=+0.059565779 container attach 7eb56a6a9eac18ebb12a1cba672e8db7fd62138575b9c853fad3d200bd627e6e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-activate, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T07:54:40.149 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:39 vm08.local podman[115352]: 2026-03-10 07:54:39.845384999 +0000 UTC m=+0.009599610 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T07:54:40.149 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:39 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-activate[115363]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:54:40.149 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:39 vm08.local bash[115352]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:54:40.149 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:39 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-activate[115363]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:54:40.149 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:39 vm08.local bash[115352]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:54:40.150 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:40 vm08.local ceph-mon[107898]: pgmap v91: 65 pgs: 65 active+clean; 255 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:40.150 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:40 vm08.local ceph-mon[107898]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T07:54:40.150 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:40 vm08.local ceph-mon[107898]: osdmap e66: 6 total, 5 up, 6 in 2026-03-10T07:54:40.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:40 vm05.local ceph-mon[130117]: pgmap v91: 65 pgs: 65 active+clean; 255 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:40.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:40 vm05.local ceph-mon[130117]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T07:54:40.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:40 vm05.local ceph-mon[130117]: osdmap e66: 6 total, 5 up, 6 in 2026-03-10T07:54:40.777 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-activate[115363]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T07:54:40.778 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-activate[115363]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:54:40.778 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local bash[115352]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T07:54:40.778 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local bash[115352]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:54:40.778 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-activate[115363]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:54:40.778 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local bash[115352]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:54:40.778 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-activate[115363]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-10T07:54:40.778 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local bash[115352]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-10T07:54:40.778 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-activate[115363]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-de35ef86-4925-4f07-8737-ffdd5b7332a9/osd-block-4cc7cd7e-7756-40b9-9bc8-029e26495239 --path /var/lib/ceph/osd/ceph-3 --no-mon-config 2026-03-10T07:54:40.778 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local bash[115352]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-de35ef86-4925-4f07-8737-ffdd5b7332a9/osd-block-4cc7cd7e-7756-40b9-9bc8-029e26495239 --path /var/lib/ceph/osd/ceph-3 --no-mon-config 2026-03-10T07:54:40.778 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-activate[115363]: Running command: /usr/bin/ln -snf /dev/ceph-de35ef86-4925-4f07-8737-ffdd5b7332a9/osd-block-4cc7cd7e-7756-40b9-9bc8-029e26495239 /var/lib/ceph/osd/ceph-3/block 2026-03-10T07:54:41.161 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local bash[115352]: Running command: /usr/bin/ln -snf /dev/ceph-de35ef86-4925-4f07-8737-ffdd5b7332a9/osd-block-4cc7cd7e-7756-40b9-9bc8-029e26495239 /var/lib/ceph/osd/ceph-3/block 2026-03-10T07:54:41.161 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-activate[115363]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block 2026-03-10T07:54:41.161 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local bash[115352]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-3/block 2026-03-10T07:54:41.161 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-activate[115363]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-10T07:54:41.161 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local bash[115352]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0 2026-03-10T07:54:41.161 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-activate[115363]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-10T07:54:41.161 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local bash[115352]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-3 2026-03-10T07:54:41.162 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-activate[115363]: --> ceph-volume lvm activate successful for osd ID: 3 2026-03-10T07:54:41.162 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local bash[115352]: --> ceph-volume lvm activate successful for osd ID: 3 2026-03-10T07:54:41.162 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local podman[115352]: 2026-03-10 07:54:40.806626801 +0000 UTC m=+0.970841422 container died 7eb56a6a9eac18ebb12a1cba672e8db7fd62138575b9c853fad3d200bd627e6e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-activate, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) 2026-03-10T07:54:41.162 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local podman[115352]: 2026-03-10 07:54:40.823838319 +0000 UTC m=+0.988052931 container remove 7eb56a6a9eac18ebb12a1cba672e8db7fd62138575b9c853fad3d200bd627e6e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-activate, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default) 2026-03-10T07:54:41.162 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local podman[115603]: 2026-03-10 07:54:40.912278464 +0000 UTC m=+0.015741126 container create 3f280bcfe0f5b5098bed159e3c001472fdb361f091785107b6801e1875f117e0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS) 2026-03-10T07:54:41.162 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local podman[115603]: 2026-03-10 07:54:40.950039727 +0000 UTC m=+0.053502389 container init 3f280bcfe0f5b5098bed159e3c001472fdb361f091785107b6801e1875f117e0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.license=GPLv2) 2026-03-10T07:54:41.162 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local podman[115603]: 2026-03-10 07:54:40.954031699 +0000 UTC m=+0.057494361 container start 3f280bcfe0f5b5098bed159e3c001472fdb361f091785107b6801e1875f117e0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default) 2026-03-10T07:54:41.162 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local bash[115603]: 3f280bcfe0f5b5098bed159e3c001472fdb361f091785107b6801e1875f117e0 2026-03-10T07:54:41.162 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local podman[115603]: 2026-03-10 07:54:40.906264106 +0000 UTC m=+0.009726779 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T07:54:41.162 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:40 vm08.local systemd[1]: Started Ceph osd.3 for 12e9780e-1c55-11f1-8896-79f7c2e9b508. 2026-03-10T07:54:41.162 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:41 vm08.local ceph-mon[107898]: osdmap e67: 6 total, 5 up, 6 in 2026-03-10T07:54:41.162 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:41 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:41.162 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:41 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:41.162 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:41 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:54:41.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:41 vm05.local ceph-mon[130117]: osdmap e67: 6 total, 5 up, 6 in 2026-03-10T07:54:41.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:41 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:41.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:41 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:41.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:41 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:54:41.698 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:41 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3[115614]: 2026-03-10T07:54:41.559+0000 7f765a905740 -1 Falling back to public interface 2026-03-10T07:54:42.163 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:42 vm08.local ceph-mon[107898]: pgmap v94: 65 pgs: 7 peering, 13 stale+active+clean, 45 active+clean; 255 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:42.163 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:42 vm08.local ceph-mon[107898]: Health check failed: Reduced data availability: 1 pg peering (PG_AVAILABILITY) 2026-03-10T07:54:42.163 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:42.163 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:54:42.163 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:42.163 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:42.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:42 vm05.local ceph-mon[130117]: pgmap v94: 65 pgs: 7 peering, 13 stale+active+clean, 45 active+clean; 255 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:42.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:42 vm05.local ceph-mon[130117]: Health check failed: Reduced data availability: 1 pg peering (PG_AVAILABILITY) 2026-03-10T07:54:42.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:42.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:54:42.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:42.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:43.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:43 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:43.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:43 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:43.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:43 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:43.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:43 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:43.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:43 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:54:43.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:43 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:54:43.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:43 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:43.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:43 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:43.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:43 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:43.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:43 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:43.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:43 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:43.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:43 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:54:43.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:43 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:54:43.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:43 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:44.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:44 vm05.local ceph-mon[130117]: pgmap v95: 65 pgs: 4 active+undersized, 7 peering, 11 stale+active+clean, 5 active+undersized+degraded, 38 active+clean; 255 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 15/261 objects degraded (5.747%) 2026-03-10T07:54:44.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:44 vm05.local ceph-mon[130117]: Health check failed: Degraded data redundancy: 15/261 objects degraded (5.747%), 5 pgs degraded (PG_DEGRADED) 2026-03-10T07:54:44.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:54:44.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:54:44.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:54:44.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:54:44.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T07:54:44.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:44 vm05.local ceph-mon[130117]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T07:54:44.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:44 vm05.local ceph-mon[130117]: Upgrade: unsafe to stop osd(s) at this time (7 PGs are or would become offline) 2026-03-10T07:54:44.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:44 vm08.local ceph-mon[107898]: pgmap v95: 65 pgs: 4 active+undersized, 7 peering, 11 stale+active+clean, 5 active+undersized+degraded, 38 active+clean; 255 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 15/261 objects degraded (5.747%) 2026-03-10T07:54:44.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:44 vm08.local ceph-mon[107898]: Health check failed: Degraded data redundancy: 15/261 objects degraded (5.747%), 5 pgs degraded (PG_DEGRADED) 2026-03-10T07:54:44.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:54:44.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:54:44.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:54:44.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:54:44.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T07:54:44.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:44 vm08.local ceph-mon[107898]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T07:54:44.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:44 vm08.local ceph-mon[107898]: Upgrade: unsafe to stop osd(s) at this time (7 PGs are or would become offline) 2026-03-10T07:54:45.918 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:45 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3[115614]: 2026-03-10T07:54:45.607+0000 7f765a905740 -1 osd.3 0 read_superblock omap replica is missing. 2026-03-10T07:54:45.918 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:45 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3[115614]: 2026-03-10T07:54:45.829+0000 7f765a905740 -1 osd.3 65 log_to_monitors true 2026-03-10T07:54:46.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:46 vm05.local ceph-mon[130117]: pgmap v96: 65 pgs: 16 active+undersized, 7 peering, 15 active+undersized+degraded, 27 active+clean; 255 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 49/261 objects degraded (18.774%) 2026-03-10T07:54:46.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:46 vm05.local ceph-mon[130117]: from='osd.3 [v2:192.168.123.108:6800/94830866,v1:192.168.123.108:6801/94830866]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T07:54:46.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:46 vm05.local ceph-mon[130117]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T07:54:46.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:46 vm08.local ceph-mon[107898]: pgmap v96: 65 pgs: 16 active+undersized, 7 peering, 15 active+undersized+degraded, 27 active+clean; 255 MiB data, 2.2 GiB used, 118 GiB / 120 GiB avail; 49/261 objects degraded (18.774%) 2026-03-10T07:54:46.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:46 vm08.local ceph-mon[107898]: from='osd.3 [v2:192.168.123.108:6800/94830866,v1:192.168.123.108:6801/94830866]' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T07:54:46.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:46 vm08.local ceph-mon[107898]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]: dispatch 2026-03-10T07:54:46.668 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 07:54:46 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3[115614]: 2026-03-10T07:54:46.385+0000 7f765269f640 -1 osd.3 65 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T07:54:47.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:47 vm05.local ceph-mon[130117]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-10T07:54:47.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:47 vm05.local ceph-mon[130117]: from='osd.3 [v2:192.168.123.108:6800/94830866,v1:192.168.123.108:6801/94830866]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T07:54:47.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:47 vm05.local ceph-mon[130117]: osdmap e68: 6 total, 5 up, 6 in 2026-03-10T07:54:47.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:47 vm05.local ceph-mon[130117]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T07:54:47.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:47 vm08.local ceph-mon[107898]: from='osd.3 ' entity='osd.3' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["3"]}]': finished 2026-03-10T07:54:47.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:47 vm08.local ceph-mon[107898]: from='osd.3 [v2:192.168.123.108:6800/94830866,v1:192.168.123.108:6801/94830866]' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T07:54:47.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:47 vm08.local ceph-mon[107898]: osdmap e68: 6 total, 5 up, 6 in 2026-03-10T07:54:47.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:47 vm08.local ceph-mon[107898]: from='osd.3 ' entity='osd.3' cmd=[{"prefix": "osd crush create-or-move", "id": 3, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T07:54:48.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:48 vm05.local ceph-mon[130117]: pgmap v98: 65 pgs: 20 active+undersized, 18 active+undersized+degraded, 27 active+clean; 255 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 58/261 objects degraded (22.222%) 2026-03-10T07:54:48.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:48 vm05.local ceph-mon[130117]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 3 pgs peering) 2026-03-10T07:54:48.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:48 vm05.local ceph-mon[130117]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T07:54:48.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:48 vm05.local ceph-mon[130117]: osd.3 [v2:192.168.123.108:6800/94830866,v1:192.168.123.108:6801/94830866] boot 2026-03-10T07:54:48.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:48 vm05.local ceph-mon[130117]: osdmap e69: 6 total, 6 up, 6 in 2026-03-10T07:54:48.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:48 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:54:48.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:48 vm08.local ceph-mon[107898]: pgmap v98: 65 pgs: 20 active+undersized, 18 active+undersized+degraded, 27 active+clean; 255 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 58/261 objects degraded (22.222%) 2026-03-10T07:54:48.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:48 vm08.local ceph-mon[107898]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 3 pgs peering) 2026-03-10T07:54:48.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:48 vm08.local ceph-mon[107898]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T07:54:48.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:48 vm08.local ceph-mon[107898]: osd.3 [v2:192.168.123.108:6800/94830866,v1:192.168.123.108:6801/94830866] boot 2026-03-10T07:54:48.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:48 vm08.local ceph-mon[107898]: osdmap e69: 6 total, 6 up, 6 in 2026-03-10T07:54:48.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:48 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 3}]: dispatch 2026-03-10T07:54:49.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:49 vm05.local ceph-mon[130117]: osdmap e70: 6 total, 6 up, 6 in 2026-03-10T07:54:49.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:49 vm05.local ceph-mon[130117]: Health check update: Degraded data redundancy: 58/261 objects degraded (22.222%), 18 pgs degraded (PG_DEGRADED) 2026-03-10T07:54:49.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:49 vm08.local ceph-mon[107898]: osdmap e70: 6 total, 6 up, 6 in 2026-03-10T07:54:49.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:49 vm08.local ceph-mon[107898]: Health check update: Degraded data redundancy: 58/261 objects degraded (22.222%), 18 pgs degraded (PG_DEGRADED) 2026-03-10T07:54:50.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:50 vm05.local ceph-mon[130117]: pgmap v101: 65 pgs: 15 peering, 10 active+undersized, 10 active+undersized+degraded, 30 active+clean; 255 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 32/261 objects degraded (12.261%) 2026-03-10T07:54:50.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:50 vm08.local ceph-mon[107898]: pgmap v101: 65 pgs: 15 peering, 10 active+undersized, 10 active+undersized+degraded, 30 active+clean; 255 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 32/261 objects degraded (12.261%) 2026-03-10T07:54:52.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:52 vm05.local ceph-mon[130117]: pgmap v102: 65 pgs: 15 peering, 7 active+undersized, 9 active+undersized+degraded, 34 active+clean; 255 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 29/261 objects degraded (11.111%) 2026-03-10T07:54:52.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:52 vm08.local ceph-mon[107898]: pgmap v102: 65 pgs: 15 peering, 7 active+undersized, 9 active+undersized+degraded, 34 active+clean; 255 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 29/261 objects degraded (11.111%) 2026-03-10T07:54:53.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:53 vm05.local ceph-mon[130117]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 29/261 objects degraded (11.111%), 9 pgs degraded) 2026-03-10T07:54:53.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:53 vm08.local ceph-mon[107898]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 29/261 objects degraded (11.111%), 9 pgs degraded) 2026-03-10T07:54:54.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:54 vm05.local ceph-mon[130117]: pgmap v103: 65 pgs: 8 peering, 57 active+clean; 255 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:54.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:54 vm08.local ceph-mon[107898]: pgmap v103: 65 pgs: 8 peering, 57 active+clean; 255 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:55.457 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.456+0000 7f3da5359700 1 -- 192.168.123.105:0/551715177 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3da0103d70 msgr2=0x7f3da0107dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:55.457 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.456+0000 7f3da5359700 1 --2- 192.168.123.105:0/551715177 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3da0103d70 0x7f3da0107dc0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f3d90009b50 tx=0x7f3d90009e60 comp rx=0 tx=0).stop 2026-03-10T07:54:55.457 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.456+0000 7f3da5359700 1 -- 192.168.123.105:0/551715177 shutdown_connections 2026-03-10T07:54:55.457 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.456+0000 7f3da5359700 1 --2- 192.168.123.105:0/551715177 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3da0103d70 0x7f3da0107dc0 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:55.457 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.456+0000 7f3da5359700 1 --2- 192.168.123.105:0/551715177 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3da01033c0 0x7f3da01037a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:55.457 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.456+0000 7f3da5359700 1 -- 192.168.123.105:0/551715177 >> 192.168.123.105:0/551715177 conn(0x7f3da00fec30 msgr2=0x7f3da0101050 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:55.457 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.457+0000 7f3da5359700 1 -- 192.168.123.105:0/551715177 shutdown_connections 2026-03-10T07:54:55.457 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.457+0000 7f3da5359700 1 -- 192.168.123.105:0/551715177 wait complete. 2026-03-10T07:54:55.457 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.457+0000 7f3da5359700 1 Processor -- start 2026-03-10T07:54:55.457 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.457+0000 7f3da5359700 1 -- start start 2026-03-10T07:54:55.458 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.458+0000 7f3da5359700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3da01033c0 0x7f3da0198e30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:55.458 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.458+0000 7f3da5359700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3da0103d70 0x7f3da0199370 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:55.458 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.458+0000 7f3da5359700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3da0199a50 con 0x7f3da01033c0 2026-03-10T07:54:55.458 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.458+0000 7f3da5359700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3da019d7e0 con 0x7f3da0103d70 2026-03-10T07:54:55.458 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.458+0000 7f3d9effd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3da01033c0 0x7f3da0198e30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:55.458 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.458+0000 7f3d9e7fc700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3da0103d70 0x7f3da0199370 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:55.458 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.458+0000 7f3d9effd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3da01033c0 0x7f3da0198e30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:51754/0 (socket says 192.168.123.105:51754) 2026-03-10T07:54:55.458 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.458+0000 7f3d9effd700 1 -- 192.168.123.105:0/1214969252 learned_addr learned my addr 192.168.123.105:0/1214969252 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:54:55.458 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.458+0000 7f3d9effd700 1 -- 192.168.123.105:0/1214969252 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3da0103d70 msgr2=0x7f3da0199370 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:55.458 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.458+0000 7f3d9effd700 1 --2- 192.168.123.105:0/1214969252 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3da0103d70 0x7f3da0199370 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:55.458 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.458+0000 7f3d9effd700 1 -- 192.168.123.105:0/1214969252 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3d900097e0 con 0x7f3da01033c0 2026-03-10T07:54:55.459 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.458+0000 7f3d9effd700 1 --2- 192.168.123.105:0/1214969252 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3da01033c0 0x7f3da0198e30 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f3d8800eb10 tx=0x7f3d8800eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:55.459 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.459+0000 7f3d97fff700 1 -- 192.168.123.105:0/1214969252 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3d8800cca0 con 0x7f3da01033c0 2026-03-10T07:54:55.459 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.459+0000 7f3da5359700 1 -- 192.168.123.105:0/1214969252 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3da019dac0 con 0x7f3da01033c0 2026-03-10T07:54:55.459 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.459+0000 7f3da5359700 1 -- 192.168.123.105:0/1214969252 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3da019e010 con 0x7f3da01033c0 2026-03-10T07:54:55.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.459+0000 7f3d97fff700 1 -- 192.168.123.105:0/1214969252 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f3d8800ce00 con 0x7f3da01033c0 2026-03-10T07:54:55.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.459+0000 7f3d97fff700 1 -- 192.168.123.105:0/1214969252 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3d88018910 con 0x7f3da01033c0 2026-03-10T07:54:55.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.461+0000 7f3d97fff700 1 -- 192.168.123.105:0/1214969252 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3d88010c80 con 0x7f3da01033c0 2026-03-10T07:54:55.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.461+0000 7f3d97fff700 1 --2- 192.168.123.105:0/1214969252 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3d8c0779e0 0x7f3d8c079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:55.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.461+0000 7f3d9e7fc700 1 --2- 192.168.123.105:0/1214969252 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3d8c0779e0 0x7f3d8c079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:55.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.461+0000 7f3d97fff700 1 -- 192.168.123.105:0/1214969252 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(70..70 src has 1..70) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f3d88014070 con 0x7f3da01033c0 2026-03-10T07:54:55.462 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.462+0000 7f3d9e7fc700 1 --2- 192.168.123.105:0/1214969252 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3d8c0779e0 0x7f3d8c079ea0 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f3d90006010 tx=0x7f3d900058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:55.462 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.462+0000 7f3da5359700 1 -- 192.168.123.105:0/1214969252 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3d80005320 con 0x7f3da01033c0 2026-03-10T07:54:55.465 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.465+0000 7f3d97fff700 1 -- 192.168.123.105:0/1214969252 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3d88062bb0 con 0x7f3da01033c0 2026-03-10T07:54:55.589 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.588+0000 7f3da5359700 1 -- 192.168.123.105:0/1214969252 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f3d80000bf0 con 0x7f3d8c0779e0 2026-03-10T07:54:55.590 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.590+0000 7f3d97fff700 1 -- 192.168.123.105:0/1214969252 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f3d80000bf0 con 0x7f3d8c0779e0 2026-03-10T07:54:55.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.592+0000 7f3da5359700 1 -- 192.168.123.105:0/1214969252 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3d8c0779e0 msgr2=0x7f3d8c079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:55.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.592+0000 7f3da5359700 1 --2- 192.168.123.105:0/1214969252 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3d8c0779e0 0x7f3d8c079ea0 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7f3d90006010 tx=0x7f3d900058e0 comp rx=0 tx=0).stop 2026-03-10T07:54:55.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.592+0000 7f3da5359700 1 -- 192.168.123.105:0/1214969252 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3da01033c0 msgr2=0x7f3da0198e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:55.592 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.592+0000 7f3da5359700 1 --2- 192.168.123.105:0/1214969252 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3da01033c0 0x7f3da0198e30 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f3d8800eb10 tx=0x7f3d8800eed0 comp rx=0 tx=0).stop 2026-03-10T07:54:55.593 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.592+0000 7f3da5359700 1 -- 192.168.123.105:0/1214969252 shutdown_connections 2026-03-10T07:54:55.593 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.592+0000 7f3da5359700 1 --2- 192.168.123.105:0/1214969252 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3d8c0779e0 0x7f3d8c079ea0 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:55.593 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.593+0000 7f3da5359700 1 --2- 192.168.123.105:0/1214969252 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3da01033c0 0x7f3da0198e30 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:55.593 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.593+0000 7f3da5359700 1 --2- 192.168.123.105:0/1214969252 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3da0103d70 0x7f3da0199370 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:55.593 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.593+0000 7f3da5359700 1 -- 192.168.123.105:0/1214969252 >> 192.168.123.105:0/1214969252 conn(0x7f3da00fec30 msgr2=0x7f3da01001e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:55.593 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.593+0000 7f3da5359700 1 -- 192.168.123.105:0/1214969252 shutdown_connections 2026-03-10T07:54:55.593 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.593+0000 7f3da5359700 1 -- 192.168.123.105:0/1214969252 wait complete. 2026-03-10T07:54:55.602 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T07:54:55.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.660+0000 7f4c59cbd700 1 -- 192.168.123.105:0/1095312065 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c54069180 msgr2=0x7f4c54069600 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:55.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.660+0000 7f4c59cbd700 1 --2- 192.168.123.105:0/1095312065 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c54069180 0x7f4c54069600 secure :-1 s=READY pgs=35 cs=0 l=1 rev1=1 crypto rx=0x7f4c44009a60 tx=0x7f4c44009d70 comp rx=0 tx=0).stop 2026-03-10T07:54:55.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.661+0000 7f4c59cbd700 1 -- 192.168.123.105:0/1095312065 shutdown_connections 2026-03-10T07:54:55.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.661+0000 7f4c59cbd700 1 --2- 192.168.123.105:0/1095312065 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c54069180 0x7f4c54069600 unknown :-1 s=CLOSED pgs=35 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:55.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.661+0000 7f4c59cbd700 1 --2- 192.168.123.105:0/1095312065 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4c54102db0 0x7f4c54103190 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:55.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.661+0000 7f4c59cbd700 1 -- 192.168.123.105:0/1095312065 >> 192.168.123.105:0/1095312065 conn(0x7f4c54076b70 msgr2=0x7f4c54076f80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:55.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.661+0000 7f4c59cbd700 1 -- 192.168.123.105:0/1095312065 shutdown_connections 2026-03-10T07:54:55.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.661+0000 7f4c59cbd700 1 -- 192.168.123.105:0/1095312065 wait complete. 2026-03-10T07:54:55.661 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.661+0000 7f4c59cbd700 1 Processor -- start 2026-03-10T07:54:55.662 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.661+0000 7f4c59cbd700 1 -- start start 2026-03-10T07:54:55.662 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.662+0000 7f4c59cbd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4c54069180 0x7f4c54198e60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:55.662 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.662+0000 7f4c59cbd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c54102db0 0x7f4c541993a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:55.662 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.662+0000 7f4c59cbd700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4c54199a80 con 0x7f4c54069180 2026-03-10T07:54:55.662 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.662+0000 7f4c59cbd700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4c5419d810 con 0x7f4c54102db0 2026-03-10T07:54:55.662 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.662+0000 7f4c52ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c54102db0 0x7f4c541993a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:55.662 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.662+0000 7f4c52ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c54102db0 0x7f4c541993a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:46622/0 (socket says 192.168.123.105:46622) 2026-03-10T07:54:55.662 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.662+0000 7f4c52ffd700 1 -- 192.168.123.105:0/477555182 learned_addr learned my addr 192.168.123.105:0/477555182 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:54:55.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.663+0000 7f4c52ffd700 1 -- 192.168.123.105:0/477555182 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4c54069180 msgr2=0x7f4c54198e60 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:54:55.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.663+0000 7f4c537fe700 1 --2- 192.168.123.105:0/477555182 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4c54069180 0x7f4c54198e60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:55.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.663+0000 7f4c52ffd700 1 --2- 192.168.123.105:0/477555182 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4c54069180 0x7f4c54198e60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:55.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.663+0000 7f4c52ffd700 1 -- 192.168.123.105:0/477555182 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4c3c0097e0 con 0x7f4c54102db0 2026-03-10T07:54:55.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.663+0000 7f4c537fe700 1 --2- 192.168.123.105:0/477555182 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4c54069180 0x7f4c54198e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:54:55.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.663+0000 7f4c52ffd700 1 --2- 192.168.123.105:0/477555182 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c54102db0 0x7f4c541993a0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f4c4400b5c0 tx=0x7f4c4400f740 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:55.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.663+0000 7f4c50ff9700 1 -- 192.168.123.105:0/477555182 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4c4401d070 con 0x7f4c54102db0 2026-03-10T07:54:55.664 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.663+0000 7f4c50ff9700 1 -- 192.168.123.105:0/477555182 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f4c4400fd20 con 0x7f4c54102db0 2026-03-10T07:54:55.664 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.663+0000 7f4c59cbd700 1 -- 192.168.123.105:0/477555182 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4c44009710 con 0x7f4c54102db0 2026-03-10T07:54:55.664 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.663+0000 7f4c50ff9700 1 -- 192.168.123.105:0/477555182 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4c44017700 con 0x7f4c54102db0 2026-03-10T07:54:55.664 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.664+0000 7f4c59cbd700 1 -- 192.168.123.105:0/477555182 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4c5419ddf0 con 0x7f4c54102db0 2026-03-10T07:54:55.665 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.665+0000 7f4c59cbd700 1 -- 192.168.123.105:0/477555182 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4c5410b520 con 0x7f4c54102db0 2026-03-10T07:54:55.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.666+0000 7f4c50ff9700 1 -- 192.168.123.105:0/477555182 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4c44021410 con 0x7f4c54102db0 2026-03-10T07:54:55.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.666+0000 7f4c50ff9700 1 --2- 192.168.123.105:0/477555182 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4c400778c0 0x7f4c40079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:55.666 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.666+0000 7f4c50ff9700 1 -- 192.168.123.105:0/477555182 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(70..70 src has 1..70) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f4c4409b840 con 0x7f4c54102db0 2026-03-10T07:54:55.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.666+0000 7f4c537fe700 1 --2- 192.168.123.105:0/477555182 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4c400778c0 0x7f4c40079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:55.667 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.667+0000 7f4c537fe700 1 --2- 192.168.123.105:0/477555182 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4c400778c0 0x7f4c40079d80 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f4c3c005fd0 tx=0x7f4c3c009500 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:55.668 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.668+0000 7f4c50ff9700 1 -- 192.168.123.105:0/477555182 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4c440640f0 con 0x7f4c54102db0 2026-03-10T07:54:55.791 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.790+0000 7f4c59cbd700 1 -- 192.168.123.105:0/477555182 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f4c5419a1c0 con 0x7f4c400778c0 2026-03-10T07:54:55.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.794+0000 7f4c50ff9700 1 -- 192.168.123.105:0/477555182 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f4c5419a1c0 con 0x7f4c400778c0 2026-03-10T07:54:55.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.797+0000 7f4c59cbd700 1 -- 192.168.123.105:0/477555182 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4c400778c0 msgr2=0x7f4c40079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:55.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.797+0000 7f4c59cbd700 1 --2- 192.168.123.105:0/477555182 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4c400778c0 0x7f4c40079d80 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f4c3c005fd0 tx=0x7f4c3c009500 comp rx=0 tx=0).stop 2026-03-10T07:54:55.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.797+0000 7f4c59cbd700 1 -- 192.168.123.105:0/477555182 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c54102db0 msgr2=0x7f4c541993a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:55.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.797+0000 7f4c59cbd700 1 --2- 192.168.123.105:0/477555182 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c54102db0 0x7f4c541993a0 secure :-1 s=READY pgs=36 cs=0 l=1 rev1=1 crypto rx=0x7f4c4400b5c0 tx=0x7f4c4400f740 comp rx=0 tx=0).stop 2026-03-10T07:54:55.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.797+0000 7f4c59cbd700 1 -- 192.168.123.105:0/477555182 shutdown_connections 2026-03-10T07:54:55.798 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.797+0000 7f4c59cbd700 1 --2- 192.168.123.105:0/477555182 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4c400778c0 0x7f4c40079d80 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:55.798 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.797+0000 7f4c59cbd700 1 --2- 192.168.123.105:0/477555182 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4c54069180 0x7f4c54198e60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:55.798 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.797+0000 7f4c59cbd700 1 --2- 192.168.123.105:0/477555182 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4c54102db0 0x7f4c541993a0 unknown :-1 s=CLOSED pgs=36 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:55.798 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.797+0000 7f4c59cbd700 1 -- 192.168.123.105:0/477555182 >> 192.168.123.105:0/477555182 conn(0x7f4c54076b70 msgr2=0x7f4c54105170 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:55.798 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.797+0000 7f4c59cbd700 1 -- 192.168.123.105:0/477555182 shutdown_connections 2026-03-10T07:54:55.798 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.797+0000 7f4c59cbd700 1 -- 192.168.123.105:0/477555182 wait complete. 2026-03-10T07:54:55.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.864+0000 7f10d51ca700 1 -- 192.168.123.105:0/2324304972 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f10d00ffc50 msgr2=0x7f10d0100030 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:55.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.864+0000 7f10d51ca700 1 --2- 192.168.123.105:0/2324304972 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f10d00ffc50 0x7f10d0100030 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f10b8009b50 tx=0x7f10b8009e60 comp rx=0 tx=0).stop 2026-03-10T07:54:55.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.864+0000 7f10d51ca700 1 -- 192.168.123.105:0/2324304972 shutdown_connections 2026-03-10T07:54:55.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.864+0000 7f10d51ca700 1 --2- 192.168.123.105:0/2324304972 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f10d0100600 0x7f10d010d2c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:55.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.864+0000 7f10d51ca700 1 --2- 192.168.123.105:0/2324304972 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f10d00ffc50 0x7f10d0100030 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:55.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.864+0000 7f10d51ca700 1 -- 192.168.123.105:0/2324304972 >> 192.168.123.105:0/2324304972 conn(0x7f10d00fb830 msgr2=0x7f10d00fdc50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:55.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.865+0000 7f10d51ca700 1 -- 192.168.123.105:0/2324304972 shutdown_connections 2026-03-10T07:54:55.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.865+0000 7f10d51ca700 1 -- 192.168.123.105:0/2324304972 wait complete. 2026-03-10T07:54:55.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.865+0000 7f10d51ca700 1 Processor -- start 2026-03-10T07:54:55.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.865+0000 7f10d51ca700 1 -- start start 2026-03-10T07:54:55.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.866+0000 7f10d51ca700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f10d00ffc50 0x7f10d0198d20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:55.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.866+0000 7f10d51ca700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f10d0100600 0x7f10d0199260 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:55.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.866+0000 7f10d51ca700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f10d0199940 con 0x7f10d00ffc50 2026-03-10T07:54:55.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.866+0000 7f10d51ca700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f10d019d6d0 con 0x7f10d0100600 2026-03-10T07:54:55.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.866+0000 7f10ce59c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f10d0100600 0x7f10d0199260 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:55.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.866+0000 7f10ce59c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f10d0100600 0x7f10d0199260 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:46636/0 (socket says 192.168.123.105:46636) 2026-03-10T07:54:55.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.866+0000 7f10ce59c700 1 -- 192.168.123.105:0/4253653408 learned_addr learned my addr 192.168.123.105:0/4253653408 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:54:55.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.866+0000 7f10ce59c700 1 -- 192.168.123.105:0/4253653408 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f10d00ffc50 msgr2=0x7f10d0198d20 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:55.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.866+0000 7f10ced9d700 1 --2- 192.168.123.105:0/4253653408 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f10d00ffc50 0x7f10d0198d20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:55.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.866+0000 7f10ce59c700 1 --2- 192.168.123.105:0/4253653408 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f10d00ffc50 0x7f10d0198d20 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:55.866 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.866+0000 7f10ce59c700 1 -- 192.168.123.105:0/4253653408 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f10b80097e0 con 0x7f10d0100600 2026-03-10T07:54:55.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.866+0000 7f10ced9d700 1 --2- 192.168.123.105:0/4253653408 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f10d00ffc50 0x7f10d0198d20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:54:55.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.867+0000 7f10ce59c700 1 --2- 192.168.123.105:0/4253653408 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f10d0100600 0x7f10d0199260 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f10c000d900 tx=0x7f10c000dcc0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:55.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.867+0000 7f10c7fff700 1 -- 192.168.123.105:0/4253653408 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f10c00041d0 con 0x7f10d0100600 2026-03-10T07:54:55.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.867+0000 7f10d51ca700 1 -- 192.168.123.105:0/4253653408 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f10d019d9b0 con 0x7f10d0100600 2026-03-10T07:54:55.867 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.867+0000 7f10d51ca700 1 -- 192.168.123.105:0/4253653408 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f10d019df00 con 0x7f10d0100600 2026-03-10T07:54:55.868 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.867+0000 7f10c7fff700 1 -- 192.168.123.105:0/4253653408 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f10c0004d10 con 0x7f10d0100600 2026-03-10T07:54:55.868 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.867+0000 7f10c7fff700 1 -- 192.168.123.105:0/4253653408 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f10c000b750 con 0x7f10d0100600 2026-03-10T07:54:55.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.868+0000 7f10c7fff700 1 -- 192.168.123.105:0/4253653408 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f10c000b980 con 0x7f10d0100600 2026-03-10T07:54:55.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.869+0000 7f10c7fff700 1 --2- 192.168.123.105:0/4253653408 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f10bc0778c0 0x7f10bc079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:55.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.869+0000 7f10ced9d700 1 --2- 192.168.123.105:0/4253653408 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f10bc0778c0 0x7f10bc079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:55.869 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.869+0000 7f10c7fff700 1 -- 192.168.123.105:0/4253653408 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(70..70 src has 1..70) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f10c00998a0 con 0x7f10d0100600 2026-03-10T07:54:55.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.869+0000 7f10d51ca700 1 -- 192.168.123.105:0/4253653408 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f10d010aa30 con 0x7f10d0100600 2026-03-10T07:54:55.870 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.870+0000 7f10ced9d700 1 --2- 192.168.123.105:0/4253653408 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f10bc0778c0 0x7f10bc079d80 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f10b800b5c0 tx=0x7f10b80058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:55.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.872+0000 7f10c7fff700 1 -- 192.168.123.105:0/4253653408 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f10c00620d0 con 0x7f10d0100600 2026-03-10T07:54:55.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.988+0000 7f10d51ca700 1 -- 192.168.123.105:0/4253653408 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f10d019a080 con 0x7f10bc0778c0 2026-03-10T07:54:55.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.993+0000 7f10c7fff700 1 -- 192.168.123.105:0/4253653408 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f10d019a080 con 0x7f10bc0778c0 2026-03-10T07:54:55.994 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T07:54:55.994 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (3m) 34s ago 8m 23.3M - 0.25.0 c8568f914cd2 ac15d5f35994 2026-03-10T07:54:55.994 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (8m) 34s ago 8m 9.98M - 18.2.1 5be31c24972a 26c4db858175 2026-03-10T07:54:55.994 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (8m) 14s ago 8m 10.5M - 18.2.1 5be31c24972a 209e2398a09c 2026-03-10T07:54:55.994 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (2m) 34s ago 8m 7847k - 19.2.3-678-ge911bdeb 654f31e6858e daa831c74cf4 2026-03-10T07:54:55.994 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (2m) 14s ago 8m 8262k - 19.2.3-678-ge911bdeb 654f31e6858e 668ac55c9722 2026-03-10T07:54:55.994 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (3m) 34s ago 8m 81.7M - 10.4.0 c8b91775d855 6acb529ad951 2026-03-10T07:54:55.994 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.omfhnh vm05 running (6m) 34s ago 6m 186M - 18.2.1 5be31c24972a e23de179e09c 2026-03-10T07:54:55.994 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.pavqil vm05 running (6m) 34s ago 6m 17.6M - 18.2.1 5be31c24972a 5b9e5afa214c 2026-03-10T07:54:55.994 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.dgsaon vm08 running (6m) 14s ago 6m 18.5M - 18.2.1 5be31c24972a 1696aee522b5 2026-03-10T07:54:55.994 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ybmbgd vm08 running (6m) 14s ago 6m 16.8M - 18.2.1 5be31c24972a 30b0e51cd2ed 2026-03-10T07:54:55.994 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.blexke vm05 *:8443,9283,8765 running (4m) 34s ago 9m 586M - 19.2.3-678-ge911bdeb 654f31e6858e 5467b6a3bd38 2026-03-10T07:54:55.994 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.orfpog vm08 *:8443,9283,8765 running (4m) 14s ago 8m 535M - 19.2.3-678-ge911bdeb 654f31e6858e 7a15d7811d5b 2026-03-10T07:54:55.994 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (2m) 34s ago 9m 58.0M 2048M 19.2.3-678-ge911bdeb 654f31e6858e f02f076bb820 2026-03-10T07:54:55.994 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (2m) 14s ago 8m 52.0M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 73d9a504f360 2026-03-10T07:54:55.994 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (4m) 34s ago 8m 10.3M - 1.7.0 72c9c2088986 7cd0b23b4118 2026-03-10T07:54:55.994 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (4m) 14s ago 8m 9.95M - 1.7.0 72c9c2088986 3dd4d91d5881 2026-03-10T07:54:55.994 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (112s) 34s ago 7m 201M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b35fccc2a4d5 2026-03-10T07:54:55.994 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (58s) 34s ago 7m 95.0M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e3fe4ad5c6a3 2026-03-10T07:54:55.994 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (36s) 34s ago 7m 13.0M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 108a77e324b8 2026-03-10T07:54:55.994 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (15s) 14s ago 7m 31.7M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 3f280bcfe0f5 2026-03-10T07:54:55.994 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (7m) 14s ago 7m 397M 4096M 18.2.1 5be31c24972a bd748b691ccd 2026-03-10T07:54:55.994 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (6m) 14s ago 6m 342M 4096M 18.2.1 5be31c24972a 9f08820ae98b 2026-03-10T07:54:55.994 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (3m) 34s ago 8m 51.4M - 2.51.0 1d3b7f56885b c59a6be07563 2026-03-10T07:54:55.996 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.996+0000 7f10d51ca700 1 -- 192.168.123.105:0/4253653408 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f10bc0778c0 msgr2=0x7f10bc079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:55.996 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.996+0000 7f10d51ca700 1 --2- 192.168.123.105:0/4253653408 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f10bc0778c0 0x7f10bc079d80 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f10b800b5c0 tx=0x7f10b80058e0 comp rx=0 tx=0).stop 2026-03-10T07:54:55.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.996+0000 7f10d51ca700 1 -- 192.168.123.105:0/4253653408 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f10d0100600 msgr2=0x7f10d0199260 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:55.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.996+0000 7f10d51ca700 1 --2- 192.168.123.105:0/4253653408 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f10d0100600 0x7f10d0199260 secure :-1 s=READY pgs=37 cs=0 l=1 rev1=1 crypto rx=0x7f10c000d900 tx=0x7f10c000dcc0 comp rx=0 tx=0).stop 2026-03-10T07:54:55.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.996+0000 7f10d51ca700 1 -- 192.168.123.105:0/4253653408 shutdown_connections 2026-03-10T07:54:55.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.996+0000 7f10d51ca700 1 --2- 192.168.123.105:0/4253653408 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f10bc0778c0 0x7f10bc079d80 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:55.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.996+0000 7f10d51ca700 1 --2- 192.168.123.105:0/4253653408 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f10d00ffc50 0x7f10d0198d20 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:55.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.996+0000 7f10d51ca700 1 --2- 192.168.123.105:0/4253653408 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f10d0100600 0x7f10d0199260 unknown :-1 s=CLOSED pgs=37 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:55.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.996+0000 7f10d51ca700 1 -- 192.168.123.105:0/4253653408 >> 192.168.123.105:0/4253653408 conn(0x7f10d00fb830 msgr2=0x7f10d00fce10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:55.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.996+0000 7f10d51ca700 1 -- 192.168.123.105:0/4253653408 shutdown_connections 2026-03-10T07:54:55.997 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:55.996+0000 7f10d51ca700 1 -- 192.168.123.105:0/4253653408 wait complete. 2026-03-10T07:54:56.066 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.063+0000 7f9a2f152700 1 -- 192.168.123.105:0/3099950952 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9a28102a00 msgr2=0x7f9a2810aef0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:56.066 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.063+0000 7f9a2f152700 1 --2- 192.168.123.105:0/3099950952 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9a28102a00 0x7f9a2810aef0 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f9a1800b3a0 tx=0x7f9a1800b6b0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.065+0000 7f9a2f152700 1 -- 192.168.123.105:0/3099950952 shutdown_connections 2026-03-10T07:54:56.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.065+0000 7f9a2f152700 1 --2- 192.168.123.105:0/3099950952 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9a28102a00 0x7f9a2810aef0 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.065+0000 7f9a2f152700 1 --2- 192.168.123.105:0/3099950952 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9a281020e0 0x7f9a281024c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.065+0000 7f9a2f152700 1 -- 192.168.123.105:0/3099950952 >> 192.168.123.105:0/3099950952 conn(0x7f9a280fb830 msgr2=0x7f9a280fdc50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:56.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.067+0000 7f9a2f152700 1 -- 192.168.123.105:0/3099950952 shutdown_connections 2026-03-10T07:54:56.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.067+0000 7f9a2f152700 1 -- 192.168.123.105:0/3099950952 wait complete. 2026-03-10T07:54:56.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.068+0000 7f9a2f152700 1 Processor -- start 2026-03-10T07:54:56.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.068+0000 7f9a2f152700 1 -- start start 2026-03-10T07:54:56.068 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.068+0000 7f9a2f152700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9a281020e0 0x7f9a2819ca10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:56.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.068+0000 7f9a2f152700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9a28102a00 0x7f9a2819cf50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:56.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.068+0000 7f9a2f152700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9a2819d5e0 con 0x7f9a28102a00 2026-03-10T07:54:56.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.068+0000 7f9a2f152700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9a28196a90 con 0x7f9a281020e0 2026-03-10T07:54:56.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.069+0000 7f9a27fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9a28102a00 0x7f9a2819cf50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:56.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.069+0000 7f9a27fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9a28102a00 0x7f9a2819cf50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:51810/0 (socket says 192.168.123.105:51810) 2026-03-10T07:54:56.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.069+0000 7f9a27fff700 1 -- 192.168.123.105:0/3308585430 learned_addr learned my addr 192.168.123.105:0/3308585430 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:54:56.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.069+0000 7f9a27fff700 1 -- 192.168.123.105:0/3308585430 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9a281020e0 msgr2=0x7f9a2819ca10 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:54:56.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.069+0000 7f9a27fff700 1 --2- 192.168.123.105:0/3308585430 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9a281020e0 0x7f9a2819ca10 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.069 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.069+0000 7f9a27fff700 1 -- 192.168.123.105:0/3308585430 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9a1800b050 con 0x7f9a28102a00 2026-03-10T07:54:56.070 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.069+0000 7f9a27fff700 1 --2- 192.168.123.105:0/3308585430 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9a28102a00 0x7f9a2819cf50 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f9a18012a00 tx=0x7f9a18012ae0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:56.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.070+0000 7f9a25ffb700 1 -- 192.168.123.105:0/3308585430 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9a1800e040 con 0x7f9a28102a00 2026-03-10T07:54:56.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.070+0000 7f9a25ffb700 1 -- 192.168.123.105:0/3308585430 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f9a180090d0 con 0x7f9a28102a00 2026-03-10T07:54:56.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.070+0000 7f9a25ffb700 1 -- 192.168.123.105:0/3308585430 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9a180047b0 con 0x7f9a28102a00 2026-03-10T07:54:56.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.070+0000 7f9a2f152700 1 -- 192.168.123.105:0/3308585430 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9a28196d10 con 0x7f9a28102a00 2026-03-10T07:54:56.071 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.070+0000 7f9a2f152700 1 -- 192.168.123.105:0/3308585430 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9a28197200 con 0x7f9a28102a00 2026-03-10T07:54:56.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.071+0000 7f9a25ffb700 1 -- 192.168.123.105:0/3308585430 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9a18003a00 con 0x7f9a28102a00 2026-03-10T07:54:56.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.072+0000 7f9a2f152700 1 -- 192.168.123.105:0/3308585430 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9a281085f0 con 0x7f9a28102a00 2026-03-10T07:54:56.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.072+0000 7f9a25ffb700 1 --2- 192.168.123.105:0/3308585430 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f9a14077870 0x7f9a14079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:56.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.072+0000 7f9a25ffb700 1 -- 192.168.123.105:0/3308585430 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(70..70 src has 1..70) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f9a1809ae90 con 0x7f9a28102a00 2026-03-10T07:54:56.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.075+0000 7f9a2ceee700 1 --2- 192.168.123.105:0/3308585430 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f9a14077870 0x7f9a14079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:56.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.075+0000 7f9a25ffb700 1 -- 192.168.123.105:0/3308585430 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9a180636c0 con 0x7f9a28102a00 2026-03-10T07:54:56.075 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.075+0000 7f9a2ceee700 1 --2- 192.168.123.105:0/3308585430 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f9a14077870 0x7f9a14079d30 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f9a1000ba10 tx=0x7f9a1000b3f0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:56.235 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.235+0000 7f9a2f152700 1 -- 192.168.123.105:0/3308585430 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f9a28197fa0 con 0x7f9a28102a00 2026-03-10T07:54:56.236 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.236+0000 7f9a25ffb700 1 -- 192.168.123.105:0/3308585430 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+799 (secure 0 0 0) 0x7f9a18062e10 con 0x7f9a28102a00 2026-03-10T07:54:56.237 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:54:56.237 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T07:54:56.237 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:54:56.237 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:54:56.237 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T07:54:56.237 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:54:56.237 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:54:56.237 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T07:54:56.237 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 2, 2026-03-10T07:54:56.237 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T07:54:56.237 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:54:56.237 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T07:54:56.237 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 4 2026-03-10T07:54:56.237 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:54:56.237 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T07:54:56.237 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 6, 2026-03-10T07:54:56.237 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 8 2026-03-10T07:54:56.237 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T07:54:56.237 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:54:56.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.238+0000 7f9a2f152700 1 -- 192.168.123.105:0/3308585430 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f9a14077870 msgr2=0x7f9a14079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:56.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.238+0000 7f9a2f152700 1 --2- 192.168.123.105:0/3308585430 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f9a14077870 0x7f9a14079d30 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f9a1000ba10 tx=0x7f9a1000b3f0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.238+0000 7f9a2f152700 1 -- 192.168.123.105:0/3308585430 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9a28102a00 msgr2=0x7f9a2819cf50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:56.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.238+0000 7f9a2f152700 1 --2- 192.168.123.105:0/3308585430 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9a28102a00 0x7f9a2819cf50 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f9a18012a00 tx=0x7f9a18012ae0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.239+0000 7f9a2f152700 1 -- 192.168.123.105:0/3308585430 shutdown_connections 2026-03-10T07:54:56.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.239+0000 7f9a2f152700 1 --2- 192.168.123.105:0/3308585430 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9a281020e0 0x7f9a2819ca10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.239+0000 7f9a2f152700 1 --2- 192.168.123.105:0/3308585430 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f9a14077870 0x7f9a14079d30 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.239+0000 7f9a2f152700 1 --2- 192.168.123.105:0/3308585430 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9a28102a00 0x7f9a2819cf50 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.239+0000 7f9a2f152700 1 -- 192.168.123.105:0/3308585430 >> 192.168.123.105:0/3308585430 conn(0x7f9a280fb830 msgr2=0x7f9a28105730 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:56.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.239+0000 7f9a2f152700 1 -- 192.168.123.105:0/3308585430 shutdown_connections 2026-03-10T07:54:56.239 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.239+0000 7f9a2f152700 1 -- 192.168.123.105:0/3308585430 wait complete. 2026-03-10T07:54:56.308 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.307+0000 7f8fcb0a4700 1 -- 192.168.123.105:0/4173400690 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8fc41032e0 msgr2=0x7f8fc41036c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:56.308 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.307+0000 7f8fcb0a4700 1 --2- 192.168.123.105:0/4173400690 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8fc41032e0 0x7f8fc41036c0 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f8fc0009b00 tx=0x7f8fc0009e10 comp rx=0 tx=0).stop 2026-03-10T07:54:56.309 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.308+0000 7f8fcb0a4700 1 -- 192.168.123.105:0/4173400690 shutdown_connections 2026-03-10T07:54:56.309 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.308+0000 7f8fcb0a4700 1 --2- 192.168.123.105:0/4173400690 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8fc4103c90 0x7f8fc4107ce0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.309 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.308+0000 7f8fcb0a4700 1 --2- 192.168.123.105:0/4173400690 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8fc41032e0 0x7f8fc41036c0 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.309 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.308+0000 7f8fcb0a4700 1 -- 192.168.123.105:0/4173400690 >> 192.168.123.105:0/4173400690 conn(0x7f8fc40feb50 msgr2=0x7f8fc4100f70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:56.309 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.308+0000 7f8fcb0a4700 1 -- 192.168.123.105:0/4173400690 shutdown_connections 2026-03-10T07:54:56.309 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.308+0000 7f8fcb0a4700 1 -- 192.168.123.105:0/4173400690 wait complete. 2026-03-10T07:54:56.309 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.309+0000 7f8fcb0a4700 1 Processor -- start 2026-03-10T07:54:56.309 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.309+0000 7f8fcb0a4700 1 -- start start 2026-03-10T07:54:56.309 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.309+0000 7f8fcb0a4700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8fc41032e0 0x7f8fc4198e80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:56.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.309+0000 7f8fcb0a4700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8fc4103c90 0x7f8fc41993c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:56.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.309+0000 7f8fcb0a4700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8fc4199aa0 con 0x7f8fc4103c90 2026-03-10T07:54:56.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.309+0000 7f8fcb0a4700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f8fc419d830 con 0x7f8fc41032e0 2026-03-10T07:54:56.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.309+0000 7f8fc8e40700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8fc41032e0 0x7f8fc4198e80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:56.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.309+0000 7f8fc8e40700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8fc41032e0 0x7f8fc4198e80 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:46656/0 (socket says 192.168.123.105:46656) 2026-03-10T07:54:56.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.309+0000 7f8fc8e40700 1 -- 192.168.123.105:0/1990156488 learned_addr learned my addr 192.168.123.105:0/1990156488 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:54:56.310 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.310+0000 7f8fc8e40700 1 -- 192.168.123.105:0/1990156488 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8fc4103c90 msgr2=0x7f8fc41993c0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:54:56.311 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.310+0000 7f8fbbfff700 1 --2- 192.168.123.105:0/1990156488 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8fc4103c90 0x7f8fc41993c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:56.311 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.311+0000 7f8fc8e40700 1 --2- 192.168.123.105:0/1990156488 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8fc4103c90 0x7f8fc41993c0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.311 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.311+0000 7f8fc8e40700 1 -- 192.168.123.105:0/1990156488 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f8fc00097e0 con 0x7f8fc41032e0 2026-03-10T07:54:56.311 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.311+0000 7f8fbbfff700 1 --2- 192.168.123.105:0/1990156488 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8fc4103c90 0x7f8fc41993c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:54:56.311 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.311+0000 7f8fc8e40700 1 --2- 192.168.123.105:0/1990156488 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8fc41032e0 0x7f8fc4198e80 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f8fc00048c0 tx=0x7f8fc00049a0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:56.311 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.311+0000 7f8fb9ffb700 1 -- 192.168.123.105:0/1990156488 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8fc001d070 con 0x7f8fc41032e0 2026-03-10T07:54:56.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.311+0000 7f8fcb0a4700 1 -- 192.168.123.105:0/1990156488 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f8fc419dab0 con 0x7f8fc41032e0 2026-03-10T07:54:56.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.311+0000 7f8fcb0a4700 1 -- 192.168.123.105:0/1990156488 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f8fc419dfa0 con 0x7f8fc41032e0 2026-03-10T07:54:56.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.312+0000 7f8fb9ffb700 1 -- 192.168.123.105:0/1990156488 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f8fc000bc50 con 0x7f8fc41032e0 2026-03-10T07:54:56.312 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.312+0000 7f8fb9ffb700 1 -- 192.168.123.105:0/1990156488 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f8fc000f670 con 0x7f8fc41032e0 2026-03-10T07:54:56.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.313+0000 7f8fb9ffb700 1 -- 192.168.123.105:0/1990156488 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f8fc000f7d0 con 0x7f8fc41032e0 2026-03-10T07:54:56.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.313+0000 7f8fcb0a4700 1 -- 192.168.123.105:0/1990156488 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f8fb0005320 con 0x7f8fc41032e0 2026-03-10T07:54:56.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.313+0000 7f8fb9ffb700 1 --2- 192.168.123.105:0/1990156488 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f8fac077870 0x7f8fac079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:56.314 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.313+0000 7f8fb9ffb700 1 -- 192.168.123.105:0/1990156488 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(70..70 src has 1..70) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f8fc009b0b0 con 0x7f8fc41032e0 2026-03-10T07:54:56.316 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.313+0000 7f8fbbfff700 1 --2- 192.168.123.105:0/1990156488 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f8fac077870 0x7f8fac079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:56.316 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.315+0000 7f8fbbfff700 1 --2- 192.168.123.105:0/1990156488 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f8fac077870 0x7f8fac079d30 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f8fc419a4a0 tx=0x7f8fb4008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:56.319 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.319+0000 7f8fb9ffb700 1 -- 192.168.123.105:0/1990156488 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f8fc0063990 con 0x7f8fc41032e0 2026-03-10T07:54:56.471 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.470+0000 7f8fcb0a4700 1 -- 192.168.123.105:0/1990156488 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f8fb0006200 con 0x7f8fc41032e0 2026-03-10T07:54:56.471 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.471+0000 7f8fb9ffb700 1 -- 192.168.123.105:0/1990156488 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 12 v12) v1 ==== 76+0+1915 (secure 0 0 0) 0x7f8fc00630e0 con 0x7f8fc41032e0 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:e12 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:btime 1970-01-01T00:00:00:000000+0000 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:epoch 12 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T07:48:24.309293+0000 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T07:48:33.425187+0000 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:max_xattr_size 65536 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 39 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24297} 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:inline_data enabled 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:qdb_cluster leader: 0 members: 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.omfhnh{0:24297} state up:active seq 5 join_fscid=1 addr [v2:192.168.123.105:6826/723078808,v1:192.168.123.105:6827/723078808] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.pavqil{-1:14512} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/427555544,v1:192.168.123.105:6829/427555544] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ybmbgd{-1:14524} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3815585988,v1:192.168.123.108:6825/3815585988] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:54:56.473 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.dgsaon{-1:24313} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/2963085185,v1:192.168.123.108:6827/2963085185] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:54:56.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.475+0000 7f8fcb0a4700 1 -- 192.168.123.105:0/1990156488 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f8fac077870 msgr2=0x7f8fac079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:56.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.475+0000 7f8fcb0a4700 1 --2- 192.168.123.105:0/1990156488 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f8fac077870 0x7f8fac079d30 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f8fc419a4a0 tx=0x7f8fb4008040 comp rx=0 tx=0).stop 2026-03-10T07:54:56.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.475+0000 7f8fcb0a4700 1 -- 192.168.123.105:0/1990156488 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8fc41032e0 msgr2=0x7f8fc4198e80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:56.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.475+0000 7f8fcb0a4700 1 --2- 192.168.123.105:0/1990156488 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8fc41032e0 0x7f8fc4198e80 secure :-1 s=READY pgs=38 cs=0 l=1 rev1=1 crypto rx=0x7f8fc00048c0 tx=0x7f8fc00049a0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.475+0000 7f8fcb0a4700 1 -- 192.168.123.105:0/1990156488 shutdown_connections 2026-03-10T07:54:56.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.475+0000 7f8fcb0a4700 1 --2- 192.168.123.105:0/1990156488 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f8fc41032e0 0x7f8fc4198e80 unknown :-1 s=CLOSED pgs=38 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.475 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.475+0000 7f8fcb0a4700 1 --2- 192.168.123.105:0/1990156488 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f8fac077870 0x7f8fac079d30 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.476 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.475+0000 7f8fcb0a4700 1 --2- 192.168.123.105:0/1990156488 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f8fc4103c90 0x7f8fc41993c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.476 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.475+0000 7f8fcb0a4700 1 -- 192.168.123.105:0/1990156488 >> 192.168.123.105:0/1990156488 conn(0x7f8fc40feb50 msgr2=0x7f8fc4100200 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:56.476 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.475+0000 7f8fcb0a4700 1 -- 192.168.123.105:0/1990156488 shutdown_connections 2026-03-10T07:54:56.476 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.475+0000 7f8fcb0a4700 1 -- 192.168.123.105:0/1990156488 wait complete. 2026-03-10T07:54:56.476 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T07:54:56.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.546+0000 7f3b38d28700 1 -- 192.168.123.105:0/2414660311 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b34103cf0 msgr2=0x7f3b34107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:56.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.546+0000 7f3b38d28700 1 --2- 192.168.123.105:0/2414660311 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b34103cf0 0x7f3b34107d40 secure :-1 s=READY pgs=39 cs=0 l=1 rev1=1 crypto rx=0x7f3b24009b00 tx=0x7f3b24009e10 comp rx=0 tx=0).stop 2026-03-10T07:54:56.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.546+0000 7f3b38d28700 1 -- 192.168.123.105:0/2414660311 shutdown_connections 2026-03-10T07:54:56.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.547+0000 7f3b38d28700 1 --2- 192.168.123.105:0/2414660311 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b34103cf0 0x7f3b34107d40 unknown :-1 s=CLOSED pgs=39 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.547+0000 7f3b38d28700 1 --2- 192.168.123.105:0/2414660311 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b34103340 0x7f3b34103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.547+0000 7f3b38d28700 1 -- 192.168.123.105:0/2414660311 >> 192.168.123.105:0/2414660311 conn(0x7f3b340feb90 msgr2=0x7f3b34100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:56.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.547+0000 7f3b38d28700 1 -- 192.168.123.105:0/2414660311 shutdown_connections 2026-03-10T07:54:56.547 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.547+0000 7f3b38d28700 1 -- 192.168.123.105:0/2414660311 wait complete. 2026-03-10T07:54:56.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.547+0000 7f3b38d28700 1 Processor -- start 2026-03-10T07:54:56.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.548+0000 7f3b38d28700 1 -- start start 2026-03-10T07:54:56.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.548+0000 7f3b38d28700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b34103340 0x7f3b34198de0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:56.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.548+0000 7f3b38d28700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b34103cf0 0x7f3b34199320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:56.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.548+0000 7f3b38d28700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3b34199a00 con 0x7f3b34103340 2026-03-10T07:54:56.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.548+0000 7f3b38d28700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3b3419d790 con 0x7f3b34103cf0 2026-03-10T07:54:56.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.548+0000 7f3b3259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b34103340 0x7f3b34198de0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:56.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.548+0000 7f3b31d9b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b34103cf0 0x7f3b34199320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:56.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.548+0000 7f3b3259c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b34103340 0x7f3b34198de0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:51836/0 (socket says 192.168.123.105:51836) 2026-03-10T07:54:56.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.548+0000 7f3b3259c700 1 -- 192.168.123.105:0/968319078 learned_addr learned my addr 192.168.123.105:0/968319078 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:54:56.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.548+0000 7f3b3259c700 1 -- 192.168.123.105:0/968319078 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b34103cf0 msgr2=0x7f3b34199320 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:56.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.548+0000 7f3b3259c700 1 --2- 192.168.123.105:0/968319078 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b34103cf0 0x7f3b34199320 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.548+0000 7f3b3259c700 1 -- 192.168.123.105:0/968319078 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3b1c009710 con 0x7f3b34103340 2026-03-10T07:54:56.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.549+0000 7f3b31d9b700 1 --2- 192.168.123.105:0/968319078 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b34103cf0 0x7f3b34199320 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T07:54:56.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.549+0000 7f3b3259c700 1 --2- 192.168.123.105:0/968319078 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b34103340 0x7f3b34198de0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f3b1c00eee0 tx=0x7f3b1c00c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:56.554 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.549+0000 7f3b2b7fe700 1 -- 192.168.123.105:0/968319078 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3b1c00ce10 con 0x7f3b34103340 2026-03-10T07:54:56.554 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.549+0000 7f3b2b7fe700 1 -- 192.168.123.105:0/968319078 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f3b1c004500 con 0x7f3b34103340 2026-03-10T07:54:56.554 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.549+0000 7f3b2b7fe700 1 -- 192.168.123.105:0/968319078 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3b1c005490 con 0x7f3b34103340 2026-03-10T07:54:56.554 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.549+0000 7f3b38d28700 1 -- 192.168.123.105:0/968319078 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3b240097e0 con 0x7f3b34103340 2026-03-10T07:54:56.554 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.549+0000 7f3b38d28700 1 -- 192.168.123.105:0/968319078 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3b3419dd50 con 0x7f3b34103340 2026-03-10T07:54:56.554 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.551+0000 7f3b38d28700 1 -- 192.168.123.105:0/968319078 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3b3410b740 con 0x7f3b34103340 2026-03-10T07:54:56.554 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.551+0000 7f3b2b7fe700 1 -- 192.168.123.105:0/968319078 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3b1c0055f0 con 0x7f3b34103340 2026-03-10T07:54:56.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.555+0000 7f3b2b7fe700 1 --2- 192.168.123.105:0/968319078 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3b20077870 0x7f3b20079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:56.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.555+0000 7f3b2b7fe700 1 -- 192.168.123.105:0/968319078 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(70..70 src has 1..70) v4 ==== 6136+0+0 (secure 0 0 0) 0x7f3b1c014070 con 0x7f3b34103340 2026-03-10T07:54:56.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.555+0000 7f3b31d9b700 1 --2- 192.168.123.105:0/968319078 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3b20077870 0x7f3b20079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:56.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.555+0000 7f3b2b7fe700 1 -- 192.168.123.105:0/968319078 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3b1c062880 con 0x7f3b34103340 2026-03-10T07:54:56.555 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.555+0000 7f3b31d9b700 1 --2- 192.168.123.105:0/968319078 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3b20077870 0x7f3b20079d30 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f3b3419a400 tx=0x7f3b2400b560 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:56.586 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:56 vm05.local ceph-mon[130117]: pgmap v104: 65 pgs: 65 active+clean; 255 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:56.587 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:56 vm05.local ceph-mon[130117]: from='client.34226 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:54:56.587 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:56 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/3308585430' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:54:56.587 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:56 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/1990156488' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T07:54:56.681 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.680+0000 7f3b38d28700 1 -- 192.168.123.105:0/968319078 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f3b3419e030 con 0x7f3b20077870 2026-03-10T07:54:56.681 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.681+0000 7f3b2b7fe700 1 -- 192.168.123.105:0/968319078 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+401 (secure 0 0 0) 0x7f3b3419e030 con 0x7f3b20077870 2026-03-10T07:54:56.681 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:54:56.682 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T07:54:56.682 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T07:54:56.682 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T07:54:56.682 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T07:54:56.682 INFO:teuthology.orchestra.run.vm05.stdout: "mgr", 2026-03-10T07:54:56.682 INFO:teuthology.orchestra.run.vm05.stdout: "crash", 2026-03-10T07:54:56.682 INFO:teuthology.orchestra.run.vm05.stdout: "mon" 2026-03-10T07:54:56.682 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T07:54:56.682 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "10/23 daemons upgraded", 2026-03-10T07:54:56.682 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading osd daemons", 2026-03-10T07:54:56.682 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T07:54:56.682 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:54:56.684 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.684+0000 7f3b38d28700 1 -- 192.168.123.105:0/968319078 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3b20077870 msgr2=0x7f3b20079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:56.684 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.684+0000 7f3b38d28700 1 --2- 192.168.123.105:0/968319078 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3b20077870 0x7f3b20079d30 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f3b3419a400 tx=0x7f3b2400b560 comp rx=0 tx=0).stop 2026-03-10T07:54:56.684 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.684+0000 7f3b38d28700 1 -- 192.168.123.105:0/968319078 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b34103340 msgr2=0x7f3b34198de0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:56.684 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.684+0000 7f3b38d28700 1 --2- 192.168.123.105:0/968319078 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b34103340 0x7f3b34198de0 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f3b1c00eee0 tx=0x7f3b1c00c5b0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.685 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.684+0000 7f3b38d28700 1 -- 192.168.123.105:0/968319078 shutdown_connections 2026-03-10T07:54:56.685 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.685+0000 7f3b38d28700 1 --2- 192.168.123.105:0/968319078 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3b20077870 0x7f3b20079d30 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.685 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.685+0000 7f3b38d28700 1 --2- 192.168.123.105:0/968319078 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3b34103340 0x7f3b34198de0 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.685 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.685+0000 7f3b38d28700 1 --2- 192.168.123.105:0/968319078 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3b34103cf0 0x7f3b34199320 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.685 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.685+0000 7f3b38d28700 1 -- 192.168.123.105:0/968319078 >> 192.168.123.105:0/968319078 conn(0x7f3b340feb90 msgr2=0x7f3b34100150 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:56.685 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.685+0000 7f3b38d28700 1 -- 192.168.123.105:0/968319078 shutdown_connections 2026-03-10T07:54:56.685 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.685+0000 7f3b38d28700 1 -- 192.168.123.105:0/968319078 wait complete. 2026-03-10T07:54:56.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.756+0000 7fdccf79a700 1 -- 192.168.123.105:0/1345591560 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdcc80731c0 msgr2=0x7fdcc80735a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:56.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.756+0000 7fdccf79a700 1 --2- 192.168.123.105:0/1345591560 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdcc80731c0 0x7fdcc80735a0 secure :-1 s=READY pgs=78 cs=0 l=1 rev1=1 crypto rx=0x7fdcbc009b50 tx=0x7fdcbc009e60 comp rx=0 tx=0).stop 2026-03-10T07:54:56.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.757+0000 7fdccf79a700 1 -- 192.168.123.105:0/1345591560 shutdown_connections 2026-03-10T07:54:56.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.757+0000 7fdccf79a700 1 --2- 192.168.123.105:0/1345591560 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdcc8073ae0 0x7fdcc810d1c0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.757+0000 7fdccf79a700 1 --2- 192.168.123.105:0/1345591560 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdcc80731c0 0x7fdcc80735a0 unknown :-1 s=CLOSED pgs=78 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.757+0000 7fdccf79a700 1 -- 192.168.123.105:0/1345591560 >> 192.168.123.105:0/1345591560 conn(0x7fdcc80fc9b0 msgr2=0x7fdcc80fedd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:56.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.757+0000 7fdccf79a700 1 -- 192.168.123.105:0/1345591560 shutdown_connections 2026-03-10T07:54:56.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.757+0000 7fdccf79a700 1 -- 192.168.123.105:0/1345591560 wait complete. 2026-03-10T07:54:56.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.757+0000 7fdccf79a700 1 Processor -- start 2026-03-10T07:54:56.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.758+0000 7fdccf79a700 1 -- start start 2026-03-10T07:54:56.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.758+0000 7fdccf79a700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdcc80731c0 0x7fdcc8198dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:56.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.758+0000 7fdccf79a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdcc8073ae0 0x7fdcc8199310 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:56.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.758+0000 7fdccf79a700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdcc81999f0 con 0x7fdcc8073ae0 2026-03-10T07:54:56.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.758+0000 7fdccf79a700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fdcc819d780 con 0x7fdcc80731c0 2026-03-10T07:54:56.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.758+0000 7fdcccd35700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdcc8073ae0 0x7fdcc8199310 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:56.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.758+0000 7fdcccd35700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdcc8073ae0 0x7fdcc8199310 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:51856/0 (socket says 192.168.123.105:51856) 2026-03-10T07:54:56.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.758+0000 7fdccd536700 1 --2- 192.168.123.105:0/2257446671 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdcc80731c0 0x7fdcc8198dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:56.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.758+0000 7fdcccd35700 1 -- 192.168.123.105:0/2257446671 learned_addr learned my addr 192.168.123.105:0/2257446671 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:54:56.759 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.758+0000 7fdcccd35700 1 -- 192.168.123.105:0/2257446671 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdcc80731c0 msgr2=0x7fdcc8198dd0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:56.759 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.758+0000 7fdcccd35700 1 --2- 192.168.123.105:0/2257446671 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdcc80731c0 0x7fdcc8198dd0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.759 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.758+0000 7fdcccd35700 1 -- 192.168.123.105:0/2257446671 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fdcbc0097e0 con 0x7fdcc8073ae0 2026-03-10T07:54:56.759 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.759+0000 7fdcccd35700 1 --2- 192.168.123.105:0/2257446671 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdcc8073ae0 0x7fdcc8199310 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7fdcc400eb10 tx=0x7fdcc400eed0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:56.760 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.759+0000 7fdcba7fc700 1 -- 192.168.123.105:0/2257446671 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdcc400cca0 con 0x7fdcc8073ae0 2026-03-10T07:54:56.760 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.759+0000 7fdcba7fc700 1 -- 192.168.123.105:0/2257446671 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fdcc400ce00 con 0x7fdcc8073ae0 2026-03-10T07:54:56.760 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.759+0000 7fdcba7fc700 1 -- 192.168.123.105:0/2257446671 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fdcc4018910 con 0x7fdcc8073ae0 2026-03-10T07:54:56.760 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.759+0000 7fdccf79a700 1 -- 192.168.123.105:0/2257446671 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fdcc819da60 con 0x7fdcc8073ae0 2026-03-10T07:54:56.760 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.759+0000 7fdccf79a700 1 -- 192.168.123.105:0/2257446671 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fdcc819dfb0 con 0x7fdcc8073ae0 2026-03-10T07:54:56.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.761+0000 7fdccf79a700 1 -- 192.168.123.105:0/2257446671 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fdcc810a8c0 con 0x7fdcc8073ae0 2026-03-10T07:54:56.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.761+0000 7fdcba7fc700 1 -- 192.168.123.105:0/2257446671 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fdcc4018a70 con 0x7fdcc8073ae0 2026-03-10T07:54:56.764 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.764+0000 7fdcba7fc700 1 --2- 192.168.123.105:0/2257446671 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fdcb40778c0 0x7fdcb4079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:54:56.765 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.764+0000 7fdcba7fc700 1 -- 192.168.123.105:0/2257446671 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(70..70 src has 1..70) v4 ==== 6136+0+0 (secure 0 0 0) 0x7fdcc4014070 con 0x7fdcc8073ae0 2026-03-10T07:54:56.765 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.764+0000 7fdccd536700 1 --2- 192.168.123.105:0/2257446671 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fdcb40778c0 0x7fdcb4079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:54:56.765 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.765+0000 7fdccd536700 1 --2- 192.168.123.105:0/2257446671 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fdcb40778c0 0x7fdcb4079d80 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7fdcbc006010 tx=0x7fdcbc00b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:54:56.765 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.765+0000 7fdcba7fc700 1 -- 192.168.123.105:0/2257446671 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fdcc4062b50 con 0x7fdcc8073ae0 2026-03-10T07:54:56.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:56 vm08.local ceph-mon[107898]: pgmap v104: 65 pgs: 65 active+clean; 255 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:56.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:56 vm08.local ceph-mon[107898]: from='client.34226 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:54:56.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:56 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/3308585430' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:54:56.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:56 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/1990156488' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T07:54:56.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.926+0000 7fdccf79a700 1 -- 192.168.123.105:0/2257446671 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7fdcc804ea90 con 0x7fdcc8073ae0 2026-03-10T07:54:56.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.927+0000 7fdcba7fc700 1 -- 192.168.123.105:0/2257446671 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+201 (secure 0 0 0) 0x7fdcc40622a0 con 0x7fdcc8073ae0 2026-03-10T07:54:56.927 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T07:54:56.927 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T07:54:56.927 INFO:teuthology.orchestra.run.vm05.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T07:54:56.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.929+0000 7fdccf79a700 1 -- 192.168.123.105:0/2257446671 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fdcb40778c0 msgr2=0x7fdcb4079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:56.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.929+0000 7fdccf79a700 1 --2- 192.168.123.105:0/2257446671 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fdcb40778c0 0x7fdcb4079d80 secure :-1 s=READY pgs=58 cs=0 l=1 rev1=1 crypto rx=0x7fdcbc006010 tx=0x7fdcbc00b540 comp rx=0 tx=0).stop 2026-03-10T07:54:56.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.929+0000 7fdccf79a700 1 -- 192.168.123.105:0/2257446671 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdcc8073ae0 msgr2=0x7fdcc8199310 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:54:56.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.930+0000 7fdccf79a700 1 --2- 192.168.123.105:0/2257446671 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdcc8073ae0 0x7fdcc8199310 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7fdcc400eb10 tx=0x7fdcc400eed0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.930+0000 7fdccf79a700 1 -- 192.168.123.105:0/2257446671 shutdown_connections 2026-03-10T07:54:56.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.930+0000 7fdccf79a700 1 --2- 192.168.123.105:0/2257446671 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fdcc80731c0 0x7fdcc8198dd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.930+0000 7fdccf79a700 1 --2- 192.168.123.105:0/2257446671 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fdcb40778c0 0x7fdcb4079d80 unknown :-1 s=CLOSED pgs=58 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.930+0000 7fdccf79a700 1 --2- 192.168.123.105:0/2257446671 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fdcc8073ae0 0x7fdcc8199310 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:54:56.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.930+0000 7fdccf79a700 1 -- 192.168.123.105:0/2257446671 >> 192.168.123.105:0/2257446671 conn(0x7fdcc80fc9b0 msgr2=0x7fdcc8107a00 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:54:56.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.930+0000 7fdccf79a700 1 -- 192.168.123.105:0/2257446671 shutdown_connections 2026-03-10T07:54:56.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:54:56.930+0000 7fdccf79a700 1 -- 192.168.123.105:0/2257446671 wait complete. 2026-03-10T07:54:57.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:57 vm05.local ceph-mon[130117]: from='client.44207 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:54:57.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:57 vm05.local ceph-mon[130117]: from='client.44211 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:54:57.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:57 vm05.local ceph-mon[130117]: from='client.34240 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:54:57.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:57.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:54:57.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:57 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/2257446671' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T07:54:57.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:57 vm08.local ceph-mon[107898]: from='client.44207 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:54:57.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:57 vm08.local ceph-mon[107898]: from='client.44211 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:54:57.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:57 vm08.local ceph-mon[107898]: from='client.34240 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:54:57.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:54:57.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:54:57.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:57 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/2257446671' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T07:54:58.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:58 vm05.local ceph-mon[130117]: pgmap v105: 65 pgs: 65 active+clean; 255 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:58.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:58 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T07:54:58.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:58 vm08.local ceph-mon[107898]: pgmap v105: 65 pgs: 65 active+clean; 255 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:54:58.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:58 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T07:54:59.773 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:54:59 vm08.local systemd[1]: Stopping Ceph osd.4 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T07:54:59.773 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:54:59 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4[72016]: 2026-03-10T07:54:59.611+0000 7f375d666700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.4 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T07:54:59.773 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:54:59 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4[72016]: 2026-03-10T07:54:59.611+0000 7f375d666700 -1 osd.4 70 *** Got signal Terminated *** 2026-03-10T07:54:59.773 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:54:59 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4[72016]: 2026-03-10T07:54:59.611+0000 7f375d666700 -1 osd.4 70 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T07:55:00.033 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:54:59 vm08.local podman[119722]: 2026-03-10 07:54:59.85943828 +0000 UTC m=+0.270425862 container died bd748b691ccd898bbe428c42301826f5a397556bbd787d226dbdd93adc9a276d (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4, CEPH_POINT_RELEASE=-18.2.1, GIT_CLEAN=True, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, RELEASE=HEAD, maintainer=Guillaume Abrioux , org.label-schema.build-date=20240222, org.label-schema.license=GPLv2, GIT_BRANCH=HEAD, io.buildah.version=1.29.1, org.label-schema.schema-version=1.0, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.vendor=CentOS) 2026-03-10T07:55:00.033 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:54:59 vm08.local podman[119722]: 2026-03-10 07:54:59.880617203 +0000 UTC m=+0.291604785 container remove bd748b691ccd898bbe428c42301826f5a397556bbd787d226dbdd93adc9a276d (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4, RELEASE=HEAD, maintainer=Guillaume Abrioux , org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.vendor=CentOS, GIT_BRANCH=HEAD, GIT_CLEAN=True, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.buildah.version=1.29.1, org.label-schema.license=GPLv2, CEPH_POINT_RELEASE=-18.2.1, org.label-schema.build-date=20240222, org.label-schema.schema-version=1.0) 2026-03-10T07:55:00.033 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:54:59 vm08.local bash[119722]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4 2026-03-10T07:55:00.033 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:59 vm08.local ceph-mon[107898]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T07:55:00.033 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:59 vm08.local ceph-mon[107898]: Upgrade: osd.4 is safe to restart 2026-03-10T07:55:00.033 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:59 vm08.local ceph-mon[107898]: Upgrade: Updating osd.4 2026-03-10T07:55:00.033 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:59 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:00.033 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:59 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T07:55:00.033 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:59 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:55:00.033 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:59 vm08.local ceph-mon[107898]: Deploying daemon osd.4 on vm08 2026-03-10T07:55:00.033 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:54:59 vm08.local ceph-mon[107898]: osd.4 marked itself down and dead 2026-03-10T07:55:00.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:59 vm05.local ceph-mon[130117]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["4"], "max": 16}]: dispatch 2026-03-10T07:55:00.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:59 vm05.local ceph-mon[130117]: Upgrade: osd.4 is safe to restart 2026-03-10T07:55:00.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:59 vm05.local ceph-mon[130117]: Upgrade: Updating osd.4 2026-03-10T07:55:00.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:59 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:00.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:59 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.4"}]: dispatch 2026-03-10T07:55:00.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:59 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:55:00.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:59 vm05.local ceph-mon[130117]: Deploying daemon osd.4 on vm08 2026-03-10T07:55:00.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:54:59 vm05.local ceph-mon[130117]: osd.4 marked itself down and dead 2026-03-10T07:55:00.343 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:00 vm08.local podman[119787]: 2026-03-10 07:55:00.033007532 +0000 UTC m=+0.019114609 container create 82fee6bfaa1e7b4168358d3d40b7582aa48d0c5805552dff65ba0b905809d51b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-deactivate, io.buildah.version=1.41.3, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T07:55:00.343 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:00 vm08.local podman[119787]: 2026-03-10 07:55:00.075737235 +0000 UTC m=+0.061844312 container init 82fee6bfaa1e7b4168358d3d40b7582aa48d0c5805552dff65ba0b905809d51b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260223, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T07:55:00.343 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:00 vm08.local podman[119787]: 2026-03-10 07:55:00.082271466 +0000 UTC m=+0.068378533 container start 82fee6bfaa1e7b4168358d3d40b7582aa48d0c5805552dff65ba0b905809d51b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-deactivate, io.buildah.version=1.41.3, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T07:55:00.343 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:00 vm08.local podman[119787]: 2026-03-10 07:55:00.090304252 +0000 UTC m=+0.076411329 container attach 82fee6bfaa1e7b4168358d3d40b7582aa48d0c5805552dff65ba0b905809d51b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=squid) 2026-03-10T07:55:00.343 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:00 vm08.local podman[119787]: 2026-03-10 07:55:00.026126893 +0000 UTC m=+0.012233980 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T07:55:00.343 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:00 vm08.local podman[119787]: 2026-03-10 07:55:00.218403653 +0000 UTC m=+0.204510731 container died 82fee6bfaa1e7b4168358d3d40b7582aa48d0c5805552dff65ba0b905809d51b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-deactivate, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, ceph=True) 2026-03-10T07:55:00.343 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:00 vm08.local podman[119787]: 2026-03-10 07:55:00.239524327 +0000 UTC m=+0.225631404 container remove 82fee6bfaa1e7b4168358d3d40b7582aa48d0c5805552dff65ba0b905809d51b (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0) 2026-03-10T07:55:00.343 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:00 vm08.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.4.service: Deactivated successfully. 2026-03-10T07:55:00.343 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:00 vm08.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.4.service: Unit process 119799 (conmon) remains running after unit stopped. 2026-03-10T07:55:00.343 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:00 vm08.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.4.service: Unit process 119809 (podman) remains running after unit stopped. 2026-03-10T07:55:00.343 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:00 vm08.local systemd[1]: Stopped Ceph osd.4 for 12e9780e-1c55-11f1-8896-79f7c2e9b508. 2026-03-10T07:55:00.343 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:00 vm08.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.4.service: Consumed 36.150s CPU time, 800.1M memory peak. 2026-03-10T07:55:00.701 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:00 vm08.local systemd[1]: Starting Ceph osd.4 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T07:55:00.701 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:00 vm08.local podman[119897]: 2026-03-10 07:55:00.566274618 +0000 UTC m=+0.025675370 container create feb78610cd1a37d303a7fa771f9d89a0b7ea387713007f32b83a376435d07827 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, ceph=True, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T07:55:00.701 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:00 vm08.local podman[119897]: 2026-03-10 07:55:00.603017194 +0000 UTC m=+0.062417966 container init feb78610cd1a37d303a7fa771f9d89a0b7ea387713007f32b83a376435d07827 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-activate, org.label-schema.license=GPLv2, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, CEPH_REF=squid, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T07:55:00.701 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:00 vm08.local podman[119897]: 2026-03-10 07:55:00.609355959 +0000 UTC m=+0.068756722 container start feb78610cd1a37d303a7fa771f9d89a0b7ea387713007f32b83a376435d07827 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, ceph=True, CEPH_REF=squid, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T07:55:00.701 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:00 vm08.local podman[119897]: 2026-03-10 07:55:00.621332629 +0000 UTC m=+0.080733370 container attach feb78610cd1a37d303a7fa771f9d89a0b7ea387713007f32b83a376435d07827 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-activate, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T07:55:00.701 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:00 vm08.local podman[119897]: 2026-03-10 07:55:00.554724349 +0000 UTC m=+0.014125101 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T07:55:00.701 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:00 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-activate[119910]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:55:00.701 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:00 vm08.local bash[119897]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:55:00.701 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:00 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-activate[119910]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:55:00.701 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:00 vm08.local bash[119897]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:55:01.060 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:00 vm05.local ceph-mon[130117]: pgmap v106: 65 pgs: 65 active+clean; 255 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:55:01.060 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:00 vm05.local ceph-mon[130117]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T07:55:01.060 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:00 vm05.local ceph-mon[130117]: osdmap e71: 6 total, 5 up, 6 in 2026-03-10T07:55:01.063 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:00 vm08.local ceph-mon[107898]: pgmap v106: 65 pgs: 65 active+clean; 255 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:55:01.063 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:00 vm08.local ceph-mon[107898]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T07:55:01.063 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:00 vm08.local ceph-mon[107898]: osdmap e71: 6 total, 5 up, 6 in 2026-03-10T07:55:01.418 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-activate[119910]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T07:55:01.418 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-activate[119910]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:55:01.418 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local bash[119897]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T07:55:01.418 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local bash[119897]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:55:01.418 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-activate[119910]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:55:01.418 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local bash[119897]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:55:01.418 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-activate[119910]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-10T07:55:01.418 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local bash[119897]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-10T07:55:01.418 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-activate[119910]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-3d87f515-41a4-4ffe-a5ec-2714c6141204/osd-block-efa282d4-a651-4903-bdb8-2e148038e567 --path /var/lib/ceph/osd/ceph-4 --no-mon-config 2026-03-10T07:55:01.418 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local bash[119897]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-3d87f515-41a4-4ffe-a5ec-2714c6141204/osd-block-efa282d4-a651-4903-bdb8-2e148038e567 --path /var/lib/ceph/osd/ceph-4 --no-mon-config 2026-03-10T07:55:01.803 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:01 vm08.local ceph-mon[107898]: osdmap e72: 6 total, 5 up, 6 in 2026-03-10T07:55:01.804 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-activate[119910]: Running command: /usr/bin/ln -snf /dev/ceph-3d87f515-41a4-4ffe-a5ec-2714c6141204/osd-block-efa282d4-a651-4903-bdb8-2e148038e567 /var/lib/ceph/osd/ceph-4/block 2026-03-10T07:55:01.804 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local bash[119897]: Running command: /usr/bin/ln -snf /dev/ceph-3d87f515-41a4-4ffe-a5ec-2714c6141204/osd-block-efa282d4-a651-4903-bdb8-2e148038e567 /var/lib/ceph/osd/ceph-4/block 2026-03-10T07:55:01.804 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-activate[119910]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-4/block 2026-03-10T07:55:01.804 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-activate[119910]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-10T07:55:01.804 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local bash[119897]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-4/block 2026-03-10T07:55:01.804 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local bash[119897]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1 2026-03-10T07:55:01.804 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-activate[119910]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-10T07:55:01.804 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local bash[119897]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-4 2026-03-10T07:55:01.804 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-activate[119910]: --> ceph-volume lvm activate successful for osd ID: 4 2026-03-10T07:55:01.804 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local bash[119897]: --> ceph-volume lvm activate successful for osd ID: 4 2026-03-10T07:55:01.804 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local conmon[119910]: conmon feb78610cd1a37d303a7 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-feb78610cd1a37d303a7fa771f9d89a0b7ea387713007f32b83a376435d07827.scope/container/memory.events 2026-03-10T07:55:01.804 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local podman[119897]: 2026-03-10 07:55:01.577259402 +0000 UTC m=+1.036660154 container died feb78610cd1a37d303a7fa771f9d89a0b7ea387713007f32b83a376435d07827 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-activate, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T07:55:01.804 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local podman[119897]: 2026-03-10 07:55:01.596495338 +0000 UTC m=+1.055896090 container remove feb78610cd1a37d303a7fa771f9d89a0b7ea387713007f32b83a376435d07827 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-activate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True) 2026-03-10T07:55:01.804 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local podman[120163]: 2026-03-10 07:55:01.689265989 +0000 UTC m=+0.015452846 container create 132c8d288b1e2635dc27692f5dbd57580b708e251146e04ffae18c1fa83d9fb1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3, ceph=True, org.label-schema.license=GPLv2, CEPH_REF=squid) 2026-03-10T07:55:01.804 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local podman[120163]: 2026-03-10 07:55:01.718041652 +0000 UTC m=+0.044228509 container init 132c8d288b1e2635dc27692f5dbd57580b708e251146e04ffae18c1fa83d9fb1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, OSD_FLAVOR=default, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, io.buildah.version=1.41.3) 2026-03-10T07:55:01.804 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local podman[120163]: 2026-03-10 07:55:01.723651614 +0000 UTC m=+0.049838471 container start 132c8d288b1e2635dc27692f5dbd57580b708e251146e04ffae18c1fa83d9fb1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, OSD_FLAVOR=default) 2026-03-10T07:55:01.804 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local bash[120163]: 132c8d288b1e2635dc27692f5dbd57580b708e251146e04ffae18c1fa83d9fb1 2026-03-10T07:55:01.804 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local podman[120163]: 2026-03-10 07:55:01.683100989 +0000 UTC m=+0.009287846 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T07:55:01.804 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:01 vm08.local systemd[1]: Started Ceph osd.4 for 12e9780e-1c55-11f1-8896-79f7c2e9b508. 2026-03-10T07:55:02.075 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:01 vm08.local ceph-mon[107898]: pgmap v109: 65 pgs: 8 peering, 4 stale+active+clean, 53 active+clean; 255 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:55:02.075 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:01 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:02.075 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:01 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:02.075 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:01 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:55:02.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:01 vm05.local ceph-mon[130117]: osdmap e72: 6 total, 5 up, 6 in 2026-03-10T07:55:02.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:01 vm05.local ceph-mon[130117]: pgmap v109: 65 pgs: 8 peering, 4 stale+active+clean, 53 active+clean; 255 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail 2026-03-10T07:55:02.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:01 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:02.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:01 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:02.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:01 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:55:02.668 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:02 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4[120174]: 2026-03-10T07:55:02.562+0000 7f2ce06e6740 -1 Falling back to public interface 2026-03-10T07:55:03.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:02 vm05.local ceph-mon[130117]: Health check failed: Reduced data availability: 2 pgs peering (PG_AVAILABILITY) 2026-03-10T07:55:03.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:02 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:03.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:02 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:03.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:02 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:02 vm08.local ceph-mon[107898]: Health check failed: Reduced data availability: 2 pgs peering (PG_AVAILABILITY) 2026-03-10T07:55:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:02 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:02 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:02 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:04.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:04 vm05.local ceph-mon[130117]: pgmap v110: 65 pgs: 10 active+undersized, 8 peering, 2 stale+active+clean, 9 active+undersized+degraded, 36 active+clean; 255 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 29/261 objects degraded (11.111%) 2026-03-10T07:55:04.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:04 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:04.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:04 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:04.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:04 vm05.local ceph-mon[130117]: Health check failed: Degraded data redundancy: 29/261 objects degraded (11.111%), 9 pgs degraded (PG_DEGRADED) 2026-03-10T07:55:04.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:04 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:04.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:04 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:04.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:04 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:55:04.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:04 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:55:04.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:04 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:04.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:04 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:55:04.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:04 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:04.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:04 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:04.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:04 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:04.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:04 vm08.local ceph-mon[107898]: pgmap v110: 65 pgs: 10 active+undersized, 8 peering, 2 stale+active+clean, 9 active+undersized+degraded, 36 active+clean; 255 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 29/261 objects degraded (11.111%) 2026-03-10T07:55:04.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:04 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:04.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:04 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:04.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:04 vm08.local ceph-mon[107898]: Health check failed: Degraded data redundancy: 29/261 objects degraded (11.111%), 9 pgs degraded (PG_DEGRADED) 2026-03-10T07:55:04.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:04 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:04.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:04 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:04.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:04 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:55:04.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:04 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:55:04.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:04 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:04.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:04 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:55:04.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:04 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:04.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:04 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:04.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:04 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:05.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:05 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T07:55:05.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:05 vm05.local ceph-mon[130117]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T07:55:05.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:05 vm05.local ceph-mon[130117]: Upgrade: unsafe to stop osd(s) at this time (5 PGs are or would become offline) 2026-03-10T07:55:05.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:05 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T07:55:05.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:05 vm08.local ceph-mon[107898]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T07:55:05.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:05 vm08.local ceph-mon[107898]: Upgrade: unsafe to stop osd(s) at this time (5 PGs are or would become offline) 2026-03-10T07:55:06.137 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:05 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4[120174]: 2026-03-10T07:55:05.765+0000 7f2ce06e6740 -1 osd.4 0 read_superblock omap replica is missing. 2026-03-10T07:55:06.137 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:05 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4[120174]: 2026-03-10T07:55:05.943+0000 7f2ce06e6740 -1 osd.4 70 log_to_monitors true 2026-03-10T07:55:06.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:06 vm05.local ceph-mon[130117]: pgmap v111: 65 pgs: 15 active+undersized, 8 peering, 11 active+undersized+degraded, 31 active+clean; 255 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 35/261 objects degraded (13.410%) 2026-03-10T07:55:06.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:06 vm05.local ceph-mon[130117]: from='osd.4 [v2:192.168.123.108:6808/2094993144,v1:192.168.123.108:6809/2094993144]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T07:55:06.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:06 vm05.local ceph-mon[130117]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T07:55:06.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:06 vm08.local ceph-mon[107898]: pgmap v111: 65 pgs: 15 active+undersized, 8 peering, 11 active+undersized+degraded, 31 active+clean; 255 MiB data, 1.7 GiB used, 118 GiB / 120 GiB avail; 35/261 objects degraded (13.410%) 2026-03-10T07:55:06.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:06 vm08.local ceph-mon[107898]: from='osd.4 [v2:192.168.123.108:6808/2094993144,v1:192.168.123.108:6809/2094993144]' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T07:55:06.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:06 vm08.local ceph-mon[107898]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]: dispatch 2026-03-10T07:55:07.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:07 vm05.local ceph-mon[130117]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-10T07:55:07.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:07 vm05.local ceph-mon[130117]: osdmap e73: 6 total, 5 up, 6 in 2026-03-10T07:55:07.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:07 vm05.local ceph-mon[130117]: from='osd.4 [v2:192.168.123.108:6808/2094993144,v1:192.168.123.108:6809/2094993144]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T07:55:07.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:07 vm05.local ceph-mon[130117]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T07:55:07.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:07 vm05.local ceph-mon[130117]: from='osd.4 ' entity='osd.4' 2026-03-10T07:55:07.418 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 07:55:07 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4[120174]: 2026-03-10T07:55:07.023+0000 7f2cd7c7f640 -1 osd.4 70 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T07:55:07.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:07 vm08.local ceph-mon[107898]: from='osd.4 ' entity='osd.4' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["4"]}]': finished 2026-03-10T07:55:07.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:07 vm08.local ceph-mon[107898]: osdmap e73: 6 total, 5 up, 6 in 2026-03-10T07:55:07.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:07 vm08.local ceph-mon[107898]: from='osd.4 [v2:192.168.123.108:6808/2094993144,v1:192.168.123.108:6809/2094993144]' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T07:55:07.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:07 vm08.local ceph-mon[107898]: from='osd.4 ' entity='osd.4' cmd=[{"prefix": "osd crush create-or-move", "id": 4, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T07:55:07.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:07 vm08.local ceph-mon[107898]: from='osd.4 ' entity='osd.4' 2026-03-10T07:55:08.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:08 vm05.local ceph-mon[130117]: pgmap v113: 65 pgs: 19 active+undersized, 15 active+undersized+degraded, 31 active+clean; 255 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 45/261 objects degraded (17.241%) 2026-03-10T07:55:08.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:08 vm05.local ceph-mon[130117]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 2 pgs peering) 2026-03-10T07:55:08.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:08 vm05.local ceph-mon[130117]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T07:55:08.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:08 vm05.local ceph-mon[130117]: osd.4 [v2:192.168.123.108:6808/2094993144,v1:192.168.123.108:6809/2094993144] boot 2026-03-10T07:55:08.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:08 vm05.local ceph-mon[130117]: osdmap e74: 6 total, 6 up, 6 in 2026-03-10T07:55:08.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:08 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T07:55:08.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:08 vm08.local ceph-mon[107898]: pgmap v113: 65 pgs: 19 active+undersized, 15 active+undersized+degraded, 31 active+clean; 255 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 45/261 objects degraded (17.241%) 2026-03-10T07:55:08.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:08 vm08.local ceph-mon[107898]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 2 pgs peering) 2026-03-10T07:55:08.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:08 vm08.local ceph-mon[107898]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T07:55:08.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:08 vm08.local ceph-mon[107898]: osd.4 [v2:192.168.123.108:6808/2094993144,v1:192.168.123.108:6809/2094993144] boot 2026-03-10T07:55:08.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:08 vm08.local ceph-mon[107898]: osdmap e74: 6 total, 6 up, 6 in 2026-03-10T07:55:08.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:08 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 4}]: dispatch 2026-03-10T07:55:10.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:10 vm05.local ceph-mon[130117]: pgmap v115: 65 pgs: 1 peering, 18 active+undersized, 15 active+undersized+degraded, 31 active+clean; 255 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 45/261 objects degraded (17.241%) 2026-03-10T07:55:10.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:10 vm05.local ceph-mon[130117]: osdmap e75: 6 total, 6 up, 6 in 2026-03-10T07:55:10.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:10 vm05.local ceph-mon[130117]: Health check update: Degraded data redundancy: 45/261 objects degraded (17.241%), 15 pgs degraded (PG_DEGRADED) 2026-03-10T07:55:10.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:10 vm08.local ceph-mon[107898]: pgmap v115: 65 pgs: 1 peering, 18 active+undersized, 15 active+undersized+degraded, 31 active+clean; 255 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 45/261 objects degraded (17.241%) 2026-03-10T07:55:10.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:10 vm08.local ceph-mon[107898]: osdmap e75: 6 total, 6 up, 6 in 2026-03-10T07:55:10.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:10 vm08.local ceph-mon[107898]: Health check update: Degraded data redundancy: 45/261 objects degraded (17.241%), 15 pgs degraded (PG_DEGRADED) 2026-03-10T07:55:12.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:12 vm05.local ceph-mon[130117]: pgmap v117: 65 pgs: 1 peering, 15 active+undersized, 12 active+undersized+degraded, 37 active+clean; 255 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 38/261 objects degraded (14.559%) 2026-03-10T07:55:12.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:12 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:12.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:12 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:55:12.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:12 vm08.local ceph-mon[107898]: pgmap v117: 65 pgs: 1 peering, 15 active+undersized, 12 active+undersized+degraded, 37 active+clean; 255 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 38/261 objects degraded (14.559%) 2026-03-10T07:55:12.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:12 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:12.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:12 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:55:14.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:14 vm05.local ceph-mon[130117]: pgmap v118: 65 pgs: 1 peering, 3 active+undersized, 1 active+undersized+degraded, 60 active+clean; 255 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 4/261 objects degraded (1.533%) 2026-03-10T07:55:14.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:14 vm08.local ceph-mon[107898]: pgmap v118: 65 pgs: 1 peering, 3 active+undersized, 1 active+undersized+degraded, 60 active+clean; 255 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 4/261 objects degraded (1.533%) 2026-03-10T07:55:15.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:15 vm05.local ceph-mon[130117]: Health check update: Degraded data redundancy: 4/261 objects degraded (1.533%), 1 pg degraded (PG_DEGRADED) 2026-03-10T07:55:15.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:15 vm08.local ceph-mon[107898]: Health check update: Degraded data redundancy: 4/261 objects degraded (1.533%), 1 pg degraded (PG_DEGRADED) 2026-03-10T07:55:16.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:16 vm05.local ceph-mon[130117]: pgmap v119: 65 pgs: 65 active+clean; 255 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:55:16.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:16 vm05.local ceph-mon[130117]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 4/261 objects degraded (1.533%), 1 pg degraded) 2026-03-10T07:55:16.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:16 vm08.local ceph-mon[107898]: pgmap v119: 65 pgs: 65 active+clean; 255 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:55:16.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:16 vm08.local ceph-mon[107898]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 4/261 objects degraded (1.533%), 1 pg degraded) 2026-03-10T07:55:18.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:18 vm05.local ceph-mon[130117]: pgmap v120: 65 pgs: 65 active+clean; 255 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:55:18.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:18 vm08.local ceph-mon[107898]: pgmap v120: 65 pgs: 65 active+clean; 255 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:55:20.146 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:20 vm08.local ceph-mon[107898]: pgmap v121: 65 pgs: 65 active+clean; 255 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:55:20.146 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:20 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T07:55:20.146 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:20 vm08.local ceph-mon[107898]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T07:55:20.146 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:20 vm08.local ceph-mon[107898]: Upgrade: osd.5 is safe to restart 2026-03-10T07:55:20.146 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:20 vm08.local ceph-mon[107898]: Upgrade: Updating osd.5 2026-03-10T07:55:20.146 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:20 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:20.146 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:20 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T07:55:20.146 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:20 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:55:20.146 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:20 vm08.local ceph-mon[107898]: Deploying daemon osd.5 on vm08 2026-03-10T07:55:20.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:20 vm05.local ceph-mon[130117]: pgmap v121: 65 pgs: 65 active+clean; 255 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:55:20.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:20 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T07:55:20.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:20 vm05.local ceph-mon[130117]: from='mon.0 -' entity='mon.' cmd=[{"prefix": "osd ok-to-stop", "ids": ["5"], "max": 16}]: dispatch 2026-03-10T07:55:20.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:20 vm05.local ceph-mon[130117]: Upgrade: osd.5 is safe to restart 2026-03-10T07:55:20.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:20 vm05.local ceph-mon[130117]: Upgrade: Updating osd.5 2026-03-10T07:55:20.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:20 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:20.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:20 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "osd.5"}]: dispatch 2026-03-10T07:55:20.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:20 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:55:20.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:20 vm05.local ceph-mon[130117]: Deploying daemon osd.5 on vm08 2026-03-10T07:55:20.418 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:20 vm08.local systemd[1]: Stopping Ceph osd.5 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T07:55:20.418 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:20 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5[78393]: 2026-03-10T07:55:20.214+0000 7fc861e4e700 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.5 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T07:55:20.418 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:20 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5[78393]: 2026-03-10T07:55:20.214+0000 7fc861e4e700 -1 osd.5 75 *** Got signal Terminated *** 2026-03-10T07:55:20.418 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:20 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5[78393]: 2026-03-10T07:55:20.214+0000 7fc861e4e700 -1 osd.5 75 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T07:55:21.402 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:21 vm08.local ceph-mon[107898]: osd.5 marked itself down and dead 2026-03-10T07:55:21.402 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:21 vm08.local podman[124037]: 2026-03-10 07:55:21.192517786 +0000 UTC m=+0.990980905 container died 9f08820ae98bb02c2cde667ab777a062aad5ba962bb03c5fdfbe3ee207d77571 (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5, GIT_BRANCH=HEAD, ceph=True, maintainer=Guillaume Abrioux , org.label-schema.build-date=20240222, org.label-schema.license=GPLv2, GIT_CLEAN=True, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, CEPH_POINT_RELEASE=-18.2.1, RELEASE=HEAD, io.buildah.version=1.29.1, GIT_REPO=https://github.com/ceph/ceph-container.git, org.label-schema.name=CentOS Stream 8 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) 2026-03-10T07:55:21.402 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:21 vm08.local podman[124037]: 2026-03-10 07:55:21.240633772 +0000 UTC m=+1.039096901 container remove 9f08820ae98bb02c2cde667ab777a062aad5ba962bb03c5fdfbe3ee207d77571 (image=quay.io/ceph/ceph@sha256:9f35728f6070a596500c0804814a12ab6b98e05067316dc64876fb4b28d04af3, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_POINT_RELEASE=-18.2.1, GIT_BRANCH=HEAD, maintainer=Guillaume Abrioux , org.label-schema.build-date=20240222, ceph=True, org.label-schema.schema-version=1.0, GIT_COMMIT=1617517b9622744995c4661989e3c30d036a7cfd, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.29.1, org.label-schema.name=CentOS Stream 8 Base Image, GIT_CLEAN=True, RELEASE=HEAD) 2026-03-10T07:55:21.402 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:21 vm08.local bash[124037]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5 2026-03-10T07:55:21.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:21 vm05.local ceph-mon[130117]: osd.5 marked itself down and dead 2026-03-10T07:55:21.655 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:21 vm08.local podman[124115]: 2026-03-10 07:55:21.43095618 +0000 UTC m=+0.022101191 container create c2c85af066cf98d26bcb3cc85e7c14f2003682a7d0dabd0e9e7468828a558a5c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3) 2026-03-10T07:55:21.655 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:21 vm08.local podman[124115]: 2026-03-10 07:55:21.490022424 +0000 UTC m=+0.081167435 container init c2c85af066cf98d26bcb3cc85e7c14f2003682a7d0dabd0e9e7468828a558a5c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-deactivate, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.license=GPLv2) 2026-03-10T07:55:21.655 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:21 vm08.local podman[124115]: 2026-03-10 07:55:21.494294241 +0000 UTC m=+0.085439242 container start c2c85af066cf98d26bcb3cc85e7c14f2003682a7d0dabd0e9e7468828a558a5c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-deactivate, OSD_FLAVOR=default, CEPH_REF=squid, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS) 2026-03-10T07:55:21.655 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:21 vm08.local podman[124115]: 2026-03-10 07:55:21.499385331 +0000 UTC m=+0.090530332 container attach c2c85af066cf98d26bcb3cc85e7c14f2003682a7d0dabd0e9e7468828a558a5c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, ceph=True, CEPH_REF=squid, org.label-schema.license=GPLv2) 2026-03-10T07:55:21.655 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:21 vm08.local podman[124115]: 2026-03-10 07:55:21.419920174 +0000 UTC m=+0.011065185 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T07:55:21.655 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:21 vm08.local conmon[124127]: conmon c2c85af066cf98d26bcb : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-c2c85af066cf98d26bcb3cc85e7c14f2003682a7d0dabd0e9e7468828a558a5c.scope/container/memory.events 2026-03-10T07:55:21.655 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:21 vm08.local podman[124115]: 2026-03-10 07:55:21.615279359 +0000 UTC m=+0.206424370 container died c2c85af066cf98d26bcb3cc85e7c14f2003682a7d0dabd0e9e7468828a558a5c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-deactivate, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T07:55:21.655 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:21 vm08.local podman[124115]: 2026-03-10 07:55:21.64851953 +0000 UTC m=+0.239664541 container remove c2c85af066cf98d26bcb3cc85e7c14f2003682a7d0dabd0e9e7468828a558a5c (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T07:55:21.918 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:21 vm08.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.5.service: Deactivated successfully. 2026-03-10T07:55:21.918 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:21 vm08.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.5.service: Unit process 124127 (conmon) remains running after unit stopped. 2026-03-10T07:55:21.918 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:21 vm08.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.5.service: Unit process 124138 (podman) remains running after unit stopped. 2026-03-10T07:55:21.918 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:21 vm08.local systemd[1]: Stopped Ceph osd.5 for 12e9780e-1c55-11f1-8896-79f7c2e9b508. 2026-03-10T07:55:21.918 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:21 vm08.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.5.service: Consumed 34.994s CPU time, 823.5M memory peak. 2026-03-10T07:55:21.918 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:21 vm08.local systemd[1]: Starting Ceph osd.5 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T07:55:22.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:22 vm05.local ceph-mon[130117]: pgmap v122: 65 pgs: 65 active+clean; 255 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:55:22.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:22 vm05.local ceph-mon[130117]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T07:55:22.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:22 vm05.local ceph-mon[130117]: Health check failed: all OSDs are running squid or later but require_osd_release < squid (OSD_UPGRADE_FINISHED) 2026-03-10T07:55:22.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:22 vm05.local ceph-mon[130117]: osdmap e76: 6 total, 5 up, 6 in 2026-03-10T07:55:22.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:22 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:22.418 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:21 vm08.local podman[124221]: 2026-03-10 07:55:21.957533724 +0000 UTC m=+0.015225001 container create 6ae91b9c96e2c2d85bb2a4c2d95a8c1096a32b07a6f1cb1a19cbb7970a6a3be9 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-activate, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.41.3, ceph=True, org.label-schema.vendor=CentOS) 2026-03-10T07:55:22.418 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:21 vm08.local podman[124221]: 2026-03-10 07:55:21.997251079 +0000 UTC m=+0.054942366 container init 6ae91b9c96e2c2d85bb2a4c2d95a8c1096a32b07a6f1cb1a19cbb7970a6a3be9 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-activate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, CEPH_REF=squid) 2026-03-10T07:55:22.418 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local podman[124221]: 2026-03-10 07:55:22.000058155 +0000 UTC m=+0.057749441 container start 6ae91b9c96e2c2d85bb2a4c2d95a8c1096a32b07a6f1cb1a19cbb7970a6a3be9 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-activate, CEPH_REF=squid, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3) 2026-03-10T07:55:22.418 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local podman[124221]: 2026-03-10 07:55:22.005291992 +0000 UTC m=+0.062983287 container attach 6ae91b9c96e2c2d85bb2a4c2d95a8c1096a32b07a6f1cb1a19cbb7970a6a3be9 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-activate, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.schema-version=1.0) 2026-03-10T07:55:22.418 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local podman[124221]: 2026-03-10 07:55:21.951420881 +0000 UTC m=+0.009112167 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T07:55:22.418 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-activate[124231]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:55:22.419 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local bash[124221]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:55:22.419 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-activate[124231]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:55:22.419 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local bash[124221]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:55:22.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:22 vm08.local ceph-mon[107898]: pgmap v122: 65 pgs: 65 active+clean; 255 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:55:22.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:22 vm08.local ceph-mon[107898]: Health check failed: 1 osds down (OSD_DOWN) 2026-03-10T07:55:22.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:22 vm08.local ceph-mon[107898]: Health check failed: all OSDs are running squid or later but require_osd_release < squid (OSD_UPGRADE_FINISHED) 2026-03-10T07:55:22.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:22 vm08.local ceph-mon[107898]: osdmap e76: 6 total, 5 up, 6 in 2026-03-10T07:55:22.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:22 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:22.844 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-activate[124231]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T07:55:22.844 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-activate[124231]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:55:22.844 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local bash[124221]: --> Failed to activate via raw: did not find any matching OSD to activate 2026-03-10T07:55:22.844 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local bash[124221]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:55:22.844 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-activate[124231]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:55:22.845 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local bash[124221]: Running command: /usr/bin/ceph-authtool --gen-print-key 2026-03-10T07:55:22.845 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-activate[124231]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-10T07:55:22.845 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local bash[124221]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-10T07:55:22.845 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-activate[124231]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-b125fe6d-7d41-44e2-95ed-86c595d15f65/osd-block-ef1ae183-14f2-4400-85d6-c48b79ef2819 --path /var/lib/ceph/osd/ceph-5 --no-mon-config 2026-03-10T07:55:22.845 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local bash[124221]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph-b125fe6d-7d41-44e2-95ed-86c595d15f65/osd-block-ef1ae183-14f2-4400-85d6-c48b79ef2819 --path /var/lib/ceph/osd/ceph-5 --no-mon-config 2026-03-10T07:55:22.845 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-activate[124231]: Running command: /usr/bin/ln -snf /dev/ceph-b125fe6d-7d41-44e2-95ed-86c595d15f65/osd-block-ef1ae183-14f2-4400-85d6-c48b79ef2819 /var/lib/ceph/osd/ceph-5/block 2026-03-10T07:55:23.122 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:23 vm08.local ceph-mon[107898]: osdmap e77: 6 total, 5 up, 6 in 2026-03-10T07:55:23.122 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:23 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:23.122 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:23 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:23.122 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:23 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:55:23.124 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local bash[124221]: Running command: /usr/bin/ln -snf /dev/ceph-b125fe6d-7d41-44e2-95ed-86c595d15f65/osd-block-ef1ae183-14f2-4400-85d6-c48b79ef2819 /var/lib/ceph/osd/ceph-5/block 2026-03-10T07:55:23.124 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-activate[124231]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-5/block 2026-03-10T07:55:23.124 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local bash[124221]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-5/block 2026-03-10T07:55:23.124 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-activate[124231]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-10T07:55:23.124 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local bash[124221]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-2 2026-03-10T07:55:23.124 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-activate[124231]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-10T07:55:23.124 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local bash[124221]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5 2026-03-10T07:55:23.124 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-activate[124231]: --> ceph-volume lvm activate successful for osd ID: 5 2026-03-10T07:55:23.124 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local bash[124221]: --> ceph-volume lvm activate successful for osd ID: 5 2026-03-10T07:55:23.124 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local conmon[124231]: conmon 6ae91b9c96e2c2d85bb2 : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-6ae91b9c96e2c2d85bb2a4c2d95a8c1096a32b07a6f1cb1a19cbb7970a6a3be9.scope/container/memory.events 2026-03-10T07:55:23.124 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local podman[124221]: 2026-03-10 07:55:22.871708338 +0000 UTC m=+0.929399634 container died 6ae91b9c96e2c2d85bb2a4c2d95a8c1096a32b07a6f1cb1a19cbb7970a6a3be9 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-activate, FROM_IMAGE=quay.io/centos/centos:stream9, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.vendor=CentOS, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T07:55:23.124 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local podman[124221]: 2026-03-10 07:55:22.89634805 +0000 UTC m=+0.954039336 container remove 6ae91b9c96e2c2d85bb2a4c2d95a8c1096a32b07a6f1cb1a19cbb7970a6a3be9 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-activate, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0) 2026-03-10T07:55:23.124 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:22 vm08.local podman[124472]: 2026-03-10 07:55:22.981401368 +0000 UTC m=+0.013950875 container create a4a8929822a27be3014918e92bd2cd6cbdcb35208c0788a66211d421d9c4c05e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, CEPH_REF=squid, org.label-schema.license=GPLv2, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS) 2026-03-10T07:55:23.124 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:23 vm08.local podman[124472]: 2026-03-10 07:55:23.019821797 +0000 UTC m=+0.052371304 container init a4a8929822a27be3014918e92bd2cd6cbdcb35208c0788a66211d421d9c4c05e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.build-date=20260223, io.buildah.version=1.41.3, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T07:55:23.124 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:23 vm08.local podman[124472]: 2026-03-10 07:55:23.022333398 +0000 UTC m=+0.054882905 container start a4a8929822a27be3014918e92bd2cd6cbdcb35208c0788a66211d421d9c4c05e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20260223) 2026-03-10T07:55:23.124 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:23 vm08.local bash[124472]: a4a8929822a27be3014918e92bd2cd6cbdcb35208c0788a66211d421d9c4c05e 2026-03-10T07:55:23.124 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:23 vm08.local podman[124472]: 2026-03-10 07:55:22.975554333 +0000 UTC m=+0.008103850 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T07:55:23.124 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:23 vm08.local systemd[1]: Started Ceph osd.5 for 12e9780e-1c55-11f1-8896-79f7c2e9b508. 2026-03-10T07:55:23.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:23 vm05.local ceph-mon[130117]: osdmap e77: 6 total, 5 up, 6 in 2026-03-10T07:55:23.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:23 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:23.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:23 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:23.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:23 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:55:23.718 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:23 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5[124483]: 2026-03-10T07:55:23.589+0000 7fc0b70d6740 -1 Falling back to public interface 2026-03-10T07:55:24.124 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:24 vm08.local ceph-mon[107898]: pgmap v125: 65 pgs: 1 active+undersized, 17 peering, 5 stale+active+clean, 2 active+undersized+degraded, 40 active+clean; 255 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 3/261 objects degraded (1.149%) 2026-03-10T07:55:24.124 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:24 vm08.local ceph-mon[107898]: Health check failed: Reduced data availability: 7 pgs peering (PG_AVAILABILITY) 2026-03-10T07:55:24.124 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:24 vm08.local ceph-mon[107898]: Health check failed: Degraded data redundancy: 3/261 objects degraded (1.149%), 2 pgs degraded (PG_DEGRADED) 2026-03-10T07:55:24.124 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:24 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:24.124 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:24 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:24.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:24 vm05.local ceph-mon[130117]: pgmap v125: 65 pgs: 1 active+undersized, 17 peering, 5 stale+active+clean, 2 active+undersized+degraded, 40 active+clean; 255 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 3/261 objects degraded (1.149%) 2026-03-10T07:55:24.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:24 vm05.local ceph-mon[130117]: Health check failed: Reduced data availability: 7 pgs peering (PG_AVAILABILITY) 2026-03-10T07:55:24.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:24 vm05.local ceph-mon[130117]: Health check failed: Degraded data redundancy: 3/261 objects degraded (1.149%), 2 pgs degraded (PG_DEGRADED) 2026-03-10T07:55:24.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:24 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:24.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:24 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:25.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:25 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:25.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:25 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:25.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:25 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:25.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:25 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:25.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:25 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:55:25.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:25 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:55:25.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:25 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:25.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:25 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:25.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:25 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:25.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:25 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:25.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:25 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:25.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:25 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:55:25.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:25 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:55:25.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:25 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:26.702 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:26 vm05.local ceph-mon[130117]: pgmap v126: 65 pgs: 4 active+undersized, 17 peering, 2 stale+active+clean, 4 active+undersized+degraded, 38 active+clean; 255 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 7/261 objects degraded (2.682%) 2026-03-10T07:55:26.702 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:55:26.702 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:26.702 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:26.702 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:26.702 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:26.702 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:26 vm05.local ceph-mon[130117]: Upgrade: Setting container_image for all osd 2026-03-10T07:55:26.702 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:26.702 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]: dispatch 2026-03-10T07:55:26.702 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]': finished 2026-03-10T07:55:26.702 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]: dispatch 2026-03-10T07:55:26.702 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]': finished 2026-03-10T07:55:26.702 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]: dispatch 2026-03-10T07:55:26.702 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]': finished 2026-03-10T07:55:26.702 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]: dispatch 2026-03-10T07:55:26.702 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]': finished 2026-03-10T07:55:26.702 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]: dispatch 2026-03-10T07:55:26.702 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]': finished 2026-03-10T07:55:26.702 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]: dispatch 2026-03-10T07:55:26.702 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]': finished 2026-03-10T07:55:26.702 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:26 vm05.local ceph-mon[130117]: Upgrade: Setting require_osd_release to 19 squid 2026-03-10T07:55:26.702 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd require-osd-release", "release": "squid"}]: dispatch 2026-03-10T07:55:26.709 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:26 vm08.local ceph-mon[107898]: pgmap v126: 65 pgs: 4 active+undersized, 17 peering, 2 stale+active+clean, 4 active+undersized+degraded, 38 active+clean; 255 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 7/261 objects degraded (2.682%) 2026-03-10T07:55:26.709 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:26 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:55:26.709 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:26 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:26.709 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:26 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:26.709 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:26 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:26.709 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:26 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:26.709 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:26 vm08.local ceph-mon[107898]: Upgrade: Setting container_image for all osd 2026-03-10T07:55:26.709 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:26 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:26.709 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:26 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]: dispatch 2026-03-10T07:55:26.709 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:26 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.0"}]': finished 2026-03-10T07:55:26.709 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:26 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]: dispatch 2026-03-10T07:55:26.709 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:26 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.1"}]': finished 2026-03-10T07:55:26.709 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:26 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]: dispatch 2026-03-10T07:55:26.709 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:26 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.2"}]': finished 2026-03-10T07:55:26.709 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:26 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]: dispatch 2026-03-10T07:55:26.709 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:26 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.3"}]': finished 2026-03-10T07:55:26.709 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:26 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]: dispatch 2026-03-10T07:55:26.709 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:26 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.4"}]': finished 2026-03-10T07:55:26.709 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:26 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]: dispatch 2026-03-10T07:55:26.709 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:26 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd.5"}]': finished 2026-03-10T07:55:26.709 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:26 vm08.local ceph-mon[107898]: Upgrade: Setting require_osd_release to 19 squid 2026-03-10T07:55:26.709 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:26 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd require-osd-release", "release": "squid"}]: dispatch 2026-03-10T07:55:27.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.018+0000 7ff99c67a700 1 -- 192.168.123.105:0/3247800350 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff994101120 msgr2=0x7ff994101500 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:27.020 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.018+0000 7ff99c67a700 1 --2- 192.168.123.105:0/3247800350 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff994101120 0x7ff994101500 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7ff990009b00 tx=0x7ff990009e10 comp rx=0 tx=0).stop 2026-03-10T07:55:27.020 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.018+0000 7ff99c67a700 1 -- 192.168.123.105:0/3247800350 shutdown_connections 2026-03-10T07:55:27.020 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.018+0000 7ff99c67a700 1 --2- 192.168.123.105:0/3247800350 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff994101ad0 0x7ff994105b20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.020 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.018+0000 7ff99c67a700 1 --2- 192.168.123.105:0/3247800350 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff994101120 0x7ff994101500 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.020 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.018+0000 7ff99c67a700 1 -- 192.168.123.105:0/3247800350 >> 192.168.123.105:0/3247800350 conn(0x7ff9940fc9b0 msgr2=0x7ff9940fedd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:27.020 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.019+0000 7ff99c67a700 1 -- 192.168.123.105:0/3247800350 shutdown_connections 2026-03-10T07:55:27.020 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.019+0000 7ff99c67a700 1 -- 192.168.123.105:0/3247800350 wait complete. 2026-03-10T07:55:27.020 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.019+0000 7ff99c67a700 1 Processor -- start 2026-03-10T07:55:27.020 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.019+0000 7ff99c67a700 1 -- start start 2026-03-10T07:55:27.020 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.019+0000 7ff99c67a700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff994101ad0 0x7ff99410fc00 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:27.020 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.019+0000 7ff99c67a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff994110140 0x7ff9941145a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:27.020 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.019+0000 7ff99c67a700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff994110650 con 0x7ff994110140 2026-03-10T07:55:27.020 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.019+0000 7ff99c67a700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff994110790 con 0x7ff994101ad0 2026-03-10T07:55:27.021 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.020+0000 7ff999c15700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff994110140 0x7ff9941145a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:27.023 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.020+0000 7ff999c15700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff994110140 0x7ff9941145a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38330/0 (socket says 192.168.123.105:38330) 2026-03-10T07:55:27.023 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.020+0000 7ff999c15700 1 -- 192.168.123.105:0/710470200 learned_addr learned my addr 192.168.123.105:0/710470200 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:55:27.023 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.023+0000 7ff99a416700 1 --2- 192.168.123.105:0/710470200 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff994101ad0 0x7ff99410fc00 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:27.024 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.023+0000 7ff999c15700 1 -- 192.168.123.105:0/710470200 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff994101ad0 msgr2=0x7ff99410fc00 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:27.024 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.023+0000 7ff999c15700 1 --2- 192.168.123.105:0/710470200 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff994101ad0 0x7ff99410fc00 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.024 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.023+0000 7ff999c15700 1 -- 192.168.123.105:0/710470200 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff9900097e0 con 0x7ff994110140 2026-03-10T07:55:27.025 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.023+0000 7ff999c15700 1 --2- 192.168.123.105:0/710470200 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff994110140 0x7ff9941145a0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7ff98c00c370 tx=0x7ff98c00c730 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:27.025 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.024+0000 7ff98b7fe700 1 -- 192.168.123.105:0/710470200 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff98c00e050 con 0x7ff994110140 2026-03-10T07:55:27.025 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.024+0000 7ff99c67a700 1 -- 192.168.123.105:0/710470200 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff994114ae0 con 0x7ff994110140 2026-03-10T07:55:27.025 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.024+0000 7ff99c67a700 1 -- 192.168.123.105:0/710470200 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff994115000 con 0x7ff994110140 2026-03-10T07:55:27.028 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.024+0000 7ff98b7fe700 1 -- 192.168.123.105:0/710470200 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7ff98c00f040 con 0x7ff994110140 2026-03-10T07:55:27.028 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.024+0000 7ff98b7fe700 1 -- 192.168.123.105:0/710470200 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff98c013610 con 0x7ff994110140 2026-03-10T07:55:27.028 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.025+0000 7ff99c67a700 1 -- 192.168.123.105:0/710470200 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff978005320 con 0x7ff994110140 2026-03-10T07:55:27.029 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.026+0000 7ff98b7fe700 1 -- 192.168.123.105:0/710470200 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff98c0090d0 con 0x7ff994110140 2026-03-10T07:55:27.029 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.026+0000 7ff98b7fe700 1 --2- 192.168.123.105:0/710470200 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7ff9800779e0 0x7ff980079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:27.029 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.026+0000 7ff98b7fe700 1 -- 192.168.123.105:0/710470200 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(78..78 src has 1..78) v4 ==== 6136+0+0 (secure 0 0 0) 0x7ff98c099620 con 0x7ff994110140 2026-03-10T07:55:27.029 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.029+0000 7ff99a416700 1 --2- 192.168.123.105:0/710470200 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7ff9800779e0 0x7ff980079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:27.030 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.029+0000 7ff98b7fe700 1 -- 192.168.123.105:0/710470200 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff98c061ed0 con 0x7ff994110140 2026-03-10T07:55:27.030 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.030+0000 7ff99a416700 1 --2- 192.168.123.105:0/710470200 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7ff9800779e0 0x7ff980079ea0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7ff990000c00 tx=0x7ff990011040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:27.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.192+0000 7ff99c67a700 1 -- 192.168.123.105:0/710470200 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7ff978000bf0 con 0x7ff9800779e0 2026-03-10T07:55:27.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.194+0000 7ff98b7fe700 1 -- 192.168.123.105:0/710470200 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7ff978000bf0 con 0x7ff9800779e0 2026-03-10T07:55:27.198 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.198+0000 7ff9897fa700 1 -- 192.168.123.105:0/710470200 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7ff9800779e0 msgr2=0x7ff980079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:27.198 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.198+0000 7ff9897fa700 1 --2- 192.168.123.105:0/710470200 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7ff9800779e0 0x7ff980079ea0 secure :-1 s=READY pgs=60 cs=0 l=1 rev1=1 crypto rx=0x7ff990000c00 tx=0x7ff990011040 comp rx=0 tx=0).stop 2026-03-10T07:55:27.198 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.198+0000 7ff9897fa700 1 -- 192.168.123.105:0/710470200 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff994110140 msgr2=0x7ff9941145a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:27.198 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.198+0000 7ff9897fa700 1 --2- 192.168.123.105:0/710470200 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff994110140 0x7ff9941145a0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7ff98c00c370 tx=0x7ff98c00c730 comp rx=0 tx=0).stop 2026-03-10T07:55:27.199 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.198+0000 7ff9897fa700 1 -- 192.168.123.105:0/710470200 shutdown_connections 2026-03-10T07:55:27.199 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.198+0000 7ff9897fa700 1 --2- 192.168.123.105:0/710470200 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff994101ad0 0x7ff99410fc00 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.199 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.198+0000 7ff9897fa700 1 --2- 192.168.123.105:0/710470200 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7ff9800779e0 0x7ff980079ea0 unknown :-1 s=CLOSED pgs=60 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.199 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.198+0000 7ff9897fa700 1 --2- 192.168.123.105:0/710470200 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff994110140 0x7ff9941145a0 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.199 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.198+0000 7ff9897fa700 1 -- 192.168.123.105:0/710470200 >> 192.168.123.105:0/710470200 conn(0x7ff9940fc9b0 msgr2=0x7ff9940fd4b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:27.199 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.199+0000 7ff9897fa700 1 -- 192.168.123.105:0/710470200 shutdown_connections 2026-03-10T07:55:27.199 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.199+0000 7ff9897fa700 1 -- 192.168.123.105:0/710470200 wait complete. 2026-03-10T07:55:27.213 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T07:55:27.273 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.273+0000 7faebbfff700 1 -- 192.168.123.105:0/3846190169 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faebc1030a0 msgr2=0x7faebc103480 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:27.274 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.273+0000 7faebbfff700 1 --2- 192.168.123.105:0/3846190169 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faebc1030a0 0x7faebc103480 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7faeac009b00 tx=0x7faeac009e10 comp rx=0 tx=0).stop 2026-03-10T07:55:27.274 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.274+0000 7faebbfff700 1 -- 192.168.123.105:0/3846190169 shutdown_connections 2026-03-10T07:55:27.274 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.274+0000 7faebbfff700 1 --2- 192.168.123.105:0/3846190169 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faebc103a50 0x7faebc107aa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.274 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.274+0000 7faebbfff700 1 --2- 192.168.123.105:0/3846190169 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faebc1030a0 0x7faebc103480 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.274 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.274+0000 7faebbfff700 1 -- 192.168.123.105:0/3846190169 >> 192.168.123.105:0/3846190169 conn(0x7faebc0fe930 msgr2=0x7faebc100d50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:27.274 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.274+0000 7faebbfff700 1 -- 192.168.123.105:0/3846190169 shutdown_connections 2026-03-10T07:55:27.274 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.274+0000 7faebbfff700 1 -- 192.168.123.105:0/3846190169 wait complete. 2026-03-10T07:55:27.275 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.275+0000 7faebbfff700 1 Processor -- start 2026-03-10T07:55:27.275 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.275+0000 7faebbfff700 1 -- start start 2026-03-10T07:55:27.275 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.275+0000 7faebbfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faebc1030a0 0x7faebc19e830 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:27.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.276+0000 7faebaffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faebc1030a0 0x7faebc19e830 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:27.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.276+0000 7faebaffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faebc1030a0 0x7faebc19e830 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38352/0 (socket says 192.168.123.105:38352) 2026-03-10T07:55:27.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.276+0000 7faebaffd700 1 -- 192.168.123.105:0/3395981345 learned_addr learned my addr 192.168.123.105:0/3395981345 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:55:27.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.276+0000 7faebbfff700 1 --2- 192.168.123.105:0/3395981345 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faebc103a50 0x7faebc19ed70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:27.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.276+0000 7faebbfff700 1 -- 192.168.123.105:0/3395981345 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faebc19f400 con 0x7faebc1030a0 2026-03-10T07:55:27.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.276+0000 7faebbfff700 1 -- 192.168.123.105:0/3395981345 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7faebc198ac0 con 0x7faebc103a50 2026-03-10T07:55:27.276 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.276+0000 7faeba7fc700 1 --2- 192.168.123.105:0/3395981345 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faebc103a50 0x7faebc19ed70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:27.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.276+0000 7faebaffd700 1 -- 192.168.123.105:0/3395981345 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faebc103a50 msgr2=0x7faebc19ed70 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:27.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.276+0000 7faebaffd700 1 --2- 192.168.123.105:0/3395981345 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faebc103a50 0x7faebc19ed70 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.276+0000 7faebaffd700 1 -- 192.168.123.105:0/3395981345 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7faeac0097e0 con 0x7faebc1030a0 2026-03-10T07:55:27.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.277+0000 7faebaffd700 1 --2- 192.168.123.105:0/3395981345 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faebc1030a0 0x7faebc19e830 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7faeac005f50 tx=0x7faeac004c30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:27.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.277+0000 7faea3fff700 1 -- 192.168.123.105:0/3395981345 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faeac01d070 con 0x7faebc1030a0 2026-03-10T07:55:27.277 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.277+0000 7faea3fff700 1 -- 192.168.123.105:0/3395981345 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7faeac00bc50 con 0x7faebc1030a0 2026-03-10T07:55:27.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.277+0000 7faea3fff700 1 -- 192.168.123.105:0/3395981345 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7faeac00f890 con 0x7faebc1030a0 2026-03-10T07:55:27.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.278+0000 7faebbfff700 1 -- 192.168.123.105:0/3395981345 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7faebc198d40 con 0x7faebc1030a0 2026-03-10T07:55:27.278 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.278+0000 7faebbfff700 1 -- 192.168.123.105:0/3395981345 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7faebc1991b0 con 0x7faebc1030a0 2026-03-10T07:55:27.279 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.278+0000 7faebbfff700 1 -- 192.168.123.105:0/3395981345 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7faebc10b3d0 con 0x7faebc1030a0 2026-03-10T07:55:27.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.282+0000 7faea3fff700 1 -- 192.168.123.105:0/3395981345 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7faeac00f9f0 con 0x7faebc1030a0 2026-03-10T07:55:27.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.282+0000 7faea3fff700 1 --2- 192.168.123.105:0/3395981345 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7faea4077700 0x7faea4079bc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:27.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.282+0000 7faea3fff700 1 -- 192.168.123.105:0/3395981345 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(78..78 src has 1..78) v4 ==== 6136+0+0 (secure 0 0 0) 0x7faeac09bc80 con 0x7faebc1030a0 2026-03-10T07:55:27.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.282+0000 7faeba7fc700 1 --2- 192.168.123.105:0/3395981345 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7faea4077700 0x7faea4079bc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:27.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.283+0000 7faeba7fc700 1 --2- 192.168.123.105:0/3395981345 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7faea4077700 0x7faea4079bc0 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7faebc0ff150 tx=0x7faeb000b410 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:27.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.283+0000 7faea3fff700 1 -- 192.168.123.105:0/3395981345 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7faeac064530 con 0x7faebc1030a0 2026-03-10T07:55:27.419 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.419+0000 7faebbfff700 1 -- 192.168.123.105:0/3395981345 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7faebc104580 con 0x7faea4077700 2026-03-10T07:55:27.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.420+0000 7faea3fff700 1 -- 192.168.123.105:0/3395981345 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7faebc104580 con 0x7faea4077700 2026-03-10T07:55:27.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.422+0000 7faebbfff700 1 -- 192.168.123.105:0/3395981345 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7faea4077700 msgr2=0x7faea4079bc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:27.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.422+0000 7faebbfff700 1 --2- 192.168.123.105:0/3395981345 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7faea4077700 0x7faea4079bc0 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7faebc0ff150 tx=0x7faeb000b410 comp rx=0 tx=0).stop 2026-03-10T07:55:27.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.423+0000 7faebbfff700 1 -- 192.168.123.105:0/3395981345 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faebc1030a0 msgr2=0x7faebc19e830 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:27.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.423+0000 7faebbfff700 1 --2- 192.168.123.105:0/3395981345 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faebc1030a0 0x7faebc19e830 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7faeac005f50 tx=0x7faeac004c30 comp rx=0 tx=0).stop 2026-03-10T07:55:27.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.423+0000 7faebbfff700 1 -- 192.168.123.105:0/3395981345 shutdown_connections 2026-03-10T07:55:27.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.423+0000 7faebbfff700 1 --2- 192.168.123.105:0/3395981345 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7faea4077700 0x7faea4079bc0 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.423+0000 7faebbfff700 1 --2- 192.168.123.105:0/3395981345 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7faebc1030a0 0x7faebc19e830 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.424 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.423+0000 7faebbfff700 1 --2- 192.168.123.105:0/3395981345 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7faebc103a50 0x7faebc19ed70 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.424 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.423+0000 7faebbfff700 1 -- 192.168.123.105:0/3395981345 >> 192.168.123.105:0/3395981345 conn(0x7faebc0fe930 msgr2=0x7faebc0fffd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:27.424 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.423+0000 7faebbfff700 1 -- 192.168.123.105:0/3395981345 shutdown_connections 2026-03-10T07:55:27.424 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.423+0000 7faebbfff700 1 -- 192.168.123.105:0/3395981345 wait complete. 2026-03-10T07:55:27.519 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.518+0000 7f3c12b6c700 1 -- 192.168.123.105:0/3248549688 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3c0c1089a0 msgr2=0x7f3c0c10be70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:27.519 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.518+0000 7f3c12b6c700 1 --2- 192.168.123.105:0/3248549688 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3c0c1089a0 0x7f3c0c10be70 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f3c0400d3f0 tx=0x7f3c0400d700 comp rx=0 tx=0).stop 2026-03-10T07:55:27.519 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.519+0000 7f3c12b6c700 1 -- 192.168.123.105:0/3248549688 shutdown_connections 2026-03-10T07:55:27.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:27 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm05[130113]: 2026-03-10T07:55:27.418+0000 7f980f7e7640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T07:55:27.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:27 vm05.local ceph-mon[130117]: Health check cleared: OSD_UPGRADE_FINISHED (was: all OSDs are running squid or later but require_osd_release < squid) 2026-03-10T07:55:27.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "osd require-osd-release", "release": "squid"}]': finished 2026-03-10T07:55:27.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:27 vm05.local ceph-mon[130117]: osdmap e78: 6 total, 5 up, 6 in 2026-03-10T07:55:27.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:27.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:55:27.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:27.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.omfhnh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T07:55:27.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:55:27.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:27 vm05.local ceph-mon[130117]: from='osd.5 [v2:192.168.123.108:6816/2090596459,v1:192.168.123.108:6817/2090596459]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T07:55:27.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:27 vm05.local ceph-mon[130117]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T07:55:27.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:27 vm05.local ceph-mon[130117]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T07:55:27.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:27 vm05.local ceph-mon[130117]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T07:55:27.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:27 vm05.local ceph-mon[130117]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-10T07:55:27.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:27 vm05.local ceph-mon[130117]: osdmap e79: 6 total, 5 up, 6 in 2026-03-10T07:55:27.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:27 vm05.local ceph-mon[130117]: Standby daemon mds.cephfs.vm05.pavqil assigned to filesystem cephfs as rank 0 2026-03-10T07:55:27.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:27 vm05.local ceph-mon[130117]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T07:55:27.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:27 vm05.local ceph-mon[130117]: from='osd.5 [v2:192.168.123.108:6816/2090596459,v1:192.168.123.108:6817/2090596459]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T07:55:27.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:27 vm05.local ceph-mon[130117]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T07:55:27.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:27 vm05.local ceph-mon[130117]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T07:55:27.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:27 vm05.local ceph-mon[130117]: fsmap cephfs:1/1 {0=cephfs.vm05.pavqil=up:replay} 2 up:standby 2026-03-10T07:55:27.519 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.519+0000 7f3c12b6c700 1 --2- 192.168.123.105:0/3248549688 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3c0c1089a0 0x7f3c0c10be70 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.519 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.519+0000 7f3c12b6c700 1 --2- 192.168.123.105:0/3248549688 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3c0c107ff0 0x7f3c0c1083d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.522 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.519+0000 7f3c12b6c700 1 -- 192.168.123.105:0/3248549688 >> 192.168.123.105:0/3248549688 conn(0x7f3c0c06ce20 msgr2=0x7f3c0c06d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:27.522 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.522+0000 7f3c12b6c700 1 -- 192.168.123.105:0/3248549688 shutdown_connections 2026-03-10T07:55:27.523 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.522+0000 7f3c12b6c700 1 -- 192.168.123.105:0/3248549688 wait complete. 2026-03-10T07:55:27.523 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.522+0000 7f3c12b6c700 1 Processor -- start 2026-03-10T07:55:27.523 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.522+0000 7f3c12b6c700 1 -- start start 2026-03-10T07:55:27.523 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.522+0000 7f3c12b6c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3c0c107ff0 0x7f3c0c1332d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:27.523 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.522+0000 7f3c12b6c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3c0c133810 0x7f3c0c133c90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:27.523 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.522+0000 7f3c12b6c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3c0c07ef10 con 0x7f3c0c133810 2026-03-10T07:55:27.523 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.522+0000 7f3c12b6c700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3c0c07f080 con 0x7f3c0c107ff0 2026-03-10T07:55:27.523 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.522+0000 7f3c0bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3c0c133810 0x7f3c0c133c90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:27.523 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.522+0000 7f3c0bfff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3c0c133810 0x7f3c0c133c90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38380/0 (socket says 192.168.123.105:38380) 2026-03-10T07:55:27.523 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.522+0000 7f3c0bfff700 1 -- 192.168.123.105:0/336304576 learned_addr learned my addr 192.168.123.105:0/336304576 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:55:27.523 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.522+0000 7f3c10908700 1 --2- 192.168.123.105:0/336304576 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3c0c107ff0 0x7f3c0c1332d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:27.523 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.523+0000 7f3c0bfff700 1 -- 192.168.123.105:0/336304576 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3c0c107ff0 msgr2=0x7f3c0c1332d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:27.524 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.523+0000 7f3c0bfff700 1 --2- 192.168.123.105:0/336304576 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3c0c107ff0 0x7f3c0c1332d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.524 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.523+0000 7f3c0bfff700 1 -- 192.168.123.105:0/336304576 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3c04007ed0 con 0x7f3c0c133810 2026-03-10T07:55:27.524 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.523+0000 7f3c10908700 1 --2- 192.168.123.105:0/336304576 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3c0c107ff0 0x7f3c0c1332d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:55:27.524 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.523+0000 7f3c0bfff700 1 --2- 192.168.123.105:0/336304576 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3c0c133810 0x7f3c0c133c90 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f3c04003c30 tx=0x7f3c04003d10 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:27.524 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.524+0000 7f3c09ffb700 1 -- 192.168.123.105:0/336304576 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3c0401c070 con 0x7f3c0c133810 2026-03-10T07:55:27.524 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.524+0000 7f3c12b6c700 1 -- 192.168.123.105:0/336304576 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3c0c07f2b0 con 0x7f3c0c133810 2026-03-10T07:55:27.524 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.524+0000 7f3c12b6c700 1 -- 192.168.123.105:0/336304576 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3c0c07f750 con 0x7f3c0c133810 2026-03-10T07:55:27.524 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.524+0000 7f3c09ffb700 1 -- 192.168.123.105:0/336304576 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f3c0400fcf0 con 0x7f3c0c133810 2026-03-10T07:55:27.525 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.524+0000 7f3c09ffb700 1 -- 192.168.123.105:0/336304576 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3c04017dd0 con 0x7f3c0c133810 2026-03-10T07:55:27.525 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.525+0000 7f3c12b6c700 1 -- 192.168.123.105:0/336304576 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3bf8005320 con 0x7f3c0c133810 2026-03-10T07:55:27.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.526+0000 7f3c09ffb700 1 -- 192.168.123.105:0/336304576 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3c04017420 con 0x7f3c0c133810 2026-03-10T07:55:27.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.526+0000 7f3c09ffb700 1 --2- 192.168.123.105:0/336304576 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3bf40779e0 0x7f3bf4079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:27.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.526+0000 7f3c10908700 1 --2- 192.168.123.105:0/336304576 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3bf40779e0 0x7f3bf4079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:27.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.526+0000 7f3c09ffb700 1 -- 192.168.123.105:0/336304576 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f3c04013070 con 0x7f3c0c133810 2026-03-10T07:55:27.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.529+0000 7f3c10908700 1 --2- 192.168.123.105:0/336304576 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3bf40779e0 0x7f3bf4079ea0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f3bfc006fd0 tx=0x7f3bfc008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:27.530 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.530+0000 7f3c09ffb700 1 -- 192.168.123.105:0/336304576 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3c0400fe60 con 0x7f3c0c133810 2026-03-10T07:55:27.653 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.652+0000 7f3c12b6c700 1 -- 192.168.123.105:0/336304576 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f3bf8000bf0 con 0x7f3bf40779e0 2026-03-10T07:55:27.662 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T07:55:27.662 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (4m) 66s ago 9m 23.3M - 0.25.0 c8568f914cd2 ac15d5f35994 2026-03-10T07:55:27.662 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (9m) 66s ago 9m 9.98M - 18.2.1 5be31c24972a 26c4db858175 2026-03-10T07:55:27.662 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (8m) 3s ago 8m 11.7M - 18.2.1 5be31c24972a 209e2398a09c 2026-03-10T07:55:27.662 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (2m) 66s ago 9m 7847k - 19.2.3-678-ge911bdeb 654f31e6858e daa831c74cf4 2026-03-10T07:55:27.662 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (2m) 3s ago 8m 8262k - 19.2.3-678-ge911bdeb 654f31e6858e 668ac55c9722 2026-03-10T07:55:27.662 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (3m) 66s ago 8m 81.7M - 10.4.0 c8b91775d855 6acb529ad951 2026-03-10T07:55:27.662 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.omfhnh vm05 running (7m) 66s ago 7m 186M - 18.2.1 5be31c24972a e23de179e09c 2026-03-10T07:55:27.662 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.pavqil vm05 running (7m) 66s ago 7m 17.6M - 18.2.1 5be31c24972a 5b9e5afa214c 2026-03-10T07:55:27.662 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.dgsaon vm08 running (6m) 3s ago 6m 18.7M - 18.2.1 5be31c24972a 1696aee522b5 2026-03-10T07:55:27.662 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ybmbgd vm08 running (7m) 3s ago 7m 17.1M - 18.2.1 5be31c24972a 30b0e51cd2ed 2026-03-10T07:55:27.662 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.blexke vm05 *:8443,9283,8765 running (5m) 66s ago 9m 586M - 19.2.3-678-ge911bdeb 654f31e6858e 5467b6a3bd38 2026-03-10T07:55:27.662 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.orfpog vm08 *:8443,9283,8765 running (4m) 3s ago 8m 536M - 19.2.3-678-ge911bdeb 654f31e6858e 7a15d7811d5b 2026-03-10T07:55:27.662 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (3m) 66s ago 9m 58.0M 2048M 19.2.3-678-ge911bdeb 654f31e6858e f02f076bb820 2026-03-10T07:55:27.662 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (2m) 3s ago 8m 54.2M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 73d9a504f360 2026-03-10T07:55:27.662 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (4m) 66s ago 9m 10.3M - 1.7.0 72c9c2088986 7cd0b23b4118 2026-03-10T07:55:27.662 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (4m) 3s ago 8m 10.1M - 1.7.0 72c9c2088986 3dd4d91d5881 2026-03-10T07:55:27.663 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (2m) 66s ago 8m 201M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b35fccc2a4d5 2026-03-10T07:55:27.663 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (89s) 66s ago 8m 95.0M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e3fe4ad5c6a3 2026-03-10T07:55:27.663 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (68s) 66s ago 7m 13.0M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 108a77e324b8 2026-03-10T07:55:27.663 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (46s) 3s ago 7m 137M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 3f280bcfe0f5 2026-03-10T07:55:27.663 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (25s) 3s ago 7m 129M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 132c8d288b1e 2026-03-10T07:55:27.663 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (4s) 3s ago 7m 12.8M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a4a8929822a2 2026-03-10T07:55:27.663 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (4m) 66s ago 8m 51.4M - 2.51.0 1d3b7f56885b c59a6be07563 2026-03-10T07:55:27.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.660+0000 7f3c09ffb700 1 -- 192.168.123.105:0/336304576 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f3bf8000bf0 con 0x7f3bf40779e0 2026-03-10T07:55:27.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.662+0000 7f3bf37fe700 1 -- 192.168.123.105:0/336304576 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3bf40779e0 msgr2=0x7f3bf4079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:27.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.662+0000 7f3bf37fe700 1 --2- 192.168.123.105:0/336304576 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3bf40779e0 0x7f3bf4079ea0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f3bfc006fd0 tx=0x7f3bfc008040 comp rx=0 tx=0).stop 2026-03-10T07:55:27.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.662+0000 7f3bf37fe700 1 -- 192.168.123.105:0/336304576 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3c0c133810 msgr2=0x7f3c0c133c90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:27.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.662+0000 7f3bf37fe700 1 --2- 192.168.123.105:0/336304576 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3c0c133810 0x7f3c0c133c90 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7f3c04003c30 tx=0x7f3c04003d10 comp rx=0 tx=0).stop 2026-03-10T07:55:27.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.663+0000 7f3bf37fe700 1 -- 192.168.123.105:0/336304576 shutdown_connections 2026-03-10T07:55:27.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.663+0000 7f3bf37fe700 1 --2- 192.168.123.105:0/336304576 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3c0c107ff0 0x7f3c0c1332d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.663+0000 7f3bf37fe700 1 --2- 192.168.123.105:0/336304576 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3bf40779e0 0x7f3bf4079ea0 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.663+0000 7f3bf37fe700 1 --2- 192.168.123.105:0/336304576 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3c0c133810 0x7f3c0c133c90 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.663 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.663+0000 7f3bf37fe700 1 -- 192.168.123.105:0/336304576 >> 192.168.123.105:0/336304576 conn(0x7f3c0c06ce20 msgr2=0x7f3c0c070640 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:27.664 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.664+0000 7f3bf37fe700 1 -- 192.168.123.105:0/336304576 shutdown_connections 2026-03-10T07:55:27.664 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.664+0000 7f3bf37fe700 1 -- 192.168.123.105:0/336304576 wait complete. 2026-03-10T07:55:27.668 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:27 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5[124483]: 2026-03-10T07:55:27.229+0000 7fc0b70d6740 -1 osd.5 0 read_superblock omap replica is missing. 2026-03-10T07:55:27.668 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:27 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5[124483]: 2026-03-10T07:55:27.392+0000 7fc0b70d6740 -1 osd.5 75 log_to_monitors true 2026-03-10T07:55:27.668 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 07:55:27 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5[124483]: 2026-03-10T07:55:27.445+0000 7fc0aee70640 -1 osd.5 75 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory 2026-03-10T07:55:27.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:27 vm08.local ceph-mon[107898]: Health check cleared: OSD_UPGRADE_FINISHED (was: all OSDs are running squid or later but require_osd_release < squid) 2026-03-10T07:55:27.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:27 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "osd require-osd-release", "release": "squid"}]': finished 2026-03-10T07:55:27.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:27 vm08.local ceph-mon[107898]: osdmap e78: 6 total, 5 up, 6 in 2026-03-10T07:55:27.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:27 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:27.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:27 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:55:27.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:27 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:27.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:27 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.omfhnh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T07:55:27.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:27 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:55:27.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:27 vm08.local ceph-mon[107898]: from='osd.5 [v2:192.168.123.108:6816/2090596459,v1:192.168.123.108:6817/2090596459]' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T07:55:27.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:27 vm08.local ceph-mon[107898]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]: dispatch 2026-03-10T07:55:27.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:27 vm08.local ceph-mon[107898]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T07:55:27.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:27 vm08.local ceph-mon[107898]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T07:55:27.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:27 vm08.local ceph-mon[107898]: from='osd.5 ' entity='osd.5' cmd='[{"prefix": "osd crush set-device-class", "class": "hdd", "ids": ["5"]}]': finished 2026-03-10T07:55:27.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:27 vm08.local ceph-mon[107898]: osdmap e79: 6 total, 5 up, 6 in 2026-03-10T07:55:27.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:27 vm08.local ceph-mon[107898]: Standby daemon mds.cephfs.vm05.pavqil assigned to filesystem cephfs as rank 0 2026-03-10T07:55:27.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:27 vm08.local ceph-mon[107898]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T07:55:27.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:27 vm08.local ceph-mon[107898]: from='osd.5 [v2:192.168.123.108:6816/2090596459,v1:192.168.123.108:6817/2090596459]' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T07:55:27.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:27 vm08.local ceph-mon[107898]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T07:55:27.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:27 vm08.local ceph-mon[107898]: from='osd.5 ' entity='osd.5' cmd=[{"prefix": "osd crush create-or-move", "id": 5, "weight":0.0195, "args": ["host=vm08", "root=default"]}]: dispatch 2026-03-10T07:55:27.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:27 vm08.local ceph-mon[107898]: fsmap cephfs:1/1 {0=cephfs.vm05.pavqil=up:replay} 2 up:standby 2026-03-10T07:55:27.732 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.732+0000 7fac32bf4700 1 -- 192.168.123.105:0/3214155656 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fac2c10f420 msgr2=0x7fac2c10f800 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:27.732 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.732+0000 7fac32bf4700 1 --2- 192.168.123.105:0/3214155656 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fac2c10f420 0x7fac2c10f800 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7fac20009b00 tx=0x7fac20009e10 comp rx=0 tx=0).stop 2026-03-10T07:55:27.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.732+0000 7fac32bf4700 1 -- 192.168.123.105:0/3214155656 shutdown_connections 2026-03-10T07:55:27.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.732+0000 7fac32bf4700 1 --2- 192.168.123.105:0/3214155656 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac2c107d90 0x7fac2c108210 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.732+0000 7fac32bf4700 1 --2- 192.168.123.105:0/3214155656 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fac2c10f420 0x7fac2c10f800 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.732+0000 7fac32bf4700 1 -- 192.168.123.105:0/3214155656 >> 192.168.123.105:0/3214155656 conn(0x7fac2c06ce20 msgr2=0x7fac2c06d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:27.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.732+0000 7fac32bf4700 1 -- 192.168.123.105:0/3214155656 shutdown_connections 2026-03-10T07:55:27.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.732+0000 7fac32bf4700 1 -- 192.168.123.105:0/3214155656 wait complete. 2026-03-10T07:55:27.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.733+0000 7fac32bf4700 1 Processor -- start 2026-03-10T07:55:27.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.733+0000 7fac32bf4700 1 -- start start 2026-03-10T07:55:27.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.733+0000 7fac32bf4700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fac2c107d90 0x7fac2c117ed0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:27.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.733+0000 7fac32bf4700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac2c112ed0 0x7fac2c113350 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:27.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.733+0000 7fac32bf4700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fac2c113890 con 0x7fac2c107d90 2026-03-10T07:55:27.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.733+0000 7fac32bf4700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fac2c113a00 con 0x7fac2c112ed0 2026-03-10T07:55:27.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.733+0000 7fac2bfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac2c112ed0 0x7fac2c113350 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:27.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.733+0000 7fac2bfff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac2c112ed0 0x7fac2c113350 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:48760/0 (socket says 192.168.123.105:48760) 2026-03-10T07:55:27.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.733+0000 7fac2bfff700 1 -- 192.168.123.105:0/4229693500 learned_addr learned my addr 192.168.123.105:0/4229693500 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:55:27.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.733+0000 7fac30990700 1 --2- 192.168.123.105:0/4229693500 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fac2c107d90 0x7fac2c117ed0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:27.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.733+0000 7fac2bfff700 1 -- 192.168.123.105:0/4229693500 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fac2c107d90 msgr2=0x7fac2c117ed0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:27.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.733+0000 7fac2bfff700 1 --2- 192.168.123.105:0/4229693500 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fac2c107d90 0x7fac2c117ed0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.733+0000 7fac2bfff700 1 -- 192.168.123.105:0/4229693500 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fac200097e0 con 0x7fac2c112ed0 2026-03-10T07:55:27.734 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.734+0000 7fac2bfff700 1 --2- 192.168.123.105:0/4229693500 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac2c112ed0 0x7fac2c113350 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7fac2400c3b0 tx=0x7fac2400c770 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:27.738 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.738+0000 7fac29ffb700 1 -- 192.168.123.105:0/4229693500 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fac2400e030 con 0x7fac2c112ed0 2026-03-10T07:55:27.738 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.738+0000 7fac29ffb700 1 -- 192.168.123.105:0/4229693500 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fac2400f040 con 0x7fac2c112ed0 2026-03-10T07:55:27.738 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.738+0000 7fac29ffb700 1 -- 192.168.123.105:0/4229693500 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fac24007e50 con 0x7fac2c112ed0 2026-03-10T07:55:27.738 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.738+0000 7fac32bf4700 1 -- 192.168.123.105:0/4229693500 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fac2c113ce0 con 0x7fac2c112ed0 2026-03-10T07:55:27.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.738+0000 7fac32bf4700 1 -- 192.168.123.105:0/4229693500 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fac2c1b8690 con 0x7fac2c112ed0 2026-03-10T07:55:27.739 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.739+0000 7fac32bf4700 1 -- 192.168.123.105:0/4229693500 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fac2c04f2e0 con 0x7fac2c112ed0 2026-03-10T07:55:27.740 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.740+0000 7fac29ffb700 1 -- 192.168.123.105:0/4229693500 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fac240090d0 con 0x7fac2c112ed0 2026-03-10T07:55:27.740 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.740+0000 7fac29ffb700 1 --2- 192.168.123.105:0/4229693500 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fac14077720 0x7fac14079be0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:27.740 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.740+0000 7fac29ffb700 1 -- 192.168.123.105:0/4229693500 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6222+0+0 (secure 0 0 0) 0x7fac24099680 con 0x7fac2c112ed0 2026-03-10T07:55:27.741 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.740+0000 7fac30990700 1 --2- 192.168.123.105:0/4229693500 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fac14077720 0x7fac14079be0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:27.741 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.741+0000 7fac30990700 1 --2- 192.168.123.105:0/4229693500 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fac14077720 0x7fac14079be0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fac20000c00 tx=0x7fac20011040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:27.742 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.742+0000 7fac29ffb700 1 -- 192.168.123.105:0/4229693500 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fac24061ee0 con 0x7fac2c112ed0 2026-03-10T07:55:27.901 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.901+0000 7fac32bf4700 1 -- 192.168.123.105:0/4229693500 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fac2c114520 con 0x7fac2c112ed0 2026-03-10T07:55:27.908 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.908+0000 7fac29ffb700 1 -- 192.168.123.105:0/4229693500 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+708 (secure 0 0 0) 0x7fac24061630 con 0x7fac2c112ed0 2026-03-10T07:55:27.908 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:55:27.908 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T07:55:27.908 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:55:27.908 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:55:27.908 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T07:55:27.908 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:55:27.908 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:55:27.908 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T07:55:27.908 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 5 2026-03-10T07:55:27.908 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:55:27.908 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T07:55:27.908 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 3 2026-03-10T07:55:27.908 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:55:27.908 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T07:55:27.908 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 3, 2026-03-10T07:55:27.908 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 9 2026-03-10T07:55:27.908 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T07:55:27.908 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:55:27.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.914+0000 7fac137fe700 1 -- 192.168.123.105:0/4229693500 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fac14077720 msgr2=0x7fac14079be0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:27.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.914+0000 7fac137fe700 1 --2- 192.168.123.105:0/4229693500 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fac14077720 0x7fac14079be0 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fac20000c00 tx=0x7fac20011040 comp rx=0 tx=0).stop 2026-03-10T07:55:27.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.914+0000 7fac137fe700 1 -- 192.168.123.105:0/4229693500 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac2c112ed0 msgr2=0x7fac2c113350 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:27.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.914+0000 7fac137fe700 1 --2- 192.168.123.105:0/4229693500 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac2c112ed0 0x7fac2c113350 secure :-1 s=READY pgs=44 cs=0 l=1 rev1=1 crypto rx=0x7fac2400c3b0 tx=0x7fac2400c770 comp rx=0 tx=0).stop 2026-03-10T07:55:27.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.914+0000 7fac137fe700 1 -- 192.168.123.105:0/4229693500 shutdown_connections 2026-03-10T07:55:27.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.914+0000 7fac137fe700 1 --2- 192.168.123.105:0/4229693500 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fac14077720 0x7fac14079be0 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.914+0000 7fac137fe700 1 --2- 192.168.123.105:0/4229693500 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fac2c107d90 0x7fac2c117ed0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.914+0000 7fac137fe700 1 --2- 192.168.123.105:0/4229693500 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fac2c112ed0 0x7fac2c113350 unknown :-1 s=CLOSED pgs=44 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.914+0000 7fac137fe700 1 -- 192.168.123.105:0/4229693500 >> 192.168.123.105:0/4229693500 conn(0x7fac2c06ce20 msgr2=0x7fac2c10d0f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:27.914 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.914+0000 7fac137fe700 1 -- 192.168.123.105:0/4229693500 shutdown_connections 2026-03-10T07:55:27.915 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.914+0000 7fac137fe700 1 -- 192.168.123.105:0/4229693500 wait complete. 2026-03-10T07:55:27.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.984+0000 7f03b3fde700 1 -- 192.168.123.105:0/2542729406 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f03ac107d90 msgr2=0x7f03ac108210 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:27.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.984+0000 7f03b3fde700 1 --2- 192.168.123.105:0/2542729406 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f03ac107d90 0x7f03ac108210 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f03a400b3a0 tx=0x7f03a400b6b0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.985+0000 7f03b3fde700 1 -- 192.168.123.105:0/2542729406 shutdown_connections 2026-03-10T07:55:27.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.985+0000 7f03b3fde700 1 --2- 192.168.123.105:0/2542729406 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f03ac107d90 0x7f03ac108210 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.985+0000 7f03b3fde700 1 --2- 192.168.123.105:0/2542729406 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f03ac10f420 0x7f03ac10f800 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.985+0000 7f03b3fde700 1 -- 192.168.123.105:0/2542729406 >> 192.168.123.105:0/2542729406 conn(0x7f03ac06ce20 msgr2=0x7f03ac06d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:27.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.985+0000 7f03b3fde700 1 -- 192.168.123.105:0/2542729406 shutdown_connections 2026-03-10T07:55:27.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.985+0000 7f03b3fde700 1 -- 192.168.123.105:0/2542729406 wait complete. 2026-03-10T07:55:27.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.985+0000 7f03b3fde700 1 Processor -- start 2026-03-10T07:55:27.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.985+0000 7f03b3fde700 1 -- start start 2026-03-10T07:55:27.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.985+0000 7f03b3fde700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f03ac107d90 0x7f03ac119f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:27.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.985+0000 7f03b3fde700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f03ac10f420 0x7f03ac114f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:27.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.985+0000 7f03b3fde700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f03ac115560 con 0x7f03ac107d90 2026-03-10T07:55:27.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.985+0000 7f03b3fde700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f03ac1156d0 con 0x7f03ac10f420 2026-03-10T07:55:27.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.986+0000 7f03b1579700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f03ac10f420 0x7f03ac114f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:27.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.986+0000 7f03b1579700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f03ac10f420 0x7f03ac114f90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:48772/0 (socket says 192.168.123.105:48772) 2026-03-10T07:55:27.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.986+0000 7f03b1579700 1 -- 192.168.123.105:0/3310954901 learned_addr learned my addr 192.168.123.105:0/3310954901 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:55:27.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.986+0000 7f03b1d7a700 1 --2- 192.168.123.105:0/3310954901 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f03ac107d90 0x7f03ac119f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:27.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.986+0000 7f03b1579700 1 -- 192.168.123.105:0/3310954901 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f03ac107d90 msgr2=0x7f03ac119f90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:27.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.986+0000 7f03b1579700 1 --2- 192.168.123.105:0/3310954901 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f03ac107d90 0x7f03ac119f90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:27.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.986+0000 7f03b1579700 1 -- 192.168.123.105:0/3310954901 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f03a400b050 con 0x7f03ac10f420 2026-03-10T07:55:27.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.986+0000 7f03b1579700 1 --2- 192.168.123.105:0/3310954901 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f03ac10f420 0x7f03ac114f90 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f03a4015040 tx=0x7f03a4012710 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:27.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.986+0000 7f03b1d7a700 1 --2- 192.168.123.105:0/3310954901 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f03ac107d90 0x7f03ac119f90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:55:27.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.986+0000 7f03a2ffd700 1 -- 192.168.123.105:0/3310954901 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f03a400e040 con 0x7f03ac10f420 2026-03-10T07:55:27.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.987+0000 7f03b3fde700 1 -- 192.168.123.105:0/3310954901 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f03ac115950 con 0x7f03ac10f420 2026-03-10T07:55:27.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.987+0000 7f03b3fde700 1 -- 192.168.123.105:0/3310954901 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f03ac076fe0 con 0x7f03ac10f420 2026-03-10T07:55:27.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.987+0000 7f03b3fde700 1 -- 192.168.123.105:0/3310954901 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f03ac10c5f0 con 0x7f03ac10f420 2026-03-10T07:55:27.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.987+0000 7f03a2ffd700 1 -- 192.168.123.105:0/3310954901 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f03a4012e90 con 0x7f03ac10f420 2026-03-10T07:55:27.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.987+0000 7f03a2ffd700 1 -- 192.168.123.105:0/3310954901 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f03a4004830 con 0x7f03ac10f420 2026-03-10T07:55:27.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.989+0000 7f03a2ffd700 1 -- 192.168.123.105:0/3310954901 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f03a40129b0 con 0x7f03ac10f420 2026-03-10T07:55:27.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.989+0000 7f03a2ffd700 1 --2- 192.168.123.105:0/3310954901 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f0398077920 0x7f0398079de0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:27.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.989+0000 7f03b1d7a700 1 --2- 192.168.123.105:0/3310954901 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f0398077920 0x7f0398079de0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:27.990 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.989+0000 7f03a2ffd700 1 -- 192.168.123.105:0/3310954901 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f03a409bb60 con 0x7f03ac10f420 2026-03-10T07:55:27.990 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.990+0000 7f03b1d7a700 1 --2- 192.168.123.105:0/3310954901 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f0398077920 0x7f0398079de0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f039c007900 tx=0x7f039c008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:27.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:27.991+0000 7f03a2ffd700 1 -- 192.168.123.105:0/3310954901 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f03a40643c0 con 0x7f03ac10f420 2026-03-10T07:55:28.135 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.135+0000 7f03b3fde700 1 -- 192.168.123.105:0/3310954901 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f03ac1164c0 con 0x7f03ac10f420 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.136+0000 7f03a2ffd700 1 -- 192.168.123.105:0/3310954901 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 14 v14) v1 ==== 76+0+1740 (secure 0 0 0) 0x7f03a4063b10 con 0x7f03ac10f420 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout:e14 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout:btime 2026-03-10T07:55:27:428564+0000 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout:epoch 14 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T07:48:24.309293+0000 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T07:55:27.428559+0000 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout:max_xattr_size 65536 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 79 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T07:55:28.136 INFO:teuthology.orchestra.run.vm05.stdout:up {0=14512} 2026-03-10T07:55:28.137 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T07:55:28.137 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T07:55:28.137 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T07:55:28.137 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T07:55:28.137 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T07:55:28.137 INFO:teuthology.orchestra.run.vm05.stdout:inline_data enabled 2026-03-10T07:55:28.137 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T07:55:28.137 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T07:55:28.137 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T07:55:28.137 INFO:teuthology.orchestra.run.vm05.stdout:qdb_cluster leader: 0 members: 2026-03-10T07:55:28.137 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.pavqil{0:14512} state up:replay seq 2 join_fscid=1 addr [v2:192.168.123.105:6828/427555544,v1:192.168.123.105:6829/427555544] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:55:28.137 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:55:28.137 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:55:28.137 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T07:55:28.137 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:55:28.137 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ybmbgd{-1:14524} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/3815585988,v1:192.168.123.108:6825/3815585988] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:55:28.137 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.dgsaon{-1:24313} state up:standby seq 2 join_fscid=1 addr [v2:192.168.123.108:6826/2963085185,v1:192.168.123.108:6827/2963085185] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:55:28.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.138+0000 7f03a0ff9700 1 -- 192.168.123.105:0/3310954901 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f0398077920 msgr2=0x7f0398079de0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:28.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.138+0000 7f03a0ff9700 1 --2- 192.168.123.105:0/3310954901 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f0398077920 0x7f0398079de0 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f039c007900 tx=0x7f039c008040 comp rx=0 tx=0).stop 2026-03-10T07:55:28.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.138+0000 7f03a0ff9700 1 -- 192.168.123.105:0/3310954901 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f03ac10f420 msgr2=0x7f03ac114f90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:28.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.138+0000 7f03a0ff9700 1 --2- 192.168.123.105:0/3310954901 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f03ac10f420 0x7f03ac114f90 secure :-1 s=READY pgs=45 cs=0 l=1 rev1=1 crypto rx=0x7f03a4015040 tx=0x7f03a4012710 comp rx=0 tx=0).stop 2026-03-10T07:55:28.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.139+0000 7f03a0ff9700 1 -- 192.168.123.105:0/3310954901 shutdown_connections 2026-03-10T07:55:28.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.139+0000 7f03a0ff9700 1 --2- 192.168.123.105:0/3310954901 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f0398077920 0x7f0398079de0 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:28.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.139+0000 7f03a0ff9700 1 --2- 192.168.123.105:0/3310954901 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f03ac107d90 0x7f03ac119f90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:28.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.139+0000 7f03a0ff9700 1 --2- 192.168.123.105:0/3310954901 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f03ac10f420 0x7f03ac114f90 unknown :-1 s=CLOSED pgs=45 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:28.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.139+0000 7f03a0ff9700 1 -- 192.168.123.105:0/3310954901 >> 192.168.123.105:0/3310954901 conn(0x7f03ac06ce20 msgr2=0x7f03ac070050 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:28.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.139+0000 7f03a0ff9700 1 -- 192.168.123.105:0/3310954901 shutdown_connections 2026-03-10T07:55:28.139 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.139+0000 7f03a0ff9700 1 -- 192.168.123.105:0/3310954901 wait complete. 2026-03-10T07:55:28.140 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 14 2026-03-10T07:55:28.221 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.221+0000 7f9cfb337700 1 -- 192.168.123.105:0/2262165052 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cf4107d90 msgr2=0x7f9cf4108210 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:28.221 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.221+0000 7f9cfb337700 1 --2- 192.168.123.105:0/2262165052 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cf4107d90 0x7f9cf4108210 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f9cec00b3a0 tx=0x7f9cec00b6b0 comp rx=0 tx=0).stop 2026-03-10T07:55:28.221 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.221+0000 7f9cfb337700 1 -- 192.168.123.105:0/2262165052 shutdown_connections 2026-03-10T07:55:28.221 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.221+0000 7f9cfb337700 1 --2- 192.168.123.105:0/2262165052 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cf4107d90 0x7f9cf4108210 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:28.221 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.221+0000 7f9cfb337700 1 --2- 192.168.123.105:0/2262165052 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9cf410f420 0x7f9cf410f800 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:28.221 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.221+0000 7f9cfb337700 1 -- 192.168.123.105:0/2262165052 >> 192.168.123.105:0/2262165052 conn(0x7f9cf406ce20 msgr2=0x7f9cf406d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:28.221 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.221+0000 7f9cfb337700 1 -- 192.168.123.105:0/2262165052 shutdown_connections 2026-03-10T07:55:28.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.221+0000 7f9cfb337700 1 -- 192.168.123.105:0/2262165052 wait complete. 2026-03-10T07:55:28.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.221+0000 7f9cfb337700 1 Processor -- start 2026-03-10T07:55:28.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.222+0000 7f9cfb337700 1 -- start start 2026-03-10T07:55:28.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.222+0000 7f9cfb337700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cf4107d90 0x7f9cf4119f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:28.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.222+0000 7f9cfb337700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9cf410f420 0x7f9cf4114f90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:28.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.222+0000 7f9cfb337700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9cf4115560 con 0x7f9cf4107d90 2026-03-10T07:55:28.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.222+0000 7f9cfb337700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f9cf41156d0 con 0x7f9cf410f420 2026-03-10T07:55:28.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.222+0000 7f9cf90d3700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cf4107d90 0x7f9cf4119f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:28.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.222+0000 7f9cf90d3700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cf4107d90 0x7f9cf4119f90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38436/0 (socket says 192.168.123.105:38436) 2026-03-10T07:55:28.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.222+0000 7f9cf90d3700 1 -- 192.168.123.105:0/947410007 learned_addr learned my addr 192.168.123.105:0/947410007 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:55:28.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.222+0000 7f9cf88d2700 1 --2- 192.168.123.105:0/947410007 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9cf410f420 0x7f9cf4114f90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:28.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.222+0000 7f9cf90d3700 1 -- 192.168.123.105:0/947410007 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9cf410f420 msgr2=0x7f9cf4114f90 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:28.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.222+0000 7f9cf90d3700 1 --2- 192.168.123.105:0/947410007 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9cf410f420 0x7f9cf4114f90 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:28.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.222+0000 7f9cf90d3700 1 -- 192.168.123.105:0/947410007 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f9cec00b050 con 0x7f9cf4107d90 2026-03-10T07:55:28.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.223+0000 7f9cf90d3700 1 --2- 192.168.123.105:0/947410007 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cf4107d90 0x7f9cf4119f90 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f9cf000b700 tx=0x7f9cf000bac0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:28.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.223+0000 7f9cea7fc700 1 -- 192.168.123.105:0/947410007 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9cf0010840 con 0x7f9cf4107d90 2026-03-10T07:55:28.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.223+0000 7f9cfb337700 1 -- 192.168.123.105:0/947410007 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f9cf41159b0 con 0x7f9cf4107d90 2026-03-10T07:55:28.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.223+0000 7f9cfb337700 1 -- 192.168.123.105:0/947410007 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f9cf41b84d0 con 0x7f9cf4107d90 2026-03-10T07:55:28.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.223+0000 7f9cea7fc700 1 -- 192.168.123.105:0/947410007 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f9cf0010e80 con 0x7f9cf4107d90 2026-03-10T07:55:28.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.224+0000 7f9cea7fc700 1 -- 192.168.123.105:0/947410007 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f9cf000d590 con 0x7f9cf4107d90 2026-03-10T07:55:28.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.224+0000 7f9cfb337700 1 -- 192.168.123.105:0/947410007 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f9cd8005320 con 0x7f9cf4107d90 2026-03-10T07:55:28.225 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.225+0000 7f9cea7fc700 1 -- 192.168.123.105:0/947410007 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f9cf00109a0 con 0x7f9cf4107d90 2026-03-10T07:55:28.225 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.225+0000 7f9cea7fc700 1 --2- 192.168.123.105:0/947410007 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f9ce00779e0 0x7f9ce0079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:28.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.225+0000 7f9cf88d2700 1 --2- 192.168.123.105:0/947410007 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f9ce00779e0 0x7f9ce0079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:28.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.226+0000 7f9cea7fc700 1 -- 192.168.123.105:0/947410007 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(79..79 src has 1..79) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f9cf00996d0 con 0x7f9cf4107d90 2026-03-10T07:55:28.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.226+0000 7f9cf88d2700 1 --2- 192.168.123.105:0/947410007 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f9ce00779e0 0x7f9ce0079ea0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f9cec009250 tx=0x7f9cec00bf90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:28.227 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.227+0000 7f9cea7fc700 1 -- 192.168.123.105:0/947410007 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f9cf0061f30 con 0x7f9cf4107d90 2026-03-10T07:55:28.362 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.362+0000 7f9cfb337700 1 -- 192.168.123.105:0/947410007 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f9cd8000bf0 con 0x7f9ce00779e0 2026-03-10T07:55:28.363 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.363+0000 7f9cea7fc700 1 -- 192.168.123.105:0/947410007 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f9cd8000bf0 con 0x7f9ce00779e0 2026-03-10T07:55:28.363 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:55:28.363 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T07:55:28.363 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T07:55:28.363 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T07:55:28.363 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T07:55:28.363 INFO:teuthology.orchestra.run.vm05.stdout: "osd", 2026-03-10T07:55:28.363 INFO:teuthology.orchestra.run.vm05.stdout: "mgr", 2026-03-10T07:55:28.363 INFO:teuthology.orchestra.run.vm05.stdout: "crash", 2026-03-10T07:55:28.363 INFO:teuthology.orchestra.run.vm05.stdout: "mon" 2026-03-10T07:55:28.363 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T07:55:28.363 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "12/23 daemons upgraded", 2026-03-10T07:55:28.363 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading mds daemons", 2026-03-10T07:55:28.363 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T07:55:28.363 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:55:28.366 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.366+0000 7f9cdffff700 1 -- 192.168.123.105:0/947410007 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f9ce00779e0 msgr2=0x7f9ce0079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:28.366 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.366+0000 7f9cdffff700 1 --2- 192.168.123.105:0/947410007 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f9ce00779e0 0x7f9ce0079ea0 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f9cec009250 tx=0x7f9cec00bf90 comp rx=0 tx=0).stop 2026-03-10T07:55:28.366 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.366+0000 7f9cdffff700 1 -- 192.168.123.105:0/947410007 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cf4107d90 msgr2=0x7f9cf4119f90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:28.366 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.366+0000 7f9cdffff700 1 --2- 192.168.123.105:0/947410007 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cf4107d90 0x7f9cf4119f90 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f9cf000b700 tx=0x7f9cf000bac0 comp rx=0 tx=0).stop 2026-03-10T07:55:28.366 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.366+0000 7f9cdffff700 1 -- 192.168.123.105:0/947410007 shutdown_connections 2026-03-10T07:55:28.366 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.366+0000 7f9cdffff700 1 --2- 192.168.123.105:0/947410007 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f9ce00779e0 0x7f9ce0079ea0 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:28.366 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.366+0000 7f9cdffff700 1 --2- 192.168.123.105:0/947410007 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f9cf4107d90 0x7f9cf4119f90 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:28.366 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.366+0000 7f9cdffff700 1 --2- 192.168.123.105:0/947410007 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f9cf410f420 0x7f9cf4114f90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:28.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.366+0000 7f9cdffff700 1 -- 192.168.123.105:0/947410007 >> 192.168.123.105:0/947410007 conn(0x7f9cf406ce20 msgr2=0x7f9cf4070050 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:28.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.366+0000 7f9cdffff700 1 -- 192.168.123.105:0/947410007 shutdown_connections 2026-03-10T07:55:28.367 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.367+0000 7f9cdffff700 1 -- 192.168.123.105:0/947410007 wait complete. 2026-03-10T07:55:28.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.447+0000 7f1a11e5c700 1 -- 192.168.123.105:0/3437683316 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1a0c107ff0 msgr2=0x7f1a0c1083d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:28.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.447+0000 7f1a11e5c700 1 --2- 192.168.123.105:0/3437683316 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1a0c107ff0 0x7f1a0c1083d0 secure :-1 s=READY pgs=46 cs=0 l=1 rev1=1 crypto rx=0x7f19fc008790 tx=0x7f19fc00ae50 comp rx=0 tx=0).stop 2026-03-10T07:55:28.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.447+0000 7f1a11e5c700 1 -- 192.168.123.105:0/3437683316 shutdown_connections 2026-03-10T07:55:28.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.447+0000 7f1a11e5c700 1 --2- 192.168.123.105:0/3437683316 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1a0c1089a0 0x7f1a0c10be70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:28.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.447+0000 7f1a11e5c700 1 --2- 192.168.123.105:0/3437683316 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1a0c107ff0 0x7f1a0c1083d0 unknown :-1 s=CLOSED pgs=46 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:28.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.447+0000 7f1a11e5c700 1 -- 192.168.123.105:0/3437683316 >> 192.168.123.105:0/3437683316 conn(0x7f1a0c06ce20 msgr2=0x7f1a0c06d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:28.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.447+0000 7f1a11e5c700 1 -- 192.168.123.105:0/3437683316 shutdown_connections 2026-03-10T07:55:28.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.447+0000 7f1a11e5c700 1 -- 192.168.123.105:0/3437683316 wait complete. 2026-03-10T07:55:28.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.448+0000 7f1a11e5c700 1 Processor -- start 2026-03-10T07:55:28.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.448+0000 7f1a11e5c700 1 -- start start 2026-03-10T07:55:28.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.448+0000 7f1a11e5c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1a0c1089a0 0x7f1a0c07cea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:28.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.448+0000 7f1a11e5c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1a0c07d3e0 0x7f1a0c07d860 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:28.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.448+0000 7f1a11e5c700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1a0c081a20 con 0x7f1a0c07d3e0 2026-03-10T07:55:28.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.448+0000 7f1a11e5c700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1a0c081b90 con 0x7f1a0c1089a0 2026-03-10T07:55:28.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.448+0000 7f1a0b7fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1a0c1089a0 0x7f1a0c07cea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:28.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.448+0000 7f1a0b7fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1a0c1089a0 0x7f1a0c07cea0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:48800/0 (socket says 192.168.123.105:48800) 2026-03-10T07:55:28.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.448+0000 7f1a0b7fe700 1 -- 192.168.123.105:0/647322379 learned_addr learned my addr 192.168.123.105:0/647322379 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:55:28.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.449+0000 7f1a0affd700 1 --2- 192.168.123.105:0/647322379 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1a0c07d3e0 0x7f1a0c07d860 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:28.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.449+0000 7f1a0b7fe700 1 -- 192.168.123.105:0/647322379 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1a0c07d3e0 msgr2=0x7f1a0c07d860 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:28.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.449+0000 7f1a0b7fe700 1 --2- 192.168.123.105:0/647322379 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1a0c07d3e0 0x7f1a0c07d860 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:28.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.449+0000 7f1a0b7fe700 1 -- 192.168.123.105:0/647322379 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f19fc008440 con 0x7f1a0c1089a0 2026-03-10T07:55:28.449 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.449+0000 7f1a0b7fe700 1 --2- 192.168.123.105:0/647322379 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1a0c1089a0 0x7f1a0c07cea0 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f19fc00bed0 tx=0x7f19fc00bfb0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:28.450 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.449+0000 7f1a08ff9700 1 -- 192.168.123.105:0/647322379 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f19fc00a510 con 0x7f1a0c1089a0 2026-03-10T07:55:28.450 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.449+0000 7f1a11e5c700 1 -- 192.168.123.105:0/647322379 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f1a0c081e10 con 0x7f1a0c1089a0 2026-03-10T07:55:28.450 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.449+0000 7f1a11e5c700 1 -- 192.168.123.105:0/647322379 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f1a0c082360 con 0x7f1a0c1089a0 2026-03-10T07:55:28.450 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.450+0000 7f1a08ff9700 1 -- 192.168.123.105:0/647322379 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f19fc00b870 con 0x7f1a0c1089a0 2026-03-10T07:55:28.450 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.450+0000 7f1a08ff9700 1 -- 192.168.123.105:0/647322379 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f19fc01cc40 con 0x7f1a0c1089a0 2026-03-10T07:55:28.451 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.451+0000 7f1a08ff9700 1 -- 192.168.123.105:0/647322379 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f19fc01c420 con 0x7f1a0c1089a0 2026-03-10T07:55:28.452 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.452+0000 7f1a08ff9700 1 --2- 192.168.123.105:0/647322379 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f19f4077910 0x7f19f4079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:28.452 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.452+0000 7f1a08ff9700 1 -- 192.168.123.105:0/647322379 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(80..80 src has 1..80) v4 ==== 6222+0+0 (secure 0 0 0) 0x7f19fc09a8d0 con 0x7f1a0c1089a0 2026-03-10T07:55:28.452 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.452+0000 7f1a11e5c700 1 -- 192.168.123.105:0/647322379 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f19f8005320 con 0x7f1a0c1089a0 2026-03-10T07:55:28.453 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.453+0000 7f1a0affd700 1 --2- 192.168.123.105:0/647322379 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f19f4077910 0x7f19f4079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:28.453 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.453+0000 7f1a0affd700 1 --2- 192.168.123.105:0/647322379 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f19f4077910 0x7f19f4079dd0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f1a040095a0 tx=0x7f1a0400b040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:28.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.456+0000 7f1a08ff9700 1 -- 192.168.123.105:0/647322379 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f19fc0630b0 con 0x7f1a0c1089a0 2026-03-10T07:55:28.639 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.639+0000 7f1a11e5c700 1 -- 192.168.123.105:0/647322379 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f19f8005190 con 0x7f1a0c1089a0 2026-03-10T07:55:28.640 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.639+0000 7f1a08ff9700 1 -- 192.168.123.105:0/647322379 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+1194 (secure 0 0 0) 0x7f19fc020090 con 0x7f1a0c1089a0 2026-03-10T07:55:28.640 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_WARN 1 filesystem is degraded; 1 filesystem with deprecated feature inline_data; Degraded data redundancy: 40/261 objects degraded (15.326%), 13 pgs degraded 2026-03-10T07:55:28.640 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] FS_DEGRADED: 1 filesystem is degraded 2026-03-10T07:55:28.640 INFO:teuthology.orchestra.run.vm05.stdout: fs cephfs is degraded 2026-03-10T07:55:28.640 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T07:55:28.640 INFO:teuthology.orchestra.run.vm05.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T07:55:28.640 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] PG_DEGRADED: Degraded data redundancy: 40/261 objects degraded (15.326%), 13 pgs degraded 2026-03-10T07:55:28.640 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.2 is active+undersized+degraded, acting [1,0] 2026-03-10T07:55:28.640 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.3 is active+undersized+degraded, acting [2,1] 2026-03-10T07:55:28.640 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.8 is active+undersized+degraded, acting [3,0] 2026-03-10T07:55:28.640 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.b is active+undersized+degraded, acting [3,4] 2026-03-10T07:55:28.640 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.f is active+undersized+degraded, acting [4,0] 2026-03-10T07:55:28.640 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.14 is active+undersized+degraded, acting [3,4] 2026-03-10T07:55:28.640 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.16 is active+undersized+degraded, acting [3,2] 2026-03-10T07:55:28.640 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.18 is active+undersized+degraded, acting [4,3] 2026-03-10T07:55:28.640 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.1a is active+undersized+degraded, acting [3,4] 2026-03-10T07:55:28.640 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.1b is active+undersized+degraded, acting [1,0] 2026-03-10T07:55:28.640 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.1c is active+undersized+degraded, acting [4,2] 2026-03-10T07:55:28.640 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.1d is active+undersized+degraded, acting [3,0] 2026-03-10T07:55:28.640 INFO:teuthology.orchestra.run.vm05.stdout: pg 2.1e is active+undersized+degraded, acting [2,0] 2026-03-10T07:55:28.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.645+0000 7f19f27fc700 1 -- 192.168.123.105:0/647322379 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f19f4077910 msgr2=0x7f19f4079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:28.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.645+0000 7f19f27fc700 1 --2- 192.168.123.105:0/647322379 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f19f4077910 0x7f19f4079dd0 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f1a040095a0 tx=0x7f1a0400b040 comp rx=0 tx=0).stop 2026-03-10T07:55:28.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.645+0000 7f19f27fc700 1 -- 192.168.123.105:0/647322379 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1a0c1089a0 msgr2=0x7f1a0c07cea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:28.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.645+0000 7f19f27fc700 1 --2- 192.168.123.105:0/647322379 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1a0c1089a0 0x7f1a0c07cea0 secure :-1 s=READY pgs=47 cs=0 l=1 rev1=1 crypto rx=0x7f19fc00bed0 tx=0x7f19fc00bfb0 comp rx=0 tx=0).stop 2026-03-10T07:55:28.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.645+0000 7f19f27fc700 1 -- 192.168.123.105:0/647322379 shutdown_connections 2026-03-10T07:55:28.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.645+0000 7f19f27fc700 1 --2- 192.168.123.105:0/647322379 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1a0c1089a0 0x7f1a0c07cea0 unknown :-1 s=CLOSED pgs=47 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:28.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.645+0000 7f19f27fc700 1 --2- 192.168.123.105:0/647322379 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f19f4077910 0x7f19f4079dd0 unknown :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:28.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.645+0000 7f19f27fc700 1 --2- 192.168.123.105:0/647322379 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1a0c07d3e0 0x7f1a0c07d860 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:28.645 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.645+0000 7f19f27fc700 1 -- 192.168.123.105:0/647322379 >> 192.168.123.105:0/647322379 conn(0x7f1a0c06ce20 msgr2=0x7f1a0c071590 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:28.646 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.645+0000 7f19f27fc700 1 -- 192.168.123.105:0/647322379 shutdown_connections 2026-03-10T07:55:28.646 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:28.645+0000 7f19f27fc700 1 -- 192.168.123.105:0/647322379 wait complete. 2026-03-10T07:55:28.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:28 vm05.local ceph-mon[130117]: pgmap v128: 65 pgs: 15 active+undersized, 13 active+undersized+degraded, 37 active+clean; 255 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 40/261 objects degraded (15.326%) 2026-03-10T07:55:28.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:28 vm05.local ceph-mon[130117]: Upgrade: Updating mds.cephfs.vm05.omfhnh 2026-03-10T07:55:28.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:28 vm05.local ceph-mon[130117]: Deploying daemon mds.cephfs.vm05.omfhnh on vm05 2026-03-10T07:55:28.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:28 vm05.local ceph-mon[130117]: from='client.34250 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:55:28.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:28 vm05.local ceph-mon[130117]: from='client.34254 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:55:28.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:28 vm05.local ceph-mon[130117]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 7 pgs peering) 2026-03-10T07:55:28.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:28 vm05.local ceph-mon[130117]: from='client.34258 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:55:28.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:28 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/4229693500' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:28.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:28 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.omfhnh"}]: dispatch 2026-03-10T07:55:28.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:28 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/3310954901' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T07:55:28.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:28 vm05.local ceph-mon[130117]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T07:55:28.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:28 vm05.local ceph-mon[130117]: osd.5 [v2:192.168.123.108:6816/2090596459,v1:192.168.123.108:6817/2090596459] boot 2026-03-10T07:55:28.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:28 vm05.local ceph-mon[130117]: osdmap e80: 6 total, 6 up, 6 in 2026-03-10T07:55:28.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:28 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T07:55:28.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:28 vm08.local ceph-mon[107898]: pgmap v128: 65 pgs: 15 active+undersized, 13 active+undersized+degraded, 37 active+clean; 255 MiB data, 1.3 GiB used, 119 GiB / 120 GiB avail; 40/261 objects degraded (15.326%) 2026-03-10T07:55:28.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:28 vm08.local ceph-mon[107898]: Upgrade: Updating mds.cephfs.vm05.omfhnh 2026-03-10T07:55:28.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:28 vm08.local ceph-mon[107898]: Deploying daemon mds.cephfs.vm05.omfhnh on vm05 2026-03-10T07:55:28.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:28 vm08.local ceph-mon[107898]: from='client.34250 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:55:28.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:28 vm08.local ceph-mon[107898]: from='client.34254 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:55:28.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:28 vm08.local ceph-mon[107898]: Health check cleared: PG_AVAILABILITY (was: Reduced data availability: 7 pgs peering) 2026-03-10T07:55:28.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:28 vm08.local ceph-mon[107898]: from='client.34258 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:55:28.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:28 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/4229693500' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:28.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:28 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.omfhnh"}]: dispatch 2026-03-10T07:55:28.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:28 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/3310954901' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T07:55:28.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:28 vm08.local ceph-mon[107898]: Health check cleared: OSD_DOWN (was: 1 osds down) 2026-03-10T07:55:28.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:28 vm08.local ceph-mon[107898]: osd.5 [v2:192.168.123.108:6816/2090596459,v1:192.168.123.108:6817/2090596459] boot 2026-03-10T07:55:28.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:28 vm08.local ceph-mon[107898]: osdmap e80: 6 total, 6 up, 6 in 2026-03-10T07:55:28.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:28 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd metadata", "id": 5}]: dispatch 2026-03-10T07:55:29.579 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:29 vm05.local ceph-mon[130117]: from='client.34266 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:55:29.579 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:29 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/647322379' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T07:55:29.579 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:29 vm05.local ceph-mon[130117]: Health check update: Degraded data redundancy: 40/261 objects degraded (15.326%), 13 pgs degraded (PG_DEGRADED) 2026-03-10T07:55:29.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:29 vm08.local ceph-mon[107898]: from='client.34266 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:55:29.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:29 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/647322379' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T07:55:29.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:29 vm08.local ceph-mon[107898]: Health check update: Degraded data redundancy: 40/261 objects degraded (15.326%), 13 pgs degraded (PG_DEGRADED) 2026-03-10T07:55:30.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:30 vm05.local ceph-mon[130117]: pgmap v131: 65 pgs: 2 peering, 14 active+undersized, 12 active+undersized+degraded, 37 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 4.0 MiB/s rd, 1 op/s; 37/261 objects degraded (14.176%) 2026-03-10T07:55:30.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:30 vm05.local ceph-mon[130117]: osdmap e81: 6 total, 6 up, 6 in 2026-03-10T07:55:30.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:30 vm08.local ceph-mon[107898]: pgmap v131: 65 pgs: 2 peering, 14 active+undersized, 12 active+undersized+degraded, 37 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 4.0 MiB/s rd, 1 op/s; 37/261 objects degraded (14.176%) 2026-03-10T07:55:30.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:30 vm08.local ceph-mon[107898]: osdmap e81: 6 total, 6 up, 6 in 2026-03-10T07:55:33.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:32 vm05.local ceph-mon[130117]: pgmap v133: 65 pgs: 2 peering, 13 active+undersized, 12 active+undersized+degraded, 38 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 7.4 MiB/s rd, 3 op/s; 37/261 objects degraded (14.176%) 2026-03-10T07:55:33.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:32 vm08.local ceph-mon[107898]: pgmap v133: 65 pgs: 2 peering, 13 active+undersized, 12 active+undersized+degraded, 38 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 7.4 MiB/s rd, 3 op/s; 37/261 objects degraded (14.176%) 2026-03-10T07:55:34.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:33 vm05.local ceph-mon[130117]: pgmap v134: 65 pgs: 2 peering, 63 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 26 MiB/s rd, 7 op/s 2026-03-10T07:55:34.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:33 vm05.local ceph-mon[130117]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 37/261 objects degraded (14.176%), 12 pgs degraded) 2026-03-10T07:55:34.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:33 vm08.local ceph-mon[107898]: pgmap v134: 65 pgs: 2 peering, 63 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 26 MiB/s rd, 7 op/s 2026-03-10T07:55:34.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:33 vm08.local ceph-mon[107898]: Health check cleared: PG_DEGRADED (was: Degraded data redundancy: 37/261 objects degraded (14.176%), 12 pgs degraded) 2026-03-10T07:55:35.057 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:34 vm05.local ceph-mon[130117]: reconnect by client.24347 192.168.123.108:0/2923077033 after 0.002 2026-03-10T07:55:35.057 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:34 vm05.local ceph-mon[130117]: mds.? [v2:192.168.123.105:6828/427555544,v1:192.168.123.105:6829/427555544] up:reconnect 2026-03-10T07:55:35.057 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:34 vm05.local ceph-mon[130117]: fsmap cephfs:1/1 {0=cephfs.vm05.pavqil=up:reconnect} 2 up:standby 2026-03-10T07:55:35.057 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:34 vm05.local ceph-mon[130117]: reconnect by client.14542 192.168.144.1:0/2345317662 after 0.00500001 2026-03-10T07:55:35.057 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:34 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:35.057 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:34 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:35.057 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:34 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:55:35.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:34 vm08.local ceph-mon[107898]: reconnect by client.24347 192.168.123.108:0/2923077033 after 0.002 2026-03-10T07:55:35.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:34 vm08.local ceph-mon[107898]: mds.? [v2:192.168.123.105:6828/427555544,v1:192.168.123.105:6829/427555544] up:reconnect 2026-03-10T07:55:35.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:34 vm08.local ceph-mon[107898]: fsmap cephfs:1/1 {0=cephfs.vm05.pavqil=up:reconnect} 2 up:standby 2026-03-10T07:55:35.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:34 vm08.local ceph-mon[107898]: reconnect by client.14542 192.168.144.1:0/2345317662 after 0.00500001 2026-03-10T07:55:35.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:34 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:35.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:34 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:35.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:34 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:55:36.153 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:35 vm05.local ceph-mon[130117]: pgmap v135: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 26 MiB/s rd, 7 op/s 2026-03-10T07:55:36.153 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:35 vm05.local ceph-mon[130117]: mds.? [v2:192.168.123.105:6828/427555544,v1:192.168.123.105:6829/427555544] up:rejoin 2026-03-10T07:55:36.153 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:35 vm05.local ceph-mon[130117]: mds.? [v2:192.168.123.105:6826/4251520120,v1:192.168.123.105:6827/4251520120] up:boot 2026-03-10T07:55:36.153 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:35 vm05.local ceph-mon[130117]: fsmap cephfs:1/1 {0=cephfs.vm05.pavqil=up:rejoin} 3 up:standby 2026-03-10T07:55:36.153 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:35 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.omfhnh"}]: dispatch 2026-03-10T07:55:36.153 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:35 vm05.local ceph-mon[130117]: daemon mds.cephfs.vm05.pavqil is now active in filesystem cephfs as rank 0 2026-03-10T07:55:36.154 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:35 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:36.154 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:35 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:36.154 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:35 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:36.154 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:35 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:36.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:35 vm08.local ceph-mon[107898]: pgmap v135: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 26 MiB/s rd, 7 op/s 2026-03-10T07:55:36.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:35 vm08.local ceph-mon[107898]: mds.? [v2:192.168.123.105:6828/427555544,v1:192.168.123.105:6829/427555544] up:rejoin 2026-03-10T07:55:36.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:35 vm08.local ceph-mon[107898]: mds.? [v2:192.168.123.105:6826/4251520120,v1:192.168.123.105:6827/4251520120] up:boot 2026-03-10T07:55:36.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:35 vm08.local ceph-mon[107898]: fsmap cephfs:1/1 {0=cephfs.vm05.pavqil=up:rejoin} 3 up:standby 2026-03-10T07:55:36.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:35 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.omfhnh"}]: dispatch 2026-03-10T07:55:36.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:35 vm08.local ceph-mon[107898]: daemon mds.cephfs.vm05.pavqil is now active in filesystem cephfs as rank 0 2026-03-10T07:55:36.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:35 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:36.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:35 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:36.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:35 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:36.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:35 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:37.133 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:36 vm05.local ceph-mon[130117]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T07:55:37.133 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:36 vm05.local ceph-mon[130117]: mds.? [v2:192.168.123.105:6828/427555544,v1:192.168.123.105:6829/427555544] up:active 2026-03-10T07:55:37.133 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:36 vm05.local ceph-mon[130117]: fsmap cephfs:1 {0=cephfs.vm05.pavqil=up:active} 3 up:standby 2026-03-10T07:55:37.133 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:36 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:37.133 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:36 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:37.133 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:36 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:55:37.133 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:36 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:55:37.133 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:36 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:37.133 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:36 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:55:37.133 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:36 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:37.133 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:36 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:37.133 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:36 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:37.133 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:36 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:37.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:36 vm08.local ceph-mon[107898]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T07:55:37.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:36 vm08.local ceph-mon[107898]: mds.? [v2:192.168.123.105:6828/427555544,v1:192.168.123.105:6829/427555544] up:active 2026-03-10T07:55:37.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:36 vm08.local ceph-mon[107898]: fsmap cephfs:1 {0=cephfs.vm05.pavqil=up:active} 3 up:standby 2026-03-10T07:55:37.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:36 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:37.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:36 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:37.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:36 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:55:37.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:36 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:55:37.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:36 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:37.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:36 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:55:37.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:36 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:37.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:36 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:37.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:36 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:37.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:36 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:37.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:37 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm05[130113]: 2026-03-10T07:55:37.451+0000 7f980f7e7640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T07:55:38.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:37 vm05.local ceph-mon[130117]: pgmap v136: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 23 MiB/s rd, 122 B/s wr, 8 op/s 2026-03-10T07:55:38.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:37 vm05.local ceph-mon[130117]: Upgrade: Updating mds.cephfs.vm05.pavqil 2026-03-10T07:55:38.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:37 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:38.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:37 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.pavqil", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T07:55:38.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:37 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:55:38.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:37 vm05.local ceph-mon[130117]: Deploying daemon mds.cephfs.vm05.pavqil on vm05 2026-03-10T07:55:38.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:37 vm05.local ceph-mon[130117]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T07:55:38.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:37 vm05.local ceph-mon[130117]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T07:55:38.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:37 vm05.local ceph-mon[130117]: osdmap e82: 6 total, 6 up, 6 in 2026-03-10T07:55:38.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:37 vm05.local ceph-mon[130117]: Standby daemon mds.cephfs.vm08.ybmbgd assigned to filesystem cephfs as rank 0 2026-03-10T07:55:38.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:37 vm05.local ceph-mon[130117]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T07:55:38.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:37 vm05.local ceph-mon[130117]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T07:55:38.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:37 vm05.local ceph-mon[130117]: fsmap cephfs:1/1 {0=cephfs.vm08.ybmbgd=up:replay} 2 up:standby 2026-03-10T07:55:38.408 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:37 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.pavqil"}]: dispatch 2026-03-10T07:55:38.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:37 vm08.local ceph-mon[107898]: pgmap v136: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 23 MiB/s rd, 122 B/s wr, 8 op/s 2026-03-10T07:55:38.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:37 vm08.local ceph-mon[107898]: Upgrade: Updating mds.cephfs.vm05.pavqil 2026-03-10T07:55:38.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:37 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:38.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:37 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm05.pavqil", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T07:55:38.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:37 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:55:38.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:37 vm08.local ceph-mon[107898]: Deploying daemon mds.cephfs.vm05.pavqil on vm05 2026-03-10T07:55:38.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:37 vm08.local ceph-mon[107898]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T07:55:38.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:37 vm08.local ceph-mon[107898]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T07:55:38.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:37 vm08.local ceph-mon[107898]: osdmap e82: 6 total, 6 up, 6 in 2026-03-10T07:55:38.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:37 vm08.local ceph-mon[107898]: Standby daemon mds.cephfs.vm08.ybmbgd assigned to filesystem cephfs as rank 0 2026-03-10T07:55:38.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:37 vm08.local ceph-mon[107898]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T07:55:38.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:37 vm08.local ceph-mon[107898]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T07:55:38.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:37 vm08.local ceph-mon[107898]: fsmap cephfs:1/1 {0=cephfs.vm08.ybmbgd=up:replay} 2 up:standby 2026-03-10T07:55:38.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:37 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.pavqil"}]: dispatch 2026-03-10T07:55:41.060 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:40 vm05.local ceph-mon[130117]: pgmap v138: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 23 MiB/s rd, 110 B/s wr, 10 op/s 2026-03-10T07:55:41.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:40 vm08.local ceph-mon[107898]: pgmap v138: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 23 MiB/s rd, 110 B/s wr, 10 op/s 2026-03-10T07:55:43.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:42 vm05.local ceph-mon[130117]: pgmap v139: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 204 B/s wr, 9 op/s 2026-03-10T07:55:43.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:43.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:43.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:55:43.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:42 vm05.local ceph-mon[130117]: mds.? [v2:192.168.123.105:6828/426813062,v1:192.168.123.105:6829/426813062] up:boot 2026-03-10T07:55:43.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:42 vm05.local ceph-mon[130117]: mds.? [v2:192.168.123.108:6824/3815585988,v1:192.168.123.108:6825/3815585988] up:reconnect 2026-03-10T07:55:43.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:42 vm05.local ceph-mon[130117]: fsmap cephfs:1/1 {0=cephfs.vm08.ybmbgd=up:reconnect} 3 up:standby 2026-03-10T07:55:43.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.pavqil"}]: dispatch 2026-03-10T07:55:43.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:42 vm05.local ceph-mon[130117]: reconnect by client.24347 192.168.123.108:0/2923077033 after 0 2026-03-10T07:55:43.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:42 vm05.local ceph-mon[130117]: reconnect by client.14542 192.168.144.1:0/2345317662 after 0.001 2026-03-10T07:55:43.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:43.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:55:43.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:43.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:42 vm08.local ceph-mon[107898]: pgmap v139: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 204 B/s wr, 9 op/s 2026-03-10T07:55:43.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:43.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:43.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:55:43.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:42 vm08.local ceph-mon[107898]: mds.? [v2:192.168.123.105:6828/426813062,v1:192.168.123.105:6829/426813062] up:boot 2026-03-10T07:55:43.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:42 vm08.local ceph-mon[107898]: mds.? [v2:192.168.123.108:6824/3815585988,v1:192.168.123.108:6825/3815585988] up:reconnect 2026-03-10T07:55:43.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:42 vm08.local ceph-mon[107898]: fsmap cephfs:1/1 {0=cephfs.vm08.ybmbgd=up:reconnect} 3 up:standby 2026-03-10T07:55:43.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm05.pavqil"}]: dispatch 2026-03-10T07:55:43.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:42 vm08.local ceph-mon[107898]: reconnect by client.24347 192.168.123.108:0/2923077033 after 0 2026-03-10T07:55:43.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:42 vm08.local ceph-mon[107898]: reconnect by client.14542 192.168.144.1:0/2345317662 after 0.001 2026-03-10T07:55:43.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:43.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:55:43.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:43.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:43 vm05.local ceph-mon[130117]: mds.? [v2:192.168.123.108:6824/3815585988,v1:192.168.123.108:6825/3815585988] up:rejoin 2026-03-10T07:55:43.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:43 vm05.local ceph-mon[130117]: fsmap cephfs:1/1 {0=cephfs.vm08.ybmbgd=up:rejoin} 3 up:standby 2026-03-10T07:55:43.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:43 vm05.local ceph-mon[130117]: daemon mds.cephfs.vm08.ybmbgd is now active in filesystem cephfs as rank 0 2026-03-10T07:55:43.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:43 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:43.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:43 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:43.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:43 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:43.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:43 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:44.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:43 vm08.local ceph-mon[107898]: mds.? [v2:192.168.123.108:6824/3815585988,v1:192.168.123.108:6825/3815585988] up:rejoin 2026-03-10T07:55:44.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:43 vm08.local ceph-mon[107898]: fsmap cephfs:1/1 {0=cephfs.vm08.ybmbgd=up:rejoin} 3 up:standby 2026-03-10T07:55:44.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:43 vm08.local ceph-mon[107898]: daemon mds.cephfs.vm08.ybmbgd is now active in filesystem cephfs as rank 0 2026-03-10T07:55:44.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:43 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:44.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:43 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:44.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:43 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:44.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:43 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:44.868 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:44 vm08.local ceph-mon[107898]: pgmap v140: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 27 MiB/s rd, 204 B/s wr, 11 op/s 2026-03-10T07:55:44.868 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:44 vm08.local ceph-mon[107898]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T07:55:44.868 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:44 vm08.local ceph-mon[107898]: mds.? [v2:192.168.123.108:6824/3815585988,v1:192.168.123.108:6825/3815585988] up:active 2026-03-10T07:55:44.868 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:44 vm08.local ceph-mon[107898]: fsmap cephfs:1 {0=cephfs.vm08.ybmbgd=up:active} 3 up:standby 2026-03-10T07:55:44.868 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:44.868 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:44.868 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:55:44.868 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:55:44.869 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:44.869 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:55:44.869 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:44.869 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:44.869 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:44.869 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:44.869 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:44.869 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.ybmbgd", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T07:55:44.869 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:44 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:55:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:44 vm05.local ceph-mon[130117]: pgmap v140: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 27 MiB/s rd, 204 B/s wr, 11 op/s 2026-03-10T07:55:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:44 vm05.local ceph-mon[130117]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T07:55:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:44 vm05.local ceph-mon[130117]: mds.? [v2:192.168.123.108:6824/3815585988,v1:192.168.123.108:6825/3815585988] up:active 2026-03-10T07:55:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:44 vm05.local ceph-mon[130117]: fsmap cephfs:1 {0=cephfs.vm08.ybmbgd=up:active} 3 up:standby 2026-03-10T07:55:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:55:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:55:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:55:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.ybmbgd", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T07:55:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:44 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:55:45.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:45 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm05[130113]: 2026-03-10T07:55:45.162+0000 7f980f7e7640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T07:55:46.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:45 vm05.local ceph-mon[130117]: Upgrade: Updating mds.cephfs.vm08.ybmbgd 2026-03-10T07:55:46.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:45 vm05.local ceph-mon[130117]: Deploying daemon mds.cephfs.vm08.ybmbgd on vm08 2026-03-10T07:55:46.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:45 vm05.local ceph-mon[130117]: pgmap v141: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 26 MiB/s rd, 204 B/s wr, 11 op/s 2026-03-10T07:55:46.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:45 vm05.local ceph-mon[130117]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T07:55:46.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:45 vm05.local ceph-mon[130117]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T07:55:46.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:45 vm05.local ceph-mon[130117]: osdmap e83: 6 total, 6 up, 6 in 2026-03-10T07:55:46.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:45 vm05.local ceph-mon[130117]: Standby daemon mds.cephfs.vm08.dgsaon assigned to filesystem cephfs as rank 0 2026-03-10T07:55:46.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:45 vm05.local ceph-mon[130117]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T07:55:46.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:45 vm05.local ceph-mon[130117]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T07:55:46.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:45 vm05.local ceph-mon[130117]: fsmap cephfs:1/1 {0=cephfs.vm08.dgsaon=up:replay} 2 up:standby 2026-03-10T07:55:46.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:46 vm08.local ceph-mon[107898]: Upgrade: Updating mds.cephfs.vm08.ybmbgd 2026-03-10T07:55:46.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:46 vm08.local ceph-mon[107898]: Deploying daemon mds.cephfs.vm08.ybmbgd on vm08 2026-03-10T07:55:46.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:46 vm08.local ceph-mon[107898]: pgmap v141: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 26 MiB/s rd, 204 B/s wr, 11 op/s 2026-03-10T07:55:46.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:46 vm08.local ceph-mon[107898]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T07:55:46.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:46 vm08.local ceph-mon[107898]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T07:55:46.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:46 vm08.local ceph-mon[107898]: osdmap e83: 6 total, 6 up, 6 in 2026-03-10T07:55:46.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:46 vm08.local ceph-mon[107898]: Standby daemon mds.cephfs.vm08.dgsaon assigned to filesystem cephfs as rank 0 2026-03-10T07:55:46.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:46 vm08.local ceph-mon[107898]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T07:55:46.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:46 vm08.local ceph-mon[107898]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T07:55:46.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:46 vm08.local ceph-mon[107898]: fsmap cephfs:1/1 {0=cephfs.vm08.dgsaon=up:replay} 2 up:standby 2026-03-10T07:55:48.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:48 vm05.local ceph-mon[130117]: pgmap v143: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 32 MiB/s rd, 328 B/s wr, 14 op/s 2026-03-10T07:55:48.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:48 vm08.local ceph-mon[107898]: pgmap v143: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 32 MiB/s rd, 328 B/s wr, 14 op/s 2026-03-10T07:55:50.318 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:50 vm08.local ceph-mon[107898]: pgmap v144: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 39 MiB/s rd, 307 B/s wr, 16 op/s 2026-03-10T07:55:50.318 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:50 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:50.318 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:50 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:50.318 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:50 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:55:50.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:50 vm05.local ceph-mon[130117]: pgmap v144: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 39 MiB/s rd, 307 B/s wr, 16 op/s 2026-03-10T07:55:50.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:50 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:50.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:50 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:50.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:50 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:55:51.334 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:51 vm08.local ceph-mon[107898]: mds.? [v2:192.168.123.108:6824/1209244,v1:192.168.123.108:6825/1209244] up:boot 2026-03-10T07:55:51.334 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:51 vm08.local ceph-mon[107898]: fsmap cephfs:1/1 {0=cephfs.vm08.dgsaon=up:replay} 3 up:standby 2026-03-10T07:55:51.334 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:51 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.ybmbgd"}]: dispatch 2026-03-10T07:55:51.334 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:51 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:51.334 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:51 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:51.334 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:51 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:51.334 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:51 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:51.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:51 vm05.local ceph-mon[130117]: mds.? [v2:192.168.123.108:6824/1209244,v1:192.168.123.108:6825/1209244] up:boot 2026-03-10T07:55:51.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:51 vm05.local ceph-mon[130117]: fsmap cephfs:1/1 {0=cephfs.vm08.dgsaon=up:replay} 3 up:standby 2026-03-10T07:55:51.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:51 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.ybmbgd"}]: dispatch 2026-03-10T07:55:51.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:51 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:51.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:51 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:51.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:51 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:51.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:51 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:52.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:52 vm08.local ceph-mon[107898]: pgmap v145: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 40 MiB/s rd, 204 B/s wr, 15 op/s 2026-03-10T07:55:52.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:52 vm08.local ceph-mon[107898]: mds.? [v2:192.168.123.108:6826/2963085185,v1:192.168.123.108:6827/2963085185] up:reconnect 2026-03-10T07:55:52.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:52 vm08.local ceph-mon[107898]: fsmap cephfs:1/1 {0=cephfs.vm08.dgsaon=up:reconnect} 3 up:standby 2026-03-10T07:55:52.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:52 vm08.local ceph-mon[107898]: reconnect by client.14542 192.168.144.1:0/2345317662 after 0 2026-03-10T07:55:52.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:52 vm08.local ceph-mon[107898]: reconnect by client.24347 192.168.123.108:0/2923077033 after 0 2026-03-10T07:55:52.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:52 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:52.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:52 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:52.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:52 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:55:52.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:52 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:55:52.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:52 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:52.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:52 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:55:52.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:52 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:52.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:52 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:52.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:52 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:52.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:52 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:52.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:52 vm05.local ceph-mon[130117]: pgmap v145: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 40 MiB/s rd, 204 B/s wr, 15 op/s 2026-03-10T07:55:52.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:52 vm05.local ceph-mon[130117]: mds.? [v2:192.168.123.108:6826/2963085185,v1:192.168.123.108:6827/2963085185] up:reconnect 2026-03-10T07:55:52.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:52 vm05.local ceph-mon[130117]: fsmap cephfs:1/1 {0=cephfs.vm08.dgsaon=up:reconnect} 3 up:standby 2026-03-10T07:55:52.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:52 vm05.local ceph-mon[130117]: reconnect by client.14542 192.168.144.1:0/2345317662 after 0 2026-03-10T07:55:52.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:52 vm05.local ceph-mon[130117]: reconnect by client.24347 192.168.123.108:0/2923077033 after 0 2026-03-10T07:55:52.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:52 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:52.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:52 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:52.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:52 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:55:52.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:52 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:55:52.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:52 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:52.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:52 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:55:52.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:52 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:52.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:52 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:52.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:52 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:52.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:52 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:53.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:53 vm05.local ceph-mon[130117]: Upgrade: Waiting for mds.cephfs.vm08.dgsaon to be up:active (currently up:reconnect) 2026-03-10T07:55:53.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:53 vm05.local ceph-mon[130117]: mds.? [v2:192.168.123.108:6826/2963085185,v1:192.168.123.108:6827/2963085185] up:rejoin 2026-03-10T07:55:53.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:53 vm05.local ceph-mon[130117]: fsmap cephfs:1/1 {0=cephfs.vm08.dgsaon=up:rejoin} 3 up:standby 2026-03-10T07:55:53.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:53 vm05.local ceph-mon[130117]: daemon mds.cephfs.vm08.dgsaon is now active in filesystem cephfs as rank 0 2026-03-10T07:55:53.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:53 vm08.local ceph-mon[107898]: Upgrade: Waiting for mds.cephfs.vm08.dgsaon to be up:active (currently up:reconnect) 2026-03-10T07:55:53.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:53 vm08.local ceph-mon[107898]: mds.? [v2:192.168.123.108:6826/2963085185,v1:192.168.123.108:6827/2963085185] up:rejoin 2026-03-10T07:55:53.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:53 vm08.local ceph-mon[107898]: fsmap cephfs:1/1 {0=cephfs.vm08.dgsaon=up:rejoin} 3 up:standby 2026-03-10T07:55:53.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:53 vm08.local ceph-mon[107898]: daemon mds.cephfs.vm08.dgsaon is now active in filesystem cephfs as rank 0 2026-03-10T07:55:54.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:54 vm05.local ceph-mon[130117]: pgmap v146: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 28 MiB/s rd, 204 B/s wr, 12 op/s 2026-03-10T07:55:54.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:54 vm05.local ceph-mon[130117]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T07:55:54.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:54 vm05.local ceph-mon[130117]: mds.? [v2:192.168.123.108:6826/2963085185,v1:192.168.123.108:6827/2963085185] up:active 2026-03-10T07:55:54.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:54 vm05.local ceph-mon[130117]: fsmap cephfs:1 {0=cephfs.vm08.dgsaon=up:active} 3 up:standby 2026-03-10T07:55:54.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:54 vm08.local ceph-mon[107898]: pgmap v146: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 28 MiB/s rd, 204 B/s wr, 12 op/s 2026-03-10T07:55:54.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:54 vm08.local ceph-mon[107898]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T07:55:54.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:54 vm08.local ceph-mon[107898]: mds.? [v2:192.168.123.108:6826/2963085185,v1:192.168.123.108:6827/2963085185] up:active 2026-03-10T07:55:54.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:54 vm08.local ceph-mon[107898]: fsmap cephfs:1 {0=cephfs.vm08.dgsaon=up:active} 3 up:standby 2026-03-10T07:55:56.900 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:56 vm08.local ceph-mon[107898]: pgmap v147: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 25 MiB/s rd, 204 B/s wr, 12 op/s 2026-03-10T07:55:56.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:56 vm05.local ceph-mon[130117]: pgmap v147: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 25 MiB/s rd, 204 B/s wr, 12 op/s 2026-03-10T07:55:58.069 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:57 vm08.local ceph-mon[107898]: pgmap v148: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 4.4 KiB/s wr, 14 op/s 2026-03-10T07:55:58.069 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:58.069 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:55:58.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:57 vm05.local ceph-mon[130117]: pgmap v148: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 21 MiB/s rd, 4.4 KiB/s wr, 14 op/s 2026-03-10T07:55:58.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:55:58.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:55:58.722 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.720+0000 7f613f59e700 1 -- 192.168.123.105:0/602652607 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6140105be0 msgr2=0x7f6140105fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:58.722 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.721+0000 7f613f59e700 1 --2- 192.168.123.105:0/602652607 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6140105be0 0x7f6140105fc0 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7f6130009b00 tx=0x7f6130009e10 comp rx=0 tx=0).stop 2026-03-10T07:55:58.722 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.722+0000 7f613f59e700 1 -- 192.168.123.105:0/602652607 shutdown_connections 2026-03-10T07:55:58.722 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.722+0000 7f613f59e700 1 --2- 192.168.123.105:0/602652607 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f61400684d0 0x7f6140068950 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:58.722 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.722+0000 7f613f59e700 1 --2- 192.168.123.105:0/602652607 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6140105be0 0x7f6140105fc0 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:58.722 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.722+0000 7f613f59e700 1 -- 192.168.123.105:0/602652607 >> 192.168.123.105:0/602652607 conn(0x7f61400756b0 msgr2=0x7f6140075ac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:58.722 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.722+0000 7f613f59e700 1 -- 192.168.123.105:0/602652607 shutdown_connections 2026-03-10T07:55:58.722 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.722+0000 7f613f59e700 1 -- 192.168.123.105:0/602652607 wait complete. 2026-03-10T07:55:58.723 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.723+0000 7f613f59e700 1 Processor -- start 2026-03-10T07:55:58.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.723+0000 7f613f59e700 1 -- start start 2026-03-10T07:55:58.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.724+0000 7f613f59e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61400684d0 0x7f614019c410 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:58.724 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.724+0000 7f613e59c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61400684d0 0x7f614019c410 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:58.725 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.724+0000 7f613e59c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61400684d0 0x7f614019c410 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:38890/0 (socket says 192.168.123.105:38890) 2026-03-10T07:55:58.725 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.724+0000 7f613e59c700 1 -- 192.168.123.105:0/2036565922 learned_addr learned my addr 192.168.123.105:0/2036565922 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:55:58.725 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.725+0000 7f613f59e700 1 --2- 192.168.123.105:0/2036565922 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6140105be0 0x7f614019c950 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:58.725 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.725+0000 7f613f59e700 1 -- 192.168.123.105:0/2036565922 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f614019d030 con 0x7f61400684d0 2026-03-10T07:55:58.725 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.725+0000 7f613dd9b700 1 --2- 192.168.123.105:0/2036565922 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6140105be0 0x7f614019c950 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:58.725 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.725+0000 7f613f59e700 1 -- 192.168.123.105:0/2036565922 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6140196700 con 0x7f6140105be0 2026-03-10T07:55:58.726 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.725+0000 7f613e59c700 1 -- 192.168.123.105:0/2036565922 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6140105be0 msgr2=0x7f614019c950 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:58.726 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.725+0000 7f613e59c700 1 --2- 192.168.123.105:0/2036565922 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6140105be0 0x7f614019c950 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:58.726 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.725+0000 7f613e59c700 1 -- 192.168.123.105:0/2036565922 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f61300097e0 con 0x7f61400684d0 2026-03-10T07:55:58.726 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.726+0000 7f613e59c700 1 --2- 192.168.123.105:0/2036565922 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61400684d0 0x7f614019c410 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f6130005f50 tx=0x7f6130004c30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:58.727 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.727+0000 7f612f7fe700 1 -- 192.168.123.105:0/2036565922 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f613001d070 con 0x7f61400684d0 2026-03-10T07:55:58.727 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.727+0000 7f612f7fe700 1 -- 192.168.123.105:0/2036565922 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f613000bc50 con 0x7f61400684d0 2026-03-10T07:55:58.727 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.727+0000 7f612f7fe700 1 -- 192.168.123.105:0/2036565922 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f613000f890 con 0x7f61400684d0 2026-03-10T07:55:58.727 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.727+0000 7f613f59e700 1 -- 192.168.123.105:0/2036565922 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6140196980 con 0x7f61400684d0 2026-03-10T07:55:58.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.727+0000 7f613f59e700 1 -- 192.168.123.105:0/2036565922 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6140196e70 con 0x7f61400684d0 2026-03-10T07:55:58.728 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.728+0000 7f612d7fa700 1 -- 192.168.123.105:0/2036565922 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f614004f2e0 con 0x7f61400684d0 2026-03-10T07:55:58.732 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.732+0000 7f612f7fe700 1 -- 192.168.123.105:0/2036565922 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6130022be0 con 0x7f61400684d0 2026-03-10T07:55:58.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.732+0000 7f612f7fe700 1 --2- 192.168.123.105:0/2036565922 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f612807bdf0 0x7f612807e2b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:58.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.732+0000 7f612f7fe700 1 -- 192.168.123.105:0/2036565922 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f613009be90 con 0x7f61400684d0 2026-03-10T07:55:58.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.732+0000 7f612f7fe700 1 -- 192.168.123.105:0/2036565922 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f613009c310 con 0x7f61400684d0 2026-03-10T07:55:58.733 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.733+0000 7f613dd9b700 1 --2- 192.168.123.105:0/2036565922 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f612807bdf0 0x7f612807e2b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:58.736 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.736+0000 7f613dd9b700 1 --2- 192.168.123.105:0/2036565922 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f612807bdf0 0x7f612807e2b0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f6140197ce0 tx=0x7f613400b410 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:58.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.863+0000 7f612d7fa700 1 -- 192.168.123.105:0/2036565922 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f6140197150 con 0x7f612807bdf0 2026-03-10T07:55:58.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.864+0000 7f612f7fe700 1 -- 192.168.123.105:0/2036565922 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f6140197150 con 0x7f612807bdf0 2026-03-10T07:55:58.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.871+0000 7f613f59e700 1 -- 192.168.123.105:0/2036565922 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f612807bdf0 msgr2=0x7f612807e2b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:58.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.871+0000 7f613f59e700 1 --2- 192.168.123.105:0/2036565922 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f612807bdf0 0x7f612807e2b0 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f6140197ce0 tx=0x7f613400b410 comp rx=0 tx=0).stop 2026-03-10T07:55:58.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.871+0000 7f613f59e700 1 -- 192.168.123.105:0/2036565922 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61400684d0 msgr2=0x7f614019c410 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:58.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.871+0000 7f613f59e700 1 --2- 192.168.123.105:0/2036565922 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61400684d0 0x7f614019c410 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7f6130005f50 tx=0x7f6130004c30 comp rx=0 tx=0).stop 2026-03-10T07:55:58.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.871+0000 7f613f59e700 1 -- 192.168.123.105:0/2036565922 shutdown_connections 2026-03-10T07:55:58.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.871+0000 7f613f59e700 1 --2- 192.168.123.105:0/2036565922 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f612807bdf0 0x7f612807e2b0 secure :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f6140197ce0 tx=0x7f613400b410 comp rx=0 tx=0).stop 2026-03-10T07:55:58.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.871+0000 7f613f59e700 1 --2- 192.168.123.105:0/2036565922 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f61400684d0 0x7f614019c410 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:58.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.871+0000 7f613f59e700 1 --2- 192.168.123.105:0/2036565922 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6140105be0 0x7f614019c950 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:58.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.871+0000 7f613f59e700 1 -- 192.168.123.105:0/2036565922 >> 192.168.123.105:0/2036565922 conn(0x7f61400756b0 msgr2=0x7f61400fdc30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:58.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.872+0000 7f613f59e700 1 -- 192.168.123.105:0/2036565922 shutdown_connections 2026-03-10T07:55:58.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.872+0000 7f613f59e700 1 -- 192.168.123.105:0/2036565922 wait complete. 2026-03-10T07:55:58.882 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T07:55:58.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.939+0000 7ff3bef83700 1 -- 192.168.123.105:0/1131001617 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3b80731c0 msgr2=0x7ff3b80735a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:58.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.939+0000 7ff3bef83700 1 --2- 192.168.123.105:0/1131001617 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3b80731c0 0x7ff3b80735a0 secure :-1 s=READY pgs=52 cs=0 l=1 rev1=1 crypto rx=0x7ff3a8009b30 tx=0x7ff3a8009e40 comp rx=0 tx=0).stop 2026-03-10T07:55:58.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.939+0000 7ff3bef83700 1 -- 192.168.123.105:0/1131001617 shutdown_connections 2026-03-10T07:55:58.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.939+0000 7ff3bef83700 1 --2- 192.168.123.105:0/1131001617 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff3b8073ae0 0x7ff3b810d190 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:58.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.939+0000 7ff3bef83700 1 --2- 192.168.123.105:0/1131001617 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3b80731c0 0x7ff3b80735a0 unknown :-1 s=CLOSED pgs=52 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:58.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.939+0000 7ff3bef83700 1 -- 192.168.123.105:0/1131001617 >> 192.168.123.105:0/1131001617 conn(0x7ff3b80fc920 msgr2=0x7ff3b80fed40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:58.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.940+0000 7ff3bef83700 1 -- 192.168.123.105:0/1131001617 shutdown_connections 2026-03-10T07:55:58.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.940+0000 7ff3bef83700 1 -- 192.168.123.105:0/1131001617 wait complete. 2026-03-10T07:55:58.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.940+0000 7ff3bef83700 1 Processor -- start 2026-03-10T07:55:58.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.940+0000 7ff3bef83700 1 -- start start 2026-03-10T07:55:58.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.940+0000 7ff3bef83700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff3b80731c0 0x7ff3b8198da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:58.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.941+0000 7ff3bcd1f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff3b80731c0 0x7ff3b8198da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:58.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.941+0000 7ff3bcd1f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff3b80731c0 0x7ff3b8198da0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36756/0 (socket says 192.168.123.105:36756) 2026-03-10T07:55:58.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.941+0000 7ff3bef83700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3b8073ae0 0x7ff3b81992e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:58.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.941+0000 7ff3bef83700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff3b81999c0 con 0x7ff3b80731c0 2026-03-10T07:55:58.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.941+0000 7ff3bef83700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff3b819d750 con 0x7ff3b8073ae0 2026-03-10T07:55:58.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.941+0000 7ff3bcd1f700 1 -- 192.168.123.105:0/3613692361 learned_addr learned my addr 192.168.123.105:0/3613692361 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:55:58.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.941+0000 7ff3bcd1f700 1 -- 192.168.123.105:0/3613692361 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3b8073ae0 msgr2=0x7ff3b81992e0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T07:55:58.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.941+0000 7ff3bcd1f700 1 --2- 192.168.123.105:0/3613692361 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3b8073ae0 0x7ff3b81992e0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:58.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.941+0000 7ff3bcd1f700 1 -- 192.168.123.105:0/3613692361 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff3ac009710 con 0x7ff3b80731c0 2026-03-10T07:55:58.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.941+0000 7ff3bcd1f700 1 --2- 192.168.123.105:0/3613692361 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff3b80731c0 0x7ff3b8198da0 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7ff3a800bda0 tx=0x7ff3a800be80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:58.942 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.941+0000 7ff3b5ffb700 1 -- 192.168.123.105:0/3613692361 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff3a801d070 con 0x7ff3b80731c0 2026-03-10T07:55:58.942 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.941+0000 7ff3bef83700 1 -- 192.168.123.105:0/3613692361 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff3a80097e0 con 0x7ff3b80731c0 2026-03-10T07:55:58.942 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.942+0000 7ff3bef83700 1 -- 192.168.123.105:0/3613692361 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff3b819dd30 con 0x7ff3b80731c0 2026-03-10T07:55:58.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.942+0000 7ff3b5ffb700 1 -- 192.168.123.105:0/3613692361 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7ff3a800f460 con 0x7ff3b80731c0 2026-03-10T07:55:58.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.942+0000 7ff3b5ffb700 1 -- 192.168.123.105:0/3613692361 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff3a8021600 con 0x7ff3b80731c0 2026-03-10T07:55:58.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.943+0000 7ff3b5ffb700 1 -- 192.168.123.105:0/3613692361 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff3a802b430 con 0x7ff3b80731c0 2026-03-10T07:55:58.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.943+0000 7ff3b5ffb700 1 --2- 192.168.123.105:0/3613692361 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7ff3a00778c0 0x7ff3a0079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:58.944 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.943+0000 7ff3b7fff700 1 --2- 192.168.123.105:0/3613692361 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7ff3a00778c0 0x7ff3a0079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:58.944 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.944+0000 7ff3b5ffb700 1 -- 192.168.123.105:0/3613692361 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6394+0+0 (secure 0 0 0) 0x7ff3a809ae10 con 0x7ff3b80731c0 2026-03-10T07:55:58.944 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.944+0000 7ff3b7fff700 1 --2- 192.168.123.105:0/3613692361 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7ff3a00778c0 0x7ff3a0079d80 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7ff3b819a3c0 tx=0x7ff3ac009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:58.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.944+0000 7ff3bef83700 1 -- 192.168.123.105:0/3613692361 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff3a4005320 con 0x7ff3b80731c0 2026-03-10T07:55:58.947 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:58.947+0000 7ff3b5ffb700 1 -- 192.168.123.105:0/3613692361 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff3a8064630 con 0x7ff3b80731c0 2026-03-10T07:55:59.076 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.076+0000 7ff3bef83700 1 -- 192.168.123.105:0/3613692361 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7ff3a4000bf0 con 0x7ff3a00778c0 2026-03-10T07:55:59.078 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.077+0000 7ff3b5ffb700 1 -- 192.168.123.105:0/3613692361 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7ff3a4000bf0 con 0x7ff3a00778c0 2026-03-10T07:55:59.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.079+0000 7ff3bef83700 1 -- 192.168.123.105:0/3613692361 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7ff3a00778c0 msgr2=0x7ff3a0079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:59.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.079+0000 7ff3bef83700 1 --2- 192.168.123.105:0/3613692361 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7ff3a00778c0 0x7ff3a0079d80 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7ff3b819a3c0 tx=0x7ff3ac009450 comp rx=0 tx=0).stop 2026-03-10T07:55:59.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.079+0000 7ff3bef83700 1 -- 192.168.123.105:0/3613692361 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff3b80731c0 msgr2=0x7ff3b8198da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:59.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.079+0000 7ff3bef83700 1 --2- 192.168.123.105:0/3613692361 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff3b80731c0 0x7ff3b8198da0 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7ff3a800bda0 tx=0x7ff3a800be80 comp rx=0 tx=0).stop 2026-03-10T07:55:59.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.080+0000 7ff3bef83700 1 -- 192.168.123.105:0/3613692361 shutdown_connections 2026-03-10T07:55:59.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.080+0000 7ff3bef83700 1 --2- 192.168.123.105:0/3613692361 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7ff3a00778c0 0x7ff3a0079d80 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.080+0000 7ff3bef83700 1 --2- 192.168.123.105:0/3613692361 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff3b80731c0 0x7ff3b8198da0 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.080+0000 7ff3bef83700 1 --2- 192.168.123.105:0/3613692361 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff3b8073ae0 0x7ff3b81992e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.080 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.080+0000 7ff3bef83700 1 -- 192.168.123.105:0/3613692361 >> 192.168.123.105:0/3613692361 conn(0x7ff3b80fc920 msgr2=0x7ff3b81079d0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:59.081 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.080+0000 7ff3bef83700 1 -- 192.168.123.105:0/3613692361 shutdown_connections 2026-03-10T07:55:59.081 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.080+0000 7ff3bef83700 1 -- 192.168.123.105:0/3613692361 wait complete. 2026-03-10T07:55:59.142 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.141+0000 7f327f1a6700 1 -- 192.168.123.105:0/1001698775 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3278103cf0 msgr2=0x7f3278107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:59.142 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.141+0000 7f327f1a6700 1 --2- 192.168.123.105:0/1001698775 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3278103cf0 0x7f3278107d40 secure :-1 s=READY pgs=53 cs=0 l=1 rev1=1 crypto rx=0x7f3268009a60 tx=0x7f3268009d70 comp rx=0 tx=0).stop 2026-03-10T07:55:59.142 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.142+0000 7f327f1a6700 1 -- 192.168.123.105:0/1001698775 shutdown_connections 2026-03-10T07:55:59.142 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.142+0000 7f327f1a6700 1 --2- 192.168.123.105:0/1001698775 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3278103cf0 0x7f3278107d40 unknown :-1 s=CLOSED pgs=53 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.142 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.142+0000 7f327f1a6700 1 --2- 192.168.123.105:0/1001698775 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3278103340 0x7f3278103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.142 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.142+0000 7f327f1a6700 1 -- 192.168.123.105:0/1001698775 >> 192.168.123.105:0/1001698775 conn(0x7f32780feb90 msgr2=0x7f3278100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:59.142 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.142+0000 7f327f1a6700 1 -- 192.168.123.105:0/1001698775 shutdown_connections 2026-03-10T07:55:59.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.142+0000 7f327f1a6700 1 -- 192.168.123.105:0/1001698775 wait complete. 2026-03-10T07:55:59.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.143+0000 7f327f1a6700 1 Processor -- start 2026-03-10T07:55:59.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.143+0000 7f327f1a6700 1 -- start start 2026-03-10T07:55:59.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.143+0000 7f327f1a6700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3278103340 0x7f3278198de0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:59.143 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.143+0000 7f327f1a6700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3278103cf0 0x7f3278199320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:59.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.143+0000 7f327f1a6700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3278199a00 con 0x7f3278103340 2026-03-10T07:55:59.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.143+0000 7f327f1a6700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f327819d790 con 0x7f3278103cf0 2026-03-10T07:55:59.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.143+0000 7f327cf42700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3278103340 0x7f3278198de0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:59.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.143+0000 7f327cf42700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3278103340 0x7f3278198de0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36782/0 (socket says 192.168.123.105:36782) 2026-03-10T07:55:59.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.143+0000 7f327cf42700 1 -- 192.168.123.105:0/1725236624 learned_addr learned my addr 192.168.123.105:0/1725236624 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:55:59.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.143+0000 7f326ffff700 1 --2- 192.168.123.105:0/1725236624 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3278103cf0 0x7f3278199320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:59.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.143+0000 7f327cf42700 1 -- 192.168.123.105:0/1725236624 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3278103cf0 msgr2=0x7f3278199320 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:59.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.143+0000 7f327cf42700 1 --2- 192.168.123.105:0/1725236624 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3278103cf0 0x7f3278199320 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.144+0000 7f327cf42700 1 -- 192.168.123.105:0/1725236624 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f32740097e0 con 0x7f3278103340 2026-03-10T07:55:59.145 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.144+0000 7f327cf42700 1 --2- 192.168.123.105:0/1725236624 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3278103340 0x7f3278198de0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f327400efd0 tx=0x7f327400c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:59.145 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.145+0000 7f326dffb700 1 -- 192.168.123.105:0/1725236624 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3274009c00 con 0x7f3278103340 2026-03-10T07:55:59.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.145+0000 7f327f1a6700 1 -- 192.168.123.105:0/1725236624 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3268009710 con 0x7f3278103340 2026-03-10T07:55:59.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.145+0000 7f327f1a6700 1 -- 192.168.123.105:0/1725236624 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f327819dd70 con 0x7f3278103340 2026-03-10T07:55:59.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.145+0000 7f326dffb700 1 -- 192.168.123.105:0/1725236624 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f3274004500 con 0x7f3278103340 2026-03-10T07:55:59.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.145+0000 7f326dffb700 1 -- 192.168.123.105:0/1725236624 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3274010710 con 0x7f3278103340 2026-03-10T07:55:59.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.146+0000 7f326dffb700 1 -- 192.168.123.105:0/1725236624 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3274004000 con 0x7f3278103340 2026-03-10T07:55:59.146 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.146+0000 7f326dffb700 1 --2- 192.168.123.105:0/1725236624 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3260077990 0x7f3260079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:59.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.146+0000 7f326ffff700 1 --2- 192.168.123.105:0/1725236624 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3260077990 0x7f3260079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:59.149 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.146+0000 7f326dffb700 1 -- 192.168.123.105:0/1725236624 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f3274014070 con 0x7f3278103340 2026-03-10T07:55:59.149 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.146+0000 7f326ffff700 1 --2- 192.168.123.105:0/1725236624 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3260077990 0x7f3260079e50 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f327819a400 tx=0x7f326800b580 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:59.149 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.146+0000 7f327f1a6700 1 -- 192.168.123.105:0/1725236624 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3264005320 con 0x7f3278103340 2026-03-10T07:55:59.149 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.149+0000 7f326dffb700 1 -- 192.168.123.105:0/1725236624 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3274062910 con 0x7f3278103340 2026-03-10T07:55:59.264 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.264+0000 7f327f1a6700 1 -- 192.168.123.105:0/1725236624 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f3264000bf0 con 0x7f3260077990 2026-03-10T07:55:59.269 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.269+0000 7f326dffb700 1 -- 192.168.123.105:0/1725236624 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3552 (secure 0 0 0) 0x7f3264000bf0 con 0x7f3260077990 2026-03-10T07:55:59.269 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T07:55:59.269 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (4m) 16s ago 9m 23.4M - 0.25.0 c8568f914cd2 ac15d5f35994 2026-03-10T07:55:59.269 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (9m) 16s ago 9m 10.2M - 18.2.1 5be31c24972a 26c4db858175 2026-03-10T07:55:59.269 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (9m) 8s ago 9m 12.0M - 18.2.1 5be31c24972a 209e2398a09c 2026-03-10T07:55:59.269 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (3m) 16s ago 9m 7847k - 19.2.3-678-ge911bdeb 654f31e6858e daa831c74cf4 2026-03-10T07:55:59.269 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (3m) 8s ago 9m 8262k - 19.2.3-678-ge911bdeb 654f31e6858e 668ac55c9722 2026-03-10T07:55:59.269 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (4m) 16s ago 9m 85.0M - 10.4.0 c8b91775d855 6acb529ad951 2026-03-10T07:55:59.269 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.omfhnh vm05 running (25s) 16s ago 7m 17.1M - 19.2.3-678-ge911bdeb 654f31e6858e fd612446ffa4 2026-03-10T07:55:59.269 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.pavqil vm05 running (17s) 16s ago 7m 12.7M - 19.2.3-678-ge911bdeb 654f31e6858e f8155f8c8aff 2026-03-10T07:55:59.269 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.dgsaon vm08 running (7m) 8s ago 7m 264M - 18.2.1 5be31c24972a 1696aee522b5 2026-03-10T07:55:59.269 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ybmbgd vm08 running (10s) 8s ago 7m 19.0M - 19.2.3-678-ge911bdeb 654f31e6858e 24039c2c65e7 2026-03-10T07:55:59.269 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.blexke vm05 *:8443,9283,8765 running (5m) 16s ago 10m 591M - 19.2.3-678-ge911bdeb 654f31e6858e 5467b6a3bd38 2026-03-10T07:55:59.269 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.orfpog vm08 *:8443,9283,8765 running (5m) 8s ago 9m 536M - 19.2.3-678-ge911bdeb 654f31e6858e 7a15d7811d5b 2026-03-10T07:55:59.269 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (3m) 16s ago 10m 64.2M 2048M 19.2.3-678-ge911bdeb 654f31e6858e f02f076bb820 2026-03-10T07:55:59.269 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (3m) 8s ago 9m 54.6M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 73d9a504f360 2026-03-10T07:55:59.269 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (5m) 16s ago 9m 10.5M - 1.7.0 72c9c2088986 7cd0b23b4118 2026-03-10T07:55:59.269 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (5m) 8s ago 9m 10.1M - 1.7.0 72c9c2088986 3dd4d91d5881 2026-03-10T07:55:59.269 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (2m) 16s ago 8m 205M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b35fccc2a4d5 2026-03-10T07:55:59.269 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (2m) 16s ago 8m 107M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e3fe4ad5c6a3 2026-03-10T07:55:59.269 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (99s) 16s ago 8m 111M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 108a77e324b8 2026-03-10T07:55:59.270 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (78s) 8s ago 8m 149M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 3f280bcfe0f5 2026-03-10T07:55:59.270 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (57s) 8s ago 8m 135M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 132c8d288b1e 2026-03-10T07:55:59.270 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (36s) 8s ago 7m 101M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a4a8929822a2 2026-03-10T07:55:59.270 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (4m) 16s ago 9m 52.1M - 2.51.0 1d3b7f56885b c59a6be07563 2026-03-10T07:55:59.271 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.271+0000 7f327f1a6700 1 -- 192.168.123.105:0/1725236624 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3260077990 msgr2=0x7f3260079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:59.271 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.271+0000 7f327f1a6700 1 --2- 192.168.123.105:0/1725236624 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3260077990 0x7f3260079e50 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f327819a400 tx=0x7f326800b580 comp rx=0 tx=0).stop 2026-03-10T07:55:59.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.271+0000 7f327f1a6700 1 -- 192.168.123.105:0/1725236624 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3278103340 msgr2=0x7f3278198de0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:59.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.271+0000 7f327f1a6700 1 --2- 192.168.123.105:0/1725236624 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3278103340 0x7f3278198de0 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f327400efd0 tx=0x7f327400c5b0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.272+0000 7f327f1a6700 1 -- 192.168.123.105:0/1725236624 shutdown_connections 2026-03-10T07:55:59.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.272+0000 7f327f1a6700 1 --2- 192.168.123.105:0/1725236624 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3260077990 0x7f3260079e50 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.272+0000 7f327f1a6700 1 --2- 192.168.123.105:0/1725236624 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3278103340 0x7f3278198de0 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.272+0000 7f327f1a6700 1 --2- 192.168.123.105:0/1725236624 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3278103cf0 0x7f3278199320 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.272+0000 7f327f1a6700 1 -- 192.168.123.105:0/1725236624 >> 192.168.123.105:0/1725236624 conn(0x7f32780feb90 msgr2=0x7f3278100150 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:59.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.272+0000 7f327f1a6700 1 -- 192.168.123.105:0/1725236624 shutdown_connections 2026-03-10T07:55:59.272 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.272+0000 7f327f1a6700 1 -- 192.168.123.105:0/1725236624 wait complete. 2026-03-10T07:55:59.333 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.332+0000 7f55d352e700 1 -- 192.168.123.105:0/1036176966 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f55cc0690e0 msgr2=0x7f55cc105b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:59.333 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.332+0000 7f55d352e700 1 --2- 192.168.123.105:0/1036176966 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f55cc0690e0 0x7f55cc105b50 secure :-1 s=READY pgs=54 cs=0 l=1 rev1=1 crypto rx=0x7f55c8009a60 tx=0x7f55c8009d70 comp rx=0 tx=0).stop 2026-03-10T07:55:59.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.334+0000 7f55d352e700 1 -- 192.168.123.105:0/1036176966 shutdown_connections 2026-03-10T07:55:59.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.334+0000 7f55d352e700 1 --2- 192.168.123.105:0/1036176966 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f55cc0690e0 0x7f55cc105b50 unknown :-1 s=CLOSED pgs=54 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.334+0000 7f55d352e700 1 --2- 192.168.123.105:0/1036176966 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f55cc068730 0x7f55cc068b10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.335 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.334+0000 7f55d352e700 1 -- 192.168.123.105:0/1036176966 >> 192.168.123.105:0/1036176966 conn(0x7f55cc075960 msgr2=0x7f55cc075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:59.336 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.336+0000 7f55d352e700 1 -- 192.168.123.105:0/1036176966 shutdown_connections 2026-03-10T07:55:59.336 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.336+0000 7f55d352e700 1 -- 192.168.123.105:0/1036176966 wait complete. 2026-03-10T07:55:59.336 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.336+0000 7f55d352e700 1 Processor -- start 2026-03-10T07:55:59.336 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.336+0000 7f55d352e700 1 -- start start 2026-03-10T07:55:59.336 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.336+0000 7f55d352e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f55cc068730 0x7f55cc199880 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:59.337 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.336+0000 7f55d352e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f55cc0690e0 0x7f55cc199dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:59.337 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.336+0000 7f55d352e700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f55cc19f5a0 con 0x7f55cc068730 2026-03-10T07:55:59.337 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.336+0000 7f55d352e700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f55cc19f710 con 0x7f55cc0690e0 2026-03-10T07:55:59.337 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.337+0000 7f55d12ca700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f55cc068730 0x7f55cc199880 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:59.337 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.337+0000 7f55d12ca700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f55cc068730 0x7f55cc199880 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36800/0 (socket says 192.168.123.105:36800) 2026-03-10T07:55:59.337 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.337+0000 7f55d12ca700 1 -- 192.168.123.105:0/4104747827 learned_addr learned my addr 192.168.123.105:0/4104747827 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:55:59.337 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.337+0000 7f55d0ac9700 1 --2- 192.168.123.105:0/4104747827 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f55cc0690e0 0x7f55cc199dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:59.337 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.337+0000 7f55d0ac9700 1 -- 192.168.123.105:0/4104747827 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f55cc068730 msgr2=0x7f55cc199880 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:59.337 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.337+0000 7f55d0ac9700 1 --2- 192.168.123.105:0/4104747827 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f55cc068730 0x7f55cc199880 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.337 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.337+0000 7f55d0ac9700 1 -- 192.168.123.105:0/4104747827 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f55bc0097e0 con 0x7f55cc0690e0 2026-03-10T07:55:59.337 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.337+0000 7f55d12ca700 1 --2- 192.168.123.105:0/4104747827 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f55cc068730 0x7f55cc199880 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T07:55:59.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.337+0000 7f55d0ac9700 1 --2- 192.168.123.105:0/4104747827 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f55cc0690e0 0x7f55cc199dc0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f55c8000c00 tx=0x7f55c800f740 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:59.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.338+0000 7f55c27fc700 1 -- 192.168.123.105:0/4104747827 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f55c801d070 con 0x7f55cc0690e0 2026-03-10T07:55:59.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.338+0000 7f55d352e700 1 -- 192.168.123.105:0/4104747827 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f55c8009710 con 0x7f55cc0690e0 2026-03-10T07:55:59.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.338+0000 7f55d352e700 1 -- 192.168.123.105:0/4104747827 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f55cc19fc40 con 0x7f55cc0690e0 2026-03-10T07:55:59.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.338+0000 7f55c27fc700 1 -- 192.168.123.105:0/4104747827 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f55c80037e0 con 0x7f55cc0690e0 2026-03-10T07:55:59.338 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.338+0000 7f55c27fc700 1 -- 192.168.123.105:0/4104747827 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f55c8017720 con 0x7f55cc0690e0 2026-03-10T07:55:59.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.339+0000 7f55c27fc700 1 -- 192.168.123.105:0/4104747827 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f55c8021af0 con 0x7f55cc0690e0 2026-03-10T07:55:59.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.339+0000 7f55c27fc700 1 --2- 192.168.123.105:0/4104747827 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f55b80778c0 0x7f55b8079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:59.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.339+0000 7f55d12ca700 1 --2- 192.168.123.105:0/4104747827 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f55b80778c0 0x7f55b8079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:59.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.339+0000 7f55c27fc700 1 -- 192.168.123.105:0/4104747827 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f55c809ac90 con 0x7f55cc0690e0 2026-03-10T07:55:59.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.340+0000 7f55b7fff700 1 -- 192.168.123.105:0/4104747827 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f55ac0052f0 con 0x7f55cc0690e0 2026-03-10T07:55:59.340 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.340+0000 7f55d12ca700 1 --2- 192.168.123.105:0/4104747827 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f55b80778c0 0x7f55b8079d80 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f55bc009fd0 tx=0x7f55bc009500 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:59.343 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.343+0000 7f55c27fc700 1 -- 192.168.123.105:0/4104747827 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f55c8063470 con 0x7f55cc0690e0 2026-03-10T07:55:59.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.498+0000 7f55b7fff700 1 -- 192.168.123.105:0/4104747827 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f55ac0061d0 con 0x7f55cc0690e0 2026-03-10T07:55:59.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.498+0000 7f55c27fc700 1 -- 192.168.123.105:0/4104747827 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+815 (secure 0 0 0) 0x7f55c8026070 con 0x7f55cc0690e0 2026-03-10T07:55:59.499 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:55:59.499 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T07:55:59.499 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:55:59.499 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:55:59.499 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T07:55:59.499 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:55:59.499 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:55:59.499 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T07:55:59.499 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T07:55:59.499 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:55:59.499 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T07:55:59.499 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 1, 2026-03-10T07:55:59.499 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 3 2026-03-10T07:55:59.499 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:55:59.499 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T07:55:59.500 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 18.2.1 (7fe91d5d5842e04be3b4f514d6dd990c54b29c76) reef (stable)": 1, 2026-03-10T07:55:59.500 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 13 2026-03-10T07:55:59.500 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T07:55:59.500 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:55:59.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.501+0000 7f55b7fff700 1 -- 192.168.123.105:0/4104747827 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f55b80778c0 msgr2=0x7f55b8079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:59.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.501+0000 7f55b7fff700 1 --2- 192.168.123.105:0/4104747827 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f55b80778c0 0x7f55b8079d80 secure :-1 s=READY pgs=74 cs=0 l=1 rev1=1 crypto rx=0x7f55bc009fd0 tx=0x7f55bc009500 comp rx=0 tx=0).stop 2026-03-10T07:55:59.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.501+0000 7f55b7fff700 1 -- 192.168.123.105:0/4104747827 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f55cc0690e0 msgr2=0x7f55cc199dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:59.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.501+0000 7f55b7fff700 1 --2- 192.168.123.105:0/4104747827 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f55cc0690e0 0x7f55cc199dc0 secure :-1 s=READY pgs=55 cs=0 l=1 rev1=1 crypto rx=0x7f55c8000c00 tx=0x7f55c800f740 comp rx=0 tx=0).stop 2026-03-10T07:55:59.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.501+0000 7f55b7fff700 1 -- 192.168.123.105:0/4104747827 shutdown_connections 2026-03-10T07:55:59.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.501+0000 7f55b7fff700 1 --2- 192.168.123.105:0/4104747827 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f55b80778c0 0x7f55b8079d80 unknown :-1 s=CLOSED pgs=74 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.501+0000 7f55b7fff700 1 --2- 192.168.123.105:0/4104747827 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f55cc068730 0x7f55cc199880 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.501+0000 7f55b7fff700 1 --2- 192.168.123.105:0/4104747827 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f55cc0690e0 0x7f55cc199dc0 unknown :-1 s=CLOSED pgs=55 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.501+0000 7f55b7fff700 1 -- 192.168.123.105:0/4104747827 >> 192.168.123.105:0/4104747827 conn(0x7f55cc075960 msgr2=0x7f55cc1044a0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:59.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.502+0000 7f55b7fff700 1 -- 192.168.123.105:0/4104747827 shutdown_connections 2026-03-10T07:55:59.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.502+0000 7f55b7fff700 1 -- 192.168.123.105:0/4104747827 wait complete. 2026-03-10T07:55:59.566 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.566+0000 7f2b8f13b700 1 -- 192.168.123.105:0/1563323329 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b88102a20 msgr2=0x7f2b8810af10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:59.566 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.566+0000 7f2b8f13b700 1 --2- 192.168.123.105:0/1563323329 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b88102a20 0x7f2b8810af10 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f2b7c009b00 tx=0x7f2b7c009e10 comp rx=0 tx=0).stop 2026-03-10T07:55:59.566 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.566+0000 7f2b8f13b700 1 -- 192.168.123.105:0/1563323329 shutdown_connections 2026-03-10T07:55:59.566 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.566+0000 7f2b8f13b700 1 --2- 192.168.123.105:0/1563323329 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b88102a20 0x7f2b8810af10 unknown :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.566 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.566+0000 7f2b8f13b700 1 --2- 192.168.123.105:0/1563323329 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2b88102100 0x7f2b881024e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.566 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.566+0000 7f2b8f13b700 1 -- 192.168.123.105:0/1563323329 >> 192.168.123.105:0/1563323329 conn(0x7f2b880fb830 msgr2=0x7f2b880fdc50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:59.566 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.566+0000 7f2b8f13b700 1 -- 192.168.123.105:0/1563323329 shutdown_connections 2026-03-10T07:55:59.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.566+0000 7f2b8f13b700 1 -- 192.168.123.105:0/1563323329 wait complete. 2026-03-10T07:55:59.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.567+0000 7f2b8f13b700 1 Processor -- start 2026-03-10T07:55:59.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.567+0000 7f2b8f13b700 1 -- start start 2026-03-10T07:55:59.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.567+0000 7f2b8f13b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2b88102100 0x7f2b881949a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:59.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.567+0000 7f2b8f13b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b88102a20 0x7f2b88194ee0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:59.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.567+0000 7f2b8f13b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2b88195570 con 0x7f2b88102a20 2026-03-10T07:55:59.568 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.567+0000 7f2b8f13b700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2b88199380 con 0x7f2b88102100 2026-03-10T07:55:59.568 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.567+0000 7f2b87fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b88102a20 0x7f2b88194ee0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:59.568 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.567+0000 7f2b87fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b88102a20 0x7f2b88194ee0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36826/0 (socket says 192.168.123.105:36826) 2026-03-10T07:55:59.568 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.567+0000 7f2b87fff700 1 -- 192.168.123.105:0/269686206 learned_addr learned my addr 192.168.123.105:0/269686206 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:55:59.568 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.567+0000 7f2b8ced7700 1 --2- 192.168.123.105:0/269686206 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2b88102100 0x7f2b881949a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:59.568 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.568+0000 7f2b87fff700 1 -- 192.168.123.105:0/269686206 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2b88102100 msgr2=0x7f2b881949a0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:59.568 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.568+0000 7f2b87fff700 1 --2- 192.168.123.105:0/269686206 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2b88102100 0x7f2b881949a0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.568 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.568+0000 7f2b87fff700 1 -- 192.168.123.105:0/269686206 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2b7c0097e0 con 0x7f2b88102a20 2026-03-10T07:55:59.568 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.568+0000 7f2b87fff700 1 --2- 192.168.123.105:0/269686206 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b88102a20 0x7f2b88194ee0 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f2b7c009ad0 tx=0x7f2b7c003960 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:59.568 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.568+0000 7f2b85ffb700 1 -- 192.168.123.105:0/269686206 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2b7c01d070 con 0x7f2b88102a20 2026-03-10T07:55:59.568 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.568+0000 7f2b8f13b700 1 -- 192.168.123.105:0/269686206 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2b88199600 con 0x7f2b88102a20 2026-03-10T07:55:59.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.569+0000 7f2b8f13b700 1 -- 192.168.123.105:0/269686206 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2b88199af0 con 0x7f2b88102a20 2026-03-10T07:55:59.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.569+0000 7f2b85ffb700 1 -- 192.168.123.105:0/269686206 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f2b7c00bb40 con 0x7f2b88102a20 2026-03-10T07:55:59.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.569+0000 7f2b85ffb700 1 -- 192.168.123.105:0/269686206 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2b7c021620 con 0x7f2b88102a20 2026-03-10T07:55:59.570 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.570+0000 7f2b8f13b700 1 -- 192.168.123.105:0/269686206 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2b74005320 con 0x7f2b88102a20 2026-03-10T07:55:59.571 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.570+0000 7f2b85ffb700 1 -- 192.168.123.105:0/269686206 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2b7c003c30 con 0x7f2b88102a20 2026-03-10T07:55:59.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.573+0000 7f2b85ffb700 1 --2- 192.168.123.105:0/269686206 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f2b700778e0 0x7f2b70079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:59.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.573+0000 7f2b8ced7700 1 --2- 192.168.123.105:0/269686206 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f2b700778e0 0x7f2b70079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:59.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.573+0000 7f2b85ffb700 1 -- 192.168.123.105:0/269686206 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f2b7c0a1d90 con 0x7f2b88102a20 2026-03-10T07:55:59.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.573+0000 7f2b8ced7700 1 --2- 192.168.123.105:0/269686206 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f2b700778e0 0x7f2b70079da0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f2b7800f830 tx=0x7f2b78009450 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:59.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.574+0000 7f2b85ffb700 1 -- 192.168.123.105:0/269686206 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2b7c063230 con 0x7f2b88102a20 2026-03-10T07:55:59.710 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.710+0000 7f2b8f13b700 1 -- 192.168.123.105:0/269686206 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f2b74005cc0 con 0x7f2b88102a20 2026-03-10T07:55:59.710 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.710+0000 7f2b85ffb700 1 -- 192.168.123.105:0/269686206 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 28 v28) v1 ==== 76+0+1925 (secure 0 0 0) 0x7f2b7c0267a0 con 0x7f2b88102a20 2026-03-10T07:55:59.711 INFO:teuthology.orchestra.run.vm05.stdout:e28 2026-03-10T07:55:59.711 INFO:teuthology.orchestra.run.vm05.stdout:btime 2026-03-10T07:55:53:338844+0000 2026-03-10T07:55:59.711 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T07:55:59.711 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:55:59.711 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T07:55:59.711 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:55:59.711 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T07:55:59.711 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T07:55:59.711 INFO:teuthology.orchestra.run.vm05.stdout:epoch 28 2026-03-10T07:55:59.711 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T07:55:59.711 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T07:48:24.309293+0000 2026-03-10T07:55:59.711 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T07:55:53.338843+0000 2026-03-10T07:55:59.711 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:max_xattr_size 65536 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 83 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:up {0=24313} 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:inline_data enabled 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:qdb_cluster leader: 24313 members: 24313 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.dgsaon{0:24313} state up:active seq 114 join_fscid=1 addr [v2:192.168.123.108:6826/2963085185,v1:192.168.123.108:6827/2963085185] compat {c=[1],r=[1],i=[7ff]}] 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.pavqil{-1:34274} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.105:6828/426813062,v1:192.168.123.105:6829/426813062] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.omfhnh{-1:44255} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.105:6826/4251520120,v1:192.168.123.105:6827/4251520120] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T07:55:59.712 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ybmbgd{-1:44263} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/1209244,v1:192.168.123.108:6825/1209244] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T07:55:59.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.713+0000 7f2b8f13b700 1 -- 192.168.123.105:0/269686206 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f2b700778e0 msgr2=0x7f2b70079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:59.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.713+0000 7f2b8f13b700 1 --2- 192.168.123.105:0/269686206 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f2b700778e0 0x7f2b70079da0 secure :-1 s=READY pgs=75 cs=0 l=1 rev1=1 crypto rx=0x7f2b7800f830 tx=0x7f2b78009450 comp rx=0 tx=0).stop 2026-03-10T07:55:59.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.713+0000 7f2b8f13b700 1 -- 192.168.123.105:0/269686206 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b88102a20 msgr2=0x7f2b88194ee0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:59.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.713+0000 7f2b8f13b700 1 --2- 192.168.123.105:0/269686206 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b88102a20 0x7f2b88194ee0 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7f2b7c009ad0 tx=0x7f2b7c003960 comp rx=0 tx=0).stop 2026-03-10T07:55:59.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.714+0000 7f2b8f13b700 1 -- 192.168.123.105:0/269686206 shutdown_connections 2026-03-10T07:55:59.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.714+0000 7f2b8f13b700 1 --2- 192.168.123.105:0/269686206 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2b88102100 0x7f2b881949a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.714+0000 7f2b8f13b700 1 --2- 192.168.123.105:0/269686206 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f2b700778e0 0x7f2b70079da0 unknown :-1 s=CLOSED pgs=75 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.714+0000 7f2b8f13b700 1 --2- 192.168.123.105:0/269686206 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2b88102a20 0x7f2b88194ee0 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.714+0000 7f2b8f13b700 1 -- 192.168.123.105:0/269686206 >> 192.168.123.105:0/269686206 conn(0x7f2b880fb830 msgr2=0x7f2b88105750 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:59.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.714+0000 7f2b8f13b700 1 -- 192.168.123.105:0/269686206 shutdown_connections 2026-03-10T07:55:59.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.714+0000 7f2b8f13b700 1 -- 192.168.123.105:0/269686206 wait complete. 2026-03-10T07:55:59.715 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 28 2026-03-10T07:55:59.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.778+0000 7f24e7204700 1 -- 192.168.123.105:0/1399404424 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f24e0069c00 msgr2=0x7f24e0105aa0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:59.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.778+0000 7f24e7204700 1 --2- 192.168.123.105:0/1399404424 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f24e0069c00 0x7f24e0105aa0 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f24d4009b00 tx=0x7f24d4009e10 comp rx=0 tx=0).stop 2026-03-10T07:55:59.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.778+0000 7f24e7204700 1 -- 192.168.123.105:0/1399404424 shutdown_connections 2026-03-10T07:55:59.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.778+0000 7f24e7204700 1 --2- 192.168.123.105:0/1399404424 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f24e0069c00 0x7f24e0105aa0 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.778+0000 7f24e7204700 1 --2- 192.168.123.105:0/1399404424 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f24e0069250 0x7f24e0069630 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.778 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.778+0000 7f24e7204700 1 -- 192.168.123.105:0/1399404424 >> 192.168.123.105:0/1399404424 conn(0x7f24e0077c10 msgr2=0x7f24e0078020 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:59.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.778+0000 7f24e7204700 1 -- 192.168.123.105:0/1399404424 shutdown_connections 2026-03-10T07:55:59.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.778+0000 7f24e7204700 1 -- 192.168.123.105:0/1399404424 wait complete. 2026-03-10T07:55:59.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.779+0000 7f24e7204700 1 Processor -- start 2026-03-10T07:55:59.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.779+0000 7f24e7204700 1 -- start start 2026-03-10T07:55:59.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.779+0000 7f24e7204700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f24e0069250 0x7f24e0198dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:59.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.779+0000 7f24e7204700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f24e0069c00 0x7f24e0199310 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:59.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.779+0000 7f24e7204700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f24e0199960 con 0x7f24e0069c00 2026-03-10T07:55:59.779 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.779+0000 7f24e7204700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f24e0199aa0 con 0x7f24e0069250 2026-03-10T07:55:59.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.779+0000 7f24e4fa0700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f24e0069250 0x7f24e0198dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:59.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.780+0000 7f24dffff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f24e0069c00 0x7f24e0199310 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:59.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.780+0000 7f24dffff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f24e0069c00 0x7f24e0199310 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:36836/0 (socket says 192.168.123.105:36836) 2026-03-10T07:55:59.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.780+0000 7f24dffff700 1 -- 192.168.123.105:0/3144089331 learned_addr learned my addr 192.168.123.105:0/3144089331 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:55:59.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.780+0000 7f24dffff700 1 -- 192.168.123.105:0/3144089331 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f24e0069250 msgr2=0x7f24e0198dd0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:59.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.780+0000 7f24dffff700 1 --2- 192.168.123.105:0/3144089331 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f24e0069250 0x7f24e0198dd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.780+0000 7f24dffff700 1 -- 192.168.123.105:0/3144089331 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f24d40097e0 con 0x7f24e0069c00 2026-03-10T07:55:59.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.780+0000 7f24dffff700 1 --2- 192.168.123.105:0/3144089331 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f24e0069c00 0x7f24e0199310 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f24d4009fd0 tx=0x7f24d40049e0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:59.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.780+0000 7f24ddffb700 1 -- 192.168.123.105:0/3144089331 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f24d401d070 con 0x7f24e0069c00 2026-03-10T07:55:59.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.780+0000 7f24ddffb700 1 -- 192.168.123.105:0/3144089331 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f24d400bc50 con 0x7f24e0069c00 2026-03-10T07:55:59.780 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.780+0000 7f24e7204700 1 -- 192.168.123.105:0/3144089331 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f24e019d890 con 0x7f24e0069c00 2026-03-10T07:55:59.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.780+0000 7f24ddffb700 1 -- 192.168.123.105:0/3144089331 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f24d400f780 con 0x7f24e0069c00 2026-03-10T07:55:59.781 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.781+0000 7f24e7204700 1 -- 192.168.123.105:0/3144089331 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f24e019dd80 con 0x7f24e0069c00 2026-03-10T07:55:59.783 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.781+0000 7f24e7204700 1 -- 192.168.123.105:0/3144089331 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f24e004f350 con 0x7f24e0069c00 2026-03-10T07:55:59.783 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.782+0000 7f24ddffb700 1 -- 192.168.123.105:0/3144089331 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f24d4022470 con 0x7f24e0069c00 2026-03-10T07:55:59.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.783+0000 7f24ddffb700 1 --2- 192.168.123.105:0/3144089331 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f24c8077910 0x7f24c8079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:59.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.783+0000 7f24ddffb700 1 -- 192.168.123.105:0/3144089331 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f24d409b310 con 0x7f24e0069c00 2026-03-10T07:55:59.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.783+0000 7f24e4fa0700 1 --2- 192.168.123.105:0/3144089331 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f24c8077910 0x7f24c8079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:59.784 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.784+0000 7f24e4fa0700 1 --2- 192.168.123.105:0/3144089331 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f24c8077910 0x7f24c8079dd0 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f24d000ba60 tx=0x7f24d0005d50 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:59.785 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.785+0000 7f24ddffb700 1 -- 192.168.123.105:0/3144089331 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f24d4063a40 con 0x7f24e0069c00 2026-03-10T07:55:59.917 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.917+0000 7f24e7204700 1 -- 192.168.123.105:0/3144089331 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f24e019e080 con 0x7f24c8077910 2026-03-10T07:55:59.918 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.918+0000 7f24ddffb700 1 -- 192.168.123.105:0/3144089331 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+416 (secure 0 0 0) 0x7f24e019e080 con 0x7f24c8077910 2026-03-10T07:55:59.918 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:55:59.918 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": "quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc", 2026-03-10T07:55:59.918 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": true, 2026-03-10T07:55:59.918 INFO:teuthology.orchestra.run.vm05.stdout: "which": "Upgrading all daemon types on all hosts", 2026-03-10T07:55:59.918 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [ 2026-03-10T07:55:59.918 INFO:teuthology.orchestra.run.vm05.stdout: "osd", 2026-03-10T07:55:59.918 INFO:teuthology.orchestra.run.vm05.stdout: "mgr", 2026-03-10T07:55:59.919 INFO:teuthology.orchestra.run.vm05.stdout: "crash", 2026-03-10T07:55:59.919 INFO:teuthology.orchestra.run.vm05.stdout: "mon" 2026-03-10T07:55:59.919 INFO:teuthology.orchestra.run.vm05.stdout: ], 2026-03-10T07:55:59.919 INFO:teuthology.orchestra.run.vm05.stdout: "progress": "15/23 daemons upgraded", 2026-03-10T07:55:59.919 INFO:teuthology.orchestra.run.vm05.stdout: "message": "Currently upgrading mds daemons", 2026-03-10T07:55:59.919 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T07:55:59.919 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:55:59.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.921+0000 7f24e7204700 1 -- 192.168.123.105:0/3144089331 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f24c8077910 msgr2=0x7f24c8079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:59.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.921+0000 7f24e7204700 1 --2- 192.168.123.105:0/3144089331 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f24c8077910 0x7f24c8079dd0 secure :-1 s=READY pgs=76 cs=0 l=1 rev1=1 crypto rx=0x7f24d000ba60 tx=0x7f24d0005d50 comp rx=0 tx=0).stop 2026-03-10T07:55:59.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.921+0000 7f24e7204700 1 -- 192.168.123.105:0/3144089331 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f24e0069c00 msgr2=0x7f24e0199310 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:59.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.921+0000 7f24e7204700 1 --2- 192.168.123.105:0/3144089331 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f24e0069c00 0x7f24e0199310 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f24d4009fd0 tx=0x7f24d40049e0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.921+0000 7f24e7204700 1 -- 192.168.123.105:0/3144089331 shutdown_connections 2026-03-10T07:55:59.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.921+0000 7f24e7204700 1 --2- 192.168.123.105:0/3144089331 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f24e0069250 0x7f24e0198dd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.921+0000 7f24e7204700 1 --2- 192.168.123.105:0/3144089331 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f24c8077910 0x7f24c8079dd0 unknown :-1 s=CLOSED pgs=76 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.921+0000 7f24e7204700 1 --2- 192.168.123.105:0/3144089331 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f24e0069c00 0x7f24e0199310 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.921+0000 7f24e7204700 1 -- 192.168.123.105:0/3144089331 >> 192.168.123.105:0/3144089331 conn(0x7f24e0077c10 msgr2=0x7f24e0102a70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:59.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.921+0000 7f24e7204700 1 -- 192.168.123.105:0/3144089331 shutdown_connections 2026-03-10T07:55:59.921 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.921+0000 7f24e7204700 1 -- 192.168.123.105:0/3144089331 wait complete. 2026-03-10T07:55:59.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.982+0000 7f52bb59e700 1 -- 192.168.123.105:0/1910990652 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52bc0684d0 msgr2=0x7f52bc068950 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:59.983 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.982+0000 7f52bb59e700 1 --2- 192.168.123.105:0/1910990652 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52bc0684d0 0x7f52bc068950 secure :-1 s=READY pgs=56 cs=0 l=1 rev1=1 crypto rx=0x7f52ac009a60 tx=0x7f52ac009d70 comp rx=0 tx=0).stop 2026-03-10T07:55:59.983 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:59 vm05.local ceph-mon[130117]: pgmap v149: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 16 MiB/s rd, 4.2 KiB/s wr, 11 op/s 2026-03-10T07:55:59.983 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:59 vm05.local ceph-mon[130117]: from='client.34278 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:55:59.983 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:59 vm05.local ceph-mon[130117]: from='client.34282 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:55:59.983 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:59 vm05.local ceph-mon[130117]: from='client.34286 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:55:59.983 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:59 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/4104747827' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:55:59.983 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:55:59 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/269686206' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T07:55:59.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.984+0000 7f52bb59e700 1 -- 192.168.123.105:0/1910990652 shutdown_connections 2026-03-10T07:55:59.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.984+0000 7f52bb59e700 1 --2- 192.168.123.105:0/1910990652 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52bc0684d0 0x7f52bc068950 unknown :-1 s=CLOSED pgs=56 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.984+0000 7f52bb59e700 1 --2- 192.168.123.105:0/1910990652 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f52bc105be0 0x7f52bc105fc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.984+0000 7f52bb59e700 1 -- 192.168.123.105:0/1910990652 >> 192.168.123.105:0/1910990652 conn(0x7f52bc0756b0 msgr2=0x7f52bc075ac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:55:59.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.986+0000 7f52bb59e700 1 -- 192.168.123.105:0/1910990652 shutdown_connections 2026-03-10T07:55:59.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.986+0000 7f52bb59e700 1 -- 192.168.123.105:0/1910990652 wait complete. 2026-03-10T07:55:59.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.986+0000 7f52bb59e700 1 Processor -- start 2026-03-10T07:55:59.986 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.986+0000 7f52bb59e700 1 -- start start 2026-03-10T07:55:59.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.986+0000 7f52bb59e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f52bc0684d0 0x7f52bc0ffc80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:59.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.986+0000 7f52bb59e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52bc105be0 0x7f52bc1001c0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:59.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.986+0000 7f52bb59e700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f52bc100700 con 0x7f52bc0684d0 2026-03-10T07:55:59.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.986+0000 7f52bb59e700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f52bc100870 con 0x7f52bc105be0 2026-03-10T07:55:59.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.987+0000 7f52b9d9b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52bc105be0 0x7f52bc1001c0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:59.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.987+0000 7f52b9d9b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52bc105be0 0x7f52bc1001c0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:43574/0 (socket says 192.168.123.105:43574) 2026-03-10T07:55:59.987 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.987+0000 7f52b9d9b700 1 -- 192.168.123.105:0/57808901 learned_addr learned my addr 192.168.123.105:0/57808901 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:55:59.988 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.987+0000 7f52b9d9b700 1 -- 192.168.123.105:0/57808901 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f52bc0684d0 msgr2=0x7f52bc0ffc80 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:55:59.988 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.987+0000 7f52b9d9b700 1 --2- 192.168.123.105:0/57808901 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f52bc0684d0 0x7f52bc0ffc80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:55:59.988 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.987+0000 7f52b9d9b700 1 -- 192.168.123.105:0/57808901 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f52a40097e0 con 0x7f52bc105be0 2026-03-10T07:55:59.988 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.988+0000 7f52b9d9b700 1 --2- 192.168.123.105:0/57808901 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52bc105be0 0x7f52bc1001c0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f52ac00b5c0 tx=0x7f52ac00f740 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:59.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.989+0000 7f52b37fe700 1 -- 192.168.123.105:0/57808901 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f52ac01d070 con 0x7f52bc105be0 2026-03-10T07:55:59.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.989+0000 7f52b37fe700 1 -- 192.168.123.105:0/57808901 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f52ac00fd20 con 0x7f52bc105be0 2026-03-10T07:55:59.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.989+0000 7f52b37fe700 1 -- 192.168.123.105:0/57808901 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f52ac017760 con 0x7f52bc105be0 2026-03-10T07:55:59.989 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.989+0000 7f52bb59e700 1 -- 192.168.123.105:0/57808901 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f52ac009710 con 0x7f52bc105be0 2026-03-10T07:55:59.990 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.989+0000 7f52bb59e700 1 -- 192.168.123.105:0/57808901 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f52bc071a60 con 0x7f52bc105be0 2026-03-10T07:55:59.990 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.989+0000 7f52bb59e700 1 -- 192.168.123.105:0/57808901 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f52bc04f2e0 con 0x7f52bc105be0 2026-03-10T07:55:59.990 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.990+0000 7f52b37fe700 1 -- 192.168.123.105:0/57808901 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f52ac021410 con 0x7f52bc105be0 2026-03-10T07:55:59.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.991+0000 7f52b37fe700 1 --2- 192.168.123.105:0/57808901 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f52a80778c0 0x7f52a8079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:55:59.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.991+0000 7f52b37fe700 1 -- 192.168.123.105:0/57808901 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(83..83 src has 1..83) v4 ==== 6394+0+0 (secure 0 0 0) 0x7f52ac09aba0 con 0x7f52bc105be0 2026-03-10T07:55:59.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.991+0000 7f52ba59c700 1 --2- 192.168.123.105:0/57808901 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f52a80778c0 0x7f52a8079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:55:59.991 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.991+0000 7f52ba59c700 1 --2- 192.168.123.105:0/57808901 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f52a80778c0 0x7f52a8079d80 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f52a40097b0 tx=0x7f52a4009700 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:55:59.993 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:55:59.993+0000 7f52b37fe700 1 -- 192.168.123.105:0/57808901 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f52ac063360 con 0x7f52bc105be0 2026-03-10T07:56:00.149 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:00.149+0000 7f52bb59e700 1 -- 192.168.123.105:0/57808901 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f52bc04ea90 con 0x7f52bc105be0 2026-03-10T07:56:00.150 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:00.149+0000 7f52b37fe700 1 -- 192.168.123.105:0/57808901 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+201 (secure 0 0 0) 0x7f52ac0260e0 con 0x7f52bc105be0 2026-03-10T07:56:00.150 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T07:56:00.150 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T07:56:00.150 INFO:teuthology.orchestra.run.vm05.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T07:56:00.152 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:00.152+0000 7f52bb59e700 1 -- 192.168.123.105:0/57808901 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f52a80778c0 msgr2=0x7f52a8079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:56:00.152 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:00.152+0000 7f52bb59e700 1 --2- 192.168.123.105:0/57808901 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f52a80778c0 0x7f52a8079d80 secure :-1 s=READY pgs=77 cs=0 l=1 rev1=1 crypto rx=0x7f52a40097b0 tx=0x7f52a4009700 comp rx=0 tx=0).stop 2026-03-10T07:56:00.153 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:00.152+0000 7f52bb59e700 1 -- 192.168.123.105:0/57808901 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52bc105be0 msgr2=0x7f52bc1001c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:56:00.153 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:00.152+0000 7f52bb59e700 1 --2- 192.168.123.105:0/57808901 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52bc105be0 0x7f52bc1001c0 secure :-1 s=READY pgs=57 cs=0 l=1 rev1=1 crypto rx=0x7f52ac00b5c0 tx=0x7f52ac00f740 comp rx=0 tx=0).stop 2026-03-10T07:56:00.153 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:00.152+0000 7f52bb59e700 1 -- 192.168.123.105:0/57808901 shutdown_connections 2026-03-10T07:56:00.153 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:00.152+0000 7f52bb59e700 1 --2- 192.168.123.105:0/57808901 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f52a80778c0 0x7f52a8079d80 unknown :-1 s=CLOSED pgs=77 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:00.153 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:00.152+0000 7f52bb59e700 1 --2- 192.168.123.105:0/57808901 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f52bc0684d0 0x7f52bc0ffc80 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:00.153 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:00.152+0000 7f52bb59e700 1 --2- 192.168.123.105:0/57808901 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f52bc105be0 0x7f52bc1001c0 unknown :-1 s=CLOSED pgs=57 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:00.153 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:00.152+0000 7f52bb59e700 1 -- 192.168.123.105:0/57808901 >> 192.168.123.105:0/57808901 conn(0x7f52bc0756b0 msgr2=0x7f52bc0fdb90 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:56:00.153 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:00.153+0000 7f52bb59e700 1 -- 192.168.123.105:0/57808901 shutdown_connections 2026-03-10T07:56:00.153 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:00.153+0000 7f52bb59e700 1 -- 192.168.123.105:0/57808901 wait complete. 2026-03-10T07:56:00.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:59 vm08.local ceph-mon[107898]: pgmap v149: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 16 MiB/s rd, 4.2 KiB/s wr, 11 op/s 2026-03-10T07:56:00.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:59 vm08.local ceph-mon[107898]: from='client.34278 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:56:00.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:59 vm08.local ceph-mon[107898]: from='client.34282 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:56:00.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:59 vm08.local ceph-mon[107898]: from='client.34286 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:56:00.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:59 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/4104747827' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:00.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:55:59 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/269686206' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T07:56:01.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:00 vm05.local ceph-mon[130117]: from='client.34298 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:56:01.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:00 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/57808901' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T07:56:01.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:00 vm08.local ceph-mon[107898]: from='client.34298 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:56:01.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:00 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/57808901' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T07:56:02.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:01 vm05.local ceph-mon[130117]: pgmap v150: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 6.3 MiB/s rd, 4.2 KiB/s wr, 6 op/s 2026-03-10T07:56:02.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:01 vm08.local ceph-mon[107898]: pgmap v150: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 6.3 MiB/s rd, 4.2 KiB/s wr, 6 op/s 2026-03-10T07:56:03.019 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:02 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:56:03.019 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:02 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:03.019 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:02 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:03.019 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:02 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:56:03.019 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:02 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:56:03.019 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:02 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:03.019 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:02 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:56:03.019 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:02 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:03.019 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:02 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:03.019 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:02 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:03.019 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:02 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:03.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:02 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:56:03.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:02 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:03.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:02 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:03.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:02 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:56:03.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:02 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:56:03.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:02 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:03.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:02 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:56:03.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:02 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:03.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:02 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:03.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:02 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:03.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:02 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:03.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:03 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm05[130113]: 2026-03-10T07:56:03.317+0000 7f980f7e7640 -1 log_channel(cluster) log [ERR] : Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T07:56:04.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:03 vm05.local ceph-mon[130117]: pgmap v151: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 5.3 MiB/s rd, 4.2 KiB/s wr, 6 op/s 2026-03-10T07:56:04.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:03 vm05.local ceph-mon[130117]: Upgrade: Updating mds.cephfs.vm08.dgsaon 2026-03-10T07:56:04.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:03 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:04.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:03 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.dgsaon", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T07:56:04.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:03 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:56:04.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:03 vm05.local ceph-mon[130117]: Deploying daemon mds.cephfs.vm08.dgsaon on vm08 2026-03-10T07:56:04.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:03 vm05.local ceph-mon[130117]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T07:56:04.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:03 vm05.local ceph-mon[130117]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T07:56:04.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:03 vm05.local ceph-mon[130117]: osdmap e84: 6 total, 6 up, 6 in 2026-03-10T07:56:04.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:03 vm05.local ceph-mon[130117]: Standby daemon mds.cephfs.vm05.pavqil assigned to filesystem cephfs as rank 0 2026-03-10T07:56:04.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:03 vm05.local ceph-mon[130117]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T07:56:04.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:03 vm05.local ceph-mon[130117]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T07:56:04.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:03 vm05.local ceph-mon[130117]: fsmap cephfs:1/1 {0=cephfs.vm05.pavqil=up:replay} 2 up:standby 2026-03-10T07:56:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:03 vm08.local ceph-mon[107898]: pgmap v151: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 5.3 MiB/s rd, 4.2 KiB/s wr, 6 op/s 2026-03-10T07:56:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:03 vm08.local ceph-mon[107898]: Upgrade: Updating mds.cephfs.vm08.dgsaon 2026-03-10T07:56:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:03 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:03 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "mds.cephfs.vm08.dgsaon", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]}]: dispatch 2026-03-10T07:56:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:03 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:56:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:03 vm08.local ceph-mon[107898]: Deploying daemon mds.cephfs.vm08.dgsaon on vm08 2026-03-10T07:56:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:03 vm08.local ceph-mon[107898]: Health check failed: 1 filesystem is degraded (FS_DEGRADED) 2026-03-10T07:56:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:03 vm08.local ceph-mon[107898]: Health check failed: 1 filesystem is offline (MDS_ALL_DOWN) 2026-03-10T07:56:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:03 vm08.local ceph-mon[107898]: osdmap e84: 6 total, 6 up, 6 in 2026-03-10T07:56:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:03 vm08.local ceph-mon[107898]: Standby daemon mds.cephfs.vm05.pavqil assigned to filesystem cephfs as rank 0 2026-03-10T07:56:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:03 vm08.local ceph-mon[107898]: Health check cleared: MDS_ALL_DOWN (was: 1 filesystem is offline) 2026-03-10T07:56:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:03 vm08.local ceph-mon[107898]: fsmap cephfs:0/1 3 up:standby, 1 failed 2026-03-10T07:56:04.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:03 vm08.local ceph-mon[107898]: fsmap cephfs:1/1 {0=cephfs.vm05.pavqil=up:replay} 2 up:standby 2026-03-10T07:56:06.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:06 vm05.local ceph-mon[130117]: pgmap v153: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 822 KiB/s rd, 5.0 KiB/s wr, 4 op/s 2026-03-10T07:56:06.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:06 vm08.local ceph-mon[107898]: pgmap v153: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 822 KiB/s rd, 5.0 KiB/s wr, 4 op/s 2026-03-10T07:56:08.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:08 vm08.local ceph-mon[107898]: pgmap v154: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 11 MiB/s rd, 0 B/s wr, 4 op/s 2026-03-10T07:56:08.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:08 vm08.local ceph-mon[107898]: mds.? [v2:192.168.123.105:6828/426813062,v1:192.168.123.105:6829/426813062] up:reconnect 2026-03-10T07:56:08.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:08 vm08.local ceph-mon[107898]: fsmap cephfs:1/1 {0=cephfs.vm05.pavqil=up:reconnect} 2 up:standby 2026-03-10T07:56:08.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:08 vm08.local ceph-mon[107898]: reconnect by client.14542 192.168.144.1:0/2345317662 after 0.002 2026-03-10T07:56:08.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:08 vm08.local ceph-mon[107898]: reconnect by client.24347 192.168.123.108:0/2923077033 after 0.003 2026-03-10T07:56:08.595 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:08 vm05.local ceph-mon[130117]: pgmap v154: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 11 MiB/s rd, 0 B/s wr, 4 op/s 2026-03-10T07:56:08.595 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:08 vm05.local ceph-mon[130117]: mds.? [v2:192.168.123.105:6828/426813062,v1:192.168.123.105:6829/426813062] up:reconnect 2026-03-10T07:56:08.595 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:08 vm05.local ceph-mon[130117]: fsmap cephfs:1/1 {0=cephfs.vm05.pavqil=up:reconnect} 2 up:standby 2026-03-10T07:56:08.595 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:08 vm05.local ceph-mon[130117]: reconnect by client.14542 192.168.144.1:0/2345317662 after 0.002 2026-03-10T07:56:08.595 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:08 vm05.local ceph-mon[130117]: reconnect by client.24347 192.168.123.108:0/2923077033 after 0.003 2026-03-10T07:56:09.547 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:09 vm08.local ceph-mon[107898]: mds.? [v2:192.168.123.105:6828/426813062,v1:192.168.123.105:6829/426813062] up:rejoin 2026-03-10T07:56:09.547 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:09 vm08.local ceph-mon[107898]: fsmap cephfs:1/1 {0=cephfs.vm05.pavqil=up:rejoin} 2 up:standby 2026-03-10T07:56:09.547 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:09 vm08.local ceph-mon[107898]: daemon mds.cephfs.vm05.pavqil is now active in filesystem cephfs as rank 0 2026-03-10T07:56:09.547 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:09 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:09.547 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:09 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:09.547 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:09 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:56:09.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:09 vm05.local ceph-mon[130117]: mds.? [v2:192.168.123.105:6828/426813062,v1:192.168.123.105:6829/426813062] up:rejoin 2026-03-10T07:56:09.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:09 vm05.local ceph-mon[130117]: fsmap cephfs:1/1 {0=cephfs.vm05.pavqil=up:rejoin} 2 up:standby 2026-03-10T07:56:09.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:09 vm05.local ceph-mon[130117]: daemon mds.cephfs.vm05.pavqil is now active in filesystem cephfs as rank 0 2026-03-10T07:56:09.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:09 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:09.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:09 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:09.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:09 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:56:10.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:10 vm08.local ceph-mon[107898]: pgmap v155: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 25 MiB/s rd, 7 op/s 2026-03-10T07:56:10.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:10 vm08.local ceph-mon[107898]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T07:56:10.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:10 vm08.local ceph-mon[107898]: mds.? [v2:192.168.123.105:6828/426813062,v1:192.168.123.105:6829/426813062] up:active 2026-03-10T07:56:10.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:10 vm08.local ceph-mon[107898]: mds.? [v2:192.168.123.108:6826/3010391204,v1:192.168.123.108:6827/3010391204] up:boot 2026-03-10T07:56:10.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:10 vm08.local ceph-mon[107898]: fsmap cephfs:1 {0=cephfs.vm05.pavqil=up:active} 3 up:standby 2026-03-10T07:56:10.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:10 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.dgsaon"}]: dispatch 2026-03-10T07:56:10.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:10 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:10.553 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:10 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:10.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:10 vm05.local ceph-mon[130117]: pgmap v155: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 25 MiB/s rd, 7 op/s 2026-03-10T07:56:10.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:10 vm05.local ceph-mon[130117]: Health check cleared: FS_DEGRADED (was: 1 filesystem is degraded) 2026-03-10T07:56:10.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:10 vm05.local ceph-mon[130117]: mds.? [v2:192.168.123.105:6828/426813062,v1:192.168.123.105:6829/426813062] up:active 2026-03-10T07:56:10.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:10 vm05.local ceph-mon[130117]: mds.? [v2:192.168.123.108:6826/3010391204,v1:192.168.123.108:6827/3010391204] up:boot 2026-03-10T07:56:10.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:10 vm05.local ceph-mon[130117]: fsmap cephfs:1 {0=cephfs.vm05.pavqil=up:active} 3 up:standby 2026-03-10T07:56:10.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:10 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "mds metadata", "who": "cephfs.vm08.dgsaon"}]: dispatch 2026-03-10T07:56:10.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:10 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:10.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:10 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:11.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:11 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:11.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:11 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:11.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:11 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:11.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:11 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:11.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:11 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:56:11.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:11 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:56:11.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:11 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:11.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:11 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:56:11.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:11 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:11.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:11 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:11.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:11 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:11.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:11 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:11.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:11 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:11.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:11 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:11.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:11 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm05.omfhnh"}]: dispatch 2026-03-10T07:56:11.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:11 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm05.omfhnh"}]': finished 2026-03-10T07:56:11.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:11 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm05.pavqil"}]: dispatch 2026-03-10T07:56:11.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:11 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm05.pavqil"}]': finished 2026-03-10T07:56:11.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:11 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm08.dgsaon"}]: dispatch 2026-03-10T07:56:11.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:11 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm08.dgsaon"}]': finished 2026-03-10T07:56:11.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:11 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm08.ybmbgd"}]: dispatch 2026-03-10T07:56:11.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:11 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm08.ybmbgd"}]': finished 2026-03-10T07:56:11.419 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:11 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]: dispatch 2026-03-10T07:56:11.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:11 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:11.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:11 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:11.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:11 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:11.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:11 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:11.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:11 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:56:11.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:11 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:56:11.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:11 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:11.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:11 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:56:11.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:11 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:11.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:11 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:11.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:11 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:11.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:11 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:11.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:11 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:11.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:11 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:11.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:11 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm05.omfhnh"}]: dispatch 2026-03-10T07:56:11.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:11 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm05.omfhnh"}]': finished 2026-03-10T07:56:11.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:11 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm05.pavqil"}]: dispatch 2026-03-10T07:56:11.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:11 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm05.pavqil"}]': finished 2026-03-10T07:56:11.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:11 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm08.dgsaon"}]: dispatch 2026-03-10T07:56:11.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:11 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm08.dgsaon"}]': finished 2026-03-10T07:56:11.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:11 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm08.ybmbgd"}]: dispatch 2026-03-10T07:56:11.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:11 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds.cephfs.vm08.ybmbgd"}]': finished 2026-03-10T07:56:11.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:11 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]: dispatch 2026-03-10T07:56:12.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:12 vm05.local ceph-mon[130117]: pgmap v156: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 25 MiB/s rd, 102 B/s wr, 8 op/s 2026-03-10T07:56:12.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:12 vm05.local ceph-mon[130117]: Upgrade: Setting container_image for all mds 2026-03-10T07:56:12.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:12 vm05.local ceph-mon[130117]: Upgrade: Setting filesystem cephfs Joinable 2026-03-10T07:56:12.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:12 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:56:12.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:12 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]': finished 2026-03-10T07:56:12.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:12 vm05.local ceph-mon[130117]: fsmap cephfs:1 {0=cephfs.vm05.pavqil=up:active} 3 up:standby 2026-03-10T07:56:12.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:12 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:12.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:12 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:12.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:12 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:12.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:12 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:12.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:12 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:12.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:12 vm08.local ceph-mon[107898]: pgmap v156: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 25 MiB/s rd, 102 B/s wr, 8 op/s 2026-03-10T07:56:12.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:12 vm08.local ceph-mon[107898]: Upgrade: Setting container_image for all mds 2026-03-10T07:56:12.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:12 vm08.local ceph-mon[107898]: Upgrade: Setting filesystem cephfs Joinable 2026-03-10T07:56:12.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:12 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:56:12.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:12 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]': finished 2026-03-10T07:56:12.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:12 vm08.local ceph-mon[107898]: fsmap cephfs:1 {0=cephfs.vm05.pavqil=up:active} 3 up:standby 2026-03-10T07:56:12.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:12 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:12.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:12 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:12.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:12 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:12.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:12 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:12.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:12 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:14.380 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:14 vm08.local ceph-mon[107898]: Upgrade: Setting container_image for all rgw 2026-03-10T07:56:14.380 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:14 vm08.local ceph-mon[107898]: Upgrade: Setting container_image for all rbd-mirror 2026-03-10T07:56:14.380 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:14 vm08.local ceph-mon[107898]: Upgrade: Updating ceph-exporter.vm05 (1/2) 2026-03-10T07:56:14.380 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:14 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:14.380 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:14 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T07:56:14.380 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:14 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:56:14.380 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:14 vm08.local ceph-mon[107898]: Deploying daemon ceph-exporter.vm05 on vm05 2026-03-10T07:56:14.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:14 vm05.local ceph-mon[130117]: Upgrade: Setting container_image for all rgw 2026-03-10T07:56:14.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:14 vm05.local ceph-mon[130117]: Upgrade: Setting container_image for all rbd-mirror 2026-03-10T07:56:14.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:14 vm05.local ceph-mon[130117]: Upgrade: Updating ceph-exporter.vm05 (1/2) 2026-03-10T07:56:14.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:14 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:14.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:14 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm05", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T07:56:14.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:14 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:56:14.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:14 vm05.local ceph-mon[130117]: Deploying daemon ceph-exporter.vm05 on vm05 2026-03-10T07:56:15.287 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:15 vm08.local ceph-mon[107898]: pgmap v157: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 25 MiB/s rd, 204 B/s wr, 9 op/s 2026-03-10T07:56:15.287 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:15 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:15.287 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:15 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:15.287 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:15 vm08.local ceph-mon[107898]: Upgrade: Updating ceph-exporter.vm08 (2/2) 2026-03-10T07:56:15.287 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:15 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:15.287 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:15 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T07:56:15.287 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:15 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:56:15.287 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:15 vm08.local ceph-mon[107898]: Deploying daemon ceph-exporter.vm08 on vm08 2026-03-10T07:56:15.372 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:15 vm05.local ceph-mon[130117]: pgmap v157: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 25 MiB/s rd, 204 B/s wr, 9 op/s 2026-03-10T07:56:15.372 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:15 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:15.372 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:15 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:15.372 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:15 vm05.local ceph-mon[130117]: Upgrade: Updating ceph-exporter.vm08 (2/2) 2026-03-10T07:56:15.372 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:15 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:15.372 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:15 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get-or-create", "entity": "client.ceph-exporter.vm08", "caps": ["mon", "profile ceph-exporter", "mon", "allow r", "mgr", "allow r", "osd", "allow r"]}]: dispatch 2026-03-10T07:56:15.373 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:15 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:56:15.373 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:15 vm05.local ceph-mon[130117]: Deploying daemon ceph-exporter.vm08 on vm08 2026-03-10T07:56:16.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:16 vm05.local ceph-mon[130117]: pgmap v158: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 22 MiB/s rd, 178 B/s wr, 10 op/s 2026-03-10T07:56:16.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:16 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:16.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:16 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:16.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:16 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:56:16.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:16 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:16.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:16 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:16.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:16 vm08.local ceph-mon[107898]: pgmap v158: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 22 MiB/s rd, 178 B/s wr, 10 op/s 2026-03-10T07:56:16.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:16 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:16.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:16 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:16.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:16 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:56:16.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:16 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:16.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:16 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:17.584 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:17 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:17.584 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:17 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:17.584 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:17 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:17.584 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:17 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:17.584 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:17 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:17.584 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:17 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:17.584 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:17 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:17.584 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:17 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:17 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:17 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:17 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:17 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:17 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:17 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:17 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:17.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:17 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:18.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:18 vm08.local ceph-mon[107898]: pgmap v159: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 20 MiB/s rd, 4.2 KiB/s wr, 11 op/s 2026-03-10T07:56:18.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:18 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:18.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:18 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:18.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:18 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:56:18.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:18 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:56:18.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:18 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:18.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:18 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:56:18.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:18 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:18.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:18 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:18.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:18 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:18.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:18 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:18.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:18 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:18.919 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:18 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]: dispatch 2026-03-10T07:56:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:18 vm05.local ceph-mon[130117]: pgmap v159: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 20 MiB/s rd, 4.2 KiB/s wr, 11 op/s 2026-03-10T07:56:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:18 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:18 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:18 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:56:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:18 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:56:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:18 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:18 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:56:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:18 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:18 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:18 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:18 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:18 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:18 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]: dispatch 2026-03-10T07:56:20.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: Upgrade: Setting filesystem cephfs Joinable 2026-03-10T07:56:20.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]': finished 2026-03-10T07:56:20.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: fsmap cephfs:1 {0=cephfs.vm05.pavqil=up:active} 3 up:standby 2026-03-10T07:56:20.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm05"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm05"}]': finished 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm08"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm08"}]': finished 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]': finished 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd"}]': finished 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds"}]': finished 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]': finished 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:56:20.158 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: Upgrade: Setting filesystem cephfs Joinable 2026-03-10T07:56:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "fs set", "fs_name": "cephfs", "var": "joinable", "val": "true"}]': finished 2026-03-10T07:56:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: fsmap cephfs:1 {0=cephfs.vm05.pavqil=up:active} 3 up:standby 2026-03-10T07:56:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm05"}]: dispatch 2026-03-10T07:56:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm05"}]': finished 2026-03-10T07:56:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm08"}]: dispatch 2026-03-10T07:56:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter.vm08"}]': finished 2026-03-10T07:56:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mgr"}]': finished 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mon"}]': finished 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.crash"}]': finished 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "osd"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "osd"}]': finished 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mds"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "mds"}]': finished 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rgw"}]': finished 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.rbd-mirror"}]': finished 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.ceph-exporter"}]': finished 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.iscsi"}]': finished 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nfs"}]': finished 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix": "config rm", "name": "container_image", "who": "client.nvmeof"}]': finished 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config rm", "name": "container_image", "who": "mon"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/upgrade_state"}]': finished 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:56:20.169 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:20.734 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:20 vm08.local ceph-mon[107898]: pgmap v160: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 12 MiB/s rd, 4.2 KiB/s wr, 8 op/s 2026-03-10T07:56:21.061 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:20 vm05.local ceph-mon[130117]: pgmap v160: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 12 MiB/s rd, 4.2 KiB/s wr, 8 op/s 2026-03-10T07:56:21.061 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:20 vm05.local ceph-mon[130117]: Upgrade: Setting container_image for all ceph-exporter 2026-03-10T07:56:21.061 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:20 vm05.local ceph-mon[130117]: Upgrade: Setting container_image for all iscsi 2026-03-10T07:56:21.061 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:20 vm05.local ceph-mon[130117]: Upgrade: Setting container_image for all nfs 2026-03-10T07:56:21.061 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:20 vm05.local ceph-mon[130117]: Upgrade: Setting container_image for all nvmeof 2026-03-10T07:56:21.061 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:20 vm05.local ceph-mon[130117]: Upgrade: Finalizing container_image settings 2026-03-10T07:56:21.061 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:20 vm05.local ceph-mon[130117]: Upgrade: Complete! 2026-03-10T07:56:21.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:20 vm08.local ceph-mon[107898]: Upgrade: Setting container_image for all ceph-exporter 2026-03-10T07:56:21.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:20 vm08.local ceph-mon[107898]: Upgrade: Setting container_image for all iscsi 2026-03-10T07:56:21.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:20 vm08.local ceph-mon[107898]: Upgrade: Setting container_image for all nfs 2026-03-10T07:56:21.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:20 vm08.local ceph-mon[107898]: Upgrade: Setting container_image for all nvmeof 2026-03-10T07:56:21.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:20 vm08.local ceph-mon[107898]: Upgrade: Finalizing container_image settings 2026-03-10T07:56:21.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:20 vm08.local ceph-mon[107898]: Upgrade: Complete! 2026-03-10T07:56:23.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:22 vm05.local ceph-mon[130117]: pgmap v161: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 316 KiB/s rd, 4.2 KiB/s wr, 5 op/s 2026-03-10T07:56:23.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:22 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:23.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:22 vm08.local ceph-mon[107898]: pgmap v161: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 316 KiB/s rd, 4.2 KiB/s wr, 5 op/s 2026-03-10T07:56:23.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:22 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:56:25.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:24 vm05.local ceph-mon[130117]: pgmap v162: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 2.5 KiB/s rd, 4.1 KiB/s wr, 4 op/s 2026-03-10T07:56:25.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:24 vm08.local ceph-mon[107898]: pgmap v162: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 2.5 KiB/s rd, 4.1 KiB/s wr, 4 op/s 2026-03-10T07:56:27.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:26 vm05.local ceph-mon[130117]: pgmap v163: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 4.0 KiB/s wr, 3 op/s 2026-03-10T07:56:27.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:26 vm08.local ceph-mon[107898]: pgmap v163: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 1.2 KiB/s rd, 4.0 KiB/s wr, 3 op/s 2026-03-10T07:56:28.058 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:56:28.070 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:27 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:56:29.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:28 vm05.local ceph-mon[130117]: pgmap v164: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 4.0 KiB/s wr, 1 op/s 2026-03-10T07:56:29.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:28 vm08.local ceph-mon[107898]: pgmap v164: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail; 4.0 KiB/s wr, 1 op/s 2026-03-10T07:56:30.219 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.217+0000 7f27a6ab5700 1 -- 192.168.123.105:0/558465825 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f27a0069220 msgr2=0x7f27a0069680 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:56:30.219 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.217+0000 7f27a6ab5700 1 --2- 192.168.123.105:0/558465825 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f27a0069220 0x7f27a0069680 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f2794009b50 tx=0x7f2794009e60 comp rx=0 tx=0).stop 2026-03-10T07:56:30.219 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.218+0000 7f27a6ab5700 1 -- 192.168.123.105:0/558465825 shutdown_connections 2026-03-10T07:56:30.219 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.218+0000 7f27a6ab5700 1 --2- 192.168.123.105:0/558465825 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f27a0069220 0x7f27a0069680 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:30.219 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.218+0000 7f27a6ab5700 1 --2- 192.168.123.105:0/558465825 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f27a0106fc0 0x7f27a01073a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:30.219 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.218+0000 7f27a6ab5700 1 -- 192.168.123.105:0/558465825 >> 192.168.123.105:0/558465825 conn(0x7f27a0076b60 msgr2=0x7f27a0076f70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:56:30.219 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.218+0000 7f27a6ab5700 1 -- 192.168.123.105:0/558465825 shutdown_connections 2026-03-10T07:56:30.220 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.218+0000 7f27a6ab5700 1 -- 192.168.123.105:0/558465825 wait complete. 2026-03-10T07:56:30.220 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.219+0000 7f27a6ab5700 1 Processor -- start 2026-03-10T07:56:30.220 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.219+0000 7f27a6ab5700 1 -- start start 2026-03-10T07:56:30.220 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.219+0000 7f27a6ab5700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f27a0069220 0x7f27a0100040 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:56:30.220 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.219+0000 7f27a6ab5700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f27a0106fc0 0x7f27a0100580 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:56:30.221 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.219+0000 7f27a6ab5700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f27a0100ac0 con 0x7f27a0106fc0 2026-03-10T07:56:30.221 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.219+0000 7f27a6ab5700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f27a0100c00 con 0x7f27a0069220 2026-03-10T07:56:30.221 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.219+0000 7f27a4851700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f27a0069220 0x7f27a0100040 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:56:30.221 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.219+0000 7f27a4851700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f27a0069220 0x7f27a0100040 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:37762/0 (socket says 192.168.123.105:37762) 2026-03-10T07:56:30.221 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.219+0000 7f27a4851700 1 -- 192.168.123.105:0/3937609156 learned_addr learned my addr 192.168.123.105:0/3937609156 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:56:30.221 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.219+0000 7f27a4851700 1 -- 192.168.123.105:0/3937609156 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f27a0106fc0 msgr2=0x7f27a0100580 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:56:30.221 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.220+0000 7f279ffff700 1 --2- 192.168.123.105:0/3937609156 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f27a0106fc0 0x7f27a0100580 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:56:30.221 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.220+0000 7f27a4851700 1 --2- 192.168.123.105:0/3937609156 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f27a0106fc0 0x7f27a0100580 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:30.221 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.220+0000 7f27a4851700 1 -- 192.168.123.105:0/3937609156 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f27940097e0 con 0x7f27a0069220 2026-03-10T07:56:30.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.220+0000 7f279ffff700 1 --2- 192.168.123.105:0/3937609156 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f27a0106fc0 0x7f27a0100580 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:56:30.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.220+0000 7f27a4851700 1 --2- 192.168.123.105:0/3937609156 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f27a0069220 0x7f27a0100040 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f279000d8d0 tx=0x7f279000dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:56:30.222 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.221+0000 7f279dffb700 1 -- 192.168.123.105:0/3937609156 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2790009940 con 0x7f27a0069220 2026-03-10T07:56:30.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.221+0000 7f279dffb700 1 -- 192.168.123.105:0/3937609156 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f2790010460 con 0x7f27a0069220 2026-03-10T07:56:30.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.221+0000 7f279dffb700 1 -- 192.168.123.105:0/3937609156 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f279000f5d0 con 0x7f27a0069220 2026-03-10T07:56:30.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.221+0000 7f27a6ab5700 1 -- 192.168.123.105:0/3937609156 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f27a01a2e00 con 0x7f27a0069220 2026-03-10T07:56:30.223 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.221+0000 7f27a6ab5700 1 -- 192.168.123.105:0/3937609156 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f27a01a3210 con 0x7f27a0069220 2026-03-10T07:56:30.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.222+0000 7f279dffb700 1 -- 192.168.123.105:0/3937609156 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f27900105d0 con 0x7f27a0069220 2026-03-10T07:56:30.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.221+0000 7f27a6ab5700 1 -- 192.168.123.105:0/3937609156 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f27a01093e0 con 0x7f27a0069220 2026-03-10T07:56:30.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.222+0000 7f279dffb700 1 --2- 192.168.123.105:0/3937609156 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f27880778c0 0x7f2788079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:56:30.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.222+0000 7f279dffb700 1 -- 192.168.123.105:0/3937609156 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f2790099a00 con 0x7f27a0069220 2026-03-10T07:56:30.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.223+0000 7f279ffff700 1 --2- 192.168.123.105:0/3937609156 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f27880778c0 0x7f2788079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:56:30.224 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.223+0000 7f279ffff700 1 --2- 192.168.123.105:0/3937609156 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f27880778c0 0x7f2788079d80 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f279400b5c0 tx=0x7f27940058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:56:30.226 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.225+0000 7f279dffb700 1 -- 192.168.123.105:0/3937609156 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2790061ab0 con 0x7f27a0069220 2026-03-10T07:56:30.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.350+0000 7f27a6ab5700 1 -- 192.168.123.105:0/3937609156 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f27a01015e0 con 0x7f27880778c0 2026-03-10T07:56:30.352 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.351+0000 7f279dffb700 1 -- 192.168.123.105:0/3937609156 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+175 (secure 0 0 0) 0x7f27a01015e0 con 0x7f27880778c0 2026-03-10T07:56:30.354 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.353+0000 7f27a6ab5700 1 -- 192.168.123.105:0/3937609156 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f27880778c0 msgr2=0x7f2788079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:56:30.354 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.353+0000 7f27a6ab5700 1 --2- 192.168.123.105:0/3937609156 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f27880778c0 0x7f2788079d80 secure :-1 s=READY pgs=79 cs=0 l=1 rev1=1 crypto rx=0x7f279400b5c0 tx=0x7f27940058e0 comp rx=0 tx=0).stop 2026-03-10T07:56:30.354 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.353+0000 7f27a6ab5700 1 -- 192.168.123.105:0/3937609156 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f27a0069220 msgr2=0x7f27a0100040 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:56:30.354 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.353+0000 7f27a6ab5700 1 --2- 192.168.123.105:0/3937609156 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f27a0069220 0x7f27a0100040 secure :-1 s=READY pgs=61 cs=0 l=1 rev1=1 crypto rx=0x7f279000d8d0 tx=0x7f279000dc90 comp rx=0 tx=0).stop 2026-03-10T07:56:30.355 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.353+0000 7f27a6ab5700 1 -- 192.168.123.105:0/3937609156 shutdown_connections 2026-03-10T07:56:30.355 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.353+0000 7f27a6ab5700 1 --2- 192.168.123.105:0/3937609156 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f27a0069220 0x7f27a0100040 unknown :-1 s=CLOSED pgs=61 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:30.355 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.353+0000 7f27a6ab5700 1 --2- 192.168.123.105:0/3937609156 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f27880778c0 0x7f2788079d80 unknown :-1 s=CLOSED pgs=79 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:30.355 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.353+0000 7f27a6ab5700 1 --2- 192.168.123.105:0/3937609156 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f27a0106fc0 0x7f27a0100580 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:30.355 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.353+0000 7f27a6ab5700 1 -- 192.168.123.105:0/3937609156 >> 192.168.123.105:0/3937609156 conn(0x7f27a0076b60 msgr2=0x7f27a00fef60 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:56:30.355 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.353+0000 7f27a6ab5700 1 -- 192.168.123.105:0/3937609156 shutdown_connections 2026-03-10T07:56:30.355 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.354+0000 7f27a6ab5700 1 -- 192.168.123.105:0/3937609156 wait complete. 2026-03-10T07:56:30.419 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch ps' 2026-03-10T07:56:30.609 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:56:30.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.851+0000 7f4a59a95700 1 -- 192.168.123.105:0/4008342753 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a54103cf0 msgr2=0x7f4a54107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:56:30.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.851+0000 7f4a59a95700 1 --2- 192.168.123.105:0/4008342753 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a54103cf0 0x7f4a54107d40 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f4a44009b00 tx=0x7f4a44009e10 comp rx=0 tx=0).stop 2026-03-10T07:56:30.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.852+0000 7f4a59a95700 1 -- 192.168.123.105:0/4008342753 shutdown_connections 2026-03-10T07:56:30.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.852+0000 7f4a59a95700 1 --2- 192.168.123.105:0/4008342753 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a54103cf0 0x7f4a54107d40 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:30.853 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.852+0000 7f4a59a95700 1 --2- 192.168.123.105:0/4008342753 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4a54103340 0x7f4a54103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:30.854 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.852+0000 7f4a59a95700 1 -- 192.168.123.105:0/4008342753 >> 192.168.123.105:0/4008342753 conn(0x7f4a540febd0 msgr2=0x7f4a54100ff0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:56:30.854 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.852+0000 7f4a59a95700 1 -- 192.168.123.105:0/4008342753 shutdown_connections 2026-03-10T07:56:30.854 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.852+0000 7f4a59a95700 1 -- 192.168.123.105:0/4008342753 wait complete. 2026-03-10T07:56:30.854 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.853+0000 7f4a59a95700 1 Processor -- start 2026-03-10T07:56:30.854 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.853+0000 7f4a59a95700 1 -- start start 2026-03-10T07:56:30.854 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.853+0000 7f4a59a95700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a54103340 0x7f4a54198db0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:56:30.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.853+0000 7f4a59a95700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4a54103cf0 0x7f4a541992f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:56:30.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.853+0000 7f4a59a95700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4a541999d0 con 0x7f4a54103340 2026-03-10T07:56:30.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.853+0000 7f4a59a95700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4a5419d760 con 0x7f4a54103cf0 2026-03-10T07:56:30.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.853+0000 7f4a52ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4a54103cf0 0x7f4a541992f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:56:30.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.854+0000 7f4a52ffd700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4a54103cf0 0x7f4a541992f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:37786/0 (socket says 192.168.123.105:37786) 2026-03-10T07:56:30.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.854+0000 7f4a52ffd700 1 -- 192.168.123.105:0/79527645 learned_addr learned my addr 192.168.123.105:0/79527645 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:56:30.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.854+0000 7f4a52ffd700 1 -- 192.168.123.105:0/79527645 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a54103340 msgr2=0x7f4a54198db0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:56:30.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.854+0000 7f4a537fe700 1 --2- 192.168.123.105:0/79527645 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a54103340 0x7f4a54198db0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:56:30.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.854+0000 7f4a52ffd700 1 --2- 192.168.123.105:0/79527645 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a54103340 0x7f4a54198db0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:30.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.854+0000 7f4a52ffd700 1 -- 192.168.123.105:0/79527645 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4a440097e0 con 0x7f4a54103cf0 2026-03-10T07:56:30.855 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.854+0000 7f4a537fe700 1 --2- 192.168.123.105:0/79527645 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a54103340 0x7f4a54198db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:56:30.856 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.854+0000 7f4a52ffd700 1 --2- 192.168.123.105:0/79527645 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4a54103cf0 0x7f4a541992f0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f4a44009fd0 tx=0x7f4a44004970 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:56:30.856 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.855+0000 7f4a50ff9700 1 -- 192.168.123.105:0/79527645 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4a4401d070 con 0x7f4a54103cf0 2026-03-10T07:56:30.856 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.855+0000 7f4a50ff9700 1 -- 192.168.123.105:0/79527645 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f4a4400bc50 con 0x7f4a54103cf0 2026-03-10T07:56:30.856 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.855+0000 7f4a50ff9700 1 -- 192.168.123.105:0/79527645 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4a4400f650 con 0x7f4a54103cf0 2026-03-10T07:56:30.856 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.855+0000 7f4a59a95700 1 -- 192.168.123.105:0/79527645 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4a5419d9e0 con 0x7f4a54103cf0 2026-03-10T07:56:30.856 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.855+0000 7f4a59a95700 1 -- 192.168.123.105:0/79527645 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4a5419ded0 con 0x7f4a54103cf0 2026-03-10T07:56:30.857 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.855+0000 7f4a59a95700 1 -- 192.168.123.105:0/79527645 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4a5404ea90 con 0x7f4a54103cf0 2026-03-10T07:56:30.858 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.856+0000 7f4a50ff9700 1 -- 192.168.123.105:0/79527645 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4a44022470 con 0x7f4a54103cf0 2026-03-10T07:56:30.858 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.857+0000 7f4a50ff9700 1 --2- 192.168.123.105:0/79527645 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4a400778c0 0x7f4a40079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:56:30.858 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.857+0000 7f4a50ff9700 1 -- 192.168.123.105:0/79527645 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f4a4409bd30 con 0x7f4a54103cf0 2026-03-10T07:56:30.858 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.857+0000 7f4a537fe700 1 --2- 192.168.123.105:0/79527645 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4a400778c0 0x7f4a40079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:56:30.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.857+0000 7f4a537fe700 1 --2- 192.168.123.105:0/79527645 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4a400778c0 0x7f4a40079d80 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f4a3c005fd0 tx=0x7f4a3c005ee0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:56:30.860 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.859+0000 7f4a50ff9700 1 -- 192.168.123.105:0/79527645 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4a440644b0 con 0x7f4a54103cf0 2026-03-10T07:56:30.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:30 vm08.local ceph-mon[107898]: pgmap v165: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:30.978 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.976+0000 7f4a59a95700 1 -- 192.168.123.105:0/79527645 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f4a5419e1b0 con 0x7f4a400778c0 2026-03-10T07:56:30.982 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.981+0000 7f4a50ff9700 1 -- 192.168.123.105:0/79527645 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f4a5419e1b0 con 0x7f4a400778c0 2026-03-10T07:56:30.983 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T07:56:30.983 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (5m) 14s ago 10m 23.5M - 0.25.0 c8568f914cd2 ac15d5f35994 2026-03-10T07:56:30.983 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (17s) 14s ago 10m 9773k - 19.2.3-678-ge911bdeb 654f31e6858e 918e0750b516 2026-03-10T07:56:30.983 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (15s) 14s ago 9m 9634k - 19.2.3-678-ge911bdeb 654f31e6858e a87dcf05815c 2026-03-10T07:56:30.983 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (3m) 14s ago 10m 7847k - 19.2.3-678-ge911bdeb 654f31e6858e daa831c74cf4 2026-03-10T07:56:30.983 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (3m) 14s ago 9m 8262k - 19.2.3-678-ge911bdeb 654f31e6858e 668ac55c9722 2026-03-10T07:56:30.983 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (4m) 14s ago 9m 85.9M - 10.4.0 c8b91775d855 6acb529ad951 2026-03-10T07:56:30.983 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.omfhnh vm05 running (56s) 14s ago 8m 17.5M - 19.2.3-678-ge911bdeb 654f31e6858e fd612446ffa4 2026-03-10T07:56:30.983 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.pavqil vm05 running (49s) 14s ago 8m 96.6M - 19.2.3-678-ge911bdeb 654f31e6858e f8155f8c8aff 2026-03-10T07:56:30.983 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.dgsaon vm08 running (21s) 14s ago 8m 15.4M - 19.2.3-678-ge911bdeb 654f31e6858e 1bd3e50b1d3f 2026-03-10T07:56:30.983 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ybmbgd vm08 running (41s) 14s ago 8m 21.8M - 19.2.3-678-ge911bdeb 654f31e6858e 24039c2c65e7 2026-03-10T07:56:30.983 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.blexke vm05 *:8443,9283,8765 running (6m) 14s ago 10m 596M - 19.2.3-678-ge911bdeb 654f31e6858e 5467b6a3bd38 2026-03-10T07:56:30.983 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.orfpog vm08 *:8443,9283,8765 running (5m) 14s ago 9m 536M - 19.2.3-678-ge911bdeb 654f31e6858e 7a15d7811d5b 2026-03-10T07:56:30.983 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (4m) 14s ago 10m 65.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e f02f076bb820 2026-03-10T07:56:30.983 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (3m) 14s ago 9m 57.2M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 73d9a504f360 2026-03-10T07:56:30.983 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (5m) 14s ago 10m 10.5M - 1.7.0 72c9c2088986 7cd0b23b4118 2026-03-10T07:56:30.983 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (5m) 14s ago 9m 10.1M - 1.7.0 72c9c2088986 3dd4d91d5881 2026-03-10T07:56:30.983 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (3m) 14s ago 9m 205M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b35fccc2a4d5 2026-03-10T07:56:30.983 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (2m) 14s ago 9m 107M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e3fe4ad5c6a3 2026-03-10T07:56:30.983 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (2m) 14s ago 8m 117M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 108a77e324b8 2026-03-10T07:56:30.983 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (110s) 14s ago 8m 150M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 3f280bcfe0f5 2026-03-10T07:56:30.983 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (89s) 14s ago 8m 137M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 132c8d288b1e 2026-03-10T07:56:30.983 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (68s) 14s ago 8m 104M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a4a8929822a2 2026-03-10T07:56:30.983 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (5m) 14s ago 9m 58.4M - 2.51.0 1d3b7f56885b c59a6be07563 2026-03-10T07:56:30.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.983+0000 7f4a59a95700 1 -- 192.168.123.105:0/79527645 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4a400778c0 msgr2=0x7f4a40079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:56:30.984 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.983+0000 7f4a59a95700 1 --2- 192.168.123.105:0/79527645 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4a400778c0 0x7f4a40079d80 secure :-1 s=READY pgs=80 cs=0 l=1 rev1=1 crypto rx=0x7f4a3c005fd0 tx=0x7f4a3c005ee0 comp rx=0 tx=0).stop 2026-03-10T07:56:30.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.983+0000 7f4a59a95700 1 -- 192.168.123.105:0/79527645 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4a54103cf0 msgr2=0x7f4a541992f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:56:30.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.983+0000 7f4a59a95700 1 --2- 192.168.123.105:0/79527645 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4a54103cf0 0x7f4a541992f0 secure :-1 s=READY pgs=62 cs=0 l=1 rev1=1 crypto rx=0x7f4a44009fd0 tx=0x7f4a44004970 comp rx=0 tx=0).stop 2026-03-10T07:56:30.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.984+0000 7f4a59a95700 1 -- 192.168.123.105:0/79527645 shutdown_connections 2026-03-10T07:56:30.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.984+0000 7f4a59a95700 1 --2- 192.168.123.105:0/79527645 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4a400778c0 0x7f4a40079d80 unknown :-1 s=CLOSED pgs=80 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:30.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.984+0000 7f4a59a95700 1 --2- 192.168.123.105:0/79527645 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4a54103340 0x7f4a54198db0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:30.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.984+0000 7f4a59a95700 1 --2- 192.168.123.105:0/79527645 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4a54103cf0 0x7f4a541992f0 unknown :-1 s=CLOSED pgs=62 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:30.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.984+0000 7f4a59a95700 1 -- 192.168.123.105:0/79527645 >> 192.168.123.105:0/79527645 conn(0x7f4a540febd0 msgr2=0x7f4a54100160 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:56:30.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.984+0000 7f4a59a95700 1 -- 192.168.123.105:0/79527645 shutdown_connections 2026-03-10T07:56:30.985 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:30.984+0000 7f4a59a95700 1 -- 192.168.123.105:0/79527645 wait complete. 2026-03-10T07:56:30.997 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:30 vm05.local ceph-mon[130117]: pgmap v165: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:31.051 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch upgrade status' 2026-03-10T07:56:31.184 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:56:31.413 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.411+0000 7f78e0def700 1 -- 192.168.123.105:0/1192477623 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78dc0690e0 msgr2=0x7f78dc105b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:56:31.413 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.411+0000 7f78e0def700 1 --2- 192.168.123.105:0/1192477623 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78dc0690e0 0x7f78dc105b50 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7f78cc009b00 tx=0x7f78cc009e10 comp rx=0 tx=0).stop 2026-03-10T07:56:31.413 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.412+0000 7f78e0def700 1 -- 192.168.123.105:0/1192477623 shutdown_connections 2026-03-10T07:56:31.414 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.412+0000 7f78e0def700 1 --2- 192.168.123.105:0/1192477623 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78dc0690e0 0x7f78dc105b50 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:31.414 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.412+0000 7f78e0def700 1 --2- 192.168.123.105:0/1192477623 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f78dc068730 0x7f78dc068b10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:31.414 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.412+0000 7f78e0def700 1 -- 192.168.123.105:0/1192477623 >> 192.168.123.105:0/1192477623 conn(0x7f78dc075960 msgr2=0x7f78dc075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:56:31.414 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.412+0000 7f78e0def700 1 -- 192.168.123.105:0/1192477623 shutdown_connections 2026-03-10T07:56:31.414 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.412+0000 7f78e0def700 1 -- 192.168.123.105:0/1192477623 wait complete. 2026-03-10T07:56:31.414 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.413+0000 7f78e0def700 1 Processor -- start 2026-03-10T07:56:31.414 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.413+0000 7f78e0def700 1 -- start start 2026-03-10T07:56:31.414 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.413+0000 7f78e0def700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f78dc068730 0x7f78dc19a7e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:56:31.414 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.413+0000 7f78e0def700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78dc0690e0 0x7f78dc19ad20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:56:31.414 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.413+0000 7f78e0def700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f78dc19b3b0 con 0x7f78dc0690e0 2026-03-10T07:56:31.415 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.413+0000 7f78e0def700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f78dc194860 con 0x7f78dc068730 2026-03-10T07:56:31.415 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.413+0000 7f78da59c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f78dc068730 0x7f78dc19a7e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:56:31.415 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.413+0000 7f78d9d9b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78dc0690e0 0x7f78dc19ad20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:56:31.415 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.413+0000 7f78da59c700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f78dc068730 0x7f78dc19a7e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:37804/0 (socket says 192.168.123.105:37804) 2026-03-10T07:56:31.415 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.413+0000 7f78da59c700 1 -- 192.168.123.105:0/3336905804 learned_addr learned my addr 192.168.123.105:0/3336905804 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:56:31.415 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.414+0000 7f78d9d9b700 1 -- 192.168.123.105:0/3336905804 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f78dc068730 msgr2=0x7f78dc19a7e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:56:31.416 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.414+0000 7f78d9d9b700 1 --2- 192.168.123.105:0/3336905804 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f78dc068730 0x7f78dc19a7e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:31.416 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.414+0000 7f78d9d9b700 1 -- 192.168.123.105:0/3336905804 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f78cc0097e0 con 0x7f78dc0690e0 2026-03-10T07:56:31.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.414+0000 7f78d9d9b700 1 --2- 192.168.123.105:0/3336905804 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78dc0690e0 0x7f78dc19ad20 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f78cc009fd0 tx=0x7f78cc004a50 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:56:31.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.414+0000 7f78d37fe700 1 -- 192.168.123.105:0/3336905804 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f78cc01d070 con 0x7f78dc0690e0 2026-03-10T07:56:31.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.414+0000 7f78d37fe700 1 -- 192.168.123.105:0/3336905804 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f78cc022470 con 0x7f78dc0690e0 2026-03-10T07:56:31.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.414+0000 7f78e0def700 1 -- 192.168.123.105:0/3336905804 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f78dc194ae0 con 0x7f78dc0690e0 2026-03-10T07:56:31.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.414+0000 7f78e0def700 1 -- 192.168.123.105:0/3336905804 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f78dc194fd0 con 0x7f78dc0690e0 2026-03-10T07:56:31.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.415+0000 7f78d37fe700 1 -- 192.168.123.105:0/3336905804 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f78cc00f670 con 0x7f78dc0690e0 2026-03-10T07:56:31.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.415+0000 7f78d37fe700 1 -- 192.168.123.105:0/3336905804 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f78cc00f850 con 0x7f78dc0690e0 2026-03-10T07:56:31.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.415+0000 7f78e0def700 1 -- 192.168.123.105:0/3336905804 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f78bc005320 con 0x7f78dc0690e0 2026-03-10T07:56:31.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.417+0000 7f78d37fe700 1 --2- 192.168.123.105:0/3336905804 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f78c80800e0 0x7f78c80825a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:56:31.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.417+0000 7f78d37fe700 1 -- 192.168.123.105:0/3336905804 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f78cc09b450 con 0x7f78dc0690e0 2026-03-10T07:56:31.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.418+0000 7f78da59c700 1 --2- 192.168.123.105:0/3336905804 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f78c80800e0 0x7f78c80825a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:56:31.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.418+0000 7f78d37fe700 1 -- 192.168.123.105:0/3336905804 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f78cc064810 con 0x7f78dc0690e0 2026-03-10T07:56:31.420 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.419+0000 7f78da59c700 1 --2- 192.168.123.105:0/3336905804 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f78c80800e0 0x7f78c80825a0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f78c4005950 tx=0x7f78c400b500 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:56:31.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.546+0000 7f78e0def700 1 -- 192.168.123.105:0/3336905804 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}) v1 -- 0x7f78bc000bf0 con 0x7f78c80800e0 2026-03-10T07:56:31.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.547+0000 7f78d37fe700 1 -- 192.168.123.105:0/3336905804 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+175 (secure 0 0 0) 0x7f78bc000bf0 con 0x7f78c80800e0 2026-03-10T07:56:31.549 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:56:31.549 INFO:teuthology.orchestra.run.vm05.stdout: "target_image": null, 2026-03-10T07:56:31.549 INFO:teuthology.orchestra.run.vm05.stdout: "in_progress": false, 2026-03-10T07:56:31.549 INFO:teuthology.orchestra.run.vm05.stdout: "which": "", 2026-03-10T07:56:31.549 INFO:teuthology.orchestra.run.vm05.stdout: "services_complete": [], 2026-03-10T07:56:31.549 INFO:teuthology.orchestra.run.vm05.stdout: "progress": null, 2026-03-10T07:56:31.549 INFO:teuthology.orchestra.run.vm05.stdout: "message": "", 2026-03-10T07:56:31.549 INFO:teuthology.orchestra.run.vm05.stdout: "is_paused": false 2026-03-10T07:56:31.549 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:56:31.551 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.550+0000 7f78e0def700 1 -- 192.168.123.105:0/3336905804 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f78c80800e0 msgr2=0x7f78c80825a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:56:31.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.550+0000 7f78e0def700 1 --2- 192.168.123.105:0/3336905804 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f78c80800e0 0x7f78c80825a0 secure :-1 s=READY pgs=81 cs=0 l=1 rev1=1 crypto rx=0x7f78c4005950 tx=0x7f78c400b500 comp rx=0 tx=0).stop 2026-03-10T07:56:31.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.551+0000 7f78e0def700 1 -- 192.168.123.105:0/3336905804 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78dc0690e0 msgr2=0x7f78dc19ad20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:56:31.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.551+0000 7f78e0def700 1 --2- 192.168.123.105:0/3336905804 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78dc0690e0 0x7f78dc19ad20 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f78cc009fd0 tx=0x7f78cc004a50 comp rx=0 tx=0).stop 2026-03-10T07:56:31.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.551+0000 7f78e0def700 1 -- 192.168.123.105:0/3336905804 shutdown_connections 2026-03-10T07:56:31.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.551+0000 7f78e0def700 1 --2- 192.168.123.105:0/3336905804 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f78dc068730 0x7f78dc19a7e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:31.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.551+0000 7f78e0def700 1 --2- 192.168.123.105:0/3336905804 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f78c80800e0 0x7f78c80825a0 unknown :-1 s=CLOSED pgs=81 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:31.552 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.551+0000 7f78e0def700 1 --2- 192.168.123.105:0/3336905804 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f78dc0690e0 0x7f78dc19ad20 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:31.553 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.551+0000 7f78e0def700 1 -- 192.168.123.105:0/3336905804 >> 192.168.123.105:0/3336905804 conn(0x7f78dc075960 msgr2=0x7f78dc0feac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:56:31.553 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.552+0000 7f78e0def700 1 -- 192.168.123.105:0/3336905804 shutdown_connections 2026-03-10T07:56:31.553 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.552+0000 7f78e0def700 1 -- 192.168.123.105:0/3336905804 wait complete. 2026-03-10T07:56:31.592 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph health detail' 2026-03-10T07:56:31.724 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:56:31.799 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:31 vm05.local ceph-mon[130117]: from='client.44297 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:56:31.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.963+0000 7f3a92c06700 1 -- 192.168.123.105:0/3080375185 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a8c103340 msgr2=0x7f3a8c103720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:56:31.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.963+0000 7f3a92c06700 1 --2- 192.168.123.105:0/3080375185 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a8c103340 0x7f3a8c103720 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f3a7c009b00 tx=0x7f3a7c009e10 comp rx=0 tx=0).stop 2026-03-10T07:56:31.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.964+0000 7f3a92c06700 1 -- 192.168.123.105:0/3080375185 shutdown_connections 2026-03-10T07:56:31.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.964+0000 7f3a92c06700 1 --2- 192.168.123.105:0/3080375185 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a8c103cf0 0x7f3a8c107d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:31.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.964+0000 7f3a92c06700 1 --2- 192.168.123.105:0/3080375185 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a8c103340 0x7f3a8c103720 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:31.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.964+0000 7f3a92c06700 1 -- 192.168.123.105:0/3080375185 >> 192.168.123.105:0/3080375185 conn(0x7f3a8c0feb90 msgr2=0x7f3a8c100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:56:31.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.964+0000 7f3a92c06700 1 -- 192.168.123.105:0/3080375185 shutdown_connections 2026-03-10T07:56:31.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.964+0000 7f3a92c06700 1 -- 192.168.123.105:0/3080375185 wait complete. 2026-03-10T07:56:31.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.965+0000 7f3a92c06700 1 Processor -- start 2026-03-10T07:56:31.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.965+0000 7f3a92c06700 1 -- start start 2026-03-10T07:56:31.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.965+0000 7f3a92c06700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a8c103340 0x7f3a8c198dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:56:31.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.965+0000 7f3a909a2700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a8c103340 0x7f3a8c198dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:56:31.967 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.965+0000 7f3a909a2700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a8c103340 0x7f3a8c198dd0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:59652/0 (socket says 192.168.123.105:59652) 2026-03-10T07:56:31.967 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.965+0000 7f3a92c06700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a8c103cf0 0x7f3a8c199310 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:56:31.967 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.965+0000 7f3a92c06700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3a8c1999f0 con 0x7f3a8c103340 2026-03-10T07:56:31.967 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.965+0000 7f3a92c06700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3a8c19d780 con 0x7f3a8c103cf0 2026-03-10T07:56:31.967 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.965+0000 7f3a909a2700 1 -- 192.168.123.105:0/3933871240 learned_addr learned my addr 192.168.123.105:0/3933871240 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:56:31.967 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.966+0000 7f3a8bfff700 1 --2- 192.168.123.105:0/3933871240 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a8c103cf0 0x7f3a8c199310 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:56:31.967 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.966+0000 7f3a909a2700 1 -- 192.168.123.105:0/3933871240 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a8c103cf0 msgr2=0x7f3a8c199310 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:56:31.967 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.966+0000 7f3a909a2700 1 --2- 192.168.123.105:0/3933871240 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a8c103cf0 0x7f3a8c199310 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:31.967 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.966+0000 7f3a909a2700 1 -- 192.168.123.105:0/3933871240 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3a7c0097e0 con 0x7f3a8c103340 2026-03-10T07:56:31.967 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.966+0000 7f3a8bfff700 1 --2- 192.168.123.105:0/3933871240 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a8c103cf0 0x7f3a8c199310 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T07:56:31.967 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.966+0000 7f3a909a2700 1 --2- 192.168.123.105:0/3933871240 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a8c103340 0x7f3a8c198dd0 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f3a7c004c80 tx=0x7f3a7c0051f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:56:31.967 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.966+0000 7f3a89ffb700 1 -- 192.168.123.105:0/3933871240 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3a7c01d070 con 0x7f3a8c103340 2026-03-10T07:56:31.968 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.966+0000 7f3a89ffb700 1 -- 192.168.123.105:0/3933871240 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f3a7c00bb40 con 0x7f3a8c103340 2026-03-10T07:56:31.968 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.966+0000 7f3a92c06700 1 -- 192.168.123.105:0/3933871240 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3a8c19da00 con 0x7f3a8c103340 2026-03-10T07:56:31.968 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.967+0000 7f3a89ffb700 1 -- 192.168.123.105:0/3933871240 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3a7c00f7c0 con 0x7f3a8c103340 2026-03-10T07:56:31.968 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.967+0000 7f3a92c06700 1 -- 192.168.123.105:0/3933871240 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3a8c19def0 con 0x7f3a8c103340 2026-03-10T07:56:31.972 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.968+0000 7f3a89ffb700 1 -- 192.168.123.105:0/3933871240 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3a7c00f920 con 0x7f3a8c103340 2026-03-10T07:56:31.972 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.968+0000 7f3a92c06700 1 -- 192.168.123.105:0/3933871240 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3a8c10b6e0 con 0x7f3a8c103340 2026-03-10T07:56:31.972 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.971+0000 7f3a89ffb700 1 --2- 192.168.123.105:0/3933871240 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3a74077660 0x7f3a74079b20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:56:31.972 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.971+0000 7f3a8bfff700 1 --2- 192.168.123.105:0/3933871240 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3a74077660 0x7f3a74079b20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:56:31.973 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.972+0000 7f3a8bfff700 1 --2- 192.168.123.105:0/3933871240 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3a74077660 0x7f3a74079b20 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f3a8c19a3f0 tx=0x7f3a8000b5f0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:56:31.973 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.972+0000 7f3a89ffb700 1 -- 192.168.123.105:0/3933871240 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f3a7c09afd0 con 0x7f3a8c103340 2026-03-10T07:56:31.973 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:31.972+0000 7f3a89ffb700 1 -- 192.168.123.105:0/3933871240 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3a7c0a0050 con 0x7f3a8c103340 2026-03-10T07:56:32.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.126+0000 7f3a92c06700 1 -- 192.168.123.105:0/3933871240 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "health", "detail": "detail"} v 0) v1 -- 0x7f3a8c19a130 con 0x7f3a8c103340 2026-03-10T07:56:32.128 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.127+0000 7f3a89ffb700 1 -- 192.168.123.105:0/3933871240 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "health", "detail": "detail"}]=0 v0) v1 ==== 74+0+201 (secure 0 0 0) 0x7f3a7c0636c0 con 0x7f3a8c103340 2026-03-10T07:56:32.129 INFO:teuthology.orchestra.run.vm05.stdout:HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T07:56:32.129 INFO:teuthology.orchestra.run.vm05.stdout:[WRN] FS_INLINE_DATA_DEPRECATED: 1 filesystem with deprecated feature inline_data 2026-03-10T07:56:32.129 INFO:teuthology.orchestra.run.vm05.stdout: fs cephfs has deprecated feature inline_data enabled. 2026-03-10T07:56:32.131 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.130+0000 7f3a92c06700 1 -- 192.168.123.105:0/3933871240 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3a74077660 msgr2=0x7f3a74079b20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:56:32.131 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.130+0000 7f3a92c06700 1 --2- 192.168.123.105:0/3933871240 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3a74077660 0x7f3a74079b20 secure :-1 s=READY pgs=82 cs=0 l=1 rev1=1 crypto rx=0x7f3a8c19a3f0 tx=0x7f3a8000b5f0 comp rx=0 tx=0).stop 2026-03-10T07:56:32.131 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.130+0000 7f3a92c06700 1 -- 192.168.123.105:0/3933871240 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a8c103340 msgr2=0x7f3a8c198dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:56:32.131 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.130+0000 7f3a92c06700 1 --2- 192.168.123.105:0/3933871240 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a8c103340 0x7f3a8c198dd0 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f3a7c004c80 tx=0x7f3a7c0051f0 comp rx=0 tx=0).stop 2026-03-10T07:56:32.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.130+0000 7f3a92c06700 1 -- 192.168.123.105:0/3933871240 shutdown_connections 2026-03-10T07:56:32.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.130+0000 7f3a92c06700 1 --2- 192.168.123.105:0/3933871240 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3a74077660 0x7f3a74079b20 unknown :-1 s=CLOSED pgs=82 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:32.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.130+0000 7f3a92c06700 1 --2- 192.168.123.105:0/3933871240 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3a8c103340 0x7f3a8c198dd0 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:32.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.130+0000 7f3a92c06700 1 --2- 192.168.123.105:0/3933871240 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3a8c103cf0 0x7f3a8c199310 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:32.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.131+0000 7f3a92c06700 1 -- 192.168.123.105:0/3933871240 >> 192.168.123.105:0/3933871240 conn(0x7f3a8c0feb90 msgr2=0x7f3a8c100f30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:56:32.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.131+0000 7f3a92c06700 1 -- 192.168.123.105:0/3933871240 shutdown_connections 2026-03-10T07:56:32.132 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.131+0000 7f3a92c06700 1 -- 192.168.123.105:0/3933871240 wait complete. 2026-03-10T07:56:32.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:31 vm08.local ceph-mon[107898]: from='client.44297 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:56:32.191 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions' 2026-03-10T07:56:32.327 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:56:32.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.571+0000 7f2c162b7700 1 -- 192.168.123.105:0/3638816216 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2c10100b50 msgr2=0x7f2c10104a40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:56:32.573 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.571+0000 7f2c162b7700 1 --2- 192.168.123.105:0/3638816216 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2c10100b50 0x7f2c10104a40 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f2c00009b00 tx=0x7f2c00009e10 comp rx=0 tx=0).stop 2026-03-10T07:56:32.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.571+0000 7f2c162b7700 1 -- 192.168.123.105:0/3638816216 shutdown_connections 2026-03-10T07:56:32.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.571+0000 7f2c162b7700 1 --2- 192.168.123.105:0/3638816216 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2c10100b50 0x7f2c10104a40 secure :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f2c00009b00 tx=0x7f2c00009e10 comp rx=0 tx=0).stop 2026-03-10T07:56:32.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.571+0000 7f2c162b7700 1 --2- 192.168.123.105:0/3638816216 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2c101001a0 0x7f2c10100580 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:32.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.571+0000 7f2c162b7700 1 -- 192.168.123.105:0/3638816216 >> 192.168.123.105:0/3638816216 conn(0x7f2c10075960 msgr2=0x7f2c10075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:56:32.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.572+0000 7f2c162b7700 1 -- 192.168.123.105:0/3638816216 shutdown_connections 2026-03-10T07:56:32.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.572+0000 7f2c162b7700 1 -- 192.168.123.105:0/3638816216 wait complete. 2026-03-10T07:56:32.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.573+0000 7f2c162b7700 1 Processor -- start 2026-03-10T07:56:32.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.573+0000 7f2c162b7700 1 -- start start 2026-03-10T07:56:32.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.573+0000 7f2c162b7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2c101001a0 0x7f2c10196e20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:56:32.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.573+0000 7f2c162b7700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2c10197360 0x7f2c1019b7d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:56:32.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.573+0000 7f2c162b7700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2c10197980 con 0x7f2c101001a0 2026-03-10T07:56:32.574 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.573+0000 7f2c162b7700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f2c10197af0 con 0x7f2c10197360 2026-03-10T07:56:32.575 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.573+0000 7f2c0ffff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2c101001a0 0x7f2c10196e20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:56:32.575 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.573+0000 7f2c0ffff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2c101001a0 0x7f2c10196e20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:59668/0 (socket says 192.168.123.105:59668) 2026-03-10T07:56:32.575 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.573+0000 7f2c0ffff700 1 -- 192.168.123.105:0/4268194790 learned_addr learned my addr 192.168.123.105:0/4268194790 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:56:32.575 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.573+0000 7f2c0f7fe700 1 --2- 192.168.123.105:0/4268194790 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2c10197360 0x7f2c1019b7d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:56:32.575 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.574+0000 7f2c0ffff700 1 -- 192.168.123.105:0/4268194790 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2c10197360 msgr2=0x7f2c1019b7d0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:56:32.575 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.574+0000 7f2c0ffff700 1 --2- 192.168.123.105:0/4268194790 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2c10197360 0x7f2c1019b7d0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:32.575 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.574+0000 7f2c0ffff700 1 -- 192.168.123.105:0/4268194790 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f2c000097e0 con 0x7f2c101001a0 2026-03-10T07:56:32.575 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.574+0000 7f2c0ffff700 1 --2- 192.168.123.105:0/4268194790 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2c101001a0 0x7f2c10196e20 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f2bf800ba70 tx=0x7f2bf800be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:56:32.575 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.574+0000 7f2c0d7fa700 1 -- 192.168.123.105:0/4268194790 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2bf800c780 con 0x7f2c101001a0 2026-03-10T07:56:32.576 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.574+0000 7f2c162b7700 1 -- 192.168.123.105:0/4268194790 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2c1019bdd0 con 0x7f2c101001a0 2026-03-10T07:56:32.576 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.575+0000 7f2c162b7700 1 -- 192.168.123.105:0/4268194790 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2c1019c320 con 0x7f2c101001a0 2026-03-10T07:56:32.578 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.576+0000 7f2c0d7fa700 1 -- 192.168.123.105:0/4268194790 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f2bf800cdc0 con 0x7f2c101001a0 2026-03-10T07:56:32.578 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.576+0000 7f2c0d7fa700 1 -- 192.168.123.105:0/4268194790 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f2bf8012550 con 0x7f2c101001a0 2026-03-10T07:56:32.578 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.576+0000 7f2c0d7fa700 1 -- 192.168.123.105:0/4268194790 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f2bf8012770 con 0x7f2c101001a0 2026-03-10T07:56:32.578 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.576+0000 7f2c0d7fa700 1 --2- 192.168.123.105:0/4268194790 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f2bfc07be70 0x7f2bfc07e330 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:56:32.578 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.576+0000 7f2c0d7fa700 1 -- 192.168.123.105:0/4268194790 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f2bf8099b70 con 0x7f2c101001a0 2026-03-10T07:56:32.578 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.577+0000 7f2c0f7fe700 1 --2- 192.168.123.105:0/4268194790 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f2bfc07be70 0x7f2bfc07e330 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:56:32.582 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.577+0000 7f2c162b7700 1 -- 192.168.123.105:0/4268194790 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f2bf0005320 con 0x7f2c101001a0 2026-03-10T07:56:32.582 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.580+0000 7f2c0f7fe700 1 --2- 192.168.123.105:0/4268194790 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f2bfc07be70 0x7f2bfc07e330 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f2c00006010 tx=0x7f2c0000b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:56:32.582 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.581+0000 7f2c0d7fa700 1 -- 192.168.123.105:0/4268194790 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f2bf80624b0 con 0x7f2c101001a0 2026-03-10T07:56:32.747 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.745+0000 7f2c162b7700 1 -- 192.168.123.105:0/4268194790 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f2bf0006200 con 0x7f2c101001a0 2026-03-10T07:56:32.747 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.746+0000 7f2c0d7fa700 1 -- 192.168.123.105:0/4268194790 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f2bf8061c00 con 0x7f2c101001a0 2026-03-10T07:56:32.748 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:56:32.748 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T07:56:32.748 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:56:32.748 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:56:32.748 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T07:56:32.748 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:56:32.748 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:56:32.748 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T07:56:32.748 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T07:56:32.748 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:56:32.748 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T07:56:32.748 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T07:56:32.748 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:56:32.748 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T07:56:32.748 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-10T07:56:32.748 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T07:56:32.748 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:56:32.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.748+0000 7f2c162b7700 1 -- 192.168.123.105:0/4268194790 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f2bfc07be70 msgr2=0x7f2bfc07e330 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:56:32.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.748+0000 7f2c162b7700 1 --2- 192.168.123.105:0/4268194790 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f2bfc07be70 0x7f2bfc07e330 secure :-1 s=READY pgs=83 cs=0 l=1 rev1=1 crypto rx=0x7f2c00006010 tx=0x7f2c0000b540 comp rx=0 tx=0).stop 2026-03-10T07:56:32.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.749+0000 7f2c162b7700 1 -- 192.168.123.105:0/4268194790 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2c101001a0 msgr2=0x7f2c10196e20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:56:32.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.749+0000 7f2c162b7700 1 --2- 192.168.123.105:0/4268194790 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2c101001a0 0x7f2c10196e20 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f2bf800ba70 tx=0x7f2bf800be30 comp rx=0 tx=0).stop 2026-03-10T07:56:32.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.749+0000 7f2c162b7700 1 -- 192.168.123.105:0/4268194790 shutdown_connections 2026-03-10T07:56:32.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.749+0000 7f2c162b7700 1 --2- 192.168.123.105:0/4268194790 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f2bfc07be70 0x7f2bfc07e330 unknown :-1 s=CLOSED pgs=83 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:32.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.749+0000 7f2c162b7700 1 --2- 192.168.123.105:0/4268194790 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2c101001a0 0x7f2c10196e20 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:32.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.749+0000 7f2c162b7700 1 --2- 192.168.123.105:0/4268194790 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f2c10197360 0x7f2c1019b7d0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:56:32.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.749+0000 7f2c162b7700 1 -- 192.168.123.105:0/4268194790 >> 192.168.123.105:0/4268194790 conn(0x7f2c10075960 msgr2=0x7f2c100fdab0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:56:32.750 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.749+0000 7f2c162b7700 1 -- 192.168.123.105:0/4268194790 shutdown_connections 2026-03-10T07:56:32.751 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:56:32.749+0000 7f2c162b7700 1 -- 192.168.123.105:0/4268194790 wait complete. 2026-03-10T07:56:32.796 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'echo "wait for servicemap items w/ changing names to refresh"' 2026-03-10T07:56:32.936 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:56:33.000 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:32 vm05.local ceph-mon[130117]: pgmap v166: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:33.000 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:32 vm05.local ceph-mon[130117]: from='client.44301 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:56:33.000 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:32 vm05.local ceph-mon[130117]: from='client.34312 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:56:33.000 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:32 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/3933871240' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T07:56:33.000 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:32 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/4268194790' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:33.120 INFO:teuthology.orchestra.run.vm05.stdout:wait for servicemap items w/ changing names to refresh 2026-03-10T07:56:33.157 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'sleep 60' 2026-03-10T07:56:33.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:32 vm08.local ceph-mon[107898]: pgmap v166: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:33.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:32 vm08.local ceph-mon[107898]: from='client.44301 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:56:33.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:32 vm08.local ceph-mon[107898]: from='client.34312 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:56:33.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:32 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/3933871240' entity='client.admin' cmd=[{"prefix": "health", "detail": "detail"}]: dispatch 2026-03-10T07:56:33.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:32 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/4268194790' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:56:33.290 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:56:35.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:34 vm05.local ceph-mon[130117]: pgmap v167: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:35.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:34 vm08.local ceph-mon[107898]: pgmap v167: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:37.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:36 vm05.local ceph-mon[130117]: pgmap v168: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:37.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:36 vm08.local ceph-mon[107898]: pgmap v168: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:39.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:38 vm05.local ceph-mon[130117]: pgmap v169: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:39.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:38 vm08.local ceph-mon[107898]: pgmap v169: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:40.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:39 vm05.local ceph-mon[130117]: pgmap v170: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:40.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:39 vm08.local ceph-mon[107898]: pgmap v170: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:42.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:41 vm05.local ceph-mon[130117]: pgmap v171: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:42.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:41 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:56:42.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:41 vm08.local ceph-mon[107898]: pgmap v171: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:42.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:41 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:56:44.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:43 vm05.local ceph-mon[130117]: pgmap v172: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:44.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:43 vm08.local ceph-mon[107898]: pgmap v172: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:45.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:45 vm08.local ceph-mon[107898]: pgmap v173: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:46.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:45 vm05.local ceph-mon[130117]: pgmap v173: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:48.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:47 vm05.local ceph-mon[130117]: pgmap v174: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:48.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:47 vm08.local ceph-mon[107898]: pgmap v174: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:50.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:49 vm05.local ceph-mon[130117]: pgmap v175: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:50.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:49 vm08.local ceph-mon[107898]: pgmap v175: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:52.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:51 vm05.local ceph-mon[130117]: pgmap v176: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:52.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:51 vm08.local ceph-mon[107898]: pgmap v176: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:54.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:53 vm08.local ceph-mon[107898]: pgmap v177: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:54.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:53 vm05.local ceph-mon[130117]: pgmap v177: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:56.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:55 vm08.local ceph-mon[107898]: pgmap v178: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:56.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:55 vm05.local ceph-mon[130117]: pgmap v178: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:57.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:56 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:56:57.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:56 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:56:58.309 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:57 vm05.local ceph-mon[130117]: pgmap v179: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:56:58.319 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:57 vm08.local ceph-mon[107898]: pgmap v179: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:00.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:56:59 vm05.local ceph-mon[130117]: pgmap v180: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:00.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:56:59 vm08.local ceph-mon[107898]: pgmap v180: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:02.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:01 vm05.local ceph-mon[130117]: pgmap v181: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:02.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:01 vm08.local ceph-mon[107898]: pgmap v181: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:04.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:03 vm05.local ceph-mon[130117]: pgmap v182: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:04.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:03 vm08.local ceph-mon[107898]: pgmap v182: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:06.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:05 vm08.local ceph-mon[107898]: pgmap v183: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:06.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:05 vm05.local ceph-mon[130117]: pgmap v183: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:08.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:07 vm05.local ceph-mon[130117]: pgmap v184: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:08.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:07 vm08.local ceph-mon[107898]: pgmap v184: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:10.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:09 vm05.local ceph-mon[130117]: pgmap v185: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:10.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:09 vm08.local ceph-mon[107898]: pgmap v185: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:12.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:11 vm05.local ceph-mon[130117]: pgmap v186: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:12.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:11 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:57:12.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:11 vm08.local ceph-mon[107898]: pgmap v186: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:12.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:11 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:57:14.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:13 vm05.local ceph-mon[130117]: pgmap v187: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:14.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:13 vm08.local ceph-mon[107898]: pgmap v187: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:16.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:15 vm08.local ceph-mon[107898]: pgmap v188: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:16.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:15 vm05.local ceph-mon[130117]: pgmap v188: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:18.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:17 vm05.local ceph-mon[130117]: pgmap v189: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:18.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:17 vm08.local ceph-mon[107898]: pgmap v189: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:20.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:19 vm05.local ceph-mon[130117]: pgmap v190: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:20.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:57:20.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:57:20.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:57:20.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:57:20.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:57:20.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:19 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:57:20.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:19 vm08.local ceph-mon[107898]: pgmap v190: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:20.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:57:20.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:57:20.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:57:20.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:57:20.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:57:20.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:19 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:57:22.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:21 vm05.local ceph-mon[130117]: pgmap v191: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:22.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:21 vm08.local ceph-mon[107898]: pgmap v191: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:24.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:24 vm05.local ceph-mon[130117]: pgmap v192: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:24.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:24 vm08.local ceph-mon[107898]: pgmap v192: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:26.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:26 vm05.local ceph-mon[130117]: pgmap v193: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:26.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:26 vm08.local ceph-mon[107898]: pgmap v193: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:27.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:57:27.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:27 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:57:28.308 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:28 vm05.local ceph-mon[130117]: pgmap v194: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:28.319 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:28 vm08.local ceph-mon[107898]: pgmap v194: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:30.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:30 vm05.local ceph-mon[130117]: pgmap v195: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:30.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:30 vm08.local ceph-mon[107898]: pgmap v195: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:32.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:32 vm05.local ceph-mon[130117]: pgmap v196: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:32.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:32 vm08.local ceph-mon[107898]: pgmap v196: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:33.499 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph orch ps' 2026-03-10T07:57:33.651 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:33.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.923+0000 7f6fd78a8700 1 -- 192.168.123.105:0/2134191094 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6fd0101280 msgr2=0x7f6fd0101660 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:33.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.923+0000 7f6fd78a8700 1 --2- 192.168.123.105:0/2134191094 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6fd0101280 0x7f6fd0101660 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7f6fc0009b50 tx=0x7f6fc0009e60 comp rx=0 tx=0).stop 2026-03-10T07:57:33.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.924+0000 7f6fd78a8700 1 -- 192.168.123.105:0/2134191094 shutdown_connections 2026-03-10T07:57:33.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.924+0000 7f6fd78a8700 1 --2- 192.168.123.105:0/2134191094 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6fd0101c30 0x7f6fd0105b70 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:33.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.924+0000 7f6fd78a8700 1 --2- 192.168.123.105:0/2134191094 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6fd0101280 0x7f6fd0101660 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:33.925 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.924+0000 7f6fd78a8700 1 -- 192.168.123.105:0/2134191094 >> 192.168.123.105:0/2134191094 conn(0x7f6fd0078ed0 msgr2=0x7f6fd00792e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:33.926 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.924+0000 7f6fd78a8700 1 -- 192.168.123.105:0/2134191094 shutdown_connections 2026-03-10T07:57:33.926 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.924+0000 7f6fd78a8700 1 -- 192.168.123.105:0/2134191094 wait complete. 2026-03-10T07:57:33.926 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.925+0000 7f6fd78a8700 1 Processor -- start 2026-03-10T07:57:33.926 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.925+0000 7f6fd78a8700 1 -- start start 2026-03-10T07:57:33.926 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.925+0000 7f6fd78a8700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6fd0101280 0x7f6fd0072980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:33.926 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.925+0000 7f6fd78a8700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6fd0101c30 0x7f6fd006d980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:33.926 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.925+0000 7f6fd78a8700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6fd006df50 con 0x7f6fd0101c30 2026-03-10T07:57:33.926 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.925+0000 7f6fd78a8700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6fd006e0c0 con 0x7f6fd0101280 2026-03-10T07:57:33.926 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.925+0000 7f6fd4e43700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6fd0101c30 0x7f6fd006d980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:33.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.925+0000 7f6fd4e43700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6fd0101c30 0x7f6fd006d980 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:42652/0 (socket says 192.168.123.105:42652) 2026-03-10T07:57:33.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.925+0000 7f6fd4e43700 1 -- 192.168.123.105:0/3868249056 learned_addr learned my addr 192.168.123.105:0/3868249056 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:33.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.926+0000 7f6fd4e43700 1 -- 192.168.123.105:0/3868249056 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6fd0101280 msgr2=0x7f6fd0072980 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:57:33.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.926+0000 7f6fd4e43700 1 --2- 192.168.123.105:0/3868249056 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6fd0101280 0x7f6fd0072980 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:33.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.926+0000 7f6fd4e43700 1 -- 192.168.123.105:0/3868249056 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6fc00097e0 con 0x7f6fd0101c30 2026-03-10T07:57:33.927 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.926+0000 7f6fd4e43700 1 --2- 192.168.123.105:0/3868249056 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6fd0101c30 0x7f6fd006d980 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f6fcc00da10 tx=0x7f6fcc00ddd0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:33.928 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.926+0000 7f6fc67fc700 1 -- 192.168.123.105:0/3868249056 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6fcc0049e0 con 0x7f6fd0101c30 2026-03-10T07:57:33.928 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.926+0000 7f6fc67fc700 1 -- 192.168.123.105:0/3868249056 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f6fcc005500 con 0x7f6fd0101c30 2026-03-10T07:57:33.928 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.926+0000 7f6fc67fc700 1 -- 192.168.123.105:0/3868249056 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6fcc009da0 con 0x7f6fd0101c30 2026-03-10T07:57:33.928 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.927+0000 7f6fd78a8700 1 -- 192.168.123.105:0/3868249056 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6fd006e3a0 con 0x7f6fd0101c30 2026-03-10T07:57:33.928 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.927+0000 7f6fd78a8700 1 -- 192.168.123.105:0/3868249056 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6fd01a2730 con 0x7f6fd0101c30 2026-03-10T07:57:33.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.928+0000 7f6fc67fc700 1 -- 192.168.123.105:0/3868249056 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6fcc005670 con 0x7f6fd0101c30 2026-03-10T07:57:33.930 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.929+0000 7f6fd78a8700 1 -- 192.168.123.105:0/3868249056 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6fd01094e0 con 0x7f6fd0101c30 2026-03-10T07:57:33.933 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.929+0000 7f6fc67fc700 1 --2- 192.168.123.105:0/3868249056 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6fbc077880 0x7f6fbc079d40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:33.933 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.929+0000 7f6fc67fc700 1 -- 192.168.123.105:0/3868249056 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f6fcc099640 con 0x7f6fd0101c30 2026-03-10T07:57:33.933 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.932+0000 7f6fd5644700 1 --2- 192.168.123.105:0/3868249056 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6fbc077880 0x7f6fbc079d40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:33.933 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.932+0000 7f6fd5644700 1 --2- 192.168.123.105:0/3868249056 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6fbc077880 0x7f6fbc079d40 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f6fc000b5c0 tx=0x7f6fc0005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:33.933 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:33.932+0000 7f6fc67fc700 1 -- 192.168.123.105:0/3868249056 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6fcc061dc0 con 0x7f6fd0101c30 2026-03-10T07:57:34.057 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.055+0000 7f6fd78a8700 1 -- 192.168.123.105:0/3868249056 --> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] -- mgr_command(tid 0: {"prefix": "orch ps", "target": ["mon-mgr", ""]}) v1 -- 0x7f6fd006eeb0 con 0x7f6fbc077880 2026-03-10T07:57:34.062 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.061+0000 7f6fc67fc700 1 -- 192.168.123.105:0/3868249056 <== mgr.34104 v2:192.168.123.105:6800/627686298 1 ==== mgr_command_reply(tid 0: 0 ) v1 ==== 8+0+3576 (secure 0 0 0) 0x7f6fd006eeb0 con 0x7f6fbc077880 2026-03-10T07:57:34.063 INFO:teuthology.orchestra.run.vm05.stdout:NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID 2026-03-10T07:57:34.063 INFO:teuthology.orchestra.run.vm05.stdout:alertmanager.vm05 vm05 *:9093,9094 running (6m) 77s ago 11m 23.5M - 0.25.0 c8568f914cd2 ac15d5f35994 2026-03-10T07:57:34.063 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm05 vm05 running (80s) 77s ago 11m 9773k - 19.2.3-678-ge911bdeb 654f31e6858e 918e0750b516 2026-03-10T07:57:34.063 INFO:teuthology.orchestra.run.vm05.stdout:ceph-exporter.vm08 vm08 running (78s) 77s ago 10m 9634k - 19.2.3-678-ge911bdeb 654f31e6858e a87dcf05815c 2026-03-10T07:57:34.063 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm05 vm05 running (4m) 77s ago 11m 7847k - 19.2.3-678-ge911bdeb 654f31e6858e daa831c74cf4 2026-03-10T07:57:34.063 INFO:teuthology.orchestra.run.vm05.stdout:crash.vm08 vm08 running (4m) 77s ago 10m 8262k - 19.2.3-678-ge911bdeb 654f31e6858e 668ac55c9722 2026-03-10T07:57:34.063 INFO:teuthology.orchestra.run.vm05.stdout:grafana.vm05 vm05 *:3000 running (5m) 77s ago 11m 85.9M - 10.4.0 c8b91775d855 6acb529ad951 2026-03-10T07:57:34.063 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.omfhnh vm05 running (2m) 77s ago 9m 17.5M - 19.2.3-678-ge911bdeb 654f31e6858e fd612446ffa4 2026-03-10T07:57:34.063 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm05.pavqil vm05 running (112s) 77s ago 9m 96.6M - 19.2.3-678-ge911bdeb 654f31e6858e f8155f8c8aff 2026-03-10T07:57:34.063 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.dgsaon vm08 running (85s) 77s ago 9m 15.4M - 19.2.3-678-ge911bdeb 654f31e6858e 1bd3e50b1d3f 2026-03-10T07:57:34.063 INFO:teuthology.orchestra.run.vm05.stdout:mds.cephfs.vm08.ybmbgd vm08 running (104s) 77s ago 9m 21.8M - 19.2.3-678-ge911bdeb 654f31e6858e 24039c2c65e7 2026-03-10T07:57:34.063 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm05.blexke vm05 *:8443,9283,8765 running (7m) 77s ago 11m 596M - 19.2.3-678-ge911bdeb 654f31e6858e 5467b6a3bd38 2026-03-10T07:57:34.063 INFO:teuthology.orchestra.run.vm05.stdout:mgr.vm08.orfpog vm08 *:8443,9283,8765 running (6m) 77s ago 10m 536M - 19.2.3-678-ge911bdeb 654f31e6858e 7a15d7811d5b 2026-03-10T07:57:34.063 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm05 vm05 running (5m) 77s ago 11m 65.1M 2048M 19.2.3-678-ge911bdeb 654f31e6858e f02f076bb820 2026-03-10T07:57:34.063 INFO:teuthology.orchestra.run.vm05.stdout:mon.vm08 vm08 running (4m) 77s ago 10m 57.2M 2048M 19.2.3-678-ge911bdeb 654f31e6858e 73d9a504f360 2026-03-10T07:57:34.063 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm05 vm05 *:9100 running (6m) 77s ago 11m 10.5M - 1.7.0 72c9c2088986 7cd0b23b4118 2026-03-10T07:57:34.063 INFO:teuthology.orchestra.run.vm05.stdout:node-exporter.vm08 vm08 *:9100 running (6m) 77s ago 10m 10.1M - 1.7.0 72c9c2088986 3dd4d91d5881 2026-03-10T07:57:34.063 INFO:teuthology.orchestra.run.vm05.stdout:osd.0 vm05 running (4m) 77s ago 10m 205M 4096M 19.2.3-678-ge911bdeb 654f31e6858e b35fccc2a4d5 2026-03-10T07:57:34.063 INFO:teuthology.orchestra.run.vm05.stdout:osd.1 vm05 running (3m) 77s ago 10m 107M 4096M 19.2.3-678-ge911bdeb 654f31e6858e e3fe4ad5c6a3 2026-03-10T07:57:34.063 INFO:teuthology.orchestra.run.vm05.stdout:osd.2 vm05 running (3m) 77s ago 10m 117M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 108a77e324b8 2026-03-10T07:57:34.063 INFO:teuthology.orchestra.run.vm05.stdout:osd.3 vm08 running (2m) 77s ago 9m 150M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 3f280bcfe0f5 2026-03-10T07:57:34.063 INFO:teuthology.orchestra.run.vm05.stdout:osd.4 vm08 running (2m) 77s ago 9m 137M 4096M 19.2.3-678-ge911bdeb 654f31e6858e 132c8d288b1e 2026-03-10T07:57:34.063 INFO:teuthology.orchestra.run.vm05.stdout:osd.5 vm08 running (2m) 77s ago 9m 104M 4096M 19.2.3-678-ge911bdeb 654f31e6858e a4a8929822a2 2026-03-10T07:57:34.063 INFO:teuthology.orchestra.run.vm05.stdout:prometheus.vm05 vm05 *:9095 running (6m) 77s ago 10m 58.4M - 2.51.0 1d3b7f56885b c59a6be07563 2026-03-10T07:57:34.066 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.064+0000 7f6fd78a8700 1 -- 192.168.123.105:0/3868249056 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6fbc077880 msgr2=0x7f6fbc079d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:34.066 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.064+0000 7f6fd78a8700 1 --2- 192.168.123.105:0/3868249056 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6fbc077880 0x7f6fbc079d40 secure :-1 s=READY pgs=84 cs=0 l=1 rev1=1 crypto rx=0x7f6fc000b5c0 tx=0x7f6fc0005fb0 comp rx=0 tx=0).stop 2026-03-10T07:57:34.066 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.065+0000 7f6fd78a8700 1 -- 192.168.123.105:0/3868249056 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6fd0101c30 msgr2=0x7f6fd006d980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:34.066 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.065+0000 7f6fd78a8700 1 --2- 192.168.123.105:0/3868249056 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6fd0101c30 0x7f6fd006d980 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f6fcc00da10 tx=0x7f6fcc00ddd0 comp rx=0 tx=0).stop 2026-03-10T07:57:34.067 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.065+0000 7f6fd78a8700 1 -- 192.168.123.105:0/3868249056 shutdown_connections 2026-03-10T07:57:34.067 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.065+0000 7f6fd78a8700 1 --2- 192.168.123.105:0/3868249056 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6fd0101280 0x7f6fd0072980 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:34.067 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.065+0000 7f6fd78a8700 1 --2- 192.168.123.105:0/3868249056 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6fbc077880 0x7f6fbc079d40 unknown :-1 s=CLOSED pgs=84 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:34.067 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.065+0000 7f6fd78a8700 1 --2- 192.168.123.105:0/3868249056 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6fd0101c30 0x7f6fd006d980 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:34.067 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.065+0000 7f6fd78a8700 1 -- 192.168.123.105:0/3868249056 >> 192.168.123.105:0/3868249056 conn(0x7f6fd0078ed0 msgr2=0x7f6fd00ffb10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:34.067 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.065+0000 7f6fd78a8700 1 -- 192.168.123.105:0/3868249056 shutdown_connections 2026-03-10T07:57:34.067 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.065+0000 7f6fd78a8700 1 -- 192.168.123.105:0/3868249056 wait complete. 2026-03-10T07:57:34.134 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions' 2026-03-10T07:57:34.283 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:34.308 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:34 vm05.local ceph-mon[130117]: pgmap v197: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:34.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:34 vm08.local ceph-mon[107898]: pgmap v197: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:34.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.539+0000 7fcf1d4f4700 1 -- 192.168.123.105:0/3906241524 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcf18101c30 msgr2=0x7fcf18105ba0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:34.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.539+0000 7fcf1d4f4700 1 --2- 192.168.123.105:0/3906241524 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcf18101c30 0x7fcf18105ba0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7fcf08009b50 tx=0x7fcf08009e60 comp rx=0 tx=0).stop 2026-03-10T07:57:34.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.540+0000 7fcf1d4f4700 1 -- 192.168.123.105:0/3906241524 shutdown_connections 2026-03-10T07:57:34.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.540+0000 7fcf1d4f4700 1 --2- 192.168.123.105:0/3906241524 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcf18101c30 0x7fcf18105ba0 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:34.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.540+0000 7fcf1d4f4700 1 --2- 192.168.123.105:0/3906241524 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcf18101280 0x7fcf18101660 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:34.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.540+0000 7fcf1d4f4700 1 -- 192.168.123.105:0/3906241524 >> 192.168.123.105:0/3906241524 conn(0x7fcf18078ea0 msgr2=0x7fcf180792b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:34.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.540+0000 7fcf1d4f4700 1 -- 192.168.123.105:0/3906241524 shutdown_connections 2026-03-10T07:57:34.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.540+0000 7fcf1d4f4700 1 -- 192.168.123.105:0/3906241524 wait complete. 2026-03-10T07:57:34.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.541+0000 7fcf1d4f4700 1 Processor -- start 2026-03-10T07:57:34.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.541+0000 7fcf1d4f4700 1 -- start start 2026-03-10T07:57:34.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.541+0000 7fcf1d4f4700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcf18101280 0x7fcf1819a810 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:34.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.541+0000 7fcf1d4f4700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcf18101c30 0x7fcf1819ad50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:34.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.541+0000 7fcf1d4f4700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcf1819b3e0 con 0x7fcf18101280 2026-03-10T07:57:34.542 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.541+0000 7fcf1d4f4700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fcf18194890 con 0x7fcf18101c30 2026-03-10T07:57:34.543 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.541+0000 7fcf16ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcf18101280 0x7fcf1819a810 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:34.543 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.541+0000 7fcf16ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcf18101280 0x7fcf1819a810 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:42666/0 (socket says 192.168.123.105:42666) 2026-03-10T07:57:34.543 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.541+0000 7fcf16ffd700 1 -- 192.168.123.105:0/1893878542 learned_addr learned my addr 192.168.123.105:0/1893878542 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:34.543 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.541+0000 7fcf0ffff700 1 --2- 192.168.123.105:0/1893878542 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcf18101c30 0x7fcf1819ad50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:34.543 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.542+0000 7fcf16ffd700 1 -- 192.168.123.105:0/1893878542 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcf18101c30 msgr2=0x7fcf1819ad50 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:34.543 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.542+0000 7fcf16ffd700 1 --2- 192.168.123.105:0/1893878542 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcf18101c30 0x7fcf1819ad50 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:34.543 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.542+0000 7fcf16ffd700 1 -- 192.168.123.105:0/1893878542 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fcf080097e0 con 0x7fcf18101280 2026-03-10T07:57:34.543 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.542+0000 7fcf16ffd700 1 --2- 192.168.123.105:0/1893878542 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcf18101280 0x7fcf1819a810 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7fcf0000eb10 tx=0x7fcf0000ee20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:34.545 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.543+0000 7fcf14ff9700 1 -- 192.168.123.105:0/1893878542 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcf0000cc40 con 0x7fcf18101280 2026-03-10T07:57:34.545 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.543+0000 7fcf14ff9700 1 -- 192.168.123.105:0/1893878542 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fcf0000cda0 con 0x7fcf18101280 2026-03-10T07:57:34.545 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.543+0000 7fcf14ff9700 1 -- 192.168.123.105:0/1893878542 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fcf00018810 con 0x7fcf18101280 2026-03-10T07:57:34.545 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.543+0000 7fcf1d4f4700 1 -- 192.168.123.105:0/1893878542 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fcf18194b70 con 0x7fcf18101280 2026-03-10T07:57:34.545 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.543+0000 7fcf1d4f4700 1 -- 192.168.123.105:0/1893878542 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fcf18194fe0 con 0x7fcf18101280 2026-03-10T07:57:34.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.544+0000 7fcf14ff9700 1 -- 192.168.123.105:0/1893878542 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fcf00018a80 con 0x7fcf18101280 2026-03-10T07:57:34.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.544+0000 7fcf1d4f4700 1 -- 192.168.123.105:0/1893878542 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fcf1804ea90 con 0x7fcf18101280 2026-03-10T07:57:34.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.547+0000 7fcf14ff9700 1 --2- 192.168.123.105:0/1893878542 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fcef8077930 0x7fcef8079df0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:34.548 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.547+0000 7fcf14ff9700 1 -- 192.168.123.105:0/1893878542 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fcf00014070 con 0x7fcf18101280 2026-03-10T07:57:34.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.547+0000 7fcf14ff9700 1 -- 192.168.123.105:0/1893878542 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fcf00010ba0 con 0x7fcf18101280 2026-03-10T07:57:34.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.547+0000 7fcf0ffff700 1 --2- 192.168.123.105:0/1893878542 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fcef8077930 0x7fcef8079df0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:34.549 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.548+0000 7fcf0ffff700 1 --2- 192.168.123.105:0/1893878542 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fcef8077930 0x7fcef8079df0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fcf0800b5c0 tx=0x7fcf080058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:34.711 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.709+0000 7fcf1d4f4700 1 -- 192.168.123.105:0/1893878542 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7fcf18195e40 con 0x7fcf18101280 2026-03-10T07:57:34.711 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.710+0000 7fcf14ff9700 1 -- 192.168.123.105:0/1893878542 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7fcf00063ca0 con 0x7fcf18101280 2026-03-10T07:57:34.711 INFO:teuthology.orchestra.run.vm05.stdout:{ 2026-03-10T07:57:34.712 INFO:teuthology.orchestra.run.vm05.stdout: "mon": { 2026-03-10T07:57:34.712 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:57:34.712 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:57:34.712 INFO:teuthology.orchestra.run.vm05.stdout: "mgr": { 2026-03-10T07:57:34.712 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 2 2026-03-10T07:57:34.712 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:57:34.712 INFO:teuthology.orchestra.run.vm05.stdout: "osd": { 2026-03-10T07:57:34.712 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 6 2026-03-10T07:57:34.712 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:57:34.712 INFO:teuthology.orchestra.run.vm05.stdout: "mds": { 2026-03-10T07:57:34.712 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 4 2026-03-10T07:57:34.712 INFO:teuthology.orchestra.run.vm05.stdout: }, 2026-03-10T07:57:34.712 INFO:teuthology.orchestra.run.vm05.stdout: "overall": { 2026-03-10T07:57:34.712 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)": 14 2026-03-10T07:57:34.712 INFO:teuthology.orchestra.run.vm05.stdout: } 2026-03-10T07:57:34.712 INFO:teuthology.orchestra.run.vm05.stdout:} 2026-03-10T07:57:34.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.712+0000 7fcf1d4f4700 1 -- 192.168.123.105:0/1893878542 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fcef8077930 msgr2=0x7fcef8079df0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:34.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.712+0000 7fcf1d4f4700 1 --2- 192.168.123.105:0/1893878542 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fcef8077930 0x7fcef8079df0 secure :-1 s=READY pgs=85 cs=0 l=1 rev1=1 crypto rx=0x7fcf0800b5c0 tx=0x7fcf080058e0 comp rx=0 tx=0).stop 2026-03-10T07:57:34.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.713+0000 7fcf1d4f4700 1 -- 192.168.123.105:0/1893878542 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcf18101280 msgr2=0x7fcf1819a810 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:34.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.713+0000 7fcf1d4f4700 1 --2- 192.168.123.105:0/1893878542 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcf18101280 0x7fcf1819a810 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7fcf0000eb10 tx=0x7fcf0000ee20 comp rx=0 tx=0).stop 2026-03-10T07:57:34.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.713+0000 7fcf1d4f4700 1 -- 192.168.123.105:0/1893878542 shutdown_connections 2026-03-10T07:57:34.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.713+0000 7fcf1d4f4700 1 --2- 192.168.123.105:0/1893878542 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fcef8077930 0x7fcef8079df0 unknown :-1 s=CLOSED pgs=85 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:34.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.713+0000 7fcf1d4f4700 1 --2- 192.168.123.105:0/1893878542 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fcf18101280 0x7fcf1819a810 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:34.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.713+0000 7fcf1d4f4700 1 --2- 192.168.123.105:0/1893878542 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fcf18101c30 0x7fcf1819ad50 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:34.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.713+0000 7fcf1d4f4700 1 -- 192.168.123.105:0/1893878542 >> 192.168.123.105:0/1893878542 conn(0x7fcf18078ea0 msgr2=0x7fcf180ffae0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:34.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.713+0000 7fcf1d4f4700 1 -- 192.168.123.105:0/1893878542 shutdown_connections 2026-03-10T07:57:34.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:34.713+0000 7fcf1d4f4700 1 -- 192.168.123.105:0/1893878542 wait complete. 2026-03-10T07:57:34.762 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.overall | length == 1'"'"'' 2026-03-10T07:57:34.904 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:35.177 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.175+0000 7f3de029b700 1 -- 192.168.123.105:0/3226818929 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3dd8102960 msgr2=0x7f3dd810ae50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:35.177 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.175+0000 7f3de029b700 1 --2- 192.168.123.105:0/3226818929 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3dd8102960 0x7f3dd810ae50 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f3dd4009b00 tx=0x7f3dd4009e10 comp rx=0 tx=0).stop 2026-03-10T07:57:35.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.176+0000 7f3de029b700 1 -- 192.168.123.105:0/3226818929 shutdown_connections 2026-03-10T07:57:35.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.176+0000 7f3de029b700 1 --2- 192.168.123.105:0/3226818929 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3dd8102960 0x7f3dd810ae50 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:35.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.176+0000 7f3de029b700 1 --2- 192.168.123.105:0/3226818929 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3dd8102040 0x7f3dd8102420 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:35.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.176+0000 7f3de029b700 1 -- 192.168.123.105:0/3226818929 >> 192.168.123.105:0/3226818929 conn(0x7f3dd80fb830 msgr2=0x7f3dd80fdc50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:35.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.176+0000 7f3de029b700 1 -- 192.168.123.105:0/3226818929 shutdown_connections 2026-03-10T07:57:35.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.176+0000 7f3de029b700 1 -- 192.168.123.105:0/3226818929 wait complete. 2026-03-10T07:57:35.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.177+0000 7f3de029b700 1 Processor -- start 2026-03-10T07:57:35.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.177+0000 7f3de029b700 1 -- start start 2026-03-10T07:57:35.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.177+0000 7f3de029b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3dd8102040 0x7f3dd8198e90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:35.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.177+0000 7f3de029b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3dd8102960 0x7f3dd81993d0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:35.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.177+0000 7f3de029b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3dd8199ab0 con 0x7f3dd8102960 2026-03-10T07:57:35.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.177+0000 7f3de029b700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3dd819d840 con 0x7f3dd8102040 2026-03-10T07:57:35.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.177+0000 7f3ddd836700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3dd8102960 0x7f3dd81993d0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:35.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.177+0000 7f3ddd836700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3dd8102960 0x7f3dd81993d0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:42680/0 (socket says 192.168.123.105:42680) 2026-03-10T07:57:35.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.177+0000 7f3ddd836700 1 -- 192.168.123.105:0/382246109 learned_addr learned my addr 192.168.123.105:0/382246109 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:35.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.178+0000 7f3ddd836700 1 -- 192.168.123.105:0/382246109 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3dd8102040 msgr2=0x7f3dd8198e90 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T07:57:35.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.178+0000 7f3ddd836700 1 --2- 192.168.123.105:0/382246109 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3dd8102040 0x7f3dd8198e90 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:35.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.178+0000 7f3ddd836700 1 -- 192.168.123.105:0/382246109 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3dd40097e0 con 0x7f3dd8102960 2026-03-10T07:57:35.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.178+0000 7f3ddd836700 1 --2- 192.168.123.105:0/382246109 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3dd8102960 0x7f3dd81993d0 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f3dd4009fd0 tx=0x7f3dd4004970 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:35.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.178+0000 7f3dcf7fe700 1 -- 192.168.123.105:0/382246109 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3dd401d070 con 0x7f3dd8102960 2026-03-10T07:57:35.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.178+0000 7f3dcf7fe700 1 -- 192.168.123.105:0/382246109 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f3dd4004b90 con 0x7f3dd8102960 2026-03-10T07:57:35.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.178+0000 7f3dcf7fe700 1 -- 192.168.123.105:0/382246109 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3dd400f670 con 0x7f3dd8102960 2026-03-10T07:57:35.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.178+0000 7f3de029b700 1 -- 192.168.123.105:0/382246109 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3dd819dac0 con 0x7f3dd8102960 2026-03-10T07:57:35.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.178+0000 7f3de029b700 1 -- 192.168.123.105:0/382246109 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3dd819dfb0 con 0x7f3dd8102960 2026-03-10T07:57:35.182 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.179+0000 7f3de029b700 1 -- 192.168.123.105:0/382246109 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3dd810a9b0 con 0x7f3dd8102960 2026-03-10T07:57:35.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.182+0000 7f3dcf7fe700 1 -- 192.168.123.105:0/382246109 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3dd400bc50 con 0x7f3dd8102960 2026-03-10T07:57:35.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.183+0000 7f3dcf7fe700 1 --2- 192.168.123.105:0/382246109 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3dc4077990 0x7f3dc4079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:35.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.183+0000 7f3dcf7fe700 1 -- 192.168.123.105:0/382246109 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f3dd409bec0 con 0x7f3dd8102960 2026-03-10T07:57:35.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.183+0000 7f3dde037700 1 --2- 192.168.123.105:0/382246109 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3dc4077990 0x7f3dc4079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:35.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.184+0000 7f3dcf7fe700 1 -- 192.168.123.105:0/382246109 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3dd4063f70 con 0x7f3dd8102960 2026-03-10T07:57:35.185 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.184+0000 7f3dde037700 1 --2- 192.168.123.105:0/382246109 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3dc4077990 0x7f3dc4079e50 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f3dc8009730 tx=0x7f3dc8006cb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:35.347 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:35 vm05.local ceph-mon[130117]: from='client.34324 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:57:35.347 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:35 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/1893878542' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:57:35.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.345+0000 7f3de029b700 1 -- 192.168.123.105:0/382246109 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f3dd804ea90 con 0x7f3dd8102960 2026-03-10T07:57:35.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.349+0000 7f3dcf7fe700 1 -- 192.168.123.105:0/382246109 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f3dd4064160 con 0x7f3dd8102960 2026-03-10T07:57:35.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.351+0000 7f3de029b700 1 -- 192.168.123.105:0/382246109 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3dc4077990 msgr2=0x7f3dc4079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:35.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.351+0000 7f3de029b700 1 --2- 192.168.123.105:0/382246109 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3dc4077990 0x7f3dc4079e50 secure :-1 s=READY pgs=86 cs=0 l=1 rev1=1 crypto rx=0x7f3dc8009730 tx=0x7f3dc8006cb0 comp rx=0 tx=0).stop 2026-03-10T07:57:35.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.351+0000 7f3de029b700 1 -- 192.168.123.105:0/382246109 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3dd8102960 msgr2=0x7f3dd81993d0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:35.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.351+0000 7f3de029b700 1 --2- 192.168.123.105:0/382246109 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3dd8102960 0x7f3dd81993d0 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f3dd4009fd0 tx=0x7f3dd4004970 comp rx=0 tx=0).stop 2026-03-10T07:57:35.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.351+0000 7f3de029b700 1 -- 192.168.123.105:0/382246109 shutdown_connections 2026-03-10T07:57:35.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.351+0000 7f3de029b700 1 --2- 192.168.123.105:0/382246109 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3dd8102040 0x7f3dd8198e90 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:35.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.351+0000 7f3de029b700 1 --2- 192.168.123.105:0/382246109 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3dc4077990 0x7f3dc4079e50 unknown :-1 s=CLOSED pgs=86 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:35.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.351+0000 7f3de029b700 1 --2- 192.168.123.105:0/382246109 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3dd8102960 0x7f3dd81993d0 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:35.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.351+0000 7f3de029b700 1 -- 192.168.123.105:0/382246109 >> 192.168.123.105:0/382246109 conn(0x7f3dd80fb830 msgr2=0x7f3dd8100460 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:35.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.351+0000 7f3de029b700 1 -- 192.168.123.105:0/382246109 shutdown_connections 2026-03-10T07:57:35.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.352+0000 7f3de029b700 1 -- 192.168.123.105:0/382246109 wait complete. 2026-03-10T07:57:35.361 INFO:teuthology.orchestra.run.vm05.stdout:true 2026-03-10T07:57:35.412 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -e sha1=e911bdebe5c8faa3800735d1568fcdca65db60df -- bash -c 'ceph versions | jq -e '"'"'.overall | keys'"'"' | grep $sha1' 2026-03-10T07:57:35.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:35 vm08.local ceph-mon[107898]: from='client.34324 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch 2026-03-10T07:57:35.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:35 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/1893878542' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:57:35.555 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:35.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.805+0000 7f0b9dedb700 1 -- 192.168.123.105:0/302782351 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0b98100600 msgr2=0x7f0b9810d2c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:35.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.805+0000 7f0b9dedb700 1 --2- 192.168.123.105:0/302782351 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0b98100600 0x7f0b9810d2c0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f0b88009b00 tx=0x7f0b88009e10 comp rx=0 tx=0).stop 2026-03-10T07:57:35.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.806+0000 7f0b9dedb700 1 -- 192.168.123.105:0/302782351 shutdown_connections 2026-03-10T07:57:35.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.806+0000 7f0b9dedb700 1 --2- 192.168.123.105:0/302782351 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0b98100600 0x7f0b9810d2c0 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:35.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.806+0000 7f0b9dedb700 1 --2- 192.168.123.105:0/302782351 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0b980ffc50 0x7f0b98100030 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:35.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.806+0000 7f0b9dedb700 1 -- 192.168.123.105:0/302782351 >> 192.168.123.105:0/302782351 conn(0x7f0b980fb830 msgr2=0x7f0b980fdc50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:35.808 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.806+0000 7f0b9dedb700 1 -- 192.168.123.105:0/302782351 shutdown_connections 2026-03-10T07:57:35.808 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.807+0000 7f0b9dedb700 1 -- 192.168.123.105:0/302782351 wait complete. 2026-03-10T07:57:35.808 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.807+0000 7f0b9dedb700 1 Processor -- start 2026-03-10T07:57:35.808 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.807+0000 7f0b9dedb700 1 -- start start 2026-03-10T07:57:35.808 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.807+0000 7f0b9dedb700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0b980ffc50 0x7f0b98198dc0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:35.808 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.807+0000 7f0b9dedb700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0b98100600 0x7f0b98199300 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:35.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.807+0000 7f0b9dedb700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0b98199950 con 0x7f0b980ffc50 2026-03-10T07:57:35.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.807+0000 7f0b9dedb700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0b98199a90 con 0x7f0b98100600 2026-03-10T07:57:35.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.807+0000 7f0b977fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0b980ffc50 0x7f0b98198dc0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:35.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.808+0000 7f0b977fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0b980ffc50 0x7f0b98198dc0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:42696/0 (socket says 192.168.123.105:42696) 2026-03-10T07:57:35.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.808+0000 7f0b977fe700 1 -- 192.168.123.105:0/2275163266 learned_addr learned my addr 192.168.123.105:0/2275163266 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:35.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.808+0000 7f0b96ffd700 1 --2- 192.168.123.105:0/2275163266 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0b98100600 0x7f0b98199300 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:35.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.808+0000 7f0b977fe700 1 -- 192.168.123.105:0/2275163266 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0b98100600 msgr2=0x7f0b98199300 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:35.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.808+0000 7f0b977fe700 1 --2- 192.168.123.105:0/2275163266 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0b98100600 0x7f0b98199300 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:35.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.808+0000 7f0b977fe700 1 -- 192.168.123.105:0/2275163266 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f0b880097e0 con 0x7f0b980ffc50 2026-03-10T07:57:35.809 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.808+0000 7f0b977fe700 1 --2- 192.168.123.105:0/2275163266 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0b980ffc50 0x7f0b98198dc0 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f0b8000cc60 tx=0x7f0b800074a0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:35.810 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.808+0000 7f0b94ff9700 1 -- 192.168.123.105:0/2275163266 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0b80007af0 con 0x7f0b980ffc50 2026-03-10T07:57:35.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.808+0000 7f0b94ff9700 1 -- 192.168.123.105:0/2275163266 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f0b80007c50 con 0x7f0b980ffc50 2026-03-10T07:57:35.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.808+0000 7f0b94ff9700 1 -- 192.168.123.105:0/2275163266 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0b80018730 con 0x7f0b980ffc50 2026-03-10T07:57:35.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.809+0000 7f0b9dedb700 1 -- 192.168.123.105:0/2275163266 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f0b9819d8e0 con 0x7f0b980ffc50 2026-03-10T07:57:35.811 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.809+0000 7f0b9dedb700 1 -- 192.168.123.105:0/2275163266 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f0b9819de00 con 0x7f0b980ffc50 2026-03-10T07:57:35.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.810+0000 7f0b94ff9700 1 -- 192.168.123.105:0/2275163266 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f0b8001f030 con 0x7f0b980ffc50 2026-03-10T07:57:35.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.810+0000 7f0b9dedb700 1 -- 192.168.123.105:0/2275163266 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f0b9810aa30 con 0x7f0b980ffc50 2026-03-10T07:57:35.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.811+0000 7f0b94ff9700 1 --2- 192.168.123.105:0/2275163266 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f0b840778c0 0x7f0b84079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:35.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.811+0000 7f0b94ff9700 1 -- 192.168.123.105:0/2275163266 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f0b80099c20 con 0x7f0b980ffc50 2026-03-10T07:57:35.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.811+0000 7f0b96ffd700 1 --2- 192.168.123.105:0/2275163266 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f0b840778c0 0x7f0b84079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:35.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.812+0000 7f0b96ffd700 1 --2- 192.168.123.105:0/2275163266 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f0b840778c0 0x7f0b84079d80 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f0b88000c00 tx=0x7f0b88005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:35.814 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.813+0000 7f0b94ff9700 1 -- 192.168.123.105:0/2275163266 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f0b800623a0 con 0x7f0b980ffc50 2026-03-10T07:57:35.970 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.969+0000 7f0b9dedb700 1 -- 192.168.123.105:0/2275163266 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "versions"} v 0) v1 -- 0x7f0b9819a120 con 0x7f0b980ffc50 2026-03-10T07:57:35.971 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.969+0000 7f0b94ff9700 1 -- 192.168.123.105:0/2275163266 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "versions"}]=0 v0) v1 ==== 56+0+633 (secure 0 0 0) 0x7f0b80061af0 con 0x7f0b980ffc50 2026-03-10T07:57:35.973 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.972+0000 7f0b9dedb700 1 -- 192.168.123.105:0/2275163266 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f0b840778c0 msgr2=0x7f0b84079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:35.973 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.972+0000 7f0b9dedb700 1 --2- 192.168.123.105:0/2275163266 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f0b840778c0 0x7f0b84079d80 secure :-1 s=READY pgs=87 cs=0 l=1 rev1=1 crypto rx=0x7f0b88000c00 tx=0x7f0b88005fb0 comp rx=0 tx=0).stop 2026-03-10T07:57:35.973 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.972+0000 7f0b9dedb700 1 -- 192.168.123.105:0/2275163266 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0b980ffc50 msgr2=0x7f0b98198dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:35.973 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.972+0000 7f0b9dedb700 1 --2- 192.168.123.105:0/2275163266 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0b980ffc50 0x7f0b98198dc0 secure :-1 s=READY pgs=116 cs=0 l=1 rev1=1 crypto rx=0x7f0b8000cc60 tx=0x7f0b800074a0 comp rx=0 tx=0).stop 2026-03-10T07:57:35.973 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.972+0000 7f0b9dedb700 1 -- 192.168.123.105:0/2275163266 shutdown_connections 2026-03-10T07:57:35.973 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.972+0000 7f0b9dedb700 1 --2- 192.168.123.105:0/2275163266 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f0b840778c0 0x7f0b84079d80 unknown :-1 s=CLOSED pgs=87 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:35.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.972+0000 7f0b9dedb700 1 --2- 192.168.123.105:0/2275163266 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0b980ffc50 0x7f0b98198dc0 unknown :-1 s=CLOSED pgs=116 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:35.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.972+0000 7f0b9dedb700 1 --2- 192.168.123.105:0/2275163266 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0b98100600 0x7f0b98199300 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:35.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.972+0000 7f0b9dedb700 1 -- 192.168.123.105:0/2275163266 >> 192.168.123.105:0/2275163266 conn(0x7f0b980fb830 msgr2=0x7f0b980fce10 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:35.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.972+0000 7f0b9dedb700 1 -- 192.168.123.105:0/2275163266 shutdown_connections 2026-03-10T07:57:35.974 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:35.972+0000 7f0b9dedb700 1 -- 192.168.123.105:0/2275163266 wait complete. 2026-03-10T07:57:35.983 INFO:teuthology.orchestra.run.vm05.stdout: "ceph version 19.2.3-678-ge911bdeb (e911bdebe5c8faa3800735d1568fcdca65db60df) squid (stable)" 2026-03-10T07:57:36.038 DEBUG:teuthology.parallel:result is None 2026-03-10T07:57:36.038 INFO:teuthology.run_tasks:Running task cephadm.shell... 2026-03-10T07:57:36.042 INFO:tasks.cephadm:Running commands on role host.a host ubuntu@vm05.local 2026-03-10T07:57:36.042 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- bash -c 'ceph fs dump' 2026-03-10T07:57:36.180 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:36.222 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:36 vm05.local ceph-mon[130117]: pgmap v198: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:36.222 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:36 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/382246109' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:57:36.222 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:36 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/2275163266' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:57:36.405 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.404+0000 7f86df4ef700 1 -- 192.168.123.105:0/1280879685 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86d8103cf0 msgr2=0x7f86d8107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:36.406 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.404+0000 7f86df4ef700 1 --2- 192.168.123.105:0/1280879685 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86d8103cf0 0x7f86d8107d40 secure :-1 s=READY pgs=117 cs=0 l=1 rev1=1 crypto rx=0x7f86c8009b50 tx=0x7f86c8009e60 comp rx=0 tx=0).stop 2026-03-10T07:57:36.406 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.404+0000 7f86df4ef700 1 -- 192.168.123.105:0/1280879685 shutdown_connections 2026-03-10T07:57:36.406 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.404+0000 7f86df4ef700 1 --2- 192.168.123.105:0/1280879685 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86d8103cf0 0x7f86d8107d40 unknown :-1 s=CLOSED pgs=117 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:36.406 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.404+0000 7f86df4ef700 1 --2- 192.168.123.105:0/1280879685 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f86d8103340 0x7f86d8103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:36.406 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.404+0000 7f86df4ef700 1 -- 192.168.123.105:0/1280879685 >> 192.168.123.105:0/1280879685 conn(0x7f86d80feb90 msgr2=0x7f86d8100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:36.406 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.404+0000 7f86df4ef700 1 -- 192.168.123.105:0/1280879685 shutdown_connections 2026-03-10T07:57:36.406 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.404+0000 7f86df4ef700 1 -- 192.168.123.105:0/1280879685 wait complete. 2026-03-10T07:57:36.406 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.405+0000 7f86df4ef700 1 Processor -- start 2026-03-10T07:57:36.406 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.405+0000 7f86df4ef700 1 -- start start 2026-03-10T07:57:36.406 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.405+0000 7f86df4ef700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86d8103340 0x7f86d8198e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:36.407 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.405+0000 7f86dd28b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86d8103340 0x7f86d8198e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:36.407 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.406+0000 7f86dd28b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86d8103340 0x7f86d8198e50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:42714/0 (socket says 192.168.123.105:42714) 2026-03-10T07:57:36.407 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.406+0000 7f86dd28b700 1 -- 192.168.123.105:0/1644627006 learned_addr learned my addr 192.168.123.105:0/1644627006 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:36.407 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.406+0000 7f86df4ef700 1 --2- 192.168.123.105:0/1644627006 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f86d8103cf0 0x7f86d8199390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:36.407 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.406+0000 7f86df4ef700 1 -- 192.168.123.105:0/1644627006 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f86d8199a70 con 0x7f86d8103340 2026-03-10T07:57:36.407 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.406+0000 7f86df4ef700 1 -- 192.168.123.105:0/1644627006 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f86d819d800 con 0x7f86d8103cf0 2026-03-10T07:57:36.407 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.406+0000 7f86dd28b700 1 -- 192.168.123.105:0/1644627006 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f86d8103cf0 msgr2=0x7f86d8199390 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:57:36.407 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.406+0000 7f86dd28b700 1 --2- 192.168.123.105:0/1644627006 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f86d8103cf0 0x7f86d8199390 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:36.407 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.406+0000 7f86dd28b700 1 -- 192.168.123.105:0/1644627006 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f86c80097e0 con 0x7f86d8103340 2026-03-10T07:57:36.408 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.407+0000 7f86dd28b700 1 --2- 192.168.123.105:0/1644627006 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86d8103340 0x7f86d8198e50 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f86d400ed70 tx=0x7f86d400c5b0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:36.409 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.407+0000 7f86ce7fc700 1 -- 192.168.123.105:0/1644627006 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f86d400cd70 con 0x7f86d8103340 2026-03-10T07:57:36.409 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.407+0000 7f86ce7fc700 1 -- 192.168.123.105:0/1644627006 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f86d4010910 con 0x7f86d8103340 2026-03-10T07:57:36.409 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.407+0000 7f86ce7fc700 1 -- 192.168.123.105:0/1644627006 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f86d4018980 con 0x7f86d8103340 2026-03-10T07:57:36.409 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.407+0000 7f86df4ef700 1 -- 192.168.123.105:0/1644627006 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f86d819dae0 con 0x7f86d8103340 2026-03-10T07:57:36.409 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.407+0000 7f86df4ef700 1 -- 192.168.123.105:0/1644627006 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f86d819e030 con 0x7f86d8103340 2026-03-10T07:57:36.412 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.409+0000 7f86ce7fc700 1 -- 192.168.123.105:0/1644627006 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f86d4010a80 con 0x7f86d8103340 2026-03-10T07:57:36.412 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.409+0000 7f86df4ef700 1 -- 192.168.123.105:0/1644627006 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f86d810b740 con 0x7f86d8103340 2026-03-10T07:57:36.412 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.409+0000 7f86ce7fc700 1 --2- 192.168.123.105:0/1644627006 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f86c40778c0 0x7f86c4079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:36.413 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.409+0000 7f86ce7fc700 1 -- 192.168.123.105:0/1644627006 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f86d4014070 con 0x7f86d8103340 2026-03-10T07:57:36.413 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.412+0000 7f86dca8a700 1 --2- 192.168.123.105:0/1644627006 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f86c40778c0 0x7f86c4079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:36.413 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.412+0000 7f86ce7fc700 1 -- 192.168.123.105:0/1644627006 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f86d4062cf0 con 0x7f86d8103340 2026-03-10T07:57:36.413 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.412+0000 7f86dca8a700 1 --2- 192.168.123.105:0/1644627006 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f86c40778c0 0x7f86c4079d80 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f86c800b5c0 tx=0x7f86c8005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:36.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:36 vm08.local ceph-mon[107898]: pgmap v198: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:36.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:36 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/382246109' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:57:36.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:36 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/2275163266' entity='client.admin' cmd=[{"prefix": "versions"}]: dispatch 2026-03-10T07:57:36.554 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.552+0000 7f86df4ef700 1 -- 192.168.123.105:0/1644627006 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump"} v 0) v1 -- 0x7f86d819a1b0 con 0x7f86d8103340 2026-03-10T07:57:36.554 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.553+0000 7f86ce7fc700 1 -- 192.168.123.105:0/1644627006 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump"}]=0 dumped fsmap epoch 35 v35) v1 ==== 76+0+1969 (secure 0 0 0) 0x7f86d4062440 con 0x7f86d8103340 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:e35 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:btime 2026-03-10T07:56:18:951575+0000 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:enable_multiple, ever_enabled_multiple: 1,1 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2} 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:legacy client fscid: 1 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:Filesystem 'cephfs' (1) 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:fs_name cephfs 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:epoch 35 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:flags 12 joinable allow_snaps allow_multimds_snaps 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:created 2026-03-10T07:48:24.309293+0000 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:modified 2026-03-10T07:56:18.018738+0000 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:tableserver 0 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:root 0 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:session_timeout 60 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:session_autoclose 300 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:max_file_size 1099511627776 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:max_xattr_size 65536 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:required_client_features {} 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:last_failure 0 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:last_failure_osd_epoch 84 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:compat compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,11=minor log segments,12=quiesce subvolumes} 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:max_mds 1 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:in 0 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:up {0=34274} 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:failed 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:damaged 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:stopped 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:data_pools [3] 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:metadata_pool 2 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:inline_data enabled 2026-03-10T07:57:36.557 INFO:teuthology.orchestra.run.vm05.stdout:balancer 2026-03-10T07:57:36.558 INFO:teuthology.orchestra.run.vm05.stdout:bal_rank_mask -1 2026-03-10T07:57:36.558 INFO:teuthology.orchestra.run.vm05.stdout:standby_count_wanted 1 2026-03-10T07:57:36.558 INFO:teuthology.orchestra.run.vm05.stdout:qdb_cluster leader: 34274 members: 34274 2026-03-10T07:57:36.558 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.pavqil{0:34274} state up:active seq 10 join_fscid=1 addr [v2:192.168.123.105:6828/426813062,v1:192.168.123.105:6829/426813062] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T07:57:36.558 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:36.558 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:36.558 INFO:teuthology.orchestra.run.vm05.stdout:Standby daemons: 2026-03-10T07:57:36.558 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:36.558 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm05.omfhnh{-1:44255} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.105:6826/4251520120,v1:192.168.123.105:6827/4251520120] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T07:57:36.558 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.ybmbgd{-1:44263} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6824/1209244,v1:192.168.123.108:6825/1209244] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T07:57:36.558 INFO:teuthology.orchestra.run.vm05.stdout:[mds.cephfs.vm08.dgsaon{-1:44289} state up:standby seq 1 join_fscid=1 addr [v2:192.168.123.108:6826/3010391204,v1:192.168.123.108:6827/3010391204] compat {c=[1],r=[1],i=[1fff]}] 2026-03-10T07:57:36.558 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.557+0000 7f86df4ef700 1 -- 192.168.123.105:0/1644627006 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f86c40778c0 msgr2=0x7f86c4079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:36.558 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.557+0000 7f86df4ef700 1 --2- 192.168.123.105:0/1644627006 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f86c40778c0 0x7f86c4079d80 secure :-1 s=READY pgs=88 cs=0 l=1 rev1=1 crypto rx=0x7f86c800b5c0 tx=0x7f86c8005fb0 comp rx=0 tx=0).stop 2026-03-10T07:57:36.558 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.557+0000 7f86df4ef700 1 -- 192.168.123.105:0/1644627006 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86d8103340 msgr2=0x7f86d8198e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:36.559 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.557+0000 7f86df4ef700 1 --2- 192.168.123.105:0/1644627006 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86d8103340 0x7f86d8198e50 secure :-1 s=READY pgs=118 cs=0 l=1 rev1=1 crypto rx=0x7f86d400ed70 tx=0x7f86d400c5b0 comp rx=0 tx=0).stop 2026-03-10T07:57:36.559 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.558+0000 7f86df4ef700 1 -- 192.168.123.105:0/1644627006 shutdown_connections 2026-03-10T07:57:36.559 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.558+0000 7f86df4ef700 1 --2- 192.168.123.105:0/1644627006 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f86c40778c0 0x7f86c4079d80 unknown :-1 s=CLOSED pgs=88 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:36.559 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.558+0000 7f86df4ef700 1 --2- 192.168.123.105:0/1644627006 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f86d8103340 0x7f86d8198e50 unknown :-1 s=CLOSED pgs=118 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:36.559 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.558+0000 7f86df4ef700 1 --2- 192.168.123.105:0/1644627006 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f86d8103cf0 0x7f86d8199390 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:36.559 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.558+0000 7f86df4ef700 1 -- 192.168.123.105:0/1644627006 >> 192.168.123.105:0/1644627006 conn(0x7f86d80feb90 msgr2=0x7f86d8100fa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:36.559 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.558+0000 7f86df4ef700 1 -- 192.168.123.105:0/1644627006 shutdown_connections 2026-03-10T07:57:36.560 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.559+0000 7f86df4ef700 1 -- 192.168.123.105:0/1644627006 wait complete. 2026-03-10T07:57:36.560 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 35 2026-03-10T07:57:36.620 INFO:teuthology.run_tasks:Running task fs.post_upgrade_checks... 2026-03-10T07:57:36.622 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 2026-03-10T07:57:36.758 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:36.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.995+0000 7f4d95f4f700 1 -- 192.168.123.105:0/3642134929 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d90100600 msgr2=0x7f4d9010d2c0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:36.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.995+0000 7f4d95f4f700 1 --2- 192.168.123.105:0/3642134929 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d90100600 0x7f4d9010d2c0 secure :-1 s=READY pgs=119 cs=0 l=1 rev1=1 crypto rx=0x7f4d80009b00 tx=0x7f4d80009e10 comp rx=0 tx=0).stop 2026-03-10T07:57:36.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.996+0000 7f4d95f4f700 1 -- 192.168.123.105:0/3642134929 shutdown_connections 2026-03-10T07:57:36.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.996+0000 7f4d95f4f700 1 --2- 192.168.123.105:0/3642134929 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d90100600 0x7f4d9010d2c0 unknown :-1 s=CLOSED pgs=119 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:36.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.996+0000 7f4d95f4f700 1 --2- 192.168.123.105:0/3642134929 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4d900ffc50 0x7f4d90100030 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:36.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.996+0000 7f4d95f4f700 1 -- 192.168.123.105:0/3642134929 >> 192.168.123.105:0/3642134929 conn(0x7f4d900fb830 msgr2=0x7f4d900fdc50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:36.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.996+0000 7f4d95f4f700 1 -- 192.168.123.105:0/3642134929 shutdown_connections 2026-03-10T07:57:36.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.996+0000 7f4d95f4f700 1 -- 192.168.123.105:0/3642134929 wait complete. 2026-03-10T07:57:36.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.997+0000 7f4d95f4f700 1 Processor -- start 2026-03-10T07:57:36.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.997+0000 7f4d95f4f700 1 -- start start 2026-03-10T07:57:36.998 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.997+0000 7f4d95f4f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d900ffc50 0x7f4d90198db0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:36.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.997+0000 7f4d95f4f700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4d90100600 0x7f4d901992f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:36.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.997+0000 7f4d95f4f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4d901999d0 con 0x7f4d900ffc50 2026-03-10T07:57:36.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.997+0000 7f4d95f4f700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4d9019d760 con 0x7f4d90100600 2026-03-10T07:57:36.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.998+0000 7f4d8f7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d900ffc50 0x7f4d90198db0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:36.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.998+0000 7f4d8f7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d900ffc50 0x7f4d90198db0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:42722/0 (socket says 192.168.123.105:42722) 2026-03-10T07:57:36.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.998+0000 7f4d8f7fe700 1 -- 192.168.123.105:0/4136815779 learned_addr learned my addr 192.168.123.105:0/4136815779 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:36.999 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.998+0000 7f4d8effd700 1 --2- 192.168.123.105:0/4136815779 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4d90100600 0x7f4d901992f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:37.000 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.998+0000 7f4d8f7fe700 1 -- 192.168.123.105:0/4136815779 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4d90100600 msgr2=0x7f4d901992f0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:37.001 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.998+0000 7f4d8f7fe700 1 --2- 192.168.123.105:0/4136815779 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4d90100600 0x7f4d901992f0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:37.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.998+0000 7f4d8f7fe700 1 -- 192.168.123.105:0/4136815779 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4d800097e0 con 0x7f4d900ffc50 2026-03-10T07:57:37.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.998+0000 7f4d8f7fe700 1 --2- 192.168.123.105:0/4136815779 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d900ffc50 0x7f4d90198db0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f4d7800ba70 tx=0x7f4d7800bd80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:37.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.998+0000 7f4d8cff9700 1 -- 192.168.123.105:0/4136815779 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4d7800c700 con 0x7f4d900ffc50 2026-03-10T07:57:37.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.998+0000 7f4d8cff9700 1 -- 192.168.123.105:0/4136815779 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f4d7800cd40 con 0x7f4d900ffc50 2026-03-10T07:57:37.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.998+0000 7f4d8cff9700 1 -- 192.168.123.105:0/4136815779 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4d78012340 con 0x7f4d900ffc50 2026-03-10T07:57:37.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.998+0000 7f4d95f4f700 1 -- 192.168.123.105:0/4136815779 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4d9019da40 con 0x7f4d900ffc50 2026-03-10T07:57:37.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:36.998+0000 7f4d95f4f700 1 -- 192.168.123.105:0/4136815779 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4d9019df90 con 0x7f4d900ffc50 2026-03-10T07:57:37.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.000+0000 7f4d8cff9700 1 -- 192.168.123.105:0/4136815779 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4d780124e0 con 0x7f4d900ffc50 2026-03-10T07:57:37.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.000+0000 7f4d95f4f700 1 -- 192.168.123.105:0/4136815779 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4d9010aa30 con 0x7f4d900ffc50 2026-03-10T07:57:37.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.001+0000 7f4d8cff9700 1 --2- 192.168.123.105:0/4136815779 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4d7c077910 0x7f4d7c079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:37.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.001+0000 7f4d8cff9700 1 -- 192.168.123.105:0/4136815779 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f4d78099a80 con 0x7f4d900ffc50 2026-03-10T07:57:37.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.001+0000 7f4d8effd700 1 --2- 192.168.123.105:0/4136815779 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4d7c077910 0x7f4d7c079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:37.004 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.001+0000 7f4d8effd700 1 --2- 192.168.123.105:0/4136815779 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4d7c077910 0x7f4d7c079dd0 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f4d9019a3d0 tx=0x7f4d80005fd0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:37.005 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.003+0000 7f4d8cff9700 1 -- 192.168.123.105:0/4136815779 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4d78062340 con 0x7f4d900ffc50 2026-03-10T07:57:37.142 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:37 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/1644627006' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T07:57:37.142 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.140+0000 7f4d95f4f700 1 -- 192.168.123.105:0/4136815779 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "format": "json"} v 0) v1 -- 0x7f4d9004ea90 con 0x7f4d900ffc50 2026-03-10T07:57:37.144 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.143+0000 7f4d8cff9700 1 -- 192.168.123.105:0/4136815779 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "format": "json"}]=0 dumped fsmap epoch 35 v35) v1 ==== 94+0+5253 (secure 0 0 0) 0x7f4d78061a90 con 0x7f4d900ffc50 2026-03-10T07:57:37.144 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:37.144 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":35,"btime":"2026-03-10T07:56:18:951575+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":44255,"name":"cephfs.vm05.omfhnh","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/4251520120","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":4251520120},{"type":"v1","addr":"192.168.123.105:6827","nonce":4251520120}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":16},{"gid":44263,"name":"cephfs.vm08.ybmbgd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/1209244","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":1209244},{"type":"v1","addr":"192.168.123.108:6825","nonce":1209244}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44289,"name":"cephfs.vm08.dgsaon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6827/3010391204","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":3010391204},{"type":"v1","addr":"192.168.123.108:6827","nonce":3010391204}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":33}],"filesystems":[{"mdsmap":{"epoch":35,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:56:18.018738+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":84,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34274},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34274":{"gid":34274,"name":"cephfs.vm05.pavqil","rank":0,"incarnation":30,"state":"up:active","state_seq":10,"addr":"192.168.123.105:6829/426813062","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":426813062},{"type":"v1","addr":"192.168.123.105:6829","nonce":426813062}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34274,"qdb_cluster":[34274]},"id":1}]} 2026-03-10T07:57:37.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.146+0000 7f4d95f4f700 1 -- 192.168.123.105:0/4136815779 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4d7c077910 msgr2=0x7f4d7c079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:37.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.146+0000 7f4d95f4f700 1 --2- 192.168.123.105:0/4136815779 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4d7c077910 0x7f4d7c079dd0 secure :-1 s=READY pgs=89 cs=0 l=1 rev1=1 crypto rx=0x7f4d9019a3d0 tx=0x7f4d80005fd0 comp rx=0 tx=0).stop 2026-03-10T07:57:37.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.146+0000 7f4d95f4f700 1 -- 192.168.123.105:0/4136815779 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d900ffc50 msgr2=0x7f4d90198db0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:37.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.146+0000 7f4d95f4f700 1 --2- 192.168.123.105:0/4136815779 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d900ffc50 0x7f4d90198db0 secure :-1 s=READY pgs=120 cs=0 l=1 rev1=1 crypto rx=0x7f4d7800ba70 tx=0x7f4d7800bd80 comp rx=0 tx=0).stop 2026-03-10T07:57:37.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.146+0000 7f4d95f4f700 1 -- 192.168.123.105:0/4136815779 shutdown_connections 2026-03-10T07:57:37.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.146+0000 7f4d95f4f700 1 --2- 192.168.123.105:0/4136815779 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4d7c077910 0x7f4d7c079dd0 unknown :-1 s=CLOSED pgs=89 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:37.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.146+0000 7f4d95f4f700 1 --2- 192.168.123.105:0/4136815779 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d900ffc50 0x7f4d90198db0 unknown :-1 s=CLOSED pgs=120 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:37.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.146+0000 7f4d95f4f700 1 --2- 192.168.123.105:0/4136815779 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4d90100600 0x7f4d901992f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:37.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.146+0000 7f4d95f4f700 1 -- 192.168.123.105:0/4136815779 >> 192.168.123.105:0/4136815779 conn(0x7f4d900fb830 msgr2=0x7f4d900fcdf0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:37.147 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.146+0000 7f4d95f4f700 1 -- 192.168.123.105:0/4136815779 shutdown_connections 2026-03-10T07:57:37.148 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.146+0000 7f4d95f4f700 1 -- 192.168.123.105:0/4136815779 wait complete. 2026-03-10T07:57:37.148 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 35 2026-03-10T07:57:37.208 DEBUG:tasks.fs:checking fs fscid=1,name=cephfs state = {'epoch': 10, 'max_mds': 1, 'flags': 18} 2026-03-10T07:57:37.208 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 11 2026-03-10T07:57:37.356 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:37.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:37 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/1644627006' entity='client.admin' cmd=[{"prefix": "fs dump"}]: dispatch 2026-03-10T07:57:37.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.603+0000 7f387ab59700 1 -- 192.168.123.105:0/1690773600 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3874103cf0 msgr2=0x7f3874107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:37.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.603+0000 7f387ab59700 1 --2- 192.168.123.105:0/1690773600 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3874103cf0 0x7f3874107d40 secure :-1 s=READY pgs=121 cs=0 l=1 rev1=1 crypto rx=0x7f3868009b50 tx=0x7f3868009e60 comp rx=0 tx=0).stop 2026-03-10T07:57:37.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.604+0000 7f387ab59700 1 -- 192.168.123.105:0/1690773600 shutdown_connections 2026-03-10T07:57:37.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.604+0000 7f387ab59700 1 --2- 192.168.123.105:0/1690773600 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3874103cf0 0x7f3874107d40 unknown :-1 s=CLOSED pgs=121 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:37.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.604+0000 7f387ab59700 1 --2- 192.168.123.105:0/1690773600 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3874103340 0x7f3874103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:37.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.604+0000 7f387ab59700 1 -- 192.168.123.105:0/1690773600 >> 192.168.123.105:0/1690773600 conn(0x7f38740feb90 msgr2=0x7f3874100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:37.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.604+0000 7f387ab59700 1 -- 192.168.123.105:0/1690773600 shutdown_connections 2026-03-10T07:57:37.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.604+0000 7f387ab59700 1 -- 192.168.123.105:0/1690773600 wait complete. 2026-03-10T07:57:37.605 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.604+0000 7f387ab59700 1 Processor -- start 2026-03-10T07:57:37.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.605+0000 7f387ab59700 1 -- start start 2026-03-10T07:57:37.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.605+0000 7f387ab59700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3874103340 0x7f3874198de0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:37.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.605+0000 7f387ab59700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3874103cf0 0x7f3874199320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:37.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.605+0000 7f387ab59700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3874199a00 con 0x7f3874103cf0 2026-03-10T07:57:37.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.605+0000 7f387ab59700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f387419d790 con 0x7f3874103340 2026-03-10T07:57:37.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.605+0000 7f38788f5700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3874103340 0x7f3874198de0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:37.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.605+0000 7f3873fff700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3874103cf0 0x7f3874199320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:37.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.605+0000 7f38788f5700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3874103340 0x7f3874198de0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:44170/0 (socket says 192.168.123.105:44170) 2026-03-10T07:57:37.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.605+0000 7f38788f5700 1 -- 192.168.123.105:0/2691278701 learned_addr learned my addr 192.168.123.105:0/2691278701 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:37.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.605+0000 7f38788f5700 1 -- 192.168.123.105:0/2691278701 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3874103cf0 msgr2=0x7f3874199320 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:37.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.606+0000 7f38788f5700 1 --2- 192.168.123.105:0/2691278701 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3874103cf0 0x7f3874199320 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:37.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.606+0000 7f38788f5700 1 -- 192.168.123.105:0/2691278701 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3864009710 con 0x7f3874103340 2026-03-10T07:57:37.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.606+0000 7f3873fff700 1 --2- 192.168.123.105:0/2691278701 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3874103cf0 0x7f3874199320 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T07:57:37.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.606+0000 7f38788f5700 1 --2- 192.168.123.105:0/2691278701 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3874103340 0x7f3874198de0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f386400ec80 tx=0x7f386400ef90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:37.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.606+0000 7f3871ffb700 1 -- 192.168.123.105:0/2691278701 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f386400ccd0 con 0x7f3874103340 2026-03-10T07:57:37.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.606+0000 7f387ab59700 1 -- 192.168.123.105:0/2691278701 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f38680097e0 con 0x7f3874103340 2026-03-10T07:57:37.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.606+0000 7f387ab59700 1 -- 192.168.123.105:0/2691278701 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f387419de30 con 0x7f3874103340 2026-03-10T07:57:37.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.607+0000 7f3871ffb700 1 -- 192.168.123.105:0/2691278701 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f3864004500 con 0x7f3874103340 2026-03-10T07:57:37.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.607+0000 7f3871ffb700 1 -- 192.168.123.105:0/2691278701 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3864005330 con 0x7f3874103340 2026-03-10T07:57:37.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.608+0000 7f387ab59700 1 -- 192.168.123.105:0/2691278701 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f387410b740 con 0x7f3874103340 2026-03-10T07:57:37.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.608+0000 7f3871ffb700 1 -- 192.168.123.105:0/2691278701 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3864005540 con 0x7f3874103340 2026-03-10T07:57:37.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.608+0000 7f3871ffb700 1 --2- 192.168.123.105:0/2691278701 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f385c0778e0 0x7f385c079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:37.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.608+0000 7f3871ffb700 1 -- 192.168.123.105:0/2691278701 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f3864014070 con 0x7f3874103340 2026-03-10T07:57:37.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.608+0000 7f3873fff700 1 --2- 192.168.123.105:0/2691278701 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f385c0778e0 0x7f385c079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:37.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.609+0000 7f3873fff700 1 --2- 192.168.123.105:0/2691278701 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f385c0778e0 0x7f385c079da0 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f387419a400 tx=0x7f3868005fd0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:37.612 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.611+0000 7f3871ffb700 1 -- 192.168.123.105:0/2691278701 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3864062920 con 0x7f3874103340 2026-03-10T07:57:37.754 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.752+0000 7f387ab59700 1 -- 192.168.123.105:0/2691278701 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 11, "format": "json"} v 0) v1 -- 0x7f387419d950 con 0x7f3874103340 2026-03-10T07:57:37.754 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.753+0000 7f3871ffb700 1 -- 192.168.123.105:0/2691278701 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 11, "format": "json"}]=0 dumped fsmap epoch 11 v35) v1 ==== 107+0+4908 (secure 0 0 0) 0x7f3864062070 con 0x7f3874103340 2026-03-10T07:57:37.754 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:37.755 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":11,"btime":"1970-01-01T00:00:00:000000+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14512,"name":"cephfs.vm05.pavqil","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.105:6829/427555544","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":427555544},{"type":"v1","addr":"192.168.123.105:6829","nonce":427555544}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":14524,"name":"cephfs.vm08.ybmbgd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3815585988","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3815585988},{"type":"v1","addr":"192.168.123.108:6825","nonce":3815585988}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24313,"name":"cephfs.vm08.dgsaon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.108:6827/2963085185","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2963085185},{"type":"v1","addr":"192.168.123.108:6827","nonce":2963085185}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11}],"filesystems":[{"mdsmap":{"epoch":11,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:48:32.422856+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":39,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24297},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24297":{"gid":24297,"name":"cephfs.vm05.omfhnh","rank":0,"incarnation":9,"state":"up:rejoin","state_seq":4,"addr":"192.168.123.105:6827/723078808","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":723078808},{"type":"v1","addr":"192.168.123.105:6827","nonce":723078808}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T07:57:37.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.756+0000 7f387ab59700 1 -- 192.168.123.105:0/2691278701 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f385c0778e0 msgr2=0x7f385c079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:37.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.756+0000 7f387ab59700 1 --2- 192.168.123.105:0/2691278701 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f385c0778e0 0x7f385c079da0 secure :-1 s=READY pgs=90 cs=0 l=1 rev1=1 crypto rx=0x7f387419a400 tx=0x7f3868005fd0 comp rx=0 tx=0).stop 2026-03-10T07:57:37.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.756+0000 7f387ab59700 1 -- 192.168.123.105:0/2691278701 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3874103340 msgr2=0x7f3874198de0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:37.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.756+0000 7f387ab59700 1 --2- 192.168.123.105:0/2691278701 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3874103340 0x7f3874198de0 secure :-1 s=READY pgs=63 cs=0 l=1 rev1=1 crypto rx=0x7f386400ec80 tx=0x7f386400ef90 comp rx=0 tx=0).stop 2026-03-10T07:57:37.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.756+0000 7f387ab59700 1 -- 192.168.123.105:0/2691278701 shutdown_connections 2026-03-10T07:57:37.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.756+0000 7f387ab59700 1 --2- 192.168.123.105:0/2691278701 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3874103340 0x7f3874198de0 unknown :-1 s=CLOSED pgs=63 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:37.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.756+0000 7f387ab59700 1 --2- 192.168.123.105:0/2691278701 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f385c0778e0 0x7f385c079da0 unknown :-1 s=CLOSED pgs=90 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:37.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.756+0000 7f387ab59700 1 --2- 192.168.123.105:0/2691278701 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3874103cf0 0x7f3874199320 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:37.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.756+0000 7f387ab59700 1 -- 192.168.123.105:0/2691278701 >> 192.168.123.105:0/2691278701 conn(0x7f38740feb90 msgr2=0x7f3874100f40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:37.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.756+0000 7f387ab59700 1 -- 192.168.123.105:0/2691278701 shutdown_connections 2026-03-10T07:57:37.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:37.756+0000 7f387ab59700 1 -- 192.168.123.105:0/2691278701 wait complete. 2026-03-10T07:57:37.758 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 11 2026-03-10T07:57:37.806 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 12 2026-03-10T07:57:37.945 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:38.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.189+0000 7f4b48112700 1 -- 192.168.123.105:0/3292317121 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4b40103340 msgr2=0x7f4b40103720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:38.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.189+0000 7f4b48112700 1 --2- 192.168.123.105:0/3292317121 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4b40103340 0x7f4b40103720 secure :-1 s=READY pgs=122 cs=0 l=1 rev1=1 crypto rx=0x7f4b30009b00 tx=0x7f4b30009e10 comp rx=0 tx=0).stop 2026-03-10T07:57:38.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.189+0000 7f4b48112700 1 -- 192.168.123.105:0/3292317121 shutdown_connections 2026-03-10T07:57:38.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.189+0000 7f4b48112700 1 --2- 192.168.123.105:0/3292317121 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4b40103cf0 0x7f4b40107d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:38.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.189+0000 7f4b48112700 1 --2- 192.168.123.105:0/3292317121 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4b40103340 0x7f4b40103720 unknown :-1 s=CLOSED pgs=122 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:38.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.189+0000 7f4b48112700 1 -- 192.168.123.105:0/3292317121 >> 192.168.123.105:0/3292317121 conn(0x7f4b400feb90 msgr2=0x7f4b40100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:38.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.190+0000 7f4b48112700 1 -- 192.168.123.105:0/3292317121 shutdown_connections 2026-03-10T07:57:38.191 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.190+0000 7f4b48112700 1 -- 192.168.123.105:0/3292317121 wait complete. 2026-03-10T07:57:38.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.191+0000 7f4b48112700 1 Processor -- start 2026-03-10T07:57:38.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.191+0000 7f4b48112700 1 -- start start 2026-03-10T07:57:38.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.191+0000 7f4b48112700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4b40103cf0 0x7f4b401990e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:38.192 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.191+0000 7f4b48112700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4b40199620 0x7f4b4019da90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:38.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.191+0000 7f4b456ad700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4b40199620 0x7f4b4019da90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:38.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.191+0000 7f4b456ad700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4b40199620 0x7f4b4019da90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:42756/0 (socket says 192.168.123.105:42756) 2026-03-10T07:57:38.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.191+0000 7f4b456ad700 1 -- 192.168.123.105:0/1963155101 learned_addr learned my addr 192.168.123.105:0/1963155101 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:38.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.192+0000 7f4b48112700 1 -- 192.168.123.105:0/1963155101 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4b40199c40 con 0x7f4b40199620 2026-03-10T07:57:38.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.192+0000 7f4b48112700 1 -- 192.168.123.105:0/1963155101 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4b40199db0 con 0x7f4b40103cf0 2026-03-10T07:57:38.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.192+0000 7f4b456ad700 1 -- 192.168.123.105:0/1963155101 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4b40103cf0 msgr2=0x7f4b401990e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:38.193 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.192+0000 7f4b45eae700 1 --2- 192.168.123.105:0/1963155101 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4b40103cf0 0x7f4b401990e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:38.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.192+0000 7f4b456ad700 1 --2- 192.168.123.105:0/1963155101 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4b40103cf0 0x7f4b401990e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:38.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.192+0000 7f4b456ad700 1 -- 192.168.123.105:0/1963155101 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4b300097e0 con 0x7f4b40199620 2026-03-10T07:57:38.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.192+0000 7f4b45eae700 1 --2- 192.168.123.105:0/1963155101 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4b40103cf0 0x7f4b401990e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:57:38.194 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.192+0000 7f4b456ad700 1 --2- 192.168.123.105:0/1963155101 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4b40199620 0x7f4b4019da90 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f4b3c00ba70 tx=0x7f4b3c00be30 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:38.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.193+0000 7f4b36ffd700 1 -- 192.168.123.105:0/1963155101 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4b3c00c760 con 0x7f4b40199620 2026-03-10T07:57:38.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.193+0000 7f4b36ffd700 1 -- 192.168.123.105:0/1963155101 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f4b3c00cda0 con 0x7f4b40199620 2026-03-10T07:57:38.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.193+0000 7f4b36ffd700 1 -- 192.168.123.105:0/1963155101 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4b3c012550 con 0x7f4b40199620 2026-03-10T07:57:38.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.193+0000 7f4b48112700 1 -- 192.168.123.105:0/1963155101 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4b4019e090 con 0x7f4b40199620 2026-03-10T07:57:38.195 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.193+0000 7f4b48112700 1 -- 192.168.123.105:0/1963155101 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4b4019e5e0 con 0x7f4b40199620 2026-03-10T07:57:38.198 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.194+0000 7f4b48112700 1 -- 192.168.123.105:0/1963155101 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4b4010b6e0 con 0x7f4b40199620 2026-03-10T07:57:38.199 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.197+0000 7f4b36ffd700 1 -- 192.168.123.105:0/1963155101 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4b3c014440 con 0x7f4b40199620 2026-03-10T07:57:38.199 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.198+0000 7f4b36ffd700 1 --2- 192.168.123.105:0/1963155101 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4b2c0778c0 0x7f4b2c079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:38.199 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.198+0000 7f4b36ffd700 1 -- 192.168.123.105:0/1963155101 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f4b3c099670 con 0x7f4b40199620 2026-03-10T07:57:38.199 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.198+0000 7f4b36ffd700 1 -- 192.168.123.105:0/1963155101 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4b3c099a80 con 0x7f4b40199620 2026-03-10T07:57:38.199 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.198+0000 7f4b45eae700 1 --2- 192.168.123.105:0/1963155101 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4b2c0778c0 0x7f4b2c079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:38.200 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.199+0000 7f4b45eae700 1 --2- 192.168.123.105:0/1963155101 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4b2c0778c0 0x7f4b2c079d80 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f4b30009fd0 tx=0x7f4b30005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:38.337 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:38 vm05.local ceph-mon[130117]: pgmap v199: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:38.337 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:38 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/4136815779' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T07:57:38.337 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:38 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/2691278701' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 11, "format": "json"}]: dispatch 2026-03-10T07:57:38.337 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.334+0000 7f4b48112700 1 -- 192.168.123.105:0/1963155101 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 12, "format": "json"} v 0) v1 -- 0x7f4b40066eb0 con 0x7f4b40199620 2026-03-10T07:57:38.339 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.338+0000 7f4b36ffd700 1 -- 192.168.123.105:0/1963155101 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 12, "format": "json"}]=0 dumped fsmap epoch 12 v35) v1 ==== 107+0+4908 (secure 0 0 0) 0x7f4b3c061f00 con 0x7f4b40199620 2026-03-10T07:57:38.339 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:38.341 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":12,"btime":"1970-01-01T00:00:00:000000+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14512,"name":"cephfs.vm05.pavqil","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.105:6829/427555544","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":427555544},{"type":"v1","addr":"192.168.123.105:6829","nonce":427555544}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":14524,"name":"cephfs.vm08.ybmbgd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3815585988","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3815585988},{"type":"v1","addr":"192.168.123.108:6825","nonce":3815585988}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24313,"name":"cephfs.vm08.dgsaon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.108:6827/2963085185","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2963085185},{"type":"v1","addr":"192.168.123.108:6827","nonce":2963085185}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11}],"filesystems":[{"mdsmap":{"epoch":12,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:48:33.425187+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":39,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24297},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24297":{"gid":24297,"name":"cephfs.vm05.omfhnh","rank":0,"incarnation":9,"state":"up:active","state_seq":5,"addr":"192.168.123.105:6827/723078808","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":723078808},{"type":"v1","addr":"192.168.123.105:6827","nonce":723078808}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T07:57:38.344 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.343+0000 7f4b48112700 1 -- 192.168.123.105:0/1963155101 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4b2c0778c0 msgr2=0x7f4b2c079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:38.345 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.343+0000 7f4b48112700 1 --2- 192.168.123.105:0/1963155101 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4b2c0778c0 0x7f4b2c079d80 secure :-1 s=READY pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f4b30009fd0 tx=0x7f4b30005fb0 comp rx=0 tx=0).stop 2026-03-10T07:57:38.345 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.343+0000 7f4b48112700 1 -- 192.168.123.105:0/1963155101 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4b40199620 msgr2=0x7f4b4019da90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:38.345 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.343+0000 7f4b48112700 1 --2- 192.168.123.105:0/1963155101 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4b40199620 0x7f4b4019da90 secure :-1 s=READY pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f4b3c00ba70 tx=0x7f4b3c00be30 comp rx=0 tx=0).stop 2026-03-10T07:57:38.345 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.343+0000 7f4b48112700 1 -- 192.168.123.105:0/1963155101 shutdown_connections 2026-03-10T07:57:38.345 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.343+0000 7f4b48112700 1 --2- 192.168.123.105:0/1963155101 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4b40103cf0 0x7f4b401990e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:38.345 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.343+0000 7f4b48112700 1 --2- 192.168.123.105:0/1963155101 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4b2c0778c0 0x7f4b2c079d80 secure :-1 s=CLOSED pgs=91 cs=0 l=1 rev1=1 crypto rx=0x7f4b30009fd0 tx=0x7f4b30005fb0 comp rx=0 tx=0).stop 2026-03-10T07:57:38.345 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.343+0000 7f4b48112700 1 --2- 192.168.123.105:0/1963155101 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4b40199620 0x7f4b4019da90 secure :-1 s=CLOSED pgs=123 cs=0 l=1 rev1=1 crypto rx=0x7f4b3c00ba70 tx=0x7f4b3c00be30 comp rx=0 tx=0).stop 2026-03-10T07:57:38.345 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.343+0000 7f4b48112700 1 -- 192.168.123.105:0/1963155101 >> 192.168.123.105:0/1963155101 conn(0x7f4b400feb90 msgr2=0x7f4b401002c0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:38.345 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.343+0000 7f4b48112700 1 -- 192.168.123.105:0/1963155101 shutdown_connections 2026-03-10T07:57:38.345 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.344+0000 7f4b48112700 1 -- 192.168.123.105:0/1963155101 wait complete. 2026-03-10T07:57:38.346 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 12 2026-03-10T07:57:38.388 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 13 2026-03-10T07:57:38.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:38 vm08.local ceph-mon[107898]: pgmap v199: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:38.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:38 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/4136815779' entity='client.admin' cmd=[{"prefix": "fs dump", "format": "json"}]: dispatch 2026-03-10T07:57:38.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:38 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/2691278701' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 11, "format": "json"}]: dispatch 2026-03-10T07:57:38.532 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:38.792 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.790+0000 7fd58762d700 1 -- 192.168.123.105:0/1919605508 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd580105be0 msgr2=0x7fd580105fc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:38.792 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.790+0000 7fd58762d700 1 --2- 192.168.123.105:0/1919605508 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd580105be0 0x7fd580105fc0 secure :-1 s=READY pgs=124 cs=0 l=1 rev1=1 crypto rx=0x7fd56c009b00 tx=0x7fd56c009e10 comp rx=0 tx=0).stop 2026-03-10T07:57:38.792 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.791+0000 7fd58762d700 1 -- 192.168.123.105:0/1919605508 shutdown_connections 2026-03-10T07:57:38.792 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.791+0000 7fd58762d700 1 --2- 192.168.123.105:0/1919605508 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd5800684d0 0x7fd580068950 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:38.792 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.791+0000 7fd58762d700 1 --2- 192.168.123.105:0/1919605508 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd580105be0 0x7fd580105fc0 unknown :-1 s=CLOSED pgs=124 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:38.792 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.791+0000 7fd58762d700 1 -- 192.168.123.105:0/1919605508 >> 192.168.123.105:0/1919605508 conn(0x7fd5800756b0 msgr2=0x7fd580075ac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:38.792 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.791+0000 7fd58762d700 1 -- 192.168.123.105:0/1919605508 shutdown_connections 2026-03-10T07:57:38.792 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.791+0000 7fd58762d700 1 -- 192.168.123.105:0/1919605508 wait complete. 2026-03-10T07:57:38.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.791+0000 7fd58762d700 1 Processor -- start 2026-03-10T07:57:38.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.792+0000 7fd58762d700 1 -- start start 2026-03-10T07:57:38.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.792+0000 7fd58762d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5800684d0 0x7fd58019c610 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:38.793 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.792+0000 7fd58762d700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd580105be0 0x7fd58019cb50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:38.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.792+0000 7fd585e2a700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd580105be0 0x7fd58019cb50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:38.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.792+0000 7fd585e2a700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd580105be0 0x7fd58019cb50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:44206/0 (socket says 192.168.123.105:44206) 2026-03-10T07:57:38.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.792+0000 7fd585e2a700 1 -- 192.168.123.105:0/1725819254 learned_addr learned my addr 192.168.123.105:0/1725819254 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:38.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.792+0000 7fd58762d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd58019d230 con 0x7fd5800684d0 2026-03-10T07:57:38.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.792+0000 7fd58762d700 1 -- 192.168.123.105:0/1725819254 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd5801966b0 con 0x7fd580105be0 2026-03-10T07:57:38.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.792+0000 7fd58662b700 1 --2- 192.168.123.105:0/1725819254 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5800684d0 0x7fd58019c610 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:38.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.793+0000 7fd585e2a700 1 -- 192.168.123.105:0/1725819254 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5800684d0 msgr2=0x7fd58019c610 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:38.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.793+0000 7fd585e2a700 1 --2- 192.168.123.105:0/1725819254 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5800684d0 0x7fd58019c610 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:38.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.793+0000 7fd585e2a700 1 -- 192.168.123.105:0/1725819254 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd56c0097e0 con 0x7fd580105be0 2026-03-10T07:57:38.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.793+0000 7fd585e2a700 1 --2- 192.168.123.105:0/1725819254 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd580105be0 0x7fd58019cb50 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fd57400eb10 tx=0x7fd57400ee20 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:38.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.793+0000 7fd58662b700 1 --2- 192.168.123.105:0/1725819254 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5800684d0 0x7fd58019c610 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T07:57:38.794 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.793+0000 7fd57b7fe700 1 -- 192.168.123.105:0/1725819254 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd57400cc40 con 0x7fd580105be0 2026-03-10T07:57:38.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.793+0000 7fd57b7fe700 1 -- 192.168.123.105:0/1725819254 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fd57400cda0 con 0x7fd580105be0 2026-03-10T07:57:38.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.793+0000 7fd57b7fe700 1 -- 192.168.123.105:0/1725819254 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd574018870 con 0x7fd580105be0 2026-03-10T07:57:38.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.793+0000 7fd58762d700 1 -- 192.168.123.105:0/1725819254 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd580196990 con 0x7fd580105be0 2026-03-10T07:57:38.795 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.793+0000 7fd58762d700 1 -- 192.168.123.105:0/1725819254 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd580196ee0 con 0x7fd580105be0 2026-03-10T07:57:38.796 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.794+0000 7fd58762d700 1 -- 192.168.123.105:0/1725819254 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd58004f2e0 con 0x7fd580105be0 2026-03-10T07:57:38.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.795+0000 7fd57b7fe700 1 -- 192.168.123.105:0/1725819254 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd5740189d0 con 0x7fd580105be0 2026-03-10T07:57:38.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.796+0000 7fd57b7fe700 1 --2- 192.168.123.105:0/1725819254 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fd5700778c0 0x7fd570079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:38.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.796+0000 7fd57b7fe700 1 -- 192.168.123.105:0/1725819254 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fd574014070 con 0x7fd580105be0 2026-03-10T07:57:38.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.796+0000 7fd58662b700 1 --2- 192.168.123.105:0/1725819254 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fd5700778c0 0x7fd570079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:38.797 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.796+0000 7fd58662b700 1 --2- 192.168.123.105:0/1725819254 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fd5700778c0 0x7fd570079d80 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7fd56c006010 tx=0x7fd56c00b560 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:38.799 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.798+0000 7fd57b7fe700 1 -- 192.168.123.105:0/1725819254 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd574063050 con 0x7fd580105be0 2026-03-10T07:57:38.941 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.939+0000 7fd58762d700 1 -- 192.168.123.105:0/1725819254 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 13, "format": "json"} v 0) v1 -- 0x7fd58004ea90 con 0x7fd580105be0 2026-03-10T07:57:38.942 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.941+0000 7fd57b7fe700 1 -- 192.168.123.105:0/1725819254 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 13, "format": "json"}]=0 dumped fsmap epoch 13 v35) v1 ==== 107+0+4118 (secure 0 0 0) 0x7fd5740627a0 con 0x7fd580105be0 2026-03-10T07:57:38.942 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:38.942 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":13,"btime":"2026-03-10T07:55:27:417427+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14512,"name":"cephfs.vm05.pavqil","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.105:6829/427555544","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":427555544},{"type":"v1","addr":"192.168.123.105:6829","nonce":427555544}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":14524,"name":"cephfs.vm08.ybmbgd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3815585988","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3815585988},{"type":"v1","addr":"192.168.123.108:6825","nonce":3815585988}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24313,"name":"cephfs.vm08.dgsaon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.108:6827/2963085185","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2963085185},{"type":"v1","addr":"192.168.123.108:6827","nonce":2963085185}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11}],"filesystems":[{"mdsmap":{"epoch":13,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:55:27.416998+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":79,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T07:57:38.944 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.943+0000 7fd58762d700 1 -- 192.168.123.105:0/1725819254 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fd5700778c0 msgr2=0x7fd570079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:38.944 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.943+0000 7fd58762d700 1 --2- 192.168.123.105:0/1725819254 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fd5700778c0 0x7fd570079d80 secure :-1 s=READY pgs=92 cs=0 l=1 rev1=1 crypto rx=0x7fd56c006010 tx=0x7fd56c00b560 comp rx=0 tx=0).stop 2026-03-10T07:57:38.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.943+0000 7fd58762d700 1 -- 192.168.123.105:0/1725819254 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd580105be0 msgr2=0x7fd58019cb50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:38.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.944+0000 7fd58762d700 1 --2- 192.168.123.105:0/1725819254 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd580105be0 0x7fd58019cb50 secure :-1 s=READY pgs=64 cs=0 l=1 rev1=1 crypto rx=0x7fd57400eb10 tx=0x7fd57400ee20 comp rx=0 tx=0).stop 2026-03-10T07:57:38.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.944+0000 7fd58762d700 1 -- 192.168.123.105:0/1725819254 shutdown_connections 2026-03-10T07:57:38.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.944+0000 7fd58762d700 1 --2- 192.168.123.105:0/1725819254 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fd5700778c0 0x7fd570079d80 unknown :-1 s=CLOSED pgs=92 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:38.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.944+0000 7fd58762d700 1 --2- 192.168.123.105:0/1725819254 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd5800684d0 0x7fd58019c610 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:38.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.944+0000 7fd58762d700 1 --2- 192.168.123.105:0/1725819254 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd580105be0 0x7fd58019cb50 unknown :-1 s=CLOSED pgs=64 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:38.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.944+0000 7fd58762d700 1 -- 192.168.123.105:0/1725819254 >> 192.168.123.105:0/1725819254 conn(0x7fd5800756b0 msgr2=0x7fd5800fdc30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:38.945 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.944+0000 7fd58762d700 1 -- 192.168.123.105:0/1725819254 shutdown_connections 2026-03-10T07:57:38.946 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:38.944+0000 7fd58762d700 1 -- 192.168.123.105:0/1725819254 wait complete. 2026-03-10T07:57:38.947 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 13 2026-03-10T07:57:39.015 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 14 2026-03-10T07:57:39.170 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:39.195 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:39 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/1963155101' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 12, "format": "json"}]: dispatch 2026-03-10T07:57:39.195 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:39 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/1725819254' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 13, "format": "json"}]: dispatch 2026-03-10T07:57:39.414 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.412+0000 7fe0d32df700 1 -- 192.168.123.105:0/234992139 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0cc068730 msgr2=0x7fe0cc068b10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:39.414 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.412+0000 7fe0d32df700 1 --2- 192.168.123.105:0/234992139 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0cc068730 0x7fe0cc068b10 secure :-1 s=READY pgs=125 cs=0 l=1 rev1=1 crypto rx=0x7fe0c0009b50 tx=0x7fe0c0009e60 comp rx=0 tx=0).stop 2026-03-10T07:57:39.414 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.413+0000 7fe0d32df700 1 -- 192.168.123.105:0/234992139 shutdown_connections 2026-03-10T07:57:39.414 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.413+0000 7fe0d32df700 1 --2- 192.168.123.105:0/234992139 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe0cc0690e0 0x7fe0cc105b50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:39.414 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.413+0000 7fe0d32df700 1 --2- 192.168.123.105:0/234992139 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0cc068730 0x7fe0cc068b10 unknown :-1 s=CLOSED pgs=125 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:39.414 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.413+0000 7fe0d32df700 1 -- 192.168.123.105:0/234992139 >> 192.168.123.105:0/234992139 conn(0x7fe0cc075960 msgr2=0x7fe0cc075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:39.415 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.413+0000 7fe0d32df700 1 -- 192.168.123.105:0/234992139 shutdown_connections 2026-03-10T07:57:39.415 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.413+0000 7fe0d32df700 1 -- 192.168.123.105:0/234992139 wait complete. 2026-03-10T07:57:39.415 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.414+0000 7fe0d32df700 1 Processor -- start 2026-03-10T07:57:39.415 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.414+0000 7fe0d32df700 1 -- start start 2026-03-10T07:57:39.416 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.415+0000 7fe0d32df700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe0cc068730 0x7fe0cc198eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:39.416 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.415+0000 7fe0d32df700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0cc0690e0 0x7fe0cc1993f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:39.416 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.415+0000 7fe0d32df700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe0cc199ad0 con 0x7fe0cc0690e0 2026-03-10T07:57:39.416 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.415+0000 7fe0d32df700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fe0cc19d860 con 0x7fe0cc068730 2026-03-10T07:57:39.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.415+0000 7fe0d087a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0cc0690e0 0x7fe0cc1993f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:39.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.415+0000 7fe0d087a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0cc0690e0 0x7fe0cc1993f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46150/0 (socket says 192.168.123.105:46150) 2026-03-10T07:57:39.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.415+0000 7fe0d087a700 1 -- 192.168.123.105:0/3247001825 learned_addr learned my addr 192.168.123.105:0/3247001825 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:39.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.415+0000 7fe0d107b700 1 --2- 192.168.123.105:0/3247001825 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe0cc068730 0x7fe0cc198eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:39.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.415+0000 7fe0d087a700 1 -- 192.168.123.105:0/3247001825 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe0cc068730 msgr2=0x7fe0cc198eb0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:39.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.415+0000 7fe0d087a700 1 --2- 192.168.123.105:0/3247001825 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe0cc068730 0x7fe0cc198eb0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:39.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.415+0000 7fe0d087a700 1 -- 192.168.123.105:0/3247001825 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fe0c00097e0 con 0x7fe0cc0690e0 2026-03-10T07:57:39.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.415+0000 7fe0d087a700 1 --2- 192.168.123.105:0/3247001825 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0cc0690e0 0x7fe0cc1993f0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7fe0c800eaa0 tx=0x7fe0c800edb0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:39.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.415+0000 7fe0be7fc700 1 -- 192.168.123.105:0/3247001825 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe0c800cc40 con 0x7fe0cc0690e0 2026-03-10T07:57:39.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.416+0000 7fe0be7fc700 1 -- 192.168.123.105:0/3247001825 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fe0c800cda0 con 0x7fe0cc0690e0 2026-03-10T07:57:39.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.416+0000 7fe0be7fc700 1 -- 192.168.123.105:0/3247001825 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fe0c8010640 con 0x7fe0cc0690e0 2026-03-10T07:57:39.417 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.416+0000 7fe0d32df700 1 -- 192.168.123.105:0/3247001825 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fe0cc19db40 con 0x7fe0cc0690e0 2026-03-10T07:57:39.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:39 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/1963155101' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 12, "format": "json"}]: dispatch 2026-03-10T07:57:39.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:39 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/1725819254' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 13, "format": "json"}]: dispatch 2026-03-10T07:57:39.418 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.416+0000 7fe0d32df700 1 -- 192.168.123.105:0/3247001825 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fe0cc19e060 con 0x7fe0cc0690e0 2026-03-10T07:57:39.421 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.417+0000 7fe0d32df700 1 -- 192.168.123.105:0/3247001825 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fe0cc1094e0 con 0x7fe0cc0690e0 2026-03-10T07:57:39.421 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.418+0000 7fe0be7fc700 1 -- 192.168.123.105:0/3247001825 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fe0c80107a0 con 0x7fe0cc0690e0 2026-03-10T07:57:39.421 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.420+0000 7fe0be7fc700 1 --2- 192.168.123.105:0/3247001825 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fe0b8077870 0x7fe0b8079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:39.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.421+0000 7fe0d107b700 1 --2- 192.168.123.105:0/3247001825 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fe0b8077870 0x7fe0b8079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:39.422 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.421+0000 7fe0d107b700 1 --2- 192.168.123.105:0/3247001825 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fe0b8077870 0x7fe0b8079d30 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7fe0c0005b20 tx=0x7fe0c0005a90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:39.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.421+0000 7fe0be7fc700 1 -- 192.168.123.105:0/3247001825 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fe0c8014070 con 0x7fe0cc0690e0 2026-03-10T07:57:39.423 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.422+0000 7fe0be7fc700 1 -- 192.168.123.105:0/3247001825 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fe0c8061d20 con 0x7fe0cc0690e0 2026-03-10T07:57:39.562 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.561+0000 7fe0d32df700 1 -- 192.168.123.105:0/3247001825 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 14, "format": "json"} v 0) v1 -- 0x7fe0cc04ea90 con 0x7fe0cc0690e0 2026-03-10T07:57:39.564 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.563+0000 7fe0be7fc700 1 -- 192.168.123.105:0/3247001825 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 14, "format": "json"}]=0 dumped fsmap epoch 14 v35) v1 ==== 107+0+4129 (secure 0 0 0) 0x7fe0c80152d0 con 0x7fe0cc0690e0 2026-03-10T07:57:39.565 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:39.565 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":14,"btime":"2026-03-10T07:55:27:428564+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14524,"name":"cephfs.vm08.ybmbgd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3815585988","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3815585988},{"type":"v1","addr":"192.168.123.108:6825","nonce":3815585988}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24313,"name":"cephfs.vm08.dgsaon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.108:6827/2963085185","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2963085185},{"type":"v1","addr":"192.168.123.108:6827","nonce":2963085185}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11}],"filesystems":[{"mdsmap":{"epoch":14,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:55:27.428559+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":79,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14512},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14512":{"gid":14512,"name":"cephfs.vm05.pavqil","rank":0,"incarnation":14,"state":"up:replay","state_seq":2,"addr":"192.168.123.105:6829/427555544","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":427555544},{"type":"v1","addr":"192.168.123.105:6829","nonce":427555544}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T07:57:39.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.566+0000 7fe0d32df700 1 -- 192.168.123.105:0/3247001825 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fe0b8077870 msgr2=0x7fe0b8079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:39.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.566+0000 7fe0d32df700 1 --2- 192.168.123.105:0/3247001825 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fe0b8077870 0x7fe0b8079d30 secure :-1 s=READY pgs=93 cs=0 l=1 rev1=1 crypto rx=0x7fe0c0005b20 tx=0x7fe0c0005a90 comp rx=0 tx=0).stop 2026-03-10T07:57:39.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.566+0000 7fe0d32df700 1 -- 192.168.123.105:0/3247001825 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0cc0690e0 msgr2=0x7fe0cc1993f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:39.567 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.566+0000 7fe0d32df700 1 --2- 192.168.123.105:0/3247001825 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0cc0690e0 0x7fe0cc1993f0 secure :-1 s=READY pgs=126 cs=0 l=1 rev1=1 crypto rx=0x7fe0c800eaa0 tx=0x7fe0c800edb0 comp rx=0 tx=0).stop 2026-03-10T07:57:39.568 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.567+0000 7fe0d32df700 1 -- 192.168.123.105:0/3247001825 shutdown_connections 2026-03-10T07:57:39.568 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.567+0000 7fe0d32df700 1 --2- 192.168.123.105:0/3247001825 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fe0cc068730 0x7fe0cc198eb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:39.568 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.567+0000 7fe0d32df700 1 --2- 192.168.123.105:0/3247001825 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fe0b8077870 0x7fe0b8079d30 unknown :-1 s=CLOSED pgs=93 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:39.568 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.567+0000 7fe0d32df700 1 --2- 192.168.123.105:0/3247001825 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fe0cc0690e0 0x7fe0cc1993f0 unknown :-1 s=CLOSED pgs=126 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:39.568 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.567+0000 7fe0d32df700 1 -- 192.168.123.105:0/3247001825 >> 192.168.123.105:0/3247001825 conn(0x7fe0cc075960 msgr2=0x7fe0cc0feac0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:39.568 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.567+0000 7fe0d32df700 1 -- 192.168.123.105:0/3247001825 shutdown_connections 2026-03-10T07:57:39.568 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:39.567+0000 7fe0d32df700 1 -- 192.168.123.105:0/3247001825 wait complete. 2026-03-10T07:57:39.569 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 14 2026-03-10T07:57:39.635 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 15 2026-03-10T07:57:39.779 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:40.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.024+0000 7f4679d55700 1 -- 192.168.123.105:0/433463050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4674106fc0 msgr2=0x7f46741073a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:40.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.024+0000 7f4679d55700 1 --2- 192.168.123.105:0/433463050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4674106fc0 0x7f46741073a0 secure :-1 s=READY pgs=127 cs=0 l=1 rev1=1 crypto rx=0x7f465c009b00 tx=0x7f465c009e10 comp rx=0 tx=0).stop 2026-03-10T07:57:40.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.024+0000 7f4679d55700 1 -- 192.168.123.105:0/433463050 shutdown_connections 2026-03-10T07:57:40.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.024+0000 7f4679d55700 1 --2- 192.168.123.105:0/433463050 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4674069220 0x7f4674069680 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:40.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.024+0000 7f4679d55700 1 --2- 192.168.123.105:0/433463050 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4674106fc0 0x7f46741073a0 unknown :-1 s=CLOSED pgs=127 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:40.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.024+0000 7f4679d55700 1 -- 192.168.123.105:0/433463050 >> 192.168.123.105:0/433463050 conn(0x7f4674076b60 msgr2=0x7f4674076f70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:40.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.025+0000 7f4679d55700 1 -- 192.168.123.105:0/433463050 shutdown_connections 2026-03-10T07:57:40.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.025+0000 7f4679d55700 1 -- 192.168.123.105:0/433463050 wait complete. 2026-03-10T07:57:40.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.025+0000 7f4679d55700 1 Processor -- start 2026-03-10T07:57:40.026 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.025+0000 7f4679d55700 1 -- start start 2026-03-10T07:57:40.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.025+0000 7f4679d55700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4674069220 0x7f4674193a20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:40.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.025+0000 7f4679d55700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4674106fc0 0x7f4674193f60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:40.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.026+0000 7f4679d55700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4674197bb0 con 0x7f4674106fc0 2026-03-10T07:57:40.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.026+0000 7f4679d55700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f46741944a0 con 0x7f4674069220 2026-03-10T07:57:40.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.026+0000 7f46737fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4674069220 0x7f4674193a20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:40.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.026+0000 7f46737fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4674069220 0x7f4674193a20 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:42904/0 (socket says 192.168.123.105:42904) 2026-03-10T07:57:40.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.026+0000 7f46737fe700 1 -- 192.168.123.105:0/1716454638 learned_addr learned my addr 192.168.123.105:0/1716454638 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:40.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.026+0000 7f46737fe700 1 -- 192.168.123.105:0/1716454638 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4674106fc0 msgr2=0x7f4674193f60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:40.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.026+0000 7f4672ffd700 1 --2- 192.168.123.105:0/1716454638 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4674106fc0 0x7f4674193f60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:40.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.026+0000 7f46737fe700 1 --2- 192.168.123.105:0/1716454638 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4674106fc0 0x7f4674193f60 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:40.027 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.026+0000 7f46737fe700 1 -- 192.168.123.105:0/1716454638 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f465c0097e0 con 0x7f4674069220 2026-03-10T07:57:40.028 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.026+0000 7f4672ffd700 1 --2- 192.168.123.105:0/1716454638 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4674106fc0 0x7f4674193f60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:57:40.028 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.026+0000 7f46737fe700 1 --2- 192.168.123.105:0/1716454638 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4674069220 0x7f4674193a20 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f465c009ad0 tx=0x7f465c0052e0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:40.028 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.027+0000 7f4670ff9700 1 -- 192.168.123.105:0/1716454638 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f465c01d070 con 0x7f4674069220 2026-03-10T07:57:40.028 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.027+0000 7f4679d55700 1 -- 192.168.123.105:0/1716454638 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4674194720 con 0x7f4674069220 2026-03-10T07:57:40.029 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.027+0000 7f4679d55700 1 -- 192.168.123.105:0/1716454638 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f46741a7610 con 0x7f4674069220 2026-03-10T07:57:40.029 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.027+0000 7f4670ff9700 1 -- 192.168.123.105:0/1716454638 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f465c00bc50 con 0x7f4674069220 2026-03-10T07:57:40.029 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.027+0000 7f4670ff9700 1 -- 192.168.123.105:0/1716454638 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f465c00f670 con 0x7f4674069220 2026-03-10T07:57:40.029 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.028+0000 7f4679d55700 1 -- 192.168.123.105:0/1716454638 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4654005320 con 0x7f4674069220 2026-03-10T07:57:40.029 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.028+0000 7f4670ff9700 1 -- 192.168.123.105:0/1716454638 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f465c00f8b0 con 0x7f4674069220 2026-03-10T07:57:40.030 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.028+0000 7f4670ff9700 1 --2- 192.168.123.105:0/1716454638 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4660077910 0x7f4660079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:40.030 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.028+0000 7f4670ff9700 1 -- 192.168.123.105:0/1716454638 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f465c09b190 con 0x7f4674069220 2026-03-10T07:57:40.030 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.029+0000 7f4672ffd700 1 --2- 192.168.123.105:0/1716454638 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4660077910 0x7f4660079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:40.030 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.029+0000 7f4672ffd700 1 --2- 192.168.123.105:0/1716454638 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4660077910 0x7f4660079dd0 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f4674195210 tx=0x7f4664008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:40.034 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.032+0000 7f4670ff9700 1 -- 192.168.123.105:0/1716454638 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f465c063940 con 0x7f4674069220 2026-03-10T07:57:40.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:40 vm05.local ceph-mon[130117]: pgmap v200: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:40.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:40 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/3247001825' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 14, "format": "json"}]: dispatch 2026-03-10T07:57:40.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.181+0000 7f4679d55700 1 -- 192.168.123.105:0/1716454638 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 15, "format": "json"} v 0) v1 -- 0x7f4654005190 con 0x7f4674069220 2026-03-10T07:57:40.183 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.182+0000 7f4670ff9700 1 -- 192.168.123.105:0/1716454638 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 15, "format": "json"}]=0 dumped fsmap epoch 15 v35) v1 ==== 107+0+4134 (secure 0 0 0) 0x7f465c063090 con 0x7f4674069220 2026-03-10T07:57:40.184 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:40.184 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":15,"btime":"2026-03-10T07:55:33:889783+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14524,"name":"cephfs.vm08.ybmbgd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3815585988","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3815585988},{"type":"v1","addr":"192.168.123.108:6825","nonce":3815585988}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24313,"name":"cephfs.vm08.dgsaon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.108:6827/2963085185","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2963085185},{"type":"v1","addr":"192.168.123.108:6827","nonce":2963085185}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11}],"filesystems":[{"mdsmap":{"epoch":15,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:55:33.098647+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":79,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14512},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14512":{"gid":14512,"name":"cephfs.vm05.pavqil","rank":0,"incarnation":14,"state":"up:reconnect","state_seq":108,"addr":"192.168.123.105:6829/427555544","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":427555544},{"type":"v1","addr":"192.168.123.105:6829","nonce":427555544}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T07:57:40.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.184+0000 7f4679d55700 1 -- 192.168.123.105:0/1716454638 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4660077910 msgr2=0x7f4660079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:40.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.184+0000 7f4679d55700 1 --2- 192.168.123.105:0/1716454638 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4660077910 0x7f4660079dd0 secure :-1 s=READY pgs=94 cs=0 l=1 rev1=1 crypto rx=0x7f4674195210 tx=0x7f4664008040 comp rx=0 tx=0).stop 2026-03-10T07:57:40.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.184+0000 7f4679d55700 1 -- 192.168.123.105:0/1716454638 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4674069220 msgr2=0x7f4674193a20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:40.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.184+0000 7f4679d55700 1 --2- 192.168.123.105:0/1716454638 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4674069220 0x7f4674193a20 secure :-1 s=READY pgs=65 cs=0 l=1 rev1=1 crypto rx=0x7f465c009ad0 tx=0x7f465c0052e0 comp rx=0 tx=0).stop 2026-03-10T07:57:40.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.185+0000 7f4679d55700 1 -- 192.168.123.105:0/1716454638 shutdown_connections 2026-03-10T07:57:40.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.185+0000 7f4679d55700 1 --2- 192.168.123.105:0/1716454638 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4674069220 0x7f4674193a20 unknown :-1 s=CLOSED pgs=65 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:40.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.185+0000 7f4679d55700 1 --2- 192.168.123.105:0/1716454638 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4660077910 0x7f4660079dd0 unknown :-1 s=CLOSED pgs=94 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:40.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.185+0000 7f4679d55700 1 --2- 192.168.123.105:0/1716454638 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4674106fc0 0x7f4674193f60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:40.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.185+0000 7f4679d55700 1 -- 192.168.123.105:0/1716454638 >> 192.168.123.105:0/1716454638 conn(0x7f4674076b60 msgr2=0x7f4674100360 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:40.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.185+0000 7f4679d55700 1 -- 192.168.123.105:0/1716454638 shutdown_connections 2026-03-10T07:57:40.186 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.185+0000 7f4679d55700 1 -- 192.168.123.105:0/1716454638 wait complete. 2026-03-10T07:57:40.187 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 15 2026-03-10T07:57:40.247 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 16 2026-03-10T07:57:40.387 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:40.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:40 vm08.local ceph-mon[107898]: pgmap v200: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:40.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:40 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/3247001825' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 14, "format": "json"}]: dispatch 2026-03-10T07:57:40.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.607+0000 7f7ba6f94700 1 -- 192.168.123.105:0/3009214555 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ba0103cf0 msgr2=0x7f7ba0107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:40.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.607+0000 7f7ba6f94700 1 --2- 192.168.123.105:0/3009214555 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ba0103cf0 0x7f7ba0107d40 secure :-1 s=READY pgs=128 cs=0 l=1 rev1=1 crypto rx=0x7f7b94009b50 tx=0x7f7b94009e60 comp rx=0 tx=0).stop 2026-03-10T07:57:40.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.608+0000 7f7ba6f94700 1 -- 192.168.123.105:0/3009214555 shutdown_connections 2026-03-10T07:57:40.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.608+0000 7f7ba6f94700 1 --2- 192.168.123.105:0/3009214555 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ba0103cf0 0x7f7ba0107d40 unknown :-1 s=CLOSED pgs=128 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:40.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.608+0000 7f7ba6f94700 1 --2- 192.168.123.105:0/3009214555 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7ba0103340 0x7f7ba0103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:40.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.608+0000 7f7ba6f94700 1 -- 192.168.123.105:0/3009214555 >> 192.168.123.105:0/3009214555 conn(0x7f7ba00feb90 msgr2=0x7f7ba0100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:40.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.608+0000 7f7ba6f94700 1 -- 192.168.123.105:0/3009214555 shutdown_connections 2026-03-10T07:57:40.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.608+0000 7f7ba6f94700 1 -- 192.168.123.105:0/3009214555 wait complete. 2026-03-10T07:57:40.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.609+0000 7f7ba6f94700 1 Processor -- start 2026-03-10T07:57:40.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.609+0000 7f7ba6f94700 1 -- start start 2026-03-10T07:57:40.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.609+0000 7f7ba6f94700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7ba0103340 0x7f7ba0198e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:40.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.609+0000 7f7ba6f94700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ba0103cf0 0x7f7ba0199390 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:40.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.609+0000 7f7ba6f94700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7ba0199a70 con 0x7f7ba0103cf0 2026-03-10T07:57:40.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.609+0000 7f7ba6f94700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7ba019d800 con 0x7f7ba0103340 2026-03-10T07:57:40.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.609+0000 7f7ba4d30700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7ba0103340 0x7f7ba0198e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:40.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.609+0000 7f7ba4d30700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7ba0103340 0x7f7ba0198e50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:42922/0 (socket says 192.168.123.105:42922) 2026-03-10T07:57:40.611 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.609+0000 7f7ba4d30700 1 -- 192.168.123.105:0/3842060489 learned_addr learned my addr 192.168.123.105:0/3842060489 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:40.611 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.609+0000 7f7b9ffff700 1 --2- 192.168.123.105:0/3842060489 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ba0103cf0 0x7f7ba0199390 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:40.611 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.610+0000 7f7ba4d30700 1 -- 192.168.123.105:0/3842060489 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ba0103cf0 msgr2=0x7f7ba0199390 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:40.611 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.610+0000 7f7ba4d30700 1 --2- 192.168.123.105:0/3842060489 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ba0103cf0 0x7f7ba0199390 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:40.611 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.610+0000 7f7ba4d30700 1 -- 192.168.123.105:0/3842060489 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7b940097e0 con 0x7f7ba0103340 2026-03-10T07:57:40.611 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.610+0000 7f7b9ffff700 1 --2- 192.168.123.105:0/3842060489 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ba0103cf0 0x7f7ba0199390 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T07:57:40.611 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.610+0000 7f7ba4d30700 1 --2- 192.168.123.105:0/3842060489 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7ba0103340 0x7f7ba0198e50 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f7b9000d8d0 tx=0x7f7b9000dbe0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:40.611 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.610+0000 7f7b9dffb700 1 -- 192.168.123.105:0/3842060489 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7b90009880 con 0x7f7ba0103340 2026-03-10T07:57:40.612 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.610+0000 7f7ba6f94700 1 -- 192.168.123.105:0/3842060489 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7ba019dae0 con 0x7f7ba0103340 2026-03-10T07:57:40.612 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.611+0000 7f7ba6f94700 1 -- 192.168.123.105:0/3842060489 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7ba019e030 con 0x7f7ba0103340 2026-03-10T07:57:40.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.611+0000 7f7b9dffb700 1 -- 192.168.123.105:0/3842060489 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f7b90010460 con 0x7f7ba0103340 2026-03-10T07:57:40.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.611+0000 7f7b9dffb700 1 -- 192.168.123.105:0/3842060489 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7b9000f5d0 con 0x7f7ba0103340 2026-03-10T07:57:40.614 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.612+0000 7f7b9dffb700 1 -- 192.168.123.105:0/3842060489 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f7b9000f800 con 0x7f7ba0103340 2026-03-10T07:57:40.614 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.612+0000 7f7ba6f94700 1 -- 192.168.123.105:0/3842060489 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7ba010b740 con 0x7f7ba0103340 2026-03-10T07:57:40.614 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.612+0000 7f7b9dffb700 1 --2- 192.168.123.105:0/3842060489 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f7b88077870 0x7f7b88079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:40.614 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.612+0000 7f7b9dffb700 1 -- 192.168.123.105:0/3842060489 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f7b90099910 con 0x7f7ba0103340 2026-03-10T07:57:40.614 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.613+0000 7f7b9ffff700 1 --2- 192.168.123.105:0/3842060489 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f7b88077870 0x7f7b88079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:40.614 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.613+0000 7f7b9ffff700 1 --2- 192.168.123.105:0/3842060489 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f7b88077870 0x7f7b88079d30 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f7ba019a470 tx=0x7f7b9400b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:40.616 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.615+0000 7f7b9dffb700 1 -- 192.168.123.105:0/3842060489 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f7b90062010 con 0x7f7ba0103340 2026-03-10T07:57:40.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.756+0000 7f7ba6f94700 1 -- 192.168.123.105:0/3842060489 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 16, "format": "json"} v 0) v1 -- 0x7f7ba004ea90 con 0x7f7ba0103340 2026-03-10T07:57:40.761 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:40.761 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":16,"btime":"2026-03-10T07:55:34:891992+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14524,"name":"cephfs.vm08.ybmbgd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3815585988","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3815585988},{"type":"v1","addr":"192.168.123.108:6825","nonce":3815585988}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24313,"name":"cephfs.vm08.dgsaon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.108:6827/2963085185","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2963085185},{"type":"v1","addr":"192.168.123.108:6827","nonce":2963085185}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":44255,"name":"cephfs.vm05.omfhnh","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/4251520120","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":4251520120},{"type":"v1","addr":"192.168.123.105:6827","nonce":4251520120}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":16}],"filesystems":[{"mdsmap":{"epoch":16,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:55:33.898455+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":79,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14512},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14512":{"gid":14512,"name":"cephfs.vm05.pavqil","rank":0,"incarnation":14,"state":"up:rejoin","state_seq":109,"addr":"192.168.123.105:6829/427555544","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":427555544},{"type":"v1","addr":"192.168.123.105:6829","nonce":427555544}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T07:57:40.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.759+0000 7f7b9dffb700 1 -- 192.168.123.105:0/3842060489 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 16, "format": "json"}]=0 dumped fsmap epoch 16 v35) v1 ==== 107+0+4982 (secure 0 0 0) 0x7f7b90061760 con 0x7f7ba0103340 2026-03-10T07:57:40.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.762+0000 7f7ba6f94700 1 -- 192.168.123.105:0/3842060489 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f7b88077870 msgr2=0x7f7b88079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:40.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.762+0000 7f7ba6f94700 1 --2- 192.168.123.105:0/3842060489 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f7b88077870 0x7f7b88079d30 secure :-1 s=READY pgs=95 cs=0 l=1 rev1=1 crypto rx=0x7f7ba019a470 tx=0x7f7b9400b540 comp rx=0 tx=0).stop 2026-03-10T07:57:40.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.762+0000 7f7ba6f94700 1 -- 192.168.123.105:0/3842060489 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7ba0103340 msgr2=0x7f7ba0198e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:40.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.762+0000 7f7ba6f94700 1 --2- 192.168.123.105:0/3842060489 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7ba0103340 0x7f7ba0198e50 secure :-1 s=READY pgs=66 cs=0 l=1 rev1=1 crypto rx=0x7f7b9000d8d0 tx=0x7f7b9000dbe0 comp rx=0 tx=0).stop 2026-03-10T07:57:40.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.762+0000 7f7ba6f94700 1 -- 192.168.123.105:0/3842060489 shutdown_connections 2026-03-10T07:57:40.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.762+0000 7f7ba6f94700 1 --2- 192.168.123.105:0/3842060489 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7ba0103340 0x7f7ba0198e50 unknown :-1 s=CLOSED pgs=66 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:40.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.762+0000 7f7ba6f94700 1 --2- 192.168.123.105:0/3842060489 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f7b88077870 0x7f7b88079d30 unknown :-1 s=CLOSED pgs=95 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:40.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.762+0000 7f7ba6f94700 1 --2- 192.168.123.105:0/3842060489 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7ba0103cf0 0x7f7ba0199390 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:40.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.762+0000 7f7ba6f94700 1 -- 192.168.123.105:0/3842060489 >> 192.168.123.105:0/3842060489 conn(0x7f7ba00feb90 msgr2=0x7f7ba0100fa0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:40.763 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.762+0000 7f7ba6f94700 1 -- 192.168.123.105:0/3842060489 shutdown_connections 2026-03-10T07:57:40.764 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:40.762+0000 7f7ba6f94700 1 -- 192.168.123.105:0/3842060489 wait complete. 2026-03-10T07:57:40.764 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 16 2026-03-10T07:57:40.833 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 17 2026-03-10T07:57:40.977 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:41.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.245+0000 7f6de12aa700 1 -- 192.168.123.105:0/730755976 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6ddc103cf0 msgr2=0x7f6ddc107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:41.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.245+0000 7f6de12aa700 1 --2- 192.168.123.105:0/730755976 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6ddc103cf0 0x7f6ddc107d40 secure :-1 s=READY pgs=129 cs=0 l=1 rev1=1 crypto rx=0x7f6dc4009b00 tx=0x7f6dc4009e10 comp rx=0 tx=0).stop 2026-03-10T07:57:41.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.246+0000 7f6de12aa700 1 -- 192.168.123.105:0/730755976 shutdown_connections 2026-03-10T07:57:41.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.246+0000 7f6de12aa700 1 --2- 192.168.123.105:0/730755976 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6ddc103cf0 0x7f6ddc107d40 unknown :-1 s=CLOSED pgs=129 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:41.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.246+0000 7f6de12aa700 1 --2- 192.168.123.105:0/730755976 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ddc103340 0x7f6ddc103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:41.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.246+0000 7f6de12aa700 1 -- 192.168.123.105:0/730755976 >> 192.168.123.105:0/730755976 conn(0x7f6ddc0feb90 msgr2=0x7f6ddc100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:41.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.246+0000 7f6de12aa700 1 -- 192.168.123.105:0/730755976 shutdown_connections 2026-03-10T07:57:41.247 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.246+0000 7f6de12aa700 1 -- 192.168.123.105:0/730755976 wait complete. 2026-03-10T07:57:41.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.247+0000 7f6de12aa700 1 Processor -- start 2026-03-10T07:57:41.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.247+0000 7f6de12aa700 1 -- start start 2026-03-10T07:57:41.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.247+0000 7f6de12aa700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6ddc103340 0x7f6ddc198de0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:41.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.247+0000 7f6de12aa700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ddc103cf0 0x7f6ddc199320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:41.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.247+0000 7f6de12aa700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6ddc199a00 con 0x7f6ddc103340 2026-03-10T07:57:41.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.247+0000 7f6de12aa700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6ddc19d790 con 0x7f6ddc103cf0 2026-03-10T07:57:41.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.247+0000 7f6dd3fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ddc103cf0 0x7f6ddc199320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:41.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.247+0000 7f6dd3fff700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ddc103cf0 0x7f6ddc199320 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:42946/0 (socket says 192.168.123.105:42946) 2026-03-10T07:57:41.248 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.247+0000 7f6dd3fff700 1 -- 192.168.123.105:0/1865211490 learned_addr learned my addr 192.168.123.105:0/1865211490 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:41.249 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.247+0000 7f6ddaffd700 1 --2- 192.168.123.105:0/1865211490 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6ddc103340 0x7f6ddc198de0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:41.249 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.247+0000 7f6dd3fff700 1 -- 192.168.123.105:0/1865211490 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6ddc103340 msgr2=0x7f6ddc198de0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:41.249 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.247+0000 7f6dd3fff700 1 --2- 192.168.123.105:0/1865211490 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6ddc103340 0x7f6ddc198de0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:41.249 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.248+0000 7f6dd3fff700 1 -- 192.168.123.105:0/1865211490 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6dc40097e0 con 0x7f6ddc103cf0 2026-03-10T07:57:41.249 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.248+0000 7f6ddaffd700 1 --2- 192.168.123.105:0/1865211490 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6ddc103340 0x7f6ddc198de0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:57:41.249 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.248+0000 7f6dd3fff700 1 --2- 192.168.123.105:0/1865211490 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ddc103cf0 0x7f6ddc199320 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f6dc4005850 tx=0x7f6dc40048c0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:41.249 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.248+0000 7f6dd8ff9700 1 -- 192.168.123.105:0/1865211490 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6dc401d070 con 0x7f6ddc103cf0 2026-03-10T07:57:41.249 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.248+0000 7f6de12aa700 1 -- 192.168.123.105:0/1865211490 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6ddc19da10 con 0x7f6ddc103cf0 2026-03-10T07:57:41.249 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.248+0000 7f6de12aa700 1 -- 192.168.123.105:0/1865211490 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6ddc19df00 con 0x7f6ddc103cf0 2026-03-10T07:57:41.250 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.249+0000 7f6dd8ff9700 1 -- 192.168.123.105:0/1865211490 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f6dc4022470 con 0x7f6ddc103cf0 2026-03-10T07:57:41.250 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.249+0000 7f6de12aa700 1 -- 192.168.123.105:0/1865211490 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6ddc10b740 con 0x7f6ddc103cf0 2026-03-10T07:57:41.250 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.249+0000 7f6dd8ff9700 1 -- 192.168.123.105:0/1865211490 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6dc400f670 con 0x7f6ddc103cf0 2026-03-10T07:57:41.251 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.250+0000 7f6dd8ff9700 1 -- 192.168.123.105:0/1865211490 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6dc40225e0 con 0x7f6ddc103cf0 2026-03-10T07:57:41.251 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.250+0000 7f6dd8ff9700 1 --2- 192.168.123.105:0/1865211490 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6dbc077910 0x7f6dbc079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:41.251 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.250+0000 7f6dd8ff9700 1 -- 192.168.123.105:0/1865211490 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f6dc409b170 con 0x7f6ddc103cf0 2026-03-10T07:57:41.252 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.250+0000 7f6ddaffd700 1 --2- 192.168.123.105:0/1865211490 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6dbc077910 0x7f6dbc079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:41.252 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.251+0000 7f6ddaffd700 1 --2- 192.168.123.105:0/1865211490 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6dbc077910 0x7f6dbc079dd0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f6dcc005fd0 tx=0x7f6dcc005ee0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:41.253 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.252+0000 7f6dd8ff9700 1 -- 192.168.123.105:0/1865211490 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6dc40639a0 con 0x7f6ddc103cf0 2026-03-10T07:57:41.395 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:41 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/1716454638' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 15, "format": "json"}]: dispatch 2026-03-10T07:57:41.395 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:41 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/3842060489' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 16, "format": "json"}]: dispatch 2026-03-10T07:57:41.395 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.393+0000 7f6de12aa700 1 -- 192.168.123.105:0/1865211490 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 17, "format": "json"} v 0) v1 -- 0x7f6ddc04ea90 con 0x7f6ddc103cf0 2026-03-10T07:57:41.395 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.394+0000 7f6dd8ff9700 1 -- 192.168.123.105:0/1865211490 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 17, "format": "json"}]=0 dumped fsmap epoch 17 v35) v1 ==== 107+0+4991 (secure 0 0 0) 0x7f6dc40630f0 con 0x7f6ddc103cf0 2026-03-10T07:57:41.397 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:41.397 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":17,"btime":"2026-03-10T07:55:35:900232+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14524,"name":"cephfs.vm08.ybmbgd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3815585988","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3815585988},{"type":"v1","addr":"192.168.123.108:6825","nonce":3815585988}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24313,"name":"cephfs.vm08.dgsaon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.108:6827/2963085185","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2963085185},{"type":"v1","addr":"192.168.123.108:6827","nonce":2963085185}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":44255,"name":"cephfs.vm05.omfhnh","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/4251520120","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":4251520120},{"type":"v1","addr":"192.168.123.105:6827","nonce":4251520120}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":16}],"filesystems":[{"mdsmap":{"epoch":17,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:55:35.900231+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":79,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14512},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14512":{"gid":14512,"name":"cephfs.vm05.pavqil","rank":0,"incarnation":14,"state":"up:active","state_seq":110,"addr":"192.168.123.105:6829/427555544","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":427555544},{"type":"v1","addr":"192.168.123.105:6829","nonce":427555544}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":14512,"qdb_cluster":[14512]},"id":1}]} 2026-03-10T07:57:41.399 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.398+0000 7f6de12aa700 1 -- 192.168.123.105:0/1865211490 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6dbc077910 msgr2=0x7f6dbc079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:41.399 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.398+0000 7f6de12aa700 1 --2- 192.168.123.105:0/1865211490 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6dbc077910 0x7f6dbc079dd0 secure :-1 s=READY pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f6dcc005fd0 tx=0x7f6dcc005ee0 comp rx=0 tx=0).stop 2026-03-10T07:57:41.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.398+0000 7f6de12aa700 1 -- 192.168.123.105:0/1865211490 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ddc103cf0 msgr2=0x7f6ddc199320 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:41.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.398+0000 7f6de12aa700 1 --2- 192.168.123.105:0/1865211490 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ddc103cf0 0x7f6ddc199320 secure :-1 s=READY pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f6dc4005850 tx=0x7f6dc40048c0 comp rx=0 tx=0).stop 2026-03-10T07:57:41.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.398+0000 7f6de12aa700 1 -- 192.168.123.105:0/1865211490 shutdown_connections 2026-03-10T07:57:41.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.398+0000 7f6de12aa700 1 --2- 192.168.123.105:0/1865211490 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6dbc077910 0x7f6dbc079dd0 secure :-1 s=CLOSED pgs=96 cs=0 l=1 rev1=1 crypto rx=0x7f6dcc005fd0 tx=0x7f6dcc005ee0 comp rx=0 tx=0).stop 2026-03-10T07:57:41.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.398+0000 7f6de12aa700 1 --2- 192.168.123.105:0/1865211490 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6ddc103340 0x7f6ddc198de0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:41.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.398+0000 7f6de12aa700 1 --2- 192.168.123.105:0/1865211490 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ddc103cf0 0x7f6ddc199320 secure :-1 s=CLOSED pgs=67 cs=0 l=1 rev1=1 crypto rx=0x7f6dc4005850 tx=0x7f6dc40048c0 comp rx=0 tx=0).stop 2026-03-10T07:57:41.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.398+0000 7f6de12aa700 1 -- 192.168.123.105:0/1865211490 >> 192.168.123.105:0/1865211490 conn(0x7f6ddc0feb90 msgr2=0x7f6ddc100f40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:41.400 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.399+0000 7f6de12aa700 1 -- 192.168.123.105:0/1865211490 shutdown_connections 2026-03-10T07:57:41.401 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.399+0000 7f6de12aa700 1 -- 192.168.123.105:0/1865211490 wait complete. 2026-03-10T07:57:41.401 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 17 2026-03-10T07:57:41.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:41 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/1716454638' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 15, "format": "json"}]: dispatch 2026-03-10T07:57:41.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:41 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/3842060489' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 16, "format": "json"}]: dispatch 2026-03-10T07:57:41.463 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 18 2026-03-10T07:57:41.615 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:41.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.869+0000 7fd71c0e7700 1 -- 192.168.123.105:0/867715698 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd714103cf0 msgr2=0x7fd714107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:41.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.869+0000 7fd71c0e7700 1 --2- 192.168.123.105:0/867715698 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd714103cf0 0x7fd714107d40 secure :-1 s=READY pgs=130 cs=0 l=1 rev1=1 crypto rx=0x7fd710009b00 tx=0x7fd710009e10 comp rx=0 tx=0).stop 2026-03-10T07:57:41.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.870+0000 7fd71c0e7700 1 -- 192.168.123.105:0/867715698 shutdown_connections 2026-03-10T07:57:41.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.870+0000 7fd71c0e7700 1 --2- 192.168.123.105:0/867715698 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd714103cf0 0x7fd714107d40 unknown :-1 s=CLOSED pgs=130 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:41.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.870+0000 7fd71c0e7700 1 --2- 192.168.123.105:0/867715698 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd714103340 0x7fd714103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:41.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.870+0000 7fd71c0e7700 1 -- 192.168.123.105:0/867715698 >> 192.168.123.105:0/867715698 conn(0x7fd7140feb90 msgr2=0x7fd714100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:41.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.870+0000 7fd71c0e7700 1 -- 192.168.123.105:0/867715698 shutdown_connections 2026-03-10T07:57:41.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.870+0000 7fd71c0e7700 1 -- 192.168.123.105:0/867715698 wait complete. 2026-03-10T07:57:41.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.871+0000 7fd71c0e7700 1 Processor -- start 2026-03-10T07:57:41.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.871+0000 7fd71c0e7700 1 -- start start 2026-03-10T07:57:41.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.871+0000 7fd71c0e7700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd714103340 0x7fd714198eb0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:41.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.871+0000 7fd71c0e7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd714103cf0 0x7fd7141993f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:41.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.871+0000 7fd71c0e7700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd714199ad0 con 0x7fd714103cf0 2026-03-10T07:57:41.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.871+0000 7fd71c0e7700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fd71419d860 con 0x7fd714103340 2026-03-10T07:57:41.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.871+0000 7fd719682700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd714103cf0 0x7fd7141993f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:41.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.871+0000 7fd719682700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd714103cf0 0x7fd7141993f0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46220/0 (socket says 192.168.123.105:46220) 2026-03-10T07:57:41.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.871+0000 7fd719682700 1 -- 192.168.123.105:0/2731042778 learned_addr learned my addr 192.168.123.105:0/2731042778 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:41.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.872+0000 7fd719682700 1 -- 192.168.123.105:0/2731042778 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd714103340 msgr2=0x7fd714198eb0 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:57:41.873 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.872+0000 7fd719e83700 1 --2- 192.168.123.105:0/2731042778 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd714103340 0x7fd714198eb0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:41.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.872+0000 7fd719682700 1 --2- 192.168.123.105:0/2731042778 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd714103340 0x7fd714198eb0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:41.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.872+0000 7fd719682700 1 -- 192.168.123.105:0/2731042778 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fd7100097e0 con 0x7fd714103cf0 2026-03-10T07:57:41.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.872+0000 7fd719e83700 1 --2- 192.168.123.105:0/2731042778 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd714103340 0x7fd714198eb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:57:41.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.872+0000 7fd719682700 1 --2- 192.168.123.105:0/2731042778 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd714103cf0 0x7fd7141993f0 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7fd710005f50 tx=0x7fd710004970 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:41.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.872+0000 7fd70affd700 1 -- 192.168.123.105:0/2731042778 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd71001d070 con 0x7fd714103cf0 2026-03-10T07:57:41.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.873+0000 7fd71c0e7700 1 -- 192.168.123.105:0/2731042778 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fd71419dae0 con 0x7fd714103cf0 2026-03-10T07:57:41.874 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.873+0000 7fd71c0e7700 1 -- 192.168.123.105:0/2731042778 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fd71419dfd0 con 0x7fd714103cf0 2026-03-10T07:57:41.875 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.873+0000 7fd70affd700 1 -- 192.168.123.105:0/2731042778 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fd71000bc50 con 0x7fd714103cf0 2026-03-10T07:57:41.878 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.873+0000 7fd71c0e7700 1 -- 192.168.123.105:0/2731042778 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fd71410b690 con 0x7fd714103cf0 2026-03-10T07:57:41.878 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.875+0000 7fd70affd700 1 -- 192.168.123.105:0/2731042778 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fd71000f700 con 0x7fd714103cf0 2026-03-10T07:57:41.878 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.875+0000 7fd70affd700 1 -- 192.168.123.105:0/2731042778 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fd71000f8a0 con 0x7fd714103cf0 2026-03-10T07:57:41.878 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.875+0000 7fd70affd700 1 --2- 192.168.123.105:0/2731042778 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fd700077730 0x7fd700079bf0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:41.878 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.875+0000 7fd70affd700 1 -- 192.168.123.105:0/2731042778 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fd71009c320 con 0x7fd714103cf0 2026-03-10T07:57:41.878 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.877+0000 7fd70affd700 1 -- 192.168.123.105:0/2731042778 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fd710064b50 con 0x7fd714103cf0 2026-03-10T07:57:41.879 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.878+0000 7fd719e83700 1 --2- 192.168.123.105:0/2731042778 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fd700077730 0x7fd700079bf0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:41.879 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:41.878+0000 7fd719e83700 1 --2- 192.168.123.105:0/2731042778 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fd700077730 0x7fd700079bf0 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fd704007950 tx=0x7fd704008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:42.015 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.013+0000 7fd71c0e7700 1 -- 192.168.123.105:0/2731042778 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 18, "format": "json"} v 0) v1 -- 0x7fd71404ea90 con 0x7fd714103cf0 2026-03-10T07:57:42.016 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.015+0000 7fd70affd700 1 -- 192.168.123.105:0/2731042778 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 18, "format": "json"}]=0 dumped fsmap epoch 18 v35) v1 ==== 107+0+4189 (secure 0 0 0) 0x7fd7100642a0 con 0x7fd714103cf0 2026-03-10T07:57:42.016 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:42.016 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":18,"btime":"2026-03-10T07:55:37:451725+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":14524,"name":"cephfs.vm08.ybmbgd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/3815585988","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3815585988},{"type":"v1","addr":"192.168.123.108:6825","nonce":3815585988}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":10},{"gid":24313,"name":"cephfs.vm08.dgsaon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.108:6827/2963085185","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2963085185},{"type":"v1","addr":"192.168.123.108:6827","nonce":2963085185}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":44255,"name":"cephfs.vm05.omfhnh","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/4251520120","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":4251520120},{"type":"v1","addr":"192.168.123.105:6827","nonce":4251520120}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":16}],"filesystems":[{"mdsmap":{"epoch":18,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:55:37.451723+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":82,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T07:57:42.018 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.017+0000 7fd71c0e7700 1 -- 192.168.123.105:0/2731042778 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fd700077730 msgr2=0x7fd700079bf0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:42.018 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.017+0000 7fd71c0e7700 1 --2- 192.168.123.105:0/2731042778 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fd700077730 0x7fd700079bf0 secure :-1 s=READY pgs=97 cs=0 l=1 rev1=1 crypto rx=0x7fd704007950 tx=0x7fd704008040 comp rx=0 tx=0).stop 2026-03-10T07:57:42.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.017+0000 7fd71c0e7700 1 -- 192.168.123.105:0/2731042778 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd714103cf0 msgr2=0x7fd7141993f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:42.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.017+0000 7fd71c0e7700 1 --2- 192.168.123.105:0/2731042778 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd714103cf0 0x7fd7141993f0 secure :-1 s=READY pgs=131 cs=0 l=1 rev1=1 crypto rx=0x7fd710005f50 tx=0x7fd710004970 comp rx=0 tx=0).stop 2026-03-10T07:57:42.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.018+0000 7fd71c0e7700 1 -- 192.168.123.105:0/2731042778 shutdown_connections 2026-03-10T07:57:42.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.018+0000 7fd71c0e7700 1 --2- 192.168.123.105:0/2731042778 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fd714103340 0x7fd714198eb0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:42.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.018+0000 7fd71c0e7700 1 --2- 192.168.123.105:0/2731042778 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fd700077730 0x7fd700079bf0 unknown :-1 s=CLOSED pgs=97 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:42.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.018+0000 7fd71c0e7700 1 --2- 192.168.123.105:0/2731042778 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fd714103cf0 0x7fd7141993f0 unknown :-1 s=CLOSED pgs=131 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:42.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.018+0000 7fd71c0e7700 1 -- 192.168.123.105:0/2731042778 >> 192.168.123.105:0/2731042778 conn(0x7fd7140feb90 msgr2=0x7fd7141001b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:42.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.018+0000 7fd71c0e7700 1 -- 192.168.123.105:0/2731042778 shutdown_connections 2026-03-10T07:57:42.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.018+0000 7fd71c0e7700 1 -- 192.168.123.105:0/2731042778 wait complete. 2026-03-10T07:57:42.020 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 18 2026-03-10T07:57:42.081 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 19 2026-03-10T07:57:42.221 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:42.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:42 vm05.local ceph-mon[130117]: pgmap v201: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:42.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:42 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/1865211490' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 17, "format": "json"}]: dispatch 2026-03-10T07:57:42.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:57:42.277 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:42 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/2731042778' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 18, "format": "json"}]: dispatch 2026-03-10T07:57:42.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:42 vm08.local ceph-mon[107898]: pgmap v201: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:42.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:42 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/1865211490' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 17, "format": "json"}]: dispatch 2026-03-10T07:57:42.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:57:42.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:42 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/2731042778' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 18, "format": "json"}]: dispatch 2026-03-10T07:57:42.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.495+0000 7f60ee619700 1 -- 192.168.123.105:0/997261152 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f60e8103340 msgr2=0x7f60e8103720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:42.496 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.495+0000 7f60ee619700 1 --2- 192.168.123.105:0/997261152 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f60e8103340 0x7f60e8103720 secure :-1 s=READY pgs=68 cs=0 l=1 rev1=1 crypto rx=0x7f60d0009a60 tx=0x7f60d0009d70 comp rx=0 tx=0).stop 2026-03-10T07:57:42.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.495+0000 7f60ee619700 1 -- 192.168.123.105:0/997261152 shutdown_connections 2026-03-10T07:57:42.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.495+0000 7f60ee619700 1 --2- 192.168.123.105:0/997261152 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f60e8103cf0 0x7f60e8107d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:42.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.495+0000 7f60ee619700 1 --2- 192.168.123.105:0/997261152 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f60e8103340 0x7f60e8103720 unknown :-1 s=CLOSED pgs=68 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:42.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.495+0000 7f60ee619700 1 -- 192.168.123.105:0/997261152 >> 192.168.123.105:0/997261152 conn(0x7f60e80feb90 msgr2=0x7f60e8100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:42.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.496+0000 7f60ee619700 1 -- 192.168.123.105:0/997261152 shutdown_connections 2026-03-10T07:57:42.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.496+0000 7f60ee619700 1 -- 192.168.123.105:0/997261152 wait complete. 2026-03-10T07:57:42.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.496+0000 7f60ee619700 1 Processor -- start 2026-03-10T07:57:42.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.496+0000 7f60ee619700 1 -- start start 2026-03-10T07:57:42.497 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.496+0000 7f60ee619700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f60e8103340 0x7f60e8198d70 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:42.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.496+0000 7f60ee619700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f60e8103cf0 0x7f60e81992b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:42.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.496+0000 7f60ee619700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f60e8199990 con 0x7f60e8103340 2026-03-10T07:57:42.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.496+0000 7f60ee619700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f60e819d720 con 0x7f60e8103cf0 2026-03-10T07:57:42.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.497+0000 7f60e77fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f60e8103cf0 0x7f60e81992b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:42.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.497+0000 7f60e77fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f60e8103cf0 0x7f60e81992b0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:42968/0 (socket says 192.168.123.105:42968) 2026-03-10T07:57:42.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.497+0000 7f60e77fe700 1 -- 192.168.123.105:0/3346508295 learned_addr learned my addr 192.168.123.105:0/3346508295 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:42.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.497+0000 7f60e7fff700 1 --2- 192.168.123.105:0/3346508295 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f60e8103340 0x7f60e8198d70 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:42.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.497+0000 7f60e7fff700 1 -- 192.168.123.105:0/3346508295 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f60e8103cf0 msgr2=0x7f60e81992b0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:42.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.497+0000 7f60e7fff700 1 --2- 192.168.123.105:0/3346508295 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f60e8103cf0 0x7f60e81992b0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:42.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.497+0000 7f60e7fff700 1 -- 192.168.123.105:0/3346508295 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f60d80097e0 con 0x7f60e8103340 2026-03-10T07:57:42.498 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.497+0000 7f60e77fe700 1 --2- 192.168.123.105:0/3346508295 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f60e8103cf0 0x7f60e81992b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T07:57:42.499 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.497+0000 7f60e7fff700 1 --2- 192.168.123.105:0/3346508295 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f60e8103340 0x7f60e8198d70 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f60d0009a60 tx=0x7f60d000f690 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:42.500 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.498+0000 7f60e57fa700 1 -- 192.168.123.105:0/3346508295 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f60d001d070 con 0x7f60e8103340 2026-03-10T07:57:42.500 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.498+0000 7f60e57fa700 1 -- 192.168.123.105:0/3346508295 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f60d000fc30 con 0x7f60e8103340 2026-03-10T07:57:42.500 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.498+0000 7f60e57fa700 1 -- 192.168.123.105:0/3346508295 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f60d00175e0 con 0x7f60e8103340 2026-03-10T07:57:42.500 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.498+0000 7f60ee619700 1 -- 192.168.123.105:0/3346508295 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f60d0009710 con 0x7f60e8103340 2026-03-10T07:57:42.500 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.498+0000 7f60ee619700 1 -- 192.168.123.105:0/3346508295 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f60e819dd00 con 0x7f60e8103340 2026-03-10T07:57:42.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.499+0000 7f60e57fa700 1 -- 192.168.123.105:0/3346508295 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f60d0017740 con 0x7f60e8103340 2026-03-10T07:57:42.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.500+0000 7f60e57fa700 1 --2- 192.168.123.105:0/3346508295 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f60d4077910 0x7f60d4079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:42.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.500+0000 7f60e77fe700 1 --2- 192.168.123.105:0/3346508295 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f60d4077910 0x7f60d4079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:42.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.501+0000 7f60ee619700 1 -- 192.168.123.105:0/3346508295 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f60e810b690 con 0x7f60e8103340 2026-03-10T07:57:42.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.501+0000 7f60e57fa700 1 -- 192.168.123.105:0/3346508295 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f60d009c060 con 0x7f60e8103340 2026-03-10T07:57:42.502 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.501+0000 7f60e77fe700 1 --2- 192.168.123.105:0/3346508295 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f60d4077910 0x7f60d4079dd0 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f60e819a390 tx=0x7f60d8009500 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:42.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.504+0000 7f60e57fa700 1 -- 192.168.123.105:0/3346508295 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f60d0064810 con 0x7f60e8103340 2026-03-10T07:57:42.644 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.642+0000 7f60ee619700 1 -- 192.168.123.105:0/3346508295 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 19, "format": "json"} v 0) v1 -- 0x7f60e804ea90 con 0x7f60e8103340 2026-03-10T07:57:42.647 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.646+0000 7f60e57fa700 1 -- 192.168.123.105:0/3346508295 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 19, "format": "json"}]=0 dumped fsmap epoch 19 v35) v1 ==== 107+0+4200 (secure 0 0 0) 0x7f60d0063f60 con 0x7f60e8103340 2026-03-10T07:57:42.647 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:42.647 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":19,"btime":"2026-03-10T07:55:37:457455+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24313,"name":"cephfs.vm08.dgsaon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.108:6827/2963085185","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2963085185},{"type":"v1","addr":"192.168.123.108:6827","nonce":2963085185}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":44255,"name":"cephfs.vm05.omfhnh","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/4251520120","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":4251520120},{"type":"v1","addr":"192.168.123.105:6827","nonce":4251520120}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":16}],"filesystems":[{"mdsmap":{"epoch":19,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:55:37.457132+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":82,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14524},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14524":{"gid":14524,"name":"cephfs.vm08.ybmbgd","rank":0,"incarnation":19,"state":"up:replay","state_seq":1,"addr":"192.168.123.108:6825/3815585988","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3815585988},{"type":"v1","addr":"192.168.123.108:6825","nonce":3815585988}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T07:57:42.649 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.648+0000 7f60ee619700 1 -- 192.168.123.105:0/3346508295 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f60d4077910 msgr2=0x7f60d4079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:42.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.648+0000 7f60ee619700 1 --2- 192.168.123.105:0/3346508295 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f60d4077910 0x7f60d4079dd0 secure :-1 s=READY pgs=98 cs=0 l=1 rev1=1 crypto rx=0x7f60e819a390 tx=0x7f60d8009500 comp rx=0 tx=0).stop 2026-03-10T07:57:42.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.649+0000 7f60ee619700 1 -- 192.168.123.105:0/3346508295 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f60e8103340 msgr2=0x7f60e8198d70 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:42.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.649+0000 7f60ee619700 1 --2- 192.168.123.105:0/3346508295 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f60e8103340 0x7f60e8198d70 secure :-1 s=READY pgs=132 cs=0 l=1 rev1=1 crypto rx=0x7f60d0009a60 tx=0x7f60d000f690 comp rx=0 tx=0).stop 2026-03-10T07:57:42.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.649+0000 7f60ee619700 1 -- 192.168.123.105:0/3346508295 shutdown_connections 2026-03-10T07:57:42.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.649+0000 7f60ee619700 1 --2- 192.168.123.105:0/3346508295 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f60d4077910 0x7f60d4079dd0 unknown :-1 s=CLOSED pgs=98 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:42.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.649+0000 7f60ee619700 1 --2- 192.168.123.105:0/3346508295 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f60e8103340 0x7f60e8198d70 unknown :-1 s=CLOSED pgs=132 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:42.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.649+0000 7f60ee619700 1 --2- 192.168.123.105:0/3346508295 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f60e8103cf0 0x7f60e81992b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:42.650 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.649+0000 7f60ee619700 1 -- 192.168.123.105:0/3346508295 >> 192.168.123.105:0/3346508295 conn(0x7f60e80feb90 msgr2=0x7f60e8100f30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:42.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.649+0000 7f60ee619700 1 -- 192.168.123.105:0/3346508295 shutdown_connections 2026-03-10T07:57:42.651 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:42.650+0000 7f60ee619700 1 -- 192.168.123.105:0/3346508295 wait complete. 2026-03-10T07:57:42.652 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 19 2026-03-10T07:57:42.713 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 20 2026-03-10T07:57:42.857 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:43.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.106+0000 7f5625e85700 1 -- 192.168.123.105:0/1028663078 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5620103bc0 msgr2=0x7f5620107c10 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:43.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.106+0000 7f5625e85700 1 --2- 192.168.123.105:0/1028663078 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5620103bc0 0x7f5620107c10 secure :-1 s=READY pgs=133 cs=0 l=1 rev1=1 crypto rx=0x7f5610009b00 tx=0x7f5610009e10 comp rx=0 tx=0).stop 2026-03-10T07:57:43.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.106+0000 7f5625e85700 1 -- 192.168.123.105:0/1028663078 shutdown_connections 2026-03-10T07:57:43.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.106+0000 7f5625e85700 1 --2- 192.168.123.105:0/1028663078 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5620103bc0 0x7f5620107c10 unknown :-1 s=CLOSED pgs=133 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:43.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.106+0000 7f5625e85700 1 --2- 192.168.123.105:0/1028663078 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5620103210 0x7f56201035f0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:43.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.106+0000 7f5625e85700 1 -- 192.168.123.105:0/1028663078 >> 192.168.123.105:0/1028663078 conn(0x7f56200fea60 msgr2=0x7f5620100e80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:43.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.107+0000 7f5625e85700 1 -- 192.168.123.105:0/1028663078 shutdown_connections 2026-03-10T07:57:43.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.107+0000 7f5625e85700 1 -- 192.168.123.105:0/1028663078 wait complete. 2026-03-10T07:57:43.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.107+0000 7f5625e85700 1 Processor -- start 2026-03-10T07:57:43.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.108+0000 7f5625e85700 1 -- start start 2026-03-10T07:57:43.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.108+0000 7f5625e85700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5620103210 0x7f5620198d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:43.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.108+0000 7f5625e85700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5620103bc0 0x7f5620199270 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:43.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.108+0000 7f5625e85700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5620199950 con 0x7f5620103bc0 2026-03-10T07:57:43.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.108+0000 7f5625e85700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f562019d6e0 con 0x7f5620103210 2026-03-10T07:57:43.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.108+0000 7f561f7fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5620103210 0x7f5620198d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:43.109 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.108+0000 7f561f7fe700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5620103210 0x7f5620198d30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:42984/0 (socket says 192.168.123.105:42984) 2026-03-10T07:57:43.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.108+0000 7f561f7fe700 1 -- 192.168.123.105:0/209068972 learned_addr learned my addr 192.168.123.105:0/209068972 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:43.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.108+0000 7f561f7fe700 1 -- 192.168.123.105:0/209068972 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5620103bc0 msgr2=0x7f5620199270 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:43.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.108+0000 7f561f7fe700 1 --2- 192.168.123.105:0/209068972 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5620103bc0 0x7f5620199270 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:43.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.108+0000 7f561f7fe700 1 -- 192.168.123.105:0/209068972 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f56100097e0 con 0x7f5620103210 2026-03-10T07:57:43.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.108+0000 7f561f7fe700 1 --2- 192.168.123.105:0/209068972 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5620103210 0x7f5620198d30 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f560800eb10 tx=0x7f560800eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:43.110 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.109+0000 7f561cff9700 1 -- 192.168.123.105:0/209068972 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f560800cca0 con 0x7f5620103210 2026-03-10T07:57:43.111 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.109+0000 7f5625e85700 1 -- 192.168.123.105:0/209068972 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f562019d9c0 con 0x7f5620103210 2026-03-10T07:57:43.111 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.109+0000 7f5625e85700 1 -- 192.168.123.105:0/209068972 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f562019df10 con 0x7f5620103210 2026-03-10T07:57:43.111 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.109+0000 7f561cff9700 1 -- 192.168.123.105:0/209068972 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f560800ce00 con 0x7f5620103210 2026-03-10T07:57:43.111 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.109+0000 7f561cff9700 1 -- 192.168.123.105:0/209068972 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f56080189c0 con 0x7f5620103210 2026-03-10T07:57:43.112 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.110+0000 7f561cff9700 1 -- 192.168.123.105:0/209068972 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5608018b20 con 0x7f5620103210 2026-03-10T07:57:43.112 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.110+0000 7f5625e85700 1 -- 192.168.123.105:0/209068972 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5600005320 con 0x7f5620103210 2026-03-10T07:57:43.112 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.111+0000 7f561cff9700 1 --2- 192.168.123.105:0/209068972 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f560c0778c0 0x7f560c079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:43.112 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.111+0000 7f561cff9700 1 -- 192.168.123.105:0/209068972 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f5608014070 con 0x7f5620103210 2026-03-10T07:57:43.112 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.111+0000 7f561effd700 1 --2- 192.168.123.105:0/209068972 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f560c0778c0 0x7f560c079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:43.113 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.112+0000 7f561effd700 1 --2- 192.168.123.105:0/209068972 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f560c0778c0 0x7f560c079d80 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f561000b5c0 tx=0x7f5610005fd0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:43.115 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.114+0000 7f561cff9700 1 -- 192.168.123.105:0/209068972 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5608063d70 con 0x7f5620103210 2026-03-10T07:57:43.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:43 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/3346508295' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 19, "format": "json"}]: dispatch 2026-03-10T07:57:43.257 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.256+0000 7f5625e85700 1 -- 192.168.123.105:0/209068972 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 20, "format": "json"} v 0) v1 -- 0x7f5600005190 con 0x7f5620103210 2026-03-10T07:57:43.258 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.257+0000 7f561cff9700 1 -- 192.168.123.105:0/209068972 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 20, "format": "json"}]=0 dumped fsmap epoch 20 v35) v1 ==== 107+0+5053 (secure 0 0 0) 0x7f56080634c0 con 0x7f5620103210 2026-03-10T07:57:43.258 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:43.258 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":20,"btime":"2026-03-10T07:55:41:759478+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24313,"name":"cephfs.vm08.dgsaon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.108:6827/2963085185","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2963085185},{"type":"v1","addr":"192.168.123.108:6827","nonce":2963085185}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":34274,"name":"cephfs.vm05.pavqil","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6829/426813062","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":426813062},{"type":"v1","addr":"192.168.123.105:6829","nonce":426813062}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20},{"gid":44255,"name":"cephfs.vm05.omfhnh","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/4251520120","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":4251520120},{"type":"v1","addr":"192.168.123.105:6827","nonce":4251520120}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":16}],"filesystems":[{"mdsmap":{"epoch":20,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:55:41.637110+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":82,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14524},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14524":{"gid":14524,"name":"cephfs.vm08.ybmbgd","rank":0,"incarnation":19,"state":"up:reconnect","state_seq":109,"addr":"192.168.123.108:6825/3815585988","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3815585988},{"type":"v1","addr":"192.168.123.108:6825","nonce":3815585988}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T07:57:43.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.260+0000 7f5625e85700 1 -- 192.168.123.105:0/209068972 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f560c0778c0 msgr2=0x7f560c079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:43.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.260+0000 7f5625e85700 1 --2- 192.168.123.105:0/209068972 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f560c0778c0 0x7f560c079d80 secure :-1 s=READY pgs=99 cs=0 l=1 rev1=1 crypto rx=0x7f561000b5c0 tx=0x7f5610005fd0 comp rx=0 tx=0).stop 2026-03-10T07:57:43.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.260+0000 7f5625e85700 1 -- 192.168.123.105:0/209068972 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5620103210 msgr2=0x7f5620198d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:43.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.260+0000 7f5625e85700 1 --2- 192.168.123.105:0/209068972 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5620103210 0x7f5620198d30 secure :-1 s=READY pgs=69 cs=0 l=1 rev1=1 crypto rx=0x7f560800eb10 tx=0x7f560800eed0 comp rx=0 tx=0).stop 2026-03-10T07:57:43.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.260+0000 7f5625e85700 1 -- 192.168.123.105:0/209068972 shutdown_connections 2026-03-10T07:57:43.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.260+0000 7f5625e85700 1 --2- 192.168.123.105:0/209068972 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5620103210 0x7f5620198d30 unknown :-1 s=CLOSED pgs=69 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:43.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.260+0000 7f5625e85700 1 --2- 192.168.123.105:0/209068972 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f560c0778c0 0x7f560c079d80 unknown :-1 s=CLOSED pgs=99 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:43.261 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.260+0000 7f5625e85700 1 --2- 192.168.123.105:0/209068972 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5620103bc0 0x7f5620199270 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:43.262 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.260+0000 7f5625e85700 1 -- 192.168.123.105:0/209068972 >> 192.168.123.105:0/209068972 conn(0x7f56200fea60 msgr2=0x7f5620100040 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:43.262 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.260+0000 7f5625e85700 1 -- 192.168.123.105:0/209068972 shutdown_connections 2026-03-10T07:57:43.262 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.260+0000 7f5625e85700 1 -- 192.168.123.105:0/209068972 wait complete. 2026-03-10T07:57:43.262 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 20 2026-03-10T07:57:43.311 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 21 2026-03-10T07:57:43.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:43 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/3346508295' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 19, "format": "json"}]: dispatch 2026-03-10T07:57:43.453 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:43.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.711+0000 7fc999fda700 1 -- 192.168.123.105:0/3176034918 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc994103d70 msgr2=0x7fc994107dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:43.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.711+0000 7fc999fda700 1 --2- 192.168.123.105:0/3176034918 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc994103d70 0x7fc994107dc0 secure :-1 s=READY pgs=134 cs=0 l=1 rev1=1 crypto rx=0x7fc984009b00 tx=0x7fc984009e10 comp rx=0 tx=0).stop 2026-03-10T07:57:43.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.712+0000 7fc999fda700 1 -- 192.168.123.105:0/3176034918 shutdown_connections 2026-03-10T07:57:43.713 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.712+0000 7fc999fda700 1 --2- 192.168.123.105:0/3176034918 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc994103d70 0x7fc994107dc0 unknown :-1 s=CLOSED pgs=134 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:43.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.712+0000 7fc999fda700 1 --2- 192.168.123.105:0/3176034918 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc9941033c0 0x7fc9941037a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:43.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.712+0000 7fc999fda700 1 -- 192.168.123.105:0/3176034918 >> 192.168.123.105:0/3176034918 conn(0x7fc9940fec30 msgr2=0x7fc994101050 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:43.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.712+0000 7fc999fda700 1 -- 192.168.123.105:0/3176034918 shutdown_connections 2026-03-10T07:57:43.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.712+0000 7fc999fda700 1 -- 192.168.123.105:0/3176034918 wait complete. 2026-03-10T07:57:43.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.713+0000 7fc999fda700 1 Processor -- start 2026-03-10T07:57:43.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.713+0000 7fc999fda700 1 -- start start 2026-03-10T07:57:43.714 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.713+0000 7fc999fda700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc9941033c0 0x7fc994198ec0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:43.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.713+0000 7fc9937fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc9941033c0 0x7fc994198ec0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:43.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.713+0000 7fc9937fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc9941033c0 0x7fc994198ec0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46272/0 (socket says 192.168.123.105:46272) 2026-03-10T07:57:43.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.713+0000 7fc999fda700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc994103d70 0x7fc994199400 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:43.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.713+0000 7fc999fda700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc994199ae0 con 0x7fc9941033c0 2026-03-10T07:57:43.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.713+0000 7fc999fda700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7fc99419d870 con 0x7fc994103d70 2026-03-10T07:57:43.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.713+0000 7fc9937fe700 1 -- 192.168.123.105:0/3067424383 learned_addr learned my addr 192.168.123.105:0/3067424383 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:43.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.714+0000 7fc992ffd700 1 --2- 192.168.123.105:0/3067424383 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc994103d70 0x7fc994199400 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:43.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.714+0000 7fc9937fe700 1 -- 192.168.123.105:0/3067424383 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc994103d70 msgr2=0x7fc994199400 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:43.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.714+0000 7fc9937fe700 1 --2- 192.168.123.105:0/3067424383 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc994103d70 0x7fc994199400 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:43.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.714+0000 7fc9937fe700 1 -- 192.168.123.105:0/3067424383 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7fc9840097e0 con 0x7fc9941033c0 2026-03-10T07:57:43.715 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.714+0000 7fc9937fe700 1 --2- 192.168.123.105:0/3067424383 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc9941033c0 0x7fc994198ec0 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7fc97c00d8d0 tx=0x7fc97c00dc90 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:43.716 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.714+0000 7fc990ff9700 1 -- 192.168.123.105:0/3067424383 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc97c009940 con 0x7fc9941033c0 2026-03-10T07:57:43.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.714+0000 7fc990ff9700 1 -- 192.168.123.105:0/3067424383 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7fc97c010460 con 0x7fc9941033c0 2026-03-10T07:57:43.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.714+0000 7fc999fda700 1 -- 192.168.123.105:0/3067424383 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7fc99419db50 con 0x7fc9941033c0 2026-03-10T07:57:43.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.714+0000 7fc999fda700 1 -- 192.168.123.105:0/3067424383 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7fc99419e0a0 con 0x7fc9941033c0 2026-03-10T07:57:43.717 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.715+0000 7fc990ff9700 1 -- 192.168.123.105:0/3067424383 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7fc97c00f5d0 con 0x7fc9941033c0 2026-03-10T07:57:43.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.716+0000 7fc990ff9700 1 -- 192.168.123.105:0/3067424383 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7fc97c00f790 con 0x7fc9941033c0 2026-03-10T07:57:43.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.716+0000 7fc990ff9700 1 --2- 192.168.123.105:0/3067424383 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fc980077990 0x7fc980079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:43.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.716+0000 7fc992ffd700 1 --2- 192.168.123.105:0/3067424383 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fc980077990 0x7fc980079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:43.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.717+0000 7fc990ff9700 1 -- 192.168.123.105:0/3067424383 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7fc97c09aa10 con 0x7fc9941033c0 2026-03-10T07:57:43.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.717+0000 7fc992ffd700 1 --2- 192.168.123.105:0/3067424383 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fc980077990 0x7fc980079e50 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7fc99419a4e0 tx=0x7fc98400b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:43.718 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.717+0000 7fc999fda700 1 -- 192.168.123.105:0/3067424383 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7fc974005320 con 0x7fc9941033c0 2026-03-10T07:57:43.721 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.720+0000 7fc990ff9700 1 -- 192.168.123.105:0/3067424383 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7fc97c062ac0 con 0x7fc9941033c0 2026-03-10T07:57:43.859 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.857+0000 7fc999fda700 1 -- 192.168.123.105:0/3067424383 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 21, "format": "json"} v 0) v1 -- 0x7fc974005190 con 0x7fc9941033c0 2026-03-10T07:57:43.861 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.860+0000 7fc990ff9700 1 -- 192.168.123.105:0/3067424383 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 21, "format": "json"}]=0 dumped fsmap epoch 21 v35) v1 ==== 107+0+5050 (secure 0 0 0) 0x7fc97c020090 con 0x7fc9941033c0 2026-03-10T07:57:43.862 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:43.862 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":21,"btime":"2026-03-10T07:55:42:765097+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24313,"name":"cephfs.vm08.dgsaon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.108:6827/2963085185","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2963085185},{"type":"v1","addr":"192.168.123.108:6827","nonce":2963085185}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":34274,"name":"cephfs.vm05.pavqil","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6829/426813062","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":426813062},{"type":"v1","addr":"192.168.123.105:6829","nonce":426813062}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20},{"gid":44255,"name":"cephfs.vm05.omfhnh","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/4251520120","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":4251520120},{"type":"v1","addr":"192.168.123.105:6827","nonce":4251520120}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":16}],"filesystems":[{"mdsmap":{"epoch":21,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:55:41.770797+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":82,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14524},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14524":{"gid":14524,"name":"cephfs.vm08.ybmbgd","rank":0,"incarnation":19,"state":"up:rejoin","state_seq":110,"addr":"192.168.123.108:6825/3815585988","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3815585988},{"type":"v1","addr":"192.168.123.108:6825","nonce":3815585988}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T07:57:43.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.863+0000 7fc999fda700 1 -- 192.168.123.105:0/3067424383 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fc980077990 msgr2=0x7fc980079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:43.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.863+0000 7fc999fda700 1 --2- 192.168.123.105:0/3067424383 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fc980077990 0x7fc980079e50 secure :-1 s=READY pgs=100 cs=0 l=1 rev1=1 crypto rx=0x7fc99419a4e0 tx=0x7fc98400b540 comp rx=0 tx=0).stop 2026-03-10T07:57:43.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.863+0000 7fc999fda700 1 -- 192.168.123.105:0/3067424383 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc9941033c0 msgr2=0x7fc994198ec0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:43.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.863+0000 7fc999fda700 1 --2- 192.168.123.105:0/3067424383 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc9941033c0 0x7fc994198ec0 secure :-1 s=READY pgs=135 cs=0 l=1 rev1=1 crypto rx=0x7fc97c00d8d0 tx=0x7fc97c00dc90 comp rx=0 tx=0).stop 2026-03-10T07:57:43.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.863+0000 7fc999fda700 1 -- 192.168.123.105:0/3067424383 shutdown_connections 2026-03-10T07:57:43.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.863+0000 7fc999fda700 1 --2- 192.168.123.105:0/3067424383 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7fc980077990 0x7fc980079e50 unknown :-1 s=CLOSED pgs=100 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:43.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.863+0000 7fc999fda700 1 --2- 192.168.123.105:0/3067424383 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7fc9941033c0 0x7fc994198ec0 unknown :-1 s=CLOSED pgs=135 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:43.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.863+0000 7fc999fda700 1 --2- 192.168.123.105:0/3067424383 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7fc994103d70 0x7fc994199400 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:43.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.863+0000 7fc999fda700 1 -- 192.168.123.105:0/3067424383 >> 192.168.123.105:0/3067424383 conn(0x7fc9940fec30 msgr2=0x7fc994100200 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:43.864 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.863+0000 7fc999fda700 1 -- 192.168.123.105:0/3067424383 shutdown_connections 2026-03-10T07:57:43.865 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:43.863+0000 7fc999fda700 1 -- 192.168.123.105:0/3067424383 wait complete. 2026-03-10T07:57:43.865 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 21 2026-03-10T07:57:43.926 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 22 2026-03-10T07:57:44.085 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:44.115 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:44 vm05.local ceph-mon[130117]: pgmap v202: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:44.115 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:44 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/209068972' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 20, "format": "json"}]: dispatch 2026-03-10T07:57:44.115 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:44 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/3067424383' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 21, "format": "json"}]: dispatch 2026-03-10T07:57:44.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.345+0000 7f7d35f1f700 1 -- 192.168.123.105:0/2817234167 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7d300690e0 msgr2=0x7f7d30105b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:44.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.345+0000 7f7d35f1f700 1 --2- 192.168.123.105:0/2817234167 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7d300690e0 0x7f7d30105b50 secure :-1 s=READY pgs=136 cs=0 l=1 rev1=1 crypto rx=0x7f7d20009b00 tx=0x7f7d20009e10 comp rx=0 tx=0).stop 2026-03-10T07:57:44.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.346+0000 7f7d35f1f700 1 -- 192.168.123.105:0/2817234167 shutdown_connections 2026-03-10T07:57:44.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.346+0000 7f7d35f1f700 1 --2- 192.168.123.105:0/2817234167 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7d300690e0 0x7f7d30105b50 unknown :-1 s=CLOSED pgs=136 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:44.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.346+0000 7f7d35f1f700 1 --2- 192.168.123.105:0/2817234167 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7d30068730 0x7f7d30068b10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:44.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.346+0000 7f7d35f1f700 1 -- 192.168.123.105:0/2817234167 >> 192.168.123.105:0/2817234167 conn(0x7f7d30075960 msgr2=0x7f7d30075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:44.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.346+0000 7f7d35f1f700 1 -- 192.168.123.105:0/2817234167 shutdown_connections 2026-03-10T07:57:44.347 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.346+0000 7f7d35f1f700 1 -- 192.168.123.105:0/2817234167 wait complete. 2026-03-10T07:57:44.348 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.346+0000 7f7d35f1f700 1 Processor -- start 2026-03-10T07:57:44.348 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.347+0000 7f7d35f1f700 1 -- start start 2026-03-10T07:57:44.348 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.347+0000 7f7d35f1f700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7d30068730 0x7f7d30198fa0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:44.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.347+0000 7f7d35f1f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7d300690e0 0x7f7d301994e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:44.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.347+0000 7f7d35f1f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7d30199bc0 con 0x7f7d300690e0 2026-03-10T07:57:44.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.347+0000 7f7d35f1f700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f7d3019d950 con 0x7f7d30068730 2026-03-10T07:57:44.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.347+0000 7f7d2effd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7d300690e0 0x7f7d301994e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:44.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.347+0000 7f7d2effd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7d300690e0 0x7f7d301994e0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46286/0 (socket says 192.168.123.105:46286) 2026-03-10T07:57:44.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.347+0000 7f7d2effd700 1 -- 192.168.123.105:0/2931757326 learned_addr learned my addr 192.168.123.105:0/2931757326 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:44.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.347+0000 7f7d2effd700 1 -- 192.168.123.105:0/2931757326 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7d30068730 msgr2=0x7f7d30198fa0 unknown :-1 s=STATE_CONNECTING_RE l=1).mark_down 2026-03-10T07:57:44.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.347+0000 7f7d2effd700 1 --2- 192.168.123.105:0/2931757326 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7d30068730 0x7f7d30198fa0 unknown :-1 s=START_CONNECT pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:44.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.347+0000 7f7d2effd700 1 -- 192.168.123.105:0/2931757326 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f7d200097e0 con 0x7f7d300690e0 2026-03-10T07:57:44.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.348+0000 7f7d2effd700 1 --2- 192.168.123.105:0/2931757326 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7d300690e0 0x7f7d301994e0 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f7d20000c00 tx=0x7f7d20004970 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:44.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.348+0000 7f7d2cff9700 1 -- 192.168.123.105:0/2931757326 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7d2001d070 con 0x7f7d300690e0 2026-03-10T07:57:44.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.348+0000 7f7d2cff9700 1 -- 192.168.123.105:0/2931757326 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f7d2000bc50 con 0x7f7d300690e0 2026-03-10T07:57:44.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.348+0000 7f7d2cff9700 1 -- 192.168.123.105:0/2931757326 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f7d2000f650 con 0x7f7d300690e0 2026-03-10T07:57:44.349 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.348+0000 7f7d35f1f700 1 -- 192.168.123.105:0/2931757326 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f7d3019dbd0 con 0x7f7d300690e0 2026-03-10T07:57:44.350 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.348+0000 7f7d35f1f700 1 -- 192.168.123.105:0/2931757326 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f7d3019e0c0 con 0x7f7d300690e0 2026-03-10T07:57:44.351 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.350+0000 7f7d35f1f700 1 -- 192.168.123.105:0/2931757326 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f7d3004ea90 con 0x7f7d300690e0 2026-03-10T07:57:44.354 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.350+0000 7f7d2cff9700 1 -- 192.168.123.105:0/2931757326 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f7d20022470 con 0x7f7d300690e0 2026-03-10T07:57:44.354 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.350+0000 7f7d2cff9700 1 --2- 192.168.123.105:0/2931757326 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f7d1c0778c0 0x7f7d1c079d80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:44.354 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.350+0000 7f7d2cff9700 1 -- 192.168.123.105:0/2931757326 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f7d2009b3c0 con 0x7f7d300690e0 2026-03-10T07:57:44.354 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.353+0000 7f7d2cff9700 1 -- 192.168.123.105:0/2931757326 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f7d20063b40 con 0x7f7d300690e0 2026-03-10T07:57:44.354 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.353+0000 7f7d2f7fe700 1 --2- 192.168.123.105:0/2931757326 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f7d1c0778c0 0x7f7d1c079d80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:44.357 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.353+0000 7f7d2f7fe700 1 --2- 192.168.123.105:0/2931757326 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f7d1c0778c0 0x7f7d1c079d80 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f7d18009780 tx=0x7f7d18006cb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:44.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:44 vm08.local ceph-mon[107898]: pgmap v202: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:44.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:44 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/209068972' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 20, "format": "json"}]: dispatch 2026-03-10T07:57:44.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:44 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/3067424383' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 21, "format": "json"}]: dispatch 2026-03-10T07:57:44.500 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.499+0000 7f7d35f1f700 1 -- 192.168.123.105:0/2931757326 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 22, "format": "json"} v 0) v1 -- 0x7f7d30066e80 con 0x7f7d300690e0 2026-03-10T07:57:44.501 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.500+0000 7f7d2cff9700 1 -- 192.168.123.105:0/2931757326 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 22, "format": "json"}]=0 dumped fsmap epoch 22 v35) v1 ==== 107+0+5059 (secure 0 0 0) 0x7f7d20063290 con 0x7f7d300690e0 2026-03-10T07:57:44.501 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:44.501 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":22,"btime":"2026-03-10T07:55:43:768247+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24313,"name":"cephfs.vm08.dgsaon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.108:6827/2963085185","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2963085185},{"type":"v1","addr":"192.168.123.108:6827","nonce":2963085185}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":34274,"name":"cephfs.vm05.pavqil","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6829/426813062","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":426813062},{"type":"v1","addr":"192.168.123.105:6829","nonce":426813062}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20},{"gid":44255,"name":"cephfs.vm05.omfhnh","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/4251520120","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":4251520120},{"type":"v1","addr":"192.168.123.105:6827","nonce":4251520120}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":16}],"filesystems":[{"mdsmap":{"epoch":22,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:55:43.768246+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":82,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":14524},"failed":[],"damaged":[],"stopped":[],"info":{"gid_14524":{"gid":14524,"name":"cephfs.vm08.ybmbgd","rank":0,"incarnation":19,"state":"up:active","state_seq":111,"addr":"192.168.123.108:6825/3815585988","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":3815585988},{"type":"v1","addr":"192.168.123.108:6825","nonce":3815585988}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":14524,"qdb_cluster":[14524]},"id":1}]} 2026-03-10T07:57:44.504 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.502+0000 7f7d35f1f700 1 -- 192.168.123.105:0/2931757326 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f7d1c0778c0 msgr2=0x7f7d1c079d80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:44.504 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.502+0000 7f7d35f1f700 1 --2- 192.168.123.105:0/2931757326 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f7d1c0778c0 0x7f7d1c079d80 secure :-1 s=READY pgs=101 cs=0 l=1 rev1=1 crypto rx=0x7f7d18009780 tx=0x7f7d18006cb0 comp rx=0 tx=0).stop 2026-03-10T07:57:44.504 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.503+0000 7f7d35f1f700 1 -- 192.168.123.105:0/2931757326 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7d300690e0 msgr2=0x7f7d301994e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:44.504 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.503+0000 7f7d35f1f700 1 --2- 192.168.123.105:0/2931757326 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7d300690e0 0x7f7d301994e0 secure :-1 s=READY pgs=137 cs=0 l=1 rev1=1 crypto rx=0x7f7d20000c00 tx=0x7f7d20004970 comp rx=0 tx=0).stop 2026-03-10T07:57:44.504 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.503+0000 7f7d35f1f700 1 -- 192.168.123.105:0/2931757326 shutdown_connections 2026-03-10T07:57:44.504 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.503+0000 7f7d35f1f700 1 --2- 192.168.123.105:0/2931757326 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f7d30068730 0x7f7d30198fa0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:44.504 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.503+0000 7f7d35f1f700 1 --2- 192.168.123.105:0/2931757326 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f7d1c0778c0 0x7f7d1c079d80 unknown :-1 s=CLOSED pgs=101 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:44.504 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.503+0000 7f7d35f1f700 1 --2- 192.168.123.105:0/2931757326 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f7d300690e0 0x7f7d301994e0 unknown :-1 s=CLOSED pgs=137 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:44.504 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.503+0000 7f7d35f1f700 1 -- 192.168.123.105:0/2931757326 >> 192.168.123.105:0/2931757326 conn(0x7f7d30075960 msgr2=0x7f7d30102b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:44.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.503+0000 7f7d35f1f700 1 -- 192.168.123.105:0/2931757326 shutdown_connections 2026-03-10T07:57:44.505 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.503+0000 7f7d35f1f700 1 -- 192.168.123.105:0/2931757326 wait complete. 2026-03-10T07:57:44.505 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 22 2026-03-10T07:57:44.554 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 23 2026-03-10T07:57:44.706 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:44.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.959+0000 7f4e04799700 1 -- 192.168.123.105:0/3555960973 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4dfc103cf0 msgr2=0x7f4dfc107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:44.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.959+0000 7f4e04799700 1 --2- 192.168.123.105:0/3555960973 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4dfc103cf0 0x7f4dfc107d40 secure :-1 s=READY pgs=138 cs=0 l=1 rev1=1 crypto rx=0x7f4df8009b00 tx=0x7f4df8009e10 comp rx=0 tx=0).stop 2026-03-10T07:57:44.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.960+0000 7f4e04799700 1 -- 192.168.123.105:0/3555960973 shutdown_connections 2026-03-10T07:57:44.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.960+0000 7f4e04799700 1 --2- 192.168.123.105:0/3555960973 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4dfc103cf0 0x7f4dfc107d40 unknown :-1 s=CLOSED pgs=138 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:44.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.960+0000 7f4e04799700 1 --2- 192.168.123.105:0/3555960973 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4dfc103340 0x7f4dfc103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:44.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.960+0000 7f4e04799700 1 -- 192.168.123.105:0/3555960973 >> 192.168.123.105:0/3555960973 conn(0x7f4dfc0febd0 msgr2=0x7f4dfc100ff0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:44.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.960+0000 7f4e04799700 1 -- 192.168.123.105:0/3555960973 shutdown_connections 2026-03-10T07:57:44.961 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.960+0000 7f4e04799700 1 -- 192.168.123.105:0/3555960973 wait complete. 2026-03-10T07:57:44.962 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.961+0000 7f4e04799700 1 Processor -- start 2026-03-10T07:57:44.962 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.961+0000 7f4e04799700 1 -- start start 2026-03-10T07:57:44.963 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.961+0000 7f4e04799700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4dfc103340 0x7f4dfc198e40 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:44.963 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.962+0000 7f4e04799700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4dfc103cf0 0x7f4dfc199380 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:44.963 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.962+0000 7f4e04799700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4dfc199a60 con 0x7f4dfc103340 2026-03-10T07:57:44.963 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.962+0000 7f4e04799700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4dfc19d7f0 con 0x7f4dfc103cf0 2026-03-10T07:57:44.963 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.962+0000 7f4e01d34700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4dfc103cf0 0x7f4dfc199380 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:44.963 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.962+0000 7f4e01d34700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4dfc103cf0 0x7f4dfc199380 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:43036/0 (socket says 192.168.123.105:43036) 2026-03-10T07:57:44.963 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.962+0000 7f4e01d34700 1 -- 192.168.123.105:0/2861301968 learned_addr learned my addr 192.168.123.105:0/2861301968 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:44.963 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.962+0000 7f4e02535700 1 --2- 192.168.123.105:0/2861301968 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4dfc103340 0x7f4dfc198e40 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:44.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.962+0000 7f4e01d34700 1 -- 192.168.123.105:0/2861301968 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4dfc103340 msgr2=0x7f4dfc198e40 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:44.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.962+0000 7f4e01d34700 1 --2- 192.168.123.105:0/2861301968 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4dfc103340 0x7f4dfc198e40 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:44.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.962+0000 7f4e01d34700 1 -- 192.168.123.105:0/2861301968 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4df80097e0 con 0x7f4dfc103cf0 2026-03-10T07:57:44.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.962+0000 7f4e02535700 1 --2- 192.168.123.105:0/2861301968 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4dfc103340 0x7f4dfc198e40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:57:44.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.963+0000 7f4e01d34700 1 --2- 192.168.123.105:0/2861301968 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4dfc103cf0 0x7f4dfc199380 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f4df8005f50 tx=0x7f4df8004970 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:44.964 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.963+0000 7f4df37fe700 1 -- 192.168.123.105:0/2861301968 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4df801d070 con 0x7f4dfc103cf0 2026-03-10T07:57:44.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.963+0000 7f4df37fe700 1 -- 192.168.123.105:0/2861301968 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f4df800bc50 con 0x7f4dfc103cf0 2026-03-10T07:57:44.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.963+0000 7f4e04799700 1 -- 192.168.123.105:0/2861301968 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4dfc19da70 con 0x7f4dfc103cf0 2026-03-10T07:57:44.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.963+0000 7f4e04799700 1 -- 192.168.123.105:0/2861301968 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4dfc19df60 con 0x7f4dfc103cf0 2026-03-10T07:57:44.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.964+0000 7f4e04799700 1 -- 192.168.123.105:0/2861301968 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4dfc04ea90 con 0x7f4dfc103cf0 2026-03-10T07:57:44.965 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.964+0000 7f4df37fe700 1 -- 192.168.123.105:0/2861301968 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4df8022620 con 0x7f4dfc103cf0 2026-03-10T07:57:44.966 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.965+0000 7f4df37fe700 1 -- 192.168.123.105:0/2861301968 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4df800f530 con 0x7f4dfc103cf0 2026-03-10T07:57:44.967 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.966+0000 7f4df37fe700 1 --2- 192.168.123.105:0/2861301968 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4de8077870 0x7f4de8079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:44.967 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.966+0000 7f4df37fe700 1 -- 192.168.123.105:0/2861301968 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f4df809bea0 con 0x7f4dfc103cf0 2026-03-10T07:57:44.968 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.966+0000 7f4e02535700 1 --2- 192.168.123.105:0/2861301968 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4de8077870 0x7f4de8079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:44.968 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.966+0000 7f4e02535700 1 --2- 192.168.123.105:0/2861301968 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4de8077870 0x7f4de8079d30 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f4dec007950 tx=0x7f4dec008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:44.969 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:44.968+0000 7f4df37fe700 1 -- 192.168.123.105:0/2861301968 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4df80645a0 con 0x7f4dfc103cf0 2026-03-10T07:57:45.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.117+0000 7f4e04799700 1 -- 192.168.123.105:0/2861301968 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 23, "format": "json"} v 0) v1 -- 0x7f4dfc066e80 con 0x7f4dfc103cf0 2026-03-10T07:57:45.124 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.123+0000 7f4df37fe700 1 -- 192.168.123.105:0/2861301968 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 23, "format": "json"}]=0 dumped fsmap epoch 23 v35) v1 ==== 107+0+4254 (secure 0 0 0) 0x7f4df8005c00 con 0x7f4dfc103cf0 2026-03-10T07:57:45.125 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:45.125 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":23,"btime":"2026-03-10T07:55:45:162616+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":24313,"name":"cephfs.vm08.dgsaon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":2,"addr":"192.168.123.108:6827/2963085185","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2963085185},{"type":"v1","addr":"192.168.123.108:6827","nonce":2963085185}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"epoch":11},{"gid":34274,"name":"cephfs.vm05.pavqil","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6829/426813062","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":426813062},{"type":"v1","addr":"192.168.123.105:6829","nonce":426813062}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20},{"gid":44255,"name":"cephfs.vm05.omfhnh","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/4251520120","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":4251520120},{"type":"v1","addr":"192.168.123.105:6827","nonce":4251520120}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":16}],"filesystems":[{"mdsmap":{"epoch":23,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:55:45.162615+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":83,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T07:57:45.127 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.126+0000 7f4e04799700 1 -- 192.168.123.105:0/2861301968 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4de8077870 msgr2=0x7f4de8079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:45.127 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.126+0000 7f4e04799700 1 --2- 192.168.123.105:0/2861301968 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4de8077870 0x7f4de8079d30 secure :-1 s=READY pgs=102 cs=0 l=1 rev1=1 crypto rx=0x7f4dec007950 tx=0x7f4dec008040 comp rx=0 tx=0).stop 2026-03-10T07:57:45.127 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.126+0000 7f4e04799700 1 -- 192.168.123.105:0/2861301968 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4dfc103cf0 msgr2=0x7f4dfc199380 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:45.127 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.126+0000 7f4e04799700 1 --2- 192.168.123.105:0/2861301968 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4dfc103cf0 0x7f4dfc199380 secure :-1 s=READY pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f4df8005f50 tx=0x7f4df8004970 comp rx=0 tx=0).stop 2026-03-10T07:57:45.127 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.126+0000 7f4e04799700 1 -- 192.168.123.105:0/2861301968 shutdown_connections 2026-03-10T07:57:45.127 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.126+0000 7f4e04799700 1 --2- 192.168.123.105:0/2861301968 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4de8077870 0x7f4de8079d30 unknown :-1 s=CLOSED pgs=102 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:45.127 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.126+0000 7f4e04799700 1 --2- 192.168.123.105:0/2861301968 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4dfc103340 0x7f4dfc198e40 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:45.127 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.126+0000 7f4e04799700 1 --2- 192.168.123.105:0/2861301968 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4dfc103cf0 0x7f4dfc199380 secure :-1 s=CLOSED pgs=70 cs=0 l=1 rev1=1 crypto rx=0x7f4df8005f50 tx=0x7f4df8004970 comp rx=0 tx=0).stop 2026-03-10T07:57:45.127 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.126+0000 7f4e04799700 1 -- 192.168.123.105:0/2861301968 >> 192.168.123.105:0/2861301968 conn(0x7f4dfc0febd0 msgr2=0x7f4dfc1001f0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:45.127 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.126+0000 7f4e04799700 1 -- 192.168.123.105:0/2861301968 shutdown_connections 2026-03-10T07:57:45.127 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.126+0000 7f4e04799700 1 -- 192.168.123.105:0/2861301968 wait complete. 2026-03-10T07:57:45.128 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 23 2026-03-10T07:57:45.186 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 24 2026-03-10T07:57:45.347 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:45.370 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:45 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/2931757326' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 22, "format": "json"}]: dispatch 2026-03-10T07:57:45.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:45 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/2931757326' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 22, "format": "json"}]: dispatch 2026-03-10T07:57:45.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.605+0000 7ff68c675700 1 -- 192.168.123.105:0/3754037475 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff684073500 msgr2=0x7ff684073960 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:45.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.605+0000 7ff68c675700 1 --2- 192.168.123.105:0/3754037475 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff684073500 0x7ff684073960 secure :-1 s=READY pgs=139 cs=0 l=1 rev1=1 crypto rx=0x7ff680009b30 tx=0x7ff680009e40 comp rx=0 tx=0).stop 2026-03-10T07:57:45.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.606+0000 7ff68c675700 1 -- 192.168.123.105:0/3754037475 shutdown_connections 2026-03-10T07:57:45.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.606+0000 7ff68c675700 1 --2- 192.168.123.105:0/3754037475 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff684073500 0x7ff684073960 unknown :-1 s=CLOSED pgs=139 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:45.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.606+0000 7ff68c675700 1 --2- 192.168.123.105:0/3754037475 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff684074dd0 0x7ff684072fc0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:45.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.606+0000 7ff68c675700 1 -- 192.168.123.105:0/3754037475 >> 192.168.123.105:0/3754037475 conn(0x7ff684078ed0 msgr2=0x7ff6840792e0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:45.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.606+0000 7ff68c675700 1 -- 192.168.123.105:0/3754037475 shutdown_connections 2026-03-10T07:57:45.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.606+0000 7ff68c675700 1 -- 192.168.123.105:0/3754037475 wait complete. 2026-03-10T07:57:45.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.606+0000 7ff68c675700 1 Processor -- start 2026-03-10T07:57:45.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.606+0000 7ff68c675700 1 -- start start 2026-03-10T07:57:45.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.606+0000 7ff68c675700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff684073500 0x7ff68419d240 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:45.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.607+0000 7ff68a411700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff684073500 0x7ff68419d240 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:45.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.607+0000 7ff68a411700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff684073500 0x7ff68419d240 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46316/0 (socket says 192.168.123.105:46316) 2026-03-10T07:57:45.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.607+0000 7ff68c675700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff684074dd0 0x7ff68419d780 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:45.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.607+0000 7ff68c675700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff68419de60 con 0x7ff684073500 2026-03-10T07:57:45.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.607+0000 7ff68c675700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff6841a1bf0 con 0x7ff684074dd0 2026-03-10T07:57:45.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.607+0000 7ff68a411700 1 -- 192.168.123.105:0/1797584833 learned_addr learned my addr 192.168.123.105:0/1797584833 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:45.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.607+0000 7ff68a411700 1 -- 192.168.123.105:0/1797584833 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff684074dd0 msgr2=0x7ff68419d780 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:57:45.608 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.607+0000 7ff689c10700 1 --2- 192.168.123.105:0/1797584833 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff684074dd0 0x7ff68419d780 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:45.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.607+0000 7ff68a411700 1 --2- 192.168.123.105:0/1797584833 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff684074dd0 0x7ff68419d780 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:45.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.607+0000 7ff68a411700 1 -- 192.168.123.105:0/1797584833 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff6800097e0 con 0x7ff684073500 2026-03-10T07:57:45.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.607+0000 7ff68a411700 1 --2- 192.168.123.105:0/1797584833 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff684073500 0x7ff68419d240 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7ff67400d900 tx=0x7ff67400dcc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:45.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.608+0000 7ff67b7fe700 1 -- 192.168.123.105:0/1797584833 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff6740041d0 con 0x7ff684073500 2026-03-10T07:57:45.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.608+0000 7ff67b7fe700 1 -- 192.168.123.105:0/1797584833 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7ff674004d10 con 0x7ff684073500 2026-03-10T07:57:45.609 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.608+0000 7ff67b7fe700 1 -- 192.168.123.105:0/1797584833 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff67400b750 con 0x7ff684073500 2026-03-10T07:57:45.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.608+0000 7ff68c675700 1 -- 192.168.123.105:0/1797584833 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff6841a1ed0 con 0x7ff684073500 2026-03-10T07:57:45.610 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.608+0000 7ff68c675700 1 -- 192.168.123.105:0/1797584833 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff6841a23a0 con 0x7ff684073500 2026-03-10T07:57:45.611 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.610+0000 7ff68c675700 1 -- 192.168.123.105:0/1797584833 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff68404ea90 con 0x7ff684073500 2026-03-10T07:57:45.611 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.610+0000 7ff67b7fe700 1 -- 192.168.123.105:0/1797584833 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff674004330 con 0x7ff684073500 2026-03-10T07:57:45.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.610+0000 7ff67b7fe700 1 --2- 192.168.123.105:0/1797584833 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7ff6700776c0 0x7ff670079b80 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:45.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.610+0000 7ff67b7fe700 1 -- 192.168.123.105:0/1797584833 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7ff67401f030 con 0x7ff684073500 2026-03-10T07:57:45.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.611+0000 7ff689c10700 1 --2- 192.168.123.105:0/1797584833 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7ff6700776c0 0x7ff670079b80 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:45.613 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.611+0000 7ff689c10700 1 --2- 192.168.123.105:0/1797584833 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7ff6700776c0 0x7ff670079b80 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7ff680006010 tx=0x7ff680005790 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:45.614 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.613+0000 7ff67b7fe700 1 -- 192.168.123.105:0/1797584833 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff674061ed0 con 0x7ff684073500 2026-03-10T07:57:45.757 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.756+0000 7ff68c675700 1 -- 192.168.123.105:0/1797584833 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 24, "format": "json"} v 0) v1 -- 0x7ff68419e5d0 con 0x7ff684073500 2026-03-10T07:57:45.758 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.757+0000 7ff67b7fe700 1 -- 192.168.123.105:0/1797584833 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 24, "format": "json"}]=0 dumped fsmap epoch 24 v35) v1 ==== 107+0+4265 (secure 0 0 0) 0x7ff674061620 con 0x7ff684073500 2026-03-10T07:57:45.758 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:45.758 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":24,"btime":"2026-03-10T07:55:45:167712+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34274,"name":"cephfs.vm05.pavqil","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6829/426813062","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":426813062},{"type":"v1","addr":"192.168.123.105:6829","nonce":426813062}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20},{"gid":44255,"name":"cephfs.vm05.omfhnh","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/4251520120","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":4251520120},{"type":"v1","addr":"192.168.123.105:6827","nonce":4251520120}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":16}],"filesystems":[{"mdsmap":{"epoch":24,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:55:45.167709+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":83,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24313},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24313":{"gid":24313,"name":"cephfs.vm08.dgsaon","rank":0,"incarnation":24,"state":"up:replay","state_seq":2,"addr":"192.168.123.108:6827/2963085185","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2963085185},{"type":"v1","addr":"192.168.123.108:6827","nonce":2963085185}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T07:57:45.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.760+0000 7ff68c675700 1 -- 192.168.123.105:0/1797584833 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7ff6700776c0 msgr2=0x7ff670079b80 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:45.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.760+0000 7ff68c675700 1 --2- 192.168.123.105:0/1797584833 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7ff6700776c0 0x7ff670079b80 secure :-1 s=READY pgs=103 cs=0 l=1 rev1=1 crypto rx=0x7ff680006010 tx=0x7ff680005790 comp rx=0 tx=0).stop 2026-03-10T07:57:45.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.760+0000 7ff68c675700 1 -- 192.168.123.105:0/1797584833 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff684073500 msgr2=0x7ff68419d240 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:45.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.760+0000 7ff68c675700 1 --2- 192.168.123.105:0/1797584833 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff684073500 0x7ff68419d240 secure :-1 s=READY pgs=140 cs=0 l=1 rev1=1 crypto rx=0x7ff67400d900 tx=0x7ff67400dcc0 comp rx=0 tx=0).stop 2026-03-10T07:57:45.761 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.760+0000 7ff68c675700 1 -- 192.168.123.105:0/1797584833 shutdown_connections 2026-03-10T07:57:45.762 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.760+0000 7ff68c675700 1 --2- 192.168.123.105:0/1797584833 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7ff6700776c0 0x7ff670079b80 unknown :-1 s=CLOSED pgs=103 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:45.762 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.760+0000 7ff68c675700 1 --2- 192.168.123.105:0/1797584833 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff684073500 0x7ff68419d240 unknown :-1 s=CLOSED pgs=140 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:45.762 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.760+0000 7ff68c675700 1 --2- 192.168.123.105:0/1797584833 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff684074dd0 0x7ff68419d780 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:45.762 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.760+0000 7ff68c675700 1 -- 192.168.123.105:0/1797584833 >> 192.168.123.105:0/1797584833 conn(0x7ff684078ed0 msgr2=0x7ff68410f980 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:45.762 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.760+0000 7ff68c675700 1 -- 192.168.123.105:0/1797584833 shutdown_connections 2026-03-10T07:57:45.762 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:45.761+0000 7ff68c675700 1 -- 192.168.123.105:0/1797584833 wait complete. 2026-03-10T07:57:45.763 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 24 2026-03-10T07:57:45.811 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 25 2026-03-10T07:57:45.955 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:46.210 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.207+0000 7f5ce4e3d700 1 -- 192.168.123.105:0/629617524 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ce0103d70 msgr2=0x7f5ce0107dc0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:46.210 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.207+0000 7f5ce4e3d700 1 --2- 192.168.123.105:0/629617524 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ce0103d70 0x7f5ce0107dc0 secure :-1 s=READY pgs=141 cs=0 l=1 rev1=1 crypto rx=0x7f5cd0009b00 tx=0x7f5cd0009e10 comp rx=0 tx=0).stop 2026-03-10T07:57:46.210 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.209+0000 7f5ce4e3d700 1 -- 192.168.123.105:0/629617524 shutdown_connections 2026-03-10T07:57:46.210 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.209+0000 7f5ce4e3d700 1 --2- 192.168.123.105:0/629617524 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ce0103d70 0x7f5ce0107dc0 unknown :-1 s=CLOSED pgs=141 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:46.210 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.209+0000 7f5ce4e3d700 1 --2- 192.168.123.105:0/629617524 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5ce01033c0 0x7f5ce01037a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:46.210 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.209+0000 7f5ce4e3d700 1 -- 192.168.123.105:0/629617524 >> 192.168.123.105:0/629617524 conn(0x7f5ce00fec30 msgr2=0x7f5ce0101050 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:46.210 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.209+0000 7f5ce4e3d700 1 -- 192.168.123.105:0/629617524 shutdown_connections 2026-03-10T07:57:46.210 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.209+0000 7f5ce4e3d700 1 -- 192.168.123.105:0/629617524 wait complete. 2026-03-10T07:57:46.210 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.209+0000 7f5ce4e3d700 1 Processor -- start 2026-03-10T07:57:46.211 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.209+0000 7f5ce4e3d700 1 -- start start 2026-03-10T07:57:46.211 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.210+0000 7f5ce4e3d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ce01033c0 0x7f5ce00752a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:46.211 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.210+0000 7f5ce4e3d700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5ce0103d70 0x7f5ce00757e0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:46.211 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.210+0000 7f5ce4e3d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5ce0079430 con 0x7f5ce01033c0 2026-03-10T07:57:46.211 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.210+0000 7f5ce4e3d700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f5ce0075d20 con 0x7f5ce0103d70 2026-03-10T07:57:46.211 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.210+0000 7f5cde59c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ce01033c0 0x7f5ce00752a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:46.211 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.210+0000 7f5cde59c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ce01033c0 0x7f5ce00752a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46330/0 (socket says 192.168.123.105:46330) 2026-03-10T07:57:46.211 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.210+0000 7f5cde59c700 1 -- 192.168.123.105:0/1128729968 learned_addr learned my addr 192.168.123.105:0/1128729968 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:46.211 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.210+0000 7f5cddd9b700 1 --2- 192.168.123.105:0/1128729968 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5ce0103d70 0x7f5ce00757e0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:46.212 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.211+0000 7f5cde59c700 1 -- 192.168.123.105:0/1128729968 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5ce0103d70 msgr2=0x7f5ce00757e0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:46.212 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.211+0000 7f5cde59c700 1 --2- 192.168.123.105:0/1128729968 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5ce0103d70 0x7f5ce00757e0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:46.212 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.211+0000 7f5cde59c700 1 -- 192.168.123.105:0/1128729968 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f5cd00097e0 con 0x7f5ce01033c0 2026-03-10T07:57:46.212 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.211+0000 7f5cde59c700 1 --2- 192.168.123.105:0/1128729968 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ce01033c0 0x7f5ce00752a0 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f5cd400d8d0 tx=0x7f5cd400dbe0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:46.212 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.211+0000 7f5ccf7fe700 1 -- 192.168.123.105:0/1128729968 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5cd4009880 con 0x7f5ce01033c0 2026-03-10T07:57:46.212 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.211+0000 7f5ce4e3d700 1 -- 192.168.123.105:0/1128729968 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f5ce0076000 con 0x7f5ce01033c0 2026-03-10T07:57:46.212 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.211+0000 7f5ce4e3d700 1 -- 192.168.123.105:0/1128729968 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f5ce01a7690 con 0x7f5ce01033c0 2026-03-10T07:57:46.214 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.211+0000 7f5ccf7fe700 1 -- 192.168.123.105:0/1128729968 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f5cd4010460 con 0x7f5ce01033c0 2026-03-10T07:57:46.214 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.211+0000 7f5ccf7fe700 1 -- 192.168.123.105:0/1128729968 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f5cd400f5d0 con 0x7f5ce01033c0 2026-03-10T07:57:46.214 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.213+0000 7f5ce4e3d700 1 -- 192.168.123.105:0/1128729968 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f5cc0005320 con 0x7f5ce01033c0 2026-03-10T07:57:46.217 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.213+0000 7f5ccf7fe700 1 -- 192.168.123.105:0/1128729968 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f5cd40099e0 con 0x7f5ce01033c0 2026-03-10T07:57:46.217 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.216+0000 7f5ccf7fe700 1 --2- 192.168.123.105:0/1128729968 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f5cc8077990 0x7f5cc8079e50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:46.217 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.216+0000 7f5ccf7fe700 1 -- 192.168.123.105:0/1128729968 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f5cd4099740 con 0x7f5ce01033c0 2026-03-10T07:57:46.218 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.217+0000 7f5cddd9b700 1 --2- 192.168.123.105:0/1128729968 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f5cc8077990 0x7f5cc8079e50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:46.218 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.217+0000 7f5cddd9b700 1 --2- 192.168.123.105:0/1128729968 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f5cc8077990 0x7f5cc8079e50 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f5ce0100410 tx=0x7f5cd000b540 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:46.218 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.217+0000 7f5ccf7fe700 1 -- 192.168.123.105:0/1128729968 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f5cd4061ec0 con 0x7f5ce01033c0 2026-03-10T07:57:46.353 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:46 vm05.local ceph-mon[130117]: pgmap v203: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:46.353 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:46 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/2861301968' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 23, "format": "json"}]: dispatch 2026-03-10T07:57:46.353 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:46 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/1797584833' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 24, "format": "json"}]: dispatch 2026-03-10T07:57:46.353 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.351+0000 7f5ce4e3d700 1 -- 192.168.123.105:0/1128729968 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 25, "format": "json"} v 0) v1 -- 0x7f5cc0005190 con 0x7f5ce01033c0 2026-03-10T07:57:46.354 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.353+0000 7f5ccf7fe700 1 -- 192.168.123.105:0/1128729968 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 25, "format": "json"}]=0 dumped fsmap epoch 25 v35) v1 ==== 107+0+5107 (secure 0 0 0) 0x7f5cd4061610 con 0x7f5ce01033c0 2026-03-10T07:57:46.355 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:46.355 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":25,"btime":"2026-03-10T07:55:50:319362+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34274,"name":"cephfs.vm05.pavqil","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6829/426813062","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":426813062},{"type":"v1","addr":"192.168.123.105:6829","nonce":426813062}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20},{"gid":44255,"name":"cephfs.vm05.omfhnh","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/4251520120","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":4251520120},{"type":"v1","addr":"192.168.123.105:6827","nonce":4251520120}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":16},{"gid":44263,"name":"cephfs.vm08.ybmbgd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/1209244","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":1209244},{"type":"v1","addr":"192.168.123.108:6825","nonce":1209244}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25}],"filesystems":[{"mdsmap":{"epoch":24,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:55:45.167709+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":83,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24313},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24313":{"gid":24313,"name":"cephfs.vm08.dgsaon","rank":0,"incarnation":24,"state":"up:replay","state_seq":2,"addr":"192.168.123.108:6827/2963085185","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2963085185},{"type":"v1","addr":"192.168.123.108:6827","nonce":2963085185}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T07:57:46.357 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.356+0000 7f5ce4e3d700 1 -- 192.168.123.105:0/1128729968 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f5cc8077990 msgr2=0x7f5cc8079e50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:46.357 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.356+0000 7f5ce4e3d700 1 --2- 192.168.123.105:0/1128729968 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f5cc8077990 0x7f5cc8079e50 secure :-1 s=READY pgs=104 cs=0 l=1 rev1=1 crypto rx=0x7f5ce0100410 tx=0x7f5cd000b540 comp rx=0 tx=0).stop 2026-03-10T07:57:46.358 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.357+0000 7f5ce4e3d700 1 -- 192.168.123.105:0/1128729968 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ce01033c0 msgr2=0x7f5ce00752a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:46.358 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.357+0000 7f5ce4e3d700 1 --2- 192.168.123.105:0/1128729968 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ce01033c0 0x7f5ce00752a0 secure :-1 s=READY pgs=142 cs=0 l=1 rev1=1 crypto rx=0x7f5cd400d8d0 tx=0x7f5cd400dbe0 comp rx=0 tx=0).stop 2026-03-10T07:57:46.358 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.357+0000 7f5ce4e3d700 1 -- 192.168.123.105:0/1128729968 shutdown_connections 2026-03-10T07:57:46.358 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.357+0000 7f5ce4e3d700 1 --2- 192.168.123.105:0/1128729968 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f5cc8077990 0x7f5cc8079e50 unknown :-1 s=CLOSED pgs=104 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:46.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.357+0000 7f5ce4e3d700 1 --2- 192.168.123.105:0/1128729968 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f5ce01033c0 0x7f5ce00752a0 unknown :-1 s=CLOSED pgs=142 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:46.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.358+0000 7f5ce4e3d700 1 --2- 192.168.123.105:0/1128729968 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f5ce0103d70 0x7f5ce00757e0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:46.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.358+0000 7f5ce4e3d700 1 -- 192.168.123.105:0/1128729968 >> 192.168.123.105:0/1128729968 conn(0x7f5ce00fec30 msgr2=0x7f5ce0107630 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:46.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.358+0000 7f5ce4e3d700 1 -- 192.168.123.105:0/1128729968 shutdown_connections 2026-03-10T07:57:46.359 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.358+0000 7f5ce4e3d700 1 -- 192.168.123.105:0/1128729968 wait complete. 2026-03-10T07:57:46.360 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 25 2026-03-10T07:57:46.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:46 vm08.local ceph-mon[107898]: pgmap v203: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:46.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:46 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/2861301968' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 23, "format": "json"}]: dispatch 2026-03-10T07:57:46.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:46 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/1797584833' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 24, "format": "json"}]: dispatch 2026-03-10T07:57:46.421 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 26 2026-03-10T07:57:46.559 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:46.798 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.796+0000 7f291d77f700 1 -- 192.168.123.105:0/1665910639 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2918101a80 msgr2=0x7f2918105ad0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:46.798 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.796+0000 7f291d77f700 1 --2- 192.168.123.105:0/1665910639 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2918101a80 0x7f2918105ad0 secure :-1 s=READY pgs=143 cs=0 l=1 rev1=1 crypto rx=0x7f2908009b00 tx=0x7f2908009e10 comp rx=0 tx=0).stop 2026-03-10T07:57:46.798 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.797+0000 7f291d77f700 1 -- 192.168.123.105:0/1665910639 shutdown_connections 2026-03-10T07:57:46.798 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.797+0000 7f291d77f700 1 --2- 192.168.123.105:0/1665910639 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2918101a80 0x7f2918105ad0 unknown :-1 s=CLOSED pgs=143 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:46.798 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.797+0000 7f291d77f700 1 --2- 192.168.123.105:0/1665910639 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f29181010d0 0x7f29181014b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:46.798 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.797+0000 7f291d77f700 1 -- 192.168.123.105:0/1665910639 >> 192.168.123.105:0/1665910639 conn(0x7f29180fc920 msgr2=0x7f29180fed40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:46.798 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.797+0000 7f291d77f700 1 -- 192.168.123.105:0/1665910639 shutdown_connections 2026-03-10T07:57:46.799 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.798+0000 7f291d77f700 1 -- 192.168.123.105:0/1665910639 wait complete. 2026-03-10T07:57:46.799 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.798+0000 7f291d77f700 1 Processor -- start 2026-03-10T07:57:46.799 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.798+0000 7f291d77f700 1 -- start start 2026-03-10T07:57:46.800 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.799+0000 7f291d77f700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f29181010d0 0x7f291819a820 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:46.800 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.799+0000 7f291d77f700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2918101a80 0x7f291819ad60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:46.800 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.799+0000 7f291d77f700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f291819b3f0 con 0x7f2918101a80 2026-03-10T07:57:46.800 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.799+0000 7f291d77f700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f29181948a0 con 0x7f29181010d0 2026-03-10T07:57:46.800 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.799+0000 7f29167fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2918101a80 0x7f291819ad60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:46.800 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.799+0000 7f29167fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2918101a80 0x7f291819ad60 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46358/0 (socket says 192.168.123.105:46358) 2026-03-10T07:57:46.800 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.799+0000 7f29167fc700 1 -- 192.168.123.105:0/3952425265 learned_addr learned my addr 192.168.123.105:0/3952425265 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:46.800 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.799+0000 7f29167fc700 1 -- 192.168.123.105:0/3952425265 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f29181010d0 msgr2=0x7f291819a820 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:57:46.800 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.799+0000 7f2916ffd700 1 --2- 192.168.123.105:0/3952425265 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f29181010d0 0x7f291819a820 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:46.801 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.799+0000 7f29167fc700 1 --2- 192.168.123.105:0/3952425265 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f29181010d0 0x7f291819a820 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:46.801 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.800+0000 7f29167fc700 1 -- 192.168.123.105:0/3952425265 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f29080097e0 con 0x7f2918101a80 2026-03-10T07:57:46.801 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.800+0000 7f29167fc700 1 --2- 192.168.123.105:0/3952425265 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2918101a80 0x7f291819ad60 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f2908009fd0 tx=0x7f2908004970 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:46.802 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.800+0000 7f290ffff700 1 -- 192.168.123.105:0/3952425265 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f290801d070 con 0x7f2918101a80 2026-03-10T07:57:46.802 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.800+0000 7f290ffff700 1 -- 192.168.123.105:0/3952425265 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f2908022470 con 0x7f2918101a80 2026-03-10T07:57:46.802 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.800+0000 7f290ffff700 1 -- 192.168.123.105:0/3952425265 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f290800f700 con 0x7f2918101a80 2026-03-10T07:57:46.802 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.800+0000 7f291d77f700 1 -- 192.168.123.105:0/3952425265 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f2918194b20 con 0x7f2918101a80 2026-03-10T07:57:46.802 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.800+0000 7f291d77f700 1 -- 192.168.123.105:0/3952425265 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f2918195010 con 0x7f2918101a80 2026-03-10T07:57:46.802 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.800+0000 7f2916ffd700 1 --2- 192.168.123.105:0/3952425265 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f29181010d0 0x7f291819a820 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:57:46.803 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.801+0000 7f291d77f700 1 -- 192.168.123.105:0/3952425265 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f29181094d0 con 0x7f2918101a80 2026-03-10T07:57:46.806 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.804+0000 7f290ffff700 1 -- 192.168.123.105:0/3952425265 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f29080225e0 con 0x7f2918101a80 2026-03-10T07:57:46.806 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.805+0000 7f290ffff700 1 --2- 192.168.123.105:0/3952425265 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f290407bdf0 0x7f290407e2b0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:46.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.805+0000 7f290ffff700 1 -- 192.168.123.105:0/3952425265 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f290809bc90 con 0x7f2918101a80 2026-03-10T07:57:46.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.805+0000 7f2916ffd700 1 --2- 192.168.123.105:0/3952425265 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f290407bdf0 0x7f290407e2b0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:46.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.805+0000 7f290ffff700 1 -- 192.168.123.105:0/3952425265 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f290809c110 con 0x7f2918101a80 2026-03-10T07:57:46.807 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.806+0000 7f2916ffd700 1 --2- 192.168.123.105:0/3952425265 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f290407bdf0 0x7f290407e2b0 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f2900005d90 tx=0x7f2900005d00 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:46.939 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.938+0000 7f291d77f700 1 -- 192.168.123.105:0/3952425265 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 26, "format": "json"} v 0) v1 -- 0x7f291804ea90 con 0x7f2918101a80 2026-03-10T07:57:46.940 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.939+0000 7f290ffff700 1 -- 192.168.123.105:0/3952425265 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 26, "format": "json"}]=0 dumped fsmap epoch 26 v35) v1 ==== 107+0+5112 (secure 0 0 0) 0x7f2908064410 con 0x7f2918101a80 2026-03-10T07:57:46.940 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:46.940 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":26,"btime":"2026-03-10T07:55:51:323580+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34274,"name":"cephfs.vm05.pavqil","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6829/426813062","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":426813062},{"type":"v1","addr":"192.168.123.105:6829","nonce":426813062}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20},{"gid":44255,"name":"cephfs.vm05.omfhnh","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/4251520120","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":4251520120},{"type":"v1","addr":"192.168.123.105:6827","nonce":4251520120}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":16},{"gid":44263,"name":"cephfs.vm08.ybmbgd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/1209244","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":1209244},{"type":"v1","addr":"192.168.123.108:6825","nonce":1209244}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25}],"filesystems":[{"mdsmap":{"epoch":26,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:55:50.420420+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":83,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24313},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24313":{"gid":24313,"name":"cephfs.vm08.dgsaon","rank":0,"incarnation":24,"state":"up:reconnect","state_seq":112,"addr":"192.168.123.108:6827/2963085185","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2963085185},{"type":"v1","addr":"192.168.123.108:6827","nonce":2963085185}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T07:57:46.942 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.941+0000 7f291d77f700 1 -- 192.168.123.105:0/3952425265 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f290407bdf0 msgr2=0x7f290407e2b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:46.942 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.941+0000 7f291d77f700 1 --2- 192.168.123.105:0/3952425265 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f290407bdf0 0x7f290407e2b0 secure :-1 s=READY pgs=105 cs=0 l=1 rev1=1 crypto rx=0x7f2900005d90 tx=0x7f2900005d00 comp rx=0 tx=0).stop 2026-03-10T07:57:46.942 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.941+0000 7f291d77f700 1 -- 192.168.123.105:0/3952425265 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2918101a80 msgr2=0x7f291819ad60 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:46.942 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.941+0000 7f291d77f700 1 --2- 192.168.123.105:0/3952425265 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2918101a80 0x7f291819ad60 secure :-1 s=READY pgs=144 cs=0 l=1 rev1=1 crypto rx=0x7f2908009fd0 tx=0x7f2908004970 comp rx=0 tx=0).stop 2026-03-10T07:57:46.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.941+0000 7f291d77f700 1 -- 192.168.123.105:0/3952425265 shutdown_connections 2026-03-10T07:57:46.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.941+0000 7f291d77f700 1 --2- 192.168.123.105:0/3952425265 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f29181010d0 0x7f291819a820 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:46.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.941+0000 7f291d77f700 1 --2- 192.168.123.105:0/3952425265 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f290407bdf0 0x7f290407e2b0 unknown :-1 s=CLOSED pgs=105 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:46.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.941+0000 7f291d77f700 1 --2- 192.168.123.105:0/3952425265 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f2918101a80 0x7f291819ad60 unknown :-1 s=CLOSED pgs=144 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:46.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.942+0000 7f291d77f700 1 -- 192.168.123.105:0/3952425265 >> 192.168.123.105:0/3952425265 conn(0x7f29180fc920 msgr2=0x7f29180fed20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:46.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.942+0000 7f291d77f700 1 -- 192.168.123.105:0/3952425265 shutdown_connections 2026-03-10T07:57:46.943 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:46.942+0000 7f291d77f700 1 -- 192.168.123.105:0/3952425265 wait complete. 2026-03-10T07:57:46.944 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 26 2026-03-10T07:57:47.004 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 27 2026-03-10T07:57:47.148 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:47.191 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:47 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/1128729968' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 25, "format": "json"}]: dispatch 2026-03-10T07:57:47.191 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:47 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/3952425265' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 26, "format": "json"}]: dispatch 2026-03-10T07:57:47.375 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.373+0000 7f4d11264700 1 -- 192.168.123.105:0/2317423728 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d0c103340 msgr2=0x7f4d0c103720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:47.375 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.373+0000 7f4d11264700 1 --2- 192.168.123.105:0/2317423728 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d0c103340 0x7f4d0c103720 secure :-1 s=READY pgs=145 cs=0 l=1 rev1=1 crypto rx=0x7f4cf4009b00 tx=0x7f4cf4009e10 comp rx=0 tx=0).stop 2026-03-10T07:57:47.376 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.374+0000 7f4d11264700 1 -- 192.168.123.105:0/2317423728 shutdown_connections 2026-03-10T07:57:47.376 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.374+0000 7f4d11264700 1 --2- 192.168.123.105:0/2317423728 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4d0c103cf0 0x7f4d0c107d40 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:47.376 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.374+0000 7f4d11264700 1 --2- 192.168.123.105:0/2317423728 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d0c103340 0x7f4d0c103720 unknown :-1 s=CLOSED pgs=145 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:47.376 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.374+0000 7f4d11264700 1 -- 192.168.123.105:0/2317423728 >> 192.168.123.105:0/2317423728 conn(0x7f4d0c0febd0 msgr2=0x7f4d0c100ff0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:47.376 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.375+0000 7f4d11264700 1 -- 192.168.123.105:0/2317423728 shutdown_connections 2026-03-10T07:57:47.376 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.375+0000 7f4d11264700 1 -- 192.168.123.105:0/2317423728 wait complete. 2026-03-10T07:57:47.376 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.375+0000 7f4d11264700 1 Processor -- start 2026-03-10T07:57:47.376 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.375+0000 7f4d11264700 1 -- start start 2026-03-10T07:57:47.376 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.375+0000 7f4d11264700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d0c103340 0x7f4d0c198e30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:47.377 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.375+0000 7f4d11264700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4d0c103cf0 0x7f4d0c199370 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:47.377 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.375+0000 7f4d11264700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4d0c199a50 con 0x7f4d0c103340 2026-03-10T07:57:47.377 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.375+0000 7f4d11264700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f4d0c19d7e0 con 0x7f4d0c103cf0 2026-03-10T07:57:47.377 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.375+0000 7f4d0affd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d0c103340 0x7f4d0c198e30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:47.377 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.376+0000 7f4d0affd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d0c103340 0x7f4d0c198e30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46376/0 (socket says 192.168.123.105:46376) 2026-03-10T07:57:47.379 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.376+0000 7f4d0affd700 1 -- 192.168.123.105:0/2939084324 learned_addr learned my addr 192.168.123.105:0/2939084324 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:47.381 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.376+0000 7f4d0a7fc700 1 --2- 192.168.123.105:0/2939084324 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4d0c103cf0 0x7f4d0c199370 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:47.381 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.376+0000 7f4d0affd700 1 -- 192.168.123.105:0/2939084324 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4d0c103cf0 msgr2=0x7f4d0c199370 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:47.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.376+0000 7f4d0affd700 1 --2- 192.168.123.105:0/2939084324 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4d0c103cf0 0x7f4d0c199370 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:47.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.376+0000 7f4d0affd700 1 -- 192.168.123.105:0/2939084324 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f4cf40097e0 con 0x7f4d0c103340 2026-03-10T07:57:47.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.376+0000 7f4d0affd700 1 --2- 192.168.123.105:0/2939084324 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d0c103340 0x7f4d0c198e30 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7f4cf40048c0 tx=0x7f4cf40048f0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:47.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.376+0000 7f4d03fff700 1 -- 192.168.123.105:0/2939084324 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4cf401d070 con 0x7f4d0c103340 2026-03-10T07:57:47.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.376+0000 7f4d03fff700 1 -- 192.168.123.105:0/2939084324 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f4cf4022470 con 0x7f4d0c103340 2026-03-10T07:57:47.382 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.376+0000 7f4d11264700 1 -- 192.168.123.105:0/2939084324 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f4d0c19da60 con 0x7f4d0c103340 2026-03-10T07:57:47.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.376+0000 7f4d11264700 1 -- 192.168.123.105:0/2939084324 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f4d0c19df50 con 0x7f4d0c103340 2026-03-10T07:57:47.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.377+0000 7f4d03fff700 1 -- 192.168.123.105:0/2939084324 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f4cf400f630 con 0x7f4d0c103340 2026-03-10T07:57:47.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.378+0000 7f4d11264700 1 -- 192.168.123.105:0/2939084324 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f4cec005320 con 0x7f4d0c103340 2026-03-10T07:57:47.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.380+0000 7f4d03fff700 1 -- 192.168.123.105:0/2939084324 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f4cf400f850 con 0x7f4d0c103340 2026-03-10T07:57:47.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.381+0000 7f4d03fff700 1 --2- 192.168.123.105:0/2939084324 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4cf80779e0 0x7f4cf8079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:47.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.381+0000 7f4d03fff700 1 -- 192.168.123.105:0/2939084324 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f4cf409c060 con 0x7f4d0c103340 2026-03-10T07:57:47.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.381+0000 7f4d03fff700 1 -- 192.168.123.105:0/2939084324 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f4cf400ba90 con 0x7f4d0c103340 2026-03-10T07:57:47.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.382+0000 7f4d0a7fc700 1 --2- 192.168.123.105:0/2939084324 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4cf80779e0 0x7f4cf8079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:47.383 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.382+0000 7f4d0a7fc700 1 --2- 192.168.123.105:0/2939084324 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4cf80779e0 0x7f4cf8079ea0 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f4d0c19a450 tx=0x7f4cfc017040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:47.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:47 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/1128729968' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 25, "format": "json"}]: dispatch 2026-03-10T07:57:47.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:47 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/3952425265' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 26, "format": "json"}]: dispatch 2026-03-10T07:57:47.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.519+0000 7f4d11264700 1 -- 192.168.123.105:0/2939084324 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 27, "format": "json"} v 0) v1 -- 0x7f4cec005190 con 0x7f4d0c103340 2026-03-10T07:57:47.521 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.520+0000 7f4d03fff700 1 -- 192.168.123.105:0/2939084324 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 27, "format": "json"}]=0 dumped fsmap epoch 27 v35) v1 ==== 107+0+5109 (secure 0 0 0) 0x7f4cf406d020 con 0x7f4d0c103340 2026-03-10T07:57:47.523 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:47.523 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":27,"btime":"2026-03-10T07:55:52:329393+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34274,"name":"cephfs.vm05.pavqil","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6829/426813062","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":426813062},{"type":"v1","addr":"192.168.123.105:6829","nonce":426813062}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20},{"gid":44255,"name":"cephfs.vm05.omfhnh","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/4251520120","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":4251520120},{"type":"v1","addr":"192.168.123.105:6827","nonce":4251520120}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":16},{"gid":44263,"name":"cephfs.vm08.ybmbgd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/1209244","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":1209244},{"type":"v1","addr":"192.168.123.108:6825","nonce":1209244}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25}],"filesystems":[{"mdsmap":{"epoch":27,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:55:51.333895+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":83,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24313},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24313":{"gid":24313,"name":"cephfs.vm08.dgsaon","rank":0,"incarnation":24,"state":"up:rejoin","state_seq":113,"addr":"192.168.123.108:6827/2963085185","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2963085185},{"type":"v1","addr":"192.168.123.108:6827","nonce":2963085185}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T07:57:47.525 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.524+0000 7f4d11264700 1 -- 192.168.123.105:0/2939084324 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4cf80779e0 msgr2=0x7f4cf8079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:47.525 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.524+0000 7f4d11264700 1 --2- 192.168.123.105:0/2939084324 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4cf80779e0 0x7f4cf8079ea0 secure :-1 s=READY pgs=106 cs=0 l=1 rev1=1 crypto rx=0x7f4d0c19a450 tx=0x7f4cfc017040 comp rx=0 tx=0).stop 2026-03-10T07:57:47.525 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.524+0000 7f4d11264700 1 -- 192.168.123.105:0/2939084324 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d0c103340 msgr2=0x7f4d0c198e30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:47.525 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.524+0000 7f4d11264700 1 --2- 192.168.123.105:0/2939084324 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d0c103340 0x7f4d0c198e30 secure :-1 s=READY pgs=146 cs=0 l=1 rev1=1 crypto rx=0x7f4cf40048c0 tx=0x7f4cf40048f0 comp rx=0 tx=0).stop 2026-03-10T07:57:47.525 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.524+0000 7f4d11264700 1 -- 192.168.123.105:0/2939084324 shutdown_connections 2026-03-10T07:57:47.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.524+0000 7f4d11264700 1 --2- 192.168.123.105:0/2939084324 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f4cf80779e0 0x7f4cf8079ea0 unknown :-1 s=CLOSED pgs=106 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:47.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.524+0000 7f4d11264700 1 --2- 192.168.123.105:0/2939084324 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f4d0c103340 0x7f4d0c198e30 unknown :-1 s=CLOSED pgs=146 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:47.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.524+0000 7f4d11264700 1 --2- 192.168.123.105:0/2939084324 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f4d0c103cf0 0x7f4d0c199370 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:47.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.524+0000 7f4d11264700 1 -- 192.168.123.105:0/2939084324 >> 192.168.123.105:0/2939084324 conn(0x7f4d0c0febd0 msgr2=0x7f4d0c100fc0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:47.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.524+0000 7f4d11264700 1 -- 192.168.123.105:0/2939084324 shutdown_connections 2026-03-10T07:57:47.526 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.525+0000 7f4d11264700 1 -- 192.168.123.105:0/2939084324 wait complete. 2026-03-10T07:57:47.527 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 27 2026-03-10T07:57:47.568 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 28 2026-03-10T07:57:47.707 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:47.950 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.948+0000 7f3d09e4e700 1 -- 192.168.123.105:0/31908058 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d04102040 msgr2=0x7f3d04102420 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:47.950 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.948+0000 7f3d09e4e700 1 --2- 192.168.123.105:0/31908058 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d04102040 0x7f3d04102420 secure :-1 s=READY pgs=147 cs=0 l=1 rev1=1 crypto rx=0x7f3cec009b00 tx=0x7f3cec009e10 comp rx=0 tx=0).stop 2026-03-10T07:57:47.950 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.949+0000 7f3d09e4e700 1 -- 192.168.123.105:0/31908058 shutdown_connections 2026-03-10T07:57:47.950 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.949+0000 7f3d09e4e700 1 --2- 192.168.123.105:0/31908058 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d04102960 0x7f3d0410ae50 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:47.950 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.949+0000 7f3d09e4e700 1 --2- 192.168.123.105:0/31908058 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d04102040 0x7f3d04102420 unknown :-1 s=CLOSED pgs=147 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:47.950 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.949+0000 7f3d09e4e700 1 -- 192.168.123.105:0/31908058 >> 192.168.123.105:0/31908058 conn(0x7f3d040fb830 msgr2=0x7f3d040fdc50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:47.950 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.949+0000 7f3d09e4e700 1 -- 192.168.123.105:0/31908058 shutdown_connections 2026-03-10T07:57:47.950 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.949+0000 7f3d09e4e700 1 -- 192.168.123.105:0/31908058 wait complete. 2026-03-10T07:57:47.951 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.950+0000 7f3d09e4e700 1 Processor -- start 2026-03-10T07:57:47.951 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.950+0000 7f3d09e4e700 1 -- start start 2026-03-10T07:57:47.951 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.950+0000 7f3d09e4e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d04102040 0x7f3d04194960 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:47.951 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.950+0000 7f3d09e4e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d04102960 0x7f3d04194ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:47.951 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.950+0000 7f3d09e4e700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3d04195530 con 0x7f3d04102960 2026-03-10T07:57:47.951 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.950+0000 7f3d09e4e700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3d04199340 con 0x7f3d04102040 2026-03-10T07:57:47.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.950+0000 7f3d02ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d04102960 0x7f3d04194ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:47.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.950+0000 7f3d02ffd700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d04102960 0x7f3d04194ea0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46392/0 (socket says 192.168.123.105:46392) 2026-03-10T07:57:47.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.950+0000 7f3d02ffd700 1 -- 192.168.123.105:0/3581318113 learned_addr learned my addr 192.168.123.105:0/3581318113 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:47.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.951+0000 7f3d037fe700 1 --2- 192.168.123.105:0/3581318113 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d04102040 0x7f3d04194960 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:47.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.951+0000 7f3d02ffd700 1 -- 192.168.123.105:0/3581318113 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d04102040 msgr2=0x7f3d04194960 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:47.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.951+0000 7f3d02ffd700 1 --2- 192.168.123.105:0/3581318113 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d04102040 0x7f3d04194960 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:47.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.951+0000 7f3d02ffd700 1 -- 192.168.123.105:0/3581318113 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f3cec0097e0 con 0x7f3d04102960 2026-03-10T07:57:47.952 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.951+0000 7f3d02ffd700 1 --2- 192.168.123.105:0/3581318113 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d04102960 0x7f3d04194ea0 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7f3cf400c8f0 tx=0x7f3cf400ccb0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:47.954 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.951+0000 7f3d00ff9700 1 -- 192.168.123.105:0/3581318113 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3cf4007ab0 con 0x7f3d04102960 2026-03-10T07:57:47.954 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.951+0000 7f3d09e4e700 1 -- 192.168.123.105:0/3581318113 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3d04199620 con 0x7f3d04102960 2026-03-10T07:57:47.954 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.951+0000 7f3d00ff9700 1 -- 192.168.123.105:0/3581318113 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f3cf400ce90 con 0x7f3d04102960 2026-03-10T07:57:47.954 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.951+0000 7f3d00ff9700 1 -- 192.168.123.105:0/3581318113 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f3cf40186e0 con 0x7f3d04102960 2026-03-10T07:57:47.954 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.951+0000 7f3d09e4e700 1 -- 192.168.123.105:0/3581318113 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f3d04199b70 con 0x7f3d04102960 2026-03-10T07:57:47.958 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.953+0000 7f3d00ff9700 1 -- 192.168.123.105:0/3581318113 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3cf4007c10 con 0x7f3d04102960 2026-03-10T07:57:47.958 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.954+0000 7f3d00ff9700 1 --2- 192.168.123.105:0/3581318113 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3cf00779e0 0x7f3cf0079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:47.958 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.954+0000 7f3d00ff9700 1 -- 192.168.123.105:0/3581318113 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f3cf4099a70 con 0x7f3d04102960 2026-03-10T07:57:47.958 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.954+0000 7f3d09e4e700 1 -- 192.168.123.105:0/3581318113 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3d04108550 con 0x7f3d04102960 2026-03-10T07:57:47.958 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.954+0000 7f3d037fe700 1 --2- 192.168.123.105:0/3581318113 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3cf00779e0 0x7f3cf0079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:47.958 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.955+0000 7f3d037fe700 1 --2- 192.168.123.105:0/3581318113 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3cf00779e0 0x7f3cf0079ea0 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f3cec00b5c0 tx=0x7f3cec005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:47.958 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:47.957+0000 7f3d00ff9700 1 -- 192.168.123.105:0/3581318113 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3cf40621f0 con 0x7f3d04102960 2026-03-10T07:57:48.104 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.101+0000 7f3d09e4e700 1 -- 192.168.123.105:0/3581318113 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 28, "format": "json"} v 0) v1 -- 0x7f3d04066ed0 con 0x7f3d04102960 2026-03-10T07:57:48.104 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.102+0000 7f3d00ff9700 1 -- 192.168.123.105:0/3581318113 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 28, "format": "json"}]=0 dumped fsmap epoch 28 v35) v1 ==== 107+0+5118 (secure 0 0 0) 0x7f3cf4061940 con 0x7f3d04102960 2026-03-10T07:57:48.105 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:48.105 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":28,"btime":"2026-03-10T07:55:53:338844+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34274,"name":"cephfs.vm05.pavqil","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6829/426813062","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":426813062},{"type":"v1","addr":"192.168.123.105:6829","nonce":426813062}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20},{"gid":44255,"name":"cephfs.vm05.omfhnh","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/4251520120","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":4251520120},{"type":"v1","addr":"192.168.123.105:6827","nonce":4251520120}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":16},{"gid":44263,"name":"cephfs.vm08.ybmbgd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/1209244","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":1209244},{"type":"v1","addr":"192.168.123.108:6825","nonce":1209244}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25}],"filesystems":[{"mdsmap":{"epoch":28,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:55:53.338843+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":83,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{"mds_0":24313},"failed":[],"damaged":[],"stopped":[],"info":{"gid_24313":{"gid":24313,"name":"cephfs.vm08.dgsaon","rank":0,"incarnation":24,"state":"up:active","state_seq":114,"addr":"192.168.123.108:6827/2963085185","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":2963085185},{"type":"v1","addr":"192.168.123.108:6827","nonce":2963085185}]},"join_fscid":1,"export_targets":[],"features":4540138322906710015,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":24313,"qdb_cluster":[24313]},"id":1}]} 2026-03-10T07:57:48.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.105+0000 7f3d09e4e700 1 -- 192.168.123.105:0/3581318113 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3cf00779e0 msgr2=0x7f3cf0079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:48.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.105+0000 7f3d09e4e700 1 --2- 192.168.123.105:0/3581318113 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3cf00779e0 0x7f3cf0079ea0 secure :-1 s=READY pgs=107 cs=0 l=1 rev1=1 crypto rx=0x7f3cec00b5c0 tx=0x7f3cec005fb0 comp rx=0 tx=0).stop 2026-03-10T07:57:48.106 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.105+0000 7f3d09e4e700 1 -- 192.168.123.105:0/3581318113 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d04102960 msgr2=0x7f3d04194ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:48.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.105+0000 7f3d09e4e700 1 --2- 192.168.123.105:0/3581318113 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d04102960 0x7f3d04194ea0 secure :-1 s=READY pgs=148 cs=0 l=1 rev1=1 crypto rx=0x7f3cf400c8f0 tx=0x7f3cf400ccb0 comp rx=0 tx=0).stop 2026-03-10T07:57:48.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.106+0000 7f3d09e4e700 1 -- 192.168.123.105:0/3581318113 shutdown_connections 2026-03-10T07:57:48.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.106+0000 7f3d09e4e700 1 --2- 192.168.123.105:0/3581318113 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3d04102040 0x7f3d04194960 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:48.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.106+0000 7f3d09e4e700 1 --2- 192.168.123.105:0/3581318113 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f3cf00779e0 0x7f3cf0079ea0 unknown :-1 s=CLOSED pgs=107 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:48.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.106+0000 7f3d09e4e700 1 --2- 192.168.123.105:0/3581318113 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3d04102960 0x7f3d04194ea0 unknown :-1 s=CLOSED pgs=148 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:48.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.106+0000 7f3d09e4e700 1 -- 192.168.123.105:0/3581318113 >> 192.168.123.105:0/3581318113 conn(0x7f3d040fb830 msgr2=0x7f3d04105690 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:48.107 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.106+0000 7f3d09e4e700 1 -- 192.168.123.105:0/3581318113 shutdown_connections 2026-03-10T07:57:48.108 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.106+0000 7f3d09e4e700 1 -- 192.168.123.105:0/3581318113 wait complete. 2026-03-10T07:57:48.108 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 28 2026-03-10T07:57:48.174 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 29 2026-03-10T07:57:48.310 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:48.354 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:48 vm05.local ceph-mon[130117]: pgmap v204: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:48.354 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:48 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/2939084324' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 27, "format": "json"}]: dispatch 2026-03-10T07:57:48.354 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:48 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/3581318113' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 28, "format": "json"}]: dispatch 2026-03-10T07:57:48.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:48 vm08.local ceph-mon[107898]: pgmap v204: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:48.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:48 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/2939084324' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 27, "format": "json"}]: dispatch 2026-03-10T07:57:48.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:48 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/3581318113' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 28, "format": "json"}]: dispatch 2026-03-10T07:57:48.533 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.531+0000 7f6ff52b5700 1 -- 192.168.123.105:0/1324442627 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6ff0101120 msgr2=0x7f6ff0101500 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:48.533 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.531+0000 7f6ff52b5700 1 --2- 192.168.123.105:0/1324442627 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6ff0101120 0x7f6ff0101500 secure :-1 s=READY pgs=149 cs=0 l=1 rev1=1 crypto rx=0x7f6fd8009b00 tx=0x7f6fd8009e10 comp rx=0 tx=0).stop 2026-03-10T07:57:48.533 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.532+0000 7f6ff52b5700 1 -- 192.168.123.105:0/1324442627 shutdown_connections 2026-03-10T07:57:48.533 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.532+0000 7f6ff52b5700 1 --2- 192.168.123.105:0/1324442627 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ff0101ad0 0x7f6ff0105b20 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:48.533 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.532+0000 7f6ff52b5700 1 --2- 192.168.123.105:0/1324442627 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6ff0101120 0x7f6ff0101500 unknown :-1 s=CLOSED pgs=149 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:48.533 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.532+0000 7f6ff52b5700 1 -- 192.168.123.105:0/1324442627 >> 192.168.123.105:0/1324442627 conn(0x7f6ff00fc9b0 msgr2=0x7f6ff00fedd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:48.533 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.532+0000 7f6ff52b5700 1 -- 192.168.123.105:0/1324442627 shutdown_connections 2026-03-10T07:57:48.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.532+0000 7f6ff52b5700 1 -- 192.168.123.105:0/1324442627 wait complete. 2026-03-10T07:57:48.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.533+0000 7f6ff52b5700 1 Processor -- start 2026-03-10T07:57:48.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.533+0000 7f6ff52b5700 1 -- start start 2026-03-10T07:57:48.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.533+0000 7f6ff52b5700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ff0101120 0x7f6ff019ca10 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:48.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.533+0000 7f6ff52b5700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6ff0101ad0 0x7f6ff019cf50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:48.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.533+0000 7f6ff52b5700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6ff019d550 con 0x7f6ff0101ad0 2026-03-10T07:57:48.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.533+0000 7f6ff52b5700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f6ff0196a90 con 0x7f6ff0101120 2026-03-10T07:57:48.534 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.533+0000 7f6fee7fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6ff0101ad0 0x7f6ff019cf50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:48.535 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.533+0000 7f6fee7fc700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6ff0101ad0 0x7f6ff019cf50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:46412/0 (socket says 192.168.123.105:46412) 2026-03-10T07:57:48.535 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.533+0000 7f6fee7fc700 1 -- 192.168.123.105:0/1222114437 learned_addr learned my addr 192.168.123.105:0/1222114437 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:48.535 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.534+0000 7f6feeffd700 1 --2- 192.168.123.105:0/1222114437 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ff0101120 0x7f6ff019ca10 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:48.535 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.534+0000 7f6fee7fc700 1 -- 192.168.123.105:0/1222114437 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ff0101120 msgr2=0x7f6ff019ca10 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:48.535 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.534+0000 7f6fee7fc700 1 --2- 192.168.123.105:0/1222114437 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ff0101120 0x7f6ff019ca10 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:48.535 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.534+0000 7f6fee7fc700 1 -- 192.168.123.105:0/1222114437 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f6fd80097e0 con 0x7f6ff0101ad0 2026-03-10T07:57:48.535 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.534+0000 7f6fee7fc700 1 --2- 192.168.123.105:0/1222114437 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6ff0101ad0 0x7f6ff019cf50 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f6fe000c960 tx=0x7f6fe000cd20 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:48.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.534+0000 7f6fe7fff700 1 -- 192.168.123.105:0/1222114437 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6fe0007a50 con 0x7f6ff0101ad0 2026-03-10T07:57:48.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.534+0000 7f6fe7fff700 1 -- 192.168.123.105:0/1222114437 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f6fe0007bb0 con 0x7f6ff0101ad0 2026-03-10T07:57:48.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.534+0000 7f6fe7fff700 1 -- 192.168.123.105:0/1222114437 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6fe00186a0 con 0x7f6ff0101ad0 2026-03-10T07:57:48.536 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.535+0000 7f6ff52b5700 1 -- 192.168.123.105:0/1222114437 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6ff0196d70 con 0x7f6ff0101ad0 2026-03-10T07:57:48.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.535+0000 7f6ff52b5700 1 -- 192.168.123.105:0/1222114437 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6ff0197290 con 0x7f6ff0101ad0 2026-03-10T07:57:48.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.536+0000 7f6fe7fff700 1 -- 192.168.123.105:0/1222114437 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6fe001f030 con 0x7f6ff0101ad0 2026-03-10T07:57:48.537 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.536+0000 7f6ff52b5700 1 -- 192.168.123.105:0/1222114437 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f6ff0109450 con 0x7f6ff0101ad0 2026-03-10T07:57:48.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.539+0000 7f6fe7fff700 1 --2- 192.168.123.105:0/1222114437 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6fdc077910 0x7f6fdc079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:48.540 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.539+0000 7f6fe7fff700 1 -- 192.168.123.105:0/1222114437 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f6fe0099bb0 con 0x7f6ff0101ad0 2026-03-10T07:57:48.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.539+0000 7f6fe7fff700 1 -- 192.168.123.105:0/1222114437 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f6fe009d2a0 con 0x7f6ff0101ad0 2026-03-10T07:57:48.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.540+0000 7f6feeffd700 1 --2- 192.168.123.105:0/1222114437 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6fdc077910 0x7f6fdc079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:48.541 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.540+0000 7f6feeffd700 1 --2- 192.168.123.105:0/1222114437 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6fdc077910 0x7f6fdc079dd0 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f6fd8006010 tx=0x7f6fd8005c00 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:48.677 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.675+0000 7f6ff52b5700 1 -- 192.168.123.105:0/1222114437 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 29, "format": "json"} v 0) v1 -- 0x7f6ff0197fa0 con 0x7f6ff0101ad0 2026-03-10T07:57:48.677 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.676+0000 7f6fe7fff700 1 -- 192.168.123.105:0/1222114437 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 29, "format": "json"}]=0 dumped fsmap epoch 29 v35) v1 ==== 107+0+4313 (secure 0 0 0) 0x7f6fe0062330 con 0x7f6ff0101ad0 2026-03-10T07:57:48.677 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:48.677 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":29,"btime":"2026-03-10T07:56:03:318085+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":34274,"name":"cephfs.vm05.pavqil","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6829/426813062","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":426813062},{"type":"v1","addr":"192.168.123.105:6829","nonce":426813062}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":20},{"gid":44255,"name":"cephfs.vm05.omfhnh","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/4251520120","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":4251520120},{"type":"v1","addr":"192.168.123.105:6827","nonce":4251520120}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":16},{"gid":44263,"name":"cephfs.vm08.ybmbgd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/1209244","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":1209244},{"type":"v1","addr":"192.168.123.108:6825","nonce":1209244}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25}],"filesystems":[{"mdsmap":{"epoch":29,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:56:03.318082+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":84,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"max_mds":1,"in":[0],"up":{},"failed":[0],"damaged":[],"stopped":[],"info":{},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T07:57:48.678 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.677+0000 7f6ff52b5700 1 -- 192.168.123.105:0/1222114437 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6fdc077910 msgr2=0x7f6fdc079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:48.679 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.678+0000 7f6ff52b5700 1 --2- 192.168.123.105:0/1222114437 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6fdc077910 0x7f6fdc079dd0 secure :-1 s=READY pgs=108 cs=0 l=1 rev1=1 crypto rx=0x7f6fd8006010 tx=0x7f6fd8005c00 comp rx=0 tx=0).stop 2026-03-10T07:57:48.679 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.678+0000 7f6ff52b5700 1 -- 192.168.123.105:0/1222114437 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6ff0101ad0 msgr2=0x7f6ff019cf50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:48.679 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.678+0000 7f6ff52b5700 1 --2- 192.168.123.105:0/1222114437 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6ff0101ad0 0x7f6ff019cf50 secure :-1 s=READY pgs=150 cs=0 l=1 rev1=1 crypto rx=0x7f6fe000c960 tx=0x7f6fe000cd20 comp rx=0 tx=0).stop 2026-03-10T07:57:48.679 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.678+0000 7f6ff52b5700 1 -- 192.168.123.105:0/1222114437 shutdown_connections 2026-03-10T07:57:48.679 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.678+0000 7f6ff52b5700 1 --2- 192.168.123.105:0/1222114437 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6ff0101120 0x7f6ff019ca10 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:48.679 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.678+0000 7f6ff52b5700 1 --2- 192.168.123.105:0/1222114437 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6fdc077910 0x7f6fdc079dd0 unknown :-1 s=CLOSED pgs=108 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:48.680 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.679+0000 7f6ff52b5700 1 --2- 192.168.123.105:0/1222114437 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6ff0101ad0 0x7f6ff019cf50 unknown :-1 s=CLOSED pgs=150 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:48.680 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.679+0000 7f6ff52b5700 1 -- 192.168.123.105:0/1222114437 >> 192.168.123.105:0/1222114437 conn(0x7f6ff00fc9b0 msgr2=0x7f6ff00fedd0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:48.680 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.679+0000 7f6ff52b5700 1 -- 192.168.123.105:0/1222114437 shutdown_connections 2026-03-10T07:57:48.680 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:48.679+0000 7f6ff52b5700 1 -- 192.168.123.105:0/1222114437 wait complete. 2026-03-10T07:57:48.681 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 29 2026-03-10T07:57:48.738 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 30 2026-03-10T07:57:48.881 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:49.115 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.113+0000 7ff90104a700 1 -- 192.168.123.105:0/1072643994 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8fc102100 msgr2=0x7ff8fc1024e0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:49.115 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.113+0000 7ff90104a700 1 --2- 192.168.123.105:0/1072643994 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8fc102100 0x7ff8fc1024e0 secure :-1 s=READY pgs=151 cs=0 l=1 rev1=1 crypto rx=0x7ff8e4009b00 tx=0x7ff8e4009e10 comp rx=0 tx=0).stop 2026-03-10T07:57:49.115 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.114+0000 7ff90104a700 1 -- 192.168.123.105:0/1072643994 shutdown_connections 2026-03-10T07:57:49.115 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.114+0000 7ff90104a700 1 --2- 192.168.123.105:0/1072643994 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff8fc102a20 0x7ff8fc10af10 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:49.115 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.114+0000 7ff90104a700 1 --2- 192.168.123.105:0/1072643994 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8fc102100 0x7ff8fc1024e0 unknown :-1 s=CLOSED pgs=151 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:49.115 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.114+0000 7ff90104a700 1 -- 192.168.123.105:0/1072643994 >> 192.168.123.105:0/1072643994 conn(0x7ff8fc0fb830 msgr2=0x7ff8fc0fdc50 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:49.115 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.114+0000 7ff90104a700 1 -- 192.168.123.105:0/1072643994 shutdown_connections 2026-03-10T07:57:49.116 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.114+0000 7ff90104a700 1 -- 192.168.123.105:0/1072643994 wait complete. 2026-03-10T07:57:49.116 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.115+0000 7ff90104a700 1 Processor -- start 2026-03-10T07:57:49.116 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.115+0000 7ff90104a700 1 -- start start 2026-03-10T07:57:49.116 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.115+0000 7ff90104a700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff8fc102100 0x7ff8fc0ffd60 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:49.116 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.115+0000 7ff90104a700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8fc102a20 0x7ff8fc1002a0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:49.116 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.115+0000 7ff90104a700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff8fc1007e0 con 0x7ff8fc102a20 2026-03-10T07:57:49.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.115+0000 7ff8fa59c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8fc102a20 0x7ff8fc1002a0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:49.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.115+0000 7ff8fa59c700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8fc102a20 0x7ff8fc1002a0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:42926/0 (socket says 192.168.123.105:42926) 2026-03-10T07:57:49.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.115+0000 7ff8fa59c700 1 -- 192.168.123.105:0/898994329 learned_addr learned my addr 192.168.123.105:0/898994329 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:49.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.116+0000 7ff90104a700 1 -- 192.168.123.105:0/898994329 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7ff8fc100920 con 0x7ff8fc102100 2026-03-10T07:57:49.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.116+0000 7ff8fa59c700 1 -- 192.168.123.105:0/898994329 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff8fc102100 msgr2=0x7ff8fc0ffd60 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:49.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.116+0000 7ff8fa59c700 1 --2- 192.168.123.105:0/898994329 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff8fc102100 0x7ff8fc0ffd60 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:49.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.116+0000 7ff8fa59c700 1 -- 192.168.123.105:0/898994329 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7ff8e40097e0 con 0x7ff8fc102a20 2026-03-10T07:57:49.117 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.116+0000 7ff8fa59c700 1 --2- 192.168.123.105:0/898994329 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8fc102a20 0x7ff8fc1002a0 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7ff8ec00d900 tx=0x7ff8ec00dcc0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:49.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.116+0000 7ff8f3fff700 1 -- 192.168.123.105:0/898994329 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff8ec0098e0 con 0x7ff8fc102a20 2026-03-10T07:57:49.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.116+0000 7ff8f3fff700 1 -- 192.168.123.105:0/898994329 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7ff8ec010460 con 0x7ff8fc102a20 2026-03-10T07:57:49.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.116+0000 7ff8f3fff700 1 -- 192.168.123.105:0/898994329 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7ff8ec00f5d0 con 0x7ff8fc102a20 2026-03-10T07:57:49.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.116+0000 7ff90104a700 1 -- 192.168.123.105:0/898994329 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7ff8fc1a2da0 con 0x7ff8fc102a20 2026-03-10T07:57:49.118 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.116+0000 7ff90104a700 1 -- 192.168.123.105:0/898994329 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7ff8fc196bb0 con 0x7ff8fc102a20 2026-03-10T07:57:49.119 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.117+0000 7ff90104a700 1 -- 192.168.123.105:0/898994329 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7ff8fc108610 con 0x7ff8fc102a20 2026-03-10T07:57:49.122 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.121+0000 7ff8f3fff700 1 -- 192.168.123.105:0/898994329 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7ff8ec0105d0 con 0x7ff8fc102a20 2026-03-10T07:57:49.122 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.121+0000 7ff8f3fff700 1 --2- 192.168.123.105:0/898994329 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7ff8e80779e0 0x7ff8e8079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:49.122 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.121+0000 7ff8f3fff700 1 -- 192.168.123.105:0/898994329 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7ff8ec020030 con 0x7ff8fc102a20 2026-03-10T07:57:49.122 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.121+0000 7ff8f3fff700 1 -- 192.168.123.105:0/898994329 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7ff8ec09a100 con 0x7ff8fc102a20 2026-03-10T07:57:49.123 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.121+0000 7ff8fad9d700 1 --2- 192.168.123.105:0/898994329 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7ff8e80779e0 0x7ff8e8079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:49.123 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.122+0000 7ff8fad9d700 1 --2- 192.168.123.105:0/898994329 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7ff8e80779e0 0x7ff8e8079ea0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7ff8e400b5c0 tx=0x7ff8e4009f90 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:49.262 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:49 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/1222114437' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 29, "format": "json"}]: dispatch 2026-03-10T07:57:49.263 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.261+0000 7ff90104a700 1 -- 192.168.123.105:0/898994329 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 30, "format": "json"} v 0) v1 -- 0x7ff8fc04ea90 con 0x7ff8fc102a20 2026-03-10T07:57:49.263 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.262+0000 7ff8f3fff700 1 -- 192.168.123.105:0/898994329 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 30, "format": "json"}]=0 dumped fsmap epoch 30 v35) v1 ==== 107+0+4392 (secure 0 0 0) 0x7ff8ec0625c0 con 0x7ff8fc102a20 2026-03-10T07:57:49.263 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:49.263 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":30,"btime":"2026-03-10T07:56:03:324551+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":44255,"name":"cephfs.vm05.omfhnh","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/4251520120","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":4251520120},{"type":"v1","addr":"192.168.123.105:6827","nonce":4251520120}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":16},{"gid":44263,"name":"cephfs.vm08.ybmbgd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/1209244","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":1209244},{"type":"v1","addr":"192.168.123.108:6825","nonce":1209244}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25}],"filesystems":[{"mdsmap":{"epoch":30,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:56:03.324548+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":84,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34274},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34274":{"gid":34274,"name":"cephfs.vm05.pavqil","rank":0,"incarnation":30,"state":"up:replay","state_seq":1,"addr":"192.168.123.105:6829/426813062","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":426813062},{"type":"v1","addr":"192.168.123.105:6829","nonce":426813062}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T07:57:49.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.264+0000 7ff90104a700 1 -- 192.168.123.105:0/898994329 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7ff8e80779e0 msgr2=0x7ff8e8079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:49.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.264+0000 7ff90104a700 1 --2- 192.168.123.105:0/898994329 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7ff8e80779e0 0x7ff8e8079ea0 secure :-1 s=READY pgs=109 cs=0 l=1 rev1=1 crypto rx=0x7ff8e400b5c0 tx=0x7ff8e4009f90 comp rx=0 tx=0).stop 2026-03-10T07:57:49.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.264+0000 7ff90104a700 1 -- 192.168.123.105:0/898994329 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8fc102a20 msgr2=0x7ff8fc1002a0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:49.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.264+0000 7ff90104a700 1 --2- 192.168.123.105:0/898994329 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8fc102a20 0x7ff8fc1002a0 secure :-1 s=READY pgs=152 cs=0 l=1 rev1=1 crypto rx=0x7ff8ec00d900 tx=0x7ff8ec00dcc0 comp rx=0 tx=0).stop 2026-03-10T07:57:49.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.265+0000 7ff90104a700 1 -- 192.168.123.105:0/898994329 shutdown_connections 2026-03-10T07:57:49.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.265+0000 7ff90104a700 1 --2- 192.168.123.105:0/898994329 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7ff8fc102100 0x7ff8fc0ffd60 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:49.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.265+0000 7ff90104a700 1 --2- 192.168.123.105:0/898994329 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7ff8e80779e0 0x7ff8e8079ea0 unknown :-1 s=CLOSED pgs=109 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:49.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.265+0000 7ff90104a700 1 --2- 192.168.123.105:0/898994329 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7ff8fc102a20 0x7ff8fc1002a0 unknown :-1 s=CLOSED pgs=152 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:49.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.265+0000 7ff90104a700 1 -- 192.168.123.105:0/898994329 >> 192.168.123.105:0/898994329 conn(0x7ff8fc0fb830 msgr2=0x7ff8fc105750 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:49.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.265+0000 7ff90104a700 1 -- 192.168.123.105:0/898994329 shutdown_connections 2026-03-10T07:57:49.266 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.265+0000 7ff90104a700 1 -- 192.168.123.105:0/898994329 wait complete. 2026-03-10T07:57:49.267 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 30 2026-03-10T07:57:49.327 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 31 2026-03-10T07:57:49.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:49 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/1222114437' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 29, "format": "json"}]: dispatch 2026-03-10T07:57:49.459 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:49.689 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.687+0000 7f036e5d1700 1 -- 192.168.123.105:0/2685142438 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f03680684d0 msgr2=0x7f03680688b0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:49.689 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.687+0000 7f036e5d1700 1 --2- 192.168.123.105:0/2685142438 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f03680684d0 0x7f03680688b0 secure :-1 s=READY pgs=153 cs=0 l=1 rev1=1 crypto rx=0x7f0350009b30 tx=0x7f0350009e40 comp rx=0 tx=0).stop 2026-03-10T07:57:49.689 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.688+0000 7f036e5d1700 1 -- 192.168.123.105:0/2685142438 shutdown_connections 2026-03-10T07:57:49.689 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.688+0000 7f036e5d1700 1 --2- 192.168.123.105:0/2685142438 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0368068df0 0x7f036810d5b0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:49.689 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.688+0000 7f036e5d1700 1 --2- 192.168.123.105:0/2685142438 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f03680684d0 0x7f03680688b0 unknown :-1 s=CLOSED pgs=153 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:49.689 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.688+0000 7f036e5d1700 1 -- 192.168.123.105:0/2685142438 >> 192.168.123.105:0/2685142438 conn(0x7f0368075960 msgr2=0x7f0368075d70 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:49.689 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.688+0000 7f036e5d1700 1 -- 192.168.123.105:0/2685142438 shutdown_connections 2026-03-10T07:57:49.689 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.688+0000 7f036e5d1700 1 -- 192.168.123.105:0/2685142438 wait complete. 2026-03-10T07:57:49.690 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.689+0000 7f036e5d1700 1 Processor -- start 2026-03-10T07:57:49.690 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.689+0000 7f036e5d1700 1 -- start start 2026-03-10T07:57:49.690 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.689+0000 7f036e5d1700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0368068df0 0x7f0368198fd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:49.690 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.689+0000 7f036e5d1700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0368199510 0x7f036819d980 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:49.690 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.689+0000 7f036e5d1700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0368199aa0 con 0x7f0368199510 2026-03-10T07:57:49.691 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.689+0000 7f036e5d1700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f0368199c10 con 0x7f0368068df0 2026-03-10T07:57:49.691 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.689+0000 7f03677fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0368199510 0x7f036819d980 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:49.691 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.689+0000 7f03677fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0368199510 0x7f036819d980 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:42940/0 (socket says 192.168.123.105:42940) 2026-03-10T07:57:49.691 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.689+0000 7f03677fe700 1 -- 192.168.123.105:0/1584265403 learned_addr learned my addr 192.168.123.105:0/1584265403 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:49.691 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.690+0000 7f03677fe700 1 -- 192.168.123.105:0/1584265403 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0368068df0 msgr2=0x7f0368198fd0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:49.691 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.690+0000 7f0367fff700 1 --2- 192.168.123.105:0/1584265403 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0368068df0 0x7f0368198fd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:49.691 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.690+0000 7f03677fe700 1 --2- 192.168.123.105:0/1584265403 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0368068df0 0x7f0368198fd0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:49.691 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.690+0000 7f03677fe700 1 -- 192.168.123.105:0/1584265403 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f03500097e0 con 0x7f0368199510 2026-03-10T07:57:49.691 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.690+0000 7f0367fff700 1 --2- 192.168.123.105:0/1584265403 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0368068df0 0x7f0368198fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:57:49.691 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.690+0000 7f03677fe700 1 --2- 192.168.123.105:0/1584265403 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0368199510 0x7f036819d980 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f035800c930 tx=0x7f035800ccf0 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:49.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.690+0000 7f03657fa700 1 -- 192.168.123.105:0/1584265403 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0358007ab0 con 0x7f0368199510 2026-03-10T07:57:49.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.690+0000 7f03657fa700 1 -- 192.168.123.105:0/1584265403 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f0358007c10 con 0x7f0368199510 2026-03-10T07:57:49.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.690+0000 7f03657fa700 1 -- 192.168.123.105:0/1584265403 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f0358018730 con 0x7f0368199510 2026-03-10T07:57:49.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.690+0000 7f036e5d1700 1 -- 192.168.123.105:0/1584265403 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f036819df20 con 0x7f0368199510 2026-03-10T07:57:49.692 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.690+0000 7f036e5d1700 1 -- 192.168.123.105:0/1584265403 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f036819e390 con 0x7f0368199510 2026-03-10T07:57:49.696 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.691+0000 7f036e5d1700 1 -- 192.168.123.105:0/1584265403 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f036810ad20 con 0x7f0368199510 2026-03-10T07:57:49.697 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.694+0000 7f03657fa700 1 -- 192.168.123.105:0/1584265403 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f035801f030 con 0x7f0368199510 2026-03-10T07:57:49.697 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.695+0000 7f03657fa700 1 --2- 192.168.123.105:0/1584265403 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f03540779e0 0x7f0354079ea0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:49.697 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.695+0000 7f03657fa700 1 -- 192.168.123.105:0/1584265403 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f0358099fb0 con 0x7f0368199510 2026-03-10T07:57:49.697 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.695+0000 7f03657fa700 1 -- 192.168.123.105:0/1584265403 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f035809a430 con 0x7f0368199510 2026-03-10T07:57:49.697 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.695+0000 7f0367fff700 1 --2- 192.168.123.105:0/1584265403 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f03540779e0 0x7f0354079ea0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:49.697 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.696+0000 7f0367fff700 1 --2- 192.168.123.105:0/1584265403 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f03540779e0 0x7f0354079ea0 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f0350000c00 tx=0x7f0350005c00 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:49.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.830+0000 7f036e5d1700 1 -- 192.168.123.105:0/1584265403 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 31, "format": "json"} v 0) v1 -- 0x7f036804ea90 con 0x7f0368199510 2026-03-10T07:57:49.832 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.831+0000 7f03657fa700 1 -- 192.168.123.105:0/1584265403 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 31, "format": "json"}]=0 dumped fsmap epoch 31 v35) v1 ==== 107+0+4395 (secure 0 0 0) 0x7f0358062730 con 0x7f0368199510 2026-03-10T07:57:49.833 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:49.833 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":31,"btime":"2026-03-10T07:56:07:202840+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":44255,"name":"cephfs.vm05.omfhnh","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/4251520120","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":4251520120},{"type":"v1","addr":"192.168.123.105:6827","nonce":4251520120}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":16},{"gid":44263,"name":"cephfs.vm08.ybmbgd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/1209244","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":1209244},{"type":"v1","addr":"192.168.123.108:6825","nonce":1209244}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25}],"filesystems":[{"mdsmap":{"epoch":31,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:56:07.152686+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":84,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34274},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34274":{"gid":34274,"name":"cephfs.vm05.pavqil","rank":0,"incarnation":30,"state":"up:reconnect","state_seq":8,"addr":"192.168.123.105:6829/426813062","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":426813062},{"type":"v1","addr":"192.168.123.105:6829","nonce":426813062}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T07:57:49.835 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.833+0000 7f036e5d1700 1 -- 192.168.123.105:0/1584265403 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f03540779e0 msgr2=0x7f0354079ea0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:49.835 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.834+0000 7f036e5d1700 1 --2- 192.168.123.105:0/1584265403 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f03540779e0 0x7f0354079ea0 secure :-1 s=READY pgs=110 cs=0 l=1 rev1=1 crypto rx=0x7f0350000c00 tx=0x7f0350005c00 comp rx=0 tx=0).stop 2026-03-10T07:57:49.835 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.834+0000 7f036e5d1700 1 -- 192.168.123.105:0/1584265403 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0368199510 msgr2=0x7f036819d980 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:49.835 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.834+0000 7f036e5d1700 1 --2- 192.168.123.105:0/1584265403 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0368199510 0x7f036819d980 secure :-1 s=READY pgs=154 cs=0 l=1 rev1=1 crypto rx=0x7f035800c930 tx=0x7f035800ccf0 comp rx=0 tx=0).stop 2026-03-10T07:57:49.835 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.834+0000 7f036e5d1700 1 -- 192.168.123.105:0/1584265403 shutdown_connections 2026-03-10T07:57:49.835 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.834+0000 7f036e5d1700 1 --2- 192.168.123.105:0/1584265403 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f0368068df0 0x7f0368198fd0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:49.836 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.834+0000 7f036e5d1700 1 --2- 192.168.123.105:0/1584265403 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f03540779e0 0x7f0354079ea0 unknown :-1 s=CLOSED pgs=110 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:49.836 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.835+0000 7f036e5d1700 1 --2- 192.168.123.105:0/1584265403 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f0368199510 0x7f036819d980 unknown :-1 s=CLOSED pgs=154 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:49.836 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.835+0000 7f036e5d1700 1 -- 192.168.123.105:0/1584265403 >> 192.168.123.105:0/1584265403 conn(0x7f0368075960 msgr2=0x7f03680feab0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:49.836 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.835+0000 7f036e5d1700 1 -- 192.168.123.105:0/1584265403 shutdown_connections 2026-03-10T07:57:49.836 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:49.835+0000 7f036e5d1700 1 -- 192.168.123.105:0/1584265403 wait complete. 2026-03-10T07:57:49.837 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 31 2026-03-10T07:57:49.896 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 32 2026-03-10T07:57:50.030 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:50.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.280+0000 7f51d0d2e700 1 -- 192.168.123.105:0/3388061647 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51cc10f340 msgr2=0x7f51cc10f720 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:50.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.280+0000 7f51d0d2e700 1 --2- 192.168.123.105:0/3388061647 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51cc10f340 0x7f51cc10f720 secure :-1 s=READY pgs=155 cs=0 l=1 rev1=1 crypto rx=0x7f51bc009b50 tx=0x7f51bc009e60 comp rx=0 tx=0).stop 2026-03-10T07:57:50.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.281+0000 7f51d0d2e700 1 -- 192.168.123.105:0/3388061647 shutdown_connections 2026-03-10T07:57:50.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.281+0000 7f51d0d2e700 1 --2- 192.168.123.105:0/3388061647 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f51cc10d0f0 0x7f51cc10d570 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:50.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.281+0000 7f51d0d2e700 1 --2- 192.168.123.105:0/3388061647 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51cc10f340 0x7f51cc10f720 unknown :-1 s=CLOSED pgs=155 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:50.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.281+0000 7f51d0d2e700 1 -- 192.168.123.105:0/3388061647 >> 192.168.123.105:0/3388061647 conn(0x7f51cc06ce20 msgr2=0x7f51cc06d230 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:50.282 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.281+0000 7f51d0d2e700 1 -- 192.168.123.105:0/3388061647 shutdown_connections 2026-03-10T07:57:50.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.281+0000 7f51d0d2e700 1 -- 192.168.123.105:0/3388061647 wait complete. 2026-03-10T07:57:50.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.281+0000 7f51d0d2e700 1 Processor -- start 2026-03-10T07:57:50.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.282+0000 7f51d0d2e700 1 -- start start 2026-03-10T07:57:50.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.282+0000 7f51d0d2e700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51cc10d0f0 0x7f51cc11bf30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:50.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.282+0000 7f51d0d2e700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f51cc10f340 0x7f51cc116f30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:50.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.282+0000 7f51d0d2e700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f51cc117500 con 0x7f51cc10d0f0 2026-03-10T07:57:50.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.282+0000 7f51d0d2e700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f51cc117670 con 0x7f51cc10f340 2026-03-10T07:57:50.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.282+0000 7f51cb7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51cc10d0f0 0x7f51cc11bf30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:50.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.282+0000 7f51cb7fe700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51cc10d0f0 0x7f51cc11bf30 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:42966/0 (socket says 192.168.123.105:42966) 2026-03-10T07:57:50.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.282+0000 7f51cb7fe700 1 -- 192.168.123.105:0/662251269 learned_addr learned my addr 192.168.123.105:0/662251269 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:50.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.282+0000 7f51caffd700 1 --2- 192.168.123.105:0/662251269 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f51cc10f340 0x7f51cc116f30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:50.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.282+0000 7f51cb7fe700 1 -- 192.168.123.105:0/662251269 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f51cc10f340 msgr2=0x7f51cc116f30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:50.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.282+0000 7f51cb7fe700 1 --2- 192.168.123.105:0/662251269 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f51cc10f340 0x7f51cc116f30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:50.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.282+0000 7f51cb7fe700 1 -- 192.168.123.105:0/662251269 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f51bc0097e0 con 0x7f51cc10d0f0 2026-03-10T07:57:50.283 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.282+0000 7f51cb7fe700 1 --2- 192.168.123.105:0/662251269 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51cc10d0f0 0x7f51cc11bf30 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7f51bc005f50 tx=0x7f51bc00b870 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:50.284 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.283+0000 7f51c8ff9700 1 -- 192.168.123.105:0/662251269 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f51bc01d070 con 0x7f51cc10d0f0 2026-03-10T07:57:50.284 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.283+0000 7f51c8ff9700 1 -- 192.168.123.105:0/662251269 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f51bc00bb40 con 0x7f51cc10d0f0 2026-03-10T07:57:50.284 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.283+0000 7f51c8ff9700 1 -- 192.168.123.105:0/662251269 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f51bc00f780 con 0x7f51cc10d0f0 2026-03-10T07:57:50.284 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.283+0000 7f51d0d2e700 1 -- 192.168.123.105:0/662251269 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f51cc1178f0 con 0x7f51cc10d0f0 2026-03-10T07:57:50.284 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.283+0000 7f51d0d2e700 1 -- 192.168.123.105:0/662251269 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f51cc1b8480 con 0x7f51cc10d0f0 2026-03-10T07:57:50.285 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.284+0000 7f51d0d2e700 1 -- 192.168.123.105:0/662251269 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f51cc04f2e0 con 0x7f51cc10d0f0 2026-03-10T07:57:50.288 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.287+0000 7f51c8ff9700 1 -- 192.168.123.105:0/662251269 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f51bc022a50 con 0x7f51cc10d0f0 2026-03-10T07:57:50.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.287+0000 7f51c8ff9700 1 --2- 192.168.123.105:0/662251269 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f51b4077910 0x7f51b4079dd0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:50.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.288+0000 7f51c8ff9700 1 -- 192.168.123.105:0/662251269 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f51bc09bf00 con 0x7f51cc10d0f0 2026-03-10T07:57:50.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.288+0000 7f51caffd700 1 --2- 192.168.123.105:0/662251269 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f51b4077910 0x7f51b4079dd0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:50.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.288+0000 7f51caffd700 1 --2- 192.168.123.105:0/662251269 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f51b4077910 0x7f51b4079dd0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f51c0006fd0 tx=0x7f51c0008040 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:50.289 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.288+0000 7f51c8ff9700 1 -- 192.168.123.105:0/662251269 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f51bc09c310 con 0x7f51cc10d0f0 2026-03-10T07:57:50.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:50 vm05.local ceph-mon[130117]: pgmap v205: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:50.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:50 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/898994329' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 30, "format": "json"}]: dispatch 2026-03-10T07:57:50.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:50 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/1584265403' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 31, "format": "json"}]: dispatch 2026-03-10T07:57:50.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:50 vm08.local ceph-mon[107898]: pgmap v205: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:50.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:50 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/898994329' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 30, "format": "json"}]: dispatch 2026-03-10T07:57:50.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:50 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/1584265403' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 31, "format": "json"}]: dispatch 2026-03-10T07:57:50.426 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.425+0000 7f51d0d2e700 1 -- 192.168.123.105:0/662251269 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 32, "format": "json"} v 0) v1 -- 0x7f51cc04ea90 con 0x7f51cc10d0f0 2026-03-10T07:57:50.427 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.425+0000 7f51c8ff9700 1 -- 192.168.123.105:0/662251269 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 32, "format": "json"}]=0 dumped fsmap epoch 32 v35) v1 ==== 107+0+4392 (secure 0 0 0) 0x7f51bc064680 con 0x7f51cc10d0f0 2026-03-10T07:57:50.427 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:50.427 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":32,"btime":"2026-03-10T07:56:08:210283+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":44255,"name":"cephfs.vm05.omfhnh","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/4251520120","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":4251520120},{"type":"v1","addr":"192.168.123.105:6827","nonce":4251520120}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":16},{"gid":44263,"name":"cephfs.vm08.ybmbgd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/1209244","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":1209244},{"type":"v1","addr":"192.168.123.108:6825","nonce":1209244}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25}],"filesystems":[{"mdsmap":{"epoch":32,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:56:07.209620+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":84,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34274},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34274":{"gid":34274,"name":"cephfs.vm05.pavqil","rank":0,"incarnation":30,"state":"up:rejoin","state_seq":9,"addr":"192.168.123.105:6829/426813062","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":426813062},{"type":"v1","addr":"192.168.123.105:6829","nonce":426813062}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":0,"qdb_cluster":[]},"id":1}]} 2026-03-10T07:57:50.429 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.428+0000 7f51d0d2e700 1 -- 192.168.123.105:0/662251269 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f51b4077910 msgr2=0x7f51b4079dd0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:50.429 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.428+0000 7f51d0d2e700 1 --2- 192.168.123.105:0/662251269 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f51b4077910 0x7f51b4079dd0 secure :-1 s=READY pgs=111 cs=0 l=1 rev1=1 crypto rx=0x7f51c0006fd0 tx=0x7f51c0008040 comp rx=0 tx=0).stop 2026-03-10T07:57:50.429 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.428+0000 7f51d0d2e700 1 -- 192.168.123.105:0/662251269 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51cc10d0f0 msgr2=0x7f51cc11bf30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:50.429 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.428+0000 7f51d0d2e700 1 --2- 192.168.123.105:0/662251269 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51cc10d0f0 0x7f51cc11bf30 secure :-1 s=READY pgs=156 cs=0 l=1 rev1=1 crypto rx=0x7f51bc005f50 tx=0x7f51bc00b870 comp rx=0 tx=0).stop 2026-03-10T07:57:50.430 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.428+0000 7f51d0d2e700 1 -- 192.168.123.105:0/662251269 shutdown_connections 2026-03-10T07:57:50.430 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.428+0000 7f51d0d2e700 1 --2- 192.168.123.105:0/662251269 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f51b4077910 0x7f51b4079dd0 unknown :-1 s=CLOSED pgs=111 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:50.430 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.428+0000 7f51d0d2e700 1 --2- 192.168.123.105:0/662251269 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f51cc10d0f0 0x7f51cc11bf30 unknown :-1 s=CLOSED pgs=156 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:50.430 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.428+0000 7f51d0d2e700 1 --2- 192.168.123.105:0/662251269 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f51cc10f340 0x7f51cc116f30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:50.430 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.429+0000 7f51d0d2e700 1 -- 192.168.123.105:0/662251269 >> 192.168.123.105:0/662251269 conn(0x7f51cc06ce20 msgr2=0x7f51cc070420 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:50.430 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.429+0000 7f51d0d2e700 1 -- 192.168.123.105:0/662251269 shutdown_connections 2026-03-10T07:57:50.430 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.429+0000 7f51d0d2e700 1 -- 192.168.123.105:0/662251269 wait complete. 2026-03-10T07:57:50.431 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 32 2026-03-10T07:57:50.487 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 33 2026-03-10T07:57:50.623 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:50.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.863+0000 7f649f51b700 1 -- 192.168.123.105:0/2245333515 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6498073ae0 msgr2=0x7f649810d170 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:50.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.863+0000 7f649f51b700 1 --2- 192.168.123.105:0/2245333515 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6498073ae0 0x7f649810d170 secure :-1 s=READY pgs=157 cs=0 l=1 rev1=1 crypto rx=0x7f6494009b00 tx=0x7f6494009e10 comp rx=0 tx=0).stop 2026-03-10T07:57:50.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.863+0000 7f649f51b700 1 -- 192.168.123.105:0/2245333515 shutdown_connections 2026-03-10T07:57:50.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.863+0000 7f649f51b700 1 --2- 192.168.123.105:0/2245333515 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f6498073ae0 0x7f649810d170 unknown :-1 s=CLOSED pgs=157 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:50.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.863+0000 7f649f51b700 1 --2- 192.168.123.105:0/2245333515 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f64980731c0 0x7f64980735a0 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:50.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.863+0000 7f649f51b700 1 -- 192.168.123.105:0/2245333515 >> 192.168.123.105:0/2245333515 conn(0x7f64980fc920 msgr2=0x7f64980fed40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:50.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.864+0000 7f649f51b700 1 -- 192.168.123.105:0/2245333515 shutdown_connections 2026-03-10T07:57:50.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.864+0000 7f649f51b700 1 -- 192.168.123.105:0/2245333515 wait complete. 2026-03-10T07:57:50.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.865+0000 7f649f51b700 1 Processor -- start 2026-03-10T07:57:50.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.865+0000 7f649f51b700 1 -- start start 2026-03-10T07:57:50.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.865+0000 7f649f51b700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f64980731c0 0x7f649810cc90 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:50.871 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.865+0000 7f649f51b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6498073ae0 0x7f6498103be0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:50.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.865+0000 7f649f51b700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f649810d390 con 0x7f64980731c0 2026-03-10T07:57:50.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.865+0000 7f649f51b700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f649810d500 con 0x7f6498073ae0 2026-03-10T07:57:50.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.865+0000 7f649d2b7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f64980731c0 0x7f649810cc90 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:50.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.865+0000 7f649d2b7700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f64980731c0 0x7f649810cc90 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.105:3300/0 says I am v2:192.168.123.105:42982/0 (socket says 192.168.123.105:42982) 2026-03-10T07:57:50.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.865+0000 7f649d2b7700 1 -- 192.168.123.105:0/641066098 learned_addr learned my addr 192.168.123.105:0/641066098 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:50.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.865+0000 7f649cab6700 1 --2- 192.168.123.105:0/641066098 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6498073ae0 0x7f6498103be0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:50.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.865+0000 7f649d2b7700 1 -- 192.168.123.105:0/641066098 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6498073ae0 msgr2=0x7f6498103be0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:50.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.865+0000 7f649d2b7700 1 --2- 192.168.123.105:0/641066098 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6498073ae0 0x7f6498103be0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:50.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.865+0000 7f649d2b7700 1 -- 192.168.123.105:0/641066098 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f64940097e0 con 0x7f64980731c0 2026-03-10T07:57:50.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.866+0000 7f649d2b7700 1 --2- 192.168.123.105:0/641066098 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f64980731c0 0x7f649810cc90 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7f648800ba70 tx=0x7f648800bd80 comp rx=0 tx=0).ready entity=mon.0 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:50.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.866+0000 7f648e7fc700 1 -- 192.168.123.105:0/641066098 <== mon.0 v2:192.168.123.105:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f648800c700 con 0x7f64980731c0 2026-03-10T07:57:50.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.866+0000 7f649f51b700 1 -- 192.168.123.105:0/641066098 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f6498104270 con 0x7f64980731c0 2026-03-10T07:57:50.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.866+0000 7f649f51b700 1 -- 192.168.123.105:0/641066098 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f6498104790 con 0x7f64980731c0 2026-03-10T07:57:50.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.868+0000 7f649f51b700 1 -- 192.168.123.105:0/641066098 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f649804ea90 con 0x7f64980731c0 2026-03-10T07:57:50.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.869+0000 7f648e7fc700 1 -- 192.168.123.105:0/641066098 <== mon.0 v2:192.168.123.105:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f648800cd40 con 0x7f64980731c0 2026-03-10T07:57:50.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.869+0000 7f648e7fc700 1 -- 192.168.123.105:0/641066098 <== mon.0 v2:192.168.123.105:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f6488012340 con 0x7f64980731c0 2026-03-10T07:57:50.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.869+0000 7f648e7fc700 1 -- 192.168.123.105:0/641066098 <== mon.0 v2:192.168.123.105:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f6488012580 con 0x7f64980731c0 2026-03-10T07:57:50.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.870+0000 7f648e7fc700 1 --2- 192.168.123.105:0/641066098 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6484077a60 0x7f6484079f20 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:50.872 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.870+0000 7f648e7fc700 1 -- 192.168.123.105:0/641066098 <== mon.0 v2:192.168.123.105:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f6488099aa0 con 0x7f64980731c0 2026-03-10T07:57:50.880 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.879+0000 7f649cab6700 1 --2- 192.168.123.105:0/641066098 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6484077a60 0x7f6484079f20 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:50.881 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.879+0000 7f649cab6700 1 --2- 192.168.123.105:0/641066098 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6484077a60 0x7f6484079f20 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f64980fe060 tx=0x7f6494005ee0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:50.881 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:50.880+0000 7f648e7fc700 1 -- 192.168.123.105:0/641066098 <== mon.0 v2:192.168.123.105:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f64880623e0 con 0x7f64980731c0 2026-03-10T07:57:51.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.017+0000 7f649f51b700 1 -- 192.168.123.105:0/641066098 --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 33, "format": "json"} v 0) v1 -- 0x7f6498105110 con 0x7f64980731c0 2026-03-10T07:57:51.019 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.018+0000 7f648e7fc700 1 -- 192.168.123.105:0/641066098 <== mon.0 v2:192.168.123.105:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 33, "format": "json"}]=0 dumped fsmap epoch 33 v35) v1 ==== 107+0+5253 (secure 0 0 0) 0x7f6488061b30 con 0x7f64980731c0 2026-03-10T07:57:51.020 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:51.020 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":33,"btime":"2026-03-10T07:56:09:217433+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":44255,"name":"cephfs.vm05.omfhnh","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/4251520120","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":4251520120},{"type":"v1","addr":"192.168.123.105:6827","nonce":4251520120}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":16},{"gid":44263,"name":"cephfs.vm08.ybmbgd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/1209244","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":1209244},{"type":"v1","addr":"192.168.123.108:6825","nonce":1209244}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44289,"name":"cephfs.vm08.dgsaon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6827/3010391204","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":3010391204},{"type":"v1","addr":"192.168.123.108:6827","nonce":3010391204}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":33}],"filesystems":[{"mdsmap":{"epoch":33,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:56:09.217432+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":84,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34274},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34274":{"gid":34274,"name":"cephfs.vm05.pavqil","rank":0,"incarnation":30,"state":"up:active","state_seq":10,"addr":"192.168.123.105:6829/426813062","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":426813062},{"type":"v1","addr":"192.168.123.105:6829","nonce":426813062}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34274,"qdb_cluster":[34274]},"id":1}]} 2026-03-10T07:57:51.023 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.021+0000 7f649f51b700 1 -- 192.168.123.105:0/641066098 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6484077a60 msgr2=0x7f6484079f20 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:51.023 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.022+0000 7f649f51b700 1 --2- 192.168.123.105:0/641066098 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6484077a60 0x7f6484079f20 secure :-1 s=READY pgs=112 cs=0 l=1 rev1=1 crypto rx=0x7f64980fe060 tx=0x7f6494005ee0 comp rx=0 tx=0).stop 2026-03-10T07:57:51.023 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.022+0000 7f649f51b700 1 -- 192.168.123.105:0/641066098 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f64980731c0 msgr2=0x7f649810cc90 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:51.023 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.022+0000 7f649f51b700 1 --2- 192.168.123.105:0/641066098 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f64980731c0 0x7f649810cc90 secure :-1 s=READY pgs=158 cs=0 l=1 rev1=1 crypto rx=0x7f648800ba70 tx=0x7f648800bd80 comp rx=0 tx=0).stop 2026-03-10T07:57:51.023 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.022+0000 7f649f51b700 1 -- 192.168.123.105:0/641066098 shutdown_connections 2026-03-10T07:57:51.023 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.022+0000 7f649f51b700 1 --2- 192.168.123.105:0/641066098 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f6484077a60 0x7f6484079f20 unknown :-1 s=CLOSED pgs=112 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:51.023 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.022+0000 7f649f51b700 1 --2- 192.168.123.105:0/641066098 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f64980731c0 0x7f649810cc90 unknown :-1 s=CLOSED pgs=158 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:51.024 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.022+0000 7f649f51b700 1 --2- 192.168.123.105:0/641066098 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f6498073ae0 0x7f6498103be0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:51.024 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.023+0000 7f649f51b700 1 -- 192.168.123.105:0/641066098 >> 192.168.123.105:0/641066098 conn(0x7f64980fc920 msgr2=0x7f6498197b20 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:51.024 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.023+0000 7f649f51b700 1 -- 192.168.123.105:0/641066098 shutdown_connections 2026-03-10T07:57:51.024 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.023+0000 7f649f51b700 1 -- 192.168.123.105:0/641066098 wait complete. 2026-03-10T07:57:51.025 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 33 2026-03-10T07:57:51.088 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph fs dump --format=json 34 2026-03-10T07:57:51.153 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:51 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/662251269' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 32, "format": "json"}]: dispatch 2026-03-10T07:57:51.153 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:51 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/641066098' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 33, "format": "json"}]: dispatch 2026-03-10T07:57:51.233 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:51.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:51 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/662251269' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 32, "format": "json"}]: dispatch 2026-03-10T07:57:51.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:51 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/641066098' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 33, "format": "json"}]: dispatch 2026-03-10T07:57:51.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.458+0000 7f3164e0d700 1 -- 192.168.123.105:0/3328640058 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f31600739d0 msgr2=0x7f316010d1f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:51.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.458+0000 7f3164e0d700 1 --2- 192.168.123.105:0/3328640058 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f31600739d0 0x7f316010d1f0 secure :-1 s=READY pgs=159 cs=0 l=1 rev1=1 crypto rx=0x7f3150009b00 tx=0x7f3150009e10 comp rx=0 tx=0).stop 2026-03-10T07:57:51.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.459+0000 7f3164e0d700 1 -- 192.168.123.105:0/3328640058 shutdown_connections 2026-03-10T07:57:51.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.459+0000 7f3164e0d700 1 --2- 192.168.123.105:0/3328640058 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f31600739d0 0x7f316010d1f0 unknown :-1 s=CLOSED pgs=159 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:51.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.459+0000 7f3164e0d700 1 --2- 192.168.123.105:0/3328640058 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f31600730b0 0x7f3160073490 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:51.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.459+0000 7f3164e0d700 1 -- 192.168.123.105:0/3328640058 >> 192.168.123.105:0/3328640058 conn(0x7f31600fc920 msgr2=0x7f31600fed40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:51.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.459+0000 7f3164e0d700 1 -- 192.168.123.105:0/3328640058 shutdown_connections 2026-03-10T07:57:51.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.459+0000 7f3164e0d700 1 -- 192.168.123.105:0/3328640058 wait complete. 2026-03-10T07:57:51.460 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.459+0000 7f3164e0d700 1 Processor -- start 2026-03-10T07:57:51.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.459+0000 7f3164e0d700 1 -- start start 2026-03-10T07:57:51.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.460+0000 7f3164e0d700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f31600730b0 0x7f3160198d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:51.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.460+0000 7f3164e0d700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f31600739d0 0x7f3160199270 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:51.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.460+0000 7f3164e0d700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3160199950 con 0x7f31600730b0 2026-03-10T07:57:51.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.460+0000 7f3164e0d700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f316019d6e0 con 0x7f31600739d0 2026-03-10T07:57:51.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.460+0000 7f315dd9b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f31600739d0 0x7f3160199270 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:51.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.460+0000 7f315dd9b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f31600739d0 0x7f3160199270 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:46526/0 (socket says 192.168.123.105:46526) 2026-03-10T07:57:51.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.460+0000 7f315dd9b700 1 -- 192.168.123.105:0/1811486866 learned_addr learned my addr 192.168.123.105:0/1811486866 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:51.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.460+0000 7f315e59c700 1 --2- 192.168.123.105:0/1811486866 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f31600730b0 0x7f3160198d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:51.461 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.460+0000 7f315dd9b700 1 -- 192.168.123.105:0/1811486866 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f31600730b0 msgr2=0x7f3160198d30 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:51.462 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.461+0000 7f315dd9b700 1 --2- 192.168.123.105:0/1811486866 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f31600730b0 0x7f3160198d30 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:51.462 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.461+0000 7f315dd9b700 1 -- 192.168.123.105:0/1811486866 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f31500097e0 con 0x7f31600739d0 2026-03-10T07:57:51.462 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.461+0000 7f315e59c700 1 --2- 192.168.123.105:0/1811486866 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f31600730b0 0x7f3160198d30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_done state changed! 2026-03-10T07:57:51.462 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.461+0000 7f315dd9b700 1 --2- 192.168.123.105:0/1811486866 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f31600739d0 0x7f3160199270 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f31500094d0 tx=0x7f3150004930 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:51.462 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.461+0000 7f31577fe700 1 -- 192.168.123.105:0/1811486866 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f315001d070 con 0x7f31600739d0 2026-03-10T07:57:51.462 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.461+0000 7f31577fe700 1 -- 192.168.123.105:0/1811486866 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f3150022470 con 0x7f31600739d0 2026-03-10T07:57:51.462 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.461+0000 7f3164e0d700 1 -- 192.168.123.105:0/1811486866 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f316019d960 con 0x7f31600739d0 2026-03-10T07:57:51.463 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.461+0000 7f31577fe700 1 -- 192.168.123.105:0/1811486866 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f315000f6e0 con 0x7f31600739d0 2026-03-10T07:57:51.463 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.461+0000 7f3164e0d700 1 -- 192.168.123.105:0/1811486866 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f316019de50 con 0x7f31600739d0 2026-03-10T07:57:51.463 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.462+0000 7f3164e0d700 1 -- 192.168.123.105:0/1811486866 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f316010a8f0 con 0x7f31600739d0 2026-03-10T07:57:51.465 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.464+0000 7f31577fe700 1 -- 192.168.123.105:0/1811486866 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f31500225e0 con 0x7f31600739d0 2026-03-10T07:57:51.465 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.464+0000 7f31577fe700 1 --2- 192.168.123.105:0/1811486866 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f314c077870 0x7f314c079d30 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:51.465 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.464+0000 7f31577fe700 1 -- 192.168.123.105:0/1811486866 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f3150068ac0 con 0x7f31600739d0 2026-03-10T07:57:51.466 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.464+0000 7f315e59c700 1 --2- 192.168.123.105:0/1811486866 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f314c077870 0x7f314c079d30 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:51.466 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.465+0000 7f315e59c700 1 --2- 192.168.123.105:0/1811486866 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f314c077870 0x7f314c079d30 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f3148005950 tx=0x7f314800a300 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:51.467 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.466+0000 7f31577fe700 1 -- 192.168.123.105:0/1811486866 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f31500644e0 con 0x7f31600739d0 2026-03-10T07:57:51.602 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.601+0000 7f3164e0d700 1 -- 192.168.123.105:0/1811486866 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "fs dump", "epoch": 34, "format": "json"} v 0) v1 -- 0x7f316004ea90 con 0x7f31600739d0 2026-03-10T07:57:51.604 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.602+0000 7f31577fe700 1 -- 192.168.123.105:0/1811486866 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "fs dump", "epoch": 34, "format": "json"}]=0 dumped fsmap epoch 34 v35) v1 ==== 107+0+5253 (secure 0 0 0) 0x7f3150063c30 con 0x7f31600739d0 2026-03-10T07:57:51.604 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T07:57:51.604 INFO:teuthology.orchestra.run.vm05.stdout:{"epoch":34,"btime":"2026-03-10T07:56:12:299655+0000","default_fscid":1,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2"}},"feature_flags":{"enable_multiple":true,"ever_enabled_multiple":true},"standbys":[{"gid":44255,"name":"cephfs.vm05.omfhnh","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.105:6827/4251520120","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6826","nonce":4251520120},{"type":"v1","addr":"192.168.123.105:6827","nonce":4251520120}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":16},{"gid":44263,"name":"cephfs.vm08.ybmbgd","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6825/1209244","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6824","nonce":1209244},{"type":"v1","addr":"192.168.123.108:6825","nonce":1209244}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":25},{"gid":44289,"name":"cephfs.vm08.dgsaon","rank":-1,"incarnation":0,"state":"up:standby","state_seq":1,"addr":"192.168.123.108:6827/3010391204","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.108:6826","nonce":3010391204},{"type":"v1","addr":"192.168.123.108:6827","nonce":3010391204}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"epoch":33}],"filesystems":[{"mdsmap":{"epoch":34,"flags":18,"flags_state":{"joinable":true,"allow_snaps":true,"allow_multimds_snaps":true,"allow_standby_replay":false,"refuse_client_session":false,"refuse_standby_for_another_fs":false,"balance_automate":false},"ever_allowed_features":0,"explicitly_allowed_features":0,"created":"2026-03-10T07:48:24.309293+0000","modified":"2026-03-10T07:56:11.301370+0000","tableserver":0,"root":0,"session_timeout":60,"session_autoclose":300,"required_client_features":{},"max_file_size":1099511627776,"max_xattr_size":65536,"last_failure":0,"last_failure_osd_epoch":84,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}},"max_mds":1,"in":[0],"up":{"mds_0":34274},"failed":[],"damaged":[],"stopped":[],"info":{"gid_34274":{"gid":34274,"name":"cephfs.vm05.pavqil","rank":0,"incarnation":30,"state":"up:active","state_seq":10,"addr":"192.168.123.105:6829/426813062","addrs":{"addrvec":[{"type":"v2","addr":"192.168.123.105:6828","nonce":426813062},{"type":"v1","addr":"192.168.123.105:6829","nonce":426813062}]},"join_fscid":1,"export_targets":[],"features":4540701547738038271,"flags":0,"compat":{"compat":{},"ro_compat":{},"incompat":{"feature_1":"base v0.20","feature_2":"client writeable ranges","feature_3":"default file layouts on dirs","feature_4":"dir inode in separate object","feature_5":"mds uses versioned encoding","feature_6":"dirfrag is stored in omap","feature_7":"mds uses inline data","feature_8":"no anchor table","feature_9":"file layout v2","feature_10":"snaprealm v2","feature_11":"minor log segments","feature_12":"quiesce subvolumes"}}}},"data_pools":[3],"metadata_pool":2,"enabled":true,"fs_name":"cephfs","balancer":"","bal_rank_mask":"-1","standby_count_wanted":1,"qdb_leader":34274,"qdb_cluster":[34274]},"id":1}]} 2026-03-10T07:57:51.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.605+0000 7f3164e0d700 1 -- 192.168.123.105:0/1811486866 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f314c077870 msgr2=0x7f314c079d30 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:51.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.605+0000 7f3164e0d700 1 --2- 192.168.123.105:0/1811486866 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f314c077870 0x7f314c079d30 secure :-1 s=READY pgs=113 cs=0 l=1 rev1=1 crypto rx=0x7f3148005950 tx=0x7f314800a300 comp rx=0 tx=0).stop 2026-03-10T07:57:51.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.605+0000 7f3164e0d700 1 -- 192.168.123.105:0/1811486866 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f31600739d0 msgr2=0x7f3160199270 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:51.606 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.605+0000 7f3164e0d700 1 --2- 192.168.123.105:0/1811486866 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f31600739d0 0x7f3160199270 secure :-1 s=READY pgs=71 cs=0 l=1 rev1=1 crypto rx=0x7f31500094d0 tx=0x7f3150004930 comp rx=0 tx=0).stop 2026-03-10T07:57:51.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.606+0000 7f3164e0d700 1 -- 192.168.123.105:0/1811486866 shutdown_connections 2026-03-10T07:57:51.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.606+0000 7f3164e0d700 1 --2- 192.168.123.105:0/1811486866 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f314c077870 0x7f314c079d30 unknown :-1 s=CLOSED pgs=113 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:51.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.606+0000 7f3164e0d700 1 --2- 192.168.123.105:0/1811486866 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f31600730b0 0x7f3160198d30 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:51.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.606+0000 7f3164e0d700 1 --2- 192.168.123.105:0/1811486866 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f31600739d0 0x7f3160199270 unknown :-1 s=CLOSED pgs=71 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:51.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.606+0000 7f3164e0d700 1 -- 192.168.123.105:0/1811486866 >> 192.168.123.105:0/1811486866 conn(0x7f31600fc920 msgr2=0x7f3160107a30 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:51.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.606+0000 7f3164e0d700 1 -- 192.168.123.105:0/1811486866 shutdown_connections 2026-03-10T07:57:51.607 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:51.606+0000 7f3164e0d700 1 -- 192.168.123.105:0/1811486866 wait complete. 2026-03-10T07:57:51.608 INFO:teuthology.orchestra.run.vm05.stderr:dumped fsmap epoch 34 2026-03-10T07:57:51.667 DEBUG:teuthology.run_tasks:Unwinding manager ceph-fuse 2026-03-10T07:57:51.669 INFO:tasks.ceph_fuse:Unmounting ceph-fuse clients... 2026-03-10T07:57:51.669 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T07:57:51.669 DEBUG:teuthology.orchestra.run.vm05:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-10T07:57:51.685 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T07:57:51.685 DEBUG:teuthology.orchestra.run.vm05:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-10T07:57:51.739 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph osd blocklist ls 2026-03-10T07:57:51.910 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:52.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.174+0000 7f185bc66700 1 -- 192.168.123.105:0/3437911690 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1854103cf0 msgr2=0x7f1854107d40 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:52.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.174+0000 7f185bc66700 1 --2- 192.168.123.105:0/3437911690 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1854103cf0 0x7f1854107d40 secure :-1 s=READY pgs=160 cs=0 l=1 rev1=1 crypto rx=0x7f1850009b50 tx=0x7f1850009e60 comp rx=0 tx=0).stop 2026-03-10T07:57:52.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.175+0000 7f185bc66700 1 -- 192.168.123.105:0/3437911690 shutdown_connections 2026-03-10T07:57:52.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.175+0000 7f185bc66700 1 --2- 192.168.123.105:0/3437911690 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1854103cf0 0x7f1854107d40 unknown :-1 s=CLOSED pgs=160 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:52.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.175+0000 7f185bc66700 1 --2- 192.168.123.105:0/3437911690 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1854103340 0x7f1854103720 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:52.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.175+0000 7f185bc66700 1 -- 192.168.123.105:0/3437911690 >> 192.168.123.105:0/3437911690 conn(0x7f18540feb90 msgr2=0x7f1854100fb0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:52.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.175+0000 7f185bc66700 1 -- 192.168.123.105:0/3437911690 shutdown_connections 2026-03-10T07:57:52.176 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.175+0000 7f185bc66700 1 -- 192.168.123.105:0/3437911690 wait complete. 2026-03-10T07:57:52.177 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.176+0000 7f185bc66700 1 Processor -- start 2026-03-10T07:57:52.177 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.176+0000 7f185bc66700 1 -- start start 2026-03-10T07:57:52.177 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.176+0000 7f185bc66700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1854103340 0x7f1854198de0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:52.177 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.176+0000 7f185bc66700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1854103cf0 0x7f1854199320 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:52.177 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.176+0000 7f185bc66700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f1854199a00 con 0x7f1854103cf0 2026-03-10T07:57:52.177 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.176+0000 7f185bc66700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f185419d790 con 0x7f1854103340 2026-03-10T07:57:52.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.176+0000 7f1859a02700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1854103340 0x7f1854198de0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:52.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.176+0000 7f1859a02700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1854103340 0x7f1854198de0 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:46556/0 (socket says 192.168.123.105:46556) 2026-03-10T07:57:52.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.176+0000 7f1859a02700 1 -- 192.168.123.105:0/3627766791 learned_addr learned my addr 192.168.123.105:0/3627766791 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:52.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.177+0000 7f1859a02700 1 -- 192.168.123.105:0/3627766791 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1854103cf0 msgr2=0x7f1854199320 unknown :-1 s=STATE_CONNECTING l=1).mark_down 2026-03-10T07:57:52.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.177+0000 7f1859201700 1 --2- 192.168.123.105:0/3627766791 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1854103cf0 0x7f1854199320 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:52.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.177+0000 7f1859a02700 1 --2- 192.168.123.105:0/3627766791 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1854103cf0 0x7f1854199320 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:52.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.177+0000 7f1859a02700 1 -- 192.168.123.105:0/3627766791 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f18500097e0 con 0x7f1854103340 2026-03-10T07:57:52.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.177+0000 7f1859201700 1 --2- 192.168.123.105:0/3627766791 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1854103cf0 0x7f1854199320 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).send_auth_request state changed! 2026-03-10T07:57:52.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.177+0000 7f1859a02700 1 --2- 192.168.123.105:0/3627766791 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1854103340 0x7f1854198de0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f184400d8d0 tx=0x7f184400dc90 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:52.178 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.177+0000 7f184affd700 1 -- 192.168.123.105:0/3627766791 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f1844009940 con 0x7f1854103340 2026-03-10T07:57:52.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.177+0000 7f184affd700 1 -- 192.168.123.105:0/3627766791 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f1844010460 con 0x7f1854103340 2026-03-10T07:57:52.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.177+0000 7f185bc66700 1 -- 192.168.123.105:0/3627766791 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f185419da70 con 0x7f1854103340 2026-03-10T07:57:52.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.177+0000 7f184affd700 1 -- 192.168.123.105:0/3627766791 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f184400f5d0 con 0x7f1854103340 2026-03-10T07:57:52.179 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.177+0000 7f185bc66700 1 -- 192.168.123.105:0/3627766791 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f185419dfc0 con 0x7f1854103340 2026-03-10T07:57:52.180 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.179+0000 7f185bc66700 1 -- 192.168.123.105:0/3627766791 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f185410b740 con 0x7f1854103340 2026-03-10T07:57:52.180 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.179+0000 7f184affd700 1 -- 192.168.123.105:0/3627766791 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f1844009aa0 con 0x7f1854103340 2026-03-10T07:57:52.180 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.179+0000 7f184affd700 1 --2- 192.168.123.105:0/3627766791 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f18400778e0 0x7f1840079da0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:52.180 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.179+0000 7f1859201700 1 --2- 192.168.123.105:0/3627766791 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f18400778e0 0x7f1840079da0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:52.180 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.179+0000 7f184affd700 1 -- 192.168.123.105:0/3627766791 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f1844099ab0 con 0x7f1854103340 2026-03-10T07:57:52.181 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.180+0000 7f1859201700 1 --2- 192.168.123.105:0/3627766791 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f18400778e0 0x7f1840079da0 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f185000b5c0 tx=0x7f1850005fb0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:52.184 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.182+0000 7f184affd700 1 -- 192.168.123.105:0/3627766791 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f1844061ae0 con 0x7f1854103340 2026-03-10T07:57:52.186 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:52 vm05.local ceph-mon[130117]: pgmap v206: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:52.186 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:52 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/1811486866' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 34, "format": "json"}]: dispatch 2026-03-10T07:57:52.301 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.300+0000 7f185bc66700 1 -- 192.168.123.105:0/3627766791 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "osd blocklist ls"} v 0) v1 -- 0x7f185404ea90 con 0x7f1854103340 2026-03-10T07:57:52.302 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.301+0000 7f184affd700 1 -- 192.168.123.105:0/3627766791 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "osd blocklist ls"}]=0 listed 39 entries v84) v1 ==== 81+0+2399 (secure 0 0 0) 0x7f1844020020 con 0x7f1854103340 2026-03-10T07:57:52.303 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6824/3815585988 2026-03-11T07:55:45.162439+0000 2026-03-10T07:57:52.303 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6828/427555544 2026-03-11T07:55:37.451544+0000 2026-03-10T07:57:52.303 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6826/2963085185 2026-03-11T07:56:03.317867+0000 2026-03-10T07:57:52.303 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/1119586557 2026-03-11T07:52:26.626203+0000 2026-03-10T07:57:52.303 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:0/3218523886 2026-03-11T07:50:25.972395+0000 2026-03-10T07:57:52.303 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6826/723078808 2026-03-11T07:55:27.416986+0000 2026-03-10T07:57:52.303 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:0/1436092746 2026-03-11T07:50:25.972395+0000 2026-03-10T07:57:52.303 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6827/2963085185 2026-03-11T07:56:03.317867+0000 2026-03-10T07:57:52.303 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/3548984633 2026-03-11T07:45:55.496146+0000 2026-03-10T07:57:52.303 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/1043689565 2026-03-11T07:52:26.626203+0000 2026-03-10T07:57:52.303 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6828/2231834414 2026-03-11T07:50:25.972395+0000 2026-03-10T07:57:52.303 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:0/3163881539 2026-03-11T07:50:25.972395+0000 2026-03-10T07:57:52.303 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/3448907780 2026-03-11T07:50:08.386062+0000 2026-03-10T07:57:52.303 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6800/2 2026-03-11T07:45:55.496146+0000 2026-03-10T07:57:52.303 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:0/4231951380 2026-03-11T07:50:25.972395+0000 2026-03-10T07:57:52.303 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/1261368764 2026-03-11T07:46:09.521991+0000 2026-03-10T07:57:52.303 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:0/3743050524 2026-03-11T07:50:25.972395+0000 2026-03-10T07:57:52.303 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/1529845979 2026-03-11T07:52:26.626203+0000 2026-03-10T07:57:52.303 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/4259283135 2026-03-11T07:46:43.266824+0000 2026-03-10T07:57:52.303 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/2921844242 2026-03-11T07:52:26.626203+0000 2026-03-10T07:57:52.303 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/3200565544 2026-03-11T07:50:08.386062+0000 2026-03-10T07:57:52.304 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:0/2575586894 2026-03-11T07:50:25.972395+0000 2026-03-10T07:57:52.304 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6800/413688438 2026-03-11T07:52:26.626203+0000 2026-03-10T07:57:52.304 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6829/2231834414 2026-03-11T07:50:25.972395+0000 2026-03-10T07:57:52.304 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6801/2 2026-03-11T07:45:55.496146+0000 2026-03-10T07:57:52.304 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/4118350925 2026-03-11T07:45:55.496146+0000 2026-03-10T07:57:52.304 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/597549933 2026-03-11T07:46:09.521991+0000 2026-03-10T07:57:52.304 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/2233014865 2026-03-11T07:50:08.386062+0000 2026-03-10T07:57:52.304 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6824/3692157290 2026-03-11T07:48:30.401131+0000 2026-03-10T07:57:52.304 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/77887181 2026-03-11T07:46:43.266824+0000 2026-03-10T07:57:52.304 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6825/3692157290 2026-03-11T07:48:30.401131+0000 2026-03-10T07:57:52.304 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6825/3815585988 2026-03-11T07:55:45.162439+0000 2026-03-10T07:57:52.304 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6801/413688438 2026-03-11T07:52:26.626203+0000 2026-03-10T07:57:52.304 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6829/427555544 2026-03-11T07:55:37.451544+0000 2026-03-10T07:57:52.304 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6827/723078808 2026-03-11T07:55:27.416986+0000 2026-03-10T07:57:52.304 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/2454546307 2026-03-11T07:45:55.496146+0000 2026-03-10T07:57:52.304 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/2989100693 2026-03-11T07:46:09.521991+0000 2026-03-10T07:57:52.304 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/1998618536 2026-03-11T07:46:43.266824+0000 2026-03-10T07:57:52.304 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/177716193 2026-03-11T07:50:08.386062+0000 2026-03-10T07:57:52.304 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.303+0000 7f185bc66700 1 -- 192.168.123.105:0/3627766791 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f18400778e0 msgr2=0x7f1840079da0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:52.304 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.303+0000 7f185bc66700 1 --2- 192.168.123.105:0/3627766791 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f18400778e0 0x7f1840079da0 secure :-1 s=READY pgs=114 cs=0 l=1 rev1=1 crypto rx=0x7f185000b5c0 tx=0x7f1850005fb0 comp rx=0 tx=0).stop 2026-03-10T07:57:52.304 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.303+0000 7f185bc66700 1 -- 192.168.123.105:0/3627766791 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1854103340 msgr2=0x7f1854198de0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:52.304 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.303+0000 7f185bc66700 1 --2- 192.168.123.105:0/3627766791 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1854103340 0x7f1854198de0 secure :-1 s=READY pgs=72 cs=0 l=1 rev1=1 crypto rx=0x7f184400d8d0 tx=0x7f184400dc90 comp rx=0 tx=0).stop 2026-03-10T07:57:52.305 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.303+0000 7f185bc66700 1 -- 192.168.123.105:0/3627766791 shutdown_connections 2026-03-10T07:57:52.305 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.304+0000 7f185bc66700 1 --2- 192.168.123.105:0/3627766791 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f1854103340 0x7f1854198de0 unknown :-1 s=CLOSED pgs=72 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:52.305 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.304+0000 7f185bc66700 1 --2- 192.168.123.105:0/3627766791 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f18400778e0 0x7f1840079da0 unknown :-1 s=CLOSED pgs=114 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:52.305 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.304+0000 7f185bc66700 1 --2- 192.168.123.105:0/3627766791 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f1854103cf0 0x7f1854199320 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:52.305 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.304+0000 7f185bc66700 1 -- 192.168.123.105:0/3627766791 >> 192.168.123.105:0/3627766791 conn(0x7f18540feb90 msgr2=0x7f1854100f40 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:52.305 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.304+0000 7f185bc66700 1 -- 192.168.123.105:0/3627766791 shutdown_connections 2026-03-10T07:57:52.305 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.304+0000 7f185bc66700 1 -- 192.168.123.105:0/3627766791 wait complete. 2026-03-10T07:57:52.306 INFO:teuthology.orchestra.run.vm05.stderr:listed 39 entries 2026-03-10T07:57:52.364 DEBUG:teuthology.orchestra.run.vm05:> set -ex 2026-03-10T07:57:52.364 DEBUG:teuthology.orchestra.run.vm05:> dd if=/proc/self/mounts of=/dev/stdout 2026-03-10T07:57:52.378 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph osd blocklist ls 2026-03-10T07:57:52.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:52 vm08.local ceph-mon[107898]: pgmap v206: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:52.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:52 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/1811486866' entity='client.admin' cmd=[{"prefix": "fs dump", "epoch": 34, "format": "json"}]: dispatch 2026-03-10T07:57:52.552 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T07:57:52.771 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.770+0000 7f3634d13700 1 -- 192.168.123.105:0/741474018 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3630102db0 msgr2=0x7f3630103190 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:52.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.770+0000 7f3634d13700 1 --2- 192.168.123.105:0/741474018 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3630102db0 0x7f3630103190 secure :-1 s=READY pgs=161 cs=0 l=1 rev1=1 crypto rx=0x7f3618009b50 tx=0x7f3618009e60 comp rx=0 tx=0).stop 2026-03-10T07:57:52.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.771+0000 7f3634d13700 1 -- 192.168.123.105:0/741474018 shutdown_connections 2026-03-10T07:57:52.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.771+0000 7f3634d13700 1 --2- 192.168.123.105:0/741474018 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f3630069180 0x7f3630069600 unknown :-1 s=CLOSED pgs=0 cs=0 l=0 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:52.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.771+0000 7f3634d13700 1 --2- 192.168.123.105:0/741474018 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3630102db0 0x7f3630103190 unknown :-1 s=CLOSED pgs=161 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:52.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.771+0000 7f3634d13700 1 -- 192.168.123.105:0/741474018 >> 192.168.123.105:0/741474018 conn(0x7f3630076b70 msgr2=0x7f3630076f80 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:52.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.771+0000 7f3634d13700 1 -- 192.168.123.105:0/741474018 shutdown_connections 2026-03-10T07:57:52.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.771+0000 7f3634d13700 1 -- 192.168.123.105:0/741474018 wait complete. 2026-03-10T07:57:52.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.771+0000 7f3634d13700 1 Processor -- start 2026-03-10T07:57:52.772 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.771+0000 7f3634d13700 1 -- start start 2026-03-10T07:57:52.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.772+0000 7f3634d13700 1 --2- >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3630069180 0x7f363019aad0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:52.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.772+0000 7f3634d13700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f363019b010 0x7f3630194b50 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:52.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.772+0000 7f3634d13700 1 -- --> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f363019b5e0 con 0x7f3630069180 2026-03-10T07:57:52.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.772+0000 7f3634d13700 1 -- --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_getmap magic: 0 v1 -- 0x7f3630195090 con 0x7f363019b010 2026-03-10T07:57:52.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.772+0000 7f362dd9b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f363019b010 0x7f3630194b50 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:52.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.772+0000 7f362dd9b700 1 --2- >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f363019b010 0x7f3630194b50 unknown :-1 s=HELLO_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_hello peer v2:192.168.123.108:3300/0 says I am v2:192.168.123.105:46572/0 (socket says 192.168.123.105:46572) 2026-03-10T07:57:52.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.772+0000 7f362dd9b700 1 -- 192.168.123.105:0/1836288920 learned_addr learned my addr 192.168.123.105:0/1836288920 (peer_addr_for_me v2:192.168.123.105:0/0) 2026-03-10T07:57:52.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.772+0000 7f362e59c700 1 --2- 192.168.123.105:0/1836288920 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3630069180 0x7f363019aad0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:52.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.772+0000 7f362dd9b700 1 -- 192.168.123.105:0/1836288920 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3630069180 msgr2=0x7f363019aad0 unknown :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:52.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.772+0000 7f362dd9b700 1 --2- 192.168.123.105:0/1836288920 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3630069180 0x7f363019aad0 unknown :-1 s=AUTH_CONNECTING pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:52.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.772+0000 7f362dd9b700 1 -- 192.168.123.105:0/1836288920 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({config=0+,monmap=0+}) v3 -- 0x7f36180097e0 con 0x7f363019b010 2026-03-10T07:57:52.773 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.772+0000 7f362e59c700 1 --2- 192.168.123.105:0/1836288920 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3630069180 0x7f363019aad0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).handle_auth_reply_more state changed! 2026-03-10T07:57:52.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.773+0000 7f362dd9b700 1 --2- 192.168.123.105:0/1836288920 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f363019b010 0x7f3630194b50 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f362000eb10 tx=0x7f362000eed0 comp rx=0 tx=0).ready entity=mon.1 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:52.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.773+0000 7f36277fe700 1 -- 192.168.123.105:0/1836288920 <== mon.1 v2:192.168.123.108:3300/0 1 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f362000cca0 con 0x7f363019b010 2026-03-10T07:57:52.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.773+0000 7f3634d13700 1 -- 192.168.123.105:0/1836288920 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({mgrmap=0+}) v3 -- 0x7f3630195370 con 0x7f363019b010 2026-03-10T07:57:52.774 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.773+0000 7f3634d13700 1 -- 192.168.123.105:0/1836288920 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_subscribe({osdmap=0}) v3 -- 0x7f36301958c0 con 0x7f363019b010 2026-03-10T07:57:52.775 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.773+0000 7f36277fe700 1 -- 192.168.123.105:0/1836288920 <== mon.1 v2:192.168.123.108:3300/0 2 ==== config(28 keys) v1 ==== 1147+0+0 (secure 0 0 0) 0x7f362000ce00 con 0x7f363019b010 2026-03-10T07:57:52.775 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.773+0000 7f36277fe700 1 -- 192.168.123.105:0/1836288920 <== mon.1 v2:192.168.123.108:3300/0 3 ==== mon_map magic: 0 v1 ==== 327+0+0 (secure 0 0 0) 0x7f36200189c0 con 0x7f363019b010 2026-03-10T07:57:52.775 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.773+0000 7f3634d13700 1 -- 192.168.123.105:0/1836288920 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "get_command_descriptions"} v 0) v1 -- 0x7f3630109510 con 0x7f363019b010 2026-03-10T07:57:52.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.774+0000 7f36277fe700 1 -- 192.168.123.105:0/1836288920 <== mon.1 v2:192.168.123.108:3300/0 4 ==== mgrmap(e 38) v1 ==== 100115+0+0 (secure 0 0 0) 0x7f3620018b20 con 0x7f363019b010 2026-03-10T07:57:52.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.775+0000 7f36277fe700 1 --2- 192.168.123.105:0/1836288920 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f361c07bd30 0x7f361c07e1f0 unknown :-1 s=NONE pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0).connect 2026-03-10T07:57:52.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.775+0000 7f36277fe700 1 -- 192.168.123.105:0/1836288920 <== mon.1 v2:192.168.123.108:3300/0 5 ==== osd_map(84..84 src has 1..84) v4 ==== 6480+0+0 (secure 0 0 0) 0x7f3620014070 con 0x7f363019b010 2026-03-10T07:57:52.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.775+0000 7f362e59c700 1 --2- 192.168.123.105:0/1836288920 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f361c07bd30 0x7f361c07e1f0 unknown :-1 s=BANNER_CONNECTING pgs=0 cs=0 l=1 rev1=0 crypto rx=0 tx=0 comp rx=0 tx=0)._handle_peer_banner_payload supported=3 required=0 2026-03-10T07:57:52.776 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.775+0000 7f362e59c700 1 --2- 192.168.123.105:0/1836288920 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f361c07bd30 0x7f361c07e1f0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f3618009b20 tx=0x7f36180058e0 comp rx=0 tx=0).ready entity=mgr.34104 client_cookie=0 server_cookie=0 in_seq=0 out_seq=0 2026-03-10T07:57:52.777 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.776+0000 7f36277fe700 1 -- 192.168.123.105:0/1836288920 <== mon.1 v2:192.168.123.108:3300/0 6 ==== mon_command_ack([{"prefix": "get_command_descriptions"}]=0 v0) v1 ==== 72+0+195034 (secure 0 0 0) 0x7f3620062e80 con 0x7f363019b010 2026-03-10T07:57:52.895 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.894+0000 7f3634d13700 1 -- 192.168.123.105:0/1836288920 --> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] -- mon_command({"prefix": "osd blocklist ls"} v 0) v1 -- 0x7f363004ea90 con 0x7f363019b010 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.895+0000 7f36277fe700 1 -- 192.168.123.105:0/1836288920 <== mon.1 v2:192.168.123.108:3300/0 7 ==== mon_command_ack([{"prefix": "osd blocklist ls"}]=0 listed 39 entries v84) v1 ==== 81+0+2399 (secure 0 0 0) 0x7f36200625d0 con 0x7f363019b010 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6824/3815585988 2026-03-11T07:55:45.162439+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6828/427555544 2026-03-11T07:55:37.451544+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6826/2963085185 2026-03-11T07:56:03.317867+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/1119586557 2026-03-11T07:52:26.626203+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:0/3218523886 2026-03-11T07:50:25.972395+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6826/723078808 2026-03-11T07:55:27.416986+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:0/1436092746 2026-03-11T07:50:25.972395+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6827/2963085185 2026-03-11T07:56:03.317867+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/3548984633 2026-03-11T07:45:55.496146+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/1043689565 2026-03-11T07:52:26.626203+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6828/2231834414 2026-03-11T07:50:25.972395+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:0/3163881539 2026-03-11T07:50:25.972395+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/3448907780 2026-03-11T07:50:08.386062+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6800/2 2026-03-11T07:45:55.496146+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:0/4231951380 2026-03-11T07:50:25.972395+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/1261368764 2026-03-11T07:46:09.521991+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:0/3743050524 2026-03-11T07:50:25.972395+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/1529845979 2026-03-11T07:52:26.626203+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/4259283135 2026-03-11T07:46:43.266824+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/2921844242 2026-03-11T07:52:26.626203+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/3200565544 2026-03-11T07:50:08.386062+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:0/2575586894 2026-03-11T07:50:25.972395+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6800/413688438 2026-03-11T07:52:26.626203+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6829/2231834414 2026-03-11T07:50:25.972395+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6801/2 2026-03-11T07:45:55.496146+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/4118350925 2026-03-11T07:45:55.496146+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/597549933 2026-03-11T07:46:09.521991+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/2233014865 2026-03-11T07:50:08.386062+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6824/3692157290 2026-03-11T07:48:30.401131+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/77887181 2026-03-11T07:46:43.266824+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6825/3692157290 2026-03-11T07:48:30.401131+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.108:6825/3815585988 2026-03-11T07:55:45.162439+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6801/413688438 2026-03-11T07:52:26.626203+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6829/427555544 2026-03-11T07:55:37.451544+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:6827/723078808 2026-03-11T07:55:27.416986+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/2454546307 2026-03-11T07:45:55.496146+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/2989100693 2026-03-11T07:46:09.521991+0000 2026-03-10T07:57:52.896 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/1998618536 2026-03-11T07:46:43.266824+0000 2026-03-10T07:57:52.897 INFO:teuthology.orchestra.run.vm05.stdout:192.168.123.105:0/177716193 2026-03-11T07:50:08.386062+0000 2026-03-10T07:57:52.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.896+0000 7f3634d13700 1 -- 192.168.123.105:0/1836288920 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f361c07bd30 msgr2=0x7f361c07e1f0 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:52.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.896+0000 7f3634d13700 1 --2- 192.168.123.105:0/1836288920 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f361c07bd30 0x7f361c07e1f0 secure :-1 s=READY pgs=115 cs=0 l=1 rev1=1 crypto rx=0x7f3618009b20 tx=0x7f36180058e0 comp rx=0 tx=0).stop 2026-03-10T07:57:52.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.896+0000 7f3634d13700 1 -- 192.168.123.105:0/1836288920 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f363019b010 msgr2=0x7f3630194b50 secure :-1 s=STATE_CONNECTION_ESTABLISHED l=1).mark_down 2026-03-10T07:57:52.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.896+0000 7f3634d13700 1 --2- 192.168.123.105:0/1836288920 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f363019b010 0x7f3630194b50 secure :-1 s=READY pgs=73 cs=0 l=1 rev1=1 crypto rx=0x7f362000eb10 tx=0x7f362000eed0 comp rx=0 tx=0).stop 2026-03-10T07:57:52.897 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.896+0000 7f3634d13700 1 -- 192.168.123.105:0/1836288920 shutdown_connections 2026-03-10T07:57:52.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.896+0000 7f3634d13700 1 --2- 192.168.123.105:0/1836288920 >> [v2:192.168.123.105:6800/627686298,v1:192.168.123.105:6801/627686298] conn(0x7f361c07bd30 0x7f361c07e1f0 unknown :-1 s=CLOSED pgs=115 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:52.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.896+0000 7f3634d13700 1 --2- 192.168.123.105:0/1836288920 >> [v2:192.168.123.105:3300/0,v1:192.168.123.105:6789/0] conn(0x7f3630069180 0x7f363019aad0 unknown :-1 s=CLOSED pgs=0 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:52.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.896+0000 7f3634d13700 1 --2- 192.168.123.105:0/1836288920 >> [v2:192.168.123.108:3300/0,v1:192.168.123.108:6789/0] conn(0x7f363019b010 0x7f3630194b50 unknown :-1 s=CLOSED pgs=73 cs=0 l=1 rev1=1 crypto rx=0 tx=0 comp rx=0 tx=0).stop 2026-03-10T07:57:52.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.896+0000 7f3634d13700 1 -- 192.168.123.105:0/1836288920 >> 192.168.123.105:0/1836288920 conn(0x7f3630076b70 msgr2=0x7f36300fe0b0 unknown :-1 s=STATE_NONE l=0).mark_down 2026-03-10T07:57:52.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.897+0000 7f3634d13700 1 -- 192.168.123.105:0/1836288920 shutdown_connections 2026-03-10T07:57:52.898 INFO:teuthology.orchestra.run.vm05.stderr:2026-03-10T07:57:52.897+0000 7f3634d13700 1 -- 192.168.123.105:0/1836288920 wait complete. 2026-03-10T07:57:52.899 INFO:teuthology.orchestra.run.vm05.stderr:listed 39 entries 2026-03-10T07:57:52.954 INFO:tasks.cephfs.fuse_mount:Running fusermount -u on ubuntu@vm05.local... 2026-03-10T07:57:52.954 INFO:teuthology.orchestra.run:Running command with timeout 300 2026-03-10T07:57:52.954 DEBUG:teuthology.orchestra.run.vm05:> sudo fusermount -u /home/ubuntu/cephtest/mnt.0 2026-03-10T07:57:52.978 INFO:teuthology.orchestra.run:waiting for 300 2026-03-10T07:57:53.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:53 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/3627766791' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-10T07:57:53.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:53 vm05.local ceph-mon[130117]: from='client.? 192.168.123.105:0/1836288920' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-10T07:57:53.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:53 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/3627766791' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-10T07:57:53.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:53 vm08.local ceph-mon[107898]: from='client.? 192.168.123.105:0/1836288920' entity='client.admin' cmd=[{"prefix": "osd blocklist ls"}]: dispatch 2026-03-10T07:57:54.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:54 vm08.local ceph-mon[107898]: pgmap v207: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:54.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:54 vm05.local ceph-mon[130117]: pgmap v207: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:56.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:56 vm08.local ceph-mon[107898]: pgmap v208: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:56.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:56 vm05.local ceph-mon[130117]: pgmap v208: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:57.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:57:57.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:57:58.558 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:57:58 vm05.local ceph-mon[130117]: pgmap v209: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:57:58.569 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:57:58 vm08.local ceph-mon[107898]: pgmap v209: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:00.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:00 vm05.local ceph-mon[130117]: pgmap v210: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:00.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:00 vm08.local ceph-mon[107898]: pgmap v210: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:02.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:02 vm05.local ceph-mon[130117]: pgmap v211: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:02.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:02 vm08.local ceph-mon[107898]: pgmap v211: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:04.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:04 vm05.local ceph-mon[130117]: pgmap v212: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:04.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:04 vm08.local ceph-mon[107898]: pgmap v212: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:06.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:06 vm05.local ceph-mon[130117]: pgmap v213: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:06.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:06 vm08.local ceph-mon[107898]: pgmap v213: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:08.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:08 vm05.local ceph-mon[130117]: pgmap v214: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:08.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:08 vm08.local ceph-mon[107898]: pgmap v214: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:10.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:10 vm05.local ceph-mon[130117]: pgmap v215: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:10.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:10 vm08.local ceph-mon[107898]: pgmap v215: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:12.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:12 vm05.local ceph-mon[130117]: pgmap v216: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:12.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:12 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:58:12.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:12 vm08.local ceph-mon[107898]: pgmap v216: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:12.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:12 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:58:14.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:14 vm05.local ceph-mon[130117]: pgmap v217: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:14.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:14 vm08.local ceph-mon[107898]: pgmap v217: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:16.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:16 vm05.local ceph-mon[130117]: pgmap v218: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:16.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:16 vm08.local ceph-mon[107898]: pgmap v218: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:18.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:18 vm05.local ceph-mon[130117]: pgmap v219: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:18.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:18 vm08.local ceph-mon[107898]: pgmap v219: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:20.518 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:20 vm05.local ceph-mon[130117]: pgmap v220: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:20.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:20 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:58:20.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:20 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:58:20.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:20 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:58:20.519 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:20 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:58:20.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:20 vm08.local ceph-mon[107898]: pgmap v220: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:20.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:20 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:58:20.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:20 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:58:20.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:20 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:58:20.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:20 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:58:22.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:22 vm05.local ceph-mon[130117]: pgmap v221: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:22.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:22 vm08.local ceph-mon[107898]: pgmap v221: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:24.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:24 vm05.local ceph-mon[130117]: pgmap v222: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:24.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:24 vm08.local ceph-mon[107898]: pgmap v222: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:26.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:26 vm05.local ceph-mon[130117]: pgmap v223: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:26.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:26 vm08.local ceph-mon[107898]: pgmap v223: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:27.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:58:27.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:27 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:58:28.558 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:28 vm05.local ceph-mon[130117]: pgmap v224: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:28.569 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:28 vm08.local ceph-mon[107898]: pgmap v224: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:30.616 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:30 vm05.local ceph-mon[130117]: pgmap v225: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:30.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:30 vm08.local ceph-mon[107898]: pgmap v225: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:32.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:32 vm05.local ceph-mon[130117]: pgmap v226: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:32.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:32 vm08.local ceph-mon[107898]: pgmap v226: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:34.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:34 vm05.local ceph-mon[130117]: pgmap v227: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:34.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:34 vm08.local ceph-mon[107898]: pgmap v227: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:36.601 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:36 vm08.local ceph-mon[107898]: pgmap v228: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:36.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:36 vm05.local ceph-mon[130117]: pgmap v228: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:38.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:38 vm05.local ceph-mon[130117]: pgmap v229: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:38.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:38 vm08.local ceph-mon[107898]: pgmap v229: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:40.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:40 vm05.local ceph-mon[130117]: pgmap v230: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:40.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:40 vm08.local ceph-mon[107898]: pgmap v230: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:42.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:42 vm05.local ceph-mon[130117]: pgmap v231: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:42.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:58:42.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:42 vm08.local ceph-mon[107898]: pgmap v231: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:42.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:58:44.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:44 vm05.local ceph-mon[130117]: pgmap v232: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:44.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:44 vm08.local ceph-mon[107898]: pgmap v232: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:46.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:46 vm05.local ceph-mon[130117]: pgmap v233: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:46.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:46 vm08.local ceph-mon[107898]: pgmap v233: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:48.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:48 vm05.local ceph-mon[130117]: pgmap v234: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:48.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:48 vm08.local ceph-mon[107898]: pgmap v234: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:50.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:50 vm05.local ceph-mon[130117]: pgmap v235: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:50.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:50 vm08.local ceph-mon[107898]: pgmap v235: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:52.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:52 vm05.local ceph-mon[130117]: pgmap v236: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:52.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:52 vm08.local ceph-mon[107898]: pgmap v236: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:54.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:54 vm05.local ceph-mon[130117]: pgmap v237: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:54.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:54 vm08.local ceph-mon[107898]: pgmap v237: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:56.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:56 vm05.local ceph-mon[130117]: pgmap v238: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:56.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:56 vm08.local ceph-mon[107898]: pgmap v238: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:57.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:58:57.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:58:58.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:58:58 vm05.local ceph-mon[130117]: pgmap v239: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:58:58.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:58:58 vm08.local ceph-mon[107898]: pgmap v239: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:00.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:00 vm05.local ceph-mon[130117]: pgmap v240: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:00.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:00 vm08.local ceph-mon[107898]: pgmap v240: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:02.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:02 vm05.local ceph-mon[130117]: pgmap v241: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:02.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:02 vm08.local ceph-mon[107898]: pgmap v241: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:04.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:04 vm05.local ceph-mon[130117]: pgmap v242: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:04.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:04 vm08.local ceph-mon[107898]: pgmap v242: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:06.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:06 vm05.local ceph-mon[130117]: pgmap v243: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:06.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:06 vm08.local ceph-mon[107898]: pgmap v243: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:08.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:08 vm05.local ceph-mon[130117]: pgmap v244: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:08.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:08 vm08.local ceph-mon[107898]: pgmap v244: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:10.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:10 vm05.local ceph-mon[130117]: pgmap v245: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:10.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:10 vm08.local ceph-mon[107898]: pgmap v245: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:12.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:12 vm08.local ceph-mon[107898]: pgmap v246: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:12.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:12 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:59:12.906 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:12 vm05.local ceph-mon[130117]: pgmap v246: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:12.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:12 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:59:14.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:14 vm08.local ceph-mon[107898]: pgmap v247: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:14.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:14 vm05.local ceph-mon[130117]: pgmap v247: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:16.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:16 vm05.local ceph-mon[130117]: pgmap v248: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:16.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:16 vm08.local ceph-mon[107898]: pgmap v248: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:18.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:18 vm05.local ceph-mon[130117]: pgmap v249: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:18.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:18 vm08.local ceph-mon[107898]: pgmap v249: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:20.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:20 vm05.local ceph-mon[130117]: pgmap v250: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:20.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:20 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:59:20.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:20 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:59:20.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:20 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:59:20.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:20 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:59:20.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:20 vm08.local ceph-mon[107898]: pgmap v250: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:20.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:20 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T07:59:20.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:20 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T07:59:20.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:20 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T07:59:20.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:20 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T07:59:22.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:22 vm05.local ceph-mon[130117]: pgmap v251: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:22.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:22 vm08.local ceph-mon[107898]: pgmap v251: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:24.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:24 vm05.local ceph-mon[130117]: pgmap v252: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:24.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:24 vm08.local ceph-mon[107898]: pgmap v252: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:26.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:26 vm05.local ceph-mon[130117]: pgmap v253: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:26.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:26 vm08.local ceph-mon[107898]: pgmap v253: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:27.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:59:27.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:27 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:59:28.808 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:28 vm05.local ceph-mon[130117]: pgmap v254: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:28.819 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:28 vm08.local ceph-mon[107898]: pgmap v254: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:30.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:30 vm05.local ceph-mon[130117]: pgmap v255: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:30.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:30 vm08.local ceph-mon[107898]: pgmap v255: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:32.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:32 vm05.local ceph-mon[130117]: pgmap v256: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:32.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:32 vm08.local ceph-mon[107898]: pgmap v256: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:34.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:34 vm05.local ceph-mon[130117]: pgmap v257: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:34.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:34 vm08.local ceph-mon[107898]: pgmap v257: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:36.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:36 vm05.local ceph-mon[130117]: pgmap v258: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:36.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:36 vm08.local ceph-mon[107898]: pgmap v258: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:38.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:38 vm05.local ceph-mon[130117]: pgmap v259: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:38.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:38 vm08.local ceph-mon[107898]: pgmap v259: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:40.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:40 vm05.local ceph-mon[130117]: pgmap v260: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:40.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:40 vm08.local ceph-mon[107898]: pgmap v260: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:42 vm05.local ceph-mon[130117]: pgmap v261: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:42.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:59:42.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:42 vm08.local ceph-mon[107898]: pgmap v261: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:42.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:59:44.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:44 vm05.local ceph-mon[130117]: pgmap v262: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:44.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:44 vm08.local ceph-mon[107898]: pgmap v262: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:46.657 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:46 vm05.local ceph-mon[130117]: pgmap v263: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:46.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:46 vm08.local ceph-mon[107898]: pgmap v263: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:48.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:48 vm05.local ceph-mon[130117]: pgmap v264: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:48.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:48 vm08.local ceph-mon[107898]: pgmap v264: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:50.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:50 vm05.local ceph-mon[130117]: pgmap v265: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:50.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:50 vm08.local ceph-mon[107898]: pgmap v265: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:52.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:52 vm05.local ceph-mon[130117]: pgmap v266: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:52.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:52 vm08.local ceph-mon[107898]: pgmap v266: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:54.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:54 vm05.local ceph-mon[130117]: pgmap v267: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:54.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:54 vm08.local ceph-mon[107898]: pgmap v267: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:56.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:56 vm05.local ceph-mon[130117]: pgmap v268: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:56.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:56 vm08.local ceph-mon[107898]: pgmap v268: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:57.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:59:57.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T07:59:58.808 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 07:59:58 vm05.local ceph-mon[130117]: pgmap v269: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T07:59:58.819 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 07:59:58 vm08.local ceph-mon[107898]: pgmap v269: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:00.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:00 vm05.local ceph-mon[130117]: pgmap v270: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:00.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:00 vm05.local ceph-mon[130117]: overall HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T08:00:00.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:00 vm08.local ceph-mon[107898]: pgmap v270: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:00.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:00 vm08.local ceph-mon[107898]: overall HEALTH_WARN 1 filesystem with deprecated feature inline_data 2026-03-10T08:00:02.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:02 vm05.local ceph-mon[130117]: pgmap v271: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:02.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:02 vm08.local ceph-mon[107898]: pgmap v271: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:04.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:04 vm05.local ceph-mon[130117]: pgmap v272: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:04.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:04 vm08.local ceph-mon[107898]: pgmap v272: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:06.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:06 vm05.local ceph-mon[130117]: pgmap v273: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:06.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:06 vm08.local ceph-mon[107898]: pgmap v273: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:08.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:08 vm05.local ceph-mon[130117]: pgmap v274: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:08.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:08 vm08.local ceph-mon[107898]: pgmap v274: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:10.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:10 vm05.local ceph-mon[130117]: pgmap v275: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:10.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:10 vm08.local ceph-mon[107898]: pgmap v275: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:12.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:12 vm08.local ceph-mon[107898]: pgmap v276: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:12.668 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:12 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:00:12.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:12 vm05.local ceph-mon[130117]: pgmap v276: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:12.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:12 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:00:14.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:14 vm05.local ceph-mon[130117]: pgmap v277: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:14.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:14 vm08.local ceph-mon[107898]: pgmap v277: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:16.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:16 vm05.local ceph-mon[130117]: pgmap v278: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:16.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:16 vm08.local ceph-mon[107898]: pgmap v278: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:18.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:18 vm05.local ceph-mon[130117]: pgmap v279: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:18.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:18 vm08.local ceph-mon[107898]: pgmap v279: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:20.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:20 vm05.local ceph-mon[130117]: pgmap v280: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:20.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:20 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:00:20.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:20 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:00:20.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:20 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:00:20.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:20 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T08:00:20.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:20 vm08.local ceph-mon[107898]: pgmap v280: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:20.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:20 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:00:20.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:20 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:00:20.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:20 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:00:20.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:20 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T08:00:22.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:22 vm05.local ceph-mon[130117]: pgmap v281: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:22.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:22 vm08.local ceph-mon[107898]: pgmap v281: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:24.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:24 vm05.local ceph-mon[130117]: pgmap v282: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:24.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:24 vm08.local ceph-mon[107898]: pgmap v282: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:26.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:26 vm05.local ceph-mon[130117]: pgmap v283: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:26.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:26 vm08.local ceph-mon[107898]: pgmap v283: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:27.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:00:27.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:27 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:00:28.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:28 vm08.local ceph-mon[107898]: pgmap v284: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:29.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:28 vm05.local ceph-mon[130117]: pgmap v284: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:30.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:30 vm08.local ceph-mon[107898]: pgmap v285: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:31.060 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:30 vm05.local ceph-mon[130117]: pgmap v285: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:32.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:32 vm08.local ceph-mon[107898]: pgmap v286: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:33.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:32 vm05.local ceph-mon[130117]: pgmap v286: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:35.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:34 vm05.local ceph-mon[130117]: pgmap v287: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:35.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:34 vm08.local ceph-mon[107898]: pgmap v287: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:37.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:36 vm05.local ceph-mon[130117]: pgmap v288: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:37.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:36 vm08.local ceph-mon[107898]: pgmap v288: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:39.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:38 vm05.local ceph-mon[130117]: pgmap v289: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:39.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:38 vm08.local ceph-mon[107898]: pgmap v289: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:41.060 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:40 vm05.local ceph-mon[130117]: pgmap v290: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:41.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:40 vm08.local ceph-mon[107898]: pgmap v290: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:42.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:42 vm08.local ceph-mon[107898]: pgmap v291: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:42.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:00:43.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:42 vm05.local ceph-mon[130117]: pgmap v291: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:43.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:00:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:44 vm05.local ceph-mon[130117]: pgmap v292: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:45.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:44 vm08.local ceph-mon[107898]: pgmap v292: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:47.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:46 vm05.local ceph-mon[130117]: pgmap v293: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:47.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:46 vm08.local ceph-mon[107898]: pgmap v293: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:49.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:48 vm05.local ceph-mon[130117]: pgmap v294: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:49.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:48 vm08.local ceph-mon[107898]: pgmap v294: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:51.060 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:50 vm05.local ceph-mon[130117]: pgmap v295: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:51.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:50 vm08.local ceph-mon[107898]: pgmap v295: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:53.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:52 vm05.local ceph-mon[130117]: pgmap v296: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:53.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:52 vm08.local ceph-mon[107898]: pgmap v296: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:55.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:54 vm05.local ceph-mon[130117]: pgmap v297: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:55.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:54 vm08.local ceph-mon[107898]: pgmap v297: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:57.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:56 vm05.local ceph-mon[130117]: pgmap v298: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:57.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:56 vm08.local ceph-mon[107898]: pgmap v298: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:58.058 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:00:58.069 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:00:59.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:00:58 vm05.local ceph-mon[130117]: pgmap v299: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:00:59.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:00:58 vm08.local ceph-mon[107898]: pgmap v299: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:01.061 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:00 vm05.local ceph-mon[130117]: pgmap v300: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:01.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:00 vm08.local ceph-mon[107898]: pgmap v300: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:03.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:02 vm05.local ceph-mon[130117]: pgmap v301: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:03.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:02 vm08.local ceph-mon[107898]: pgmap v301: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:05.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:04 vm05.local ceph-mon[130117]: pgmap v302: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:05.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:04 vm08.local ceph-mon[107898]: pgmap v302: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:07.084 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:06 vm05.local ceph-mon[130117]: pgmap v303: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:07.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:06 vm08.local ceph-mon[107898]: pgmap v303: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:09.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:08 vm05.local ceph-mon[130117]: pgmap v304: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:09.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:08 vm08.local ceph-mon[107898]: pgmap v304: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:11.060 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:10 vm05.local ceph-mon[130117]: pgmap v305: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:11.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:10 vm08.local ceph-mon[107898]: pgmap v305: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:13.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:12 vm05.local ceph-mon[130117]: pgmap v306: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:13.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:12 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:01:13.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:12 vm08.local ceph-mon[107898]: pgmap v306: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:13.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:12 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:01:15.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:14 vm05.local ceph-mon[130117]: pgmap v307: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:15.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:14 vm08.local ceph-mon[107898]: pgmap v307: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:17.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:16 vm05.local ceph-mon[130117]: pgmap v308: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:17.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:16 vm08.local ceph-mon[107898]: pgmap v308: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:18 vm05.local ceph-mon[130117]: pgmap v309: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:19.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:18 vm08.local ceph-mon[107898]: pgmap v309: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:21.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:20 vm05.local ceph-mon[130117]: pgmap v310: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:21.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:20 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:01:21.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:20 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:01:21.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:20 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:01:21.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:20 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T08:01:21.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:20 vm08.local ceph-mon[107898]: pgmap v310: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:21.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:20 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:01:21.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:20 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:01:21.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:20 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:01:21.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:20 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T08:01:22.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:21 vm05.local ceph-mon[130117]: pgmap v311: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:22.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:21 vm08.local ceph-mon[107898]: pgmap v311: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:24.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:23 vm05.local ceph-mon[130117]: pgmap v312: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:24.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:23 vm08.local ceph-mon[107898]: pgmap v312: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:26.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:25 vm05.local ceph-mon[130117]: pgmap v313: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:26.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:25 vm08.local ceph-mon[107898]: pgmap v313: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:27.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:26 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:01:27.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:26 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:01:28.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:27 vm05.local ceph-mon[130117]: pgmap v314: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:28.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:27 vm08.local ceph-mon[107898]: pgmap v314: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:30.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:29 vm05.local ceph-mon[130117]: pgmap v315: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:30.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:29 vm08.local ceph-mon[107898]: pgmap v315: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:32.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:31 vm08.local ceph-mon[107898]: pgmap v316: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:32.323 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:31 vm05.local ceph-mon[130117]: pgmap v316: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:34.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:33 vm05.local ceph-mon[130117]: pgmap v317: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:34.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:33 vm08.local ceph-mon[107898]: pgmap v317: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:36.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:35 vm05.local ceph-mon[130117]: pgmap v318: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:36.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:35 vm08.local ceph-mon[107898]: pgmap v318: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:38.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:37 vm05.local ceph-mon[130117]: pgmap v319: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:38.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:37 vm08.local ceph-mon[107898]: pgmap v319: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:40.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:39 vm05.local ceph-mon[130117]: pgmap v320: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:40.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:39 vm08.local ceph-mon[107898]: pgmap v320: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:42.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:41 vm05.local ceph-mon[130117]: pgmap v321: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:42.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:41 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:01:42.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:41 vm08.local ceph-mon[107898]: pgmap v321: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:42.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:41 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:01:44.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:43 vm05.local ceph-mon[130117]: pgmap v322: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:44.418 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:43 vm08.local ceph-mon[107898]: pgmap v322: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:46.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:46 vm05.local ceph-mon[130117]: pgmap v323: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:46.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:46 vm08.local ceph-mon[107898]: pgmap v323: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:48.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:48 vm05.local ceph-mon[130117]: pgmap v324: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:48.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:48 vm08.local ceph-mon[107898]: pgmap v324: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:50.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:50 vm05.local ceph-mon[130117]: pgmap v325: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:50.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:50 vm08.local ceph-mon[107898]: pgmap v325: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:52.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:52 vm05.local ceph-mon[130117]: pgmap v326: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:52.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:52 vm08.local ceph-mon[107898]: pgmap v326: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:54.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:54 vm05.local ceph-mon[130117]: pgmap v327: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:54.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:54 vm08.local ceph-mon[107898]: pgmap v327: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:56.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:56 vm05.local ceph-mon[130117]: pgmap v328: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:56.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:56 vm08.local ceph-mon[107898]: pgmap v328: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:57.906 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:57 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:01:57.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:57 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:01:58.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:01:58 vm05.local ceph-mon[130117]: pgmap v329: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:01:58.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:01:58 vm08.local ceph-mon[107898]: pgmap v329: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:00.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:00 vm05.local ceph-mon[130117]: pgmap v330: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:00.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:00 vm08.local ceph-mon[107898]: pgmap v330: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:02.907 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:02 vm05.local ceph-mon[130117]: pgmap v331: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:02.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:02 vm08.local ceph-mon[107898]: pgmap v331: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:04.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:04 vm08.local ceph-mon[107898]: pgmap v332: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:05.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:04 vm05.local ceph-mon[130117]: pgmap v332: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:07.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:06 vm05.local ceph-mon[130117]: pgmap v333: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:07.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:06 vm08.local ceph-mon[107898]: pgmap v333: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:09.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:08 vm05.local ceph-mon[130117]: pgmap v334: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:09.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:08 vm08.local ceph-mon[107898]: pgmap v334: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:11.060 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:10 vm05.local ceph-mon[130117]: pgmap v335: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:11.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:10 vm08.local ceph-mon[107898]: pgmap v335: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:13.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:12 vm05.local ceph-mon[130117]: pgmap v336: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:13.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:12 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:02:13.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:12 vm08.local ceph-mon[107898]: pgmap v336: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:13.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:12 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:02:15.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:14 vm05.local ceph-mon[130117]: pgmap v337: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:15.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:14 vm08.local ceph-mon[107898]: pgmap v337: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:17.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:16 vm05.local ceph-mon[130117]: pgmap v338: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:17.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:16 vm08.local ceph-mon[107898]: pgmap v338: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:18.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:18 vm08.local ceph-mon[107898]: pgmap v339: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:19.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:18 vm05.local ceph-mon[130117]: pgmap v339: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:21.060 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:20 vm05.local ceph-mon[130117]: pgmap v340: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:21.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:20 vm08.local ceph-mon[107898]: pgmap v340: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:22.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:21 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:02:22.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:21 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:02:22.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:21 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:02:22.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:21 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T08:02:22.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:21 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config dump", "format": "json"}]: dispatch 2026-03-10T08:02:22.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:21 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "config generate-minimal-conf"}]: dispatch 2026-03-10T08:02:22.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:21 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "auth get", "entity": "client.admin"}]: dispatch 2026-03-10T08:02:22.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:21 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' 2026-03-10T08:02:23.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:22 vm05.local ceph-mon[130117]: pgmap v341: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:23.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:22 vm08.local ceph-mon[107898]: pgmap v341: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:25.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:24 vm05.local ceph-mon[130117]: pgmap v342: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:25.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:24 vm08.local ceph-mon[107898]: pgmap v342: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:27.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:26 vm05.local ceph-mon[130117]: pgmap v343: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:27.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:26 vm08.local ceph-mon[107898]: pgmap v343: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:28.058 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:27 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:02:28.069 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:27 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:02:28.918 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:28 vm08.local ceph-mon[107898]: pgmap v344: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:29.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:28 vm05.local ceph-mon[130117]: pgmap v344: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:31.060 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:30 vm05.local ceph-mon[130117]: pgmap v345: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:31.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:30 vm08.local ceph-mon[107898]: pgmap v345: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:33.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:32 vm05.local ceph-mon[130117]: pgmap v346: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:33.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:32 vm08.local ceph-mon[107898]: pgmap v346: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:35.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:34 vm05.local ceph-mon[130117]: pgmap v347: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:35.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:34 vm08.local ceph-mon[107898]: pgmap v347: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:37.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:36 vm05.local ceph-mon[130117]: pgmap v348: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:37.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:36 vm08.local ceph-mon[107898]: pgmap v348: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:39.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:38 vm05.local ceph-mon[130117]: pgmap v349: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:39.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:38 vm08.local ceph-mon[107898]: pgmap v349: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:41.060 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:40 vm05.local ceph-mon[130117]: pgmap v350: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:41.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:40 vm08.local ceph-mon[107898]: pgmap v350: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:43.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:42 vm05.local ceph-mon[130117]: pgmap v351: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:43.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:42 vm05.local ceph-mon[130117]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:02:43.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:42 vm08.local ceph-mon[107898]: pgmap v351: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:43.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:42 vm08.local ceph-mon[107898]: from='mgr.34104 192.168.123.105:0/1410448602' entity='mgr.vm05.blexke' cmd=[{"prefix": "osd blocklist ls", "format": "json"}]: dispatch 2026-03-10T08:02:45.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:44 vm05.local ceph-mon[130117]: pgmap v352: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:45.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:44 vm08.local ceph-mon[107898]: pgmap v352: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:46.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:45 vm05.local ceph-mon[130117]: pgmap v353: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:46.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:45 vm08.local ceph-mon[107898]: pgmap v353: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:48.157 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:47 vm05.local ceph-mon[130117]: pgmap v354: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:48.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:47 vm08.local ceph-mon[107898]: pgmap v354: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:50.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:49 vm08.local ceph-mon[107898]: pgmap v355: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:50.407 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:49 vm05.local ceph-mon[130117]: pgmap v355: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:52.028 ERROR:tasks.cephfs.fuse_mount:process failed to terminate after unmount. This probably indicates a bug within ceph-fuse. 2026-03-10T08:02:52.028 ERROR:teuthology.run_tasks:Manager failed: ceph-fuse Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-10T08:02:52.029 DEBUG:teuthology.run_tasks:Unwinding manager cephadm 2026-03-10T08:02:52.032 INFO:tasks.cephadm:Teardown begin 2026-03-10T08:02:52.032 ERROR:teuthology.contextutil:Saw exception from nested tasks Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/contextutil.py", line 32, in nested yield vars File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephadm.py", line 2252, in task yield File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-10T08:02:52.032 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-10T08:02:52.063 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-10T08:02:52.101 INFO:tasks.cephadm:Disabling cephadm mgr module 2026-03-10T08:02:52.101 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm --image quay.io/ceph/ceph:v18.2.1 shell -c /etc/ceph/ceph.conf -k /etc/ceph/ceph.client.admin.keyring --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 -- ceph mgr module disable cephadm 2026-03-10T08:02:52.168 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:51 vm08.local ceph-mon[107898]: pgmap v356: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:52.252 INFO:teuthology.orchestra.run.vm05.stderr:Inferring config /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/mon.vm05/config 2026-03-10T08:02:52.275 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:51 vm05.local ceph-mon[130117]: pgmap v356: 65 pgs: 65 active+clean; 255 MiB data, 1.0 GiB used, 119 GiB / 120 GiB avail 2026-03-10T08:02:52.395 INFO:teuthology.orchestra.run.vm05.stderr:Error: statfs /etc/ceph/ceph.client.admin.keyring: no such file or directory 2026-03-10T08:02:52.410 DEBUG:teuthology.orchestra.run:got remote process result: 125 2026-03-10T08:02:52.410 INFO:tasks.cephadm:Cleaning up testdir ceph.* files... 2026-03-10T08:02:52.411 DEBUG:teuthology.orchestra.run.vm05:> rm -f /home/ubuntu/cephtest/seed.ceph.conf /home/ubuntu/cephtest/ceph.pub 2026-03-10T08:02:52.426 DEBUG:teuthology.orchestra.run.vm08:> rm -f /home/ubuntu/cephtest/seed.ceph.conf /home/ubuntu/cephtest/ceph.pub 2026-03-10T08:02:52.445 INFO:tasks.cephadm:Stopping all daemons... 2026-03-10T08:02:52.445 INFO:tasks.cephadm.mon.vm05:Stopping mon.vm05... 2026-03-10T08:02:52.445 DEBUG:teuthology.orchestra.run.vm05:> sudo systemctl stop ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@mon.vm05 2026-03-10T08:02:52.577 INFO:journalctl@ceph.mon.vm05.vm05.stdout:Mar 10 08:02:52 vm05.local systemd[1]: Stopping Ceph mon.vm05 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T08:02:52.743 DEBUG:teuthology.orchestra.run.vm05:> sudo pkill -f 'journalctl -f -n 0 -u ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@mon.vm05.service' 2026-03-10T08:02:52.787 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T08:02:52.787 INFO:tasks.cephadm.mon.vm05:Stopped mon.vm05 2026-03-10T08:02:52.787 INFO:tasks.cephadm.mon.vm08:Stopping mon.vm08... 2026-03-10T08:02:52.787 DEBUG:teuthology.orchestra.run.vm08:> sudo systemctl stop ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@mon.vm08 2026-03-10T08:02:53.153 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:52 vm08.local systemd[1]: Stopping Ceph mon.vm08 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T08:02:53.153 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:52 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm08[107894]: 2026-03-10T08:02:52.901+0000 7fd8ec12f640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-mon -n mon.vm08 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false --default-mon-cluster-log-to-file=false --default-mon-cluster-log-to-journald=true --default-mon-cluster-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T08:02:53.153 INFO:journalctl@ceph.mon.vm08.vm08.stdout:Mar 10 08:02:52 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-mon-vm08[107894]: 2026-03-10T08:02:52.901+0000 7fd8ec12f640 -1 mon.vm08@1(peon) e3 *** Got Signal Terminated *** 2026-03-10T08:02:53.243 DEBUG:teuthology.orchestra.run.vm08:> sudo pkill -f 'journalctl -f -n 0 -u ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@mon.vm08.service' 2026-03-10T08:02:53.283 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T08:02:53.283 INFO:tasks.cephadm.mon.vm08:Stopped mon.vm08 2026-03-10T08:02:53.283 INFO:tasks.cephadm.osd.0:Stopping osd.0... 2026-03-10T08:02:53.283 DEBUG:teuthology.orchestra.run.vm05:> sudo systemctl stop ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.0 2026-03-10T08:02:53.327 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:02:53 vm05.local systemd[1]: Stopping Ceph osd.0 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T08:02:53.657 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:02:53 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0[139106]: 2026-03-10T08:02:53.390+0000 7fb1af517640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.0 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T08:02:53.657 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:02:53 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0[139106]: 2026-03-10T08:02:53.390+0000 7fb1af517640 -1 osd.0 84 *** Got signal Terminated *** 2026-03-10T08:02:53.657 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:02:53 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0[139106]: 2026-03-10T08:02:53.390+0000 7fb1af517640 -1 osd.0 84 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T08:02:58.692 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:02:58 vm05.local podman[170055]: 2026-03-10 08:02:58.437550896 +0000 UTC m=+5.058933462 container died b35fccc2a4d5cd00d06d04a0b2dbd1e01886793ec7ed9fd823a33bdc2c8ed084 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T08:02:58.692 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:02:58 vm05.local podman[170055]: 2026-03-10 08:02:58.467724506 +0000 UTC m=+5.089107072 container remove b35fccc2a4d5cd00d06d04a0b2dbd1e01886793ec7ed9fd823a33bdc2c8ed084 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, ceph=True, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , OSD_FLAVOR=default, CEPH_REF=squid, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.build-date=20260223) 2026-03-10T08:02:58.692 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:02:58 vm05.local bash[170055]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0 2026-03-10T08:02:58.692 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:02:58 vm05.local podman[170122]: 2026-03-10 08:02:58.639381543 +0000 UTC m=+0.031520049 container create bcf28c2c3aa483e7233e8d939d4f437fe09330270b5789390a6652118b1c71c2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-deactivate, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, org.label-schema.license=GPLv2, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, ceph=True, OSD_FLAVOR=default, io.buildah.version=1.41.3, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T08:02:58.692 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:02:58 vm05.local podman[170122]: 2026-03-10 08:02:58.691417515 +0000 UTC m=+0.083556031 container init bcf28c2c3aa483e7233e8d939d4f437fe09330270b5789390a6652118b1c71c2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-deactivate, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, ceph=True, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default) 2026-03-10T08:02:58.926 DEBUG:teuthology.orchestra.run.vm05:> sudo pkill -f 'journalctl -f -n 0 -u ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.0.service' 2026-03-10T08:02:58.963 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:02:58 vm05.local podman[170122]: 2026-03-10 08:02:58.69715843 +0000 UTC m=+0.089296936 container start bcf28c2c3aa483e7233e8d939d4f437fe09330270b5789390a6652118b1c71c2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-deactivate, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid) 2026-03-10T08:02:58.963 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:02:58 vm05.local podman[170122]: 2026-03-10 08:02:58.716457445 +0000 UTC m=+0.108595951 container attach bcf28c2c3aa483e7233e8d939d4f437fe09330270b5789390a6652118b1c71c2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.build-date=20260223, OSD_FLAVOR=default, CEPH_REF=squid) 2026-03-10T08:02:58.963 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:02:58 vm05.local podman[170122]: 2026-03-10 08:02:58.619931437 +0000 UTC m=+0.012069953 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T08:02:58.963 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:02:58 vm05.local conmon[170132]: conmon bcf28c2c3aa483e7233e : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-bcf28c2c3aa483e7233e8d939d4f437fe09330270b5789390a6652118b1c71c2.scope/container/memory.events 2026-03-10T08:02:58.963 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:02:58 vm05.local podman[170122]: 2026-03-10 08:02:58.866773767 +0000 UTC m=+0.258912273 container died bcf28c2c3aa483e7233e8d939d4f437fe09330270b5789390a6652118b1c71c2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-deactivate, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default) 2026-03-10T08:02:58.963 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:02:58 vm05.local podman[170122]: 2026-03-10 08:02:58.904361703 +0000 UTC m=+0.296500210 container remove bcf28c2c3aa483e7233e8d939d4f437fe09330270b5789390a6652118b1c71c2 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-0-deactivate, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/) 2026-03-10T08:02:58.963 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:02:58 vm05.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.0.service: Deactivated successfully. 2026-03-10T08:02:58.963 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:02:58 vm05.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.0.service: Unit process 170132 (conmon) remains running after unit stopped. 2026-03-10T08:02:58.963 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:02:58 vm05.local systemd[1]: Stopped Ceph osd.0 for 12e9780e-1c55-11f1-8896-79f7c2e9b508. 2026-03-10T08:02:58.963 INFO:journalctl@ceph.osd.0.vm05.stdout:Mar 10 08:02:58 vm05.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.0.service: Consumed 12.442s CPU time, 561.2M memory peak. 2026-03-10T08:02:58.977 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T08:02:58.977 INFO:tasks.cephadm.osd.0:Stopped osd.0 2026-03-10T08:02:58.977 INFO:tasks.cephadm.osd.1:Stopping osd.1... 2026-03-10T08:02:58.977 DEBUG:teuthology.orchestra.run.vm05:> sudo systemctl stop ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.1 2026-03-10T08:02:59.313 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:02:59 vm05.local systemd[1]: Stopping Ceph osd.1 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T08:02:59.314 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:02:59 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1[144194]: 2026-03-10T08:02:59.160+0000 7f266f834640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.1 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T08:02:59.314 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:02:59 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1[144194]: 2026-03-10T08:02:59.160+0000 7f266f834640 -1 osd.1 84 *** Got signal Terminated *** 2026-03-10T08:02:59.314 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:02:59 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1[144194]: 2026-03-10T08:02:59.160+0000 7f266f834640 -1 osd.1 84 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T08:03:04.458 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:03:04 vm05.local podman[170227]: 2026-03-10 08:03:04.204580574 +0000 UTC m=+5.065391421 container died e3fe4ad5c6a3b0fbeba0080c9d5bed5b344f914552da3fbec9b9b3cc58f959c1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, ceph=True, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T08:03:04.458 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:03:04 vm05.local podman[170227]: 2026-03-10 08:03:04.237803322 +0000 UTC m=+5.098614169 container remove e3fe4ad5c6a3b0fbeba0080c9d5bed5b344f914552da3fbec9b9b3cc58f959c1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1, OSD_FLAVOR=default, CEPH_REF=squid, io.buildah.version=1.41.3, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True) 2026-03-10T08:03:04.458 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:03:04 vm05.local bash[170227]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1 2026-03-10T08:03:04.458 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:03:04 vm05.local podman[170294]: 2026-03-10 08:03:04.415764198 +0000 UTC m=+0.019957037 container create ddba337efb7deb47c0df67ed1fe083caebecd6352eb53a71526f1da622a2220d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.label-schema.build-date=20260223) 2026-03-10T08:03:04.686 DEBUG:teuthology.orchestra.run.vm05:> sudo pkill -f 'journalctl -f -n 0 -u ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.1.service' 2026-03-10T08:03:04.714 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:03:04 vm05.local podman[170294]: 2026-03-10 08:03:04.465617484 +0000 UTC m=+0.069810333 container init ddba337efb7deb47c0df67ed1fe083caebecd6352eb53a71526f1da622a2220d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-deactivate, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.vendor=CentOS, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T08:03:04.714 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:03:04 vm05.local podman[170294]: 2026-03-10 08:03:04.469701337 +0000 UTC m=+0.073894167 container start ddba337efb7deb47c0df67ed1fe083caebecd6352eb53a71526f1da622a2220d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-deactivate, org.label-schema.license=GPLv2, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, OSD_FLAVOR=default, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.vendor=CentOS, ceph=True, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T08:03:04.714 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:03:04 vm05.local podman[170294]: 2026-03-10 08:03:04.470956687 +0000 UTC m=+0.075149526 container attach ddba337efb7deb47c0df67ed1fe083caebecd6352eb53a71526f1da622a2220d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-deactivate, org.label-schema.build-date=20260223, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, OSD_FLAVOR=default, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid) 2026-03-10T08:03:04.714 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:03:04 vm05.local podman[170294]: 2026-03-10 08:03:04.407708379 +0000 UTC m=+0.011901229 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T08:03:04.714 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:03:04 vm05.local conmon[170305]: conmon ddba337efb7deb47c0df : Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-ddba337efb7deb47c0df67ed1fe083caebecd6352eb53a71526f1da622a2220d.scope/container/memory.events 2026-03-10T08:03:04.714 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:03:04 vm05.local podman[170294]: 2026-03-10 08:03:04.646943359 +0000 UTC m=+0.251136198 container died ddba337efb7deb47c0df67ed1fe083caebecd6352eb53a71526f1da622a2220d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-deactivate, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, ceph=True, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git) 2026-03-10T08:03:04.714 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:03:04 vm05.local podman[170294]: 2026-03-10 08:03:04.670149827 +0000 UTC m=+0.274342666 container remove ddba337efb7deb47c0df67ed1fe083caebecd6352eb53a71526f1da622a2220d (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-1-deactivate, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.authors=Ceph Release Team , ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.license=GPLv2) 2026-03-10T08:03:04.714 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:03:04 vm05.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.1.service: Deactivated successfully. 2026-03-10T08:03:04.714 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:03:04 vm05.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.1.service: Unit process 170305 (conmon) remains running after unit stopped. 2026-03-10T08:03:04.714 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:03:04 vm05.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.1.service: Unit process 170313 (podman) remains running after unit stopped. 2026-03-10T08:03:04.714 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:03:04 vm05.local systemd[1]: Stopped Ceph osd.1 for 12e9780e-1c55-11f1-8896-79f7c2e9b508. 2026-03-10T08:03:04.714 INFO:journalctl@ceph.osd.1.vm05.stdout:Mar 10 08:03:04 vm05.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.1.service: Consumed 5.891s CPU time, 594.5M memory peak. 2026-03-10T08:03:04.727 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T08:03:04.727 INFO:tasks.cephadm.osd.1:Stopped osd.1 2026-03-10T08:03:04.727 INFO:tasks.cephadm.osd.2:Stopping osd.2... 2026-03-10T08:03:04.727 DEBUG:teuthology.orchestra.run.vm05:> sudo systemctl stop ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.2 2026-03-10T08:03:05.157 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 08:03:04 vm05.local systemd[1]: Stopping Ceph osd.2 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T08:03:05.157 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 08:03:04 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2[148987]: 2026-03-10T08:03:04.880+0000 7f97608bd640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.2 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T08:03:05.157 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 08:03:04 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2[148987]: 2026-03-10T08:03:04.880+0000 7f97608bd640 -1 osd.2 84 *** Got signal Terminated *** 2026-03-10T08:03:05.157 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 08:03:04 vm05.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2[148987]: 2026-03-10T08:03:04.880+0000 7f97608bd640 -1 osd.2 84 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T08:03:10.207 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 08:03:09 vm05.local podman[170389]: 2026-03-10 08:03:09.930459025 +0000 UTC m=+5.061928861 container died 108a77e324b80ab459e853640e5097ae7e1f46f7cb10342fee8da863d82fc639 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.build-date=20260223, ceph=True, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T08:03:10.207 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 08:03:09 vm05.local podman[170389]: 2026-03-10 08:03:09.956179248 +0000 UTC m=+5.087649084 container remove 108a77e324b80ab459e853640e5097ae7e1f46f7cb10342fee8da863d82fc639 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_REF=squid, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T08:03:10.207 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 08:03:09 vm05.local bash[170389]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2 2026-03-10T08:03:10.207 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 08:03:10 vm05.local podman[170455]: 2026-03-10 08:03:10.115982106 +0000 UTC m=+0.021006149 container create ac6ff5a61e0a04f9598d6206ceaf60a3c69d11f8b8afaaed2c88ef8c513e9805 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-deactivate, org.opencontainers.image.authors=Ceph Release Team , ceph=True, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, io.buildah.version=1.41.3, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, CEPH_REF=squid, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.vendor=CentOS) 2026-03-10T08:03:10.207 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 08:03:10 vm05.local podman[170455]: 2026-03-10 08:03:10.162427864 +0000 UTC m=+0.067451906 container init ac6ff5a61e0a04f9598d6206ceaf60a3c69d11f8b8afaaed2c88ef8c513e9805 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-deactivate, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, ceph=True, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_REF=squid, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, OSD_FLAVOR=default) 2026-03-10T08:03:10.207 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 08:03:10 vm05.local podman[170455]: 2026-03-10 08:03:10.165487301 +0000 UTC m=+0.070511343 container start ac6ff5a61e0a04f9598d6206ceaf60a3c69d11f8b8afaaed2c88ef8c513e9805 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-deactivate, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.build-date=20260223, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T08:03:10.207 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 08:03:10 vm05.local podman[170455]: 2026-03-10 08:03:10.166504605 +0000 UTC m=+0.071528647 container attach ac6ff5a61e0a04f9598d6206ceaf60a3c69d11f8b8afaaed2c88ef8c513e9805 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-2-deactivate, org.label-schema.vendor=CentOS, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, CEPH_REF=squid, OSD_FLAVOR=default, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223) 2026-03-10T08:03:10.207 INFO:journalctl@ceph.osd.2.vm05.stdout:Mar 10 08:03:10 vm05.local podman[170455]: 2026-03-10 08:03:10.107587033 +0000 UTC m=+0.012611085 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T08:03:10.372 DEBUG:teuthology.orchestra.run.vm05:> sudo pkill -f 'journalctl -f -n 0 -u ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.2.service' 2026-03-10T08:03:10.413 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T08:03:10.414 INFO:tasks.cephadm.osd.2:Stopped osd.2 2026-03-10T08:03:10.414 INFO:tasks.cephadm.osd.3:Stopping osd.3... 2026-03-10T08:03:10.414 DEBUG:teuthology.orchestra.run.vm08:> sudo systemctl stop ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.3 2026-03-10T08:03:10.918 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 08:03:10 vm08.local systemd[1]: Stopping Ceph osd.3 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T08:03:10.918 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 08:03:10 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3[115614]: 2026-03-10T08:03:10.589+0000 7f765789a640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.3 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T08:03:10.918 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 08:03:10 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3[115614]: 2026-03-10T08:03:10.589+0000 7f765789a640 -1 osd.3 84 *** Got signal Terminated *** 2026-03-10T08:03:10.918 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 08:03:10 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3[115614]: 2026-03-10T08:03:10.589+0000 7f765789a640 -1 osd.3 84 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T08:03:15.901 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 08:03:15 vm08.local podman[138801]: 2026-03-10 08:03:15.623105909 +0000 UTC m=+5.092584039 container died 3f280bcfe0f5b5098bed159e3c001472fdb361f091785107b6801e1875f117e0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, CEPH_REF=squid, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team ) 2026-03-10T08:03:15.901 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 08:03:15 vm08.local podman[138801]: 2026-03-10 08:03:15.650776521 +0000 UTC m=+5.120254651 container remove 3f280bcfe0f5b5098bed159e3c001472fdb361f091785107b6801e1875f117e0 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3, OSD_FLAVOR=default, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3, org.label-schema.build-date=20260223, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , CEPH_REF=squid, ceph=True, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T08:03:15.901 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 08:03:15 vm08.local bash[138801]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3 2026-03-10T08:03:15.901 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 08:03:15 vm08.local podman[138866]: 2026-03-10 08:03:15.80887603 +0000 UTC m=+0.020261986 container create 151ccb8e2b259dc0d36a257825105f2edf3ba4ebc9dff45b8e086b2976d37182 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-deactivate, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, OSD_FLAVOR=default, ceph=True, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T08:03:15.901 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 08:03:15 vm08.local podman[138866]: 2026-03-10 08:03:15.852143575 +0000 UTC m=+0.063529542 container init 151ccb8e2b259dc0d36a257825105f2edf3ba4ebc9dff45b8e086b2976d37182 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260223, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid) 2026-03-10T08:03:15.901 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 08:03:15 vm08.local podman[138866]: 2026-03-10 08:03:15.855394 +0000 UTC m=+0.066779956 container start 151ccb8e2b259dc0d36a257825105f2edf3ba4ebc9dff45b8e086b2976d37182 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, CEPH_REF=squid, ceph=True, org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9) 2026-03-10T08:03:15.901 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 08:03:15 vm08.local podman[138866]: 2026-03-10 08:03:15.858292556 +0000 UTC m=+0.069678521 container attach 151ccb8e2b259dc0d36a257825105f2edf3ba4ebc9dff45b8e086b2976d37182 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-3-deactivate, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.label-schema.license=GPLv2) 2026-03-10T08:03:15.901 INFO:journalctl@ceph.osd.3.vm08.stdout:Mar 10 08:03:15 vm08.local podman[138866]: 2026-03-10 08:03:15.800648982 +0000 UTC m=+0.012034938 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T08:03:16.036 DEBUG:teuthology.orchestra.run.vm08:> sudo pkill -f 'journalctl -f -n 0 -u ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.3.service' 2026-03-10T08:03:16.073 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T08:03:16.073 INFO:tasks.cephadm.osd.3:Stopped osd.3 2026-03-10T08:03:16.074 INFO:tasks.cephadm.osd.4:Stopping osd.4... 2026-03-10T08:03:16.074 DEBUG:teuthology.orchestra.run.vm08:> sudo systemctl stop ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.4 2026-03-10T08:03:16.156 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 08:03:16 vm08.local systemd[1]: Stopping Ceph osd.4 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T08:03:16.418 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 08:03:16 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4[120174]: 2026-03-10T08:03:16.230+0000 7f2cdd67b640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.4 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T08:03:16.418 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 08:03:16 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4[120174]: 2026-03-10T08:03:16.230+0000 7f2cdd67b640 -1 osd.4 84 *** Got signal Terminated *** 2026-03-10T08:03:16.418 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 08:03:16 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4[120174]: 2026-03-10T08:03:16.230+0000 7f2cdd67b640 -1 osd.4 84 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T08:03:21.279 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 08:03:20 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4[120174]: 2026-03-10T08:03:20.958+0000 7f2cd9482640 -1 osd.4 84 heartbeat_check: no reply from 192.168.123.105:6806 osd.0 since back 2026-03-10T08:02:56.780979+0000 front 2026-03-10T08:02:56.780937+0000 (oldest deadline 2026-03-10T08:03:20.280707+0000) 2026-03-10T08:03:21.541 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 08:03:21 vm08.local podman[138960]: 2026-03-10 08:03:21.279581439 +0000 UTC m=+5.064731075 container died 132c8d288b1e2635dc27692f5dbd57580b708e251146e04ffae18c1fa83d9fb1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4, org.label-schema.vendor=CentOS, CEPH_REF=squid, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, OSD_FLAVOR=default) 2026-03-10T08:03:21.541 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 08:03:21 vm08.local podman[138960]: 2026-03-10 08:03:21.313622224 +0000 UTC m=+5.098771849 container remove 132c8d288b1e2635dc27692f5dbd57580b708e251146e04ffae18c1fa83d9fb1 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, ceph=True, io.buildah.version=1.41.3, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, CEPH_REF=squid, org.label-schema.vendor=CentOS, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T08:03:21.541 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 08:03:21 vm08.local bash[138960]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4 2026-03-10T08:03:21.541 INFO:journalctl@ceph.osd.4.vm08.stdout:Mar 10 08:03:21 vm08.local podman[139036]: 2026-03-10 08:03:21.486816998 +0000 UTC m=+0.021448477 container create f29349ab57b964262f2eafae6bb029e3558c06875cc3ff41d0bbd30da9abefd5 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-4-deactivate, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, ceph=True, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/) 2026-03-10T08:03:21.730 DEBUG:teuthology.orchestra.run.vm08:> sudo pkill -f 'journalctl -f -n 0 -u ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.4.service' 2026-03-10T08:03:21.773 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T08:03:21.773 INFO:tasks.cephadm.osd.4:Stopped osd.4 2026-03-10T08:03:21.773 INFO:tasks.cephadm.osd.5:Stopping osd.5... 2026-03-10T08:03:21.773 DEBUG:teuthology.orchestra.run.vm08:> sudo systemctl stop ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.5 2026-03-10T08:03:22.168 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:03:21 vm08.local systemd[1]: Stopping Ceph osd.5 for 12e9780e-1c55-11f1-8896-79f7c2e9b508... 2026-03-10T08:03:22.168 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:03:21 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5[124483]: 2026-03-10T08:03:21.935+0000 7fc0b406b640 -1 received signal: Terminated from /run/podman-init -- /usr/bin/ceph-osd -n osd.5 -f --setuser ceph --setgroup ceph --default-log-to-file=false --default-log-to-journald=true --default-log-to-stderr=false (PID: 1) UID: 0 2026-03-10T08:03:22.168 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:03:21 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5[124483]: 2026-03-10T08:03:21.936+0000 7fc0b406b640 -1 osd.5 84 *** Got signal Terminated *** 2026-03-10T08:03:22.168 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:03:21 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5[124483]: 2026-03-10T08:03:21.936+0000 7fc0b406b640 -1 osd.5 84 *** Immediate shutdown (osd_fast_shutdown=true) *** 2026-03-10T08:03:23.168 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:03:22 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5[124483]: 2026-03-10T08:03:22.763+0000 7fc0b0673640 -1 osd.5 84 heartbeat_check: no reply from 192.168.123.105:6806 osd.0 since back 2026-03-10T08:02:56.728617+0000 front 2026-03-10T08:02:56.728611+0000 (oldest deadline 2026-03-10T08:03:22.028228+0000) 2026-03-10T08:03:24.116 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:03:23 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5[124483]: 2026-03-10T08:03:23.802+0000 7fc0b0673640 -1 osd.5 84 heartbeat_check: no reply from 192.168.123.105:6806 osd.0 since back 2026-03-10T08:02:56.728617+0000 front 2026-03-10T08:02:56.728611+0000 (oldest deadline 2026-03-10T08:03:22.028228+0000) 2026-03-10T08:03:25.168 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:03:24 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5[124483]: 2026-03-10T08:03:24.780+0000 7fc0b0673640 -1 osd.5 84 heartbeat_check: no reply from 192.168.123.105:6806 osd.0 since back 2026-03-10T08:02:56.728617+0000 front 2026-03-10T08:02:56.728611+0000 (oldest deadline 2026-03-10T08:03:22.028228+0000) 2026-03-10T08:03:26.168 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:03:25 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5[124483]: 2026-03-10T08:03:25.803+0000 7fc0b0673640 -1 osd.5 84 heartbeat_check: no reply from 192.168.123.105:6806 osd.0 since back 2026-03-10T08:02:56.728617+0000 front 2026-03-10T08:02:56.728611+0000 (oldest deadline 2026-03-10T08:03:22.028228+0000) 2026-03-10T08:03:27.168 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:03:26 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5[124483]: 2026-03-10T08:03:26.785+0000 7fc0b0673640 -1 osd.5 84 heartbeat_check: no reply from 192.168.123.105:6806 osd.0 since back 2026-03-10T08:02:56.728617+0000 front 2026-03-10T08:02:56.728611+0000 (oldest deadline 2026-03-10T08:03:22.028228+0000) 2026-03-10T08:03:27.169 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:03:26 vm08.local ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5[124483]: 2026-03-10T08:03:26.785+0000 7fc0b0673640 -1 osd.5 84 heartbeat_check: no reply from 192.168.123.105:6814 osd.1 since back 2026-03-10T08:03:02.028755+0000 front 2026-03-10T08:03:02.028797+0000 (oldest deadline 2026-03-10T08:03:26.728519+0000) 2026-03-10T08:03:27.169 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:03:26 vm08.local podman[139132]: 2026-03-10 08:03:26.971547427 +0000 UTC m=+5.049013344 container died a4a8929822a27be3014918e92bd2cd6cbdcb35208c0788a66211d421d9c4c05e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.vendor=CentOS, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.license=GPLv2, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, org.label-schema.schema-version=1.0) 2026-03-10T08:03:27.169 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:03:26 vm08.local podman[139132]: 2026-03-10 08:03:26.996135208 +0000 UTC m=+5.073601125 container remove a4a8929822a27be3014918e92bd2cd6cbdcb35208c0788a66211d421d9c4c05e (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5, org.label-schema.name=CentOS Stream 9 Base Image, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, OSD_FLAVOR=default, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, CEPH_REF=squid, org.label-schema.build-date=20260223, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True, org.opencontainers.image.authors=Ceph Release Team , FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS) 2026-03-10T08:03:27.169 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:03:26 vm08.local bash[139132]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5 2026-03-10T08:03:27.418 DEBUG:teuthology.orchestra.run.vm08:> sudo pkill -f 'journalctl -f -n 0 -u ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.5.service' 2026-03-10T08:03:27.445 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:03:27 vm08.local podman[139201]: 2026-03-10 08:03:27.193916952 +0000 UTC m=+0.043318782 container create 1d09c1b3b608e34afa8589fba1cb48ed53ec22a09d9ab90ecb37c3b406627c49 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-deactivate, org.label-schema.schema-version=1.0, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_REF=squid, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True) 2026-03-10T08:03:27.445 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:03:27 vm08.local podman[139201]: 2026-03-10 08:03:27.244106744 +0000 UTC m=+0.093508584 container init 1d09c1b3b608e34afa8589fba1cb48ed53ec22a09d9ab90ecb37c3b406627c49 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.opencontainers.image.authors=Ceph Release Team , io.buildah.version=1.41.3, OSD_FLAVOR=default, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223) 2026-03-10T08:03:27.445 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:03:27 vm08.local podman[139201]: 2026-03-10 08:03:27.24751873 +0000 UTC m=+0.096920551 container start 1d09c1b3b608e34afa8589fba1cb48ed53ec22a09d9ab90ecb37c3b406627c49 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-deactivate, io.buildah.version=1.41.3, ceph=True, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, OSD_FLAVOR=default, org.opencontainers.image.authors=Ceph Release Team , org.opencontainers.image.documentation=https://docs.ceph.com/, FROM_IMAGE=quay.io/centos/centos:stream9, org.label-schema.license=GPLv2, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, org.label-schema.name=CentOS Stream 9 Base Image) 2026-03-10T08:03:27.445 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:03:27 vm08.local podman[139201]: 2026-03-10 08:03:27.248771415 +0000 UTC m=+0.098173245 container attach 1d09c1b3b608e34afa8589fba1cb48ed53ec22a09d9ab90ecb37c3b406627c49 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-deactivate, CEPH_REF=squid, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, ceph=True, OSD_FLAVOR=default, org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.name=CentOS Stream 9 Base Image, FROM_IMAGE=quay.io/centos/centos:stream9, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, io.buildah.version=1.41.3) 2026-03-10T08:03:27.445 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:03:27 vm08.local podman[139201]: 2026-03-10 08:03:27.164636507 +0000 UTC m=+0.014038347 image pull 654f31e6858eb235bbece362255b685a945f2b6a367e2b88c4930c984fbb214c quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc 2026-03-10T08:03:27.445 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:03:27 vm08.local podman[139201]: 2026-03-10 08:03:27.386181189 +0000 UTC m=+0.235583019 container died 1d09c1b3b608e34afa8589fba1cb48ed53ec22a09d9ab90ecb37c3b406627c49 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-deactivate, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, ceph=True, org.opencontainers.image.documentation=https://docs.ceph.com/, CEPH_REF=squid, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, OSD_FLAVOR=default, FROM_IMAGE=quay.io/centos/centos:stream9, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.build-date=20260223, org.label-schema.vendor=CentOS) 2026-03-10T08:03:27.445 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:03:27 vm08.local podman[139201]: 2026-03-10 08:03:27.404761687 +0000 UTC m=+0.254163517 container remove 1d09c1b3b608e34afa8589fba1cb48ed53ec22a09d9ab90ecb37c3b406627c49 (image=quay.ceph.io/ceph-ci/ceph@sha256:8fda260ab1d2d3118a1622f7df75f44f285dfe74e71793626152a711c12bf2cc, name=ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508-osd-5-deactivate, CEPH_REF=squid, CEPH_SHA1=e911bdebe5c8faa3800735d1568fcdca65db60df, org.label-schema.build-date=20260223, FROM_IMAGE=quay.io/centos/centos:stream9, OSD_FLAVOR=default, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.opencontainers.image.authors=Ceph Release Team , org.label-schema.vendor=CentOS, org.opencontainers.image.documentation=https://docs.ceph.com/, org.label-schema.name=CentOS Stream 9 Base Image, CEPH_GIT_REPO=https://github.com/ceph/ceph-ci.git, GANESHA_REPO_BASEURL=https://buildlogs.centos.org/centos/$releasever-stream/storage/$basearch/nfsganesha-5/, ceph=True) 2026-03-10T08:03:27.445 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:03:27 vm08.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.5.service: Deactivated successfully. 2026-03-10T08:03:27.445 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:03:27 vm08.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.5.service: Unit process 139211 (conmon) remains running after unit stopped. 2026-03-10T08:03:27.445 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:03:27 vm08.local systemd[1]: Stopped Ceph osd.5 for 12e9780e-1c55-11f1-8896-79f7c2e9b508. 2026-03-10T08:03:27.445 INFO:journalctl@ceph.osd.5.vm08.stdout:Mar 10 08:03:27 vm08.local systemd[1]: ceph-12e9780e-1c55-11f1-8896-79f7c2e9b508@osd.5.service: Consumed 4.632s CPU time, 546.2M memory peak. 2026-03-10T08:03:27.461 DEBUG:teuthology.orchestra.run:got remote process result: None 2026-03-10T08:03:27.461 INFO:tasks.cephadm.osd.5:Stopped osd.5 2026-03-10T08:03:27.461 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 --force --keep-logs 2026-03-10T08:03:27.583 INFO:teuthology.orchestra.run.vm05.stdout:Deleting cluster with fsid: 12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T08:03:29.224 INFO:tasks.cephfs.fuse_mount.ceph-fuse.0.vm05.stderr:ceph-fuse[96652]: fuse finished with error 0 and tester_r 0 2026-03-10T08:03:39.972 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 --force --keep-logs 2026-03-10T08:03:40.095 INFO:teuthology.orchestra.run.vm08.stdout:Deleting cluster with fsid: 12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T08:03:45.328 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-10T08:03:45.358 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -f /etc/ceph/ceph.conf /etc/ceph/ceph.client.admin.keyring 2026-03-10T08:03:45.393 INFO:tasks.cephadm:Archiving crash dumps... 2026-03-10T08:03:45.393 DEBUG:teuthology.misc:Transferring archived files from vm05:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/crash to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/954/remote/vm05/crash 2026-03-10T08:03:45.393 DEBUG:teuthology.orchestra.run.vm05:> sudo tar c -f - -C /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/crash -- . 2026-03-10T08:03:45.433 INFO:teuthology.orchestra.run.vm05.stderr:tar: /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/crash: Cannot open: No such file or directory 2026-03-10T08:03:45.433 INFO:teuthology.orchestra.run.vm05.stderr:tar: Error is not recoverable: exiting now 2026-03-10T08:03:45.434 DEBUG:teuthology.misc:Transferring archived files from vm08:/var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/crash to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/954/remote/vm08/crash 2026-03-10T08:03:45.434 DEBUG:teuthology.orchestra.run.vm08:> sudo tar c -f - -C /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/crash -- . 2026-03-10T08:03:45.459 INFO:teuthology.orchestra.run.vm08.stderr:tar: /var/lib/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/crash: Cannot open: No such file or directory 2026-03-10T08:03:45.459 INFO:teuthology.orchestra.run.vm08.stderr:tar: Error is not recoverable: exiting now 2026-03-10T08:03:45.460 INFO:tasks.cephadm:Checking cluster log for badness... 2026-03-10T08:03:45.460 DEBUG:teuthology.orchestra.run.vm05:> sudo egrep '\[ERR\]|\[WRN\]|\[SEC\]' /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph.log | egrep -v '\(MDS_ALL_DOWN\)' | egrep -v '\(MDS_UP_LESS_THAN_MAX\)' | egrep -v FS_DEGRADED | egrep -v 'filesystem is degraded' | egrep -v FS_INLINE_DATA_DEPRECATED | egrep -v FS_WITH_FAILED_MDS | egrep -v MDS_ALL_DOWN | egrep -v 'filesystem is offline' | egrep -v 'is offline because no MDS' | egrep -v MDS_DAMAGE | egrep -v MDS_DEGRADED | egrep -v MDS_FAILED | egrep -v MDS_INSUFFICIENT_STANDBY | egrep -v MDS_UP_LESS_THAN_MAX | egrep -v 'online, but wants' | egrep -v 'filesystem is online with fewer MDS than max_mds' | egrep -v POOL_APP_NOT_ENABLED | egrep -v 'do not have an application enabled' | egrep -v 'overall HEALTH_' | egrep -v 'Replacing daemon' | egrep -v 'deprecated feature inline_data' | egrep -v MGR_MODULE_ERROR | egrep -v OSD_DOWN | egrep -v 'osds down' | egrep -v 'overall HEALTH_' | egrep -v '\(OSD_DOWN\)' | egrep -v '\(OSD_' | egrep -v 'but it is still running' | egrep -v 'is not responding' | egrep -v MON_DOWN | egrep -v PG_AVAILABILITY | egrep -v PG_DEGRADED | egrep -v 'Reduced data availability' | egrep -v 'Degraded data redundancy' | egrep -v 'pg .* is stuck inactive' | egrep -v 'pg .* is .*degraded' | egrep -v 'pg .* is stuck peering' | head -n 1 2026-03-10T08:03:45.544 INFO:tasks.cephadm:Compressing logs... 2026-03-10T08:03:45.544 DEBUG:teuthology.orchestra.run.vm05:> time sudo find /var/log/ceph /var/log/rbd-target-api -name '*.log' -print0 | sudo xargs --max-args=1 --max-procs=0 --verbose -0 --no-run-if-empty -- gzip -5 --verbose -- 2026-03-10T08:03:45.546 DEBUG:teuthology.orchestra.run.vm08:> time sudo find /var/log/ceph /var/log/rbd-target-api -name '*.log' -print0 | sudo xargs --max-args=1 --max-procs=0 --verbose -0 --no-run-if-empty -- gzip -5 --verbose -- 2026-03-10T08:03:45.570 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/cephadm.log 2026-03-10T08:03:45.570 INFO:teuthology.orchestra.run.vm05.stderr:find: ‘/var/log/rbd-target-api’: No such file or directory 2026-03-10T08:03:45.571 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-mon.vm05.log 2026-03-10T08:03:45.571 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph.log 2026-03-10T08:03:45.572 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/cephadm.log: /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-mon.vm05.log: gzip -5 --verbose -- /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-mgr.vm05.blexke.log 2026-03-10T08:03:45.573 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /var/log/ceph/cephadm.log 2026-03-10T08:03:45.573 INFO:teuthology.orchestra.run.vm08.stderr:find: ‘/var/log/rbd-target-api’: No such file or directory 2026-03-10T08:03:45.574 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-volume.log 2026-03-10T08:03:45.575 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/cephadm.log: gzip -5 --verbose -- /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-client.ceph-exporter.vm08.log 2026-03-10T08:03:45.576 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-volume.log: 92.6% -- replaced with /var/log/ceph/cephadm.log.gz 2026-03-10T08:03:45.576 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-mgr.vm08.orfpog.log 2026-03-10T08:03:45.577 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-client.ceph-exporter.vm08.log: 93.9% -- replaced with /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-client.ceph-exporter.vm08.log.gz 2026-03-10T08:03:45.578 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-mon.vm08.log 2026-03-10T08:03:45.578 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-mgr.vm08.orfpog.log: gzip -5 --verbose -- /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph.log 2026-03-10T08:03:45.580 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph.log: 87.3% -- replaced with /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph.log.gz 2026-03-10T08:03:45.583 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph.audit.log 2026-03-10T08:03:45.583 INFO:teuthology.orchestra.run.vm08.stderr: 93.6% -- replaced with /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-volume.log.gz 2026-03-10T08:03:45.584 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-mon.vm08.log: gzip -5 --verbose -- /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph.audit.log 2026-03-10T08:03:45.585 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph.log: 87.3% -- replaced with /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph.log.gz 2026-03-10T08:03:45.586 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph.cephadm.log 2026-03-10T08:03:45.586 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-mgr.vm05.blexke.log: 90.9% -- replaced with /var/log/ceph/cephadm.log.gz 2026-03-10T08:03:45.588 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph.audit.log: 91.4% -- replaced with /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph.audit.log.gz 2026-03-10T08:03:45.589 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-osd.3.log 2026-03-10T08:03:45.589 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph.cephadm.log: 85.2% -- replaced with /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph.cephadm.log.gz 2026-03-10T08:03:45.589 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-osd.4.log 2026-03-10T08:03:45.590 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-osd.3.log: gzip -5 --verbose -- /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-osd.5.log 2026-03-10T08:03:45.595 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph.cephadm.log 2026-03-10T08:03:45.598 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph.audit.log: 91.2% -- replaced with /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph.audit.log.gz 2026-03-10T08:03:45.599 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-volume.log 2026-03-10T08:03:45.600 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-osd.4.log: gzip -5 --verbose -- /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-mds.cephfs.vm08.ybmbgd.log 2026-03-10T08:03:45.600 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph.cephadm.log: 85.2% -- replaced with /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph.cephadm.log.gz 2026-03-10T08:03:45.604 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-client.ceph-exporter.vm05.log 2026-03-10T08:03:45.611 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-volume.log: gzip -5 --verbose -- /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-osd.0.log 2026-03-10T08:03:45.612 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-client.ceph-exporter.vm05.log: 93.9% -- replaced with /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-client.ceph-exporter.vm05.log.gz 2026-03-10T08:03:45.614 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-osd.5.log: gzip -5 --verbose -- /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-mds.cephfs.vm08.dgsaon.log 2026-03-10T08:03:45.619 INFO:teuthology.orchestra.run.vm05.stderr: 93.5% -- replaced with /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-volume.log.gz 2026-03-10T08:03:45.622 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-osd.1.log 2026-03-10T08:03:45.626 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-osd.0.log: gzip -5 --verbose -- /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-osd.2.log 2026-03-10T08:03:45.626 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-mds.cephfs.vm08.ybmbgd.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.1.log 2026-03-10T08:03:45.633 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-osd.1.log: gzip -5 --verbose -- /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-mds.cephfs.vm05.omfhnh.log 2026-03-10T08:03:45.636 INFO:teuthology.orchestra.run.vm08.stderr:/var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-mds.cephfs.vm08.dgsaon.log: /var/log/ceph/ceph-client.1.log: 89.2% -- replaced with /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-mgr.vm08.orfpog.log.gz 2026-03-10T08:03:45.638 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-osd.2.log: gzip -5 --verbose -- /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-mds.cephfs.vm05.pavqil.log 2026-03-10T08:03:45.653 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-mds.cephfs.vm05.omfhnh.log: gzip -5 --verbose -- /var/log/ceph/ceph-client.0.log 2026-03-10T08:03:46.175 INFO:teuthology.orchestra.run.vm05.stderr:/var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-mds.cephfs.vm05.pavqil.log: /var/log/ceph/ceph-client.0.log: 89.4% -- replaced with /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-mgr.vm05.blexke.log.gz 2026-03-10T08:03:46.208 INFO:teuthology.orchestra.run.vm08.stderr: 92.3% -- replaced with /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-mon.vm08.log.gz 2026-03-10T08:03:47.156 INFO:teuthology.orchestra.run.vm05.stderr: 90.6% -- replaced with /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-mon.vm05.log.gz 2026-03-10T08:03:53.117 INFO:teuthology.orchestra.run.vm08.stderr: 93.8% -- replaced with /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-osd.4.log.gz 2026-03-10T08:03:53.888 INFO:teuthology.orchestra.run.vm05.stderr: 93.9% -- replaced with /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-osd.2.log.gz 2026-03-10T08:03:54.640 INFO:teuthology.orchestra.run.vm05.stderr: 93.8% -- replaced with /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-osd.0.log.gz 2026-03-10T08:03:54.869 INFO:teuthology.orchestra.run.vm08.stderr: 93.9% -- replaced with /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-osd.5.log.gz 2026-03-10T08:03:55.482 INFO:teuthology.orchestra.run.vm05.stderr: 93.9% -- replaced with /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-osd.1.log.gz 2026-03-10T08:03:56.068 INFO:teuthology.orchestra.run.vm08.stderr: 93.9% -- replaced with /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-osd.3.log.gz 2026-03-10T08:03:56.554 INFO:teuthology.orchestra.run.vm08.stderr: 94.9% -- replaced with /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-mds.cephfs.vm08.ybmbgd.log.gz 2026-03-10T08:03:56.700 INFO:teuthology.orchestra.run.vm08.stderr: 94.9% -- replaced with /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-mds.cephfs.vm08.dgsaon.log.gz 2026-03-10T08:04:01.316 INFO:teuthology.orchestra.run.vm08.stderr:gzip: /var/log/ceph/ceph-client.1.log: file size changed while zipping 2026-03-10T08:04:01.316 INFO:teuthology.orchestra.run.vm08.stderr: 93.4% -- replaced with /var/log/ceph/ceph-client.1.log.gz 2026-03-10T08:04:01.319 INFO:teuthology.orchestra.run.vm08.stderr: 2026-03-10T08:04:01.319 INFO:teuthology.orchestra.run.vm08.stderr:real 0m15.756s 2026-03-10T08:04:01.319 INFO:teuthology.orchestra.run.vm08.stderr:user 0m25.670s 2026-03-10T08:04:01.319 INFO:teuthology.orchestra.run.vm08.stderr:sys 0m1.053s 2026-03-10T08:04:03.450 INFO:teuthology.orchestra.run.vm05.stderr: 94.9% -- replaced with /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-mds.cephfs.vm05.pavqil.log.gz 2026-03-10T08:04:04.099 INFO:teuthology.orchestra.run.vm05.stderr:gzip: /var/log/ceph/ceph-client.0.log: file size changed while zipping 2026-03-10T08:04:04.306 INFO:teuthology.orchestra.run.vm05.stderr: 93.5% -- replaced with /var/log/ceph/ceph-client.0.log.gz 2026-03-10T08:04:56.710 INFO:teuthology.orchestra.run.vm05.stderr: 92.9% -- replaced with /var/log/ceph/12e9780e-1c55-11f1-8896-79f7c2e9b508/ceph-mds.cephfs.vm05.omfhnh.log.gz 2026-03-10T08:04:56.713 INFO:teuthology.orchestra.run.vm05.stderr: 2026-03-10T08:04:56.713 INFO:teuthology.orchestra.run.vm05.stderr:real 1m11.153s 2026-03-10T08:04:56.713 INFO:teuthology.orchestra.run.vm05.stderr:user 1m22.378s 2026-03-10T08:04:56.713 INFO:teuthology.orchestra.run.vm05.stderr:sys 0m5.540s 2026-03-10T08:04:56.713 INFO:tasks.cephadm:Archiving logs... 2026-03-10T08:04:56.713 DEBUG:teuthology.misc:Transferring archived files from vm05:/var/log/ceph to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/954/remote/vm05/log 2026-03-10T08:04:56.714 DEBUG:teuthology.orchestra.run.vm05:> sudo tar c -f - -C /var/log/ceph -- . 2026-03-10T08:05:00.994 DEBUG:teuthology.misc:Transferring archived files from vm08:/var/log/ceph to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/954/remote/vm08/log 2026-03-10T08:05:00.994 DEBUG:teuthology.orchestra.run.vm08:> sudo tar c -f - -C /var/log/ceph -- . 2026-03-10T08:05:02.213 INFO:tasks.cephadm:Removing cluster... 2026-03-10T08:05:02.213 DEBUG:teuthology.orchestra.run.vm05:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 --force 2026-03-10T08:05:02.318 INFO:teuthology.orchestra.run.vm05.stdout:Deleting cluster with fsid: 12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T08:05:03.172 DEBUG:teuthology.orchestra.run.vm08:> sudo /home/ubuntu/cephtest/cephadm rm-cluster --fsid 12e9780e-1c55-11f1-8896-79f7c2e9b508 --force 2026-03-10T08:05:03.277 INFO:teuthology.orchestra.run.vm08.stdout:Deleting cluster with fsid: 12e9780e-1c55-11f1-8896-79f7c2e9b508 2026-03-10T08:05:03.550 INFO:tasks.cephadm:Removing cephadm ... 2026-03-10T08:05:03.550 DEBUG:teuthology.orchestra.run.vm05:> rm -rf /home/ubuntu/cephtest/cephadm 2026-03-10T08:05:03.568 DEBUG:teuthology.orchestra.run.vm08:> rm -rf /home/ubuntu/cephtest/cephadm 2026-03-10T08:05:03.584 INFO:tasks.cephadm:Teardown complete 2026-03-10T08:05:03.584 DEBUG:teuthology.run_tasks:Unwinding manager install 2026-03-10T08:05:03.587 ERROR:teuthology.contextutil:Saw exception from nested tasks Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/contextutil.py", line 32, in nested yield vars File "/home/teuthos/teuthology/teuthology/task/install/__init__.py", line 644, in task yield File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-10T08:05:03.587 INFO:teuthology.task.install.util:Removing shipped files: /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer... 2026-03-10T08:05:03.587 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-10T08:05:03.610 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -f -- /home/ubuntu/cephtest/valgrind.supp /usr/bin/daemon-helper /usr/bin/adjust-ulimits /usr/bin/stdin-killer 2026-03-10T08:05:03.662 INFO:teuthology.task.install.rpm:Removing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd on rpm system. 2026-03-10T08:05:03.662 DEBUG:teuthology.orchestra.run.vm05:> 2026-03-10T08:05:03.662 DEBUG:teuthology.orchestra.run.vm05:> for d in ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd ; do 2026-03-10T08:05:03.662 DEBUG:teuthology.orchestra.run.vm05:> sudo yum -y remove $d || true 2026-03-10T08:05:03.662 DEBUG:teuthology.orchestra.run.vm05:> done 2026-03-10T08:05:03.668 INFO:teuthology.task.install.rpm:Removing packages: ceph-radosgw, ceph-test, ceph, ceph-base, cephadm, ceph-immutable-object-cache, ceph-mgr, ceph-mgr-dashboard, ceph-mgr-diskprediction-local, ceph-mgr-rook, ceph-mgr-cephadm, ceph-fuse, librados-devel, libcephfs2, libcephfs-devel, librados2, librbd1, python3-rados, python3-rgw, python3-cephfs, python3-rbd, rbd-fuse, rbd-mirror, rbd-nbd on rpm system. 2026-03-10T08:05:03.668 DEBUG:teuthology.orchestra.run.vm08:> 2026-03-10T08:05:03.668 DEBUG:teuthology.orchestra.run.vm08:> for d in ceph-radosgw ceph-test ceph ceph-base cephadm ceph-immutable-object-cache ceph-mgr ceph-mgr-dashboard ceph-mgr-diskprediction-local ceph-mgr-rook ceph-mgr-cephadm ceph-fuse librados-devel libcephfs2 libcephfs-devel librados2 librbd1 python3-rados python3-rgw python3-cephfs python3-rbd rbd-fuse rbd-mirror rbd-nbd ; do 2026-03-10T08:05:03.668 DEBUG:teuthology.orchestra.run.vm08:> sudo yum -y remove $d || true 2026-03-10T08:05:03.668 DEBUG:teuthology.orchestra.run.vm08:> done 2026-03-10T08:05:03.915 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:05:03.916 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:03.916 INFO:teuthology.orchestra.run.vm08.stdout: Package Architecture Version Repository Size 2026-03-10T08:05:03.916 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:03.916 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-10T08:05:03.916 INFO:teuthology.orchestra.run.vm08.stdout: ceph-radosgw x86_64 2:18.2.1-0.el9 @ceph 31 M 2026-03-10T08:05:03.916 INFO:teuthology.orchestra.run.vm08.stdout:Removing unused dependencies: 2026-03-10T08:05:03.916 INFO:teuthology.orchestra.run.vm08.stdout: mailcap noarch 2.1.49-5.el9 @baseos 78 k 2026-03-10T08:05:03.916 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:03.916 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-10T08:05:03.916 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:03.916 INFO:teuthology.orchestra.run.vm08.stdout:Remove 2 Packages 2026-03-10T08:05:03.916 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:03.916 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 31 M 2026-03-10T08:05:03.916 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-10T08:05:03.921 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-10T08:05:03.921 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-10T08:05:03.936 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-10T08:05:03.936 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-10T08:05:03.945 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:05:03.946 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:03.946 INFO:teuthology.orchestra.run.vm05.stdout: Package Architecture Version Repository Size 2026-03-10T08:05:03.946 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:03.946 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-10T08:05:03.946 INFO:teuthology.orchestra.run.vm05.stdout: ceph-radosgw x86_64 2:18.2.1-0.el9 @ceph 31 M 2026-03-10T08:05:03.946 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-10T08:05:03.946 INFO:teuthology.orchestra.run.vm05.stdout: mailcap noarch 2.1.49-5.el9 @baseos 78 k 2026-03-10T08:05:03.946 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:03.946 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-10T08:05:03.946 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:03.946 INFO:teuthology.orchestra.run.vm05.stdout:Remove 2 Packages 2026-03-10T08:05:03.946 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:03.947 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 31 M 2026-03-10T08:05:03.947 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-10T08:05:03.951 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-10T08:05:03.951 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-10T08:05:03.965 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-10T08:05:03.965 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-10T08:05:03.969 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-10T08:05:03.991 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-radosgw-2:18.2.1-0.el9.x86_64 1/2 2026-03-10T08:05:03.992 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:05:03.992 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-10T08:05:03.992 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-radosgw.target". 2026-03-10T08:05:03.992 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-radosgw.target". 2026-03-10T08:05:03.992 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:03.993 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-radosgw-2:18.2.1-0.el9.x86_64 1/2 2026-03-10T08:05:03.999 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-10T08:05:04.003 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-radosgw-2:18.2.1-0.el9.x86_64 1/2 2026-03-10T08:05:04.017 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T08:05:04.022 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-radosgw-2:18.2.1-0.el9.x86_64 1/2 2026-03-10T08:05:04.022 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:05:04.022 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-radosgw@*.service" escaped as "ceph-radosgw@\x2a.service". 2026-03-10T08:05:04.022 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-radosgw.target". 2026-03-10T08:05:04.022 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-radosgw.target". 2026-03-10T08:05:04.022 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:04.024 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-radosgw-2:18.2.1-0.el9.x86_64 1/2 2026-03-10T08:05:04.031 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-radosgw-2:18.2.1-0.el9.x86_64 1/2 2026-03-10T08:05:04.048 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T08:05:04.085 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T08:05:04.085 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-radosgw-2:18.2.1-0.el9.x86_64 1/2 2026-03-10T08:05:04.107 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T08:05:04.107 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-radosgw-2:18.2.1-0.el9.x86_64 1/2 2026-03-10T08:05:04.134 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T08:05:04.134 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:04.134 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-10T08:05:04.134 INFO:teuthology.orchestra.run.vm08.stdout: ceph-radosgw-2:18.2.1-0.el9.x86_64 mailcap-2.1.49-5.el9.noarch 2026-03-10T08:05:04.134 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:04.134 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:05:04.149 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : mailcap-2.1.49-5.el9.noarch 2/2 2026-03-10T08:05:04.149 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:04.149 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-10T08:05:04.149 INFO:teuthology.orchestra.run.vm05.stdout: ceph-radosgw-2:18.2.1-0.el9.x86_64 mailcap-2.1.49-5.el9.noarch 2026-03-10T08:05:04.149 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:04.149 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:05:04.364 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:05:04.365 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:04.365 INFO:teuthology.orchestra.run.vm05.stdout: Package Architecture Version Repository Size 2026-03-10T08:05:04.365 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:04.365 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-10T08:05:04.365 INFO:teuthology.orchestra.run.vm05.stdout: ceph-test x86_64 2:18.2.1-0.el9 @ceph 164 M 2026-03-10T08:05:04.365 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-10T08:05:04.365 INFO:teuthology.orchestra.run.vm05.stdout: libxslt x86_64 1.1.34-12.el9 @appstream 743 k 2026-03-10T08:05:04.365 INFO:teuthology.orchestra.run.vm05.stdout: socat x86_64 1.7.4.1-8.el9 @appstream 1.1 M 2026-03-10T08:05:04.365 INFO:teuthology.orchestra.run.vm05.stdout: xmlstarlet x86_64 1.6.1-20.el9 @appstream 195 k 2026-03-10T08:05:04.365 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:04.365 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-10T08:05:04.365 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:04.365 INFO:teuthology.orchestra.run.vm05.stdout:Remove 4 Packages 2026-03-10T08:05:04.365 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:04.365 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 166 M 2026-03-10T08:05:04.365 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-10T08:05:04.368 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:05:04.368 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-10T08:05:04.368 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-10T08:05:04.368 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:04.368 INFO:teuthology.orchestra.run.vm08.stdout: Package Architecture Version Repository Size 2026-03-10T08:05:04.369 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:04.369 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-10T08:05:04.369 INFO:teuthology.orchestra.run.vm08.stdout: ceph-test x86_64 2:18.2.1-0.el9 @ceph 164 M 2026-03-10T08:05:04.369 INFO:teuthology.orchestra.run.vm08.stdout:Removing unused dependencies: 2026-03-10T08:05:04.369 INFO:teuthology.orchestra.run.vm08.stdout: libxslt x86_64 1.1.34-12.el9 @appstream 743 k 2026-03-10T08:05:04.369 INFO:teuthology.orchestra.run.vm08.stdout: socat x86_64 1.7.4.1-8.el9 @appstream 1.1 M 2026-03-10T08:05:04.369 INFO:teuthology.orchestra.run.vm08.stdout: xmlstarlet x86_64 1.6.1-20.el9 @appstream 195 k 2026-03-10T08:05:04.369 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:04.369 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-10T08:05:04.369 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:04.369 INFO:teuthology.orchestra.run.vm08.stdout:Remove 4 Packages 2026-03-10T08:05:04.369 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:04.369 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 166 M 2026-03-10T08:05:04.369 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-10T08:05:04.372 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-10T08:05:04.372 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-10T08:05:04.394 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-10T08:05:04.394 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-10T08:05:04.398 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-10T08:05:04.398 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-10T08:05:04.451 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-10T08:05:04.451 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-10T08:05:04.457 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-test-2:18.2.1-0.el9.x86_64 1/4 2026-03-10T08:05:04.458 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-test-2:18.2.1-0.el9.x86_64 1/4 2026-03-10T08:05:04.460 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : xmlstarlet-1.6.1-20.el9.x86_64 2/4 2026-03-10T08:05:04.461 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : xmlstarlet-1.6.1-20.el9.x86_64 2/4 2026-03-10T08:05:04.464 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libxslt-1.1.34-12.el9.x86_64 3/4 2026-03-10T08:05:04.465 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libxslt-1.1.34-12.el9.x86_64 3/4 2026-03-10T08:05:04.480 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-10T08:05:04.481 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-10T08:05:04.543 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-10T08:05:04.543 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-test-2:18.2.1-0.el9.x86_64 1/4 2026-03-10T08:05:04.543 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 2/4 2026-03-10T08:05:04.543 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 3/4 2026-03-10T08:05:04.557 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: socat-1.7.4.1-8.el9.x86_64 4/4 2026-03-10T08:05:04.557 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-test-2:18.2.1-0.el9.x86_64 1/4 2026-03-10T08:05:04.557 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libxslt-1.1.34-12.el9.x86_64 2/4 2026-03-10T08:05:04.557 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : socat-1.7.4.1-8.el9.x86_64 3/4 2026-03-10T08:05:04.587 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 4/4 2026-03-10T08:05:04.587 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:04.587 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-10T08:05:04.587 INFO:teuthology.orchestra.run.vm08.stdout: ceph-test-2:18.2.1-0.el9.x86_64 libxslt-1.1.34-12.el9.x86_64 2026-03-10T08:05:04.587 INFO:teuthology.orchestra.run.vm08.stdout: socat-1.7.4.1-8.el9.x86_64 xmlstarlet-1.6.1-20.el9.x86_64 2026-03-10T08:05:04.587 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:04.587 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:05:04.609 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : xmlstarlet-1.6.1-20.el9.x86_64 4/4 2026-03-10T08:05:04.609 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:04.609 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-10T08:05:04.609 INFO:teuthology.orchestra.run.vm05.stdout: ceph-test-2:18.2.1-0.el9.x86_64 libxslt-1.1.34-12.el9.x86_64 2026-03-10T08:05:04.609 INFO:teuthology.orchestra.run.vm05.stdout: socat-1.7.4.1-8.el9.x86_64 xmlstarlet-1.6.1-20.el9.x86_64 2026-03-10T08:05:04.609 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:04.609 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:05:04.806 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:05:04.807 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:04.807 INFO:teuthology.orchestra.run.vm08.stdout: Package Arch Version Repository Size 2026-03-10T08:05:04.807 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:04.807 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-10T08:05:04.807 INFO:teuthology.orchestra.run.vm08.stdout: ceph x86_64 2:18.2.1-0.el9 @ceph 0 2026-03-10T08:05:04.807 INFO:teuthology.orchestra.run.vm08.stdout:Removing unused dependencies: 2026-03-10T08:05:04.807 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mds x86_64 2:18.2.1-0.el9 @ceph 6.5 M 2026-03-10T08:05:04.807 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mon x86_64 2:18.2.1-0.el9 @ceph 20 M 2026-03-10T08:05:04.807 INFO:teuthology.orchestra.run.vm08.stdout: ceph-osd x86_64 2:18.2.1-0.el9 @ceph 61 M 2026-03-10T08:05:04.807 INFO:teuthology.orchestra.run.vm08.stdout: ledmon-libs x86_64 1.1.0-3.el9 @baseos 80 k 2026-03-10T08:05:04.807 INFO:teuthology.orchestra.run.vm08.stdout: libconfig x86_64 1.7.2-9.el9 @baseos 220 k 2026-03-10T08:05:04.807 INFO:teuthology.orchestra.run.vm08.stdout: libstoragemgmt x86_64 1.10.1-1.el9 @appstream 685 k 2026-03-10T08:05:04.807 INFO:teuthology.orchestra.run.vm08.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 @appstream 832 k 2026-03-10T08:05:04.807 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:04.807 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-10T08:05:04.807 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:04.807 INFO:teuthology.orchestra.run.vm08.stdout:Remove 8 Packages 2026-03-10T08:05:04.807 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:04.807 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 89 M 2026-03-10T08:05:04.807 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-10T08:05:04.810 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-10T08:05:04.810 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-10T08:05:04.832 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-10T08:05:04.832 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:05:04.832 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-10T08:05:04.833 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:04.833 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-10T08:05:04.833 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:04.833 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-10T08:05:04.833 INFO:teuthology.orchestra.run.vm05.stdout: ceph x86_64 2:18.2.1-0.el9 @ceph 0 2026-03-10T08:05:04.833 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-10T08:05:04.833 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mds x86_64 2:18.2.1-0.el9 @ceph 6.5 M 2026-03-10T08:05:04.833 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mon x86_64 2:18.2.1-0.el9 @ceph 20 M 2026-03-10T08:05:04.833 INFO:teuthology.orchestra.run.vm05.stdout: ceph-osd x86_64 2:18.2.1-0.el9 @ceph 61 M 2026-03-10T08:05:04.833 INFO:teuthology.orchestra.run.vm05.stdout: ledmon-libs x86_64 1.1.0-3.el9 @baseos 80 k 2026-03-10T08:05:04.833 INFO:teuthology.orchestra.run.vm05.stdout: libconfig x86_64 1.7.2-9.el9 @baseos 220 k 2026-03-10T08:05:04.833 INFO:teuthology.orchestra.run.vm05.stdout: libstoragemgmt x86_64 1.10.1-1.el9 @appstream 685 k 2026-03-10T08:05:04.833 INFO:teuthology.orchestra.run.vm05.stdout: python3-libstoragemgmt x86_64 1.10.1-1.el9 @appstream 832 k 2026-03-10T08:05:04.833 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:04.833 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-10T08:05:04.833 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:04.833 INFO:teuthology.orchestra.run.vm05.stdout:Remove 8 Packages 2026-03-10T08:05:04.833 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:04.834 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 89 M 2026-03-10T08:05:04.834 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-10T08:05:04.836 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-10T08:05:04.836 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-10T08:05:04.861 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-10T08:05:04.861 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-10T08:05:04.873 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-10T08:05:04.875 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-2:18.2.1-0.el9.x86_64 1/8 2026-03-10T08:05:04.897 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-osd-2:18.2.1-0.el9.x86_64 2/8 2026-03-10T08:05:04.897 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:05:04.897 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-10T08:05:04.897 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-osd.target". 2026-03-10T08:05:04.897 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-osd.target". 2026-03-10T08:05:04.897 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:04.900 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-osd-2:18.2.1-0.el9.x86_64 2/8 2026-03-10T08:05:04.901 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-10T08:05:04.903 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-2:18.2.1-0.el9.x86_64 1/8 2026-03-10T08:05:04.910 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-osd-2:18.2.1-0.el9.x86_64 2/8 2026-03-10T08:05:04.924 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-osd-2:18.2.1-0.el9.x86_64 2/8 2026-03-10T08:05:04.924 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:05:04.924 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-osd@*.service" escaped as "ceph-osd@\x2a.service". 2026-03-10T08:05:04.924 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-osd.target". 2026-03-10T08:05:04.924 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-osd.target". 2026-03-10T08:05:04.924 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:04.925 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T08:05:04.925 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/multi-user.target.wants/libstoragemgmt.service". 2026-03-10T08:05:04.925 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:04.927 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T08:05:04.927 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-osd-2:18.2.1-0.el9.x86_64 2/8 2026-03-10T08:05:04.936 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-osd-2:18.2.1-0.el9.x86_64 2/8 2026-03-10T08:05:04.947 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T08:05:04.950 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T08:05:04.950 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/libstoragemgmt.service". 2026-03-10T08:05:04.950 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:04.950 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 4/8 2026-03-10T08:05:04.951 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T08:05:04.953 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-10T08:05:04.954 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-10T08:05:04.974 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libstoragemgmt-1.10.1-1.el9.x86_64 3/8 2026-03-10T08:05:04.975 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mds-2:18.2.1-0.el9.x86_64 7/8 2026-03-10T08:05:04.975 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:05:04.975 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-10T08:05:04.975 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mds.target". 2026-03-10T08:05:04.975 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mds.target". 2026-03-10T08:05:04.975 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:04.976 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-mds-2:18.2.1-0.el9.x86_64 7/8 2026-03-10T08:05:04.977 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-libstoragemgmt-1.10.1-1.el9.x86_64 4/8 2026-03-10T08:05:04.979 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-10T08:05:04.981 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-10T08:05:04.985 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mds-2:18.2.1-0.el9.x86_64 7/8 2026-03-10T08:05:05.004 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mds-2:18.2.1-0.el9.x86_64 7/8 2026-03-10T08:05:05.004 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:05:05.004 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-mds@*.service" escaped as "ceph-mds@\x2a.service". 2026-03-10T08:05:05.004 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mds.target". 2026-03-10T08:05:05.004 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mds.target". 2026-03-10T08:05:05.004 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:05.004 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mon-2:18.2.1-0.el9.x86_64 8/8 2026-03-10T08:05:05.004 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:05:05.004 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-10T08:05:05.005 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mon.target". 2026-03-10T08:05:05.005 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mon.target". 2026-03-10T08:05:05.005 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:05.005 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mds-2:18.2.1-0.el9.x86_64 7/8 2026-03-10T08:05:05.005 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-mon-2:18.2.1-0.el9.x86_64 8/8 2026-03-10T08:05:05.033 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mds-2:18.2.1-0.el9.x86_64 7/8 2026-03-10T08:05:05.057 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mon-2:18.2.1-0.el9.x86_64 8/8 2026-03-10T08:05:05.057 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:05:05.057 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-mon@*.service" escaped as "ceph-mon@\x2a.service". 2026-03-10T08:05:05.057 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mon.target". 2026-03-10T08:05:05.057 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mon.target". 2026-03-10T08:05:05.057 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:05.058 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mon-2:18.2.1-0.el9.x86_64 8/8 2026-03-10T08:05:05.121 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mon-2:18.2.1-0.el9.x86_64 8/8 2026-03-10T08:05:05.121 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-2:18.2.1-0.el9.x86_64 1/8 2026-03-10T08:05:05.121 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mds-2:18.2.1-0.el9.x86_64 2/8 2026-03-10T08:05:05.121 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mon-2:18.2.1-0.el9.x86_64 3/8 2026-03-10T08:05:05.121 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-osd-2:18.2.1-0.el9.x86_64 4/8 2026-03-10T08:05:05.121 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-10T08:05:05.121 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-10T08:05:05.121 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 7/8 2026-03-10T08:05:05.148 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mon-2:18.2.1-0.el9.x86_64 8/8 2026-03-10T08:05:05.148 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-2:18.2.1-0.el9.x86_64 1/8 2026-03-10T08:05:05.148 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mds-2:18.2.1-0.el9.x86_64 2/8 2026-03-10T08:05:05.148 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mon-2:18.2.1-0.el9.x86_64 3/8 2026-03-10T08:05:05.148 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-osd-2:18.2.1-0.el9.x86_64 4/8 2026-03-10T08:05:05.148 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ledmon-libs-1.1.0-3.el9.x86_64 5/8 2026-03-10T08:05:05.148 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libconfig-1.7.2-9.el9.x86_64 6/8 2026-03-10T08:05:05.148 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libstoragemgmt-1.10.1-1.el9.x86_64 7/8 2026-03-10T08:05:05.179 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 8/8 2026-03-10T08:05:05.179 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:05.179 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-10T08:05:05.179 INFO:teuthology.orchestra.run.vm08.stdout: ceph-2:18.2.1-0.el9.x86_64 ceph-mds-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:05.179 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mon-2:18.2.1-0.el9.x86_64 ceph-osd-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:05.179 INFO:teuthology.orchestra.run.vm08.stdout: ledmon-libs-1.1.0-3.el9.x86_64 libconfig-1.7.2-9.el9.x86_64 2026-03-10T08:05:05.179 INFO:teuthology.orchestra.run.vm08.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T08:05:05.179 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:05.179 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:05:05.207 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-libstoragemgmt-1.10.1-1.el9.x86_64 8/8 2026-03-10T08:05:05.207 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:05.207 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-10T08:05:05.207 INFO:teuthology.orchestra.run.vm05.stdout: ceph-2:18.2.1-0.el9.x86_64 ceph-mds-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:05.207 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mon-2:18.2.1-0.el9.x86_64 ceph-osd-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:05.207 INFO:teuthology.orchestra.run.vm05.stdout: ledmon-libs-1.1.0-3.el9.x86_64 libconfig-1.7.2-9.el9.x86_64 2026-03-10T08:05:05.207 INFO:teuthology.orchestra.run.vm05.stdout: libstoragemgmt-1.10.1-1.el9.x86_64 python3-libstoragemgmt-1.10.1-1.el9.x86_64 2026-03-10T08:05:05.207 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:05.207 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:05:05.395 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout: Package Arch Version Repository Size 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout: ceph-base x86_64 2:18.2.1-0.el9 @ceph 22 M 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout:Removing dependent packages: 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout: ceph-immutable-object-cache x86_64 2:18.2.1-0.el9 @ceph 395 k 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr x86_64 2:18.2.1-0.el9 @ceph 4.5 M 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-cephadm noarch 2:18.2.1-0.el9 @ceph-noarch 678 k 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-dashboard noarch 2:18.2.1-0.el9 @ceph-noarch 7.6 M 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.1-0.el9 @ceph-noarch 66 M 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-rook noarch 2:18.2.1-0.el9 @ceph-noarch 574 k 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout: rbd-mirror x86_64 2:18.2.1-0.el9 @ceph 12 M 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout:Removing unused dependencies: 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout: ceph-common x86_64 2:18.2.1-0.el9 @ceph 70 M 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout: ceph-grafana-dashboards noarch 2:18.2.1-0.el9 @ceph-noarch 319 k 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-modules-core noarch 2:18.2.1-0.el9 @ceph-noarch 1.4 M 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout: ceph-prometheus-alerts noarch 2:18.2.1-0.el9 @ceph-noarch 40 k 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout: ceph-selinux x86_64 2:18.2.1-0.el9 @ceph 138 k 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas x86_64 3.0.4-9.el9 @appstream 68 k 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 @appstream 11 M 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 @appstream 39 k 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout: fmt x86_64 8.1.1-5.el9 @epel 337 k 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout: libcephsqlite x86_64 2:18.2.1-0.el9 @ceph 434 k 2026-03-10T08:05:05.400 INFO:teuthology.orchestra.run.vm08.stdout: libgfortran x86_64 11.5.0-14.el9 @baseos 2.8 M 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: liboath x86_64 2.6.12-1.el9 @epel 94 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: libquadmath x86_64 11.5.0-14.el9 @baseos 330 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: libradosstriper1 x86_64 2:18.2.1-0.el9 @ceph 1.5 M 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: openblas x86_64 0.3.29-1.el9 @appstream 112 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: openblas-openmp x86_64 0.3.29-1.el9 @appstream 46 M 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-asyncssh noarch 2.13.2-5.el9 @epel 3.9 M 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-autocommand noarch 2.2.2-8.el9 @epel 82 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-babel noarch 2.9.1-2.el9 @appstream 27 M 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 @epel 254 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-bcrypt x86_64 3.2.2-1.el9 @epel 87 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-cachetools noarch 4.2.4-1.el9 @epel 93 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-ceph-common x86_64 2:18.2.1-0.el9 @ceph 610 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-certifi noarch 2023.05.07-4.el9 @epel 6.3 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-cffi x86_64 1.14.5-5.el9 @baseos 1.0 M 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-chardet noarch 4.0.0-5.el9 @anaconda 1.4 M 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-cheroot noarch 10.0.1-4.el9 @epel 682 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-cherrypy noarch 18.6.1-2.el9 @epel 1.1 M 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-cryptography x86_64 36.0.1-5.el9 @baseos 4.5 M 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-devel x86_64 3.9.25-3.el9 @appstream 765 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-google-auth noarch 1:2.45.0-1.el9 @epel 1.4 M 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-idna noarch 2.10-7.el9.1 @anaconda 513 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco noarch 8.2.1-3.el9 @epel 3.7 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 @epel 24 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 @epel 55 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-context noarch 6.0.1-3.el9 @epel 31 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 @epel 33 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-text noarch 4.0.0-2.el9 @epel 51 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-jinja2 noarch 2.11.3-8.el9 @appstream 1.1 M 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-jsonpatch noarch 1.21-16.el9 @koji-override-0 55 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-jsonpointer noarch 2.0-4.el9 @koji-override-0 34 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-jwt noarch 2.4.0-1.el9 @epel 103 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 @epel 5.4 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 @epel 21 M 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-logutils noarch 0.3.5-21.el9 @epel 126 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-mako noarch 1.1.4-6.el9 @appstream 534 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-markupsafe x86_64 1.1.1-12.el9 @appstream 60 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-more-itertools noarch 8.12.0-2.el9 @epel 378 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-natsort noarch 7.1.1-5.el9 @epel 215 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-numpy x86_64 1:1.23.5-2.el9 @appstream 30 M 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 @appstream 1.7 M 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-oauthlib noarch 3.1.1-5.el9 @koji-override-0 888 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-pecan noarch 1.4.2-3.el9 @epel 1.3 M 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-ply noarch 3.11-14.el9 @baseos 430 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-portend noarch 3.1.0-2.el9 @epel 20 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-prettytable noarch 0.7.2-27.el9 @koji-override-0 166 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 @epel 389 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyasn1 noarch 0.4.8-7.el9 @appstream 622 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 @appstream 1.0 M 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-pycparser noarch 2.20-6.el9 @baseos 745 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-pysocks noarch 1.7.1-12.el9 @anaconda 88 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-pytz noarch 2021.1-5.el9 @koji-override-0 176 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-repoze-lru noarch 0.7-16.el9 @epel 83 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-requests noarch 2.25.1-10.el9 @baseos 405 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 @appstream 119 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-routes noarch 2.5.1-5.el9 @epel 459 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-rsa noarch 4.9-2.el9 @epel 202 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-scipy x86_64 1.9.3-2.el9 @appstream 76 M 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-tempora noarch 5.0.0-2.el9 @epel 96 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-toml noarch 0.10.2-6.el9 @appstream 99 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-typing-extensions noarch 4.15.0-1.el9 @epel 447 k 2026-03-10T08:05:05.401 INFO:teuthology.orchestra.run.vm08.stdout: python3-urllib3 noarch 1.26.5-7.el9 @baseos 746 k 2026-03-10T08:05:05.402 INFO:teuthology.orchestra.run.vm08.stdout: python3-webob noarch 1.8.8-2.el9 @epel 1.2 M 2026-03-10T08:05:05.402 INFO:teuthology.orchestra.run.vm08.stdout: python3-websocket-client noarch 1.2.3-2.el9 @epel 319 k 2026-03-10T08:05:05.402 INFO:teuthology.orchestra.run.vm08.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 @epel 1.9 M 2026-03-10T08:05:05.402 INFO:teuthology.orchestra.run.vm08.stdout: python3-zc-lockfile noarch 2.0-10.el9 @epel 35 k 2026-03-10T08:05:05.402 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:05.402 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-10T08:05:05.402 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:05.402 INFO:teuthology.orchestra.run.vm08.stdout:Remove 84 Packages 2026-03-10T08:05:05.402 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:05.402 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 434 M 2026-03-10T08:05:05.402 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-10T08:05:05.426 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-10T08:05:05.426 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-10T08:05:05.434 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:05:05.439 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: ceph-base x86_64 2:18.2.1-0.el9 @ceph 22 M 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout:Removing dependent packages: 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: ceph-immutable-object-cache x86_64 2:18.2.1-0.el9 @ceph 395 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr x86_64 2:18.2.1-0.el9 @ceph 4.5 M 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-cephadm noarch 2:18.2.1-0.el9 @ceph-noarch 678 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-dashboard noarch 2:18.2.1-0.el9 @ceph-noarch 7.6 M 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-diskprediction-local noarch 2:18.2.1-0.el9 @ceph-noarch 66 M 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-rook noarch 2:18.2.1-0.el9 @ceph-noarch 574 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: rbd-mirror x86_64 2:18.2.1-0.el9 @ceph 12 M 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: ceph-common x86_64 2:18.2.1-0.el9 @ceph 70 M 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: ceph-grafana-dashboards noarch 2:18.2.1-0.el9 @ceph-noarch 319 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core noarch 2:18.2.1-0.el9 @ceph-noarch 1.4 M 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: ceph-prometheus-alerts noarch 2:18.2.1-0.el9 @ceph-noarch 40 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: ceph-selinux x86_64 2:18.2.1-0.el9 @ceph 138 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas x86_64 3.0.4-9.el9 @appstream 68 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-netlib x86_64 3.0.4-9.el9 @appstream 11 M 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-openblas-openmp x86_64 3.0.4-9.el9 @appstream 39 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: fmt x86_64 8.1.1-5.el9 @epel 337 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: libcephsqlite x86_64 2:18.2.1-0.el9 @ceph 434 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: libgfortran x86_64 11.5.0-14.el9 @baseos 2.8 M 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: liboath x86_64 2.6.12-1.el9 @epel 94 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: libquadmath x86_64 11.5.0-14.el9 @baseos 330 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: libradosstriper1 x86_64 2:18.2.1-0.el9 @ceph 1.5 M 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: openblas x86_64 0.3.29-1.el9 @appstream 112 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: openblas-openmp x86_64 0.3.29-1.el9 @appstream 46 M 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-asyncssh noarch 2.13.2-5.el9 @epel 3.9 M 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-autocommand noarch 2.2.2-8.el9 @epel 82 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-babel noarch 2.9.1-2.el9 @appstream 27 M 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-backports-tarfile noarch 1.2.0-1.el9 @epel 254 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-bcrypt x86_64 3.2.2-1.el9 @epel 87 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-cachetools noarch 4.2.4-1.el9 @epel 93 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-common x86_64 2:18.2.1-0.el9 @ceph 610 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-certifi noarch 2023.05.07-4.el9 @epel 6.3 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-cffi x86_64 1.14.5-5.el9 @baseos 1.0 M 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-chardet noarch 4.0.0-5.el9 @anaconda 1.4 M 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-cheroot noarch 10.0.1-4.el9 @epel 682 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-cherrypy noarch 18.6.1-2.el9 @epel 1.1 M 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-cryptography x86_64 36.0.1-5.el9 @baseos 4.5 M 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-devel x86_64 3.9.25-3.el9 @appstream 765 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth noarch 1:2.45.0-1.el9 @epel 1.4 M 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-idna noarch 2.10-7.el9.1 @anaconda 513 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco noarch 8.2.1-3.el9 @epel 3.7 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-classes noarch 3.2.1-5.el9 @epel 24 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-collections noarch 3.0.0-8.el9 @epel 55 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-context noarch 6.0.1-3.el9 @epel 31 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-functools noarch 3.5.0-2.el9 @epel 33 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-text noarch 4.0.0-2.el9 @epel 51 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-jinja2 noarch 2.11.3-8.el9 @appstream 1.1 M 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-jsonpatch noarch 1.21-16.el9 @koji-override-0 55 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-jsonpointer noarch 2.0-4.el9 @koji-override-0 34 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-jwt noarch 2.4.0-1.el9 @epel 103 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-jwt+crypto noarch 2.4.0-1.el9 @epel 5.4 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes noarch 1:26.1.0-3.el9 @epel 21 M 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-logutils noarch 0.3.5-21.el9 @epel 126 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-mako noarch 1.1.4-6.el9 @appstream 534 k 2026-03-10T08:05:05.440 INFO:teuthology.orchestra.run.vm05.stdout: python3-markupsafe x86_64 1.1.1-12.el9 @appstream 60 k 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-more-itertools noarch 8.12.0-2.el9 @epel 378 k 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-natsort noarch 7.1.1-5.el9 @epel 215 k 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy x86_64 1:1.23.5-2.el9 @appstream 30 M 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy-f2py x86_64 1:1.23.5-2.el9 @appstream 1.7 M 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-oauthlib noarch 3.1.1-5.el9 @koji-override-0 888 k 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-pecan noarch 1.4.2-3.el9 @epel 1.3 M 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-ply noarch 3.11-14.el9 @baseos 430 k 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-portend noarch 3.1.0-2.el9 @epel 20 k 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-prettytable noarch 0.7.2-27.el9 @koji-override-0 166 k 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyOpenSSL noarch 21.0.0-1.el9 @epel 389 k 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1 noarch 0.4.8-7.el9 @appstream 622 k 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1-modules noarch 0.4.8-7.el9 @appstream 1.0 M 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-pycparser noarch 2.20-6.el9 @baseos 745 k 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-pysocks noarch 1.7.1-12.el9 @anaconda 88 k 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-pytz noarch 2021.1-5.el9 @koji-override-0 176 k 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-repoze-lru noarch 0.7-16.el9 @epel 83 k 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests noarch 2.25.1-10.el9 @baseos 405 k 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-oauthlib noarch 1.3.0-12.el9 @appstream 119 k 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-routes noarch 2.5.1-5.el9 @epel 459 k 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa noarch 4.9-2.el9 @epel 202 k 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-scipy x86_64 1.9.3-2.el9 @appstream 76 M 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora noarch 5.0.0-2.el9 @epel 96 k 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-toml noarch 0.10.2-6.el9 @appstream 99 k 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-typing-extensions noarch 4.15.0-1.el9 @epel 447 k 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-urllib3 noarch 1.26.5-7.el9 @baseos 746 k 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-webob noarch 1.8.8-2.el9 @epel 1.2 M 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket-client noarch 1.2.3-2.el9 @epel 319 k 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-werkzeug noarch 2.0.3-3.el9.1 @epel 1.9 M 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: python3-zc-lockfile noarch 2.0-10.el9 @epel 35 k 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout:Remove 84 Packages 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 434 M 2026-03-10T08:05:05.441 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-10T08:05:05.465 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-10T08:05:05.466 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-10T08:05:05.549 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-10T08:05:05.549 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-10T08:05:05.582 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-10T08:05:05.582 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-10T08:05:05.697 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-10T08:05:05.697 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-mgr-rook-2:18.2.1-0.el9.noarch 1/84 2026-03-10T08:05:05.759 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.1-0.el9.noarch 1/84 2026-03-10T08:05:05.776 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-2:18.2.1-0.el9.x86_64 2/84 2026-03-10T08:05:05.776 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:05:05.776 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-10T08:05:05.776 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mgr.target". 2026-03-10T08:05:05.776 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mgr.target". 2026-03-10T08:05:05.776 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:05.776 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-mgr-2:18.2.1-0.el9.x86_64 2/84 2026-03-10T08:05:05.788 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-2:18.2.1-0.el9.x86_64 2/84 2026-03-10T08:05:05.796 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 3/84 2026-03-10T08:05:05.796 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 4/84 2026-03-10T08:05:05.853 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 4/84 2026-03-10T08:05:05.863 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-kubernetes-1:26.1.0-3.el9.noarch 5/84 2026-03-10T08:05:05.867 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-requests-oauthlib-1.3.0-12.el9.noarch 6/84 2026-03-10T08:05:05.867 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 7/84 2026-03-10T08:05:05.875 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-10T08:05:05.875 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-rook-2:18.2.1-0.el9.noarch 1/84 2026-03-10T08:05:05.880 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 7/84 2026-03-10T08:05:05.897 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-rook-2:18.2.1-0.el9.noarch 1/84 2026-03-10T08:05:05.900 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-cherrypy-18.6.1-2.el9.noarch 8/84 2026-03-10T08:05:05.904 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-cheroot-10.0.1-4.el9.noarch 9/84 2026-03-10T08:05:05.908 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jaraco-collections-3.0.0-8.el9.noarch 10/84 2026-03-10T08:05:05.914 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jaraco-text-4.0.0-2.el9.noarch 11/84 2026-03-10T08:05:05.920 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-2:18.2.1-0.el9.x86_64 2/84 2026-03-10T08:05:05.920 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:05:05.920 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-mgr@*.service" escaped as "ceph-mgr@\x2a.service". 2026-03-10T08:05:05.920 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-mgr.target". 2026-03-10T08:05:05.920 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-mgr.target". 2026-03-10T08:05:05.920 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:05.921 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-2:18.2.1-0.el9.x86_64 2/84 2026-03-10T08:05:05.937 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jinja2-2.11.3-8.el9.noarch 12/84 2026-03-10T08:05:05.941 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-2:18.2.1-0.el9.x86_64 2/84 2026-03-10T08:05:05.949 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-requests-2.25.1-10.el9.noarch 13/84 2026-03-10T08:05:05.950 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 3/84 2026-03-10T08:05:05.950 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 4/84 2026-03-10T08:05:05.966 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-google-auth-1:2.45.0-1.el9.noarch 14/84 2026-03-10T08:05:05.973 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-pecan-1.4.2-3.el9.noarch 15/84 2026-03-10T08:05:05.983 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-rsa-4.9-2.el9.noarch 16/84 2026-03-10T08:05:05.990 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-pyasn1-modules-0.4.8-7.el9.noarch 17/84 2026-03-10T08:05:06.011 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 4/84 2026-03-10T08:05:06.018 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-urllib3-1.26.5-7.el9.noarch 18/84 2026-03-10T08:05:06.020 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-kubernetes-1:26.1.0-3.el9.noarch 5/84 2026-03-10T08:05:06.025 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-babel-2.9.1-2.el9.noarch 19/84 2026-03-10T08:05:06.025 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-requests-oauthlib-1.3.0-12.el9.noarch 6/84 2026-03-10T08:05:06.026 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 7/84 2026-03-10T08:05:06.028 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jaraco-classes-3.2.1-5.el9.noarch 20/84 2026-03-10T08:05:06.037 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-pyOpenSSL-21.0.0-1.el9.noarch 21/84 2026-03-10T08:05:06.037 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 7/84 2026-03-10T08:05:06.044 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-cherrypy-18.6.1-2.el9.noarch 8/84 2026-03-10T08:05:06.044 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-asyncssh-2.13.2-5.el9.noarch 22/84 2026-03-10T08:05:06.044 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarc 23/84 2026-03-10T08:05:06.046 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-cheroot-10.0.1-4.el9.noarch 9/84 2026-03-10T08:05:06.049 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jaraco-collections-3.0.0-8.el9.noarch 10/84 2026-03-10T08:05:06.052 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarc 23/84 2026-03-10T08:05:06.054 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jaraco-text-4.0.0-2.el9.noarch 11/84 2026-03-10T08:05:06.062 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jinja2-2.11.3-8.el9.noarch 12/84 2026-03-10T08:05:06.071 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-requests-2.25.1-10.el9.noarch 13/84 2026-03-10T08:05:06.084 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-google-auth-1:2.45.0-1.el9.noarch 14/84 2026-03-10T08:05:06.089 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pecan-1.4.2-3.el9.noarch 15/84 2026-03-10T08:05:06.101 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-rsa-4.9-2.el9.noarch 16/84 2026-03-10T08:05:06.108 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pyasn1-modules-0.4.8-7.el9.noarch 17/84 2026-03-10T08:05:06.140 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-urllib3-1.26.5-7.el9.noarch 18/84 2026-03-10T08:05:06.147 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-babel-2.9.1-2.el9.noarch 19/84 2026-03-10T08:05:06.150 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jaraco-classes-3.2.1-5.el9.noarch 20/84 2026-03-10T08:05:06.151 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jsonpatch-1.21-16.el9.noarch 24/84 2026-03-10T08:05:06.160 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pyOpenSSL-21.0.0-1.el9.noarch 21/84 2026-03-10T08:05:06.169 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-asyncssh-2.13.2-5.el9.noarch 22/84 2026-03-10T08:05:06.169 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarc 23/84 2026-03-10T08:05:06.177 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarc 23/84 2026-03-10T08:05:06.180 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-scipy-1.9.3-2.el9.x86_64 25/84 2026-03-10T08:05:06.186 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 26/84 2026-03-10T08:05:06.192 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-bcrypt-3.2.2-1.el9.x86_64 27/84 2026-03-10T08:05:06.197 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-mako-1.1.4-6.el9.noarch 28/84 2026-03-10T08:05:06.199 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jaraco-context-6.0.1-3.el9.noarch 29/84 2026-03-10T08:05:06.202 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-portend-3.1.0-2.el9.noarch 30/84 2026-03-10T08:05:06.204 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-tempora-5.0.0-2.el9.noarch 31/84 2026-03-10T08:05:06.206 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jaraco-functools-3.5.0-2.el9.noarch 32/84 2026-03-10T08:05:06.209 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jwt-2.4.0-1.el9.noarch 33/84 2026-03-10T08:05:06.212 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jwt+crypto-2.4.0-1.el9.noarch 34/84 2026-03-10T08:05:06.225 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-routes-2.5.1-5.el9.noarch 35/84 2026-03-10T08:05:06.233 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-10T08:05:06.237 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-cffi-1.14.5-5.el9.x86_64 37/84 2026-03-10T08:05:06.283 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jsonpatch-1.21-16.el9.noarch 24/84 2026-03-10T08:05:06.285 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-pycparser-2.20-6.el9.noarch 38/84 2026-03-10T08:05:06.303 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-numpy-1:1.23.5-2.el9.x86_64 39/84 2026-03-10T08:05:06.305 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : flexiblas-netlib-3.0.4-9.el9.x86_64 40/84 2026-03-10T08:05:06.308 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 41/84 2026-03-10T08:05:06.310 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : openblas-openmp-0.3.29-1.el9.x86_64 42/84 2026-03-10T08:05:06.312 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libgfortran-11.5.0-14.el9.x86_64 43/84 2026-03-10T08:05:06.315 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-scipy-1.9.3-2.el9.x86_64 25/84 2026-03-10T08:05:06.321 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 26/84 2026-03-10T08:05:06.328 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-bcrypt-3.2.2-1.el9.x86_64 27/84 2026-03-10T08:05:06.332 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: rbd-mirror-2:18.2.1-0.el9.x86_64 44/84 2026-03-10T08:05:06.332 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:05:06.332 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-10T08:05:06.332 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target". 2026-03-10T08:05:06.332 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target". 2026-03-10T08:05:06.332 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:06.332 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-mako-1.1.4-6.el9.noarch 28/84 2026-03-10T08:05:06.332 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : rbd-mirror-2:18.2.1-0.el9.x86_64 44/84 2026-03-10T08:05:06.334 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jaraco-context-6.0.1-3.el9.noarch 29/84 2026-03-10T08:05:06.337 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-portend-3.1.0-2.el9.noarch 30/84 2026-03-10T08:05:06.340 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-tempora-5.0.0-2.el9.noarch 31/84 2026-03-10T08:05:06.341 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: rbd-mirror-2:18.2.1-0.el9.x86_64 44/84 2026-03-10T08:05:06.342 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jaraco-functools-3.5.0-2.el9.noarch 32/84 2026-03-10T08:05:06.344 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jwt-2.4.0-1.el9.noarch 33/84 2026-03-10T08:05:06.348 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jwt+crypto-2.4.0-1.el9.noarch 34/84 2026-03-10T08:05:06.362 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 45/84 2026-03-10T08:05:06.362 INFO:teuthology.orchestra.run.vm08.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:05:06.362 INFO:teuthology.orchestra.run.vm08.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-10T08:05:06.362 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:06.362 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 45/84 2026-03-10T08:05:06.363 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-routes-2.5.1-5.el9.noarch 35/84 2026-03-10T08:05:06.369 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-10T08:05:06.370 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 45/84 2026-03-10T08:05:06.372 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : openblas-0.3.29-1.el9.x86_64 46/84 2026-03-10T08:05:06.373 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-cffi-1.14.5-5.el9.x86_64 37/84 2026-03-10T08:05:06.374 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : flexiblas-3.0.4-9.el9.x86_64 47/84 2026-03-10T08:05:06.377 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-ply-3.11-14.el9.noarch 48/84 2026-03-10T08:05:06.380 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-repoze-lru-0.7-16.el9.noarch 49/84 2026-03-10T08:05:06.382 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jaraco-8.2.1-3.el9.noarch 50/84 2026-03-10T08:05:06.384 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-more-itertools-8.12.0-2.el9.noarch 51/84 2026-03-10T08:05:06.387 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-toml-0.10.2-6.el9.noarch 52/84 2026-03-10T08:05:06.390 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-pytz-2021.1-5.el9.noarch 53/84 2026-03-10T08:05:06.397 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-backports-tarfile-1.2.0-1.el9.noarch 54/84 2026-03-10T08:05:06.402 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-devel-3.9.25-3.el9.x86_64 55/84 2026-03-10T08:05:06.404 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-jsonpointer-2.0-4.el9.noarch 56/84 2026-03-10T08:05:06.406 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-typing-extensions-4.15.0-1.el9.noarch 57/84 2026-03-10T08:05:06.409 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-idna-2.10-7.el9.1.noarch 58/84 2026-03-10T08:05:06.414 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-pysocks-1.7.1-12.el9.noarch 59/84 2026-03-10T08:05:06.419 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-pyasn1-0.4.8-7.el9.noarch 60/84 2026-03-10T08:05:06.423 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pycparser-2.20-6.el9.noarch 38/84 2026-03-10T08:05:06.424 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-logutils-0.3.5-21.el9.noarch 61/84 2026-03-10T08:05:06.428 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-webob-1.8.8-2.el9.noarch 62/84 2026-03-10T08:05:06.436 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-numpy-1:1.23.5-2.el9.x86_64 39/84 2026-03-10T08:05:06.436 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-cachetools-4.2.4-1.el9.noarch 63/84 2026-03-10T08:05:06.438 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : flexiblas-netlib-3.0.4-9.el9.x86_64 40/84 2026-03-10T08:05:06.440 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-chardet-4.0.0-5.el9.noarch 64/84 2026-03-10T08:05:06.441 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 41/84 2026-03-10T08:05:06.442 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-autocommand-2.2.2-8.el9.noarch 65/84 2026-03-10T08:05:06.443 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : openblas-openmp-0.3.29-1.el9.x86_64 42/84 2026-03-10T08:05:06.445 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libgfortran-11.5.0-14.el9.x86_64 43/84 2026-03-10T08:05:06.446 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-zc-lockfile-2.0-10.el9.noarch 66/84 2026-03-10T08:05:06.455 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-natsort-7.1.1-5.el9.noarch 67/84 2026-03-10T08:05:06.461 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-oauthlib-3.1.1-5.el9.noarch 68/84 2026-03-10T08:05:06.464 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-websocket-client-1.2.3-2.el9.noarch 69/84 2026-03-10T08:05:06.466 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-certifi-2023.05.07-4.el9.noarch 70/84 2026-03-10T08:05:06.466 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: rbd-mirror-2:18.2.1-0.el9.x86_64 44/84 2026-03-10T08:05:06.466 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:05:06.466 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-rbd-mirror@*.service" escaped as "ceph-rbd-mirror@\x2a.service". 2026-03-10T08:05:06.466 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/multi-user.target.wants/ceph-rbd-mirror.target". 2026-03-10T08:05:06.466 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-rbd-mirror.target". 2026-03-10T08:05:06.466 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:06.467 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : rbd-mirror-2:18.2.1-0.el9.x86_64 44/84 2026-03-10T08:05:06.467 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 71/84 2026-03-10T08:05:06.474 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 72/84 2026-03-10T08:05:06.475 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: rbd-mirror-2:18.2.1-0.el9.x86_64 44/84 2026-03-10T08:05:06.479 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-werkzeug-2.0.3-3.el9.1.noarch 73/84 2026-03-10T08:05:06.494 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 45/84 2026-03-10T08:05:06.494 INFO:teuthology.orchestra.run.vm05.stdout:Glob pattern passed to enable, but globs are not supported for this. 2026-03-10T08:05:06.494 INFO:teuthology.orchestra.run.vm05.stdout:Invalid unit name "ceph-immutable-object-cache@*.service" escaped as "ceph-immutable-object-cache@\x2a.service". 2026-03-10T08:05:06.494 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:06.494 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 45/84 2026-03-10T08:05:06.499 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-base-2:18.2.1-0.el9.x86_64 74/84 2026-03-10T08:05:06.499 INFO:teuthology.orchestra.run.vm08.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-crash.service". 2026-03-10T08:05:06.499 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:06.502 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 45/84 2026-03-10T08:05:06.504 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : openblas-0.3.29-1.el9.x86_64 46/84 2026-03-10T08:05:06.505 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-base-2:18.2.1-0.el9.x86_64 74/84 2026-03-10T08:05:06.506 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : flexiblas-3.0.4-9.el9.x86_64 47/84 2026-03-10T08:05:06.509 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-ply-3.11-14.el9.noarch 48/84 2026-03-10T08:05:06.511 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-repoze-lru-0.7-16.el9.noarch 49/84 2026-03-10T08:05:06.513 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jaraco-8.2.1-3.el9.noarch 50/84 2026-03-10T08:05:06.515 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-more-itertools-8.12.0-2.el9.noarch 51/84 2026-03-10T08:05:06.518 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-toml-0.10.2-6.el9.noarch 52/84 2026-03-10T08:05:06.522 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pytz-2021.1-5.el9.noarch 53/84 2026-03-10T08:05:06.523 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-base-2:18.2.1-0.el9.x86_64 74/84 2026-03-10T08:05:06.523 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-selinux-2:18.2.1-0.el9.x86_64 75/84 2026-03-10T08:05:06.530 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-backports-tarfile-1.2.0-1.el9.noarch 54/84 2026-03-10T08:05:06.535 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-devel-3.9.25-3.el9.x86_64 55/84 2026-03-10T08:05:06.537 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-jsonpointer-2.0-4.el9.noarch 56/84 2026-03-10T08:05:06.539 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-typing-extensions-4.15.0-1.el9.noarch 57/84 2026-03-10T08:05:06.542 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-idna-2.10-7.el9.1.noarch 58/84 2026-03-10T08:05:06.547 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pysocks-1.7.1-12.el9.noarch 59/84 2026-03-10T08:05:06.551 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-pyasn1-0.4.8-7.el9.noarch 60/84 2026-03-10T08:05:06.556 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-logutils-0.3.5-21.el9.noarch 61/84 2026-03-10T08:05:06.561 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-webob-1.8.8-2.el9.noarch 62/84 2026-03-10T08:05:06.567 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-cachetools-4.2.4-1.el9.noarch 63/84 2026-03-10T08:05:06.571 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-chardet-4.0.0-5.el9.noarch 64/84 2026-03-10T08:05:06.574 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-autocommand-2.2.2-8.el9.noarch 65/84 2026-03-10T08:05:06.577 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-zc-lockfile-2.0-10.el9.noarch 66/84 2026-03-10T08:05:06.586 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-natsort-7.1.1-5.el9.noarch 67/84 2026-03-10T08:05:06.592 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-oauthlib-3.1.1-5.el9.noarch 68/84 2026-03-10T08:05:06.595 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-websocket-client-1.2.3-2.el9.noarch 69/84 2026-03-10T08:05:06.598 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-certifi-2023.05.07-4.el9.noarch 70/84 2026-03-10T08:05:06.599 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 71/84 2026-03-10T08:05:06.605 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 72/84 2026-03-10T08:05:06.608 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-werkzeug-2.0.3-3.el9.1.noarch 73/84 2026-03-10T08:05:06.626 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-base-2:18.2.1-0.el9.x86_64 74/84 2026-03-10T08:05:06.626 INFO:teuthology.orchestra.run.vm05.stdout:Removed "/etc/systemd/system/ceph.target.wants/ceph-crash.service". 2026-03-10T08:05:06.626 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:06.633 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-base-2:18.2.1-0.el9.x86_64 74/84 2026-03-10T08:05:06.650 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-base-2:18.2.1-0.el9.x86_64 74/84 2026-03-10T08:05:06.650 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-selinux-2:18.2.1-0.el9.x86_64 75/84 2026-03-10T08:05:12.204 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-selinux-2:18.2.1-0.el9.x86_64 75/84 2026-03-10T08:05:12.205 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /sys 2026-03-10T08:05:12.205 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /proc 2026-03-10T08:05:12.205 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /mnt 2026-03-10T08:05:12.205 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /var/tmp 2026-03-10T08:05:12.205 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /home 2026-03-10T08:05:12.205 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /root 2026-03-10T08:05:12.205 INFO:teuthology.orchestra.run.vm05.stdout:skipping the directory /tmp 2026-03-10T08:05:12.205 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:12.215 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-common-2:18.2.1-0.el9.x86_64 76/84 2026-03-10T08:05:12.221 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-selinux-2:18.2.1-0.el9.x86_64 75/84 2026-03-10T08:05:12.221 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /sys 2026-03-10T08:05:12.221 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /proc 2026-03-10T08:05:12.221 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /mnt 2026-03-10T08:05:12.221 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /var/tmp 2026-03-10T08:05:12.221 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /home 2026-03-10T08:05:12.221 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /root 2026-03-10T08:05:12.221 INFO:teuthology.orchestra.run.vm08.stdout:skipping the directory /tmp 2026-03-10T08:05:12.221 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:12.228 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-common-2:18.2.1-0.el9.x86_64 76/84 2026-03-10T08:05:12.241 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-common-2:18.2.1-0.el9.x86_64 76/84 2026-03-10T08:05:12.244 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-ceph-common-2:18.2.1-0.el9.x86_64 77/84 2026-03-10T08:05:12.245 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-prettytable-0.7.2-27.el9.noarch 78/84 2026-03-10T08:05:12.247 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : liboath-2.6.12-1.el9.x86_64 79/84 2026-03-10T08:05:12.247 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libradosstriper1-2:18.2.1-0.el9.x86_64 80/84 2026-03-10T08:05:12.255 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-common-2:18.2.1-0.el9.x86_64 76/84 2026-03-10T08:05:12.262 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-ceph-common-2:18.2.1-0.el9.x86_64 77/84 2026-03-10T08:05:12.264 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-prettytable-0.7.2-27.el9.noarch 78/84 2026-03-10T08:05:12.265 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libradosstriper1-2:18.2.1-0.el9.x86_64 80/84 2026-03-10T08:05:12.266 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : liboath-2.6.12-1.el9.x86_64 79/84 2026-03-10T08:05:12.266 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libradosstriper1-2:18.2.1-0.el9.x86_64 80/84 2026-03-10T08:05:12.267 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : fmt-8.1.1-5.el9.x86_64 81/84 2026-03-10T08:05:12.270 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libquadmath-11.5.0-14.el9.x86_64 82/84 2026-03-10T08:05:12.273 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-markupsafe-1.1.1-12.el9.x86_64 83/84 2026-03-10T08:05:12.273 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libcephsqlite-2:18.2.1-0.el9.x86_64 84/84 2026-03-10T08:05:12.281 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libradosstriper1-2:18.2.1-0.el9.x86_64 80/84 2026-03-10T08:05:12.284 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : fmt-8.1.1-5.el9.x86_64 81/84 2026-03-10T08:05:12.286 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libquadmath-11.5.0-14.el9.x86_64 82/84 2026-03-10T08:05:12.289 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-markupsafe-1.1.1-12.el9.x86_64 83/84 2026-03-10T08:05:12.289 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libcephsqlite-2:18.2.1-0.el9.x86_64 84/84 2026-03-10T08:05:12.376 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libcephsqlite-2:18.2.1-0.el9.x86_64 84/84 2026-03-10T08:05:12.376 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-base-2:18.2.1-0.el9.x86_64 1/84 2026-03-10T08:05:12.376 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-common-2:18.2.1-0.el9.x86_64 2/84 2026-03-10T08:05:12.376 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 3/84 2026-03-10T08:05:12.376 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 4/84 2026-03-10T08:05:12.376 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-2:18.2.1-0.el9.x86_64 5/84 2026-03-10T08:05:12.376 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 6/84 2026-03-10T08:05:12.376 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 7/84 2026-03-10T08:05:12.376 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarc 8/84 2026-03-10T08:05:12.376 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 9/84 2026-03-10T08:05:12.376 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-mgr-rook-2:18.2.1-0.el9.noarch 10/84 2026-03-10T08:05:12.376 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 11/84 2026-03-10T08:05:12.376 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-selinux-2:18.2.1-0.el9.x86_64 12/84 2026-03-10T08:05:12.376 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 13/84 2026-03-10T08:05:12.376 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 14/84 2026-03-10T08:05:12.378 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 15/84 2026-03-10T08:05:12.378 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 16/84 2026-03-10T08:05:12.378 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephsqlite-2:18.2.1-0.el9.x86_64 17/84 2026-03-10T08:05:12.378 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 18/84 2026-03-10T08:05:12.378 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 19/84 2026-03-10T08:05:12.378 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 20/84 2026-03-10T08:05:12.378 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libradosstriper1-2:18.2.1-0.el9.x86_64 21/84 2026-03-10T08:05:12.378 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 22/84 2026-03-10T08:05:12.378 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 23/84 2026-03-10T08:05:12.378 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 24/84 2026-03-10T08:05:12.378 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 25/84 2026-03-10T08:05:12.378 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 26/84 2026-03-10T08:05:12.378 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 27/84 2026-03-10T08:05:12.378 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 28/84 2026-03-10T08:05:12.378 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 29/84 2026-03-10T08:05:12.378 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-ceph-common-2:18.2.1-0.el9.x86_64 30/84 2026-03-10T08:05:12.378 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 31/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 32/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-chardet-4.0.0-5.el9.noarch 33/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 34/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 35/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 37/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 38/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-idna-2.10-7.el9.1.noarch 39/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 40/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 41/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 42/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 43/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 44/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 45/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 46/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jsonpatch-1.21-16.el9.noarch 47/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jsonpointer-2.0-4.el9.noarch 48/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 49/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 50/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 51/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 52/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 53/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 54/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 55/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 56/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 57/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 58/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-oauthlib-3.1.1-5.el9.noarch 59/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 60/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-ply-3.11-14.el9.noarch 61/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 62/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-prettytable-0.7.2-27.el9.noarch 63/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 64/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 65/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 66/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 67/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pysocks-1.7.1-12.el9.noarch 68/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-pytz-2021.1-5.el9.noarch 69/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 70/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 71/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 72/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 73/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 74/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 75/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 76/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 77/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 78/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 79/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 80/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 81/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 82/84 2026-03-10T08:05:12.379 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 83/84 2026-03-10T08:05:12.388 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libcephsqlite-2:18.2.1-0.el9.x86_64 84/84 2026-03-10T08:05:12.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-base-2:18.2.1-0.el9.x86_64 1/84 2026-03-10T08:05:12.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-common-2:18.2.1-0.el9.x86_64 2/84 2026-03-10T08:05:12.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 3/84 2026-03-10T08:05:12.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 4/84 2026-03-10T08:05:12.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-2:18.2.1-0.el9.x86_64 5/84 2026-03-10T08:05:12.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 6/84 2026-03-10T08:05:12.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 7/84 2026-03-10T08:05:12.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarc 8/84 2026-03-10T08:05:12.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 9/84 2026-03-10T08:05:12.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-mgr-rook-2:18.2.1-0.el9.noarch 10/84 2026-03-10T08:05:12.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 11/84 2026-03-10T08:05:12.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-selinux-2:18.2.1-0.el9.x86_64 12/84 2026-03-10T08:05:12.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : flexiblas-3.0.4-9.el9.x86_64 13/84 2026-03-10T08:05:12.388 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : flexiblas-netlib-3.0.4-9.el9.x86_64 14/84 2026-03-10T08:05:12.390 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 15/84 2026-03-10T08:05:12.390 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : fmt-8.1.1-5.el9.x86_64 16/84 2026-03-10T08:05:12.390 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libcephsqlite-2:18.2.1-0.el9.x86_64 17/84 2026-03-10T08:05:12.390 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libgfortran-11.5.0-14.el9.x86_64 18/84 2026-03-10T08:05:12.390 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : liboath-2.6.12-1.el9.x86_64 19/84 2026-03-10T08:05:12.390 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libquadmath-11.5.0-14.el9.x86_64 20/84 2026-03-10T08:05:12.390 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libradosstriper1-2:18.2.1-0.el9.x86_64 21/84 2026-03-10T08:05:12.390 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : openblas-0.3.29-1.el9.x86_64 22/84 2026-03-10T08:05:12.390 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : openblas-openmp-0.3.29-1.el9.x86_64 23/84 2026-03-10T08:05:12.390 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-asyncssh-2.13.2-5.el9.noarch 24/84 2026-03-10T08:05:12.390 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-autocommand-2.2.2-8.el9.noarch 25/84 2026-03-10T08:05:12.390 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-babel-2.9.1-2.el9.noarch 26/84 2026-03-10T08:05:12.390 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-backports-tarfile-1.2.0-1.el9.noarch 27/84 2026-03-10T08:05:12.390 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-bcrypt-3.2.2-1.el9.x86_64 28/84 2026-03-10T08:05:12.390 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cachetools-4.2.4-1.el9.noarch 29/84 2026-03-10T08:05:12.390 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-ceph-common-2:18.2.1-0.el9.x86_64 30/84 2026-03-10T08:05:12.390 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-certifi-2023.05.07-4.el9.noarch 31/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cffi-1.14.5-5.el9.x86_64 32/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-chardet-4.0.0-5.el9.noarch 33/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cheroot-10.0.1-4.el9.noarch 34/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cherrypy-18.6.1-2.el9.noarch 35/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cryptography-36.0.1-5.el9.x86_64 36/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-devel-3.9.25-3.el9.x86_64 37/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-google-auth-1:2.45.0-1.el9.noarch 38/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-idna-2.10-7.el9.1.noarch 39/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-8.2.1-3.el9.noarch 40/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-classes-3.2.1-5.el9.noarch 41/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-collections-3.0.0-8.el9.noarch 42/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-context-6.0.1-3.el9.noarch 43/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-functools-3.5.0-2.el9.noarch 44/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jaraco-text-4.0.0-2.el9.noarch 45/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jinja2-2.11.3-8.el9.noarch 46/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jsonpatch-1.21-16.el9.noarch 47/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jsonpointer-2.0-4.el9.noarch 48/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jwt-2.4.0-1.el9.noarch 49/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-jwt+crypto-2.4.0-1.el9.noarch 50/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-kubernetes-1:26.1.0-3.el9.noarch 51/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-logutils-0.3.5-21.el9.noarch 52/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-mako-1.1.4-6.el9.noarch 53/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-markupsafe-1.1.1-12.el9.x86_64 54/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-more-itertools-8.12.0-2.el9.noarch 55/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-natsort-7.1.1-5.el9.noarch 56/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-numpy-1:1.23.5-2.el9.x86_64 57/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-numpy-f2py-1:1.23.5-2.el9.x86_64 58/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-oauthlib-3.1.1-5.el9.noarch 59/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pecan-1.4.2-3.el9.noarch 60/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-ply-3.11-14.el9.noarch 61/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-portend-3.1.0-2.el9.noarch 62/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-prettytable-0.7.2-27.el9.noarch 63/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pyOpenSSL-21.0.0-1.el9.noarch 64/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pyasn1-0.4.8-7.el9.noarch 65/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pyasn1-modules-0.4.8-7.el9.noarch 66/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pycparser-2.20-6.el9.noarch 67/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pysocks-1.7.1-12.el9.noarch 68/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-pytz-2021.1-5.el9.noarch 69/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-repoze-lru-0.7-16.el9.noarch 70/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-requests-2.25.1-10.el9.noarch 71/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-requests-oauthlib-1.3.0-12.el9.noarch 72/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-routes-2.5.1-5.el9.noarch 73/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-rsa-4.9-2.el9.noarch 74/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-scipy-1.9.3-2.el9.x86_64 75/84 2026-03-10T08:05:12.391 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-tempora-5.0.0-2.el9.noarch 76/84 2026-03-10T08:05:12.392 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-toml-0.10.2-6.el9.noarch 77/84 2026-03-10T08:05:12.392 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-typing-extensions-4.15.0-1.el9.noarch 78/84 2026-03-10T08:05:12.392 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-urllib3-1.26.5-7.el9.noarch 79/84 2026-03-10T08:05:12.392 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-webob-1.8.8-2.el9.noarch 80/84 2026-03-10T08:05:12.392 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-websocket-client-1.2.3-2.el9.noarch 81/84 2026-03-10T08:05:12.392 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-werkzeug-2.0.3-3.el9.1.noarch 82/84 2026-03-10T08:05:12.392 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-zc-lockfile-2.0-10.el9.noarch 83/84 2026-03-10T08:05:12.449 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : rbd-mirror-2:18.2.1-0.el9.x86_64 84/84 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: ceph-base-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: ceph-common-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: ceph-mgr-rook-2:18.2.1-0.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: ceph-selinux-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: libcephsqlite-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: libradosstriper1-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-common-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-chardet-4.0.0-5.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-idna-2.10-7.el9.1.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-jsonpatch-1.21-16.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-jsonpointer-2.0-4.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-10T08:05:12.450 INFO:teuthology.orchestra.run.vm05.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-oauthlib-3.1.1-5.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-ply-3.11-14.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-prettytable-0.7.2-27.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-pysocks-1.7.1-12.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-pytz-2021.1-5.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: rbd-mirror-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:12.451 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:05:12.471 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : rbd-mirror-2:18.2.1-0.el9.x86_64 84/84 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: ceph-base-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: ceph-common-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: ceph-grafana-dashboards-2:18.2.1-0.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: ceph-immutable-object-cache-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-cephadm-2:18.2.1-0.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-dashboard-2:18.2.1-0.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-diskprediction-local-2:18.2.1-0.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-modules-core-2:18.2.1-0.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: ceph-mgr-rook-2:18.2.1-0.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: ceph-prometheus-alerts-2:18.2.1-0.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: ceph-selinux-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-3.0.4-9.el9.x86_64 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-netlib-3.0.4-9.el9.x86_64 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: flexiblas-openblas-openmp-3.0.4-9.el9.x86_64 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: fmt-8.1.1-5.el9.x86_64 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: libcephsqlite-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: libgfortran-11.5.0-14.el9.x86_64 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: liboath-2.6.12-1.el9.x86_64 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: libquadmath-11.5.0-14.el9.x86_64 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: libradosstriper1-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: openblas-0.3.29-1.el9.x86_64 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: openblas-openmp-0.3.29-1.el9.x86_64 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-asyncssh-2.13.2-5.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-autocommand-2.2.2-8.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-babel-2.9.1-2.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-backports-tarfile-1.2.0-1.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-bcrypt-3.2.2-1.el9.x86_64 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-cachetools-4.2.4-1.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-ceph-common-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-certifi-2023.05.07-4.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-cffi-1.14.5-5.el9.x86_64 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-chardet-4.0.0-5.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-cheroot-10.0.1-4.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-cherrypy-18.6.1-2.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-cryptography-36.0.1-5.el9.x86_64 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-devel-3.9.25-3.el9.x86_64 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-google-auth-1:2.45.0-1.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-idna-2.10-7.el9.1.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-8.2.1-3.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-classes-3.2.1-5.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-collections-3.0.0-8.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-context-6.0.1-3.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-functools-3.5.0-2.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-jaraco-text-4.0.0-2.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-jinja2-2.11.3-8.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-jsonpatch-1.21-16.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-jsonpointer-2.0-4.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-jwt-2.4.0-1.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-jwt+crypto-2.4.0-1.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-kubernetes-1:26.1.0-3.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-logutils-0.3.5-21.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-mako-1.1.4-6.el9.noarch 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-markupsafe-1.1.1-12.el9.x86_64 2026-03-10T08:05:12.472 INFO:teuthology.orchestra.run.vm08.stdout: python3-more-itertools-8.12.0-2.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-natsort-7.1.1-5.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-numpy-1:1.23.5-2.el9.x86_64 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-numpy-f2py-1:1.23.5-2.el9.x86_64 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-oauthlib-3.1.1-5.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-pecan-1.4.2-3.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-ply-3.11-14.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-portend-3.1.0-2.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-prettytable-0.7.2-27.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyOpenSSL-21.0.0-1.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyasn1-0.4.8-7.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-pyasn1-modules-0.4.8-7.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-pycparser-2.20-6.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-pysocks-1.7.1-12.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-pytz-2021.1-5.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-repoze-lru-0.7-16.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-requests-2.25.1-10.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-requests-oauthlib-1.3.0-12.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-routes-2.5.1-5.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-rsa-4.9-2.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-scipy-1.9.3-2.el9.x86_64 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-tempora-5.0.0-2.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-toml-0.10.2-6.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-typing-extensions-4.15.0-1.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-urllib3-1.26.5-7.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-webob-1.8.8-2.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-websocket-client-1.2.3-2.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-werkzeug-2.0.3-3.el9.1.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: python3-zc-lockfile-2.0-10.el9.noarch 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: rbd-mirror-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:12.473 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:05:12.666 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:05:12.666 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:12.666 INFO:teuthology.orchestra.run.vm05.stdout: Package Architecture Version Repository Size 2026-03-10T08:05:12.666 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:12.666 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-10T08:05:12.666 INFO:teuthology.orchestra.run.vm05.stdout: cephadm noarch 2:18.2.1-0.el9 @ceph-noarch 213 k 2026-03-10T08:05:12.666 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:12.666 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-10T08:05:12.667 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:12.667 INFO:teuthology.orchestra.run.vm05.stdout:Remove 1 Package 2026-03-10T08:05:12.667 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:12.667 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 213 k 2026-03-10T08:05:12.667 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-10T08:05:12.668 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-10T08:05:12.668 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-10T08:05:12.669 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-10T08:05:12.670 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-10T08:05:12.682 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:05:12.683 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:12.683 INFO:teuthology.orchestra.run.vm08.stdout: Package Architecture Version Repository Size 2026-03-10T08:05:12.683 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:12.683 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-10T08:05:12.683 INFO:teuthology.orchestra.run.vm08.stdout: cephadm noarch 2:18.2.1-0.el9 @ceph-noarch 213 k 2026-03-10T08:05:12.683 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:12.683 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-10T08:05:12.683 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:12.683 INFO:teuthology.orchestra.run.vm08.stdout:Remove 1 Package 2026-03-10T08:05:12.683 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:12.683 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 213 k 2026-03-10T08:05:12.683 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-10T08:05:12.685 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-10T08:05:12.685 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-10T08:05:12.685 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-10T08:05:12.685 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : cephadm-2:18.2.1-0.el9.noarch 1/1 2026-03-10T08:05:12.686 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-10T08:05:12.686 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-10T08:05:12.702 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-10T08:05:12.703 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : cephadm-2:18.2.1-0.el9.noarch 1/1 2026-03-10T08:05:12.789 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: cephadm-2:18.2.1-0.el9.noarch 1/1 2026-03-10T08:05:12.821 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: cephadm-2:18.2.1-0.el9.noarch 1/1 2026-03-10T08:05:12.824 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : cephadm-2:18.2.1-0.el9.noarch 1/1 2026-03-10T08:05:12.824 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:12.824 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-10T08:05:12.825 INFO:teuthology.orchestra.run.vm05.stdout: cephadm-2:18.2.1-0.el9.noarch 2026-03-10T08:05:12.825 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:12.825 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:05:12.871 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : cephadm-2:18.2.1-0.el9.noarch 1/1 2026-03-10T08:05:12.871 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:12.871 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-10T08:05:12.871 INFO:teuthology.orchestra.run.vm08.stdout: cephadm-2:18.2.1-0.el9.noarch 2026-03-10T08:05:12.871 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:12.871 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:05:13.010 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-immutable-object-cache 2026-03-10T08:05:13.011 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T08:05:13.013 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:05:13.014 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T08:05:13.014 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:05:13.059 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: ceph-immutable-object-cache 2026-03-10T08:05:13.059 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T08:05:13.062 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:05:13.062 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T08:05:13.062 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:05:13.177 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-mgr 2026-03-10T08:05:13.177 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T08:05:13.180 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:05:13.180 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T08:05:13.180 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:05:13.228 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: ceph-mgr 2026-03-10T08:05:13.228 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T08:05:13.231 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:05:13.232 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T08:05:13.232 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:05:13.343 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-mgr-dashboard 2026-03-10T08:05:13.343 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T08:05:13.346 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:05:13.346 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T08:05:13.346 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:05:13.399 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: ceph-mgr-dashboard 2026-03-10T08:05:13.399 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T08:05:13.401 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:05:13.402 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T08:05:13.402 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:05:13.514 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-mgr-diskprediction-local 2026-03-10T08:05:13.514 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T08:05:13.517 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:05:13.518 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T08:05:13.518 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:05:13.647 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: ceph-mgr-diskprediction-local 2026-03-10T08:05:13.647 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T08:05:13.650 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:05:13.650 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T08:05:13.650 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:05:13.767 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-mgr-rook 2026-03-10T08:05:13.767 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T08:05:13.770 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:05:13.770 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T08:05:13.770 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:05:13.820 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: ceph-mgr-rook 2026-03-10T08:05:13.820 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T08:05:13.823 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:05:13.823 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T08:05:13.823 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:05:13.949 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: ceph-mgr-cephadm 2026-03-10T08:05:13.949 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T08:05:13.952 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:05:13.953 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T08:05:13.953 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:05:13.994 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: ceph-mgr-cephadm 2026-03-10T08:05:13.994 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T08:05:13.997 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:05:13.997 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T08:05:13.997 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:05:14.130 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:05:14.130 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:14.130 INFO:teuthology.orchestra.run.vm05.stdout: Package Architecture Version Repository Size 2026-03-10T08:05:14.130 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:14.130 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-10T08:05:14.130 INFO:teuthology.orchestra.run.vm05.stdout: ceph-fuse x86_64 2:18.2.1-0.el9 @ceph 2.5 M 2026-03-10T08:05:14.131 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:14.131 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-10T08:05:14.131 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:14.131 INFO:teuthology.orchestra.run.vm05.stdout:Remove 1 Package 2026-03-10T08:05:14.131 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:14.131 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 2.5 M 2026-03-10T08:05:14.131 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-10T08:05:14.132 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-10T08:05:14.132 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-10T08:05:14.141 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-10T08:05:14.141 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-10T08:05:14.170 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:05:14.170 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:14.170 INFO:teuthology.orchestra.run.vm08.stdout: Package Architecture Version Repository Size 2026-03-10T08:05:14.170 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:14.170 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-10T08:05:14.170 INFO:teuthology.orchestra.run.vm08.stdout: ceph-fuse x86_64 2:18.2.1-0.el9 @ceph 2.5 M 2026-03-10T08:05:14.170 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:14.170 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-10T08:05:14.170 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:14.170 INFO:teuthology.orchestra.run.vm08.stdout:Remove 1 Package 2026-03-10T08:05:14.170 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:14.170 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 2.5 M 2026-03-10T08:05:14.170 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-10T08:05:14.172 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-10T08:05:14.172 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-10T08:05:14.175 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-10T08:05:14.181 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-10T08:05:14.181 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-10T08:05:14.189 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : ceph-fuse-2:18.2.1-0.el9.x86_64 1/1 2026-03-10T08:05:14.205 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-10T08:05:14.220 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : ceph-fuse-2:18.2.1-0.el9.x86_64 1/1 2026-03-10T08:05:14.246 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: ceph-fuse-2:18.2.1-0.el9.x86_64 1/1 2026-03-10T08:05:14.282 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: ceph-fuse-2:18.2.1-0.el9.x86_64 1/1 2026-03-10T08:05:14.289 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : ceph-fuse-2:18.2.1-0.el9.x86_64 1/1 2026-03-10T08:05:14.289 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:14.289 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-10T08:05:14.289 INFO:teuthology.orchestra.run.vm05.stdout: ceph-fuse-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:14.289 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:14.289 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:05:14.329 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : ceph-fuse-2:18.2.1-0.el9.x86_64 1/1 2026-03-10T08:05:14.329 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:14.329 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-10T08:05:14.329 INFO:teuthology.orchestra.run.vm08.stdout: ceph-fuse-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:14.329 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:14.329 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:05:14.478 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:05:14.478 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:14.478 INFO:teuthology.orchestra.run.vm05.stdout: Package Architecture Version Repository Size 2026-03-10T08:05:14.478 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:14.478 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-10T08:05:14.478 INFO:teuthology.orchestra.run.vm05.stdout: librados-devel x86_64 2:18.2.1-0.el9 @ceph 456 k 2026-03-10T08:05:14.478 INFO:teuthology.orchestra.run.vm05.stdout:Removing dependent packages: 2026-03-10T08:05:14.478 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-devel x86_64 2:18.2.1-0.el9 @ceph 139 k 2026-03-10T08:05:14.478 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:14.478 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-10T08:05:14.478 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:14.478 INFO:teuthology.orchestra.run.vm05.stdout:Remove 2 Packages 2026-03-10T08:05:14.478 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:14.478 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 595 k 2026-03-10T08:05:14.479 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-10T08:05:14.480 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-10T08:05:14.480 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-10T08:05:14.490 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-10T08:05:14.490 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-10T08:05:14.515 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-10T08:05:14.517 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libcephfs-devel-2:18.2.1-0.el9.x86_64 1/2 2026-03-10T08:05:14.523 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:05:14.523 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:14.523 INFO:teuthology.orchestra.run.vm08.stdout: Package Architecture Version Repository Size 2026-03-10T08:05:14.523 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:14.523 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-10T08:05:14.523 INFO:teuthology.orchestra.run.vm08.stdout: librados-devel x86_64 2:18.2.1-0.el9 @ceph 456 k 2026-03-10T08:05:14.523 INFO:teuthology.orchestra.run.vm08.stdout:Removing dependent packages: 2026-03-10T08:05:14.523 INFO:teuthology.orchestra.run.vm08.stdout: libcephfs-devel x86_64 2:18.2.1-0.el9 @ceph 139 k 2026-03-10T08:05:14.523 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:14.523 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-10T08:05:14.523 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:14.523 INFO:teuthology.orchestra.run.vm08.stdout:Remove 2 Packages 2026-03-10T08:05:14.523 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:14.523 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 595 k 2026-03-10T08:05:14.523 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-10T08:05:14.525 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-10T08:05:14.525 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-10T08:05:14.532 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : librados-devel-2:18.2.1-0.el9.x86_64 2/2 2026-03-10T08:05:14.535 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-10T08:05:14.535 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-10T08:05:14.563 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-10T08:05:14.565 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libcephfs-devel-2:18.2.1-0.el9.x86_64 1/2 2026-03-10T08:05:14.579 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : librados-devel-2:18.2.1-0.el9.x86_64 2/2 2026-03-10T08:05:14.592 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librados-devel-2:18.2.1-0.el9.x86_64 2/2 2026-03-10T08:05:14.592 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephfs-devel-2:18.2.1-0.el9.x86_64 1/2 2026-03-10T08:05:14.633 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librados-devel-2:18.2.1-0.el9.x86_64 2/2 2026-03-10T08:05:14.634 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:14.634 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-10T08:05:14.634 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs-devel-2:18.2.1-0.el9.x86_64 librados-devel-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:14.634 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:14.634 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:05:14.652 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librados-devel-2:18.2.1-0.el9.x86_64 2/2 2026-03-10T08:05:14.652 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libcephfs-devel-2:18.2.1-0.el9.x86_64 1/2 2026-03-10T08:05:14.702 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librados-devel-2:18.2.1-0.el9.x86_64 2/2 2026-03-10T08:05:14.702 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:14.702 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-10T08:05:14.702 INFO:teuthology.orchestra.run.vm08.stdout: libcephfs-devel-2:18.2.1-0.el9.x86_64 librados-devel-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:14.702 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:14.702 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:05:14.835 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:05:14.835 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:14.835 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-10T08:05:14.835 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:14.835 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-10T08:05:14.835 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs2 x86_64 2:18.2.1-0.el9 @ceph 1.9 M 2026-03-10T08:05:14.835 INFO:teuthology.orchestra.run.vm05.stdout:Removing dependent packages: 2026-03-10T08:05:14.835 INFO:teuthology.orchestra.run.vm05.stdout: python3-cephfs x86_64 2:18.2.1-0.el9 @ceph 505 k 2026-03-10T08:05:14.835 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-10T08:05:14.835 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse x86_64 2:18.2.1-0.el9 @ceph 186 k 2026-03-10T08:05:14.835 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:14.835 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-10T08:05:14.835 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:14.835 INFO:teuthology.orchestra.run.vm05.stdout:Remove 3 Packages 2026-03-10T08:05:14.836 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:14.836 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 2.5 M 2026-03-10T08:05:14.836 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-10T08:05:14.837 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-10T08:05:14.837 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-10T08:05:14.848 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-10T08:05:14.848 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-10T08:05:14.873 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-10T08:05:14.875 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-cephfs-2:18.2.1-0.el9.x86_64 1/3 2026-03-10T08:05:14.876 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-ceph-argparse-2:18.2.1-0.el9.x86_64 2/3 2026-03-10T08:05:14.876 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libcephfs2-2:18.2.1-0.el9.x86_64 3/3 2026-03-10T08:05:14.891 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:05:14.892 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:14.892 INFO:teuthology.orchestra.run.vm08.stdout: Package Arch Version Repository Size 2026-03-10T08:05:14.892 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:14.892 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-10T08:05:14.892 INFO:teuthology.orchestra.run.vm08.stdout: libcephfs2 x86_64 2:18.2.1-0.el9 @ceph 1.9 M 2026-03-10T08:05:14.892 INFO:teuthology.orchestra.run.vm08.stdout:Removing dependent packages: 2026-03-10T08:05:14.892 INFO:teuthology.orchestra.run.vm08.stdout: python3-cephfs x86_64 2:18.2.1-0.el9 @ceph 505 k 2026-03-10T08:05:14.892 INFO:teuthology.orchestra.run.vm08.stdout:Removing unused dependencies: 2026-03-10T08:05:14.892 INFO:teuthology.orchestra.run.vm08.stdout: python3-ceph-argparse x86_64 2:18.2.1-0.el9 @ceph 186 k 2026-03-10T08:05:14.892 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:14.892 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-10T08:05:14.892 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:14.892 INFO:teuthology.orchestra.run.vm08.stdout:Remove 3 Packages 2026-03-10T08:05:14.892 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:14.892 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 2.5 M 2026-03-10T08:05:14.893 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-10T08:05:14.894 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-10T08:05:14.894 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-10T08:05:14.906 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-10T08:05:14.906 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-10T08:05:14.933 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-10T08:05:14.935 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: libcephfs2-2:18.2.1-0.el9.x86_64 3/3 2026-03-10T08:05:14.935 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libcephfs2-2:18.2.1-0.el9.x86_64 1/3 2026-03-10T08:05:14.935 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-ceph-argparse-2:18.2.1-0.el9.x86_64 2/3 2026-03-10T08:05:14.936 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-cephfs-2:18.2.1-0.el9.x86_64 1/3 2026-03-10T08:05:14.938 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-ceph-argparse-2:18.2.1-0.el9.x86_64 2/3 2026-03-10T08:05:14.938 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libcephfs2-2:18.2.1-0.el9.x86_64 3/3 2026-03-10T08:05:14.971 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-cephfs-2:18.2.1-0.el9.x86_64 3/3 2026-03-10T08:05:14.971 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:14.971 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-10T08:05:14.971 INFO:teuthology.orchestra.run.vm05.stdout: libcephfs2-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:14.971 INFO:teuthology.orchestra.run.vm05.stdout: python3-ceph-argparse-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:14.971 INFO:teuthology.orchestra.run.vm05.stdout: python3-cephfs-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:14.971 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:14.971 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:05:15.003 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: libcephfs2-2:18.2.1-0.el9.x86_64 3/3 2026-03-10T08:05:15.003 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libcephfs2-2:18.2.1-0.el9.x86_64 1/3 2026-03-10T08:05:15.003 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-ceph-argparse-2:18.2.1-0.el9.x86_64 2/3 2026-03-10T08:05:15.042 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-cephfs-2:18.2.1-0.el9.x86_64 3/3 2026-03-10T08:05:15.042 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:15.042 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-10T08:05:15.042 INFO:teuthology.orchestra.run.vm08.stdout: libcephfs2-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:15.042 INFO:teuthology.orchestra.run.vm08.stdout: python3-ceph-argparse-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:15.042 INFO:teuthology.orchestra.run.vm08.stdout: python3-cephfs-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:15.042 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:15.042 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:05:15.139 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: libcephfs-devel 2026-03-10T08:05:15.139 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T08:05:15.142 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:05:15.143 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T08:05:15.143 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:05:15.226 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: libcephfs-devel 2026-03-10T08:05:15.226 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T08:05:15.229 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:05:15.230 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T08:05:15.230 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:05:15.318 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:05:15.319 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:15.319 INFO:teuthology.orchestra.run.vm05.stdout: Package Arch Version Repository Size 2026-03-10T08:05:15.319 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:15.319 INFO:teuthology.orchestra.run.vm05.stdout:Removing: 2026-03-10T08:05:15.319 INFO:teuthology.orchestra.run.vm05.stdout: librados2 x86_64 2:18.2.1-0.el9 @ceph 12 M 2026-03-10T08:05:15.319 INFO:teuthology.orchestra.run.vm05.stdout:Removing dependent packages: 2026-03-10T08:05:15.319 INFO:teuthology.orchestra.run.vm05.stdout: python3-rados x86_64 2:18.2.1-0.el9 @ceph 1.1 M 2026-03-10T08:05:15.319 INFO:teuthology.orchestra.run.vm05.stdout: python3-rbd x86_64 2:18.2.1-0.el9 @ceph 1.1 M 2026-03-10T08:05:15.319 INFO:teuthology.orchestra.run.vm05.stdout: python3-rgw x86_64 2:18.2.1-0.el9 @ceph 269 k 2026-03-10T08:05:15.319 INFO:teuthology.orchestra.run.vm05.stdout: qemu-kvm-block-rbd x86_64 17:10.1.0-15.el9 @appstream 37 k 2026-03-10T08:05:15.319 INFO:teuthology.orchestra.run.vm05.stdout: rbd-fuse x86_64 2:18.2.1-0.el9 @ceph 226 k 2026-03-10T08:05:15.319 INFO:teuthology.orchestra.run.vm05.stdout: rbd-nbd x86_64 2:18.2.1-0.el9 @ceph 494 k 2026-03-10T08:05:15.319 INFO:teuthology.orchestra.run.vm05.stdout:Removing unused dependencies: 2026-03-10T08:05:15.319 INFO:teuthology.orchestra.run.vm05.stdout: boost-program-options x86_64 1.75.0-13.el9 @appstream 276 k 2026-03-10T08:05:15.319 INFO:teuthology.orchestra.run.vm05.stdout: gperftools-libs x86_64 2.9.1-3.el9 @epel 1.4 M 2026-03-10T08:05:15.319 INFO:teuthology.orchestra.run.vm05.stdout: libarrow x86_64 9.0.0-15.el9 @epel 18 M 2026-03-10T08:05:15.320 INFO:teuthology.orchestra.run.vm05.stdout: libarrow-doc noarch 9.0.0-15.el9 @epel 122 k 2026-03-10T08:05:15.320 INFO:teuthology.orchestra.run.vm05.stdout: libpmemobj x86_64 1.12.1-1.el9 @appstream 383 k 2026-03-10T08:05:15.320 INFO:teuthology.orchestra.run.vm05.stdout: librabbitmq x86_64 0.11.0-7.el9 @appstream 102 k 2026-03-10T08:05:15.320 INFO:teuthology.orchestra.run.vm05.stdout: librbd1 x86_64 2:18.2.1-0.el9 @ceph 12 M 2026-03-10T08:05:15.320 INFO:teuthology.orchestra.run.vm05.stdout: librdkafka x86_64 1.6.1-102.el9 @appstream 2.0 M 2026-03-10T08:05:15.320 INFO:teuthology.orchestra.run.vm05.stdout: librgw2 x86_64 2:18.2.1-0.el9 @ceph 15 M 2026-03-10T08:05:15.320 INFO:teuthology.orchestra.run.vm05.stdout: libunwind x86_64 1.6.2-1.el9 @epel 170 k 2026-03-10T08:05:15.320 INFO:teuthology.orchestra.run.vm05.stdout: lttng-ust x86_64 2.12.0-6.el9 @appstream 1.0 M 2026-03-10T08:05:15.320 INFO:teuthology.orchestra.run.vm05.stdout: parquet-libs x86_64 9.0.0-15.el9 @epel 2.8 M 2026-03-10T08:05:15.320 INFO:teuthology.orchestra.run.vm05.stdout: re2 x86_64 1:20211101-20.el9 @epel 472 k 2026-03-10T08:05:15.320 INFO:teuthology.orchestra.run.vm05.stdout: thrift x86_64 0.15.0-4.el9 @epel 4.8 M 2026-03-10T08:05:15.320 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:15.320 INFO:teuthology.orchestra.run.vm05.stdout:Transaction Summary 2026-03-10T08:05:15.320 INFO:teuthology.orchestra.run.vm05.stdout:================================================================================ 2026-03-10T08:05:15.320 INFO:teuthology.orchestra.run.vm05.stdout:Remove 21 Packages 2026-03-10T08:05:15.320 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:15.320 INFO:teuthology.orchestra.run.vm05.stdout:Freed space: 74 M 2026-03-10T08:05:15.320 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction check 2026-03-10T08:05:15.323 INFO:teuthology.orchestra.run.vm05.stdout:Transaction check succeeded. 2026-03-10T08:05:15.323 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction test 2026-03-10T08:05:15.344 INFO:teuthology.orchestra.run.vm05.stdout:Transaction test succeeded. 2026-03-10T08:05:15.344 INFO:teuthology.orchestra.run.vm05.stdout:Running transaction 2026-03-10T08:05:15.384 INFO:teuthology.orchestra.run.vm05.stdout: Preparing : 1/1 2026-03-10T08:05:15.387 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : rbd-nbd-2:18.2.1-0.el9.x86_64 1/21 2026-03-10T08:05:15.389 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : rbd-fuse-2:18.2.1-0.el9.x86_64 2/21 2026-03-10T08:05:15.392 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-rgw-2:18.2.1-0.el9.x86_64 3/21 2026-03-10T08:05:15.392 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : librgw2-2:18.2.1-0.el9.x86_64 4/21 2026-03-10T08:05:15.406 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librgw2-2:18.2.1-0.el9.x86_64 4/21 2026-03-10T08:05:15.414 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:05:15.415 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : parquet-libs-9.0.0-15.el9.x86_64 5/21 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout: Package Arch Version Repository Size 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout:Removing: 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout: librados2 x86_64 2:18.2.1-0.el9 @ceph 12 M 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout:Removing dependent packages: 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout: python3-rados x86_64 2:18.2.1-0.el9 @ceph 1.1 M 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout: python3-rbd x86_64 2:18.2.1-0.el9 @ceph 1.1 M 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout: python3-rgw x86_64 2:18.2.1-0.el9 @ceph 269 k 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout: qemu-kvm-block-rbd x86_64 17:10.1.0-15.el9 @appstream 37 k 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout: rbd-fuse x86_64 2:18.2.1-0.el9 @ceph 226 k 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout: rbd-nbd x86_64 2:18.2.1-0.el9 @ceph 494 k 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout:Removing unused dependencies: 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout: boost-program-options x86_64 1.75.0-13.el9 @appstream 276 k 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout: gperftools-libs x86_64 2.9.1-3.el9 @epel 1.4 M 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout: libarrow x86_64 9.0.0-15.el9 @epel 18 M 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout: libarrow-doc noarch 9.0.0-15.el9 @epel 122 k 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout: libpmemobj x86_64 1.12.1-1.el9 @appstream 383 k 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout: librabbitmq x86_64 0.11.0-7.el9 @appstream 102 k 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout: librbd1 x86_64 2:18.2.1-0.el9 @ceph 12 M 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout: librdkafka x86_64 1.6.1-102.el9 @appstream 2.0 M 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout: librgw2 x86_64 2:18.2.1-0.el9 @ceph 15 M 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout: libunwind x86_64 1.6.2-1.el9 @epel 170 k 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout: lttng-ust x86_64 2.12.0-6.el9 @appstream 1.0 M 2026-03-10T08:05:15.416 INFO:teuthology.orchestra.run.vm08.stdout: parquet-libs x86_64 9.0.0-15.el9 @epel 2.8 M 2026-03-10T08:05:15.417 INFO:teuthology.orchestra.run.vm08.stdout: re2 x86_64 1:20211101-20.el9 @epel 472 k 2026-03-10T08:05:15.417 INFO:teuthology.orchestra.run.vm08.stdout: thrift x86_64 0.15.0-4.el9 @epel 4.8 M 2026-03-10T08:05:15.417 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:15.417 INFO:teuthology.orchestra.run.vm08.stdout:Transaction Summary 2026-03-10T08:05:15.417 INFO:teuthology.orchestra.run.vm08.stdout:================================================================================ 2026-03-10T08:05:15.417 INFO:teuthology.orchestra.run.vm08.stdout:Remove 21 Packages 2026-03-10T08:05:15.417 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:15.417 INFO:teuthology.orchestra.run.vm08.stdout:Freed space: 74 M 2026-03-10T08:05:15.417 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction check 2026-03-10T08:05:15.417 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-rbd-2:18.2.1-0.el9.x86_64 6/21 2026-03-10T08:05:15.419 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : python3-rados-2:18.2.1-0.el9.x86_64 7/21 2026-03-10T08:05:15.421 INFO:teuthology.orchestra.run.vm08.stdout:Transaction check succeeded. 2026-03-10T08:05:15.421 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction test 2026-03-10T08:05:15.422 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 8/21 2026-03-10T08:05:15.422 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : librbd1-2:18.2.1-0.el9.x86_64 9/21 2026-03-10T08:05:15.436 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librbd1-2:18.2.1-0.el9.x86_64 9/21 2026-03-10T08:05:15.436 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : librados2-2:18.2.1-0.el9.x86_64 10/21 2026-03-10T08:05:15.436 INFO:teuthology.orchestra.run.vm05.stdout:warning: file /etc/ceph: remove failed: No such file or directory 2026-03-10T08:05:15.436 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:15.443 INFO:teuthology.orchestra.run.vm08.stdout:Transaction test succeeded. 2026-03-10T08:05:15.444 INFO:teuthology.orchestra.run.vm08.stdout:Running transaction 2026-03-10T08:05:15.451 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librados2-2:18.2.1-0.el9.x86_64 10/21 2026-03-10T08:05:15.453 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libarrow-9.0.0-15.el9.x86_64 11/21 2026-03-10T08:05:15.455 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : gperftools-libs-2.9.1-3.el9.x86_64 12/21 2026-03-10T08:05:15.457 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libarrow-doc-9.0.0-15.el9.noarch 13/21 2026-03-10T08:05:15.460 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libunwind-1.6.2-1.el9.x86_64 14/21 2026-03-10T08:05:15.463 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : re2-1:20211101-20.el9.x86_64 15/21 2026-03-10T08:05:15.466 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : lttng-ust-2.12.0-6.el9.x86_64 16/21 2026-03-10T08:05:15.469 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : thrift-0.15.0-4.el9.x86_64 17/21 2026-03-10T08:05:15.471 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : libpmemobj-1.12.1-1.el9.x86_64 18/21 2026-03-10T08:05:15.472 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : boost-program-options-1.75.0-13.el9.x86_64 19/21 2026-03-10T08:05:15.474 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : librabbitmq-0.11.0-7.el9.x86_64 20/21 2026-03-10T08:05:15.488 INFO:teuthology.orchestra.run.vm05.stdout: Erasing : librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-10T08:05:15.489 INFO:teuthology.orchestra.run.vm08.stdout: Preparing : 1/1 2026-03-10T08:05:15.492 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : rbd-nbd-2:18.2.1-0.el9.x86_64 1/21 2026-03-10T08:05:15.494 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : rbd-fuse-2:18.2.1-0.el9.x86_64 2/21 2026-03-10T08:05:15.497 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-rgw-2:18.2.1-0.el9.x86_64 3/21 2026-03-10T08:05:15.497 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : librgw2-2:18.2.1-0.el9.x86_64 4/21 2026-03-10T08:05:15.510 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librgw2-2:18.2.1-0.el9.x86_64 4/21 2026-03-10T08:05:15.512 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : parquet-libs-9.0.0-15.el9.x86_64 5/21 2026-03-10T08:05:15.514 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-rbd-2:18.2.1-0.el9.x86_64 6/21 2026-03-10T08:05:15.516 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : python3-rados-2:18.2.1-0.el9.x86_64 7/21 2026-03-10T08:05:15.518 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 8/21 2026-03-10T08:05:15.518 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : librbd1-2:18.2.1-0.el9.x86_64 9/21 2026-03-10T08:05:15.533 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librbd1-2:18.2.1-0.el9.x86_64 9/21 2026-03-10T08:05:15.533 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : librados2-2:18.2.1-0.el9.x86_64 10/21 2026-03-10T08:05:15.533 INFO:teuthology.orchestra.run.vm08.stdout:warning: file /etc/ceph: remove failed: No such file or directory 2026-03-10T08:05:15.533 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:15.546 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librados2-2:18.2.1-0.el9.x86_64 10/21 2026-03-10T08:05:15.549 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libarrow-9.0.0-15.el9.x86_64 11/21 2026-03-10T08:05:15.552 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : gperftools-libs-2.9.1-3.el9.x86_64 12/21 2026-03-10T08:05:15.554 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libarrow-doc-9.0.0-15.el9.noarch 13/21 2026-03-10T08:05:15.555 INFO:teuthology.orchestra.run.vm05.stdout: Running scriptlet: librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-10T08:05:15.555 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 1/21 2026-03-10T08:05:15.555 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 2/21 2026-03-10T08:05:15.555 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 3/21 2026-03-10T08:05:15.555 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 4/21 2026-03-10T08:05:15.556 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 5/21 2026-03-10T08:05:15.556 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 6/21 2026-03-10T08:05:15.556 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librados2-2:18.2.1-0.el9.x86_64 7/21 2026-03-10T08:05:15.556 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librbd1-2:18.2.1-0.el9.x86_64 8/21 2026-03-10T08:05:15.556 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 9/21 2026-03-10T08:05:15.556 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : librgw2-2:18.2.1-0.el9.x86_64 10/21 2026-03-10T08:05:15.556 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 11/21 2026-03-10T08:05:15.556 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 12/21 2026-03-10T08:05:15.556 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 13/21 2026-03-10T08:05:15.556 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rados-2:18.2.1-0.el9.x86_64 14/21 2026-03-10T08:05:15.556 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rbd-2:18.2.1-0.el9.x86_64 15/21 2026-03-10T08:05:15.556 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : python3-rgw-2:18.2.1-0.el9.x86_64 16/21 2026-03-10T08:05:15.556 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 17/21 2026-03-10T08:05:15.556 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : rbd-fuse-2:18.2.1-0.el9.x86_64 18/21 2026-03-10T08:05:15.556 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : rbd-nbd-2:18.2.1-0.el9.x86_64 19/21 2026-03-10T08:05:15.556 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : re2-1:20211101-20.el9.x86_64 20/21 2026-03-10T08:05:15.557 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libunwind-1.6.2-1.el9.x86_64 14/21 2026-03-10T08:05:15.562 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : re2-1:20211101-20.el9.x86_64 15/21 2026-03-10T08:05:15.565 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : lttng-ust-2.12.0-6.el9.x86_64 16/21 2026-03-10T08:05:15.568 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : thrift-0.15.0-4.el9.x86_64 17/21 2026-03-10T08:05:15.571 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : libpmemobj-1.12.1-1.el9.x86_64 18/21 2026-03-10T08:05:15.582 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : boost-program-options-1.75.0-13.el9.x86_64 19/21 2026-03-10T08:05:15.584 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : librabbitmq-0.11.0-7.el9.x86_64 20/21 2026-03-10T08:05:15.598 INFO:teuthology.orchestra.run.vm08.stdout: Erasing : librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-10T08:05:15.611 INFO:teuthology.orchestra.run.vm05.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 21/21 2026-03-10T08:05:15.611 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:15.611 INFO:teuthology.orchestra.run.vm05.stdout:Removed: 2026-03-10T08:05:15.611 INFO:teuthology.orchestra.run.vm05.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-10T08:05:15.611 INFO:teuthology.orchestra.run.vm05.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-10T08:05:15.611 INFO:teuthology.orchestra.run.vm05.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-10T08:05:15.611 INFO:teuthology.orchestra.run.vm05.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-10T08:05:15.611 INFO:teuthology.orchestra.run.vm05.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-10T08:05:15.611 INFO:teuthology.orchestra.run.vm05.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-10T08:05:15.611 INFO:teuthology.orchestra.run.vm05.stdout: librados2-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:15.611 INFO:teuthology.orchestra.run.vm05.stdout: librbd1-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:15.612 INFO:teuthology.orchestra.run.vm05.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-10T08:05:15.612 INFO:teuthology.orchestra.run.vm05.stdout: librgw2-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:15.612 INFO:teuthology.orchestra.run.vm05.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-10T08:05:15.612 INFO:teuthology.orchestra.run.vm05.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-10T08:05:15.612 INFO:teuthology.orchestra.run.vm05.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-10T08:05:15.612 INFO:teuthology.orchestra.run.vm05.stdout: python3-rados-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:15.612 INFO:teuthology.orchestra.run.vm05.stdout: python3-rbd-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:15.612 INFO:teuthology.orchestra.run.vm05.stdout: python3-rgw-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:15.612 INFO:teuthology.orchestra.run.vm05.stdout: qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 2026-03-10T08:05:15.612 INFO:teuthology.orchestra.run.vm05.stdout: rbd-fuse-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:15.612 INFO:teuthology.orchestra.run.vm05.stdout: rbd-nbd-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:15.612 INFO:teuthology.orchestra.run.vm05.stdout: re2-1:20211101-20.el9.x86_64 2026-03-10T08:05:15.612 INFO:teuthology.orchestra.run.vm05.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-10T08:05:15.612 INFO:teuthology.orchestra.run.vm05.stdout: 2026-03-10T08:05:15.612 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:05:15.669 INFO:teuthology.orchestra.run.vm08.stdout: Running scriptlet: librdkafka-1.6.1-102.el9.x86_64 21/21 2026-03-10T08:05:15.670 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : boost-program-options-1.75.0-13.el9.x86_64 1/21 2026-03-10T08:05:15.670 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : gperftools-libs-2.9.1-3.el9.x86_64 2/21 2026-03-10T08:05:15.670 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libarrow-9.0.0-15.el9.x86_64 3/21 2026-03-10T08:05:15.670 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libarrow-doc-9.0.0-15.el9.noarch 4/21 2026-03-10T08:05:15.670 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libpmemobj-1.12.1-1.el9.x86_64 5/21 2026-03-10T08:05:15.670 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librabbitmq-0.11.0-7.el9.x86_64 6/21 2026-03-10T08:05:15.670 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librados2-2:18.2.1-0.el9.x86_64 7/21 2026-03-10T08:05:15.670 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librbd1-2:18.2.1-0.el9.x86_64 8/21 2026-03-10T08:05:15.670 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librdkafka-1.6.1-102.el9.x86_64 9/21 2026-03-10T08:05:15.670 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : librgw2-2:18.2.1-0.el9.x86_64 10/21 2026-03-10T08:05:15.670 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : libunwind-1.6.2-1.el9.x86_64 11/21 2026-03-10T08:05:15.670 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : lttng-ust-2.12.0-6.el9.x86_64 12/21 2026-03-10T08:05:15.670 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : parquet-libs-9.0.0-15.el9.x86_64 13/21 2026-03-10T08:05:15.670 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-rados-2:18.2.1-0.el9.x86_64 14/21 2026-03-10T08:05:15.670 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-rbd-2:18.2.1-0.el9.x86_64 15/21 2026-03-10T08:05:15.670 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : python3-rgw-2:18.2.1-0.el9.x86_64 16/21 2026-03-10T08:05:15.670 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 17/21 2026-03-10T08:05:15.670 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : rbd-fuse-2:18.2.1-0.el9.x86_64 18/21 2026-03-10T08:05:15.670 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : rbd-nbd-2:18.2.1-0.el9.x86_64 19/21 2026-03-10T08:05:15.670 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : re2-1:20211101-20.el9.x86_64 20/21 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout: Verifying : thrift-0.15.0-4.el9.x86_64 21/21 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout:Removed: 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout: boost-program-options-1.75.0-13.el9.x86_64 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout: gperftools-libs-2.9.1-3.el9.x86_64 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout: libarrow-9.0.0-15.el9.x86_64 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout: libarrow-doc-9.0.0-15.el9.noarch 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout: libpmemobj-1.12.1-1.el9.x86_64 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout: librabbitmq-0.11.0-7.el9.x86_64 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout: librados2-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout: librbd1-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout: librdkafka-1.6.1-102.el9.x86_64 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout: librgw2-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout: libunwind-1.6.2-1.el9.x86_64 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout: lttng-ust-2.12.0-6.el9.x86_64 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout: parquet-libs-9.0.0-15.el9.x86_64 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout: python3-rados-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout: python3-rbd-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout: python3-rgw-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout: qemu-kvm-block-rbd-17:10.1.0-15.el9.x86_64 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout: rbd-fuse-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout: rbd-nbd-2:18.2.1-0.el9.x86_64 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout: re2-1:20211101-20.el9.x86_64 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout: thrift-0.15.0-4.el9.x86_64 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout: 2026-03-10T08:05:15.719 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:05:15.813 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: librbd1 2026-03-10T08:05:15.813 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T08:05:15.815 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:05:15.816 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T08:05:15.816 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:05:15.940 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: librbd1 2026-03-10T08:05:15.940 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T08:05:15.943 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:05:15.944 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T08:05:15.944 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:05:16.004 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: python3-rados 2026-03-10T08:05:16.004 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T08:05:16.007 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:05:16.008 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T08:05:16.008 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:05:16.126 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: python3-rados 2026-03-10T08:05:16.126 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T08:05:16.129 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:05:16.129 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T08:05:16.129 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:05:16.177 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: python3-rgw 2026-03-10T08:05:16.177 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T08:05:16.180 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:05:16.181 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T08:05:16.181 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:05:16.305 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: python3-rgw 2026-03-10T08:05:16.305 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T08:05:16.308 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:05:16.309 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T08:05:16.309 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:05:16.355 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: python3-cephfs 2026-03-10T08:05:16.355 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T08:05:16.358 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:05:16.358 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T08:05:16.358 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:05:16.483 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: python3-cephfs 2026-03-10T08:05:16.483 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T08:05:16.486 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:05:16.487 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T08:05:16.487 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:05:16.524 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: python3-rbd 2026-03-10T08:05:16.524 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T08:05:16.527 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:05:16.528 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T08:05:16.528 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:05:16.646 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: python3-rbd 2026-03-10T08:05:16.646 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T08:05:16.649 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:05:16.649 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T08:05:16.649 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:05:16.695 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: rbd-fuse 2026-03-10T08:05:16.695 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T08:05:16.698 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:05:16.699 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T08:05:16.699 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:05:16.808 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: rbd-fuse 2026-03-10T08:05:16.808 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T08:05:16.811 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:05:16.812 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T08:05:16.812 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:05:16.860 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: rbd-mirror 2026-03-10T08:05:16.860 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T08:05:16.863 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:05:16.864 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T08:05:16.864 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:05:16.975 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: rbd-mirror 2026-03-10T08:05:16.975 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T08:05:16.978 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:05:16.979 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T08:05:16.979 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:05:17.039 INFO:teuthology.orchestra.run.vm05.stdout:No match for argument: rbd-nbd 2026-03-10T08:05:17.039 INFO:teuthology.orchestra.run.vm05.stderr:No packages marked for removal. 2026-03-10T08:05:17.042 INFO:teuthology.orchestra.run.vm05.stdout:Dependencies resolved. 2026-03-10T08:05:17.043 INFO:teuthology.orchestra.run.vm05.stdout:Nothing to do. 2026-03-10T08:05:17.043 INFO:teuthology.orchestra.run.vm05.stdout:Complete! 2026-03-10T08:05:17.065 DEBUG:teuthology.orchestra.run.vm05:> sudo yum clean all 2026-03-10T08:05:17.146 INFO:teuthology.orchestra.run.vm08.stdout:No match for argument: rbd-nbd 2026-03-10T08:05:17.146 INFO:teuthology.orchestra.run.vm08.stderr:No packages marked for removal. 2026-03-10T08:05:17.149 INFO:teuthology.orchestra.run.vm08.stdout:Dependencies resolved. 2026-03-10T08:05:17.150 INFO:teuthology.orchestra.run.vm08.stdout:Nothing to do. 2026-03-10T08:05:17.150 INFO:teuthology.orchestra.run.vm08.stdout:Complete! 2026-03-10T08:05:17.170 DEBUG:teuthology.orchestra.run.vm08:> sudo yum clean all 2026-03-10T08:05:17.188 INFO:teuthology.orchestra.run.vm05.stdout:56 files removed 2026-03-10T08:05:17.210 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-10T08:05:17.234 DEBUG:teuthology.orchestra.run.vm05:> sudo yum clean expire-cache 2026-03-10T08:05:17.305 INFO:teuthology.orchestra.run.vm08.stdout:56 files removed 2026-03-10T08:05:17.331 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-10T08:05:17.355 DEBUG:teuthology.orchestra.run.vm08:> sudo yum clean expire-cache 2026-03-10T08:05:17.393 INFO:teuthology.orchestra.run.vm05.stdout:Cache was expired 2026-03-10T08:05:17.393 INFO:teuthology.orchestra.run.vm05.stdout:0 files removed 2026-03-10T08:05:17.416 DEBUG:teuthology.parallel:result is None 2026-03-10T08:05:17.512 INFO:teuthology.orchestra.run.vm08.stdout:Cache was expired 2026-03-10T08:05:17.513 INFO:teuthology.orchestra.run.vm08.stdout:0 files removed 2026-03-10T08:05:17.533 DEBUG:teuthology.parallel:result is None 2026-03-10T08:05:17.533 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm05.local 2026-03-10T08:05:17.533 INFO:teuthology.task.install:Removing ceph sources lists on ubuntu@vm08.local 2026-03-10T08:05:17.533 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-10T08:05:17.533 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -f /etc/yum.repos.d/ceph.repo 2026-03-10T08:05:17.557 DEBUG:teuthology.orchestra.run.vm05:> sudo mv -f /etc/yum/pluginconf.d/priorities.conf.orig /etc/yum/pluginconf.d/priorities.conf 2026-03-10T08:05:17.560 DEBUG:teuthology.orchestra.run.vm08:> sudo mv -f /etc/yum/pluginconf.d/priorities.conf.orig /etc/yum/pluginconf.d/priorities.conf 2026-03-10T08:05:17.620 DEBUG:teuthology.parallel:result is None 2026-03-10T08:05:17.625 DEBUG:teuthology.parallel:result is None 2026-03-10T08:05:17.625 DEBUG:teuthology.run_tasks:Unwinding manager clock 2026-03-10T08:05:17.628 INFO:teuthology.task.clock:Checking final clock skew... 2026-03-10T08:05:17.628 DEBUG:teuthology.orchestra.run.vm05:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-10T08:05:17.662 DEBUG:teuthology.orchestra.run.vm08:> PATH=/usr/bin:/usr/sbin ntpq -p || PATH=/usr/bin:/usr/sbin chronyc sources || true 2026-03-10T08:05:17.674 INFO:teuthology.orchestra.run.vm05.stderr:bash: line 1: ntpq: command not found 2026-03-10T08:05:17.679 INFO:teuthology.orchestra.run.vm08.stderr:bash: line 1: ntpq: command not found 2026-03-10T08:05:17.786 INFO:teuthology.orchestra.run.vm08.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-10T08:05:17.786 INFO:teuthology.orchestra.run.vm08.stdout:=============================================================================== 2026-03-10T08:05:17.786 INFO:teuthology.orchestra.run.vm08.stdout:^+ server1a.meinberg.de 2 7 377 30 -99us[ -99us] +/- 56ms 2026-03-10T08:05:17.786 INFO:teuthology.orchestra.run.vm08.stdout:^+ time2.sebhosting.de 2 6 377 30 -644us[ -644us] +/- 16ms 2026-03-10T08:05:17.786 INFO:teuthology.orchestra.run.vm08.stdout:^- rdbl.cntx.de 2 7 377 34 +4670us[+4632us] +/- 51ms 2026-03-10T08:05:17.786 INFO:teuthology.orchestra.run.vm08.stdout:^* ntp0.rrze.uni-erlangen.de 1 7 377 30 +831us[ +793us] +/- 14ms 2026-03-10T08:05:17.786 INFO:teuthology.orchestra.run.vm05.stdout:MS Name/IP address Stratum Poll Reach LastRx Last sample 2026-03-10T08:05:17.786 INFO:teuthology.orchestra.run.vm05.stdout:=============================================================================== 2026-03-10T08:05:17.786 INFO:teuthology.orchestra.run.vm05.stdout:^+ time2.sebhosting.de 2 7 377 32 -642us[ -642us] +/- 16ms 2026-03-10T08:05:17.786 INFO:teuthology.orchestra.run.vm05.stdout:^- rdbl.cntx.de 2 7 377 33 +1732us[+1732us] +/- 50ms 2026-03-10T08:05:17.786 INFO:teuthology.orchestra.run.vm05.stdout:^* ntp0.rrze.uni-erlangen.de 1 7 377 34 +990us[ +985us] +/- 14ms 2026-03-10T08:05:17.786 INFO:teuthology.orchestra.run.vm05.stdout:^+ server1a.meinberg.de 2 7 377 38 -56us[ -62us] +/- 56ms 2026-03-10T08:05:17.788 DEBUG:teuthology.run_tasks:Unwinding manager ansible.cephlab 2026-03-10T08:05:17.791 INFO:teuthology.task.ansible:Skipping ansible cleanup... 2026-03-10T08:05:17.791 DEBUG:teuthology.run_tasks:Unwinding manager selinux 2026-03-10T08:05:17.794 DEBUG:teuthology.run_tasks:Unwinding manager pcp 2026-03-10T08:05:17.797 DEBUG:teuthology.run_tasks:Unwinding manager internal.timer 2026-03-10T08:05:17.800 INFO:teuthology.task.internal:Duration was 1401.184036 seconds 2026-03-10T08:05:17.801 DEBUG:teuthology.run_tasks:Unwinding manager internal.syslog 2026-03-10T08:05:17.803 INFO:teuthology.task.internal.syslog:Shutting down syslog monitoring... 2026-03-10T08:05:17.804 DEBUG:teuthology.orchestra.run.vm05:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-10T08:05:17.830 DEBUG:teuthology.orchestra.run.vm08:> sudo rm -f -- /etc/rsyslog.d/80-cephtest.conf && sudo service rsyslog restart 2026-03-10T08:05:17.873 INFO:teuthology.orchestra.run.vm05.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-10T08:05:17.875 INFO:teuthology.orchestra.run.vm08.stderr:Redirecting to /bin/systemctl restart rsyslog.service 2026-03-10T08:05:18.189 INFO:teuthology.task.internal.syslog:Checking logs for errors... 2026-03-10T08:05:18.189 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm05.local 2026-03-10T08:05:18.189 DEBUG:teuthology.orchestra.run.vm05:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-10T08:05:18.213 DEBUG:teuthology.task.internal.syslog:Checking ubuntu@vm08.local 2026-03-10T08:05:18.213 DEBUG:teuthology.orchestra.run.vm08:> grep -E --binary-files=text '\bBUG\b|\bINFO\b|\bDEADLOCK\b' /home/ubuntu/cephtest/archive/syslog/kern.log | grep -v 'task .* blocked for more than .* seconds' | grep -v 'lockdep is turned off' | grep -v 'trying to register non-static key' | grep -v 'DEBUG: fsize' | grep -v CRON | grep -v 'BUG: bad unlock balance detected' | grep -v 'inconsistent lock state' | grep -v '*** DEADLOCK ***' | grep -v 'INFO: possible irq lock inversion dependency detected' | grep -v 'INFO: NMI handler (perf_event_nmi_handler) took too long to run' | grep -v 'INFO: recovery required on readonly' | grep -v 'ceph-create-keys: INFO' | grep -v INFO:ceph-create-keys | grep -v 'Loaded datasource DataSourceOpenStack' | grep -v 'container-storage-setup: INFO: Volume group backing root filesystem could not be determined' | grep -E -v '\bsalt-master\b|\bsalt-minion\b|\bsalt-api\b' | grep -v ceph-crash | grep -E -v '\btcmu-runner\b.*\bINFO\b' | head -n 1 2026-03-10T08:05:18.243 INFO:teuthology.task.internal.syslog:Gathering journactl... 2026-03-10T08:05:18.243 DEBUG:teuthology.orchestra.run.vm05:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-10T08:05:18.255 DEBUG:teuthology.orchestra.run.vm08:> sudo journalctl > /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-10T08:05:18.972 INFO:teuthology.task.internal.syslog:Compressing syslogs... 2026-03-10T08:05:18.973 DEBUG:teuthology.orchestra.run.vm05:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-10T08:05:18.974 DEBUG:teuthology.orchestra.run.vm08:> find /home/ubuntu/cephtest/archive/syslog -name '*.log' -print0 | sudo xargs -0 --max-args=1 --max-procs=0 --verbose --no-run-if-empty -- gzip -5 --verbose -- 2026-03-10T08:05:18.999 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-10T08:05:18.999 INFO:teuthology.orchestra.run.vm05.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-10T08:05:19.000 INFO:teuthology.orchestra.run.vm05.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: gzip 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-10T08:05:19.000 INFO:teuthology.orchestra.run.vm05.stderr: -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-10T08:05:19.000 INFO:teuthology.orchestra.run.vm05.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-10T08:05:19.000 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/kern.log 2026-03-10T08:05:19.000 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/misc.log 2026-03-10T08:05:19.000 INFO:teuthology.orchestra.run.vm08.stderr:/home/ubuntu/cephtest/archive/syslog/kern.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/kern.log.gz 2026-03-10T08:05:19.001 INFO:teuthology.orchestra.run.vm08.stderr:gzip -5 --verbose -- /home/ubuntu/cephtest/archive/syslog/journalctl.log 2026-03-10T08:05:19.001 INFO:teuthology.orchestra.run.vm08.stderr:/home/ubuntu/cephtest/archive/syslog/misc.log: 0.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/misc.log.gz 2026-03-10T08:05:19.167 INFO:teuthology.orchestra.run.vm08.stderr:/home/ubuntu/cephtest/archive/syslog/journalctl.log: 98.0% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-10T08:05:19.175 INFO:teuthology.orchestra.run.vm05.stderr:/home/ubuntu/cephtest/archive/syslog/journalctl.log: 96.9% -- replaced with /home/ubuntu/cephtest/archive/syslog/journalctl.log.gz 2026-03-10T08:05:19.176 DEBUG:teuthology.run_tasks:Unwinding manager internal.sudo 2026-03-10T08:05:19.179 INFO:teuthology.task.internal:Restoring /etc/sudoers... 2026-03-10T08:05:19.180 DEBUG:teuthology.orchestra.run.vm05:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-10T08:05:19.244 DEBUG:teuthology.orchestra.run.vm08:> sudo mv -f /etc/sudoers.orig.teuthology /etc/sudoers 2026-03-10T08:05:19.268 DEBUG:teuthology.run_tasks:Unwinding manager internal.coredump 2026-03-10T08:05:19.272 DEBUG:teuthology.orchestra.run.vm05:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-10T08:05:19.286 DEBUG:teuthology.orchestra.run.vm08:> sudo sysctl -w kernel.core_pattern=core && sudo bash -c 'for f in `find /home/ubuntu/cephtest/archive/coredump -type f`; do file $f | grep -q systemd-sysusers && rm $f || true ; done' && rmdir --ignore-fail-on-non-empty -- /home/ubuntu/cephtest/archive/coredump 2026-03-10T08:05:19.312 INFO:teuthology.orchestra.run.vm05.stdout:kernel.core_pattern = core 2026-03-10T08:05:19.334 INFO:teuthology.orchestra.run.vm08.stdout:kernel.core_pattern = core 2026-03-10T08:05:19.350 DEBUG:teuthology.orchestra.run.vm05:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-10T08:05:19.384 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T08:05:19.385 DEBUG:teuthology.orchestra.run.vm08:> test -e /home/ubuntu/cephtest/archive/coredump 2026-03-10T08:05:19.404 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T08:05:19.405 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive 2026-03-10T08:05:19.409 INFO:teuthology.task.internal:Transferring archived files... 2026-03-10T08:05:19.409 DEBUG:teuthology.misc:Transferring archived files from vm05:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/954/remote/vm05 2026-03-10T08:05:19.409 DEBUG:teuthology.orchestra.run.vm05:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-10T08:05:19.456 DEBUG:teuthology.misc:Transferring archived files from vm08:/home/ubuntu/cephtest/archive to /archive/kyr-2026-03-10_01:00:38-orch-squid-none-default-vps/954/remote/vm08 2026-03-10T08:05:19.456 DEBUG:teuthology.orchestra.run.vm08:> sudo tar c -f - -C /home/ubuntu/cephtest/archive -- . 2026-03-10T08:05:19.485 INFO:teuthology.task.internal:Removing archive directory... 2026-03-10T08:05:19.485 DEBUG:teuthology.orchestra.run.vm05:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-10T08:05:19.496 DEBUG:teuthology.orchestra.run.vm08:> rm -rf -- /home/ubuntu/cephtest/archive 2026-03-10T08:05:19.541 DEBUG:teuthology.run_tasks:Unwinding manager internal.archive_upload 2026-03-10T08:05:19.545 INFO:teuthology.task.internal:Not uploading archives. 2026-03-10T08:05:19.545 DEBUG:teuthology.run_tasks:Unwinding manager internal.base 2026-03-10T08:05:19.548 INFO:teuthology.task.internal:Tidying up after the test... 2026-03-10T08:05:19.548 DEBUG:teuthology.orchestra.run.vm05:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-10T08:05:19.551 DEBUG:teuthology.orchestra.run.vm08:> find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest 2026-03-10T08:05:19.567 INFO:teuthology.orchestra.run.vm05.stdout: 8532146 0 drwxr-xr-x 3 ubuntu ubuntu 19 Mar 10 08:05 /home/ubuntu/cephtest 2026-03-10T08:05:19.567 INFO:teuthology.orchestra.run.vm05.stdout: 84067799 0 d--------- 2 ubuntu ubuntu 6 Mar 10 07:48 /home/ubuntu/cephtest/mnt.0 2026-03-10T08:05:19.567 INFO:teuthology.orchestra.run.vm05.stderr:find: ‘/home/ubuntu/cephtest/mnt.0’: Permission denied 2026-03-10T08:05:19.568 INFO:teuthology.orchestra.run.vm05.stderr:rmdir: failed to remove '/home/ubuntu/cephtest': Directory not empty 2026-03-10T08:05:19.583 DEBUG:teuthology.orchestra.run:got remote process result: 1 2026-03-10T08:05:19.583 ERROR:teuthology.run_tasks:Manager failed: internal.base Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/task/internal/__init__.py", line 48, in base yield File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 142, in __exit__ next(self.gen) File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/ceph_fuse.py", line 248, in task mount.umount_wait() File "/home/teuthos/src/github.com_kshtsk_ceph_75a68fd8ca3f918fe9466b4c0bb385b7fc260a9b/qa/tasks/cephfs/fuse_mount.py", line 403, in umount_wait run.wait([self.fuse_daemon], timeout) File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 479, in wait check_time() File "/home/teuthos/teuthology/teuthology/contextutil.py", line 134, in __call__ raise MaxWhileTries(error_msg) teuthology.exceptions.MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/home/teuthos/teuthology/teuthology/run_tasks.py", line 160, in run_tasks suppress = manager.__exit__(*exc_info) File "/home/teuthos/.local/share/uv/python/cpython-3.10.19-linux-x86_64-gnu/lib/python3.10/contextlib.py", line 153, in __exit__ self.gen.throw(typ, value, traceback) File "/home/teuthos/teuthology/teuthology/task/internal/__init__.py", line 53, in base run.wait( File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 485, in wait proc.wait() File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 161, in wait self._raise_for_status() File "/home/teuthos/teuthology/teuthology/orchestra/run.py", line 181, in _raise_for_status raise CommandFailedError( teuthology.exceptions.CommandFailedError: Command failed on vm05 with status 1: 'find /home/ubuntu/cephtest -ls ; rmdir -- /home/ubuntu/cephtest' 2026-03-10T08:05:19.583 DEBUG:teuthology.run_tasks:Unwinding manager console_log 2026-03-10T08:05:19.587 DEBUG:teuthology.run_tasks:Exception was not quenched, exiting: MaxWhileTries: reached maximum tries (50) after waiting for 300 seconds 2026-03-10T08:05:19.589 INFO:teuthology.run:Summary data: description: orch/cephadm/mds_upgrade_sequence/{bluestore-bitmap centos_9.stream conf/{client mds mgr mon osd} fail_fs/yes kernel overrides/{ignorelist_health ignorelist_upgrade ignorelist_wrongly_marked_down pg-warn pg_health syntax} roles tasks/{0-from/reef/{v18.2.1} 1-volume/{0-create 1-ranks/1 2-allow_standby_replay/no 3-inline/yes 4-verify} 2-client/fuse 3-upgrade-mgr-staggered 4-config-upgrade/{fail_fs} 5-upgrade-with-workload 6-verify}} duration: 1401.1840362548828 failure_reason: reached maximum tries (50) after waiting for 300 seconds flavor: default owner: kyr status: fail success: false 2026-03-10T08:05:19.589 DEBUG:teuthology.report:Pushing job info to http://localhost:8080 2026-03-10T08:05:19.617 INFO:teuthology.run:FAIL